Wtf is matrix multiplication and addition??? like I get the rules...

wtf is matrix multiplication and addition??? like I get the rules, but I dont understand it in an abstract sense or geometrically. can someone help me grasp it? Its bugging me a lot

Other urls found in this thread:

math.brown.edu/~treil/papers/LADW/LADW.html
youtube.com/watch?v=XkY2DOUCWMU
youtube.com/watch?v=kjBOesZCoqc&list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab&index=1
youtube.com/watch?v=kYB8IZa5AuE&t=247s
twitter.com/NSFWRedditImage

...

Are there good online resources for learning matricies? I assume having a deeper understanding of them would help with NN writing.

A matrix is a transformation. It "distorts" a space in a way that can be visualized as the stretching and rotation of vectors within that space. Matrix multiplication allows you to represent the successive application of two such transformations in a single object.

Matrix addition is harder to appreciate visually. If you understand vector addition visually and appreciate a Matrix as a collection of column vectors which describe how the Matrix transforms unit normal vectors you're off to a good start.

>Matrix multiplication
fancy dot product
>Matrix addition
fancy vector addition

For multiplication you need to realize that matrices are just linear transformations of vectors.
A linear transformation is a transformation of a vector such that: (let A be a matrix/linear transformation, v & w be vectors, and c & k be constants)
A(v+w)=(Av)+(Aw) (The transformation of 2 vectors added is the same as the 2 vectors transformed and then added)
A(c*v)=c*Av (The transformation of scaled vector v by c is the same as the transformed vector v then scaled by c)
or more compactly:
A(c*v+k*w)=c*Av+k*Aw

Therefore to figure out what a linear transformation does we can decompose a vector v into a sum of the components.
v=a*x+b*y+c*z ...
where x is the (unit) vector in the x dimension of length 1 and a is the magnitude in the x direction.
Therefore to know what a linear transformation L does to a vector v it is enough to know what it does to each dimension since:
L(v)=L(a*x+b*y+c*z)=a*L(x)+b*L(y)+c*L(z) (by it being linear)
We group these vectors L(x), L(y), L(z) into columns of a matrix A. Therefore multiplying A*v is taking the first component of v, a, times it by the first column of A, L(x), plus the second component of v, b, times it by the second column of A, L(y), plus the third component of v, c, times it by the third column of A, L(z), which is a*L(x)+b*L(y)+c*L(z)=L(v)=Av.

Composite multiplication of matrices is now obvious. To find where the first dimension of the product of 2 matrices B*A goes we take the first column of A and see where B takes it by standard vector multiplication and place it into the first column of the resulting matrix. Repeat for all the dimension and you have the result matrix.

Linear Algebra Done Wrong
math.brown.edu/~treil/papers/LADW/LADW.html

They're binary operations on a noncommutative ring. They also happen to be able to represent a large amount of stuff besides vector spaces, but in this context that's besides the point..

If you're looking at a matrix as a linear transformation of vector spaces, then you're just composing the transformations. So if your transformation A maps a basis vector [math]a=(1,0) \mapsto 2\cdot a[/math], then AA is like function composition, so [math]A(Aa)=(A^2)a=(4,0)[/math].

They teach you the shorthand "rules" that make sense for composing linear transformations, just like how differentiation in calculus is just shorthand "rules" for finding a derivative function. If you take linear algebra at an upper division level, you should see this broken down and it will be clearer.

So uh, do they have a beef with Linear Algebra Done Right?

Obviously or they wouldn't have had written yet another linear algebra book.

Do they explain their beef? I read their introduction and they didn't explain it.

They don't hide determinants like LADR and they introduce linear transformations first in a first LA book unlike pretty much any other LA book.

Think of a number.

But then think of that number as made of smaller numbers.

So, if you want to do any mathematical operations with that Big Number, you gotta make sure to take care of the smaller numbers that composes it.

Easy?

You can think of a matrix as a way to cleanly write down a system of linear equations. As you probably know, linear equations are polynomials where there is no exponent (or rather, the highest exponent is 1)

say we have a collection of points (x1,y1), (x2,y2) etc and we want to scale them i.e. double each points distance from the center, thus making the little square there bigger. This could be written out as a series of linear equations, as shown in pic related. The variables are in blue, and the coefficients are in green. For each of the x' answers, we double the x value by multiplying it by two, and we don't take into account the y values (we just multiply it by zero). For the y' answers we go the other way around. zeroing out the x and doubling the y.
To the right, this same system is written in matrix form. Not only does it look better, but things like a computer could probably handle and calculate the data more efficiently. Scaling is one of the simplest transforms. But think of something more complicated like a perspective transform. In perspective, the further back something is, the smaller it appears. So the z depth value needs to be included in the x and y equations.

The second part of the image shows how matrices are useful for rotation. Say we wanted to rotate a point (x,y) 30 degrees around the origin. It would be difficult to calculate the euclidean distance, then use that as the hypotenuse to get the old value etc.
Or we can say x is equal to the cosine of the old angle (theta) plus 30 and y is equal to the sine of the old angle (theta) plus 30. A known trigonomic identity is that given two angles a and b: cos(a + b) = cos(a)*cos(b) - sin(a)*sin(b) and sin(a + b) = sin(a)*cos(b) + cos(a)*sin(b)
since we know that the sine of the old angle is just y, and the cosine x, we can rewrite it as it is in the image. And as you can see, it's a linear equation, and can be rewritten as a matrix. that's the 2d rotation matrix.

The final part of the image shows that studying the properties of matricies can make calculations more efficient. We have a system of linear equations with 5 unknown variables. We can rewrite this as a matrix, and can see that the variables V times the coefficients C equals the answers A
Using the principles of algebra, we can rearrange the equation to say that A divide into C equals the unknown variables V. You can't divide matricies, but like normal numbers, you can multiply by the multiplicative inverse, and its the same as dividing. Some square matricies have an inverse, and this particular one C does. Calculating the inverse usually takes a long time for large matricies, but using algorithms like jordan-gauss elimination, you can rapidly find the inverse and solve the equation much faster than expected. that's why studying matricies are important
Using the inverse matrix, we can calculate the answer to the problem

matrix multiplication is just composition of linear transformations

First chapter of Numerical Linear Algebra by Trefethen. Read it.

Think of matrixes as a way to abrviate equation sets. They don't necessarily carry any geometrical information, and when they do it's mostly column vector times matrix (which is a simple transformation).
In general, matrixes are meant to store numbers, and some operations have been defined to work with them, but those may not have to be representative of geometry, at least in the classic euclidean space.

youtube.com/watch?v=XkY2DOUCWMU

Check out the rest of the series too, it's pretty good.

This, 3blue is pretty good at getting an intuitive concept out, even when you already know the concept in question.

thank you anons, this is very useful and I am genuinely grateful. I'm taking finite math for a [spoiler] business [/spoiler] degree and we're learning about matrices as systems of equations, and now we're suddenly performing operations with them, which made no sense to me.

Then this thread was for nothing.

kek

Best way I could describe I guess

youtube.com/watch?v=kjBOesZCoqc&list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab&index=1

Here you go OP

I'm genuinely considering switching degrees or at least taking math as well. I loved it in high school and I thought that math for business and economics would be in depth but its literally mind numbing. my only good classes right now are data science and statistics

I mean, you're on Veeky Forums, so the odds you're normal enough to survive in business is slim. Go math.

an everyday example of a linear transformation by matrix multiplication is seeing the 3D world as a 2D image. you can think of it as multiplying a 2x3 matrix with a 3x1 vector and getting a 2x1 vector (however, this representation doesn't cover the points at infinity)

>youtube.com/watch?v=kYB8IZa5AuE&t=247s

Thank me later.

On the other hand, an actual math degree is seriously difficult compared to the applied math you get in other subjects.

The stuff you're asking about is the basic introduction to one of the easiest topics.

>I thought that math for business and economics would be in depth but its literally mind numbing

How does that even happen? Don't you have any friends? Have you never spoken to any advisers?