Wtf is matrix multiplication and addition??? like I get the rules...

wtf is matrix multiplication and addition??? like I get the rules, but I dont understand it in an abstract sense or geometrically. can someone help me grasp it? Its bugging me a lot

Other urls found in this thread:

math.brown.edu/~treil/papers/LADW/LADW.html
youtube.com/watch?v=XkY2DOUCWMU
youtube.com/watch?v=kjBOesZCoqc&list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab&index=1
youtube.com/watch?v=kYB8IZa5AuE&t=247s
twitter.com/NSFWRedditImage

...

Are there good online resources for learning matricies? I assume having a deeper understanding of them would help with NN writing.

A matrix is a transformation. It "distorts" a space in a way that can be visualized as the stretching and rotation of vectors within that space. Matrix multiplication allows you to represent the successive application of two such transformations in a single object.

Matrix addition is harder to appreciate visually. If you understand vector addition visually and appreciate a Matrix as a collection of column vectors which describe how the Matrix transforms unit normal vectors you're off to a good start.

>Matrix multiplication
fancy dot product
>Matrix addition
fancy vector addition

For multiplication you need to realize that matrices are just linear transformations of vectors.
A linear transformation is a transformation of a vector such that: (let A be a matrix/linear transformation, v & w be vectors, and c & k be constants)
A(v+w)=(Av)+(Aw) (The transformation of 2 vectors added is the same as the 2 vectors transformed and then added)
A(c*v)=c*Av (The transformation of scaled vector v by c is the same as the transformed vector v then scaled by c)
or more compactly:
A(c*v+k*w)=c*Av+k*Aw

Therefore to figure out what a linear transformation does we can decompose a vector v into a sum of the components.
v=a*x+b*y+c*z ...
where x is the (unit) vector in the x dimension of length 1 and a is the magnitude in the x direction.
Therefore to know what a linear transformation L does to a vector v it is enough to know what it does to each dimension since:
L(v)=L(a*x+b*y+c*z)=a*L(x)+b*L(y)+c*L(z) (by it being linear)
We group these vectors L(x), L(y), L(z) into columns of a matrix A. Therefore multiplying A*v is taking the first component of v, a, times it by the first column of A, L(x), plus the second component of v, b, times it by the second column of A, L(y), plus the third component of v, c, times it by the third column of A, L(z), which is a*L(x)+b*L(y)+c*L(z)=L(v)=Av.

Composite multiplication of matrices is now obvious. To find where the first dimension of the product of 2 matrices B*A goes we take the first column of A and see where B takes it by standard vector multiplication and place it into the first column of the resulting matrix. Repeat for all the dimension and you have the result matrix.

Linear Algebra Done Wrong
math.brown.edu/~treil/papers/LADW/LADW.html

They're binary operations on a noncommutative ring. They also happen to be able to represent a large amount of stuff besides vector spaces, but in this context that's besides the point..

If you're looking at a matrix as a linear transformation of vector spaces, then you're just composing the transformations. So if your transformation A maps a basis vector [math]a=(1,0) \mapsto 2\cdot a[/math], then AA is like function composition, so [math]A(Aa)=(A^2)a=(4,0)[/math].

They teach you the shorthand "rules" that make sense for composing linear transformations, just like how differentiation in calculus is just shorthand "rules" for finding a derivative function. If you take linear algebra at an upper division level, you should see this broken down and it will be clearer.

So uh, do they have a beef with Linear Algebra Done Right?

Obviously or they wouldn't have had written yet another linear algebra book.