What is the difference between a tensor and a matrix? Is tensor just the patrician term for matrix?

What is the difference between a tensor and a matrix? Is tensor just the patrician term for matrix?

Other urls found in this thread:

youtube.com/watch?v=f5liqUk0ZTw
twitter.com/SFWRedditGifs

>Is tensor just the patrician term for matrix?
this is what cs-shitters actually believe

A matrix is just a method to represent a second order tensor. Tensors can have any order. Do they seem like the same thing to you?

So, tensors are better than matrices?

a matrix is a rank 2 tensor. tensors are generalizations

So they are better right?

A tensor is a multilinear map.

A matrix is a representation of a linear map in a fixed basis.

a tensor is anything that obeys certain symmetry and rotation rules.

a sum of two vectors is not a tensor but a vector minus another vector is a tensor.

youtube.com/watch?v=f5liqUk0ZTw

i mean, i guess? you could think as tensors as a group and matrices as one member of that group if that helps

Tensor is also a representation of multilinear transformation in fixed bases

But vector minus a vector is just a sum of two vectors, v-w=v+(-w), so your statement can't be true

Why as a group? Don't they form a ring like matrices do? If so then matrices are better

A tensor can be represented in a basis, but they are not in general.

Really a tensor should should just be an element of the tensor algebra of a module. By the universal property of the tensor product, this equivalent to multilinear maps.

This

I think he meant group in the informal sense (ie. a gathering)

So he's a brainlet, isn't he? Then should I trust him that tensors are better than matrices?

Considering that you use the word brainlet and that your question does not make sense, I'm not sure how to respond.
Tensors are used to represent multilinear maps, matrices represent linear maps and that's about all there is to it.

>Is tensor just the patrician term for matrix?
Yes.

Wrong.

A tensor is just something that can be indexed:
-A vector is a 1-tensor since it is singly indexed.
-A matrix is a 2-tensor since it is doubly indexed.
-A 3-tensor is triply indexed, and so on.

Tensors also satisfy the obvious generalizations of the multiplication rules you've learned for vectors and matrices. Anyone who claims they are more complicated than this has no idea what they're talking about.

So I think I understand the difference. My question now is what are the most common ranks of tensors? It seems to me usefulness drops off as rank increases

is there an analog of tensors for nonlinear? is that a useful concept?

Tensors are cooler because tensor spaces can be constructed for arbitrary modules while matrices only represent something meaningful in the context of free modules.

>It seems to me usefulness drops off as rank increases
This is true. You'll never see any useful tensors above rank 4.

No not really. The point behind the tensor product is to be the "right" notion of combing two vector spaces. So the whole point of the tensor product is linearity.

i.e. Given two vector spaces V and W, the tensor product V⊗W is uniquely characterized by the fact, for any vector space P, every bilinear map VxW-->P factors through a linear map V⊗W-->P.
(All over the same field of course)

Defining a (p,q)-tensor (on a vector space V) as an element of V*⊗...⊗V*⊗V⊗...⊗V , the tensor product of p-copies of V* and q copies of V. The above universal property makes it clear that a (p,q)-tensor should be characterized as a multilinear map
T: V*x...xV*xVx...xV--->F , where F is the underlying field.

2 (because matrices...)
3 (because they represent bilinear maps between vector spaces)
4 (because of curvature)
then, n

no, it doesn't make sense. a multilinear map between vector spaces may be represented by a tensor because it is defined by its values at basis vectors (ie. finitely many numbers). there's no reason to expect that a map with no structure at all could have a simple representation. that being said, the local behavior of smooth maps between open subsets of euclidean space is partially determined by tensors: the jacobian, the hessian, etc.

dumb remark

This is unrelated, but what does "tensor" mean in the context of Google's TensorFlow and Tensor Processing Units?

The same tensors as in OP you mong, it's literally on TensorFlow's wiki page:
>The name TensorFlow derives from the operations that such neural networks perform on multidimensional data arrays. These arrays are referred to as "tensors"

You're the brainlet here.

>Is tensor just the patrician term for matrix?
I would say every matrix is a tensor but not every tensor is a matrix.

Also, the Cauchy stress tensor is not a matrix, even though for practical purposes we treat it as such,

tensors and basically n dimensional matrices

A tensor is an ndarray.

You seem like someone who knows about continuum mechanics, could you help me with some of the terminology that's confusing me?

You can have nonlinear stress tensors, right?
But we're talking about tensors if they describe multilinear mappings

Are ie nonlinear stress measures still linear mappings in some way?

CS BTFO

Matrices form groups as well as rings.

Tensors obey the tensor transformation law. Matrices only do this sometimes... when they are rank (0,2) tensors.

physicists pls go away

you mean numpy user :^)

wrong, they're (1,1)

>Is tensor just the patrician term for matrix?
No. Tensor is an abstract term that doesn't have to describe just matrices.
>What is the difference between a tensor and a matrix?
The same as with a 1xn row matrix and a vector. Vector is an abstract term. A row matrix is one example of a representation of a vector. But a vector can also be represented by arrows (sometimes called euclidean/spatial/geometric vectors). Anything that obeys certain rules can be called a vector. Those rules say that vectors are objects that may be added with another vectors or multiplied by scalar values. Both row matrices and arrows obey those rules, so they can be called vectors.
If you make your slaves somehow obey those rules, then they can be called vectors too. Same with tensors.

>What is the difference between a tensor and a matrix?
a matrix is something you had in your linear algebra course and tensor is something you didn't menage to reach