Inmathematics, asymmetric tensor is anunmixedtensor that is invariant under apermutation of its vector arguments:
for every permutationσ of the symbols{1, 2, ...,r}. Alternatively, a symmetric tensor of orderr represented in coordinates as a quantity withr indices satisfies
The space of symmetric tensors of orderr on a finite-dimensionalvector spaceV isnaturally isomorphic to the dual of the space ofhomogeneous polynomials of degreer onV. Overfields ofcharacteristic zero, thegraded vector space of all symmetric tensors can be naturally identified with thesymmetric algebra onV. A related concept is that of theantisymmetric tensor oralternating form. Symmetric tensors occur widely inengineering,physics andmathematics.
LetV be a vector space and
a tensor of orderk. ThenT is a symmetric tensor if
for thebraiding maps associated to every permutationσ on the symbols {1,2,...,k} (or equivalently for everytransposition on these symbols).
Given abasis {ei} ofV, any symmetric tensorT of rankk can be written as
for some unique list of coefficients (thecomponents of the tensor in the basis) that are symmetric on the indices. That is to say
for everypermutationσ.
The space of all symmetric tensors of orderk defined onV is often denoted bySk(V) or Symk(V). It is itself a vector space, and ifV has dimensionN then the dimension of Symk(V) is thebinomial coefficient
We then construct Sym(V) as thedirect sum of Symk(V) fork = 0,1,2,...
There are many examples of symmetric tensors. Some include, themetric tensor,, theEinstein tensor, and theRicci tensor,.
Manymaterial properties andfields used in physics and engineering can be represented as symmetric tensor fields; for example:stress,strain, andanisotropicconductivity. Also, indiffusion MRI one often uses symmetric tensors to describe diffusion in the brain or other parts of the body.
Ellipsoids are examples ofalgebraic varieties; and so, for general rank, symmetric tensors, in the guise ofhomogeneous polynomials, are used to defineprojective varieties, and are often studied as such.
Given aRiemannian manifold equipped with its Levi-Civita connection, thecovariant curvature tensor is a symmetric order 2 tensor over the vector space of differential 2-forms. This corresponds to the fact that, viewing, we have the symmetry between the first and second pairs of arguments in addition to antisymmetry within each pair:.[1]
Suppose is a vector space over a field ofcharacteristic 0. IfT ∈V⊗k is a tensor of order, then the symmetric part of is the symmetric tensor defined by
the summation extending over thesymmetric group onk symbols. In terms of a basis, and employing theEinstein summation convention, if
then
The components of the tensor appearing on the right are often denoted by
with parentheses () around the indices being symmetrized. Square brackets [] are used to indicate anti-symmetrization.
IfT is a simple tensor, given as a pure tensor product
then the symmetric part ofT is the symmetric product of the factors:
In general we can turn Sym(V) into analgebra by defining the commutative and associative product ⊙.[2] Given two tensorsT1 ∈ Symk1(V) andT2 ∈ Symk2(V), we use the symmetrization operator to define:
It can be verified (as is done by Kostrikin and Manin[2]) that the resulting product is in fact commutative and associative. In some cases the operator is omitted:T1T2 =T1 ⊙T2.
In some cases an exponential notation is used:
Wherev is a vector.Again, in some cases the ⊙ is left out:
In analogy with the theory ofsymmetric matrices, a (real) symmetric tensor of order 2 can be "diagonalized". More precisely, for any tensorT ∈ Sym2(V), there is an integerr, non-zero unit vectorsv1,...,vr ∈ V and weightsλ1,...,λr such that
The minimum numberr for which such a decomposition is possible is the (symmetric) rank ofT. The vectors appearing in this minimal expression are theprincipal axes of the tensor, and generally have an important physical meaning. For example, the principal axes of theinertia tensor define thePoinsot's ellipsoid representing the moment of inertia. Also seeSylvester's law of inertia.
For symmetric tensors of arbitrary orderk, decompositions
are also possible. The minimum numberr for which such a decomposition is possible is thesymmetricrank ofT.[3] This minimal decomposition is called a Waring decomposition; it is a symmetric form of thetensor rank decomposition. For second-order tensors this corresponds to the rank of the matrix representing the tensor in any basis, and it is well known that the maximum rank is equal to the dimension of the underlying vector space. However, for higher orders this need not hold: the rank can be higher than the number of dimensions in the underlying vector space. Moreover, the rank and symmetric rank of a symmetric tensor may differ.[4]