Inlinear algebra, theGram matrix (orGramian matrix,Gramian) of vectors in aninner product space is theHermitian matrix ofinner products, whose entries are given by theinner product.[1] If the vectors are the columns of matrix then the Gram matrix is in the general case that the vector coordinates are complex numbers, which simplifies to for the case that the vector coordinates are real numbers.
An important application is to computelinear independence: a set of vectors are linearly independent if and only if theGram determinant (thedeterminant of the Gram matrix) is non-zero.
For finite-dimensional real vectors in with the usual Euclideandot product, the Gram matrix is, where is a matrix whose columns are the vectors and is itstranspose whose rows are the vectors. Forcomplex vectors in,, where is theconjugate transpose of.
For anybilinear form on afinite-dimensionalvector space over anyfield we can define a Gram matrix attached to a set of vectors by. The matrix will be symmetric if the bilinear form is symmetric.
InRiemannian geometry, given an embedded-dimensionalRiemannian manifold and a parametrization for, the volume form on induced by the embedding may be computed using the Gramian of the coordinate tangent vectors: This generalizes the classical surface integral of a parametrized surface for:
If the vectors are centeredrandom variables, the Gramian is approximately proportional to thecovariance matrix, with the scaling determined by the number of elements in the vector.
Gramian matrices arise in covariance structure model fitting (see e.g., Jamshidian and Bentler, 1993, Applied Psychological Measurement, Volume 18, pp. 79–94).
In thefinite element method, the Gram matrix arises from approximating a function from a finite dimensional space; the Gram matrix entries are then the inner products of the basis functions of the finite dimensional subspace.
The Gram matrix issymmetric in the case the inner product is real-valued; it isHermitian in the general, complex case by definition of aninner product.
The Gram matrix ispositive semidefinite, and every positive semidefinite matrix is the Gramian matrix for some set of vectors. The fact that the Gramian matrix is positive-semidefinite can be seen from the following simple derivation:
The first equality follows from the definition of matrix multiplication, the second and third from the bi-linearity of theinner-product, and the last from the positive definiteness of the inner product.Note that this also shows that the Gramian matrix is positive definite if and only if the vectors are linearly independent (that is, for all).[1]
The columns of can be seen asn vectors in (ork-dimensional Euclidean space, in the real case). Then
where thedot product is the usual inner product on.
Thus aHermitian matrix is positive semidefinite if and only if it is the Gram matrix of some vectors. Such vectors are called avector realization of. The infinite-dimensional analog of this statement isMercer's theorem.
If is the Gram matrix of vectors in then applying any rotation or reflection of (anyorthogonal transformation, that is, anyEuclidean isometry preserving 0) to the sequence of vectors results in the same Gram matrix. That is, for anyorthogonal matrix, the Gram matrix of is also.
This is the only way in which two real vector realizations of can differ: the vectors are unique up toorthogonal transformations. In other words, the dot products and are equal if and only if some rigid transformation of transforms the vectors to and 0 to 0.
The same holds in the complex case, withunitary transformations in place of orthogonal ones.That is, if the Gram matrix of vectors is equal to the Gram matrix of vectors in then there is aunitary matrix (meaning) such that for.[3]
Because, it is necessarily the case that and commute. That is, a real or complex Gram matrix is also anormal matrix.
The Gram matrix of anyorthonormal basis is the identity matrix. Equivalently, the Gram matrix of the rows or the columns of a realrotation matrix is the identity matrix. Likewise, the Gram matrix of the rows or columns of aunitary matrix is the identity matrix.
The rank of the Gram matrix of vectors in or equals the dimension of the spacespanned by these vectors.[1]
TheGram determinant orGramian is the determinant of the Gram matrix:
If are vectors in then it is the square of then-dimensional volume of theparallelotope formed by the vectors. In particular, the vectors arelinearly independentif and only if the parallelotope has nonzeron-dimensional volume, if and only if Gram determinant is nonzero, if and only if the Gram matrix isnonsingular. Whenn >m the determinant and volume are zero. Whenn =m, this reduces to the standard theorem that the absolute value of the determinant ofnn-dimensional vectors is then-dimensional volume. The volume of thesimplex formed by the vectors isVolume(parallelotope) /n!.
When are linearly independent, the distance between a point and the linear span of is.
Consider the moment problem: given, find a vector such that, for all. There exists a unique solution with minimal norm:[4]: 38 The Gram determinant can also be expressed in terms of theexterior product of vectors by
The Gram determinant therefore supplies aninner product for the space. If anorthonormal basisei,i = 1, 2, ...,n on is given, the vectors
will constitute an orthonormal basis ofn-dimensional volumes on the space. Then the Gram determinant amounts to ann-dimensionalPythagorean Theorem for the volume of the parallelotope formed by the vectors in terms of its projections onto the basis volumes.
When the vectors are defined from the positions of points relative to some reference point,
then the Gram determinant can be written as the difference of two Gram determinants,
where each is the corresponding point supplemented with the coordinate value of 1 for an-st dimension.[citation needed] Note that in the common case thatn =m, the second term on the right-hand side will be zero.
Given a set of linearly independent vectors with Gram matrix defined by, one can construct an orthonormal basis
In matrix notation,, where has orthonormal basis vectors and the matrix is composed of the given column vectors.
The matrix is guaranteed to exist. Indeed, is Hermitian, and so can be decomposed as with a unitary matrix and a real diagonal matrix. Additionally, the are linearly independent if and only if is positive definite, which implies that the diagonal entries of are positive. is therefore uniquely defined by. One can check that these new vectors are orthonormal: