Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Gram matrix

From Wikipedia, the free encyclopedia
Matrix of inner products of vectors

Inlinear algebra, theGram matrix (orGramian matrix,Gramian) of vectorsv1,,vn{\displaystyle v_{1},\dots ,v_{n}} in aninner product space is theHermitian matrix ofinner products, whose entries are given by theinner productGij=vi,vj{\displaystyle G_{ij}=\left\langle v_{i},v_{j}\right\rangle }.[1] If the vectorsv1,,vn{\displaystyle v_{1},\dots ,v_{n}} are the columns of matrixX{\displaystyle X} then the Gram matrix isXX{\displaystyle X^{\dagger }X} in the general case that the vector coordinates are complex numbers, which simplifies toXX{\displaystyle X^{\top }X} for the case that the vector coordinates are real numbers.

An important application is to computelinear independence: a set of vectors are linearly independent if and only if theGram determinant (thedeterminant of the Gram matrix) is non-zero.

It is named afterJørgen Pedersen Gram.

Examples

[edit]

For finite-dimensional real vectors inRn{\displaystyle \mathbb {R} ^{n}} with the usual Euclideandot product, the Gram matrix isG=VV{\displaystyle G=V^{\top }V}, whereV{\displaystyle V} is a matrix whose columns are the vectorsvk{\displaystyle v_{k}} andV{\displaystyle V^{\top }} is itstranspose whose rows are the vectorsvk{\displaystyle v_{k}^{\top }}. Forcomplex vectors inCn{\displaystyle \mathbb {C} ^{n}},G=VV{\displaystyle G=V^{\dagger }V}, whereV{\displaystyle V^{\dagger }} is theconjugate transpose ofV{\displaystyle V}.

Givensquare-integrable functions{i(),i=1,,n}{\displaystyle \{\ell _{i}(\cdot ),\,i=1,\dots ,n\}} on the interval[t0,tf]{\displaystyle \left[t_{0},t_{f}\right]}, the Gram matrixG=[Gij]{\displaystyle G=\left[G_{ij}\right]} is:

Gij=t0tfi(τ)j(τ)dτ.{\displaystyle G_{ij}=\int _{t_{0}}^{t_{f}}\ell _{i}^{*}(\tau )\ell _{j}(\tau )\,d\tau .}

wherei(τ){\displaystyle \ell _{i}^{*}(\tau )} is thecomplex conjugate ofi(τ){\displaystyle \ell _{i}(\tau )}.

For anybilinear formB{\displaystyle B} on afinite-dimensionalvector space over anyfield we can define a Gram matrixG{\displaystyle G} attached to a set of vectorsv1,,vn{\displaystyle v_{1},\dots ,v_{n}} byGij=B(vi,vj){\displaystyle G_{ij}=B\left(v_{i},v_{j}\right)}. The matrix will be symmetric if the bilinear formB{\displaystyle B} is symmetric.

Applications

[edit]

Properties

[edit]

Positive-semidefiniteness

[edit]

The Gram matrix issymmetric in the case the inner product is real-valued; it isHermitian in the general, complex case by definition of aninner product.

The Gram matrix ispositive semidefinite, and every positive semidefinite matrix is the Gramian matrix for some set of vectors. The fact that the Gramian matrix is positive-semidefinite can be seen from the following simple derivation:

xGx=i,jxixjvi,vj=i,jxivi,xjvj=ixivi,jxjvj=ixivi20.{\displaystyle x^{\dagger }\mathbf {G} x=\sum _{i,j}x_{i}^{*}x_{j}\left\langle v_{i},v_{j}\right\rangle =\sum _{i,j}\left\langle x_{i}v_{i},x_{j}v_{j}\right\rangle ={\biggl \langle }\sum _{i}x_{i}v_{i},\sum _{j}x_{j}v_{j}{\biggr \rangle }={\biggl \|}\sum _{i}x_{i}v_{i}{\biggr \|}^{2}\geq 0.}

The first equality follows from the definition of matrix multiplication, the second and third from the bi-linearity of theinner-product, and the last from the positive definiteness of the inner product.Note that this also shows that the Gramian matrix is positive definite if and only if the vectorsvi{\displaystyle v_{i}} are linearly independent (that is,ixivi0{\textstyle \sum _{i}x_{i}v_{i}\neq 0} for allx{\displaystyle x}).[1]

Finding a vector realization

[edit]
See also:Positive definite matrix § Decomposition

Given any positive semidefinite matrixM{\displaystyle M}, one can decompose it as:

M=BB{\displaystyle M=B^{\dagger }B},

whereB{\displaystyle B^{\dagger }} is theconjugate transpose ofB{\displaystyle B} (orM=BTB{\displaystyle M=B^{\textsf {T}}B} in the real case).

HereB{\displaystyle B} is ak×n{\displaystyle k\times n} matrix, wherek{\displaystyle k} is therank ofM{\displaystyle M}. Various ways to obtain such a decomposition include computing theCholesky decomposition or taking thenon-negative square root ofM{\displaystyle M}.

The columnsb(1),,b(n){\displaystyle b^{(1)},\dots ,b^{(n)}} ofB{\displaystyle B} can be seen asn vectors inCk{\displaystyle \mathbb {C} ^{k}} (ork-dimensional Euclidean spaceRk{\displaystyle \mathbb {R} ^{k}}, in the real case). Then

Mij=b(i)b(j){\displaystyle M_{ij}=b^{(i)}\cdot b^{(j)}}

where thedot productab==1kab{\textstyle a\cdot b=\sum _{\ell =1}^{k}a_{\ell }^{*}b_{\ell }} is the usual inner product onCk{\displaystyle \mathbb {C} ^{k}}.

Thus aHermitian matrixM{\displaystyle M} is positive semidefinite if and only if it is the Gram matrix of some vectorsb(1),,b(n){\displaystyle b^{(1)},\dots ,b^{(n)}}. Such vectors are called avector realization ofM{\displaystyle M}. The infinite-dimensional analog of this statement isMercer's theorem.

Uniqueness of vector realizations

[edit]

IfM{\displaystyle M} is the Gram matrix of vectorsv1,,vn{\displaystyle v_{1},\dots ,v_{n}} inRk{\displaystyle \mathbb {R} ^{k}} then applying any rotation or reflection ofRk{\displaystyle \mathbb {R} ^{k}} (anyorthogonal transformation, that is, anyEuclidean isometry preserving 0) to the sequence of vectors results in the same Gram matrix. That is, for anyk×k{\displaystyle k\times k}orthogonal matrixQ{\displaystyle Q}, the Gram matrix ofQv1,,Qvn{\displaystyle Qv_{1},\dots ,Qv_{n}} is alsoM{\displaystyle M}.

This is the only way in which two real vector realizations ofM{\displaystyle M} can differ: the vectorsv1,,vn{\displaystyle v_{1},\dots ,v_{n}} are unique up toorthogonal transformations. In other words, the dot productsvivj{\displaystyle v_{i}\cdot v_{j}} andwiwj{\displaystyle w_{i}\cdot w_{j}} are equal if and only if some rigid transformation ofRk{\displaystyle \mathbb {R} ^{k}} transforms the vectorsv1,,vn{\displaystyle v_{1},\dots ,v_{n}} tow1,,wn{\displaystyle w_{1},\dots ,w_{n}} and 0 to 0.

The same holds in the complex case, withunitary transformations in place of orthogonal ones.That is, if the Gram matrix of vectorsv1,,vn{\displaystyle v_{1},\dots ,v_{n}} is equal to the Gram matrix of vectorsw1,,wn{\displaystyle w_{1},\dots ,w_{n}} inCk{\displaystyle \mathbb {C} ^{k}} then there is aunitaryk×k{\displaystyle k\times k} matrixU{\displaystyle U} (meaningUU=I{\displaystyle U^{\dagger }U=I}) such thatvi=Uwi{\displaystyle v_{i}=Uw_{i}} fori=1,,n{\displaystyle i=1,\dots ,n}.[3]

Other properties

[edit]

Gram determinant

[edit]

TheGram determinant orGramian is the determinant of the Gram matrix:|G(v1,,vn)|=|v1,v1v1,v2v1,vnv2,v1v2,v2v2,vnvn,v1vn,v2vn,vn|.{\displaystyle {\bigl |}G(v_{1},\dots ,v_{n}){\bigr |}={\begin{vmatrix}\langle v_{1},v_{1}\rangle &\langle v_{1},v_{2}\rangle &\dots &\langle v_{1},v_{n}\rangle \\\langle v_{2},v_{1}\rangle &\langle v_{2},v_{2}\rangle &\dots &\langle v_{2},v_{n}\rangle \\\vdots &\vdots &\ddots &\vdots \\\langle v_{n},v_{1}\rangle &\langle v_{n},v_{2}\rangle &\dots &\langle v_{n},v_{n}\rangle \end{vmatrix}}.}

Ifv1,,vn{\displaystyle v_{1},\dots ,v_{n}} are vectors inRm{\displaystyle \mathbb {R} ^{m}} then it is the square of then-dimensional volume of theparallelotope formed by the vectors. In particular, the vectors arelinearly independentif and only if the parallelotope has nonzeron-dimensional volume, if and only if Gram determinant is nonzero, if and only if the Gram matrix isnonsingular. Whenn >m the determinant and volume are zero. Whenn =m, this reduces to the standard theorem that the absolute value of the determinant ofnn-dimensional vectors is then-dimensional volume. The volume of thesimplex formed by the vectors isVolume(parallelotope) /n!.

Whenv1,,vn{\displaystyle v_{1},\dots ,v_{n}} are linearly independent, the distance between a pointx{\displaystyle x} and the linear span ofv1,,vn{\displaystyle v_{1},\dots ,v_{n}} is|G(x,v1,,vn)||G(v1,,vn)|{\displaystyle {\sqrt {\frac {|G(x,v_{1},\dots ,v_{n})|}{|G(v_{1},\dots ,v_{n})|}}}}.

Consider the moment problem: givenc1,,cnC{\displaystyle c_{1},\dots ,c_{n}\in \mathbb {C} }, find a vectorv{\displaystyle v} such thatv,vi=ci{\textstyle \left\langle v,v_{i}\right\rangle =c_{i}}, for all1in{\textstyle 1\leqslant i\leqslant n}. There exists a unique solution with minimal norm:[4]: 38 v=1G(v1,v2,,vn)det[0c1c2cnv1v1,v1v1,v2v1,vnv2v2,v1v2,v2v2,vnvnvn,v1vn,v2vn,vn]{\displaystyle v=-{\frac {1}{G\left(v_{1},v_{2},\ldots ,v_{n}\right)}}\det {\begin{bmatrix}0&c_{1}&c_{2}&\cdots &c_{n}\\v_{1}&\left\langle v_{1},v_{1}\right\rangle &\left\langle v_{1},v_{2}\right\rangle &\cdots &\left\langle v_{1},v_{n}\right\rangle \\v_{2}&\left\langle v_{2},v_{1}\right\rangle &\left\langle v_{2},v_{2}\right\rangle &\cdots &\left\langle v_{2},v_{n}\right\rangle \\\vdots &\vdots &\vdots &\ddots &\vdots \\v_{n}&\left\langle v_{n},v_{1}\right\rangle &\left\langle v_{n},v_{2}\right\rangle &\cdots &\left\langle v_{n},v_{n}\right\rangle \end{bmatrix}}}The Gram determinant can also be expressed in terms of theexterior product of vectors by

|G(v1,,vn)|=v1vn2.{\displaystyle {\bigl |}G(v_{1},\dots ,v_{n}){\bigr |}=\|v_{1}\wedge \cdots \wedge v_{n}\|^{2}.}

The Gram determinant therefore supplies aninner product for the spacen(V){\displaystyle {\textstyle \bigwedge }^{\!n}(V)}. If anorthonormal basisei,i = 1, 2, ...,n onV{\displaystyle V} is given, the vectors

ei1ein,i1<<in,{\displaystyle e_{i_{1}}\wedge \cdots \wedge e_{i_{n}},\quad i_{1}<\cdots <i_{n},}

will constitute an orthonormal basis ofn-dimensional volumes on the spacen(V){\displaystyle {\textstyle \bigwedge }^{\!n}(V)}. Then the Gram determinant|G(v1,,vn)|{\displaystyle {\bigl |}G(v_{1},\dots ,v_{n}){\bigr |}} amounts to ann-dimensionalPythagorean Theorem for the volume of the parallelotope formed by the vectorsv1vn{\displaystyle v_{1}\wedge \cdots \wedge v_{n}} in terms of its projections onto the basis volumesei1ein{\displaystyle e_{i_{1}}\wedge \cdots \wedge e_{i_{n}}}.

When the vectorsv1,,vnRm{\displaystyle v_{1},\ldots ,v_{n}\in \mathbb {R} ^{m}} are defined from the positions of pointsp1,,pn{\displaystyle p_{1},\ldots ,p_{n}} relative to some reference pointpn+1{\displaystyle p_{n+1}},

(v1,v2,,vn)=(p1pn+1,p2pn+1,,pnpn+1),{\displaystyle (v_{1},v_{2},\ldots ,v_{n})=(p_{1}-p_{n+1},p_{2}-p_{n+1},\ldots ,p_{n}-p_{n+1})\,,}

then the Gram determinant can be written as the difference of two Gram determinants,

|G(v1,,vn)|=|G((p1,1),,(pn+1,1))||G(p1,,pn+1)|,{\displaystyle {\bigl |}G(v_{1},\dots ,v_{n}){\bigr |}={\bigl |}G((p_{1},1),\dots ,(p_{n+1},1)){\bigr |}-{\bigl |}G(p_{1},\dots ,p_{n+1}){\bigr |}\,,}

where each(pj,1){\displaystyle (p_{j},1)} is the corresponding pointpj{\displaystyle p_{j}} supplemented with the coordinate value of 1 for an(m+1){\displaystyle (m+1)}-st dimension.[citation needed] Note that in the common case thatn =m, the second term on the right-hand side will be zero.

Constructing an orthonormal basis

[edit]

Given a set of linearly independent vectors{vi}{\displaystyle \{v_{i}\}} with Gram matrixG{\displaystyle G} defined byGij:=vi,vj{\displaystyle G_{ij}:=\langle v_{i},v_{j}\rangle }, one can construct an orthonormal basis

ui:=j(G1/2)jivj.{\displaystyle u_{i}:=\sum _{j}{\bigl (}G^{-1/2}{\bigr )}_{ji}v_{j}.}

In matrix notation,U=VG1/2{\displaystyle U=VG^{-1/2}}, whereU{\displaystyle U} has orthonormal basis vectors{ui}{\displaystyle \{u_{i}\}} and the matrixV{\displaystyle V} is composed of the given column vectors{vi}{\displaystyle \{v_{i}\}}.

The matrixG1/2{\displaystyle G^{-1/2}} is guaranteed to exist. Indeed,G{\displaystyle G} is Hermitian, and so can be decomposed asG=UDU{\displaystyle G=UDU^{\dagger }} withU{\displaystyle U} a unitary matrix andD{\displaystyle D} a real diagonal matrix. Additionally, thevi{\displaystyle v_{i}} are linearly independent if and only ifG{\displaystyle G} is positive definite, which implies that the diagonal entries ofD{\displaystyle D} are positive.G1/2{\displaystyle G^{-1/2}} is therefore uniquely defined byG1/2:=UD1/2U{\displaystyle G^{-1/2}:=UD^{-1/2}U^{\dagger }}. One can check that these new vectors are orthonormal:

ui,uj=ij(G1/2)iivi,(G1/2)jjvj=ij(G1/2)iiGij(G1/2)jj=(G1/2GG1/2)ij=δij{\displaystyle {\begin{aligned}\langle u_{i},u_{j}\rangle &=\sum _{i'}\sum _{j'}{\Bigl \langle }{\bigl (}G^{-1/2}{\bigr )}_{i'i}v_{i'},{\bigl (}G^{-1/2}{\bigr )}_{j'j}v_{j'}{\Bigr \rangle }\\[10mu]&=\sum _{i'}\sum _{j'}{\bigl (}G^{-1/2}{\bigr )}_{ii'}G_{i'j'}{\bigl (}G^{-1/2}{\bigr )}_{j'j}\\[8mu]&={\bigl (}G^{-1/2}GG^{-1/2}{\bigr )}_{ij}=\delta _{ij}\end{aligned}}}

where we used(G1/2)=G1/2{\displaystyle {\bigl (}G^{-1/2}{\bigr )}^{\dagger }=G^{-1/2}}.

See also

[edit]

References

[edit]
  1. ^abcHorn & Johnson 2013, p. 441, p.441, Theorem 7.2.10
  2. ^Lanckriet, G. R. G.; Cristianini, N.; Bartlett, P.; Ghaoui, L. E.; Jordan, M. I. (2004).Learning the kernel matrix with semidefinite programming.Journal of Machine Learning Research (Report). Vol. 5. pp. 27–72 [p. 29].
  3. ^Horn & Johnson (2013), p. 452, Theorem 7.3.11
  4. ^Ramon, Garcia, Stephan; Javad, Mashreghi; T., Ross, William (2023-01-30)."Operator Theory by Example".OUP Academic.doi:10.1093/oso/9780192863867.001.0001.{{cite journal}}: CS1 maint: multiple names: authors list (link)

External links

[edit]
Matrix classes
Explicitly constrained entries
Constant
Conditions oneigenvalues or eigenvectors
Satisfying conditions onproducts orinverses
With specific applications
Used instatistics
Used ingraph theory
Used in science and engineering
Related terms
Retrieved from "https://en.wikipedia.org/w/index.php?title=Gram_matrix&oldid=1311913677"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp