Definiteness of Gram matrix
Definition 1 Definiteness of a matrix
Let A be n \times n real symmetric matrix, and column vector x \in \mathbb{R}^n.
- A is positive (negative) definite if x^TAx > 0 \ (x^TAx < 0),\ \forall x \neq 0.
- A is positive (negative) semidefinite if x^TAx \geq 0 \ (x^TAx \leq 0),\ \forall x \in \mathbb{R}^n , and x^TAx = 0 for at least one x \neq 0.
- A is indefinite if x^TAx takes on both positive and negative values (with different, non-zero x's)
Definition 2 Gram matrix
The Gram matrix of a set of vectors v_1, \dots, v_n in an inner product space is the Hermitan matrix of inner products, whose entries are given by (1).
\begin{align} G_{ij} = \langle v_i, v_j \rangle \tag{1} \\ \end{align}
For finite dimensional real vectors in \mathbb{R}^n with the usual Euclidean dot product, the Gram matrix G is simply G=V^TV, where the column of V is v_i.
\begin{align} G_{ij} = \langle v_i, v_j \rangle \tag{1} \\ \end{align}
For finite dimensional real vectors in \mathbb{R}^n with the usual Euclidean dot product, the Gram matrix G is simply G=V^TV, where the column of V is v_i.
Theorem 1 The symmetric matrix G is a Gram matrix if and only if it is positive semidefinite or positive definite.
Proof
(1) Gram matrix G=V^TV is positive semidefinite or positive definite.
x^TAx = x^TV^TVx = (Vx)^T(Vx) = \| Vx\|^2 \geq 0, \forall x \in \mathbb{R}^n
(2) Positive semidefinite or positive definite matrix is Gram matrix.
Positive semidefinite or positive definite n \times n matrix G with rank r \leq n can be decomposed as (2)-(a) by Spectral Theorem (since those are symmetric, hence Hermitan), where T = diag(\lambda_1, \dots , \lambda_n) is the diagonal matrix containing the eigenvalues in its main diagonal, while U = (u_1, \dots , u_n) is the orthogonal matrix containing the corresponding eigenvectors of A in its columns in the order of the eigenvalues.
G \stackrel{(a)}{=} UTU^H = U \sqrt{T} \sqrt{T} U^H = U \sqrt{T}^H (U \sqrt{T}^H)^H \tag{2} \\
Theorem 2 Gram matrix G=V^TV is positive definite if and only if the column vectors in V are linearly independent (or ker V = {0}).
Reference
Horn, Roger A.; Johnson, Charles R. (2013). "7.2 Characterizations and Properties". Matrix Analysis (Second ed.). Cambridge University Press. ISBN 978-0-521-83940-2.
댓글
댓글 쓰기