HOME

TheInfoList



OR:

In mathematics, a matrix polynomial is a polynomial with square matrices as variables. Given an ordinary, scalar-valued polynomial : P(x) = \sum_^n =a_0 + a_1 x+ a_2 x^2 + \cdots + a_n x^n, this polynomial evaluated at a matrix ''A'' is :P(A) = \sum_^n =a_0 I + a_1 A + a_2 A^2 + \cdots + a_n A^n, where ''I'' is the
identity matrix In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial o ...
. A matrix polynomial equation is an equality between two matrix polynomials, which holds for the specific matrices in question. A matrix polynomial identity is a matrix polynomial equation which holds for all matrices ''A'' in a specified
matrix ring In abstract algebra, a matrix ring is a set of matrices with entries in a ring ''R'' that form a ring under matrix addition and matrix multiplication . The set of all matrices with entries in ''R'' is a matrix ring denoted M''n''(''R'')Lang, ''U ...
''Mn''(''R'').


Characteristic and minimal polynomial

The
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The chara ...
of a matrix ''A'' is a scalar-valued polynomial, defined by p_A(t) = \det \left(tI - A\right). The Cayley–Hamilton theorem states that if this polynomial is viewed as a matrix polynomial and evaluated at the matrix ''A'' itself, the result is the zero matrix: p_A(A) = 0. The characteristic polynomial is thus a polynomial which annihilates ''A''. There is a unique monic polynomial of minimal degree which annihilates ''A''; this polynomial is the minimal polynomial. Any polynomial which annihilates ''A'' (such as the characteristic polynomial) is a multiple of the minimal polynomial. It follows that given two polynomials ''P'' and ''Q'', we have P(A) = Q(A) if and only if : P^(\lambda_i) = Q^(\lambda_i) \qquad \text j = 0,\ldots,n_i-1 \text i = 1,\ldots,s, where P^ denotes the ''j''th derivative of ''P'' and \lambda_1, \dots, \lambda_s are the eigenvalues of ''A'' with corresponding indices n_1, \dots, n_s (the index of an eigenvalue is the size of its largest Jordan block).


Matrix geometrical series

Matrix polynomials can be used to sum a matrix geometrical series as one would an ordinary geometric series, :S=I+A+A^2+\cdots +A^n :AS=A+A^2+A^3+\cdots +A^ :(I-A)S=S-AS=I-A^ :S=(I-A)^(I-A^) If ''I'' − ''A'' is nonsingular one can evaluate the expression for the sum ''S''.


See also

* Latimer–MacDuffee theorem * Matrix exponential * Matrix function


Notes


References

* * . * . {{DEFAULTSORT:Matrix Polynomial Matrix theory Polynomials