HOME

TheInfoList



OR:

In
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrice ...
, two
matrices Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** ''The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
A and B are said to commute if AB=BA, or equivalently if their
commutator In mathematics, the commutator gives an indication of the extent to which a certain binary operation fails to be commutative. There are different definitions used in group theory and ring theory. Group theory The commutator of two elements, ...
,B AB-BA is zero. A
set Set, The Set, SET or SETS may refer to: Science, technology, and mathematics Mathematics *Set (mathematics), a collection of elements *Category of sets, the category whose objects and morphisms are sets and total functions, respectively Electro ...
of matrices A_1, \ldots, A_k is said to commute if they commute pairwise, meaning that every pair of matrices in the set commute with each other.


Characterizations and properties

* Commuting matrices preserve each other's
eigenspace In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denote ...
s. As a consequence, commuting matrices over an
algebraically closed field In mathematics, a field is algebraically closed if every non-constant polynomial in (the univariate polynomial ring with coefficients in ) has a root in . Examples As an example, the field of real numbers is not algebraically closed, because ...
are simultaneously triangularizable; that is, there are bases over which they are both
upper triangular In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal ar ...
. In other words, if A_1,\ldots,A_k commute, there exists a similarity matrix P such that P^ A_i P is upper triangular for all i \in \. The converse is not necessarily true, as the following counterexample shows: *:\begin 1 & 2 \\ 0 & 3 \end\begin 1 & 1 \\ 0 & 1 \end = \begin 1 & 3 \\ 0 & 3 \end \ne \begin 1 & 5 \\ 0 & 3 \end=\begin 1 & 1 \\ 0 & 1 \end\begin 1 & 2 \\ 0 & 3 \end. : However, if the square of the commutator of two matrices is zero, that is, ,B2 = 0, then the converse is true. * Two diagonalizable matrices A and B commute (AB=BA) if they are
simultaneously diagonalizable In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P and a diagonal matrix D such that or equivalently (Such D are not unique. ...
(that is, there exists an invertible matrix P such that both P^ A P and P^B P are
diagonal In geometry, a diagonal is a line segment joining two vertices of a polygon or polyhedron, when those vertices are not on the same edge. Informally, any sloping line is called diagonal. The word ''diagonal'' derives from the ancient Greek δ� ...
). Converse is valid, provided that one of the matrices has no multiple eigenvalues. * If A and B commute, they have a common eigenvector. If A has distinct eigenvalues, and A and B commute, then A's eigenvectors are B's eigenvectors. * If one of the matrices has the property that its minimal polynomial coincides with its
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The c ...
(that is, it has the maximal degree), which happens in particular whenever the characteristic polynomial has only simple roots, then the other matrix can be written as a polynomial in the first. * As a direct consequence of simultaneous triangulizability, the
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denote ...
s of two commuting
complex Complex commonly refers to: * Complexity, the behaviour of a system whose components interact in multiple ways so possible interactions are difficult to describe ** Complex system, a system composed of many components which may interact with each ...
matrices ''A'', ''B'' with their algebraic multiplicities (the
multiset In mathematics, a multiset (or bag, or mset) is a modification of the concept of a set that, unlike a set, allows for multiple instances for each of its elements. The number of instances given for each element is called the multiplicity of that e ...
s of roots of their characteristic polynomials) can be matched up as \alpha_i\leftrightarrow\beta_i in such a way that the multiset of eigenvalues of any polynomial P(A,B) in the two matrices is the multiset of the values P(\alpha_i,\beta_i). This theorem is due to Frobenius. * Two
Hermitian {{Short description, none Numerous things are named after the French mathematician Charles Hermite (1822–1901): Hermite * Cubic Hermite spline, a type of third-degree spline * Gauss–Hermite quadrature, an extension of Gaussian quadrature m ...
matrices commute if their
eigenspace In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denote ...
s coincide. In particular, two Hermitian matrices without multiple eigenvalues commute if they share the same set of eigenvectors. This follows by considering the eigenvalue decompositions of both matrices. Let A and B be two Hermitian matrices. A and B have common eigenspaces when they can be written as A = U \Lambda_1 U^\dagger and B = U \Lambda_2 U^\dagger. It then follows that *: AB = U \Lambda_1 U^\dagger U \Lambda_2 U^\dagger = U \Lambda_1 \Lambda_2 U^\dagger = U \Lambda_2 \Lambda_1 U^\dagger = U \Lambda_2 U^\dagger U \Lambda_1 U^\dagger = BA. * The property of two matrices commuting is not transitive: A matrix A may commute with both B and C, and still B and C do not commute with each other. As an example, the
identity matrix In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial or ...
commutes with all matrices, which between them do not all commute. If the set of matrices considered is restricted to Hermitian matrices without multiple eigenvalues, then commutativity is transitive, as a consequence of the characterization in terms of eigenvectors. *
Lie's theorem In mathematics, specifically the theory of Lie algebras, Lie's theorem states that, over an algebraically closed field of characteristic zero, if \pi: \mathfrak \to \mathfrak(V) is a finite-dimensional representation of a solvable Lie algebra, the ...
, which shows that any representation of a
solvable Lie algebra In mathematics, a Lie algebra \mathfrak is solvable if its derived series terminates in the zero subalgebra. The ''derived Lie algebra'' of the Lie algebra \mathfrak is the subalgebra of \mathfrak, denoted : mathfrak,\mathfrak/math> that consist ...
is simultaneously upper triangularizable may be viewed as a generalization. * An ''n'' × ''n'' matrix A commutes with every other ''n'' × ''n'' matrix if and only if it is a scalar matrix, that is, a matrix of the form \lambda I, where I is the ''n'' × ''n'' identity matrix and \lambda is a scalar. In other words, the
center Center or centre may refer to: Mathematics *Center (geometry), the middle of an object * Center (algebra), used in various contexts ** Center (group theory) ** Center (ring theory) * Graph center, the set of all vertices of minimum eccentrici ...
of the
group A group is a number of persons or things that are located, gathered, or classed together. Groups of people * Cultural group, a group whose members share the same cultural identity * Ethnic group, a group whose members share the same ethnic ide ...
of ''n'' × ''n'' matrices under multiplication is the
subgroup In group theory, a branch of mathematics, given a group ''G'' under a binary operation ∗, a subset ''H'' of ''G'' is called a subgroup of ''G'' if ''H'' also forms a group under the operation ∗. More precisely, ''H'' is a subgroup ...
of scalar matrices.


Examples

* The identity matrix commutes with all matrices. *
Jordan block In the mathematical discipline of matrix theory, a Jordan matrix, named after Camille Jordan, is a block diagonal matrix over a ring (whose identities are the zero 0 and one 1), where each block along the diagonal, called a Jordan block, has th ...
s commute with upper triangular matrices that have the same value along bands. * If the product of two symmetric matrices is symmetric, then they must commute. That also means that every diagonal matrix commutes with all other diagonal matrices. * Circulant matrices commute. They form a
commutative ring In mathematics, a commutative ring is a ring in which the multiplication operation is commutative. The study of commutative rings is called commutative algebra. Complementarily, noncommutative algebra is the study of ring properties that are not ...
since the sum of two circulant matrices is circulant.


History

The notion of commuting matrices was introduced by Cayley in his memoir on the theory of matrices, which also provided the first axiomatization of matrices. The first significant results proved on them was the above result of Frobenius in 1878.


References

{{reflist Matrix theory