HOME

TheInfoList



OR:

In
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices. ...
, a diagonal matrix is a
matrix Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** ''The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchis ...
in which the entries outside the
main diagonal In linear algebra, the main diagonal (sometimes principal diagonal, primary diagonal, leading diagonal, major diagonal, or good diagonal) of a matrix A is the list of entries a_ where i = j. All off-diagonal elements are zero 0 (zero) is ...
are all zero; the term usually refers to
square matrices In mathematics, a square matrix is a matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied. Square matrices are often ...
. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is \left begin 3 & 0 \\ 0 & 2 \end\right/math>, while an example of a 3×3 diagonal matrix is \left begin 6 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end\right/math>. An
identity matrix In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial or ...
of any size, or any multiple of it (a
scalar matrix In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal m ...
), is a diagonal matrix. A diagonal matrix is sometimes called a scaling matrix, since matrix multiplication with it results in changing scale (size). Its determinant is the product of its diagonal values.


Definition

As stated above, a diagonal matrix is a matrix in which all off-diagonal entries are zero. That is, the matrix with ''n'' columns and ''n'' rows is diagonal if \forall i,j \in \, i \ne j \implies d_ = 0. However, the main diagonal entries are unrestricted. The term ''diagonal matrix'' may sometimes refer to a , which is an ''m''-by-''n'' matrix with all the entries not of the form ''d''''i'',''i'' being zero. For example: :\begin 1 & 0 & 0\\ 0 & 4 & 0\\ 0 & 0 & -3\\ 0 & 0 & 0\\ \end or \begin 1 & 0 & 0 & 0 & 0\\ 0 & 4 & 0& 0 & 0\\ 0 & 0 & -3& 0 & 0 \end More often, however, ''diagonal matrix'' refers to square matrices, which can be specified explicitly as a . A square diagonal matrix is a
symmetric matrix In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with re ...
, so this can also be called a . The following matrix is square diagonal matrix: \begin 1 & 0 & 0\\ 0 & 4 & 0\\ 0 & 0 & -2 \end If the entries are
real numbers In mathematics, a real number is a number that can be used to measure a ''continuous'' one-dimensional quantity such as a distance, duration or temperature. Here, ''continuous'' means that values can have arbitrarily small variations. Every ...
or
complex numbers In mathematics, a complex number is an element of a number system that extends the real numbers with a specific element denoted , called the imaginary unit and satisfying the equation i^= -1; every complex number can be expressed in the form a ...
, then it is a
normal matrix In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose : The concept of normal matrices can be extended to normal operators on infinite dimensional normed spaces and to normal elements in C*-algebras. A ...
as well. In the remainder of this article we will consider only square diagonal matrices, and refer to them simply as "diagonal matrices".


Vector-to-matrix diag operator

A diagonal matrix D can be constructed from a vector \mathbf = \begina_1 & \dotsm & a_n\end^\textsf using the \operatorname operator: D = \operatorname(a_1, \dots, a_n) This may be written more compactly as D = \operatorname(\mathbf). The same operator is also used to represent block diagonal matrices as A = \operatorname(A_1, \dots, A_n) where each argument A_i is a matrix. The \operatorname operator may be written as: \operatorname(\mathbf) = \left(\mathbf \mathbf^\textsf\right) \circ I where \circ represents the Hadamard product and \mathbf is a constant vector with elements 1.


Matrix-to-vector diag operator

The inverse matrix-to-vector \operatorname operator is sometimes denoted by the identically named \operatorname(D) = \begina_1 & \dotsm & a_n\end^\textsf where the argument is now a matrix and the result is a vector of its diagonal entries. The following property holds: \operatorname(AB) = \sum_j \left(A \circ B^\textsf\right)_


Scalar matrix

A diagonal matrix with equal diagonal entries is a scalar matrix; that is, a scalar multiple ''λ'' of the
identity matrix In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial or ...
. Its effect on a
vector Vector most often refers to: *Euclidean vector, a quantity with a magnitude and a direction *Vector (epidemiology), an agent that carries and transmits an infectious pathogen into another living organism Vector may also refer to: Mathematic ...
is
scalar multiplication In mathematics, scalar multiplication is one of the basic operations defining a vector space in linear algebra (or more generally, a module in abstract algebra). In common geometrical contexts, scalar multiplication of a real Euclidean vector ...
by ''λ''. For example, a 3×3 scalar matrix has the form: \begin \lambda & 0 & 0 \\ 0 & \lambda & 0 \\ 0 & 0 & \lambda \end \equiv \lambda \boldsymbol_3 The scalar matrices are the
center Center or centre may refer to: Mathematics *Center (geometry), the middle of an object * Center (algebra), used in various contexts ** Center (group theory) ** Center (ring theory) * Graph center, the set of all vertices of minimum eccentricity ...
of the algebra of matrices: that is, they are precisely the matrices that commute with all other square matrices of the same size. By contrast, over a
field Field may refer to: Expanses of open ground * Field (agriculture), an area of land used for agricultural purposes * Airfield, an aerodrome that lacks the infrastructure of an airport * Battlefield * Lawn, an area of mowed grass * Meadow, a grass ...
(like the real numbers), a diagonal matrix with all diagonal elements distinct only commutes with diagonal matrices (its
centralizer In mathematics, especially group theory, the centralizer (also called commutant) of a subset ''S'' in a group ''G'' is the set of elements \mathrm_G(S) of ''G'' such that each member g \in \mathrm_G(S) commutes with each element of ''S'', ...
is the set of diagonal matrices). That is because if a diagonal matrix D = \operatorname(a_1, \dots, a_n) has a_i \neq a_j, then given a matrix M with m_ \neq 0, the (i, j) term of the products are: (DM)_ = a_im_ and (MD)_ = m_a_j, and a_jm_ \neq m_a_i (since one can divide by m_), so they do not commute unless the off-diagonal terms are zero. Diagonal matrices where the diagonal entries are not all equal or all distinct have centralizers intermediate between the whole space and only diagonal matrices. For an abstract vector space ''V'' (rather than the concrete vector space K^n), the analog of scalar matrices are scalar transformations. This is true more generally for a module ''M'' over a ring ''R'', with the
endomorphism algebra In mathematics, an endomorphism is a morphism from a mathematical object to itself. An endomorphism that is also an isomorphism is an automorphism. For example, an endomorphism of a vector space is a linear map , and an endomorphism of a gro ...
End(''M'') (algebra of linear operators on ''M'') replacing the algebra of matrices. Formally, scalar multiplication is a linear map, inducing a map R \to \operatorname(M), (from a scalar ''λ'' to its corresponding scalar transformation, multiplication by ''λ'') exhibiting End(''M'') as a ''R''-
algebra Algebra () is one of the broad areas of mathematics. Roughly speaking, algebra is the study of mathematical symbols and the rules for manipulating these symbols in formulas; it is a unifying thread of almost all of mathematics. Elementary a ...
. For vector spaces, the scalar transforms are exactly the
center Center or centre may refer to: Mathematics *Center (geometry), the middle of an object * Center (algebra), used in various contexts ** Center (group theory) ** Center (ring theory) * Graph center, the set of all vertices of minimum eccentricity ...
of the endomorphism algebra, and, similarly, invertible transforms are the center of the
general linear group In mathematics, the general linear group of degree ''n'' is the set of invertible matrices, together with the operation of ordinary matrix multiplication. This forms a group, because the product of two invertible matrices is again invertible, ...
GL(''V''). The former is more generally true
free module In mathematics, a free module is a module that has a basis – that is, a generating set consisting of linearly independent elements. Every vector space is a free module, but, if the ring of the coefficients is not a division ring (not a field i ...
s M \cong R^n, for which the endomorphism algebra is isomorphic to a matrix algebra.


Vector operations

Multiplying a vector by a diagonal matrix multiplies each of the terms by the corresponding diagonal entry. Given a diagonal matrix D = \operatorname(a_1, \dots, a_n) and a vector \mathbf = \begin x_1 & \dotsm & x_n \end^\textsf, the product is: D\mathbf = \operatorname(a_1, \dots, a_n)\beginx_1 \\ \vdots \\ x_n\end = \begin a_1 \\ & \ddots \\ & & a_n \end \beginx_1 \\ \vdots \\ x_n\end = \begina_1 x_1 \\ \vdots \\ a_n x_n\end. This can be expressed more compactly by using a vector instead of a diagonal matrix, \mathbf = \begin a_1 & \dotsm & a_n \end^\textsf, and taking the Hadamard product of the vectors (entrywise product), denoted \mathbf \circ \mathbf: D\mathbf = \mathbf \circ \mathbf = \begin a_1 \\ \vdots \\ a_n \end \circ \begin x_1 \\ \vdots \\ x_n \end = \begin a_1 x_1 \\ \vdots \\ a_n x_n \end. This is mathematically equivalent, but avoids storing all the zero terms of this
sparse matrix In numerical analysis and scientific computing, a sparse matrix or sparse array is a matrix in which most of the elements are zero. There is no strict definition regarding the proportion of zero-value elements for a matrix to qualify as sparse b ...
. This product is thus used in
machine learning Machine learning (ML) is a field of inquiry devoted to understanding and building methods that 'learn', that is, methods that leverage data to improve performance on some set of tasks. It is seen as a part of artificial intelligence. Machin ...
, such as computing products of derivatives in
backpropagation In machine learning, backpropagation (backprop, BP) is a widely used algorithm for training feedforward artificial neural networks. Generalizations of backpropagation exist for other artificial neural networks (ANNs), and for functions gene ...
or multiplying IDF weights in TF-IDF, since some
BLAS Basic Linear Algebra Subprograms (BLAS) is a specification that prescribes a set of low-level routines for performing common linear algebra operations such as vector addition, scalar multiplication, dot products, linear combinations, and matrix ...
frameworks, which multiply matrices efficiently, do not include Hadamard product capability directly.


Matrix operations

The operations of matrix addition and
matrix multiplication In mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices. For matrix multiplication, the number of columns in the first matrix must be equal to the number of rows in the ...
are especially simple for diagonal matrices. Write for a diagonal matrix whose diagonal entries starting in the upper left corner are ''a''1, ..., ''a''''n''. Then, for addition, we have : + = and for
matrix multiplication In mathematics, particularly in linear algebra, matrix multiplication is a binary operation that produces a matrix from two matrices. For matrix multiplication, the number of columns in the first matrix must be equal to the number of rows in the ...
, : = . The diagonal matrix is
invertible In mathematics, the concept of an inverse element generalises the concepts of opposite () and reciprocal () of numbers. Given an operation denoted here , and an identity element denoted , if , one says that is a left inverse of , and that ...
if and only if In logic and related fields such as mathematics and philosophy, "if and only if" (shortened as "iff") is a biconditional logical connective between statements, where either both statements are true or both are false. The connective is bicond ...
the entries ''a''1, ..., ''a''''n'' are all nonzero. In this case, we have : = . In particular, the diagonal matrices form a
subring In mathematics, a subring of ''R'' is a subset of a ring that is itself a ring when binary operations of addition and multiplication on ''R'' are restricted to the subset, and which shares the same multiplicative identity as ''R''. For those ...
of the ring of all ''n''-by-''n'' matrices. Multiplying an ''n''-by-''n'' matrix from the ''left'' with amounts to multiplying the -th ''row'' of by for all ; multiplying the matrix from the ''right'' with amounts to multiplying the -th ''column'' of by for all .


Operator matrix in eigenbasis

As explained in determining coefficients of operator matrix, there is a special basis, , for which the matrix A takes the diagonal form. Hence, in the defining equation A \mathbf e_j = \sum_i a_ \mathbf e_i, all coefficients a_ with are zero, leaving only one term per sum. The surviving diagonal elements, a_, are known as eigenvalues and designated with \lambda_i in the equation, which reduces to A \mathbf e_i = \lambda_i \mathbf e_i. The resulting equation is known as eigenvalue equation and used to derive the
characteristic polynomial In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The char ...
and, further,
eigenvalues and eigenvectors In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
. In other words, the
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
s of are with associated
eigenvectors In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
of .


Properties

* The
determinant In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if an ...
of is the product . * The adjugate of a diagonal matrix is again diagonal. * Where all matrices are square, ** A matrix is diagonal if and only if it is triangular and normal. ** A matrix is diagonal if and only if it is both upper- and lower-triangular. ** A diagonal matrix is
symmetric Symmetry (from grc, συμμετρία "agreement in dimensions, due proportion, arrangement") in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics, "symmetry" has a more precise definit ...
. * The
identity matrix In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial or ...
''I''''n'' and
zero matrix In mathematics, particularly linear algebra, a zero matrix or null matrix is a matrix all of whose entries are zero. It also serves as the additive identity of the additive group of m \times n matrices, and is denoted by the symbol O or 0 followed ...
are diagonal. * A 1×1 matrix is always diagonal.


Applications

Diagonal matrices occur in many areas of linear algebra. Because of the simple description of the matrix operation and eigenvalues/eigenvectors given above, it is typically desirable to represent a given matrix or
linear map In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pre ...
by a diagonal matrix. In fact, a given ''n''-by-''n'' matrix is similar to a diagonal matrix (meaning that there is a matrix such that is diagonal) if and only if it has
linearly independent In the theory of vector spaces, a set of vectors is said to be if there is a nontrivial linear combination of the vectors that equals the zero vector. If no such linear combination exists, then the vectors are said to be . These concepts a ...
eigenvectors. Such matrices are said to be
diagonalizable In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P and a diagonal matrix D such that or equivalently (Such D are not unique.) F ...
. Over the
field Field may refer to: Expanses of open ground * Field (agriculture), an area of land used for agricultural purposes * Airfield, an aerodrome that lacks the infrastructure of an airport * Battlefield * Lawn, an area of mowed grass * Meadow, a grass ...
of
real Real may refer to: Currencies * Brazilian real (R$) * Central American Republic real * Mexican real * Portuguese real * Spanish real * Spanish colonial real Music Albums * ''Real'' (L'Arc-en-Ciel album) (2000) * ''Real'' (Bright album) (2010) ...
or
complex Complex commonly refers to: * Complexity, the behaviour of a system whose components interact in multiple ways so possible interactions are difficult to describe ** Complex system, a system composed of many components which may interact with each ...
numbers, more is true. The
spectral theorem In mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful be ...
says that every
normal matrix In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose : The concept of normal matrices can be extended to normal operators on infinite dimensional normed spaces and to normal elements in C*-algebras. A ...
is unitarily similar to a diagonal matrix (if then there exists a
unitary matrix In linear algebra, a complex square matrix is unitary if its conjugate transpose is also its inverse, that is, if U^* U = UU^* = UU^ = I, where is the identity matrix. In physics, especially in quantum mechanics, the conjugate transpose ...
such that is diagonal). Furthermore, the
singular value decomposition In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix. It generalizes the eigendecomposition of a square normal matrix with an orthonormal eigenbasis to any \ m \times n\ matrix. It is related ...
implies that for any matrix , there exist unitary matrices and such that is diagonal with positive entries.


Operator theory

In
operator theory In mathematics, operator theory is the study of linear operators on function spaces, beginning with differential operators and integral operators. The operators may be presented abstractly by their characteristics, such as bounded linear operator ...
, particularly the study of PDEs, operators are particularly easy to understand and PDEs easy to solve if the operator is diagonal with respect to the basis with which one is working; this corresponds to a
separable partial differential equation A separable partial differential equation is one that can be broken into a set of separate equations of lower dimensionality (fewer independent variables) by a method of separation of variables. This generally relies upon the problem having some s ...
. Therefore, a key technique to understanding operators is a change of coordinates—in the language of operators, an
integral transform In mathematics, an integral transform maps a function from its original function space into another function space via integration, where some of the properties of the original function might be more easily characterized and manipulated than in ...
—which changes the basis to an
eigenbasis In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
of
eigenfunction In mathematics, an eigenfunction of a linear operator ''D'' defined on some function space is any non-zero function f in that space that, when acted upon by ''D'', is only multiplied by some scaling factor called an eigenvalue. As an equation, t ...
s: which makes the equation separable. An important example of this is the
Fourier transform A Fourier transform (FT) is a mathematical transform that decomposes functions into frequency components, which are represented by the output of the transform as a function of frequency. Most commonly functions of time or space are transformed, ...
, which diagonalizes constant coefficient differentiation operators (or more generally translation invariant operators), such as the Laplacian operator, say, in the
heat equation In mathematics and physics, the heat equation is a certain partial differential equation. Solutions of the heat equation are sometimes known as caloric functions. The theory of the heat equation was first developed by Joseph Fourier in 1822 for ...
. Especially easy are
multiplication operator In operator theory, a multiplication operator is an operator defined on some vector space of functions and whose value at a function is given by multiplication by a fixed function . That is, T_f\varphi(x) = f(x) \varphi (x) \quad for all in th ...
s, which are defined as multiplication by (the values of) a fixed function–the values of the function at each point correspond to the diagonal entries of a matrix.


See also

* Anti-diagonal matrix * Banded matrix * Bidiagonal matrix * Diagonally dominant matrix *
Diagonalizable matrix In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P and a diagonal matrix D such that or equivalently (Such D are not unique.) F ...
*
Jordan normal form In linear algebra, a Jordan normal form, also known as a Jordan canonical form (JCF), is an upper triangular matrix of a particular form called a Jordan matrix representing a linear operator on a finite-dimensional vector space with respect to som ...
*
Multiplication operator In operator theory, a multiplication operator is an operator defined on some vector space of functions and whose value at a function is given by multiplication by a fixed function . That is, T_f\varphi(x) = f(x) \varphi (x) \quad for all in th ...
*
Tridiagonal matrix In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal (the first diagonal below this), and the supradiagonal/upper diagonal (the first diagonal above the main di ...
*
Toeplitz matrix In linear algebra, a Toeplitz matrix or diagonal-constant matrix, named after Otto Toeplitz, is a matrix in which each descending diagonal from left to right is constant. For instance, the following matrix is a Toeplitz matrix: :\qquad\begin a & ...
*
Toral Lie algebra In mathematics, a toral subalgebra is a Lie subalgebra of a general linear Lie algebra all of whose elements are semisimple (or diagonalizable over an algebraically closed field). Equivalently, a Lie algebra is toral if it contains no nonzero nilpo ...
* Circulant matrix


Notes


References


Sources

* {{Matrix classes Matrix normal forms Sparse matrices