HOME

TheInfoList



OR:

In
mathematics Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many ar ...
, particularly in
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as :a_1x_1+\cdots +a_nx_n=b, linear maps such as :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrix (mathemat ...
, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition In terms of the entries of the matrix, if a_ denotes the entry in the i-th row and j-th column, then the skew-symmetric condition is equivalent to


Example

The matrix A = \begin 0 & 2 & -45 \\ -2 & 0 & -4 \\ 45 & 4 & 0 \end is skew-symmetric because A^\textsf = \begin 0 & -2 & 45 \\ 2 & 0 & 4 \\ -45 & -4 & 0 \end = -A .


Properties

Throughout, we assume that all matrix entries belong to a field \mathbb whose characteristic is not equal to 2. That is, we assume that , where 1 denotes the multiplicative identity and 0 the additive identity of the given field. If the characteristic of the field is 2, then a skew-symmetric matrix is the same thing as a
symmetric matrix In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with ...
. * The sum of two skew-symmetric matrices is skew-symmetric. * A scalar multiple of a skew-symmetric matrix is skew-symmetric. * The elements on the diagonal of a skew-symmetric matrix are zero, and therefore its trace equals zero. * If A is a real skew-symmetric matrix and \lambda is a real eigenvalue, then \lambda = 0, i.e. the nonzero eigenvalues of a skew-symmetric matrix are non-real. * If A is a real skew-symmetric matrix, then I + A is invertible, where I is the identity matrix. * If A is a skew-symmetric matrix then A^2 is a symmetric negative semi-definite matrix.


Vector space structure

As a result of the first two properties above, the set of all skew-symmetric matrices of a fixed size forms a
vector space In mathematics and physics, a vector space (also called a linear space) is a set (mathematics), set whose elements, often called vector (mathematics and physics), ''vectors'', can be added together and multiplied ("scaled") by numbers called sc ...
. The space of n \times n skew-symmetric matrices has
dimension In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it. Thus, a line has a dimension of one (1D) because only one coo ...
\fracn(n - 1). Let \mbox_n denote the space of n \times n matrices. A skew-symmetric matrix is determined by \fracn(n - 1) scalars (the number of entries above the main diagonal); a
symmetric matrix In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with ...
is determined by \fracn(n + 1) scalars (the number of entries on or above the main diagonal). Let \mbox_n denote the space of n \times n skew-symmetric matrices and \mbox_n denote the space of n \times n symmetric matrices. If A \in \mbox_n then A = \tfrac\left(A - A^\mathsf\right) + \tfrac\left(A + A^\mathsf\right). Notice that \frac\left(A - A^\textsf\right) \in \mbox_n and \frac\left(A + A^\textsf\right) \in \mbox_n. This is true for every square matrix A with entries from any field whose characteristic is different from 2. Then, since \mbox_n = \mbox_n + \mbox_n and \mbox_n \cap \mbox_n = \, \mbox_n = \mbox_n \oplus \mbox_n, where \oplus denotes the direct sum. Denote by \langle \cdot, \cdot \rangle the standard inner product on \R^n. The real n \times n matrix A is skew-symmetric if and only if \langle Ax,y \rangle = - \langle x, Ay\rangle \quad \text x, y \in \R^n. This is also equivalent to \langle x, Ax \rangle = 0 for all x \in \R^n (one implication being obvious, the other a plain consequence of \langle x + y, A(x + y)\rangle = 0 for all x and y). Since this definition is independent of the choice of basis, skew-symmetry is a property that depends only on the
linear operator In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pr ...
A and a choice of inner product. 3 \times 3 skew symmetric matrices can be used to represent
cross product In mathematics, the cross product or vector product (occasionally directed area product, to emphasize its geometric significance) is a binary operation on two vectors in a three-dimensional oriented Euclidean vector space (named here E), and ...
s as matrix multiplications. Furthermore, if A is a skew-symmetric (or skew-Hermitian) matrix, then x^T A x = 0 for all x \in \C^n.


Determinant

Let A be a n \times n skew-symmetric matrix. The
determinant In mathematics, the determinant is a Scalar (mathematics), scalar-valued function (mathematics), function of the entries of a square matrix. The determinant of a matrix is commonly denoted , , or . Its value characterizes some properties of the ...
of A satisfies \det(A) = \det\left(A^\textsf\right) = \det(-A) = ^n \det(A). In particular, if n is odd, and since the underlying field is not of characteristic 2, the determinant vanishes. Hence, all odd dimension skew symmetric matrices are singular as their determinants are always zero. This result is called Jacobi’s theorem, after Carl Gustav Jacobi (Eves, 1980). The even-dimensional case is more interesting. It turns out that the determinant of A for n even can be written as the square of a polynomial in the entries of A, which was first proved by Cayley: \det(A) = \operatorname(A)^2. This polynomial is called the '' Pfaffian'' of A and is denoted \operatorname(A). Thus the determinant of a real skew-symmetric matrix is always non-negative. However this last fact can be proved in an elementary way as follows: the eigenvalues of a real skew-symmetric matrix are purely imaginary (see below) and to every eigenvalue there corresponds the conjugate eigenvalue with the same multiplicity; therefore, as the determinant is the product of the eigenvalues, each one repeated according to its multiplicity, it follows at once that the determinant, if it is not 0, is a positive real number. The number of distinct terms s(n) in the expansion of the determinant of a skew-symmetric matrix of order n was considered already by Cayley, Sylvester, and Pfaff. Due to cancellations, this number is quite small as compared the number of terms of the determinant of a generic matrix of order n, which is n!. The sequence s(n) is :1, 0, 1, 0, 6, 0, 120, 0, 5250, 0, 395010, 0, … and it is encoded in the exponential generating function \sum_^\infty \fracx^n = \left(1 - x^2\right)^\exp\left(\frac\right). The latter yields to the asymptotics (for n even) s(n) = \frac \, \Gamma ^ \left(1 + O\right). The number of positive and negative terms are approximatively a half of the total, although their difference takes larger and larger positive and negative values as n increases .


Cross product

Three-by-three skew-symmetric matrices can be used to represent cross products as matrix multiplications. Consider two vectors \mathbf = \left(a_1, a_2, a_3\right) and \mathbf = \left(b_1, b_2, b_3\right). The
cross product In mathematics, the cross product or vector product (occasionally directed area product, to emphasize its geometric significance) is a binary operation on two vectors in a three-dimensional oriented Euclidean vector space (named here E), and ...
\mathbf\times\mathbf is a bilinear map, which means that by fixing one of the two arguments, for example \mathbf, it induces a
linear map In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that p ...
with an associated transformation matrix mathbf, such that \mathbf\times\mathbf = mathbf\mathbf, where mathbf is mathbf = \begin \,\,0 & \!-a_3 & \,\,\,a_2 \\ \,\,\,a_3 & 0 & \!-a_1 \\ \!-a_2 & \,\,a_1 & \,\,0 \end. This can be immediately verified by computing both sides of the previous equation and comparing each corresponding element of the results. One actually has mathbf = mathbf mathbf - mathbf mathbf; i.e., the commutator of skew-symmetric three-by-three matrices can be identified with the cross-product of two vectors. Since the skew-symmetric three-by-three matrices are the
Lie algebra In mathematics, a Lie algebra (pronounced ) is a vector space \mathfrak g together with an operation called the Lie bracket, an alternating bilinear map \mathfrak g \times \mathfrak g \rightarrow \mathfrak g, that satisfies the Jacobi ident ...
of the rotation group SO(3) this elucidates the relation between three-space \mathbb^3, the cross product and three-dimensional rotations. More on infinitesimal rotations can be found below.


Spectral theory

Since a matrix is similar to its own transpose, they must have the same eigenvalues. It follows that the eigenvalues of a skew-symmetric matrix always come in pairs ±λ (except in the odd-dimensional case where there is an additional unpaired 0 eigenvalue). From the spectral theorem, for a real skew-symmetric matrix the nonzero eigenvalues are all pure imaginary and thus are of the form \lambda_1 i, -\lambda_1 i, \lambda_2 i, -\lambda_2 i, \ldots where each of the \lambda_k are real. Real skew-symmetric matrices are normal matrices (they commute with their adjoints) and are thus subject to the spectral theorem, which states that any real skew-symmetric matrix can be diagonalized by a unitary matrix. Since the eigenvalues of a real skew-symmetric matrix are imaginary, it is not possible to diagonalize one by a real matrix. However, it is possible to bring every skew-symmetric matrix to a block diagonal form by a special orthogonal transformation. Specifically, every 2n \times 2n real skew-symmetric matrix can be written in the form A = Q\Sigma Q^\textsf where Q is orthogonal and \Sigma = \begin \begin0 & \lambda_1 \\ -\lambda_1 & 0\end & 0 & \cdots & 0 \\ 0 & \begin0 & \lambda_2 \\ -\lambda_2 & 0\end & & 0 \\ \vdots & & \ddots & \vdots \\ 0 & 0 & \cdots & \begin0 & \lambda_r\\ -\lambda_r & 0\end \\ & & & & \begin0 \\ & \ddots \\ & & 0 \end \end for real positive-definite \lambda_k. The nonzero eigenvalues of this matrix are ±λ''k'' ''i''. In the odd-dimensional case Σ always has at least one row and column of zeros. More generally, every complex skew-symmetric matrix can be written in the form A = U \Sigma U^ where U is unitary and \Sigma has the block-diagonal form given above with \lambda_k still real positive-definite. This is an example of the Youla decomposition of a complex square matrix.


Skew-symmetric and alternating forms

A skew-symmetric form \varphi on a
vector space In mathematics and physics, a vector space (also called a linear space) is a set (mathematics), set whose elements, often called vector (mathematics and physics), ''vectors'', can be added together and multiplied ("scaled") by numbers called sc ...
V over a field K of arbitrary characteristic is defined to be a bilinear form \varphi: V \times V \mapsto K such that for all v, w in V, \varphi(v, w) = -\varphi(w, v). This defines a form with desirable properties for vector spaces over fields of characteristic not equal to 2, but in a vector space over a field of characteristic 2, the definition is equivalent to that of a symmetric form, as every element is its own additive inverse. Where the
vector space In mathematics and physics, a vector space (also called a linear space) is a set (mathematics), set whose elements, often called vector (mathematics and physics), ''vectors'', can be added together and multiplied ("scaled") by numbers called sc ...
V is over a field of arbitrary characteristic including characteristic 2, we may define an alternating form as a bilinear form \varphi such that for all vectors v in V \varphi(v, v) = 0. This is equivalent to a skew-symmetric form when the field is not of characteristic 2, as seen from 0 = \varphi(v + w, v + w) = \varphi(v, v) + \varphi(v, w) + \varphi(w, v) + \varphi(w, w) = \varphi(v, w) + \varphi(w, v), whence \varphi(v, w) = -\varphi(w, v). A bilinear form \varphi will be represented by a matrix A such that \varphi(v,w) = v^\textsfAw, once a basis of V is chosen, and conversely an n \times n matrix A on K^n gives rise to a form sending (v, w) to v^\textsfAw. For each of symmetric, skew-symmetric and alternating forms, the representing matrices are symmetric, skew-symmetric and alternating respectively.


Infinitesimal rotations


Coordinate-free

More intrinsically (i.e., without using coordinates), skew-symmetric linear transformations on a vector space V with an inner product may be defined as the bivectors on the space, which are sums of simple bivectors ( 2-blades) v \wedge w. The correspondence is given by the map v \wedge w \mapsto v \otimes w - w \otimes v; in orthonormal coordinates these are exactly the elementary skew-symmetric matrices. This characterization is used in interpreting the curl of a vector field (naturally a 2-vector) as an infinitesimal rotation or "curl", hence the name.


Skew-symmetrizable matrix

An n \times n matrix A is said to be skew-symmetrizable if there exists an invertible
diagonal matrix In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagon ...
D such that DA is skew-symmetric. For real n \times n matrices, sometimes the condition for D to have positive entries is added.


See also

* Cayley transform *
Symmetric matrix In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with ...
* Skew-Hermitian matrix * Symplectic matrix * Symmetry in mathematics


References


Further reading

* * *


External links

* * *
FortranFortran90
{{Authority control Matrices (mathematics)