HOME

TheInfoList



OR:

In mathematics, in particular
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matric ...
, the matrix determinant lemma computes the
determinant In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if ...
of the sum of an
invertible In mathematics, the concept of an inverse element generalises the concepts of opposite () and reciprocal () of numbers. Given an operation denoted here , and an identity element denoted , if , one says that is a left inverse of , and that ...
matrix Matrix most commonly refers to: * ''The Matrix'' (franchise), an American media franchise ** '' The Matrix'', a 1999 science-fiction action film ** "The Matrix", a fictional setting, a virtual reality environment, within ''The Matrix'' (franchi ...
A and the dyadic product, uvT, of a column
vector Vector most often refers to: *Euclidean vector, a quantity with a magnitude and a direction *Vector (epidemiology), an agent that carries and transmits an infectious pathogen into another living organism Vector may also refer to: Mathematic ...
u and a row vector vT.


Statement

Suppose A is an
invertible In mathematics, the concept of an inverse element generalises the concepts of opposite () and reciprocal () of numbers. Given an operation denoted here , and an identity element denoted , if , one says that is a left inverse of , and that ...
square matrix In mathematics, a square matrix is a matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied. Square matrices are ofte ...
and u, v are column vectors. Then the matrix determinant lemma states that :\det\left(\mathbf + \mathbf^\textsf\right) = \left(1 + \mathbf^\textsf\mathbf^\mathbf\right)\,\det\left(\mathbf\right)\,. Here, uvT is the
outer product In linear algebra, the outer product of two coordinate vectors is a matrix. If the two vectors have dimensions ''n'' and ''m'', then their outer product is an ''n'' × ''m'' matrix. More generally, given two tensors (multidimensional arrays of n ...
of two vectors u and v. The theorem can also be stated in terms of the
adjugate matrix In linear algebra, the adjugate or classical adjoint of a square matrix is the transpose of its cofactor matrix and is denoted by . It is also occasionally known as adjunct matrix, or "adjoint", though the latter today normally refers to a differ ...
of A: :\det\left(\mathbf + \mathbf^\textsf\right) = \det\left(\mathbf\right) + \mathbf^\textsf\mathrm\left(\mathbf\right)\mathbf\,, in which case it applies whether or not the square matrix A is invertible.


Proof

First the proof of the special case A = I follows from the equality: : \begin \mathbf & 0 \\ \mathbf^\textsf & 1 \end \begin \mathbf + \mathbf^\textsf & \mathbf \\ 0 & 1 \end \begin \mathbf & 0 \\ -\mathbf^\textsf & 1 \end = \begin \mathbf & \mathbf \\ 0 & 1 + \mathbf^\textsf\mathbf \end. The determinant of the left hand side is the product of the determinants of the three matrices. Since the first and third matrix are triangular matrices with unit diagonal, their determinants are just 1. The determinant of the middle matrix is our desired value. The determinant of the right hand side is simply (1 + vTu). So we have the result: :\det\left(\mathbf + \mathbf^\textsf\right) = \left(1 + \mathbf^\textsf\mathbf\right). Then the general case can be found as: :\begin \det\left(\mathbf + \mathbf^\textsf\right) &= \det\left(\mathbf\right) \det\left(\mathbf + \left(\mathbf^\mathbf\right)\mathbf^\textsf\right)\\ &= \det\left(\mathbf\right) \left(1 + \mathbf^\textsf \left(\mathbf^\mathbf\right)\right). \end


Application

If the determinant and inverse of A are already known, the formula provides a numerically cheap way to compute the determinant of A corrected by the matrix uvT. The computation is relatively cheap because the determinant of A + uvT does not have to be computed from scratch (which in general is expensive). Using unit vectors for u and/or v, individual columns, rows or elements of A may be manipulated and a correspondingly updated determinant computed relatively cheaply in this way. When the matrix determinant lemma is used in conjunction with the Sherman–Morrison formula, both the inverse and determinant may be conveniently updated together.


Generalization

Suppose A is an
invertible In mathematics, the concept of an inverse element generalises the concepts of opposite () and reciprocal () of numbers. Given an operation denoted here , and an identity element denoted , if , one says that is a left inverse of , and that ...
''n''-by-''n'' matrix and U, V are ''n''-by-''m'' matrices. Then :\det\left(\mathbf + \mathbf^\textsf\right) = \det\left(\mathbf + \mathbf^\textsf\mathbf^\mathbf\right)\det(\mathbf). In the special case \mathbf=\mathbf this is the Weinstein–Aronszajn identity. Given additionally an invertible ''m''-by-''m'' matrix W, the relationship can also be expressed as :\det\left(\mathbf + \mathbf^\textsf\right) = \det\left(\mathbf^ + \mathbf^\textsf\mathbf^\mathbf\right)\det\left(\mathbf\right)\det\left(\mathbf\right).


See also

* The Sherman–Morrison formula, which shows how to update the inverse, A−1, to obtain (A + uvT)−1. * The Woodbury formula, which shows how to update the inverse, A−1, to obtain (A + UCVT)−1. * The binomial inverse theorem for (A + UCVT)−1.


References

{{Reflist Lemmas in linear algebra Matrix theory