In
mathematics
Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many ar ...
, in particular
linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matrix (mathemat ...
, the matrix determinant lemma computes the
determinant
In mathematics, the determinant is a Scalar (mathematics), scalar-valued function (mathematics), function of the entries of a square matrix. The determinant of a matrix is commonly denoted , , or . Its value characterizes some properties of the ...
of the sum of an
invertible
In mathematics, the concept of an inverse element generalises the concepts of opposite () and reciprocal () of numbers.
Given an operation denoted here , and an identity element denoted , if , one says that is a left inverse of , and that ...
matrix
Matrix (: matrices or matrixes) or MATRIX may refer to:
Science and mathematics
* Matrix (mathematics), a rectangular array of numbers, symbols or expressions
* Matrix (logic), part of a formula in prenex normal form
* Matrix (biology), the m ...
A and the
dyadic product, uv
T, of a column
vector
Vector most often refers to:
* Euclidean vector, a quantity with a magnitude and a direction
* Disease vector, an agent that carries and transmits an infectious pathogen into another living organism
Vector may also refer to:
Mathematics a ...
u and a row vector v
T.
Statement
Suppose A is an invertible
square matrix
In mathematics, a square matrix is a Matrix (mathematics), matrix with the same number of rows and columns. An ''n''-by-''n'' matrix is known as a square matrix of order Any two square matrices of the same order can be added and multiplied.
Squ ...
and u, v are column vectors. Then the matrix determinant lemma states that
:
Here, uv
T is the
outer product
In linear algebra, the outer product of two coordinate vectors is the matrix whose entries are all products of an element in the first vector with an element in the second vector. If the two coordinate vectors have dimensions ''n'' and ''m'', the ...
of two vectors u and v.
The theorem can also be stated in terms of the
adjugate matrix of A:
:
in which case it applies whether or not the matrix A is invertible.
Proof
First the proof of the special case A = I follows from the equality:
:
The determinant of the left hand side is the product of the determinants of the three matrices. Since the first and third matrix are
triangular matrices with unit diagonal, their determinants are just 1. The determinant of the middle matrix is our desired value. The determinant of the right hand side is simply (1 + v
Tu). So we have the result:
:
Then the general case can be found as:
:
Application
If the determinant and inverse of A are already known, the formula provides a numerically cheap way to compute the determinant of A corrected by the matrix uv
T. The computation is relatively cheap because the determinant of A + uv
T does not have to be computed from scratch (which in general is expensive). Using
unit vector
In mathematics, a unit vector in a normed vector space is a Vector (mathematics and physics), vector (often a vector (geometry), spatial vector) of Norm (mathematics), length 1. A unit vector is often denoted by a lowercase letter with a circumfle ...
s for u and/or v, individual columns, rows or elements
of A may be manipulated and a correspondingly updated determinant computed relatively cheaply in this way.
When the matrix determinant lemma is used in conjunction with the
Sherman–Morrison formula, both the inverse and determinant may be conveniently updated together.
Generalization
Suppose A is an invertible ''n''-by-''n'' matrix and U, V are ''n''-by-''m'' matrices. Then
:
In the special case
this is the
Weinstein–Aronszajn identity.
Given additionally an invertible ''m''-by-''m'' matrix W, the relationship can also be expressed as
:
See also
* The
Sherman–Morrison formula, which shows how to update the inverse, A
−1, to obtain (A + uv
T)
−1.
* The
Woodbury formula, which shows how to update the inverse, A
−1, to obtain (A + UCV
T)
−1.
* The
binomial inverse theorem for (A + UCV
T)
−1.
References
{{Reflist
Lemmas in linear algebra
Matrix theory