Householder Matrix
   HOME

TheInfoList



OR:

In
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as :a_1x_1+\cdots +a_nx_n=b, linear maps such as :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrix (mathemat ...
, a Householder transformation (also known as a Householder reflection or elementary reflector) is a
linear transformation In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pr ...
that describes a reflection about a
plane Plane most often refers to: * Aero- or airplane, a powered, fixed-wing aircraft * Plane (geometry), a flat, 2-dimensional surface * Plane (mathematics), generalizations of a geometrical plane Plane or planes may also refer to: Biology * Plane ...
or
hyperplane In geometry, a hyperplane is a generalization of a two-dimensional plane in three-dimensional space to mathematical spaces of arbitrary dimension. Like a plane in space, a hyperplane is a flat hypersurface, a subspace whose dimension is ...
containing the origin. The Householder transformation was used in a 1958 paper by
Alston Scott Householder Alston Scott Householder (5 May 1904 – 4 July 1993) was an American mathematician who specialized in mathematical biology and numerical analysis. He is the inventor of the Householder transformation and of Householder's method. Career Hous ...
.


Definition


Operator and transformation

The Householder
operator Operator may refer to: Mathematics * A symbol indicating a mathematical operation * Logical operator or logical connective in mathematical logic * Operator (mathematics), mapping that acts on elements of a space to produce elements of another sp ...
may be defined over any finite-dimensional
inner product space In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, ofte ...
V with
inner product In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, ofte ...
\langle \cdot, \cdot \rangle and
unit vector In mathematics, a unit vector in a normed vector space is a Vector (mathematics and physics), vector (often a vector (geometry), spatial vector) of Norm (mathematics), length 1. A unit vector is often denoted by a lowercase letter with a circumfle ...
u\in V as : H_u(x) := x - 2\,\langle x,u \rangle\,u\,. It is also common to choose a non-unit vector q \in V, and normalize it directly in the Householder operator's expression: :H_q \left ( x \right ) = x - 2\, \frac\, q \,. Such an operator is
linear In mathematics, the term ''linear'' is used in two distinct senses for two different properties: * linearity of a '' function'' (or '' mapping''); * linearity of a '' polynomial''. An example of a linear function is the function defined by f(x) ...
and
self-adjoint In mathematics, an element of a *-algebra is called self-adjoint if it is the same as its adjoint (i.e. a = a^*). Definition Let \mathcal be a *-algebra. An element a \in \mathcal is called self-adjoint if The set of self-adjoint elements ...
. If V=\mathbb^n, note that the reflection hyperplane can be defined by its ''normal vector'', a
unit vector In mathematics, a unit vector in a normed vector space is a Vector (mathematics and physics), vector (often a vector (geometry), spatial vector) of Norm (mathematics), length 1. A unit vector is often denoted by a lowercase letter with a circumfle ...
\vec v\in V (a vector with length 1) that is
orthogonal In mathematics, orthogonality (mathematics), orthogonality is the generalization of the geometric notion of ''perpendicularity''. Although many authors use the two terms ''perpendicular'' and ''orthogonal'' interchangeably, the term ''perpendic ...
to the hyperplane. The reflection of a point x about this hyperplane is the Householder
transformation Transformation may refer to: Science and mathematics In biology and medicine * Metamorphosis, the biological process of changing physical form after birth or hatching * Malignant transformation, the process of cells becoming cancerous * Trans ...
: : \vec x - 2\langle \vec x, \vec v\rangle \vec v = \vec x - 2\vec v\left(\vec v^* \vec x\right), where \vec x is the vector from the origin to the point x, and \vec v^* is the
conjugate transpose In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \mathbf is an n \times m matrix obtained by transposing \mathbf and applying complex conjugation to each entry (the complex conjugate ...
of \vec v.


Householder matrix

The matrix constructed from this transformation can be expressed in terms of an
outer product In linear algebra, the outer product of two coordinate vectors is the matrix whose entries are all products of an element in the first vector with an element in the second vector. If the two coordinate vectors have dimensions ''n'' and ''m'', the ...
as: : P = I - 2\vec v\vec v^* is known as the Householder matrix, where I is the
identity matrix In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. It has unique properties, for example when the identity matrix represents a geometric transformation, the obje ...
.


Properties

The Householder matrix has the following properties: * it is
Hermitian {{Short description, none Numerous things are named after the French mathematician Charles Hermite (1822–1901): Hermite * Cubic Hermite spline, a type of third-degree spline * Gauss–Hermite quadrature, an extension of Gaussian quadrature me ...
: P = P^*, * it is
unitary Unitary may refer to: Mathematics * Unitary divisor * Unitary element * Unitary group * Unitary matrix * Unitary morphism * Unitary operator * Unitary transformation * Unitary representation * Unitarity (physics) * ''E''-unitary inverse semigr ...
: P^ = P^* (via the Sherman-Morrison formula), * hence it is involutory: P = P^. * A Householder matrix has eigenvalues \pm 1. To see this, notice that if \vec x is orthogonal to the vector \vec v which was used to create the reflector, then P_v\vec x = (I-2\vec v\vec v^*)\vec x = \vec x-2\langle\vec v,\vec x\rangle\vec v = \vec x, i.e., 1 is an eigenvalue of multiplicity n - 1, since there are n - 1 independent vectors orthogonal to \vec v. Also, notice P_v\vec v = (I-2\vec v\vec v^*)\vec v = \vec v - 2\langle\vec v,\vec v\rangle\vec v = -\vec v (since \vec v is by definition a unit vector), and so -1 is an eigenvalue with multiplicity 1. * The
determinant In mathematics, the determinant is a Scalar (mathematics), scalar-valued function (mathematics), function of the entries of a square matrix. The determinant of a matrix is commonly denoted , , or . Its value characterizes some properties of the ...
of a Householder reflector is -1, since the determinant of a matrix is the product of its eigenvalues, in this case one of which is -1 with the remainder being 1 (as in the previous point), or via the
Matrix determinant lemma In mathematics, in particular linear algebra, the matrix determinant lemma computes the determinant of the sum of an invertible matrix A and the dyadic product, uvT, of a column vector u and a row vector vT. Statement Suppose A is an invertible ...
.


Example

consider the normalization of a vector of 1's \vec v=\frac\begin 1\\1 \end Then the Householder matrix corresponding to this vector is P_v=\begin1&0\\0&1\end-2(\frac\begin 1\\1 \end)(\frac\begin 1&1 \end) =\begin1&0\\0&1\end-\begin 1\\1 \end\begin 1&1 \end =\begin1&0\\0&1\end-\begin1&1\\1&1\end =\begin0&-1\\-1&0\end Note that if we have a vector representing a coordinate in the 2D plane \beginx\\y\end Then in this case P_v flips and negates the x and y coordinates, in other words P_v\beginx\\y\end=\begin-y\\-x\end Which corresponds to reflecting the vector across the line y=-x, which our original vector v is normal to.


Applications


Geometric optics

In geometric optics,
specular reflection Specular reflection, or regular reflection, is the mirror-like reflection (physics), reflection of waves, such as light, from a surface. The law of reflection states that a reflected ray (optics), ray of light emerges from the reflecting surf ...
can be expressed in terms of the Householder matrix (see ').


Numerical linear algebra

Householder transformations are widely used in
numerical linear algebra Numerical linear algebra, sometimes called applied linear algebra, is the study of how matrix operations can be used to create computer algorithms which efficiently and accurately provide approximate answers to questions in continuous mathemati ...
, for example, to annihilate the entries below the main diagonal of a matrix, to perform
QR decomposition In linear algebra, a QR decomposition, also known as a QR factorization or QU factorization, is a decomposition of a matrix ''A'' into a product ''A'' = ''QR'' of an orthonormal matrix ''Q'' and an upper triangular matrix ''R''. QR decom ...
s and in the first step of the
QR algorithm In numerical linear algebra, the QR algorithm or QR iteration is an eigenvalue algorithm: that is, a procedure to calculate the eigenvalues and eigenvectors of a Matrix (mathematics), matrix. The QR algorithm was developed in the late 1950s by John ...
. They are also widely used for transforming to a Hessenberg form. For symmetric or
Hermitian {{Short description, none Numerous things are named after the French mathematician Charles Hermite (1822–1901): Hermite * Cubic Hermite spline, a type of third-degree spline * Gauss–Hermite quadrature, an extension of Gaussian quadrature me ...
matrices, the symmetry can be preserved, resulting in
tridiagonal In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal (the first diagonal below this), and the supradiagonal/upper diagonal (the first diagonal above the main di ...
ization. Because they involve only a rank-one update and make use of low-level BLAS-1 operations, they can be quite efficient.


QR decomposition

Householder transformations can be used to calculate a
QR decomposition In linear algebra, a QR decomposition, also known as a QR factorization or QU factorization, is a decomposition of a matrix ''A'' into a product ''A'' = ''QR'' of an orthonormal matrix ''Q'' and an upper triangular matrix ''R''. QR decom ...
. Consider a matrix tridiangularized up to column i, then our goal is to construct such Householder matrices that act upon the principal submatrices of a given matrix \begin a_ & a_ & \cdots & & & a_ \\ 0 & a_ & \cdots & & & a_ \\ \vdots & & \ddots & & & \vdots \\ 0 & \cdots & 0 & x_=a_ & \cdots & a_ \\ 0 & \cdots & 0 & \vdots & & \vdots \\ 0 & \cdots & 0 & x_=a_ & \cdots & a_ \end via the matrix \begin I_&0\\ 0&P_v \end . (note that we already established before that Householder transformations are unitary matrices, and since the multiplication of unitary matrices is itself a unitary matrix, this gives us the unitary matrix of the QR decomposition) If we can find a \vec v so that P_v \vec x = \vec e_1 we could accomplish this. Thinking geometrically, we are looking for a plane so that the reflection about this plane happens to land directly on the basis vector. In other words, for some constant \alpha. However, for this to happen, we must have \vec v\propto\vec x-\alpha\vec e_1. And since \vec v is a unit vector, this means that we must have Now if we apply equation () back into equation (), we get \vec x-\alpha\vec e_1 = 2(\langle \vec x,\frac\rangle\frac Or, in other words, by comparing the scalars in front of the vector \vec x - \alpha\vec e_1 we must have \, \vec x-\alpha\vec e_1\, _2^2=2\langle\vec x,\vec x-\alpha e_1\rangle. Or 2(\, \vec x\, _2^2-\alpha x_1)=\, \vec x\, _2^2-2\alpha x_1+\alpha^2 Which means that we can solve for \alpha as \alpha=\pm\, \vec x\, _2 This completes the construction; however, in practice we want to avoid
catastrophic cancellation In numerical analysis, catastrophic cancellation is the phenomenon that subtracting good approximations to two nearby numbers may yield a very bad approximation to the difference of the original numbers. For example, if there are two studs, one L ...
in equation (). To do so, we choose the sign of \alpha as \alpha=-sign(Re(x_1))\, \vec x\, _2


Tridiagonalization (Hessenberg)

This procedure is presented in Numerical Analysis by Burden and Faires, and works when the matrix is symmetric. In the non-symmetric case, it is still useful as a similar procedure can result in a Hessenberg matrix. It uses a slightly altered \operatorname function with \operatorname(0) = 1. In the first step, to form the Householder matrix in each step we need to determine \alpha and r, which are: :\begin \alpha &= -\operatorname\left(a_\right)\sqrt; \\ r &= \sqrt; \end From \alpha and r, construct vector v: :\vec v^ = \begin v_1 \\ v_2 \\ \vdots \\ v_n \end, where v_1 = 0, v_2 = \frac, and :v_k = \frac for each k = 3, 4 \ldots n Then compute: :\begin P^1 &= I - 2\vec v^ \left(\vec v^\right)^\textsf \\ A^ &= P^1 AP^1 \end Having found P^1 and computed A^ the process is repeated for k = 2, 3, \ldots, n - 2 as follows: :\begin \alpha &= -\operatorname\left(a^k_\right)\sqrt \\ pt r &= \sqrt \\ pt v^k_1 &= v^k_2 = \cdots = v^k_k = 0 \\ pt v^k_ &= \frac \\ v^k_j &= \frac \text j = k + 2,\ k + 3,\ \ldots,\ n \\ P^k &= I - 2\vec v^ \left(\vec v^\right)^\textsf \\ A^ &= P^k A^P^k \end Continuing in this manner, the tridiagonal and symmetric matrix is formed.


Examples

In this example, also from Burden and Faires, the given matrix is transformed to the similar tridiagonal matrix A3 by using the Householder method. : \mathbf = \begin 4 & 1 & -2 & 2 \\ 1 & 2 & 0 & 1 \\ -2 & 0 & 3 & -2 \\ 2 & 1 & -2 & -1 \end, Following those steps in the Householder method, we have: The first Householder matrix: : \begin Q_1 &= \begin 1 & 0 & 0 & 0 \\ 0 & -\frac & \frac & -\frac \\ 0 & \frac & \frac & \frac \\ 0 & -\frac & \frac & \frac \end, \\ A_2 = Q_1 A Q_1 &= \begin 4 & -3 & 0 & 0 \\ -3 & \frac & 1 & \frac \\ 0 & 1 & \frac & -\frac \\ 0 & \frac & -\frac & -1 \end, \end Used A_2 to form : \begin Q_2 &= \begin 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & -\frac & -\frac \\ 0 & 0 & -\frac & \frac \end, \\ A_3 = Q_2 A_2 Q_2 &= \begin 4 & -3 & 0 & 0 \\ -3 & \frac & -\frac & 0 \\ 0 & -\frac & -\frac & \frac \\ 0 & 0 & \frac & \frac \end, \end As we can see, the final result is a tridiagonal symmetric matrix which is similar to the original one. The process is finished after two steps.


Quantum computation

As unitary matrices are useful in quantum computation, and Householder transformations are unitary, they are very useful in quantum computing. One of the central algorithms where they're useful is Grover's algorithm, where we are trying to solve for a representation of an oracle function represented by what turns out to be a Householder transformation: \begin U_\omega , x\rang = -, x\rang & \text x = \omega \text f(x) = 1, \\ U_\omega , x\rang = , x\rang & \text x \ne \omega \text f(x) = 0. \end (here the , x\rangle is part of the bra-ket notation and is analogous to \vec x which we were using previously) This is done via an algorithm that iterates via the oracle function U_\omega and another operator U_s known as the ''Grover diffusion operator'' defined by , s\rangle = \frac \sum_^ , x\rangle. and U_s = 2 \left, s\right\rangle\!\! \left\langle s\ - I.


Computational and theoretical relationship to other unitary transformations

The Householder transformation is a reflection about a hyperplane with unit normal vector v, as stated earlier. An N-by-N
unitary transformation In mathematics, a unitary transformation is a linear isomorphism that preserves the inner product: the inner product of two vectors before the transformation is equal to their inner product after the transformation. Formal definition More precise ...
U satisfies UU^* = I. Taking the determinant (N-th power of the geometric mean) and trace (proportional to arithmetic mean) of a unitary matrix reveals that its eigenvalues \lambda_i have unit modulus. This can be seen directly and swiftly: :\begin \frac &= \frac = 1, & \operatorname\left(UU^*\right) &= \prod_^N \left, \lambda_j\^2 = 1. \end Since arithmetic and geometric means are equal if the variables are constant (see
inequality of arithmetic and geometric means Inequality may refer to: * Inequality (mathematics), a relation between two quantities when they are different. * Economic inequality, difference in economic well-being between population groups ** Income inequality, an unequal distribution of in ...
), we establish the claim of unit modulus. For the case of real valued unitary matrices we obtain orthogonal matrices, UU^\textsf = I. It follows rather readily (see
Orthogonal matrix In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identi ...
) that any orthogonal matrix can be decomposed into a product of 2-by-2 rotations, called
Givens rotation In numerical linear algebra, a Givens rotation is a rotation in the plane spanned by two coordinates axes. Givens rotations are named after Wallace Givens, who introduced them to numerical analysts in the 1950s while he was working at Argonne Natio ...
s, and Householder reflections. This is appealing intuitively since multiplication of a vector by an orthogonal matrix preserves the length of that vector, and rotations and reflections exhaust the set of (real valued) geometric operations that render invariant a vector's length. The Householder transformation was shown to have a one-to-one relationship with the canonical coset decomposition of unitary matrices defined in group theory, which can be used to parametrize unitary operators in a very efficient manner. Finally we note that a single Householder transform, unlike a solitary Givens transform, can act on all columns of a matrix, and as such exhibits the lowest computational cost for QR decomposition and tridiagonalization. The penalty for this "computational optimality" is, of course, that Householder operations cannot be as deeply or efficiently parallelized. As such Householder is preferred for dense matrices on sequential machines, whilst Givens is preferred on sparse matrices, and/or parallel machines.


See also

* Block reflector *
Givens rotation In numerical linear algebra, a Givens rotation is a rotation in the plane spanned by two coordinates axes. Givens rotations are named after Wallace Givens, who introduced them to numerical analysts in the 1950s while he was working at Argonne Natio ...
* Jacobi rotation


Notes


References

* * * (Herein Householder Transformation is cited as a top 10 algorithm of this century) * * {{Matrix classes Transformation (function) Matrices (mathematics) Numerical linear algebra