HOME

TheInfoList



OR:

In linear algebra, an orthogonal transformation is a linear transformation ''T'' : ''V'' → ''V'' on a
real Real may refer to: Currencies * Brazilian real (R$) * Central American Republic real * Mexican real * Portuguese real * Spanish real * Spanish colonial real Music Albums * ''Real'' (L'Arc-en-Ciel album) (2000) * ''Real'' (Bright album) (2010) ...
inner product space ''V'', that preserves the inner product. That is, for each pair of elements of ''V'', we have : \langle u,v \rangle = \langle Tu,Tv \rangle \, . Since the lengths of vectors and the angles between them are defined through the inner product, orthogonal transformations preserve lengths of vectors and angles between them. In particular, orthogonal transformations map orthonormal bases to orthonormal bases. Orthogonal transformations are injective: if Tv = 0 then 0 = \langle Tv,Tv \rangle = \langle v,v \rangle, hence v = 0, so the
kernel Kernel may refer to: Computing * Kernel (operating system), the central component of most operating systems * Kernel (image processing), a matrix used for image convolution * Compute kernel, in GPGPU programming * Kernel method, in machine learni ...
of T is trivial. Orthogonal transformations in two- or three-
dimensional In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it. Thus, a line has a dimension of one (1D) because only one coordi ...
Euclidean space are stiff rotations, reflections, or combinations of a rotation and a reflection (also known as improper rotations). Reflections are transformations that reverse the direction front to back, orthogonal to the mirror plane, like (real-world) mirrors do. The matrices corresponding to proper rotations (without reflection) have a determinant of +1. Transformations with reflection are represented by matrices with a determinant of −1. This allows the concept of rotation and reflection to be generalized to higher dimensions. In finite-dimensional spaces, the matrix representation (with respect to an orthonormal basis) of an orthogonal transformation is an
orthogonal matrix In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identity ma ...
. Its rows are mutually orthogonal vectors with unit norm, so that the rows constitute an orthonormal basis of ''V''. The columns of the matrix form another orthonormal basis of ''V''. If an orthogonal transformation is invertible (which is always the case when ''V'' is finite-dimensional) then its inverse is another orthogonal transformation. Its matrix representation is the transpose of the matrix representation of the original transformation.


Examples

Consider the inner-product space (\mathbb^2,\langle\cdot,\cdot\rangle) with the standard euclidean inner product and standard basis. Then, the matrix transformation : T = \begin \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta) \end : \mathbb^2 \to \mathbb^2 is orthogonal. To see this, consider : \begin Te_1 = \begin\cos(\theta) \\ \sin(\theta)\end && Te_2 = \begin-\sin(\theta) \\ \cos(\theta)\end \end Then, : \begin &\langle Te_1,Te_1\rangle = \begin \cos(\theta) & \sin(\theta) \end \cdot \begin \cos(\theta) \\ \sin(\theta) \end = \cos^2(\theta) + \sin^2(\theta) = 1\\ &\langle Te_1,Te_2\rangle = \begin \cos(\theta) & \sin(\theta) \end \cdot \begin -\sin(\theta) \\ \cos(\theta) \end = \sin(\theta)\cos(\theta) - \sin(\theta)\cos(\theta) = 0\\ &\langle Te_2,Te_2\rangle = \begin -\sin(\theta) & \cos(\theta) \end \cdot \begin -\sin(\theta) \\ \cos(\theta) \end = \sin^2(\theta) + \cos^2(\theta) = 1\\ \end The previous example can be extended to construct all orthogonal transformations. For example, the following matrices define orthogonal transformations on (\mathbb^3,\langle\cdot,\cdot\rangle): : \begin \cos(\theta) & -\sin(\theta) & 0 \\ \sin(\theta) & \cos(\theta) & 0 \\ 0 & 0 & 1 \end, \begin \cos(\theta) & 0 & -\sin(\theta) \\ 0 & 1 & 0 \\ \sin(\theta) & 0 & \cos(\theta) \end, \begin 1 & 0 & 0 \\ 0 & \cos(\theta) & -\sin(\theta) \\ 0 & \sin(\theta) & \cos(\theta) \end


See also

* Improper rotation * Linear transformation *
Orthogonal matrix In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identity ma ...
*
Unitary transformation In mathematics, a unitary transformation is a transformation that preserves the inner product: the inner product of two vectors before the transformation is equal to their inner product after the transformation. Formal definition More precisely, ...


References

{{Reflist Linear algebra