In

Stack Exchange Post

discusses why the set of Dirac Delta functions is not a basis of L^{2}( ,1.
{{DEFAULTSORT:Orthonormal Basis
Fourier analysis
Functional analysis
Linear algebra

mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...

, particularly linear algebra
Linear algebra is the branch of mathematics concerning linear equations such as:
:a_1x_1+\cdots +a_nx_n=b,
linear maps such as:
:(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n,
and their representations in vector spaces and through matrices.
...

, an orthonormal basis for an inner product space
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often ...

''V'' with finite dimension
In physics and mathematics, the dimension of a mathematical space (or object) is informally defined as the minimum number of coordinates needed to specify any point within it. Thus, a line has a dimension of one (1D) because only one coordina ...

is a basis
Basis may refer to:
Finance and accounting
* Adjusted basis, the net cost of an asset after adjusting for various tax-related items
* Basis point, 0.01%, often used in the context of interest rates
* Basis trading, a trading strategy consisting ...

for $V$ whose vectors are orthonormal
In linear algebra, two vectors in an inner product space are orthonormal if they are orthogonal (or perpendicular along a line) unit vectors. A set of vectors form an orthonormal set if all vectors in the set are mutually orthogonal and all of u ...

, that is, they are all unit vector
In mathematics, a unit vector in a normed vector space is a vector (often a spatial vector) of length 1. A unit vector is often denoted by a lowercase letter with a circumflex, or "hat", as in \hat (pronounced "v-hat").
The term ''direction vect ...

s and orthogonal
In mathematics, orthogonality is the generalization of the geometric notion of '' perpendicularity''.
By extension, orthogonality is also used to refer to the separation of specific features of a system. The term also has specialized meanings in ...

to each other. For example, the standard basis
In mathematics, the standard basis (also called natural basis or canonical basis) of a coordinate vector space (such as \mathbb^n or \mathbb^n) is the set of vectors whose components are all zero, except one that equals 1. For example, in the ...

for a Euclidean space
Euclidean space is the fundamental space of geometry, intended to represent physical space. Originally, that is, in Euclid's ''Elements'', it was the three-dimensional space of Euclidean geometry, but in modern mathematics there are Euclidean ...

$\backslash R^n$ is an orthonormal basis, where the relevant inner product is the dot product
In mathematics, the dot product or scalar productThe term ''scalar product'' means literally "product with a scalar as a result". It is also used sometimes for other symmetric bilinear forms, for example in a pseudo-Euclidean space. is an alg ...

of vectors. The image
An image is a visual representation of something. It can be two-dimensional, three-dimensional, or somehow otherwise feed into the visual system to convey information. An image can be an artifact, such as a photograph or other two-dimensi ...

of the standard basis under a rotation
Rotation, or spin, is the circular movement of an object around a '' central axis''. A two-dimensional rotating object has only one possible central axis and can rotate in either a clockwise or counterclockwise direction. A three-dimensional ...

or reflection (or any orthogonal transformation In linear algebra, an orthogonal transformation is a linear transformation ''T'' : ''V'' → ''V'' on a real inner product space ''V'', that preserves the inner product. That is, for each pair of elements of ''V'', we have ...

) is also orthonormal, and every orthonormal basis for $\backslash R^n$ arises in this fashion.
For a general inner product space $V,$ an orthonormal basis can be used to define normalized orthogonal coordinates on $V.$ Under these coordinates, the inner product becomes a dot product of vectors. Thus the presence of an orthonormal basis reduces the study of a finite-dimensional
In mathematics, the dimension of a vector space ''V'' is the cardinality (i.e., the number of vectors) of a basis of ''V'' over its base field. p. 44, §2.36 It is sometimes called Hamel dimension (after Georg Hamel) or algebraic dimension to dis ...

inner product space to the study of $\backslash R^n$ under dot product. Every finite-dimensional inner product space has an orthonormal basis, which may be obtained from an arbitrary basis using the Gram–Schmidt process
In mathematics, particularly linear algebra and numerical analysis, the Gram–Schmidt process is a method for orthonormalizing a set of vectors in an inner product space, most commonly the Euclidean space equipped with the standard inner pr ...

.
In functional analysis
Functional analysis is a branch of mathematical analysis, the core of which is formed by the study of vector spaces endowed with some kind of limit-related structure (e.g. inner product, norm, topology, etc.) and the linear functions defined o ...

, the concept of an orthonormal basis can be generalized to arbitrary (infinite-dimensional) inner product space
In mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often ...

s. Given a pre-Hilbert space $H,$ an ''orthonormal basis'' for $H$ is an orthonormal set of vectors with the property that every vector in $H$ can be written as an infinite linear combination of the vectors in the basis. In this case, the orthonormal basis is sometimes called a Hilbert basis for $H.$ Note that an orthonormal basis in this sense is not generally a Hamel basis, since infinite linear combinations are required. Specifically, the linear span
In mathematics, the linear span (also called the linear hull or just span) of a set of vectors (from a vector space), denoted , pp. 29-30, §§ 2.5, 2.8 is defined as the set of all linear combinations of the vectors in . It can be characterized ...

of the basis must be dense
Density (volumetric mass density or specific mass) is the substance's mass per unit of volume. The symbol most often used for density is ''ρ'' (the lower case Greek letter rho), although the Latin letter ''D'' can also be used. Mathematicall ...

in $H,$ but it may not be the entire space.
If we go on to Hilbert space
In mathematics, Hilbert spaces (named after David Hilbert) allow generalizing the methods of linear algebra and calculus from (finite-dimensional) Euclidean vector spaces to spaces that may be infinite-dimensional. Hilbert spaces arise naturall ...

s, a non-orthonormal set of vectors having the same linear span as an orthonormal basis may not be a basis at all. For instance, any square-integrable function on the interval $;\; href="/html/ALL/s/1,1.html"\; ;"title="1,1">1,1$almost everywhere
In measure theory (a branch of mathematical analysis), a property holds almost everywhere if, in a technical sense, the set for which the property holds takes up nearly all possibilities. The notion of "almost everywhere" is a companion notion t ...

) as an infinite sum of Legendre polynomials
In physical science and mathematics, Legendre polynomials (named after Adrien-Marie Legendre, who discovered them in 1782) are a system of complete and orthogonal polynomials, with a vast number of mathematical properties, and numerous applicat ...

(an orthonormal basis), but not necessarily as an infinite sum of the monomial
In mathematics, a monomial is, roughly speaking, a polynomial which has only one term. Two definitions of a monomial may be encountered:
# A monomial, also called power product, is a product of powers of variables with nonnegative integer expone ...

s $x^n.$
A different generalisation is to pseudo-inner product spaces, finite-dimensional vector spaces $M$ equipped with a non-degenerate symmetric bilinear form known as the metric tensor
In the mathematical field of differential geometry, a metric tensor (or simply metric) is an additional structure on a manifold (such as a surface) that allows defining distances and angles, just as the inner product on a Euclidean space a ...

. In such a basis, the metric takes the form $\backslash text(+1,\backslash cdots,+1,-1,\backslash cdots,-1)$ with $p$ positive ones and $q$ negative ones.
Examples

* For $\backslash mathbb^3$, the set of vectors $\backslash left\backslash ,$ is called the standard basis and forms an orthonormal basis of $\backslash mathbb^3$ with respect to the standard dot product. Note that both the standard basis and standard dot product rely on viewing $\backslash mathbb^3$ as the Cartesian product $\backslash mathbb\backslash times\backslash mathbb\backslash times\backslash mathbb$ *:Proof: A straightforward computation shows that the inner products of these vectors equals zero, $\backslash left\backslash langle\; e\_1,\; e\_2\; \backslash right\backslash rangle\; =\; \backslash left\backslash langle\; e\_1,\; e\_3\; \backslash right\backslash rangle\; =\; \backslash left\backslash langle\; e\_2,\; e\_3\; \backslash right\backslash rangle\; =\; 0$ and that each of their magnitudes equals one, $\backslash left\backslash ,\; e\_1\backslash right\backslash ,\; =\; \backslash left\backslash ,\; e\_2\backslash right\backslash ,\; =\; \backslash left\backslash ,\; e\_3\backslash right\backslash ,\; =\; 1.$ This means that $\backslash left\backslash $ is an orthonormal set. All vectors $(x,\; y,\; z)\; \backslash in\; \backslash R^3$ can be expressed as a sum of the basis vectors scaled $$(x,y,z)\; =\; x\; e\_1\; +\; y\; e\_2\; +\; z\; e\_3,$$ so $\backslash left\backslash $ spans $\backslash R^3$ and hence must be a basis. It may also be shown that the standard basis rotated about an axis through the origin or reflected in a plane through the origin also forms an orthonormal basis of $\backslash R^3$. * For $\backslash mathbb^n$, the standard basis and inner product are similarly defined. Any other orthonormal basis is related to the standard basis by anorthogonal transformation In linear algebra, an orthogonal transformation is a linear transformation ''T'' : ''V'' → ''V'' on a real inner product space ''V'', that preserves the inner product. That is, for each pair of elements of ''V'', we have ...

in the group O(n).
* For pseudo-Euclidean space $\backslash mathbb^,$, an orthogonal basis $\backslash $ with metric $\backslash eta$ instead satisfies $\backslash eta(e\_\backslash mu,e\_\backslash nu)\; =\; 0$ if $\backslash mu\backslash neq\; \backslash nu$, $\backslash eta(e\_\backslash mu,e\_\backslash mu)\; =\; +1$ if $1\backslash leq\backslash mu\backslash leq\; p$, and $\backslash eta(e\_\backslash mu,e\_\backslash mu)\; =-1$ if $p+1\backslash leq\backslash mu\backslash leq\; p+q$. Any two orthonormal bases are related by a pseudo-orthogonal transformation. In the case $(p,q)\; =\; (1,3)$, these are Lorentz transformations.
* The set $\backslash left\backslash $ with $f\_n(x)\; =\; \backslash exp(2\; \backslash pi\; inx),$ where $\backslash exp$ denotes the exponential function
The exponential function is a mathematical function denoted by f(x)=\exp(x) or e^x (where the argument is written as an exponent). Unless otherwise specified, the term generally refers to the positive-valued function of a real variable, a ...

, forms an orthonormal basis of the space of functions with finite Lebesgue integrals, $L^2(;\; href="/html/ALL/s/,1.html"\; ;"title=",1">,1$ with respect to the 2-norm
In mathematics, a norm is a function from a real or complex vector space to the non-negative real numbers that behaves in certain ways like the distance from the origin: it commutes with scaling, obeys a form of the triangle inequality, and i ...

. This is fundamental to the study of Fourier series
A Fourier series () is a summation of harmonically related sinusoidal functions, also known as components or harmonics. The result of the summation is a periodic function whose functional form is determined by the choices of cycle length (or ' ...

.
* The set $\backslash left\backslash $ with $e\_b(c)\; =\; 1$ if $b\; =\; c$ and $e\_b(c)\; =\; 0$ otherwise forms an orthonormal basis of $\backslash ell^2(B).$
* Eigenfunctions of a Sturm–Liouville eigenproblem.
* The column vectors of an orthogonal matrix
In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors.
One way to express this is
Q^\mathrm Q = Q Q^\mathrm = I,
where is the transpose of and is the identity ...

form an orthonormal set.
Basic formula

If $B$ is an orthogonal basis of $H,$ then every element $x\; \backslash in\; H$ may be written as $$x\; =\; \backslash sum\_\; \backslash frac\; b.$$ When $B$ is orthonormal, this simplifies to $$x\; =\; \backslash sum\_\backslash langle\; b,x\backslash rangle\; b$$ and the square of thenorm
Naturally occurring radioactive materials (NORM) and technologically enhanced naturally occurring radioactive materials (TENORM) consist of materials, usually industrial wastes or by-products enriched with radioactive elements found in the envir ...

of $x$ can be given by
$$\backslash ,\; x\backslash ,\; ^2\; =\; \backslash sum\_,\; \backslash langle\; x,b\backslash rangle\; ,\; ^2.$$
Even if $B$ is uncountable
In mathematics, an uncountable set (or uncountably infinite set) is an infinite set that contains too many elements to be countable. The uncountability of a set is closely related to its cardinal number: a set is uncountable if its cardinal numb ...

, only countably many terms in this sum will be non-zero, and the expression is therefore well-defined. This sum is also called the '' Fourier expansion'' of $x,$ and the formula is usually known as Parseval's identity In mathematical analysis, Parseval's identity, named after Marc-Antoine Parseval, is a fundamental result on the summability of the Fourier series of a function. Geometrically, it is a generalized Pythagorean theorem for inner-product spaces (wh ...

.
If $B$ is an orthonormal basis of $H,$ then $H$ is ''isomorphic'' to $\backslash ell^2(B)$ in the following sense: there exists a bijective
In mathematics, a bijection, also known as a bijective function, one-to-one correspondence, or invertible function, is a function between the elements of two sets, where each element of one set is paired with exactly one element of the other ...

linear
Linearity is the property of a mathematical relationship ('' function'') that can be graphically represented as a straight line. Linearity is closely related to '' proportionality''. Examples in physics include rectilinear motion, the linear ...

map $\backslash Phi\; :\; H\; \backslash to\; \backslash ell^2(B)$such that
$$\backslash langle\backslash Phi(x),\backslash Phi(y)\backslash rangle=\backslash langle\; x,y\backslash rangle\; \backslash quad\; \backslash text\; x,\; y\; \backslash in\; H.$$
Incomplete orthogonal sets

Given a Hilbert space $H$ and a set $S$ of mutually orthogonal vectors in $H,$ we can take the smallest closed linear subspace $V$ of $H$ containing $S.$ Then $S$ will be an orthogonal basis of $V;$ which may of course be smaller than $H$ itself, being an ''incomplete'' orthogonal set, or be $H,$ when it is a ''complete'' orthogonal set.Existence

Using Zorn's lemma and theGram–Schmidt process
In mathematics, particularly linear algebra and numerical analysis, the Gram–Schmidt process is a method for orthonormalizing a set of vectors in an inner product space, most commonly the Euclidean space equipped with the standard inner pr ...

(or more simply well-ordering and transfinite recursion), one can show that ''every'' Hilbert space admits an orthonormal basis; furthermore, any two orthonormal bases of the same space have the same cardinality
In mathematics, the cardinality of a set is a measure of the number of elements of the set. For example, the set A = \ contains 3 elements, and therefore A has a cardinality of 3. Beginning in the late 19th century, this concept was generalized ...

(this can be proven in a manner akin to that of the proof of the usual dimension theorem for vector spaces, with separate cases depending on whether the larger basis candidate is countable or not). A Hilbert space is separable if and only if it admits a countable
In mathematics, a set is countable if either it is finite or it can be made in one to one correspondence with the set of natural numbers. Equivalently, a set is ''countable'' if there exists an injective function from it into the natural numbe ...

orthonormal basis. (One can prove this last statement without using the axiom of choice.)
Choice of basis as a choice of isomorphism

For concreteness we discuss orthonormal bases for a real, $n$ dimensional vector space $V$ with a positive definite symmetric bilinear form $\backslash phi=\backslash langle\backslash cdot,\backslash cdot\backslash rangle$. One way to view an orthonormal basis with respect to $\backslash phi$ is as a set of vectors $\backslash mathcal\; =\; \backslash $, which allow us to write $v\; =\; v^ie\_i$ for $v\backslash in\; V$, and $v^i\backslash in\; \backslash mathbb$ or $(v^i)\; \backslash in\; \backslash mathbb^n$. With respect to this basis, the components of $\backslash phi$ are particularly simple: $\backslash phi(e\_i,e\_j)\; =\; \backslash delta\_.$ We can now view the basis as a map $\backslash psi\_\backslash mathcal:V\backslash rightarrow\; \backslash mathbb^n$ which is an isomorphism of inner product spaces: to make this more explicit we can write :$\backslash psi\_\backslash mathcal:(V,\backslash phi)\backslash rightarrow\; (\backslash mathbb^n,\backslash delta\_).$ Explicitly we can write $(\backslash psi\_\backslash mathcal(v))^i\; =\; e^i(v)\; =\; \backslash phi(e\_i,v)$ where $e^i$ is the dual basis element to $e\_i$. The inverse is a component map :$C\_\backslash mathcal:\backslash mathbb^n\backslash rightarrow\; V,\; (v^i)\backslash mapsto\; \backslash sum\_^n\; v^ie\_i.$ These definitions make it manifest that there is a bijection :$\backslash \backslash leftrightarrow\; \backslash .$ The space of isomorphisms admits actions of orthogonal groups at either the $V$ side or the $\backslash mathbb^n$ side. For concreteness we fix the isomorphisms to point in the direction $\backslash mathbb^n\backslash rightarrow\; V$, and consider the space of such maps, $\backslash text(\backslash mathbb^n\backslash rightarrow\; V)$. This space admits a left action by the group of isometries of $V$, that is, $R\backslash in\; \backslash text(V)$ such that $\backslash phi(\backslash cdot,\backslash cdot)\; =\; \backslash phi(R\backslash cdot,R\backslash cdot)$, with the action given by composition: $R*C=R\backslash circ\; C.$ This space also admits a right action by the group of isometries of $\backslash mathbb^n$, that is, $R\_\; \backslash in\; \backslash text(n)\backslash subset\; \backslash text\_(\backslash mathbb)$, with the action again given by composition: $C*R\_\; =\; C\backslash circ\; R\_$.As a principal homogeneous space

The set of orthonormal bases for $\backslash mathbb^n$ with the standard inner product is aprincipal homogeneous space
In mathematics, a principal homogeneous space, or torsor, for a group ''G'' is a homogeneous space ''X'' for ''G'' in which the stabilizer subgroup of every point is trivial. Equivalently, a principal homogeneous space for a group ''G'' is a non-e ...

or G-torsor for the orthogonal group
In mathematics, the orthogonal group in dimension , denoted , is the group of distance-preserving transformations of a Euclidean space of dimension that preserve a fixed point, where the group operation is given by composing transformations. T ...

$G\; =\; \backslash text(n),$ and is called the Stiefel manifold $V\_n(\backslash R^n)$ of orthonormal $n$-frames.
In other words, the space of orthonormal bases is like the orthogonal group, but without a choice of base point: given the space of orthonormal bases, there is no natural choice of orthonormal basis, but once one is given one, there is a one-to-one correspondence between bases and the orthogonal group.
Concretely, a linear map is determined by where it sends a given basis: just as an invertible map can take any basis to any other basis, an orthogonal map can take any ''orthogonal'' basis to any other ''orthogonal'' basis.
The other Stiefel manifolds $V\_k(\backslash R^n)$ for $k\; <\; n$ of ''incomplete'' orthonormal bases (orthonormal $k$-frames) are still homogeneous spaces for the orthogonal group, but not ''principal'' homogeneous spaces: any $k$-frame can be taken to any other $k$-frame by an orthogonal map, but this map is not uniquely determined.
* The set of orthonormal bases for $\backslash mathbb^$ is a G-torsor for $G\; =\; \backslash text(p,q)$.
* The set of orthonormal bases for $\backslash mathbb^n$ is a G-torsor for $G\; =\; \backslash text(n)$.
* The set of orthonormal bases for $\backslash mathbb^$ is a G-torsor for $G\; =\; \backslash text(p,q)$.
* The set of right-handed orthonormal bases for $\backslash mathbb^n$ is a G-torsor for $G\; =\; \backslash text(n)$
See also

* * * * *References

*External links

* ThiStack Exchange Post

discusses why the set of Dirac Delta functions is not a basis of L