Normal Matrix
In mathematics, a complex square matrix is normal if it commutes with its conjugate transpose : :A \text \iff A^*A = AA^* . The concept of normal matrices can be extended to normal operators on infinite-dimensional normed spaces and to normal elements in C*-algebras. As in the matrix case, normality means commutativity is preserved, to the extent possible, in the noncommutative setting. This makes normal operators, and normal elements of C*-algebras, more amenable to analysis. The spectral theorem states that a matrix is normal if and only if it is unitarily similar to a diagonal matrix, and therefore any matrix satisfying the equation is diagonalizable. (The converse does not hold because diagonalizable matrices may have non-orthogonal eigenspaces.) Thus A = U D U^* and A^* = U D^* U^*where D is a diagonal matrix whose diagonal values are in general complex. The left and right singular vectors in the singular value decomposition of a normal matrix A = U D V^* dif ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Complex Number
In mathematics, a complex number is an element of a number system that extends the real numbers with a specific element denoted , called the imaginary unit and satisfying the equation i^= -1; every complex number can be expressed in the form a + bi, where and are real numbers. Because no real number satisfies the above equation, was called an imaginary number by René Descartes. For the complex number is called the , and is called the . The set of complex numbers is denoted by either of the symbols \mathbb C or . Despite the historical nomenclature, "imaginary" complex numbers have a mathematical existence as firm as that of the real numbers, and they are fundamental tools in the scientific description of the natural world. Complex numbers allow solutions to all polynomial equations, even those that have no solutions in real numbers. More precisely, the fundamental theorem of algebra asserts that every non-constant polynomial equation with real or complex coefficie ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Symmetric Matrix
In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if a_ denotes the entry in the ith row and jth column then for all indices i and j. Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative. In linear algebra, a real symmetric matrix represents a self-adjoint operator represented in an orthonormal basis over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Simultaneously Triangularizable
In mathematics, a triangular matrix is a special kind of square matrix. A square matrix is called if all the entries ''above'' the main diagonal are zero. Similarly, a square matrix is called if all the entries ''below'' the main diagonal are zero. Because matrix equations with triangular matrices are easier to solve, they are very important in numerical analysis. By the LU decomposition algorithm, an invertible matrix may be written as the matrix multiplication, product of a lower triangular matrix ''L'' and an upper triangular matrix ''U'' if and only if all its leading principal minor (linear algebra), minors are non-zero. Description A matrix of the form :L = \begin \ell_ & & & & 0 \\ \ell_ & \ell_ & & & \\ \ell_ & \ell_ & \ddots & & \\ \vdots & \vdots & \ddots & \ddots & \\ \ell_ & \ell_ & \ldots & \ell_ & \ell_ \end is called a lower trian ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Commuting Matrices
In linear algebra, two matrices A and B are said to commute if AB=BA, or equivalently if their commutator ,B AB-BA is zero. Matrices A that commute with matrix B are called the commutant of matrix B (and vice versa). A set of matrices A_1, \ldots, A_k is said to commute if they commute pairwise, meaning that every pair of matrices in the set commutes. Characterizations and properties * Commuting matrices preserve each other's eigenspaces. As a consequence, commuting matrices over an algebraically closed field are simultaneously triangularizable; that is, there are bases over which they are both upper triangular. In other words, if A_1,\ldots,A_k commute, there exists a similarity matrix P such that P^ A_i P is upper triangular for all i \in \. The converse is not necessarily true, as the following counterexample shows: *:\begin 1 & 2 \\ 0 & 3 \end\begin 1 & 1 \\ 0 & 1 \end = \begin 1 & 3 \\ 0 & 3 \end \ne \begin 1 & 5 \\ 0 & 3 \end=\begin 1 & 1 \\ 0 & 1 \end\begin 1 & 2 \\ 0 ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Simultaneously Diagonalizable
In linear algebra, a square matrix A is called diagonalizable or non-defective if it is similar to a diagonal matrix. That is, if there exists an invertible matrix P and a diagonal matrix D such that . This is equivalent to (Such D are not unique.) This property exists for any linear map: for a finite-dimensional vector space a linear map T:V\to V is called diagonalizable if there exists an ordered basis of V consisting of eigenvectors of T. These definitions are equivalent: if T has a matrix representation A = PDP^ as above, then the column vectors of P form a basis consisting of eigenvectors of and the diagonal entries of D are the corresponding eigenvalues of with respect to this eigenvector basis, T is represented by Diagonalization is the process of finding the above P and and makes many subsequent computations easier. One can raise a diagonal matrix D to a power by simply raising the diagonal entries to that power. The determinant of a diagonal matrix i ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Real Number
In mathematics, a real number is a number that can be used to measure a continuous one- dimensional quantity such as a duration or temperature. Here, ''continuous'' means that pairs of values can have arbitrarily small differences. Every real number can be almost uniquely represented by an infinite decimal expansion. The real numbers are fundamental in calculus (and in many other branches of mathematics), in particular by their role in the classical definitions of limits, continuity and derivatives. The set of real numbers, sometimes called "the reals", is traditionally denoted by a bold , often using blackboard bold, . The adjective ''real'', used in the 17th century by René Descartes, distinguishes real numbers from imaginary numbers such as the square roots of . The real numbers include the rational numbers, such as the integer and the fraction . The rest of the real numbers are called irrational numbers. Some irrational numbers (as well as all the rationals) a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Self-adjoint
In mathematics, an element of a *-algebra is called self-adjoint if it is the same as its adjoint (i.e. a = a^*). Definition Let \mathcal be a *-algebra. An element a \in \mathcal is called self-adjoint if The set of self-adjoint elements is referred to as A subset \mathcal \subseteq \mathcal that is closed under the involution *, i.e. \mathcal = \mathcal^*, is called A special case of particular importance is the case where \mathcal is a complete normed *-algebra, that satisfies the C*-identity (\left\, a^*a \right\, = \left\, a \right\, ^2 \ \forall a \in \mathcal), which is called a C*-algebra. Especially in the older literature on *-algebras and C*-algebras, such elements are often called Because of that the notations \mathcal_h, \mathcal_H or H(\mathcal) for the set of self-adjoint elements are also sometimes used, even in the more recent literature. Examples * Each positive element of a C*-algebra is * For each element a of a *-algebra, the elements a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Schur Decomposition
In the mathematical discipline of linear algebra, the Schur decomposition or Schur triangulation, named after Issai Schur, is a matrix decomposition. It allows one to write an arbitrary complex square matrix as unitarily similar to an upper triangular matrix whose diagonal elements are the eigenvalues of the original matrix. Statement The complex Schur decomposition reads as follows: if is an square matrix with complex entries, then ''A'' can be expressed as (Section 2.3 and further at p. 79(Section 7.7 at p. 313 A = Q U Q^ for some unitary matrix ''Q'' (so that the inverse ''Q''−1 is also the conjugate transpose ''Q''* of ''Q''), and some upper triangular matrix ''U''. This is called a Schur form of ''A''. Since ''U'' is similar to ''A'', it has the same spectrum, and since it is triangular, its eigenvalues are the diagonal entries of ''U''. The Schur decomposition implies that there exists a nested sequence of ''A''-invariant subspaces , and that there exists an ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Orthogonal
In mathematics, orthogonality (mathematics), orthogonality is the generalization of the geometric notion of ''perpendicularity''. Although many authors use the two terms ''perpendicular'' and ''orthogonal'' interchangeably, the term ''perpendicular'' is more specifically used for lines and planes that intersect to form a right angle, whereas ''orthogonal'' is used in generalizations, such as ''orthogonal vectors'' or ''orthogonal curves''. ''Orthogonality'' is also used with various meanings that are often weakly related or not related at all with the mathematical meanings. Etymology The word comes from the Ancient Greek ('), meaning "upright", and ('), meaning "angle". The Ancient Greek (') and Classical Latin ' originally denoted a rectangle. Later, they came to mean a right triangle. In the 12th century, the post-classical Latin word ''orthogonalis'' came to mean a right angle or something related to a right angle. Mathematics Physics Optics In optics, polarization ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Eigenspace
In linear algebra, an eigenvector ( ) or characteristic vector is a Vector (mathematics and physics), vector that has its direction (geometry), direction unchanged (or reversed) by a given linear map, linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scalar multiplication, scaled by a constant factor \lambda when the linear transformation is applied to it: T\mathbf v=\lambda \mathbf v. The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor \lambda (possibly a negative number, negative or complex number, complex number). Euclidean vector, Geometrically, vectors are multi-dimensional quantities with magnitude and direction, often pictured as arrows. A linear transformation Rotation (mathematics), rotates, Scaling (geometry), stretches, or Shear mapping, shears the vectors upon which it acts. A linear transformation's eigenvectors are those vectors that are only stretched or shrunk, with nei ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Orthonormal Basis
In mathematics, particularly linear algebra, an orthonormal basis for an inner product space V with finite Dimension (linear algebra), dimension is a Basis (linear algebra), basis for V whose vectors are orthonormal, that is, they are all unit vectors and Orthogonality_(mathematics), orthogonal to each other. For example, the standard basis for a Euclidean space \R^n is an orthonormal basis, where the relevant inner product is the dot product of vectors. The Image (mathematics), image of the standard basis under a Rotation (mathematics), rotation or Reflection (mathematics), reflection (or any orthogonal transformation) is also orthonormal, and every orthonormal basis for \R^n arises in this fashion. An orthonormal basis can be derived from an orthogonal basis via Normalize (linear algebra), normalization. The choice of an origin (mathematics), origin and an orthonormal basis forms a coordinate frame known as an ''orthonormal frame''. For a general inner product space V, an orthono ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Eigenvector
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a constant factor \lambda when the linear transformation is applied to it: T\mathbf v=\lambda \mathbf v. The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor \lambda (possibly a negative or complex number). Geometrically, vectors are multi- dimensional quantities with magnitude and direction, often pictured as arrows. A linear transformation rotates, stretches, or shears the vectors upon which it acts. A linear transformation's eigenvectors are those vectors that are only stretched or shrunk, with neither rotation nor shear. The corresponding eigenvalue is the factor by which an eigenvector is stretched or shrunk. If the eigenvalue is negative, the eigenvector's direction is reversed. ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |