Generalized Eigenvector
   HOME
*



picture info

Generalized Eigenvector
In linear algebra, a generalized eigenvector of an n\times n matrix A is a vector which satisfies certain criteria which are more relaxed than those for an (ordinary) eigenvector. Let V be an n-dimensional vector space; let \phi be a linear map in , the set of all linear maps from V into itself; and let A be the matrix representation of \phi with respect to some ordered basis. There may not always exist a full set of n linearly independent eigenvectors of A that form a complete basis for V. That is, the matrix A may not be diagonalizable. This happens when the algebraic multiplicity of at least one eigenvalue \lambda_i is greater than its geometric multiplicity (the nullity of the matrix (A-\lambda_i I), or the dimension of its nullspace). In this case, \lambda_i is called a defective eigenvalue and A is called a defective matrix. A generalized eigenvector x_i corresponding to \lambda_i, together with the matrix (A-\lambda_i I) generate a Jordan chain of linearly indepe ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Linear Algebra
Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices. Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as lines, planes and rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to spaces of functions. Linear algebra is also used in most sciences and fields of engineering, because it allows modeling many natural phenomena, and computing efficiently with such models. For nonlinear systems, which cannot be modeled with linear algebra, it is often used for dealing with first-order approximations, using the fact that the differential of a multivariate function at a point is the linea ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Invariant Subspace
In mathematics, an invariant subspace of a linear mapping ''T'' : ''V'' → ''V '' i.e. from some vector space ''V'' to itself, is a subspace ''W'' of ''V'' that is preserved by ''T''; that is, ''T''(''W'') ⊆ ''W''. General description Consider a linear mapping T :T: W \to W. An invariant subspace W of T has the property that all vectors \mathbf \in W are transformed by T into vectors also contained in W. This can be stated as :\mathbf \in W \implies T(\mathbf) \in W. Trivial examples of invariant subspaces * \mathbb^n: Since T maps every vector in \mathbb^n into \mathbb^n. * \: Since a linear map has to map 0 \mapsto 0. 1-dimensional invariant subspace ''U'' A basis of a 1-dimensional space is simply a non-zero vector \mathbf. Consequently, any vector \mathbf \in U can be represented as \lambda \mathbf where \lambda is a scalar. If we represent T by a matrix A then, for U to be an invariant subspace it must satisfy : \forall \mathbf \in U \; \exists \alpha \i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Characteristic Polynomial
In linear algebra, the characteristic polynomial of a square matrix is a polynomial which is invariant under matrix similarity and has the eigenvalues as roots. It has the determinant and the trace of the matrix among its coefficients. The characteristic polynomial of an endomorphism of a finite-dimensional vector space is the characteristic polynomial of the matrix of that endomorphism over any base (that is, the characteristic polynomial does not depend on the choice of a basis). The characteristic equation, also known as the determinantal equation, is the equation obtained by equating the characteristic polynomial to zero. In spectral graph theory, the characteristic polynomial of a graph is the characteristic polynomial of its adjacency matrix. Motivation In linear algebra, eigenvalues and eigenvectors play a fundamental role, since, given a linear transformation, an eigenvector is a vector whose direction is not changed by the transformation, and the correspondin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Field (mathematics)
In mathematics, a field is a set on which addition, subtraction, multiplication, and division are defined and behave as the corresponding operations on rational and real numbers do. A field is thus a fundamental algebraic structure which is widely used in algebra, number theory, and many other areas of mathematics. The best known fields are the field of rational numbers, the field of real numbers and the field of complex numbers. Many other fields, such as fields of rational functions, algebraic function fields, algebraic number fields, and ''p''-adic fields are commonly used and studied in mathematics, particularly in number theory and algebraic geometry. Most cryptographic protocols rely on finite fields, i.e., fields with finitely many elements. The relation of two fields is expressed by the notion of a field extension. Galois theory, initiated by Évariste Galois in the 1830s, is devoted to understanding the symmetries of field extensions. Among other res ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Generalized Modal Matrix
In linear algebra, the modal matrix is used in the diagonalization process involving eigenvalues and eigenvectors. Specifically the modal matrix M for the matrix A is the ''n'' × ''n'' matrix formed with the eigenvectors of A as columns in M. It is utilized in the similarity transformation : D = M^AM, where D is an ''n'' × ''n'' diagonal matrix with the eigenvalues of A on the main diagonal of D and zeros elsewhere. The matrix D is called the spectral matrix for A. The eigenvalues must appear left to right, top to bottom in the same order as their corresponding eigenvectors are arranged left to right in M. Example The matrix :A = \begin 3 & 2 & 0 \\ 2 & 0 & 0 \\ 1 & 0 & 2 \end has eigenvalues and corresponding eigenvectors : \lambda_1 = -1, \quad \, \mathbf b_1 = \left( -3, 6, 1 \right) , : \lambda_2 = 2, \qquad \mathbf b_2 = \left( 0, 0, 1 \right) , : \lambda_3 = 4, \qquad \mathbf b_3 = \left( 2, 1, 1 \right) . A diagonal matrix D, similar to A is :D = \begin - ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Modal Matrix
In linear algebra, the modal matrix is used in the diagonalization process involving eigenvalues and eigenvectors. Specifically the modal matrix M for the matrix A is the ''n'' × ''n'' matrix formed with the eigenvectors of A as columns in M. It is utilized in the similarity transformation : D = M^AM, where D is an ''n'' × ''n'' diagonal matrix with the eigenvalues of A on the main diagonal of D and zeros elsewhere. The matrix D is called the spectral matrix for A. The eigenvalues must appear left to right, top to bottom in the same order as their corresponding eigenvectors are arranged left to right in M. Example The matrix :A = \begin 3 & 2 & 0 \\ 2 & 0 & 0 \\ 1 & 0 & 2 \end has eigenvalues and corresponding eigenvectors : \lambda_1 = -1, \quad \, \mathbf b_1 = \left( -3, 6, 1 \right) , : \lambda_2 = 2, \qquad \mathbf b_2 = \left( 0, 0, 1 \right) , : \lambda_3 = 4, \qquad \mathbf b_3 = \left( 2, 1, 1 \right) . A diagonal matrix D, similar to A is :D = \begin - ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Spectral Matrix
In linear algebra, the modal matrix is used in the diagonalization process involving eigenvalues and eigenvectors. Specifically the modal matrix M for the matrix A is the ''n'' × ''n'' matrix formed with the eigenvectors of A as columns in M. It is utilized in the similarity transformation : D = M^AM, where D is an ''n'' × ''n'' diagonal matrix with the eigenvalues of A on the main diagonal of D and zeros elsewhere. The matrix D is called the spectral matrix for A. The eigenvalues must appear left to right, top to bottom in the same order as their corresponding eigenvectors are arranged left to right in M. Example The matrix :A = \begin 3 & 2 & 0 \\ 2 & 0 & 0 \\ 1 & 0 & 2 \end has eigenvalues and corresponding eigenvectors : \lambda_1 = -1, \quad \, \mathbf b_1 = \left( -3, 6, 1 \right) , : \lambda_2 = 2, \qquad \mathbf b_2 = \left( 0, 0, 1 \right) , : \lambda_3 = 4, \qquad \mathbf b_3 = \left( 2, 1, 1 \right) . A diagonal matrix D, similar to A is :D = \begin - ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Invertible Matrix
In linear algebra, an -by- square matrix is called invertible (also nonsingular or nondegenerate), if there exists an -by- square matrix such that :\mathbf = \mathbf = \mathbf_n \ where denotes the -by- identity matrix and the multiplication used is ordinary matrix multiplication. If this is the case, then the matrix is uniquely determined by , and is called the (multiplicative) ''inverse'' of , denoted by . Matrix inversion is the process of finding the matrix that satisfies the prior equation for a given invertible matrix . A square matrix that is ''not'' invertible is called singular or degenerate. A square matrix is singular if and only if its determinant is zero. Singular matrices are rare in the sense that if a square matrix's entries are randomly selected from any finite region on the number line or complex plane, the probability that the matrix is singular is 0, that is, it will "almost never" be singular. Non-square matrices (-by- matrices for which ) do not ha ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Linear Transformation
In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that preserves the operations of vector addition and scalar multiplication. The same names and the same definition are also used for the more general case of modules over a ring; see Module homomorphism. If a linear map is a bijection then it is called a . In the case where V = W, a linear map is called a (linear) '' endomorphism''. Sometimes the term refers to this case, but the term "linear operator" can have different meanings for different conventions: for example, it can be used to emphasize that V and W are real vector spaces (not necessarily with V = W), or it can be used to emphasize that V is a function space, which is a common convention in functional analysis. Sometimes the term '' linear function'' has the same meaning as ''linea ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Zero Vector
In mathematics, a zero element is one of several generalizations of 0, the number zero to other algebraic structures. These alternate meanings may or may not reduce to the same thing, depending on the context. Additive identities An additive identity is the identity element in an Abelian group, additive group. It corresponds to the element 0 such that for all x in the group, . Some examples of additive identity include: * The zero vector under vector addition: the vector of length 0 and whose components are all 0. Often denoted as \mathbf or \vec. * The zero function or zero map defined by , under Pointwise, pointwise addition * The empty set under Union (set theory), set union * An empty sum or empty coproduct * An Initial and terminal objects, initial object in a category (mathematics), category (an empty coproduct, and so an identity under coproducts) Absorbing elements An absorbing element in a multiplicative semigroup or semiring generalises the property . Examples include: ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Identity Matrix
In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial or can be trivially determined by the context. I_1 = \begin 1 \end ,\ I_2 = \begin 1 & 0 \\ 0 & 1 \end ,\ I_3 = \begin 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end ,\ \dots ,\ I_n = \begin 1 & 0 & 0 & \cdots & 0 \\ 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & 1 \end. The term unit matrix has also been widely used, but the term ''identity matrix'' is now standard. The term ''unit matrix'' is ambiguous, because it is also used for a matrix of ones and for any unit of the ring of all n\times n matrices. In some fields, such as group theory or quantum mechanics, the identity matrix is sometimes denoted by a boldface one, \mathbf, or called "id" (short for identity) ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Ordinary Differential Equation
In mathematics, an ordinary differential equation (ODE) is a differential equation whose unknown(s) consists of one (or more) function(s) of one variable and involves the derivatives of those functions. The term ''ordinary'' is used in contrast with the term partial differential equation which may be with respect to ''more than'' one independent variable. Differential equations A linear differential equation is a differential equation that is defined by a linear polynomial in the unknown function and its derivatives, that is an equation of the form :a_0(x)y +a_1(x)y' + a_2(x)y'' +\cdots +a_n(x)y^+b(x)=0, where , ..., and are arbitrary differentiable functions that do not need to be linear, and are the successive derivatives of the unknown function of the variable . Among ordinary differential equations, linear differential equations play a prominent role for several reasons. Most elementary and special functions that are encountered in physics and applied mathematic ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]