Symplectic Vector Space
In mathematics, a symplectic vector space is a vector space V over a Field (mathematics), field F (for example the real numbers \mathbb) equipped with a symplectic bilinear form. A symplectic bilinear form is a map (mathematics), mapping \omega : V \times V \to F that is ; bilinear form, Bilinear: linear map, Linear in each argument separately; ; alternating form, Alternating: \omega(v, v) = 0 holds for all v \in V; and ; Nondegenerate form, Non-degenerate: \omega(v, u) = 0 for all v \in V implies that u = 0. If the underlying field (mathematics), field has characteristic (algebra), characteristic not 2, alternation is equivalent to skew symmetry, skew-symmetry. If the characteristic is 2, the skew-symmetry is implied by, but does not imply alternation. In this case every symplectic form is a symmetric bilinear form, symmetric form, but not vice versa. Working in a fixed basis (linear algebra), basis, \omega can be represented by a matrix (mathematics), matrix. The conditions abo ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Mathematics
Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many areas of mathematics, which include number theory (the study of numbers), algebra (the study of formulas and related structures), geometry (the study of shapes and spaces that contain them), Mathematical analysis, analysis (the study of continuous changes), and set theory (presently used as a foundation for all mathematics). Mathematics involves the description and manipulation of mathematical object, abstract objects that consist of either abstraction (mathematics), abstractions from nature orin modern mathematicspurely abstract entities that are stipulated to have certain properties, called axioms. Mathematics uses pure reason to proof (mathematics), prove properties of objects, a ''proof'' consisting of a succession of applications of in ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Symplectic Matrix
In mathematics, a symplectic matrix is a 2n\times 2n matrix M with real entries that satisfies the condition where M^\text denotes the transpose of M and \Omega is a fixed 2n\times 2n nonsingular, skew-symmetric matrix. This definition can be extended to 2n\times 2n matrices with entries in other fields, such as the complex numbers, finite fields, ''p''-adic numbers, and function fields. Typically \Omega is chosen to be the block matrix \Omega = \begin 0 & I_n \\ -I_n & 0 \\ \end, where I_n is the n\times n identity matrix. The matrix \Omega has determinant +1 and its inverse is \Omega^ = \Omega^\text = -\Omega. Properties Generators for symplectic matrices Every symplectic matrix has determinant +1, and the 2n\times 2n symplectic matrices with real entries form a subgroup of the general linear group \mathrm(2n;\mathbb) under matrix multiplication since being symplectic is a property stable under matrix multiplication. Topologically, this symplectic group is a conn ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Basis (linear Algebra)
In mathematics, a Set (mathematics), set of elements of a vector space is called a basis (: bases) if every element of can be written in a unique way as a finite linear combination of elements of . The coefficients of this linear combination are referred to as components or coordinates of the vector with respect to . The elements of a basis are called . Equivalently, a set is a basis if its elements are linearly independent and every element of is a linear combination of elements of . In other words, a basis is a linearly independent spanning set. A vector space can have several bases; however all the bases have the same number of elements, called the dimension (vector space), dimension of the vector space. This article deals mainly with finite-dimensional vector spaces. However, many of the principles are also valid for infinite-dimensional vector spaces. Basis vectors find applications in the study of crystal structures and frame of reference, frames of reference. De ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Direct Sum Of Vector Spaces
In abstract algebra, the direct sum is a construction which combines several modules into a new, larger module. The direct sum of modules is the smallest module which contains the given modules as submodules with no "unnecessary" constraints, making it an example of a coproduct. Contrast with the direct product, which is the dual notion. The most familiar examples of this construction occur when considering vector spaces (modules over a field) and abelian groups (modules over the ring Z of integers). The construction may also be extended to cover Banach spaces and Hilbert spaces. See the article decomposition of a module for a way to write a module as a direct sum of submodules. Construction for vector spaces and abelian groups We give the construction first in these two cases, under the assumption that we have only two objects. Then we generalize to an arbitrary family of arbitrary modules. The key elements of the general construction are more clearly identified by con ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Dual Space
In mathematics, any vector space ''V'' has a corresponding dual vector space (or just dual space for short) consisting of all linear forms on ''V,'' together with the vector space structure of pointwise addition and scalar multiplication by constants. The dual space as defined above is defined for all vector spaces, and to avoid ambiguity may also be called the . When defined for a topological vector space, there is a subspace of the dual space, corresponding to continuous linear functionals, called the continuous dual space. Dual vector spaces find application in many branches of mathematics that use vector spaces, such as in tensor analysis with finite-dimensional vector spaces. When applied to vector spaces of functions (which are typically infinite-dimensional), dual spaces are used to describe measures, distributions, and Hilbert spaces. Consequently, the dual space is an important concept in functional analysis. Early terms for ''dual'' include ''polarer Raum'' ahn 1 ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Skew-symmetric Matrix
In mathematics, particularly in linear algebra, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition In terms of the entries of the matrix, if a_ denotes the entry in the i-th row and j-th column, then the skew-symmetric condition is equivalent to Example The matrix A = \begin 0 & 2 & -45 \\ -2 & 0 & -4 \\ 45 & 4 & 0 \end is skew-symmetric because A^\textsf = \begin 0 & -2 & 45 \\ 2 & 0 & 4 \\ -45 & -4 & 0 \end = -A . Properties Throughout, we assume that all matrix entries belong to a field \mathbb whose characteristic is not equal to 2. That is, we assume that , where 1 denotes the multiplicative identity and 0 the additive identity of the given field. If the characteristic of the field is 2, then a skew-symmetric matrix is the same thing as a symmetric matrix. * The sum of two skew-symmetric matrices is skew-symmetric. * A scalar ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Dual Basis
In linear algebra, given a vector space V with a basis B of vectors indexed by an index set I (the cardinality of I is the dimension of V), the dual set of B is a set B^* of vectors in the dual space V^* with the same index set I such that B and B^* form a biorthogonal system. The dual set is always linearly independent but does not necessarily span V^*. If it does span V^*, then B^* is called the dual basis or reciprocal basis for the basis B. Denoting the indexed vector sets as B = \_ and B^ = \_, being biorthogonal means that the elements pair to have an inner product equal to 1 if the indexes are equal, and equal to 0 otherwise. Symbolically, evaluating a dual vector in V^* on a vector in the original space V: : v^i\cdot v_j = \delta^i_j = \begin 1 & \text i = j\\ 0 & \text i \ne j\text \end where \delta^i_j is the Kronecker delta symbol. Introduction To perform operations with a vector, we must have a straightforward method of calculating its components. In a Cartesia ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Symplectic Basis
In linear algebra, a standard symplectic basis is a basis _i, _i of a symplectic vector space, which is a vector space with a nondegenerate alternating bilinear form \omega, such that \omega(_i, _j) = 0 = \omega(_i, _j), \omega(_i, _j) = \delta_. A symplectic basis of a symplectic vector space always exists; it can be constructed by a procedure similar to the Gram–Schmidt process.Maurice de Gosson: ''Symplectic Geometry and Quantum Mechanics'' (2006), p.7 and pp. 12–13 The existence of the basis implies in particular that the dimension of a symplectic vector space is even if it is finite. See also * Darboux theorem * Symplectic frame bundle * Symplectic spinor bundle *Symplectic vector space In mathematics, a symplectic vector space is a vector space V over a Field (mathematics), field F (for example the real numbers \mathbb) equipped with a symplectic bilinear form. A symplectic bilinear form is a map (mathematics), mapping \omega : ... Notes References *da Silva, A.C ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Gram–Schmidt Process
In mathematics, particularly linear algebra and numerical analysis, the Gram–Schmidt process or Gram-Schmidt algorithm is a way of finding a set of two or more vectors that are perpendicular to each other. By technical definition, it is a method of constructing an orthonormal basis from a set of vector (geometry), vectors in an inner product space, most commonly the Euclidean space \mathbb^n equipped with the standard inner product. The Gram–Schmidt process takes a finite set, finite, linearly independent set of vectors S = \ for and generates an orthogonal set S' = \ that spans the same k-dimensional subspace of \mathbb^n as S. The method is named after Jørgen Pedersen Gram and Erhard Schmidt, but Pierre-Simon Laplace had been familiar with it before Gram and Schmidt. In the theory of Lie group decompositions, it is generalized by the Iwasawa decomposition. The application of the Gram–Schmidt process to the column vectors of a full column rank (linear algebra), rank mat ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Identity Matrix
In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. It has unique properties, for example when the identity matrix represents a geometric transformation, the object remains unchanged by the transformation. In other contexts, it is analogous to multiplying by the number 1. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial or can be trivially determined by the context. I_1 = \begin 1 \end ,\ I_2 = \begin 1 & 0 \\ 0 & 1 \end ,\ I_3 = \begin 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end ,\ \dots ,\ I_n = \begin 1 & 0 & 0 & \cdots & 0 \\ 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & 1 \end. The term unit matrix has also been widely used, but the term ''identity matrix'' is now standard. The term ''unit matrix'' is ambiguous, because it is also used for a matrix of on ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Block Matrix
In mathematics, a block matrix or a partitioned matrix is a matrix that is interpreted as having been broken into sections called blocks or submatrices. Intuitively, a matrix interpreted as a block matrix can be visualized as the original matrix with a collection of horizontal and vertical lines, which break it up, or partition it, into a collection of smaller matrices. For example, the 3x4 matrix presented below is divided by horizontal and vertical lines into four blocks: the top-left 2x3 block, the top-right 2x1 block, the bottom-left 1x3 block, and the bottom-right 1x1 block. : \left \begin a_ & a_ & a_ & b_ \\ a_ & a_ & a_ & b_ \\ \hline c_ & c_ & c_ & d \end \right Any matrix may be interpreted as a block matrix in one or more ways, with each interpretation defined by how its rows and columns are partitioned. This notion can be made more precise for an n by m matrix M by partitioning n into a collection \text, and then partitioning m into a collection \text. The original m ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |