HOME
*





Totally Unimodular Matrix
In mathematics, a unimodular matrix ''M'' is a square matrix, square integer matrix having determinant +1 or −1. Equivalently, it is an integer matrix that is invertible matrix, invertible over the integers: there is an integer matrix ''N'' that is its inverse (these are equivalent under Cramer's rule). Thus every equation , where ''M'' and ''b'' both have integer components and ''M'' is unimodular, has an integer solution. The ''n'' × ''n'' unimodular matrices form a group (mathematics), group called the ''n'' × ''n'' general linear group over \mathbb, which is denoted \operatorname_n(\mathbb). Examples of unimodular matrices Unimodular matrices form a subgroup of the general linear group under matrix multiplication, i.e. the following matrices are unimodular: * Identity matrix * The Matrix inverse, inverse of a unimodular matrix * The Matrix multiplication, product of two unimodular matrices Other examples include: * Pascal matrix, Pascal matric ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Integer Number
An integer is the number zero (), a positive natural number (, , , etc.) or a negative integer with a minus sign (−1, −2, −3, etc.). The negative numbers are the additive inverses of the corresponding positive numbers. In the language of mathematics, the Set (mathematics), set of integers is often denoted by the boldface or blackboard bold \mathbb. The set of natural numbers \mathbb is a subset of \mathbb, which in turn is a subset of the set of all rational numbers \mathbb, itself a subset of the real numbers \mathbb. Like the natural numbers, \mathbb is Countable set, countably infinite. An integer may be regarded as a real number that can be written without a fraction, fractional component. For example, 21, 4, 0, and −2048 are integers, while 9.75, , and  are not. The integers form the smallest Group (mathematics), group and the smallest ring (mathematics), ring containing the natural numbers. In algebraic number theory, the integers are sometimes qualified as ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Permutation Matrix
In mathematics, particularly in matrix theory, a permutation matrix is a square binary matrix that has exactly one entry of 1 in each row and each column and 0s elsewhere. Each such matrix, say , represents a permutation of elements and, when used to multiply another matrix, say , results in permuting the rows (when pre-multiplying, to form ) or columns (when post-multiplying, to form ) of the matrix . Definition Given a permutation of ''m'' elements, :\pi : \lbrace 1, \ldots, m \rbrace \to \lbrace 1, \ldots, m \rbrace represented in two-line form by :\begin 1 & 2 & \cdots & m \\ \pi(1) & \pi(2) & \cdots & \pi(m) \end, there are two natural ways to associate the permutation with a permutation matrix; namely, starting with the ''m'' × ''m'' identity matrix, , either permute the columns or permute the rows, according to . Both methods of defining permutation matrices appear in the literature and the properties expressed in one representation can be easily converted to th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Transpose
In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal; that is, it switches the row and column indices of the matrix by producing another matrix, often denoted by (among other notations). The transpose of a matrix was introduced in 1858 by the British mathematician Arthur Cayley. In the case of a logical matrix representing a binary relation R, the transpose corresponds to the converse relation RT. Transpose of a matrix Definition The transpose of a matrix , denoted by , , , A^, , , or , may be constructed by any one of the following methods: # Reflect over its main diagonal (which runs from top-left to bottom-right) to obtain #Write the rows of as the columns of #Write the columns of as the rows of Formally, the -th row, -th column element of is the -th row, -th column element of : :\left mathbf^\operatorname\right = \left mathbf\right. If is an matrix, then is an matrix. In the case of square matrices, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Converse (logic)
In logic and mathematics, the converse of a categorical or implicational statement is the result of reversing its two constituent statements. For the implication ''P'' → ''Q'', the converse is ''Q'' → ''P''. For the categorical proposition ''All S are P'', the converse is ''All P are S''. Either way, the truth of the converse is generally independent from that of the original statement.Robert Audi, ed. (1999), ''The Cambridge Dictionary of Philosophy'', 2nd ed., Cambridge University Press: "converse". Implicational converse Let ''S'' be a statement of the form ''P implies Q'' (''P'' → ''Q''). Then the converse of ''S'' is the statement ''Q implies P'' (''Q'' → ''P''). In general, the truth of ''S'' says nothing about the truth of its converse, unless the antecedent ''P'' and the consequent ''Q'' are logically equivalent. For example, consider the true statement "If I am a human, then I am mortal." The converse of that statement is "If I am mortal, then I am ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Submatrix
In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, \begin1 & 9 & -13 \\20 & 5 & -6 \end is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a "-matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra. Therefore, the study of matrices is a large part of linear algebra, and most properties and operations of abstract linear algebra can be expressed in terms of matrices. For example, matrix multiplication represents composition of linear maps. Not all matrices are related to linear algebra. This is, in particular, the case in graph theory, of incidence matrices, and adjacency matrices. ''This article focuses on matrices related to linear algebra, and, unle ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Joseph Kruskal
Joseph Bernard Kruskal, Jr. (; January 29, 1928 – September 19, 2010) was an American mathematician, statistician, computer scientist and psychometrician. Personal life Kruskal was born to a Jewish family in New York City to a successful fur wholesaler, Joseph B. Kruskal, Sr. His mother, Lillian Rose Vorhaus Kruskal Oppenheimer, became a noted promoter of origami during the early era of television. Kruskal had two notable brothers, Martin David Kruskal, co-inventor of solitons, and William Kruskal, who developed the Kruskal–Wallis one-way analysis of variance. One of Joseph Kruskal's nephews is notable computer scientist and professor Clyde Kruskal. Education and career He was a student at the University of Chicago earning a bachelor of science in mathematics in the year of 1948, and a master of science in mathematics in the following year 1949. After his time at the University of Chicago Kruskal attended Princeton University, where he completed his Ph.D. in 1954, nom ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Alan Hoffman (mathematician)
Alan Jerome Hoffman (May 30, 1924 – January 18, 2021) was an American mathematician and IBM Fellow emeritus, T. J. Watson Research Center, IBM, in Yorktown Heights, New York. He was the founding editor of the journal ''Linear Algebra and its Applications'', and held several patents. He contributed to combinatorial optimization and the eigenvalue theory of graphs. Hoffman and Robert Singleton constructed the Hoffman–Singleton graph, which is the unique Moore graph of degree 7 and diameter 2. Hoffman died on January 18, 2021, at the age of 96. Early life Alan Hoffman was born and raised in New York City, residing first in Bensonhurst, Brooklyn and then on the Upper West Side of Manhattan, with his sister Mildred and his parents Muriel and Jesse. Alan knew from an early age that he wanted a career in mathematics. He was a good student in all disciplines, finding inspiration in both the liberal arts and the sciences. But he was enthralled by the rigor of deductive reasoning ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Claude Berge
Claude Jacques Berge (5 June 1926 – 30 June 2002) was a French mathematician, recognized as one of the modern founders of combinatorics and graph theory. Biography and professional history Claude Berge's parents were André Berge and Geneviève Fourcade. André Berge (1902–1995) was a physician and psychoanalyst who, in addition to his professional work, had published several novels. He was the son of the René Berge, a mining engineer, and Antoinette Faure. Félix François Faure (1841–1899) was Antoinette Faure's father; he was President of France from 1895 to 1899. André Berge married Geneviève in 1924, and Claude was the second of their six children. His five siblings were Nicole (the eldest), Antoine, Philippe, Edith, and Patrick. Claude attended the near Verneuil-sur-Avre, about west of Paris. This famous private school, founded by the sociologist Edmond Demolins in 1899, attracted students from all over France to its innovative educational program. At this stage ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Kronecker Product
In mathematics, the Kronecker product, sometimes denoted by ⊗, is an operation on two matrices of arbitrary size resulting in a block matrix. It is a generalization of the outer product (which is denoted by the same symbol) from vectors to matrices, and gives the matrix of the tensor product linear map with respect to a standard choice of basis. The Kronecker product is to be distinguished from the usual matrix multiplication, which is an entirely different operation. The Kronecker product is also sometimes called matrix direct product. The Kronecker product is named after the German mathematician Leopold Kronecker (1823–1891), even though there is little evidence that he was the first to define and use it. The Kronecker product has also been called the ''Zehfuss matrix'', and the ''Zehfuss product'', after , who in 1858 described this matrix operation, but Kronecker product is currently the most widely used. Definition If A is an matrix and B is a matrix, then the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Hermite Normal Form
In linear algebra, the Hermite normal form is an analogue of reduced echelon form for matrices over the integers Z. Just as reduced echelon form can be used to solve problems about the solution to the linear system Ax=b where x is in R''n'', the Hermite normal form can solve problems about the solution to the linear system Ax=b where this time x is restricted to have integer coordinates only. Other applications of the Hermite normal form include integer programming, cryptography, and abstract algebra. Definition Various authors may prefer to talk about Hermite normal form in either row-style or column-style. They are essentially the same up to transposition. Row-style Hermite normal form An m by n matrix A with integer entries has a (row) Hermite normal form H if there is a square unimodular matrix U where H=UA and H has the following restrictions: # H is upper triangular (that is, h''ij'' = 0 for ''i'' > ''j''), and any rows of zeros are located below any other row. # The leadin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Lattice Reduction
In mathematics, the goal of lattice basis reduction is to find a basis with short, nearly orthogonal vectors when given an integer lattice basis as input. This is realized using different algorithms, whose running time is usually at least exponential in the dimension of the lattice. Nearly orthogonal One measure of ''nearly orthogonal'' is the orthogonality defect. This compares the product of the lengths of the basis vectors with the volume of the parallelepiped they define. For perfectly orthogonal basis vectors, these quantities would be the same. Any particular basis of n vectors may be represented by a matrix B, whose columns are the basis vectors b_i, i = 1, \ldots, n. In the fully dimensional case where the number of basis vectors is equal to the dimension of the space they occupy, this matrix is square, and the volume of the fundamental parallelepiped is simply the absolute value of the determinant of this matrix \det(B). If the number of vectors is less than the dimensi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Reflection Matrix
In linear algebra, linear transformations can be represented by matrices. If T is a linear transformation mapping \mathbb^n to \mathbb^m and \mathbf x is a column vector with n entries, then T( \mathbf x ) = A \mathbf x for some m \times n matrix A, called the transformation matrix of T. Note that A has m rows and n columns, whereas the transformation T is from \mathbb^n to \mathbb^m. There are alternative expressions of transformation matrices involving row vectors that are preferred by some authors. Uses Matrices allow arbitrary linear transformations to be displayed in a consistent format, suitable for computation. This also allows transformations to be composed easily (by multiplying their matrices). Linear transformations are not the only ones that can be represented by matrices. Some transformations that are non-linear on an n-dimensional Euclidean space R''n'' can be represented as linear transformations on the ''n''+1-dimensional space R''n''+1. These include both af ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]