Expander Graph
In graph theory, an expander graph is a sparse graph that has strong connectivity properties, quantified using vertex, edge or spectral expansion. Expander constructions have spawned research in pure and applied mathematics, with several applications to complexity theory, design of robust computer networks, and the theory of error-correcting codes. Definitions Intuitively, an expander graph is a finite, undirected multigraph in which every subset of the vertices that is not "too large" has a "large" boundary. Different formalisations of these notions give rise to different notions of expanders: ''edge expanders'', ''vertex expanders'', and ''spectral expanders'', as defined below. A disconnected graph is not an expander, since the boundary of a connected component is empty. Every connected finite graph is an expander; however, different connected graphs have different expansion parameters. The complete graph has the best expansion property, but it has largest possible ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Graph Theory
In mathematics and computer science, graph theory is the study of ''graph (discrete mathematics), graphs'', which are mathematical structures used to model pairwise relations between objects. A graph in this context is made up of ''Vertex (graph theory), vertices'' (also called ''nodes'' or ''points'') which are connected by ''Glossary of graph theory terms#edge, edges'' (also called ''arcs'', ''links'' or ''lines''). A distinction is made between undirected graphs, where edges link two vertices symmetrically, and directed graphs, where edges link two vertices asymmetrically. Graphs are one of the principal objects of study in discrete mathematics. Definitions Definitions in graph theory vary. The following are some of the more basic ways of defining graphs and related mathematical structures. Graph In one restricted but very common sense of the term, a graph is an ordered pair G=(V,E) comprising: * V, a Set (mathematics), set of vertices (also called nodes or points); * ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Linear Algebra
Linear algebra is the branch of mathematics concerning linear equations such as :a_1x_1+\cdots +a_nx_n=b, linear maps such as :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrix (mathematics), matrices. Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as line (geometry), lines, plane (geometry), planes and rotation (mathematics), rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to Space of functions, function spaces. Linear algebra is also used in most sciences and fields of engineering because it allows mathematical model, modeling many natural phenomena, and computing efficiently with such models. For nonlinear systems, which cannot be modeled with linear algebra, it is often used for dealing with first-order a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Markov Transition Matrix
In mathematics, a stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability. It is also called a probability matrix, transition matrix, ''substitution matrix'', or Markov matrix. The stochastic matrix was first developed by Andrey Markov at the beginning of the 20th century, and has found use throughout a wide variety of scientific fields, including probability theory, statistics, mathematical finance and linear algebra, as well as computer science and population genetics. There are several different definitions and types of stochastic matrices: *A right stochastic matrix is a square matrix of nonnegative real numbers, with each row summing to 1 (so it is also called a row stochastic matrix). *A left stochastic matrix is a square matrix of nonnegative real numbers, with each column summing to 1 (so it is also called a column stochastic matrix). *A ''doubly stochastic matrix' ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
2-norm
In mathematics, a norm is a function from a real or complex vector space to the non-negative real numbers that behaves in certain ways like the distance from the origin: it commutes with scaling, obeys a form of the triangle inequality, and zero is only at the origin. In particular, the Euclidean distance in a Euclidean space is defined by a norm on the associated Euclidean vector space, called the Euclidean norm, the 2-norm, or, sometimes, the magnitude or length of the vector. This norm can be defined as the square root of the inner product of a vector with itself. A seminorm satisfies the first two properties of a norm but may be zero for vectors other than the origin. A vector space with a specified norm is called a normed vector space. In a similar manner, a vector space with a seminorm is called a ''seminormed vector space''. The term pseudonorm has been used for several related meanings. It may be a synonym of "seminorm". It can also refer to a norm that can take i ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Rayleigh Quotient
In mathematics, the Rayleigh quotient () for a given complex Hermitian matrix M and nonzero vector ''x'' is defined as:R(M,x) = .For real matrices and vectors, the condition of being Hermitian reduces to that of being symmetric, and the conjugate transpose x^ to the usual transpose x'. Note that R(M, c x) = R(M,x) for any non-zero scalar ''c''. Recall that a Hermitian (or real symmetric) matrix is diagonalizable with only real eigenvalues. It can be shown that, for a given matrix, the Rayleigh quotient reaches its minimum value \lambda_\min (the smallest eigenvalue of ''M'') when ''x'' is v_\min (the corresponding eigenvector). Similarly, R(M, x) \leq \lambda_\max and R(M, v_\max) = \lambda_\max. The Rayleigh quotient is used in the min-max theorem to get exact values of all eigenvalues. It is also used in eigenvalue algorithms (such as Rayleigh quotient iteration) to obtain an eigenvalue approximation from an eigenvector approximation. The range of the Rayleigh quotient (fo ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Orthogonal
In mathematics, orthogonality (mathematics), orthogonality is the generalization of the geometric notion of ''perpendicularity''. Although many authors use the two terms ''perpendicular'' and ''orthogonal'' interchangeably, the term ''perpendicular'' is more specifically used for lines and planes that intersect to form a right angle, whereas ''orthogonal'' is used in generalizations, such as ''orthogonal vectors'' or ''orthogonal curves''. ''Orthogonality'' is also used with various meanings that are often weakly related or not related at all with the mathematical meanings. Etymology The word comes from the Ancient Greek ('), meaning "upright", and ('), meaning "angle". The Ancient Greek (') and Classical Latin ' originally denoted a rectangle. Later, they came to mean a right triangle. In the 12th century, the post-classical Latin word ''orthogonalis'' came to mean a right angle or something related to a right angle. Mathematics Physics Optics In optics, polarization ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Spectral Gap
In mathematics, the spectral gap is the difference between the moduli of the two largest eigenvalues of a matrix or operator; alternately, it is sometimes taken as the smallest non-zero eigenvalue. Various theorems relate this difference to other properties of the system. The spectral gap gets its name from the ''matrix spectrum'', that is, for a matrix, the list of its eigenvalues. It provides insight on diffusion within the graph: corresponding the spectral gap to the smallest non-zero eigenvalue, it is then the mode of the network state that shows the slowest exponential decay over time. See also * Cheeger constant (graph theory) * Cheeger constant (Riemannian geometry) * Eigengap * Spectral gap (physics) * Spectral radius ''Spectral'' is a 2016 Hungarian-American military science fiction action film co-written and directed by Nic Mathieu. Written with Ian Fried (screenwriter), Ian Fried & George Nolfi, the film stars James Badge Dale as DARPA research scientist Ma . ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Degree (graph Theory)
In graph theory, the degree (or valency) of a vertex of a graph is the number of edges that are incident to the vertex; in a multigraph, a loop contributes 2 to a vertex's degree, for the two ends of the edge. The degree of a vertex v is denoted \deg(v) or \deg v. The maximum degree of a graph G is denoted by \Delta(G), and is the maximum of G's vertices' degrees. The minimum degree of a graph is denoted by \delta(G), and is the minimum of G's vertices' degrees. In the multigraph shown on the right, the maximum degree is 5 and the minimum degree is 0. In a regular graph, every vertex has the same degree, and so we can speak of ''the'' degree of the graph. A complete graph (denoted K_n, where n is the number of vertices in the graph) is a special kind of regular graph where all vertices have the maximum possible degree, n-1. In a signed graph, the number of positive edges connected to the vertex v is called positive deg(v) and the number of connected negative edges is enti ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Eigenvector
In linear algebra, an eigenvector ( ) or characteristic vector is a vector that has its direction unchanged (or reversed) by a given linear transformation. More precisely, an eigenvector \mathbf v of a linear transformation T is scaled by a constant factor \lambda when the linear transformation is applied to it: T\mathbf v=\lambda \mathbf v. The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor \lambda (possibly a negative or complex number). Geometrically, vectors are multi- dimensional quantities with magnitude and direction, often pictured as arrows. A linear transformation rotates, stretches, or shears the vectors upon which it acts. A linear transformation's eigenvectors are those vectors that are only stretched or shrunk, with neither rotation nor shear. The corresponding eigenvalue is the factor by which an eigenvector is stretched or shrunk. If the eigenvalue is negative, the eigenvector's direction is reversed. ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Stationary Distribution
Stationary distribution may refer to: * and , a special distribution for a Markov chain such that if the chain starts with its stationary distribution, the marginal distribution of all states at any time will always be the stationary distribution. Assuming irreducibility, the stationary distribution is always unique if it exists, and its existence can be implied by positive recurrence of all states. The stationary distribution has the interpretation of the limiting distribution when the chain is irreducible and aperiodic. * The marginal distribution of a stationary process or stationary time series * The set of joint probability distributions of a stationary process or stationary time series In some fields of application, the term stable distribution is used for the equivalent of a stationary (marginal) distribution, although in probability and statistics the term has a rather different meaning: see stable distribution. Crudely stated, all of the above are specific cases of a c ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Expander Mixing Lemma
The expander mixing lemma intuitively states that the edges of certain d-regular graphs are evenly distributed throughout the graph. In particular, the number of edges between two vertex subsets S and T is always close to the expected number of edges between them in a random d-regular graph, namely \frac dn, S, , T, . ''d''-Regular Expander Graphs Define an (n, d, \lambda)-graph to be a d-regular graph G on n vertices such that all of the eigenvalues of its adjacency matrix A_G except one have absolute value at most \lambda. The d-regularity of the graph guarantees that its largest absolute value of an eigenvalue is d. In fact, the all-1's vector \mathbf1 is an eigenvector of A_G with eigenvalue d, and the eigenvalues of the adjacency matrix will never exceed the maximum degree of G in absolute value. If we fix d and \lambda then (n, d, \lambda)-graphs form a family of expander graphs with a constant spectral gap. Statement Let G = (V, E) be an (n, d, \lambda)-graph. For any ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Spectral Theorem
In linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful because computations involving a diagonalizable matrix can often be reduced to much simpler computations involving the corresponding diagonal matrix. The concept of diagonalization is relatively straightforward for operators on finite-dimensional vector spaces but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modeled by multiplication operators, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative C*-algebras. See also spectral theory for a historical perspective. Examples of operators to which the spectral theorem applies are self-adjoint operators or more generally normal operator ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |