HOME
*





Concentration Of Measure
In mathematics, concentration of measure (about a median) is a principle that is applied in measure theory, probability and combinatorics, and has consequences for other fields such as Banach space theory. Informally, it states that "A random variable that depends in a Lipschitz way on many independent variables (but not too much on any of them) is essentially constant". The concentration of measure phenomenon was put forth in the early 1970s by Vitali Milman in his works on the local theory of Banach spaces, extending an idea going back to the work of Paul Lévy. It was further developed in the works of Milman and Gromov, Maurey, Pisier, Schechtman, Talagrand, Ledoux, and others. The general setting Let (X, d) be a metric space with a measure \mu on the Borel sets with \mu(X) = 1. Let :\alpha(\epsilon) = \sup \left\, where :A_\epsilon = \left\ is the \epsilon-''extension'' (also called \epsilon-fattening in the context of the Hausdorff distance) of a set A. The f ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics with the major subdisciplines of number theory, algebra, geometry, and analysis, respectively. There is no general consensus among mathematicians about a common definition for their academic discipline. Most mathematical activity involves the discovery of properties of abstract objects and the use of pure reason to prove them. These objects consist of either abstractions from nature orin modern mathematicsentities that are stipulated to have certain properties, called axioms. A ''proof'' consists of a succession of applications of deductive rules to already established results. These results include previously proved theorems, axioms, andin case of abstraction from naturesome basic properties that are considered true starting points of t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Measure (mathematics)
In mathematics, the concept of a measure is a generalization and formalization of geometrical measures (length, area, volume) and other common notions, such as mass and probability of events. These seemingly distinct concepts have many similarities and can often be treated together in a single mathematical context. Measures are foundational in probability theory, integration theory, and can be generalized to assume negative values, as with electrical charge. Far-reaching generalizations (such as spectral measures and projection-valued measures) of measure are widely used in quantum physics and physics in general. The intuition behind this concept dates back to ancient Greece, when Archimedes tried to calculate the area of a circle. But it was not until the late 19th and early 20th centuries that measure theory became a branch of mathematics. The foundations of modern measure theory were laid in the works of Émile Borel, Henri Lebesgue, Nikolai Luzin, Johann Radon, C ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Thermal Fluctuations
In statistical mechanics, thermal fluctuations are random deviations of an atomic system from its average state, that occur in a system at equilibrium.In statistical mechanics they are often simply referred to as fluctuations. All thermal fluctuations become larger and more frequent as the temperature increases, and likewise they decrease as temperature approaches absolute zero. Thermal fluctuations are a basic manifestation of the temperature of systems: A system at nonzero temperature does not stay in its equilibrium microscopic state, but instead randomly samples all possible states, with probabilities given by the Boltzmann distribution. Thermal fluctuations generally affect all the degrees of freedom of a system: There can be random vibrations (phonons), random rotations ( rotons), random electronic excitations, and so forth. Thermodynamic variables, such as pressure, temperature, or entropy, likewise undergo thermal fluctuations. For example, for a system that has an eq ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Canonical Ensemble
In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat bath, so that the states of the system will differ in total energy. The principal thermodynamic variable of the canonical ensemble, determining the probability distribution of states, is the absolute temperature (symbol: ). The ensemble typically also depends on mechanical variables such as the number of particles in the system (symbol: ) and the system's volume (symbol: ), each of which influence the nature of the system's internal states. An ensemble with these three parameters is sometimes called the ensemble. The canonical ensemble assigns a probability to each distinct microstate given by the following exponential: :P = e^, where is the total energy of the microstate, and is the Boltzmann constant. The number is the free e ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Microcanonical Ensemble
In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it cannot exchange energy or particles with its environment, so that (by conservation of energy) the energy of the system does not change with time. The primary macroscopic variables of the microcanonical ensemble are the total number of particles in the system (symbol: ), the system's volume (symbol: ), as well as the total energy in the system (symbol: ). Each of these is assumed to be constant in the ensemble. For this reason, the microcanonical ensemble is sometimes called the ensemble. In simple terms, the microcanonical ensemble is defined by assigning an equal probability to every microstate whose energy falls within a range centered at . All other microstates are given a probability of zero. Since the probabilities must add up to 1, the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Liouville Measure
In differential geometry, a subject of mathematics, a symplectic manifold is a smooth manifold, M , equipped with a closed nondegenerate differential 2-form \omega , called the symplectic form. The study of symplectic manifolds is called symplectic geometry or symplectic topology. Symplectic manifolds arise naturally in abstract formulations of classical mechanics and analytical mechanics as the cotangent bundles of manifolds. For example, in the Hamiltonian formulation of classical mechanics, which provides one of the major motivations for the field, the set of all possible configurations of a system is modeled as a manifold, and this manifold's cotangent bundle describes the phase space of the system. Motivation Symplectic manifolds arise from classical mechanics; in particular, they are a generalization of the phase space of a closed system. In the same way the Hamilton equations allow one to derive the time evolution of a system from a set of differential equations, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Phase Space
In dynamical system theory, a phase space is a space in which all possible states of a system are represented, with each possible state corresponding to one unique point in the phase space. For mechanical systems, the phase space usually consists of all possible values of position and momentum variables. It is the outer product of direct space and reciprocal space. The concept of phase space was developed in the late 19th century by Ludwig Boltzmann, Henri Poincaré, and Josiah Willard Gibbs. Introduction In a phase space, every degree of freedom or parameter of the system is represented as an axis of a multidimensional space; a one-dimensional system is called a phase line, while a two-dimensional system is called a phase plane. For every possible state of the system or allowed combination of values of the system's parameters, a point is included in the multidimensional space. The system's evolving state over time traces a path (a phase-space trajectory for the s ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Albert Einstein
Albert Einstein ( ; ; 14 March 1879 – 18 April 1955) was a German-born theoretical physicist, widely acknowledged to be one of the greatest and most influential physicists of all time. Einstein is best known for developing the theory of relativity, but he also made important contributions to the development of the theory of quantum mechanics. Relativity and quantum mechanics are the two pillars of modern physics. His mass–energy equivalence formula , which arises from relativity theory, has been dubbed "the world's most famous equation". His work is also known for its influence on the philosophy of science. He received the 1921 Nobel Prize in Physics "for his services to theoretical physics, and especially for his discovery of the law of the photoelectric effect", a pivotal step in the development of quantum theory. His intellectual achievements and originality resulted in "Einstein" becoming synonymous with "genius". In 1905, a year sometimes described as his ' ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Josiah Willard Gibbs
Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in transforming physical chemistry into a rigorous inductive science. Together with James Clerk Maxwell and Ludwig Boltzmann, he created statistical mechanics (a term that he coined), explaining the laws of thermodynamics as consequences of the statistical properties of ensembles of the possible states of a physical system composed of many particles. Gibbs also worked on the application of Maxwell's equations to problems in physical optics. As a mathematician, he invented modern vector calculus (independently of the British scientist Oliver Heaviside, who carried out similar work during the same period). In 1863, Yale awarded Gibbs the first American doctorate in engineering. After a three-year sojourn in Europe, Gibbs spent the rest of hi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Dvoretzky's Theorem
In mathematics, Dvoretzky's theorem is an important structural theorem about normed vector spaces proved by Aryeh Dvoretzky in the early 1960s, answering a question of Alexander Grothendieck. In essence, it says that every sufficiently high-dimensional normed vector space will have low-dimensional subspaces that are approximately Euclidean. Equivalently, every high-dimensional bounded symmetric convex set has low-dimensional sections that are approximately ellipsoids. A new proof found by Vitali Milman in the 1970s was one of the starting points for the development of asymptotic geometric analysis (also called ''asymptotic functional analysis'' or the ''local theory of Banach spaces''). Original formulations For every natural number ''k'' ∈ N and every ''ε'' > 0 there exists a natural number ''N''(''k'', ''ε'') ∈ N such that if (''X'', ‖·‖) is any normed space of dimension ''N''(''k'', ''ε''), t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Concentration Inequality
In probability theory, concentration inequalities provide bounds on how a random variable deviates from some value (typically, its expected value). The law of large numbers of classical probability theory states that sums of independent random variables are, under very mild conditions, close to their expectation with a large probability. Such sums are the most basic examples of random variables concentrated around their mean. Recent results show that such behavior is shared by other functions of independent random variables. Concentration inequalities can be sorted according to how much information about the random variable is needed in order to use them. Markov's inequality Let X be a random variable that is non-negative (almost surely). Then, for every constant a > 0, : \Pr(X \geq a) \leq \frac. Note the following extension to Markov's inequality: if \Phi is a strictly increasing and non-negative function, then :\Pr(X \geq a) = \Pr(\Phi (X) \geq \Phi (a)) \leq \frac. Cheb ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Spherical Measure
In mathematics — specifically, in geometric measure theory — spherical measure ''σ''''n'' is the "natural" Borel measure on the ''n''-sphere S''n''. Spherical measure is often normalized so that it is a probability measure on the sphere, i.e. so that ''σ''''n''(S''n'') = 1. Definition of spherical measure There are several ways to define spherical measure. One way is to use the usual "round" or " arclength" metric ''ρ''''n'' on S''n''; that is, for points ''x'' and ''y'' in S''n'', ''ρ''''n''(''x'', ''y'') is defined to be the (Euclidean) angle that they subtend at the centre of the sphere (the origin of R''n''+1). Now construct ''n''-dimensional Hausdorff measure ''H''''n'' on the metric space (S''n'', ''ρ''''n'') and define :\sigma^ = \frac H^. One could also have given S''n'' the metric that it inherits as a subspace of the Euclidean space R''n''+1; the same spherical measure results from this choice of metric. Ano ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]