HOME



picture info

DTMC Games
In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable depends only on the value of the current variable, and not any variables in the past. For instance, a machine may have two states, ''A'' and ''E''. When it is in state ''A'', there is a 40% chance of it moving to state ''E'' and a 60% chance of it remaining in state ''A''. When it is in state ''E'', there is a 70% chance of it moving to ''A'' and a 30% chance of it staying in ''E''. The sequence of states of the machine is a Markov chain. If we denote the chain by X_0, X_1, X_2, ... then X_0 is the state which the machine starts in and X_ is the random variable describing its state after 10 transitions. The process continues forever, indexed by the natural numbers. An example of a stochastic process which is not a Markov chain is the model of a machine which has states ''A'' and ''E'' and moves to ''A'' from either state with ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Maximal Set
In recursion theory, the mathematical theory of computability, a maximal set is a coinfinite recursively enumerable subset ''A'' of the natural numbers such that for every further recursively enumerable subset ''B'' of the natural numbers, either ''B'' is cofinite or ''B'' is a finite variant of ''A'' or ''B'' is not a superset of ''A''. This gives an easy definition within the lattice of the recursively enumerable sets. Maximal sets have many interesting properties: they are simple, hypersimple, hyperhypersimple and r-maximal; the latter property says that every recursive set ''R'' contains either only finitely many elements of the complement of ''A'' or almost all elements of the complement of ''A''. There are r-maximal sets that are not maximal; some of them do even not have maximal supersets. Myhill (1956) asked whether maximal sets exist and Friedberg (1958) constructed one. Soare (1974) showed that the maximal sets form an orbit with respect to automorphism of the recursi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Markov Chain Monte Carlo
In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution, one can construct a Markov chain whose elements' distribution approximates it – that is, the Markov chain's Discrete-time Markov chain#Stationary distributions, equilibrium distribution matches the target distribution. The more steps that are included, the more closely the distribution of the sample matches the actual desired distribution. Markov chain Monte Carlo methods are used to study probability distributions that are too complex or too highly N-dimensional space, dimensional to study with analytic techniques alone. Various algorithms exist for constructing such Markov chains, including the Metropolis–Hastings algorithm. General explanation Markov chain Monte Carlo methods create samples from a continuous random variable, with probability density proportional to a known function. These samples can be used to e ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Kolmogorov's Criterion
In probability theory, Kolmogorov's criterion, named after Andrey Kolmogorov, is a theorem giving a necessary and sufficient condition for a Markov chain or continuous-time Markov chain to be stochastically identical to its time-reversed version. Discrete-time Markov chains The theorem states that an irreducible, positive recurrent, aperiodic Markov chain with transition matrix ''P'' is reversible if and only if its stationary Markov chain satisfies : p_ p_ \cdots p_ p_ = p_ p_ \cdots p_ p_ for all finite sequences of states : j_1, j_2, \ldots, j_n \in S . Here ''pij'' are components of the transition matrix ''P'', and ''S'' is the state space of the chain. That is, the chain-multiplication along any cycle is the same forwards and backwards. Example Consider this figure depicting a section of a Markov chain with states ''i'', ''j'', ''k'' and ''l'' and the corresponding transition probabilities. Here Kolmogorov's criterion implies that the product of probabilities when t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Balance Equation
In probability theory, a balance equation is an equation that describes the probability flux associated with a Markov chain In probability theory and statistics, a Markov chain or Markov process is a stochastic process describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally ... in and out of states or set of states. Global balance The global balance equations (also known as full balance equations) are a set of equations that characterize the equilibrium distribution (or any stationary distribution) of a Markov chain, when such a distribution exists. For a continuous time Markov chain with state space \mathcal, transition rate from state i to j given by q_ and equilibrium distribution given by \pi, the global balance equations are given by ::\pi_i = \sum_ \pi_j q_, or equivalently :: \pi_i \sum_ q_ = \sum_ \pi_j q_. for all i \in S. Here \pi_i q_ represents the probability f ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Detailed Balance
The principle of detailed balance can be used in Kinetics (physics), kinetic systems which are decomposed into elementary processes (collisions, or steps, or elementary reactions). It states that at Thermodynamic equilibrium, equilibrium, each elementary process is in equilibrium with its reverse process. History The principle of detailed balance was explicitly introduced for collisions by Ludwig Boltzmann. In 1872, he proved his H-theorem using this principle.Boltzmann, L. (1964), Lectures on gas theory, Berkeley, CA, USA: U. of California Press. The arguments in favor of this property are founded upon microscopic reversibility.Richard C. Tolman, Tolman, R. C. (1938). ''The Principles of Statistical Mechanics''. Oxford University Press, London, UK. Five years before Boltzmann, James Clerk Maxwell used the principle of detailed balance for gas kinetics with the reference to the principle of sufficient reason. He compared the idea of detailed balance with other types of balancing ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Absorbing Markov Chain
In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that, once entered, cannot be left. Like general Markov chains, there can be continuous-time absorbing Markov chains with an infinite state space. However, this article concentrates on the discrete-time discrete-state-space case. Formal definition A Markov chain is an absorbing chain if # there is at least one absorbing state and # it is possible to go from any state to at least one absorbing state in a finite number of steps. In an absorbing Markov chain, a state that is not absorbing is called transient. Canonical form Let an absorbing Markov chain with transition matrix ''P'' have ''t'' transient states and ''r'' absorbing states. Unlike a typical transition matrix, the rows of ''P'' represent sources, while columns represent destinations. Then : P = \begin Q & R\\ \mathbf & I_r \end, where ''Q'' is a ''t' ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Expected Value
In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first Moment (mathematics), moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean, mean of the possible values a random variable can take, weighted by the probability of those outcomes. Since it is obtained through arithmetic, the expected value sometimes may not even be included in the sample data set; it is not the value you would expect to get in reality. The expected value of a random variable with a finite number of outcomes is a weighted average of all possible outcomes. In the case of a continuum of possible outcomes, the expectation is defined by Integral, integration. In the axiomatic foundation for probability provided by measure theory, the expectation is given by Lebesgue integration. The expected value of a random variable is often denoted by , , or , with a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


If And Only If
In logic and related fields such as mathematics and philosophy, "if and only if" (often shortened as "iff") is paraphrased by the biconditional, a logical connective between statements. The biconditional is true in two cases, where either both statements are true or both are false. The connective is biconditional (a statement of material equivalence), and can be likened to the standard material conditional ("only if", equal to "if ... then") combined with its reverse ("if"); hence the name. The result is that the truth of either one of the connected statements requires the truth of the other (i.e. either both statements are true, or both are false), though it is controversial whether the connective thus defined is properly rendered by the English "if and only if"—with its pre-existing meaning. For example, ''P if and only if Q'' means that ''P'' is true whenever ''Q'' is true, and the only case in which ''P'' is true is if ''Q'' is also true, whereas in the case of ''P if Q ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Hitting Time
In the study of stochastic processes in mathematics, a hitting time (or first hit time) is the first time at which a given process "hits" a given subset of the state space. Exit times and return times are also examples of hitting times. Definitions Let be an ordered index set such as the natural numbers, the non-negative real numbers, , or a subset of these; elements can be thought of as "times". Given a probability space and a measurable space, measurable state space , let X :\Omega \times T \to S be a stochastic process, and let be a measurable set, measurable subset of the state space . Then the first hit time \tau_A : \Omega \to [0, +\infty] is the random variable defined by :\tau_A (\omega) := \inf \. The first exit time (from ) is defined to be the first hit time for , the complement (set theory), complement of in . Confusingly, this is also often denoted by . The first return time is defined to be the first hit time for the singleton (mathematics), singleton set ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Greatest Common Divisor
In mathematics, the greatest common divisor (GCD), also known as greatest common factor (GCF), of two or more integers, which are not all zero, is the largest positive integer that divides each of the integers. For two integers , , the greatest common divisor of and is denoted \gcd (x,y). For example, the GCD of 8 and 12 is 4, that is, . In the name "greatest common divisor", the adjective "greatest" may be replaced by "highest", and the word "divisor" may be replaced by "factor", so that other names include highest common factor, etc. Historically, other names for the same concept have included greatest common measure. This notion can be extended to polynomials (see ''Polynomial greatest common divisor'') and other commutative rings (see ' below). Overview Definition The ''greatest common divisor'' (GCD) of integers and , at least one of which is nonzero, is the greatest positive integer such that is a divisor of both and ; that is, there are integers and such that ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Periodic Function
A periodic function, also called a periodic waveform (or simply periodic wave), is a function that repeats its values at regular intervals or periods. The repeatable part of the function or waveform is called a ''cycle''. For example, the trigonometric functions, which repeat at intervals of 2\pi radians, are periodic functions. Periodic functions are used throughout science to describe oscillations, waves, and other phenomena that exhibit periodicity. Any function that is not periodic is called ''aperiodic''. Definition A function is said to be periodic if, for some nonzero constant , it is the case that :f(x+P) = f(x) for all values of in the domain. A nonzero constant for which this is the case is called a period of the function. If there exists a least positive constant with this property, it is called the fundamental period (also primitive period, basic period, or prime period.) Often, "the" period of a function is used to mean its fundamental period. A funct ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]