Voter Model
   HOME





Voter Model
In the mathematical theory of probability, the voter model is an interacting particle system introduced by Richard A. Holley and Thomas M. Liggett in 1975. One can imagine that there is a "voter" at each point on a connected graph, where the connections indicate that there is some form of interaction between a pair of voters (nodes). The opinions of any given voter on some issue changes at random times under the influence of opinions of his neighbours. A voter's opinion at any given time can take one of two values, labelled 0 and 1. At random times, a random individual is selected and that voter's opinion is changed according to a stochastic rule. Specifically, one of the chosen voter's neighbors is chosen according to a given set of probabilities and that neighbor’s opinion is transferred to the chosen voter. An alternative interpretation is in terms of spatial conflict. Suppose two nations control the areas (sets of nodes) labelled 0 or 1. A flip from 0 to 1 at a given loca ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Probability
Probability is a branch of mathematics and statistics concerning events and numerical descriptions of how likely they are to occur. The probability of an event is a number between 0 and 1; the larger the probability, the more likely an event is to occur."Kendall's Advanced Theory of Statistics, Volume 1: Distribution Theory", Alan Stuart and Keith Ord, 6th ed., (2009), .William Feller, ''An Introduction to Probability Theory and Its Applications'', vol. 1, 3rd ed., (1968), Wiley, . This number is often expressed as a percentage (%), ranging from 0% to 100%. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes ("heads" and "tails") are both equally probable; the probability of "heads" equals the probability of "tails"; and since no other outcomes are possible, the probability of either "heads" or "tails" is 1/2 (which could also be written as 0.5 or 50%). These concepts have been given an axiomatic mathematical formaliza ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Almost Surely
In probability theory, an event is said to happen almost surely (sometimes abbreviated as a.s.) if it happens with probability 1 (with respect to the probability measure). In other words, the set of outcomes on which the event does not occur has probability 0, even though the set might not be empty. The concept is analogous to the concept of "almost everywhere" in measure theory. In probability experiments on a finite sample space with a non-zero probability for each outcome, there is no difference between ''almost surely'' and ''surely'' (since having a probability of 1 entails including all the sample points); however, this distinction becomes important when the sample space is an infinite set, because an infinite set can have non-empty subsets of probability 0. Some examples of the use of this concept include the strong and uniform versions of the law of large numbers, the continuity of the paths of Brownian motion, and the infinite monkey theorem. The terms almost certai ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Contact Process (mathematics)
The contact process is a stochastic process used to model population growth on the set of sites S of a graph in which occupied sites become vacant at a constant rate, while vacant sites become occupied at a rate proportional to the number of occupied neighboring sites. Therefore, if we denote by \lambda the proportionality constant, each site remains occupied for a random time period which is exponentially distributed parameter 1 and places descendants at every vacant neighboring site at times of events of a Poisson process parameter \lambda during this period. All processes are independent of one another and of the random period of time sites remains occupied. The contact process can also be interpreted as a model for the spread of an infection by thinking of particles as a bacterium spreading over individuals that are positioned at the sites of S, occupied sites correspond to infected individuals, whereas vacant correspond to healthy ones. The main quantity of interest is th ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Sequential Dynamical System
Sequential dynamical systems (SDSs) are a class of graph dynamical systems. They are discrete dynamical systems which generalize many aspects of for example classical cellular automata, and they provide a framework for studying asynchronous processes over graphs. The analysis of SDSs uses techniques from combinatorics, abstract algebra, graph theory, dynamical systems and probability theory. Definition An SDS is constructed from the following components: * A finite ''graph'' ''Y'' with vertex set v 'Y''= . Depending on the context the graph can be directed or undirected. * A state ''xv'' for each vertex ''i'' of ''Y'' taken from a finite set ''K''. The ''system state'' is the ''n''-tuple ''x'' = (''x''1, ''x''2, ... , ''xn''), and ''x'' 'i''is the tuple consisting of the states associated to the vertices in the 1-neighborhood of ''i'' in ''Y'' (in some fixed order). * A ''vertex function'' ''fi'' for each vertex ''i''. The vertex function maps the state of vertex ''i'' at tim ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Stochastic Cellular Automata
Stochastic cellular automata or probabilistic cellular automata (PCA) or random cellular automata or locally interacting Markov chains are an important extension of cellular automaton. Cellular automata are a discrete-time dynamical system of interacting entities, whose state is discrete. The state of the collection of entities is updated at each discrete time according to some simple homogeneous rule. All entities' states are updated in parallel or synchronously. Stochastic cellular automata are CA whose updating rule is a stochastic one, which means the new entities' states are chosen according to some probability distributions. It is a discrete-time random dynamical system. From the spatial interaction between the entities, despite the simplicity of the updating rules, complex behaviour may emerge like self-organization. As mathematical object, it may be considered in the framework of stochastic processes as an interacting particle system in discrete-time. See for a more detai ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Renewal Theory
Renewal theory is the branch of probability theory that generalizes the Poisson process for arbitrary holding times. Instead of exponentially distributed holding times, a renewal process may have any independent and identically distributed (IID) holding times that have finite expectation. A renewal-reward process additionally has a random sequence of rewards incurred at each holding time, which are IID but need not be independent of the holding times. A renewal process has asymptotic properties analogous to the strong law of large numbers and central limit theorem. The renewal function m(t) (expected number of arrivals) and reward function g(t) (expected reward value) are of key importance in renewal theory. The renewal function satisfies a recursive integral equation, the renewal equation. The key renewal equation gives the limiting value of the convolution of m'(t) with a suitable non-negative function. The superposition of renewal processes can be studied as a special case of M ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Rate Function
In mathematics — specifically, in large deviations theory — a rate function is a function used to quantify the probabilities of rare events. Such functions are used to formulate large deviation principles. A large deviation principle quantifies the asymptotic probability of rare events for a sequence of probabilities. A rate function is also called a Cramér function, after the Swedish probabilist Harald Cramér. Definitions Rate function An extended real-valued function I: X \to , +\infty/math> defined on a Hausdorff topological space X is said to be a rate function if it is not identically +\infty and is lower semi-continuous ''i.e.'' all the sub-level sets :\ \mbox c \geq 0 are closed in X. If, furthermore, they are compact, then I is said to be a good rate function. A family of probability measures (\mu_)_ on X is said to satisfy the large deviation principle with rate function I: X \to , +\infty) (and rate 1/\delta) if, for every closed set F \subseteq X a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]




Borel–Cantelli Lemma
In probability theory, the Borel–Cantelli lemma is a theorem about sequences of events. In general, it is a result in measure theory. It is named after Émile Borel and Francesco Paolo Cantelli, who gave statement to the lemma in the first decades of the 20th century. A related result, sometimes called the second Borel–Cantelli lemma, is a partial converse of the first Borel–Cantelli lemma. The lemma states that, under certain conditions, an event will have probability of either zero or one. Accordingly, it is the best-known of a class of similar theorems, known as zero-one laws. Other examples include Kolmogorov's zero–one law and the Hewitt–Savage zero–one law. Statement of lemma for probability spaces Let ''E''1, ''E''2, ... be a sequence of events in some probability space. The Borel–Cantelli lemma states: Here, "lim sup" denotes limit supremum of the sequence of events. That is, lim sup ''E''''n'' is the outcome that infinitely many of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Chebyshev's Inequality
In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) provides an upper bound on the probability of deviation of a random variable (with finite variance) from its mean. More specifically, the probability that a random variable deviates from its mean by more than k\sigma is at most 1/k^2, where k is any positive constant and \sigma is the standard deviation (the square root of the variance). The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the weak law of large numbers. Its practical usage is similar to the 68–95–99.7 rule, which applies only to normal distributions. Chebyshev's inequality is more general, stating that a minimum of just 75% of values must lie within two standard deviations of the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


picture info

Invariant Measure
In mathematics, an invariant measure is a measure that is preserved by some function. The function may be a geometric transformation. For examples, circular angle is invariant under rotation, hyperbolic angle is invariant under squeeze mapping, and a difference of slopes is invariant under shear mapping. Ergodic theory is the study of invariant measures in dynamical systems. The Krylov–Bogolyubov theorem proves the existence of invariant measures under certain conditions on the function and space under consideration. Definition Let (X, \Sigma) be a measurable space and let f : X \to X be a measurable function from X to itself. A measure \mu on (X, \Sigma) is said to be invariant under f if, for every measurable set A in \Sigma, \mu\left(f^(A)\right) = \mu(A). In terms of the pushforward measure, this states that f_*(\mu) = \mu. The collection of measures (usually probability measures) on X that are invariant under f is sometimes denoted M_f(X). The collection of ergod ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Interacting Particle System
In probability theory, an interacting particle system (IPS) is a stochastic process (X(t))_ on some configuration space \Omega= S^G given by a site space, a countably-infinite-order graph G and a local state space, a compact metric space S . More precisely IPS are continuous-time Markov jump processes describing the collective behavior of stochastically interacting components. IPS are the continuous-time analogue of stochastic cellular automata. Among the main examples are the voter model, the contact process, the asymmetric simple exclusion process (ASEP), the Glauber dynamics and in particular the stochastic Ising model. IPS are usually defined via their Markov generator giving rise to a unique Markov process using Markov semigroups and the Hille-Yosida theorem. The generator again is given via so-called transition rates c_\Lambda(\eta,\xi)>0 where \Lambda\subset G is a finite set of sites and \eta,\xi\in\Omega with \eta_i=\xi_i for all i\notin\Lambda. The rates ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]


Ergodic Process
In physics, statistics, econometrics and signal processing, a stochastic process is said to be in an ergodic regime if an observable's ensemble average equals the time average. In this regime, any collection of random samples from a process must represent the average statistical properties of the entire regime. Conversely, a regime of a process that is not ergodic is said to be in non-ergodic regime. A regime implies a time-window of a process whereby ergodicity measure is applied. Specific definitions One can discuss the ergodicity of various statistics of a stochastic process. For example, a wide-sense stationary process X(t) has constant mean :\mu_X= E (t) and autocovariance :r_X(\tau) = E X(t)-\mu_X) (X(t+\tau)-\mu_X) that depends only on the lag \tau and not on time t. The properties \mu_X and r_X(\tau) are ''ensemble averages'' (calculated over all possible sample functions X), not time averages. The process X(t) is said to be mean-ergodicPapoulis, p. 428 or m ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   [Amazon]