HOME





Feller Process
In probability theory relating to stochastic processes, a Feller process is a particular kind of Markov process. Definitions Let ''X'' be a locally compact Hausdorff space with a countable base. Let ''C''0(''X'') denote the space of all real-valued continuous functions on ''X'' that vanish at infinity, equipped with the sup-norm , , ''f'' , , . From analysis, we know that ''C''0(''X'') with the sup norm is a Banach space. A Feller semigroup on ''C''0(''X'') is a collection ''t'' ≥ 0 of positive linear maps from ''C''0(''X'') to itself such that * , , ''T''''t''''f'' , ,  ≤ , , ''f'' , , for all ''t'' ≥ 0 and ''f'' in ''C''0(''X''), i.e., it is a contraction (in the weak sense); * the semigroup property: ''T''''t'' + ''s'' = ''T''''t'' \circ''T''''s'' for all ''s'', ''t'' ≥ 0; * lim''t'' → 0, , ''T''''t''''f'' − ''f'' , ,  = 0 for every ''f'' in ''C''0(''X ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theory
Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms of probability, axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure (mathematics), measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event (probability theory), event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of determinism, non-deterministic or uncertain processes or measured Quantity, quantities that may either be single occurrences or evolve over time in a random fashion). Although it is no ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Resolvent Formalism
In mathematics, the resolvent formalism is a technique for applying concepts from complex analysis to the study of the spectrum of operators on Banach spaces and more general spaces. Formal justification for the manipulations can be found in the framework of holomorphic functional calculus. The resolvent captures the spectral properties of an operator in the analytic structure of the functional. Given an operator , the resolvent may be defined as : R(z;A)= (A-zI)^~. Among other uses, the resolvent may be used to solve the inhomogeneous Fredholm integral equations; a commonly used approach is a series solution, the Liouville–Neumann series. The resolvent of can be used to directly obtain information about the spectral decomposition of . For example, suppose is an isolated eigenvalue in the spectrum of . That is, suppose there exists a simple closed curve C_\lambda in the complex plane that separates from the rest of the spectrum of . Then the residue : -\frac ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Hunt Process
In probability theory, a Hunt process is a type of Markov process, named for mathematician Gilbert A. Hunt who first defined them in 1957. Hunt processes were important in the study of probabilistic potential theory until they were superseded by right processes in the 1970s. History Background In the 1930-50s the work of mathematicians such as Joseph Doob, William Feller, Mark Kac, and Shizuo Kakutani developed connections between Markov processes and potential theory. In 1957-8 Gilbert A. Hunt published a triplet of papers which deepened that connection. The impact of these papers on the probabilist community of the time was significant. Joseph Doob said that "Hunt’s great papers on the potential theory generated by Markov transition functions revolutionized potential theory." Ronald Getoor described them as "a monumental work of nearly 170 pages that contained an enormous amount of truly original mathematics." Gustave Choquet wrote that Hunt's papers were "fundamental memoir ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Markov Chain
In probability theory and statistics, a Markov chain or Markov process is a stochastic process describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs ''now''." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC). Markov processes are named in honor of the Russian mathematician Andrey Markov. Markov chains have many applications as statistical models of real-world processes. They provide the basis for general stochastic simulation methods known as Markov chain Monte Carlo, which are used for simulating sampling from complex probability distributions, and have found application in areas including Bayesian statistics, biology, chemistry, economics, fin ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


David Williams (mathematician)
David Williams FRS is a Welsh mathematician who works in probability theory. Early life and education David Williams was born at Gorseinon, near Swansea, Wales. He was educated at Gowerton Grammar School, winning a mathematics scholarship to Jesus College, Oxford, and went on to obtain a DPhil under the supervision of David George Kendall and Gerd Edzard Harry Reuter, with a thesis titled ''Random time substitution in Markov chains''. Career Williams held posts at the Stanford University (1962–63), University of Durham, University of Cambridge (1966–69), and at Swansea University (1969–85), where he was promoted to a personal chair in 1972. In 1985, he was elected to the Professorship of Mathematical Statistics, University of Cambridge, where he remained until 1992, serving as Director of the Statistical Laboratory between 1987 and 1991. Following this, he held the Chair of Mathematical Sciences jointly with the Mathematics and Statistics Groups at the Univers ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Stopping Time
In probability theory, in particular in the study of stochastic processes, a stopping time (also Markov time, Markov moment, optional stopping time or optional time ) is a specific type of "random time": a random variable whose value is interpreted as the time at which a given stochastic process exhibits a certain behavior of interest. A stopping time is often defined by a stopping rule, a mechanism for deciding whether to continue or stop a process on the basis of the present position and past events, and which will almost always lead to a decision to stop at some finite time. Stopping times occur in decision theory, and the optional stopping theorem is an important result in this context. Stopping times are also frequently applied in mathematical proofs to "tame the continuum of time", as Chung put it in his book (1982). Definition Discrete time Let \tau be a random variable, which is defined on the filtered probability space (\Omega, \mathcal F, (\mathcal F_n)_, P) w ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Strong Markov Property
In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process, which means that its future evolution is independent of its history. It is named after the Russian mathematician Andrey Markov. The term strong Markov property is similar to the Markov property, except that the meaning of "present" is defined in terms of a random variable known as a stopping time. The term Markov assumption is used to describe a model where the Markov property is assumed to hold, such as a hidden Markov model. A Markov random field extends this property to two or more dimensions or to random variables defined for an interconnected network of items. An example of a model for such a field is the Ising model. A discrete-time stochastic process satisfying the Markov property is known as a Markov chain. Introduction A stochastic process has the Markov property if the conditional probability distribution of future states of the process (conditi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Filtered Probability Space
In the theory of stochastic processes, a subdiscipline of probability theory, filtrations are totally ordered collections of subsets that are used to model the information that is available at a given point and therefore play an important role in the formalization of random (stochastic) processes. Definition Let (\Omega, \mathcal A, P) be a probability space and let I be an index set with a total order \leq (often \N , \R^+ , or a subset of \mathbb R^+ ). For every i \in I let \mathcal F_i be a sub-''σ''-algebra of \mathcal A . Then : \mathbb F:= (\mathcal F_i)_ is called a filtration, if \mathcal F_k \subseteq \mathcal F_\ell for all k \leq \ell . So filtrations are families of ''σ''-algebras that are ordered non-decreasingly. If \mathbb F is a filtration, then (\Omega, \mathcal A, \mathbb F, P) is called a filtered probability space. Example Let (X_n)_ be a stochastic process on the probability space (\Omega, \mathcal A, P) . Let \sigma(X_k ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Adapted Process
In the study of stochastic processes, a stochastic process is adapted (also referred to as a non-anticipating or non-anticipative process) if information about the value of the process at a given time is available at that same time. An informal interpretation is that ''X'' is adapted if and only if, for every realisation and every ''n'', ''Xn'' is known at time ''n''. The concept of an adapted process is essential, for instance, in the definition of the Itō integral, which only makes sense if the integrand is an adapted process. Definition Let * (\Omega, \mathcal, \mathbb) be a probability space; * I be an index set with a total order \leq (often, I is \mathbb, \mathbb_0, , T/math> or filtration of the sigma algebra \mathcal; * (S,\Sigma) be a measurable space, the ''state space''; * X_i: I \times \Omega \to S be a stochastic process. The stochastic process (X_i)_ is said to be adapted to the filtration \left(\mathcal_i\right)_ if the random variable X_i: \Omega \to S is a (\ma ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Lipschitz Continuous
In mathematical analysis, Lipschitz continuity, named after Germany, German mathematician Rudolf Lipschitz, is a strong form of uniform continuity for function (mathematics), functions. Intuitively, a Lipschitz continuous function is limited in how fast it can change: there exists a real number such that, for every pair of points on the graph of this function, the absolute value of the slope of the line connecting them is not greater than this real number; the smallest such bound is called the ''Lipschitz constant'' of the function (and is related to the ''modulus of continuity, modulus of uniform continuity''). For instance, every function that is defined on an interval and has a bounded first derivative is Lipschitz continuous. In the theory of differential equations, Lipschitz continuity is the central condition of the Picard–Lindelöf theorem which guarantees the existence and uniqueness of the solution to an initial value problem. A special type of Lipschitz continuity, cal ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Stochastic Differential Equation
A stochastic differential equation (SDE) is a differential equation in which one or more of the terms is a stochastic process, resulting in a solution which is also a stochastic process. SDEs have many applications throughout pure mathematics and are used to model various behaviours of stochastic models such as stock prices,Musiela, M., and Rutkowski, M. (2004), Martingale Methods in Financial Modelling, 2nd Edition, Springer Verlag, Berlin. random growth models or physical systems that are subjected to thermal fluctuations. SDEs have a random differential that is in the most basic case random white noise calculated as the distributional derivative of a Brownian motion or more generally a semimartingale. However, other types of random behaviour are possible, such as jump processes like Lévy processes or semimartingales with jumps. Stochastic differential equations are in general neither differential equations nor random differential equations. Random differential equation ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Bessel Process
In mathematics Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many ar ..., a Bessel process, named after Friedrich Bessel, is a type of stochastic process. Formal definition The Bessel process of order ''n'' is the real-valued process ''X'' given (when ''n'' ≥ 2) by :X_t = \, W_t \, , where , , ·, , denotes the Euclidean norm in R''n'' and ''W'' is an ''n''-dimensional Wiener process ( Brownian motion). For any ''n'', the ''n''-dimensional Bessel process is the solution to the stochastic differential equation (SDE) :dX_t = dW_t + \frac\frac where W is a 1-dimensional Wiener process ( Brownian motion). Note that this SDE makes sense for any real parameter n (although the drift term is singular at zero). Notation A notation for the Bessel process of dimension start ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]