Recursive Bayesian estimation
   HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
,
statistics Statistics (from German: '' Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, indust ...
, and
machine learning Machine learning (ML) is a field of inquiry devoted to understanding and building methods that 'learn', that is, methods that leverage data to improve performance on some set of tasks. It is seen as a part of artificial intelligence. Machine ...
, recursive Bayesian estimation, also known as a Bayes filter, is a general probabilistic approach for
estimating Estimation (or estimating) is the process of finding an estimate or approximation, which is a value that is usable for some purpose even if input data may be incomplete, uncertain, or unstable. The value is nonetheless usable because it is der ...
an unknown
probability density function In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) ca ...
(
PDF Portable Document Format (PDF), standardized as ISO 32000, is a file format developed by Adobe in 1992 to present documents, including text formatting and images, in a manner independent of application software, hardware, and operating systems. ...
) recursively over time using incoming measurements and a mathematical process model. The process relies heavily upon mathematical concepts and models that are theorized within a study of prior and posterior probabilities known as
Bayesian statistics Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a ''degree of belief'' in an event. The degree of belief may be based on prior knowledge about the event, ...
.


In robotics

A Bayes filter is an algorithm used in
computer science Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to Applied science, practical discipli ...
for calculating the probabilities of multiple beliefs to allow a
robot A robot is a machine—especially one programmable by a computer—capable of carrying out a complex series of actions automatically. A robot can be guided by an external control device, or the control may be embedded within. Robots may be ...
to infer its position and orientation. Essentially, Bayes filters allow robots to continuously update their most likely position within a coordinate system, based on the most recently acquired sensor data. This is a recursive algorithm. It consists of two parts: prediction and innovation. If the variables are normally distributed and the transitions are linear, the Bayes filter becomes equal to the
Kalman filter For statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estima ...
. In a simple example, a robot moving throughout a grid may have several different sensors that provide it with information about its surroundings. The robot may start out with certainty that it is at position (0,0). However, as it moves farther and farther from its original position, the robot has continuously less certainty about its position; using a Bayes filter, a probability can be assigned to the robot's belief about its current position, and that probability can be continuously updated from additional sensor information.


Model

The measurements z are the manifestations of a
hidden Markov model A hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process — call it X — with unobservable ("''hidden''") states. As part of the definition, HMM requires that there be an ...
(HMM), which means the true state x is assumed to be an unobserved
Markov process A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happen ...
. The following picture presents a
Bayesian network A Bayesian network (also known as a Bayes network, Bayes net, belief network, or decision network) is a probabilistic graphical model that represents a set of variables and their conditional dependencies via a directed acyclic graph (DAG). Ba ...
of a HMM. Because of the Markov assumption, the probability of the current true state given the immediately previous one is conditionally independent of the other earlier states. :p(\textbf_k, \textbf_,\textbf_,\dots,\textbf_0) = p(\textbf_k, \textbf_ ) Similarly, the measurement at the ''k''-th timestep is dependent only upon the current state, so is conditionally independent of all other states given the current state. :p(\textbf_k, \textbf_k,\textbf_,\dots,\textbf_) = p(\textbf_k, \textbf_ ) Using these assumptions the probability distribution over all states of the HMM can be written simply as: :p(\textbf_0,\dots,\textbf_k,\textbf_1,\dots,\textbf_k) = p(\textbf_0)\prod_^k p(\textbf_i, \textbf_i)p(\textbf_i, \textbf_). However, when using the Kalman filter to estimate the state x, the probability distribution of interest is associated with the current states conditioned on the measurements up to the current timestep. (This is achieved by marginalising out the previous states and dividing by the probability of the measurement set.) This leads to the ''predict'' and ''update'' steps of the Kalman filter written probabilistically. The probability distribution associated with the predicted state is the sum (integral) of the products of the probability distribution associated with the transition from the (''k'' - 1)-th timestep to the ''k''-th and the probability distribution associated with the previous state, over all possible x_. : p(\textbf_k, \textbf_) = \int p(\textbf_k , \textbf_) p(\textbf_ , \textbf_ ) \, d\textbf_ The probability distribution of update is proportional to the product of the measurement likelihood and the predicted state. : p(\textbf_k, \textbf_) = \frac \propto p(\textbf_k, \textbf_k) p(\textbf_k, \textbf_) The denominator :p(\textbf_k, \textbf_) = \int p(\textbf_k, \textbf_k) p(\textbf_k, \textbf_) d\textbf_ is constant relative to x, so we can always substitute it for a coefficient \alpha, which can usually be ignored in practice. The numerator can be calculated and then simply normalized, since its integral must be unity.


Applications

*
Kalman filter For statistics and control theory, Kalman filtering, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, including statistical noise and other inaccuracies, and produces estima ...
, a recursive Bayesian filter for
multivariate normal distribution In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional ( univariate) normal distribution to higher dimensions. One ...
s *
Particle filter Particle filters, or sequential Monte Carlo methods, are a set of Monte Carlo algorithms used to solve filtering problems arising in signal processing and Bayesian statistical inference. The filtering problem consists of estimating the inte ...
, a sequential Monte Carlo (SMC) based technique, which models the
PDF Portable Document Format (PDF), standardized as ISO 32000, is a file format developed by Adobe in 1992 to present documents, including text formatting and images, in a manner independent of application software, hardware, and operating systems. ...
using a set of discrete points * Grid-based estimators, which subdivide the PDF into a deterministic discrete grid


Sequential Bayesian filtering

Sequential Bayesian filtering is the extension of the Bayesian estimation for the case when the observed value changes in time. It is a method to estimate the real value of an observed variable that evolves in time. The method is named: ;filtering: when estimating the ''current'' value given past and current observations, ;
smoothing In statistics and image processing, to smooth a data set is to create an approximating function that attempts to capture important patterns in the data, while leaving out noise or other fine-scale structures/rapid phenomena. In smoothing, the dat ...
: when estimating ''past'' values given past and current observations, and ;prediction: when estimating a probable ''future'' value given past and current observations. The notion of Sequential Bayesian filtering is extensively used in
control Control may refer to: Basic meanings Economics and business * Control (management), an element of management * Control, an element of management accounting * Comptroller (or controller), a senior financial officer in an organization * Controlli ...
and
robotics Robotics is an interdisciplinary branch of computer science and engineering. Robotics involves design, construction, operation, and use of robots. The goal of robotics is to design machines that can help and assist humans. Robotics integrat ...
.


Further reading

* * * * * *{{cite journal , first1=Alexander , last1=Volkov , title=Accuracy bounds of non-Gaussian Bayesian tracking in a NLOS environment , journal=Signal Processing , volume=108 , pages=498–508 , year=2015 , doi= 10.1016/j.sigpro.2014.10.025 Bayesian estimation Nonlinear filters Linear filters Signal estimation