HOME

TheInfoList



OR:

A Markov logic network (MLN) is a
probabilistic logic Probabilistic logic (also probability logic and probabilistic reasoning) involves the use of probability and logic to deal with uncertain situations. Probabilistic logic extends traditional logic truth tables with probabilistic expressions. A diffic ...
which applies the ideas of a Markov network to
first-order logic First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quanti ...
, enabling uncertain inference. Markov logic networks generalize first-order logic, in the sense that, in a certain limit, all unsatisfiable statements have a probability of zero, and all tautologies have probability one.


History

Work in this area began in 2003 by
Pedro Domingos Pedro Domingos is a Professor Emeritus of computer science and engineering at the University of Washington. He is a researcher in machine learning known for Markov logic network enabling uncertain inference. Education Domingos received an un ...
and Matt Richardson, and they began to use the term MLN to describe it.


Description

Briefly, it is a collection of formulas from
first-order logic First-order logic—also known as predicate logic, quantificational logic, and first-order predicate calculus—is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. First-order logic uses quanti ...
, to each of which is assigned a
real number In mathematics, a real number is a number that can be used to measurement, measure a ''continuous'' one-dimensional quantity such as a distance, time, duration or temperature. Here, ''continuous'' means that values can have arbitrarily small var ...
, the weight. Taken as a Markov network, the vertices of the network graph are
atomic formula In mathematical logic, an atomic formula (also known as an atom or a prime formula) is a formula with no deeper propositional structure, that is, a formula that contains no logical connectives or equivalently a formula that has no strict subform ...
s, and the edges are the
logical connective In logic, a logical connective (also called a logical operator, sentential connective, or sentential operator) is a logical constant. They can be used to connect logical formulas. For instance in the syntax of propositional logic, the binary ...
s used to construct the formula. Each formula is considered to be a
clique A clique ( AusE, CanE, or ), in the social sciences, is a group of individuals who interact with one another and share similar interests. Interacting with cliques is part of normative social development regardless of gender, ethnicity, or popula ...
, and the Markov blanket is the set of formulas in which a given atom appears. A potential function is associated to each formula, and takes the value of one when the formula is true, and zero when it is false. The potential function is combined with the weight to form the Gibbs measure and partition function for the Markov network. The above definition glosses over a subtle point: atomic formulas do not have a
truth value In logic and mathematics, a truth value, sometimes called a logical value, is a value indicating the relation of a proposition to truth, which in classical logic has only two possible values ('' true'' or ''false''). Computing In some prog ...
unless they are
grounded Grounding or grounded may refer to: Science and philosophy * Grounding (metaphysics), a topic of wide philosophical interest * Grounding (psychology), a strategy for coping with stress or other negative emotions * Grounding in communication, th ...
and given an
interpretation Interpretation may refer to: Culture * Aesthetic interpretation, an explanation of the meaning of a work of art * Allegorical interpretation, an approach that assumes a text should not be interpreted literally * Dramatic Interpretation, an event ...
; that is, until they are ground atoms with a Herbrand interpretation. Thus, a Markov logic network becomes a Markov network only with respect to a specific grounding and interpretation; the resulting Markov network is called the ground Markov network. The vertices of the graph of the ground Markov network are the ground atoms. The size of the resulting Markov network thus depends strongly (exponentially) on the number of constants in the
domain of discourse In the formal sciences, the domain of discourse, also called the universe of discourse, universal set, or simply universe, is the set of entities over which certain variables of interest in some formal treatment may range. Overview The dom ...
.


Inference

The goal of inference in a Markov logic network is to find the
stationary distribution Stationary distribution may refer to: * A special distribution for a Markov chain such that if the chain starts with its stationary distribution, the marginal distribution of all states at any time will always be the stationary distribution. Assum ...
of the system, or one that is close to it; that this may be difficult or not always possible is illustrated by the richness of behaviour seen in the
Ising model The Ising model () (or Lenz-Ising model or Ising-Lenz model), named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that represent ...
. As in a Markov network, the stationary distribution finds the most likely assignment of probabilities to the vertices of the graph; in this case, the vertices are the ground atoms of an interpretation. That is, the distribution indicates the probability of the truth or falsehood of each ground atom. Given the stationary distribution, one can then perform inference in the traditional statistical sense of
conditional probability In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) has already occurred. This particular method relies on event B occu ...
: obtain the probability P(A, B) that formula A holds, given that formula B is true. Inference in MLNs can be performed using standard Markov network inference techniques over the minimal subset of the relevant Markov network required for answering the query. These techniques include
Gibbs sampling In statistics, Gibbs sampling or a Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm for obtaining a sequence of observations which are approximated from a specified multivariate probability distribution, when direct sampling is diff ...
, which is effective but may be excessively slow for large networks, belief propagation, or approximation via pseudolikelihood.


See also

* Markov random field *
Statistical relational learning Statistical relational learning (SRL) is a subdiscipline of artificial intelligence and machine learning that is concerned with domain models that exhibit both uncertainty (which can be dealt with using statistical methods) and complex, relational ...
* Probabilistic logic network * Probabilistic soft logic


Resources

{{Reflist


External links


University of Washington Statistical Relational Learning group

Alchemy 2.0: Markov logic networks in C++

pracmln: Markov logic networks in Python

ProbCog: Markov logic networks in Python and Java that can use its own inference engine or Alchemy's

markov thebeast: Markov logic networks in Java

RockIt: Markov logic networks in Java (with web interface/REST API)

Tuffy: A Learning and Inference Engine with strong RDBMs-based optimization for scalability

Felix: A successor to Tuffy, with prebuilt submodules to speed up common subtasks


* ttps://www.cra.com/commercial-solutions/probabilistic-modeling-services.asp Figaro: Scala based MLN language
LoMRF: Logical Markov Random Fields, an open-source implementation of Markov Logic Networks in Scala
Bayesian statistics Markov networks