Radical Probabilism
   HOME

TheInfoList



OR:

Radical probabilism is a hypothesis in
philosophy Philosophy ('love of wisdom' in Ancient Greek) is a systematic study of general and fundamental questions concerning topics like existence, reason, knowledge, Value (ethics and social sciences), value, mind, and language. It is a rational an ...
, in particular
epistemology Epistemology is the branch of philosophy that examines the nature, origin, and limits of knowledge. Also called "the theory of knowledge", it explores different types of knowledge, such as propositional knowledge about facts, practical knowle ...
, and
probability theory Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
that holds that no facts are known for certain. That view holds profound implications for
statistical inference Statistical inference is the process of using data analysis to infer properties of an underlying probability distribution.Upton, G., Cook, I. (2008) ''Oxford Dictionary of Statistics'', OUP. . Inferential statistical analysis infers properties of ...
. The philosophy is particularly associated with
Richard Jeffrey Richard Carl Jeffrey (August 5, 1926 – November 9, 2002) was an American philosopher, logician, and probability theorist. He is best known for developing and championing the philosophy of radical probabilism and the associated heuristic of ...
who wittily characterised it with the ''dictum'' "It's probabilities all the way down."


Background

Bayes' theorem Bayes' theorem (alternatively Bayes' law or Bayes' rule, after Thomas Bayes) gives a mathematical rule for inverting Conditional probability, conditional probabilities, allowing one to find the probability of a cause given its effect. For exampl ...
states a rule for updating a
probability Probability is a branch of mathematics and statistics concerning events and numerical descriptions of how likely they are to occur. The probability of an event is a number between 0 and 1; the larger the probability, the more likely an e ...
conditioned on other information. In 1967,
Ian Hacking Ian MacDougall Hacking (February 18, 1936 – May 10, 2023) was a Canadian philosopher specializing in the philosophy of science. Throughout his career, he won numerous awards, such as the Killam Prize for the Humanities and the Balzan Prize, ...
argued that in a static form, Bayes' theorem only connects probabilities that are held simultaneously; it does not tell the learner how to update probabilities when new evidence becomes available over time, contrary to what contemporary Bayesians suggested. According to Hacking, adopting Bayes' theorem is a temptation. Suppose that a learner forms probabilities ''P''old(''A'' & ''B'') = ''p'' and ''P''old(''B'') = ''q''. If the learner subsequently learns that ''B'' is true, nothing in the
axioms of probability The standard probability axioms are the foundations of probability theory introduced by Russian mathematician Andrey Kolmogorov in 1933. These axioms remain central and have direct contributions to mathematics, the physical sciences, and real-wor ...
or the results derived therefrom tells him how to behave. He might be tempted to adopt Bayes' theorem by analogy and set his ''P''new(''A'') = ''P''old(''A'' ,  ''B'') = ''p''/''q''. In fact, that step, Bayes' rule of updating, can be justified, as necessary and sufficient, through a ''dynamic''
Dutch book In decision theory, economics, and probability theory, the Dutch book arguments are a set of results showing that agents must satisfy the axioms of rational choice to avoid a kind of self-contradiction called a Dutch book. A Dutch book, somet ...
argument that is additional to the arguments used to justify the probability axioms. This argument was first put forward by David Lewis in the 1970s though he never published it. The dynamic Dutch book argument for Bayesian updating has been criticised by Hacking, Kyburg, Christensen, and Maher. It was defended by Brian Skyrms.


Certain and uncertain knowledge

That works when the new data is certain. C. I. Lewis had argued that "If anything is to be probable then something must be certain". There must, on Lewis' account, be some certain facts on which probabilities were conditioned. However, the principle known as
Cromwell's rule Cromwell's rule, named by statistician Dennis Lindley, states that the use of prior probabilities of 1 ("the event will definitely occur") or 0 ("the event will definitely not occur") should be avoided, except when applied to statements that ar ...
declares that nothing, apart from a logical law, if that, can ever be known for certain. Jeffrey famously rejected Lewis' ''dictum''. He later quipped, "It's probabilities all the way down," a reference to the " turtles all the way down" metaphor for the
infinite regress Infinite regress is a philosophical concept to describe a series of entities. Each entity in the series depends on its predecessor, following a recursive principle. For example, the epistemic regress is a series of beliefs in which the justi ...
problem. He called this position ''radical probabilism''.


Conditioning on an uncertainty – probability kinematics

In this case Bayes' rule isn't able to capture a mere subjective change in the probability of some critical fact. The new evidence may not have been anticipated or even be capable of being articulated after the event. It seems reasonable, as a starting position, to adopt the
law of total probability In probability theory, the law (or formula) of total probability is a fundamental rule relating marginal probabilities to conditional probabilities. It expresses the total probability of an outcome which can be realized via several distinct ev ...
and extend it to updating in much the same way as was Bayes' theorem. : ''P''new(''A'') = ''P''old(''A'' , ''B'')''P''new(''B'') + ''P''old(''A'' , not-''B'')''P''new(not-''B'') Adopting such a rule is sufficient to avoid a Dutch book but not necessary. Jeffrey advocated this as a rule of updating under radical probabilism and called it probability kinematics. Others have named it Jeffrey conditioning.


Alternatives to probability kinematics

Probability kinematics is not the only sufficient updating rule for radical probabilism. Others have been advocated including E. T. Jaynes' maximum entropy principle, and Skyrms' principle of reflection. It turns out that probability kinematics is a special case of maximum entropy inference. However, maximum entropy is not a generalisation of all such sufficient updating rules.


References


Further reading

* Jeffrey, R (1990) ''The Logic of Decision''. 2nd ed. University of Chicago Press. * — (1992) ''Probability and the Art of Judgment''. Cambridge University Press. * — (2004) ''Subjective Probability: The Real Thing''. Cambridge University Press. *Skyrms, B (2012) ''From Zeno to Arbitrage: Essays on Quantity, Coherence & Induction''. Oxford University Press (Features most of the papers cited below.)


External links


Stanford Encyclopedia of Philosophy entry on Bayes' theorem
{{DEFAULTSORT:Radical probabilism Bayesian inference Epistemological theories Probability theory