Cox's theorem, named after the physicist Richard Threlkeld Cox, is a derivation of the laws of probability theory from a certain set of postulates. This derivation justifies the so-called "logical" interpretation of probability, as the laws of probability derived by Cox's theorem are applicable to any proposition. Logical (also known as objective Bayesian) probability is a type of Bayesian probability. Other forms of Bayesianism, such as the subjective interpretation, are given other justifications.
Cox's assumptions
Cox wanted his system to satisfy the following conditions:
- Divisibility and comparability – The plausibility of a proposition is a real number and is dependent on information we have related to the proposition.
- Common sense – Plausibilities should vary sensibly with the assessment of plausibilities in the model.
- Consistency – If the plausibility of a proposition can be derived in many ways, all the results must be equal.
The postulates as stated here are taken from Arnborg and Sjödin.345 "Common sense" includes consistency with Aristotelian logic in the sense that logically equivalent propositions shall have the same plausibility.
The postulates as originally stated by Cox were not mathematically rigorous (although more so than the informal description above), as noted by Halpern.67 However it appears to be possible to augment them with various mathematical assumptions made either implicitly or explicitly by Cox to produce a valid proof.
Cox's notation:
The plausibility of a proposition A {\displaystyle A} given some related information X {\displaystyle X} is denoted by A ∣ X {\displaystyle A\mid X} .Cox's postulates and functional equations are:
- The plausibility of the conjunction A B {\displaystyle AB} of two propositions A {\displaystyle A} , B {\displaystyle B} , given some related information X {\displaystyle X} , is determined by the plausibility of A {\displaystyle A} given X {\displaystyle X} and that of B {\displaystyle B} given A X {\displaystyle AX} .
- Additionally, Cox postulates the function g {\displaystyle g} to be monotonic.
- In case A {\displaystyle A} given X {\displaystyle X} is certain, we have A B ∣ X = B ∣ X {\displaystyle AB\mid X=B\mid X} and B ∣ A X = B ∣ X {\displaystyle B\mid AX=B\mid X} due to the requirement of consistency. The general equation then leads to
- In case A {\displaystyle A} given X {\displaystyle X} is impossible, we have A B ∣ X = A ∣ X {\displaystyle AB\mid X=A\mid X} and A ∣ B X = A ∣ X {\displaystyle A\mid BX=A\mid X} due to the requirement of consistency. The general equation (with the A and B factors switched) then leads to
- The plausibility of a proposition determines the plausibility of the proposition's negation.
- Furthermore, Cox postulates the function f {\displaystyle f} to be monotonic.
Implications of Cox's postulates
The laws of probability derivable from these postulates are the following.8 Let A ∣ B {\displaystyle A\mid B} be the plausibility of the proposition A {\displaystyle A} given B {\displaystyle B} satisfying Cox's postulates. Then there is a function w {\displaystyle w} mapping plausibilities to interval [0,1] and a positive number m {\displaystyle m} such that
- Certainty is represented by w ( A ∣ B ) = 1. {\displaystyle w(A\mid B)=1.}
- w m ( A | B ) + w m ( not A ∣ B ) = 1. {\displaystyle w^{m}(A|B)+w^{m}({\text{not }}A\mid B)=1.}
- w ( A B ∣ C ) = w ( A ∣ C ) w ( B ∣ A C ) = w ( B ∣ C ) w ( A ∣ B C ) . {\displaystyle w(AB\mid C)=w(A\mid C)w(B\mid AC)=w(B\mid C)w(A\mid BC).}
It is important to note that the postulates imply only these general properties. We may recover the usual laws of probability by setting a new function, conventionally denoted P {\displaystyle P} or Pr {\displaystyle \Pr } , equal to w m {\displaystyle w^{m}} . Then we obtain the laws of probability in a more familiar form:
- Certain truth is represented by Pr ( A ∣ B ) = 1 {\displaystyle \Pr(A\mid B)=1} , and certain falsehood by Pr ( A ∣ B ) = 0. {\displaystyle \Pr(A\mid B)=0.}
- Pr ( A ∣ B ) + Pr ( not A ∣ B ) = 1. {\displaystyle \Pr(A\mid B)+\Pr({\text{not }}A\mid B)=1.}
- Pr ( A B ∣ C ) = Pr ( A ∣ C ) Pr ( B ∣ A C ) = Pr ( B ∣ C ) Pr ( A ∣ B C ) . {\displaystyle \Pr(AB\mid C)=\Pr(A\mid C)\Pr(B\mid AC)=\Pr(B\mid C)\Pr(A\mid BC).}
Rule 2 is a rule for negation, and rule 3 is a rule for conjunction. Given that any proposition containing conjunction, disjunction, and negation can be equivalently rephrased using conjunction and negation alone (the conjunctive normal form), we can now handle any compound proposition.
The laws thus derived yield finite additivity of probability, but not countable additivity. The measure-theoretic formulation of Kolmogorov assumes that a probability measure is countably additive. This slightly stronger condition is necessary for certain results. An elementary example (in which this assumption merely simplifies the calculation rather than being necessary for it) is that the probability of seeing heads for the first time after an even number of flips in a sequence of coin flips is 1 3 {\displaystyle {\tfrac {1}{3}}} .9
Interpretation and further discussion
Cox's theorem has come to be used as one of the justifications for the use of Bayesian probability theory. For example, in Jaynes it is discussed in detail in chapters 1 and 2 and is a cornerstone for the rest of the book.10 Probability is interpreted as a formal system of logic, the natural extension of Aristotelian logic (in which every statement is either true or false) into the realm of reasoning in the presence of uncertainty.
It has been debated to what degree the theorem excludes alternative models for reasoning about uncertainty. For example, if certain "unintuitive" mathematical assumptions were dropped then alternatives could be devised, e.g., an example provided by Halpern.11 However Arnborg and Sjödin121314 suggest additional "common sense" postulates, which would allow the assumptions to be relaxed in some cases while still ruling out the Halpern example. Other approaches were devised by Hardy15 or Dupré and Tipler.16
The original formulation of Cox's theorem is in Cox (1946), which is extended with additional results and more discussion in Cox (1961). Jaynes17 cites Abel18 for the first known use of the associativity functional equation. János Aczél19 provides a long proof of the "associativity equation" (pages 256-267). Jaynes20: 27 reproduces the shorter proof by Cox in which differentiability is assumed. A guide to Cox's theorem by Van Horn aims at comprehensively introducing the reader to all these references.21
Baoding Liu, the founder of uncertainty theory, criticizes Cox's theorem for presuming that the truth value of conjunction P ∧ Q {\displaystyle P\land Q} is a twice differentiable function f {\displaystyle f} of truth values of the two propositions P {\displaystyle P} and Q {\displaystyle Q} , i.e., T ( P ∧ Q ) = f ( T ( P ) , T ( Q ) ) {\displaystyle T(P\land Q)=f(T(P),T(Q))} , which excludes uncertainty theory's "uncertain measure" from its start, because the function f ( x , y ) = x ∧ y {\displaystyle f(x,y)=x\land y} ,22 used in uncertainty theory, is not differentiable with respect to x {\displaystyle x} and y {\displaystyle y} .23 According to Liu, "there does not exist any evidence that the truth value of conjunction is completely determined by the truth values of individual propositions, let alone a twice differentiable function."24
See also
Notes
Further reading
- Fine, Terrence L. (1973). Theories of Probability : An examination of foundations. New York: Academic Press. ISBN 0-12-256450-2.
- Smith, C. Ray; Erickson, Gary (1989). "From Rationality and Consistency to Bayesian Probability". In Skilling, John (ed.). Maximum Entropy and Bayesian Methods. Dordrecht: Kluwer. pp. 29–44. doi:10.1007/978-94-015-7860-8_2. ISBN 0-7923-0224-9.
References
Cox, R. T. (1946). "Probability, Frequency and Reasonable Expectation". American Journal of Physics. 14 (1): 1–10. Bibcode:1946AmJPh..14....1C. doi:10.1119/1.1990764. /wiki/Richard_Threlkeld_Cox ↩
Cox, R. T. (1961). The Algebra of Probable Inference. Baltimore, MD: Johns Hopkins University Press. /wiki/Richard_Threlkeld_Cox ↩
Stefan Arnborg and Gunnar Sjödin, On the foundations of Bayesianism, Preprint: Nada, KTH (1999) — http://www.stats.org.uk/cox-theorems/ArnborgSjodin2001.pdf http://www.stats.org.uk/cox-theorems/ArnborgSjodin2001.pdf ↩
Stefan Arnborg and Gunnar Sjödin, A note on the foundations of Bayesianism, Preprint: Nada, KTH (2000a) — http://www.stats.org.uk/bayesian/ArnborgSjodin1999.pdf http://www.stats.org.uk/bayesian/ArnborgSjodin1999.pdf ↩
Stefan Arnborg and Gunnar Sjödin, "Bayes rules in finite models," in European Conference on Artificial Intelligence, Berlin, (2000b) — https://frontiersinai.com/ecai/ecai2000/pdf/p0571.pdf https://frontiersinai.com/ecai/ecai2000/pdf/p0571.pdf ↩
Joseph Y. Halpern, "A counterexample to theorems of Cox and Fine," Journal of AI research, 10, 67–85 (1999) — http://www.jair.org/media/536/live-536-2054-jair.ps.Z Archived 2015-11-25 at the Wayback Machine http://www.jair.org/media/536/live-536-2054-jair.ps.Z ↩
Joseph Y. Halpern, "Technical Addendum, Cox's theorem Revisited," Journal of AI research, 11, 429–435 (1999) — http://www.jair.org/media/644/live-644-1840-jair.ps.Z Archived 2015-11-25 at the Wayback Machine http://www.jair.org/media/644/live-644-1840-jair.ps.Z ↩
Edwin Thompson Jaynes, Probability Theory: The Logic of Science, Cambridge University Press (2003). — preprint version (1996) at "Archived copy". Archived from the original on 2016-01-19. Retrieved 2016-01-19.{{cite web}}: CS1 maint: archived copy as title (link); Chapters 1 to 3 of published version at http://bayes.wustl.edu/etj/prob/book.pdf /wiki/Edwin_Thompson_Jaynes ↩
Price, David T. (1974), "Countable additivity for probability measures", American Mathematical Monthly, 81 (8): 886–889, doi:10.2307/2319450, JSTOR 2319450, MR 0350798 /wiki/Doi_(identifier) ↩
Edwin Thompson Jaynes, Probability Theory: The Logic of Science, Cambridge University Press (2003). — preprint version (1996) at "Archived copy". Archived from the original on 2016-01-19. Retrieved 2016-01-19.{{cite web}}: CS1 maint: archived copy as title (link); Chapters 1 to 3 of published version at http://bayes.wustl.edu/etj/prob/book.pdf /wiki/Edwin_Thompson_Jaynes ↩
Joseph Y. Halpern, "A counterexample to theorems of Cox and Fine," Journal of AI research, 10, 67–85 (1999) — http://www.jair.org/media/536/live-536-2054-jair.ps.Z Archived 2015-11-25 at the Wayback Machine http://www.jair.org/media/536/live-536-2054-jair.ps.Z ↩
Stefan Arnborg and Gunnar Sjödin, On the foundations of Bayesianism, Preprint: Nada, KTH (1999) — http://www.stats.org.uk/cox-theorems/ArnborgSjodin2001.pdf http://www.stats.org.uk/cox-theorems/ArnborgSjodin2001.pdf ↩
Stefan Arnborg and Gunnar Sjödin, A note on the foundations of Bayesianism, Preprint: Nada, KTH (2000a) — http://www.stats.org.uk/bayesian/ArnborgSjodin1999.pdf http://www.stats.org.uk/bayesian/ArnborgSjodin1999.pdf ↩
Stefan Arnborg and Gunnar Sjödin, "Bayes rules in finite models," in European Conference on Artificial Intelligence, Berlin, (2000b) — https://frontiersinai.com/ecai/ecai2000/pdf/p0571.pdf https://frontiersinai.com/ecai/ecai2000/pdf/p0571.pdf ↩
Michael Hardy, "Scaled Boolean algebras", Advances in Applied Mathematics, August 2002, pages 243–292 (or preprint); Hardy has said, "I assert there that I think Cox's assumptions are too strong, although I don't really say why. I do say what I would replace them with." (The quote is from a Wikipedia discussion page, not from the article.) http://www.sciencedirect.com/science/journal/01968858 ↩
Dupré, Maurice J. & Tipler, Frank J. (2009). "New Axioms for Rigorous Bayesian Probability", Bayesian Analysis, 4(3): 599-606. http://projecteuclid.org/download/pdf_1/euclid.ba/1340369856 ↩
Edwin Thompson Jaynes, Probability Theory: The Logic of Science, Cambridge University Press (2003). — preprint version (1996) at "Archived copy". Archived from the original on 2016-01-19. Retrieved 2016-01-19.{{cite web}}: CS1 maint: archived copy as title (link); Chapters 1 to 3 of published version at http://bayes.wustl.edu/etj/prob/book.pdf /wiki/Edwin_Thompson_Jaynes ↩
Niels Henrik Abel "Untersuchung der Functionen zweier unabhängig veränderlichen Gröszen x und y, wie f(x, y), welche die Eigenschaft haben, dasz f[z, f(x,y)] eine symmetrische Function von z, x und y ist.", Jour. Reine u. angew. Math. (Crelle's Jour.), 1, 11–15, (1826). /wiki/Niels_Henrik_Abel ↩
János Aczél, Lectures on Functional Equations and their Applications, Academic Press, New York, (1966). /wiki/J%C3%A1nos_Acz%C3%A9l_(mathematician) ↩
Edwin Thompson Jaynes, Probability Theory: The Logic of Science, Cambridge University Press (2003). — preprint version (1996) at "Archived copy". Archived from the original on 2016-01-19. Retrieved 2016-01-19.{{cite web}}: CS1 maint: archived copy as title (link); Chapters 1 to 3 of published version at http://bayes.wustl.edu/etj/prob/book.pdf /wiki/Edwin_Thompson_Jaynes ↩
Van Horn, K. S. (2003). "Constructing a logic of plausible inference: A guide to Cox's theorem". International Journal of Approximate Reasoning. 34: 3–24. doi:10.1016/S0888-613X(03)00051-3. /wiki/Doi_(identifier) ↩
Liu uses the symbol ∧ as the "minimum operator", most likely referring to a binary operation that takes two numbers and returns the smaller (or minimum) of the two. ↩
Liu, Baoding (2015). Uncertainty Theory. Springer Uncertainty Research (4th ed. 2015 ed.). Berlin, Heidelberg: Springer Berlin Heidelberg : Imprint: Springer. pp. 459–460. ISBN 978-3-662-44354-5. 978-3-662-44354-5 ↩
Liu, Baoding (2015). Uncertainty Theory. Springer Uncertainty Research (4th ed. 2015 ed.). Berlin, Heidelberg: Springer Berlin Heidelberg : Imprint: Springer. pp. 459–460. ISBN 978-3-662-44354-5. 978-3-662-44354-5 ↩