more on this theme     |     more from this thinker


Single Idea 17600

[filed under theme 14. Science / C. Induction / 6. Bayes's Theorem ]

Full Idea

It is well known that the general problem with Bayesian inference is that it is computationally intractable, so the algorithms used for computing posterior probabilities have to be approximations.

Gist of Idea

Bayesian inference is forced to rely on approximations

Source

Paul Thagard (Coherence: The Price is Right [2012], p.45)

Book Ref

-: 'Southern Journal of Philosophy' [-], p.45


A Reaction

Thagard makes this sound devastating, but then concedes that all theories have to rely on approximations, so I haven't quite grasped this idea. He gives references.

Related Idea

Idea 17599 The best theory has the highest subjective (Bayesian) probability? [Thagard]


The 7 ideas from 'Coherence: The Price is Right'

Coherence problems have positive and negative restraints; solutions maximise constraint satisfaction [Thagard]
Coherence is explanatory, deductive, conceptual, analogical, perceptual, and deliberative [Thagard]
Explanatory coherence needs symmetry,explanation,analogy,data priority, contradiction,competition,acceptance [Thagard]
Bayesian inference is forced to rely on approximations [Thagard]
The best theory has the highest subjective (Bayesian) probability? [Thagard]
Verisimilitude comes from including more phenomena, and revealing what underlies [Thagard]
Neither a priori rationalism nor sense data empiricism account for scientific knowledge [Thagard]