more on this theme | more from this thinker
Full Idea
It is well known that the general problem with Bayesian inference is that it is computationally intractable, so the algorithms used for computing posterior probabilities have to be approximations.
Gist of Idea
Bayesian inference is forced to rely on approximations
Source
Paul Thagard (Coherence: The Price is Right [2012], p.45)
Book Ref
-: 'Southern Journal of Philosophy' [-], p.45
A Reaction
Thagard makes this sound devastating, but then concedes that all theories have to rely on approximations, so I haven't quite grasped this idea. He gives references.
Related Idea
Idea 17599 The best theory has the highest subjective (Bayesian) probability? [Thagard]
17596 | Coherence problems have positive and negative restraints; solutions maximise constraint satisfaction [Thagard] |
17597 | Coherence is explanatory, deductive, conceptual, analogical, perceptual, and deliberative [Thagard] |
17598 | Explanatory coherence needs symmetry,explanation,analogy,data priority, contradiction,competition,acceptance [Thagard] |
17600 | Bayesian inference is forced to rely on approximations [Thagard] |
17599 | The best theory has the highest subjective (Bayesian) probability? [Thagard] |
17602 | Verisimilitude comes from including more phenomena, and revealing what underlies [Thagard] |
17601 | Neither a priori rationalism nor sense data empiricism account for scientific knowledge [Thagard] |