Combining Philosophers

All the ideas for Myles F. Burnyeat, Ian Hacking and E Conee / R Feldman

unexpand these ideas     |    start again     |     specify just one area for these philosophers


24 ideas

1. Philosophy / C. History of Philosophy / 4. Later European Philosophy / b. Seventeenth century philosophy
Gassendi is the first great empiricist philosopher [Hacking]
     Full Idea: Gassendi is the first in the great line of empiricist philosophers that gradually came to dominate European thought.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.5)
     A reaction: Epicurus, of course, was clearly an empiricist. British readers should note that Gassendi was not British.
2. Reason / D. Definition / 3. Types of Definition
A decent modern definition should always imply a semantics [Hacking]
     Full Idea: Today we expect that anything worth calling a definition should imply a semantics.
     From: Ian Hacking (What is Logic? [1979], §10)
     A reaction: He compares this with Gentzen 1935, who was attempting purely syntactic definitions of the logical connectives.
4. Formal Logic / B. Propositional Logic PL / 2. Tools of Propositional Logic / d. Basic theorems of PL
'Thinning' ('dilution') is the key difference between deduction (which allows it) and induction [Hacking]
     Full Idea: 'Dilution' (or 'Thinning') provides an essential contrast between deductive and inductive reasoning; for the introduction of new premises may spoil an inductive inference.
     From: Ian Hacking (What is Logic? [1979], §06.2)
     A reaction: That is, inductive logic (if there is such a thing) is clearly non-monotonic, whereas classical inductive logic is monotonic.
Gentzen's Cut Rule (or transitivity of deduction) is 'If A |- B and B |- C, then A |- C' [Hacking]
     Full Idea: If A |- B and B |- C, then A |- C. This generalises to: If Γ|-A,Θ and Γ,A |- Θ, then Γ |- Θ. Gentzen called this 'cut'. It is the transitivity of a deduction.
     From: Ian Hacking (What is Logic? [1979], §06.3)
     A reaction: I read the generalisation as 'If A can be either a premise or a conclusion, you can bypass it'. The first version is just transitivity (which by-passes the middle step).
Only Cut reduces complexity, so logic is constructive without it, and it can be dispensed with [Hacking]
     Full Idea: Only the cut rule can have a conclusion that is less complex than its premises. Hence when cut is not used, a derivation is quite literally constructive, building up from components. Any theorem obtained by cut can be obtained without it.
     From: Ian Hacking (What is Logic? [1979], §08)
5. Theory of Logic / A. Overview of Logic / 4. Pure Logic
The various logics are abstractions made from terms like 'if...then' in English [Hacking]
     Full Idea: I don't believe English is by nature classical or intuitionistic etc. These are abstractions made by logicians. Logicians attend to numerous different objects that might be served by 'If...then', like material conditional, strict or relevant implication.
     From: Ian Hacking (What is Logic? [1979], §15)
     A reaction: The idea that they are 'abstractions' is close to my heart. Abstractions from what? Surely 'if...then' has a standard character when employed in normal conversation?
5. Theory of Logic / A. Overview of Logic / 5. First-Order Logic
First-order logic is the strongest complete compact theory with Löwenheim-Skolem [Hacking]
     Full Idea: First-order logic is the strongest complete compact theory with a Löwenheim-Skolem theorem.
     From: Ian Hacking (What is Logic? [1979], §13)
A limitation of first-order logic is that it cannot handle branching quantifiers [Hacking]
     Full Idea: Henkin proved that there is no first-order treatment of branching quantifiers, which do not seem to involve any idea that is fundamentally different from ordinary quantification.
     From: Ian Hacking (What is Logic? [1979], §13)
     A reaction: See Hacking for an example of branching quantifiers. Hacking is impressed by this as a real limitation of the first-order logic which he generally favours.
5. Theory of Logic / A. Overview of Logic / 7. Second-Order Logic
Second-order completeness seems to need intensional entities and possible worlds [Hacking]
     Full Idea: Second-order logic has no chance of a completeness theorem unless one ventures into intensional entities and possible worlds.
     From: Ian Hacking (What is Logic? [1979], §13)
5. Theory of Logic / E. Structures of Logic / 2. Logical Connectives / a. Logical connectives
With a pure notion of truth and consequence, the meanings of connectives are fixed syntactically [Hacking]
     Full Idea: My doctrine is that the peculiarity of the logical constants resides precisely in that given a certain pure notion of truth and consequence, all the desirable semantic properties of the constants are determined by their syntactic properties.
     From: Ian Hacking (What is Logic? [1979], §09)
     A reaction: He opposes this to Peacocke 1976, who claims that the logical connectives are essentially semantic in character, concerned with the preservation of truth.
5. Theory of Logic / E. Structures of Logic / 4. Variables in Logic
Perhaps variables could be dispensed with, by arrows joining places in the scope of quantifiers [Hacking]
     Full Idea: For some purposes the variables of first-order logic can be regarded as prepositions and place-holders that could in principle be dispensed with, say by a system of arrows indicating what places fall in the scope of which quantifier.
     From: Ian Hacking (What is Logic? [1979], §11)
     A reaction: I tend to think of variables as either pronouns, or as definite descriptions, or as temporary names, but not as prepositions. Must address this new idea...
5. Theory of Logic / J. Model Theory in Logic / 3. Löwenheim-Skolem Theorems
If it is a logic, the Löwenheim-Skolem theorem holds for it [Hacking]
     Full Idea: A Löwenheim-Skolem theorem holds for anything which, on my delineation, is a logic.
     From: Ian Hacking (What is Logic? [1979], §13)
     A reaction: I take this to be an unusually conservative view. Shapiro is the chap who can give you an alternative view of these things, or Boolos.
10. Modality / B. Possibility / 6. Probability
Probability was fully explained between 1654 and 1812 [Hacking]
     Full Idea: There is hardly any history of probability to record before Pascal (1654), and the whole subject is very well understood after Laplace (1812).
     From: Ian Hacking (The Emergence of Probability [1975], Ch.1)
     A reaction: An interesting little pointer on the question of whether the human race is close to exhausting all the available intellectual problems. What then?
Probability is statistical (behaviour of chance devices) or epistemological (belief based on evidence) [Hacking]
     Full Idea: Probability has two aspects: the degree of belief warranted by evidence, and the tendency displayed by some chance device to produce stable relative frequencies. These are the epistemological and statistical aspects of the subject.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.1)
     A reaction: The most basic distinction in the subject. Later (p.124) he suggests that the statistical form (known as 'aleatory' probability) is de re, and the other is de dicto.
Epistemological probability based either on logical implications or coherent judgments [Hacking]
     Full Idea: Epistemological probability is torn between Keynes etc saying it depends on the strength of logical implication, and Ramsey etc saying it is personal judgement which is subject to strong rules of internal coherence.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.2)
     A reaction: See Idea 7449 for epistemological probability. My immediate intuition is that the Ramsey approach sounds much more plausible. In real life there are too many fine-grained particulars involved for straight implication to settle a probability.
11. Knowledge Aims / A. Knowledge / 4. Belief / c. Aim of beliefs
If the only aim is to believe truths, that justifies recklessly believing what is unsupported (if it is right) [Conee/Feldman]
     Full Idea: If it is intellectually required that one try to believe all and only truths (as Chisholm says), ...then it is possible to believe some unsubstantiated proposition in a reckless endeavour to believe a truth, and happen to be right.
     From: E Conee / R Feldman (Evidentialism [1985], 'Justification')
     A reaction: This implies doxastic voluntarism. Sorry! I meant, this implies that we can control what we believe, when actually we believe what impinges on us as facts.
13. Knowledge Criteria / A. Justification Problems / 2. Justification Challenges / c. Knowledge closure
We don't have the capacity to know all the logical consequences of our beliefs [Conee/Feldman]
     Full Idea: Our limited cognitive capacities lead Goldman to deny a principle instructing people to believe all the logical consequences of their beliefs, since they are unable to have the infinite number of beliefs that following such a principle would require.
     From: E Conee / R Feldman (Evidentialism [1985], 'Doxastic')
     A reaction: This doesn't sound like much of an objection to epistemic closure, which I took to be the claim that you know the 'known' entailments of your knowledge.
13. Knowledge Criteria / B. Internal Justification / 3. Evidentialism / a. Evidence
In the medieval view, only deduction counted as true evidence [Hacking]
     Full Idea: In the medieval view, evidence short of deduction was not really evidence at all.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.3)
     A reaction: Hacking says the modern concept of evidence comes with probability in the 17th century. That might make it one of the most important ideas ever thought of, allowing us to abandon certainties and live our lives in a more questioning way.
Formerly evidence came from people; the new idea was that things provided evidence [Hacking]
     Full Idea: In the medieval view, people provided the evidence of testimony and of authority. What was lacking was the seventeenth century idea of the evidence provided by things.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.4)
     A reaction: A most intriguing distinction, which seems to imply a huge shift in world-view. The culmination of this is Peirce's pragmatism, in Idea 6948, of which I strongly approve.
13. Knowledge Criteria / B. Internal Justification / 3. Evidentialism / b. Evidentialism
Evidentialism says justifications supervene on the available evidence [Conee/Feldman]
     Full Idea: Fundamentally Evidentialism is a supervenience thesis, according to which facts about whether or not a person is justified in believing a proposition supervene on facts describing the evidence the person has.
     From: E Conee / R Feldman (Introduction to 'Evidentialism' [2004], p.1)
     A reaction: If facts 'describe', does that make them linguistic? That's not how I use 'facts'. A statement of a fact is not the same as the fact. An ugly fact can be beautifully expressed. I am, however, in favour of evidence.
14. Science / A. Basis of Science / 3. Experiment
An experiment is a test, or an adventure, or a diagnosis, or a dissection [Hacking, by PG]
     Full Idea: An experiment is a test (if T, then E implies R, so try E, and if R follows, T seems right), an adventure (no theory, but try things), a diagnosis (reading the signs), or a dissection (taking apart).
     From: report of Ian Hacking (The Emergence of Probability [1975], Ch.4) by PG - Db (ideas)
     A reaction: A nice analysis. The Greeks did diagnosis, then the alchemists tried adventures, then Vesalius began dissections, then the followers of Bacon concentrated on the test, setting up controlled conditions. 'If you don't believe it, try it yourself'.
14. Science / D. Explanation / 2. Types of Explanation / a. Types of explanation
Follow maths for necessary truths, and jurisprudence for contingent truths [Hacking]
     Full Idea: Mathematics is the model for reasoning about necessary truths, but jurisprudence must be our model when we deliberate about contingencies.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.10)
     A reaction: Interesting. Certainly huge thinking, especially since the Romans, has gone into the law, and creating rules of evidence. Maybe all philosophers should study law and mathematics?
20. Action / C. Motives for Action / 3. Acting on Reason / b. Intellectualism
Intellectualism is an excessive emphasis on reasoning in moral philosophy [Burnyeat]
     Full Idea: Intellectualism, a one-sided preoccupation with reason and reasoning, is a perennial failing in moral philosophy.
     From: Myles F. Burnyeat (Aristotle on Learning to be Good [1980], p.70)
     A reaction: But Kant's reply would be that while there is much more to moral behaviour, the only part which matters in morality is the reasoning part. And Socrates' view (ignorance is evil) is not obviously wrong.
20. Action / C. Motives for Action / 3. Acting on Reason / c. Reasons as causes
Rational decisions are either taken to be based on evidence, or to be explained causally [Conee/Feldman]
     Full Idea: In decision theory, there is a view according to which the rational basis for all decisions is evidential. This kind of decision theory is typically contrasted with causal decision theory.
     From: E Conee / R Feldman (Introduction to 'Evidentialism' [2004], p.3)
     A reaction: Your Kantian presumably likes rational reflection on evidence, and your modern reductive scientist prefers causality (which doesn't really sound very rational).