Combining Philosophers

All the ideas for H.Putnam/P.Oppenheim, Ian Hacking and Douglas Lackey

unexpand these ideas     |    start again     |     specify just one area for these philosophers


22 ideas

1. Philosophy / C. History of Philosophy / 4. Later European Philosophy / b. Seventeenth century philosophy
Gassendi is the first great empiricist philosopher [Hacking]
     Full Idea: Gassendi is the first in the great line of empiricist philosophers that gradually came to dominate European thought.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.5)
     A reaction: Epicurus, of course, was clearly an empiricist. British readers should note that Gassendi was not British.
2. Reason / D. Definition / 3. Types of Definition
A decent modern definition should always imply a semantics [Hacking]
     Full Idea: Today we expect that anything worth calling a definition should imply a semantics.
     From: Ian Hacking (What is Logic? [1979], §10)
     A reaction: He compares this with Gentzen 1935, who was attempting purely syntactic definitions of the logical connectives.
4. Formal Logic / B. Propositional Logic PL / 2. Tools of Propositional Logic / d. Basic theorems of PL
'Thinning' ('dilution') is the key difference between deduction (which allows it) and induction [Hacking]
     Full Idea: 'Dilution' (or 'Thinning') provides an essential contrast between deductive and inductive reasoning; for the introduction of new premises may spoil an inductive inference.
     From: Ian Hacking (What is Logic? [1979], §06.2)
     A reaction: That is, inductive logic (if there is such a thing) is clearly non-monotonic, whereas classical inductive logic is monotonic.
Gentzen's Cut Rule (or transitivity of deduction) is 'If A |- B and B |- C, then A |- C' [Hacking]
     Full Idea: If A |- B and B |- C, then A |- C. This generalises to: If Γ|-A,Θ and Γ,A |- Θ, then Γ |- Θ. Gentzen called this 'cut'. It is the transitivity of a deduction.
     From: Ian Hacking (What is Logic? [1979], §06.3)
     A reaction: I read the generalisation as 'If A can be either a premise or a conclusion, you can bypass it'. The first version is just transitivity (which by-passes the middle step).
Only Cut reduces complexity, so logic is constructive without it, and it can be dispensed with [Hacking]
     Full Idea: Only the cut rule can have a conclusion that is less complex than its premises. Hence when cut is not used, a derivation is quite literally constructive, building up from components. Any theorem obtained by cut can be obtained without it.
     From: Ian Hacking (What is Logic? [1979], §08)
5. Theory of Logic / A. Overview of Logic / 4. Pure Logic
The various logics are abstractions made from terms like 'if...then' in English [Hacking]
     Full Idea: I don't believe English is by nature classical or intuitionistic etc. These are abstractions made by logicians. Logicians attend to numerous different objects that might be served by 'If...then', like material conditional, strict or relevant implication.
     From: Ian Hacking (What is Logic? [1979], §15)
     A reaction: The idea that they are 'abstractions' is close to my heart. Abstractions from what? Surely 'if...then' has a standard character when employed in normal conversation?
5. Theory of Logic / A. Overview of Logic / 5. First-Order Logic
First-order logic is the strongest complete compact theory with Löwenheim-Skolem [Hacking]
     Full Idea: First-order logic is the strongest complete compact theory with a Löwenheim-Skolem theorem.
     From: Ian Hacking (What is Logic? [1979], §13)
A limitation of first-order logic is that it cannot handle branching quantifiers [Hacking]
     Full Idea: Henkin proved that there is no first-order treatment of branching quantifiers, which do not seem to involve any idea that is fundamentally different from ordinary quantification.
     From: Ian Hacking (What is Logic? [1979], §13)
     A reaction: See Hacking for an example of branching quantifiers. Hacking is impressed by this as a real limitation of the first-order logic which he generally favours.
5. Theory of Logic / A. Overview of Logic / 7. Second-Order Logic
Second-order completeness seems to need intensional entities and possible worlds [Hacking]
     Full Idea: Second-order logic has no chance of a completeness theorem unless one ventures into intensional entities and possible worlds.
     From: Ian Hacking (What is Logic? [1979], §13)
5. Theory of Logic / E. Structures of Logic / 2. Logical Connectives / a. Logical connectives
With a pure notion of truth and consequence, the meanings of connectives are fixed syntactically [Hacking]
     Full Idea: My doctrine is that the peculiarity of the logical constants resides precisely in that given a certain pure notion of truth and consequence, all the desirable semantic properties of the constants are determined by their syntactic properties.
     From: Ian Hacking (What is Logic? [1979], §09)
     A reaction: He opposes this to Peacocke 1976, who claims that the logical connectives are essentially semantic in character, concerned with the preservation of truth.
5. Theory of Logic / E. Structures of Logic / 4. Variables in Logic
Perhaps variables could be dispensed with, by arrows joining places in the scope of quantifiers [Hacking]
     Full Idea: For some purposes the variables of first-order logic can be regarded as prepositions and place-holders that could in principle be dispensed with, say by a system of arrows indicating what places fall in the scope of which quantifier.
     From: Ian Hacking (What is Logic? [1979], §11)
     A reaction: I tend to think of variables as either pronouns, or as definite descriptions, or as temporary names, but not as prepositions. Must address this new idea...
5. Theory of Logic / J. Model Theory in Logic / 3. Löwenheim-Skolem Theorems
If it is a logic, the Löwenheim-Skolem theorem holds for it [Hacking]
     Full Idea: A Löwenheim-Skolem theorem holds for anything which, on my delineation, is a logic.
     From: Ian Hacking (What is Logic? [1979], §13)
     A reaction: I take this to be an unusually conservative view. Shapiro is the chap who can give you an alternative view of these things, or Boolos.
5. Theory of Logic / L. Paradox / 5. Paradoxes in Set Theory / b. Cantor's paradox
Sets always exceed terms, so all the sets must exceed all the sets [Lackey]
     Full Idea: Cantor proved that the number of sets in a collection of terms is larger than the number of terms. Hence Cantor's Paradox says the number of sets in the collection of all sets must be larger than the number of sets in the collection of all sets.
     From: Douglas Lackey (Intros to Russell's 'Essays in Analysis' [1973], p.127)
     A reaction: The sets must count as terms in the next iteration, but that is a normal application of the Power Set axiom.
5. Theory of Logic / L. Paradox / 5. Paradoxes in Set Theory / c. Burali-Forti's paradox
It seems that the ordinal number of all the ordinals must be bigger than itself [Lackey]
     Full Idea: The ordinal series is well-ordered and thus has an ordinal number, and a series of ordinals to a given ordinal exceeds that ordinal by 1. So the series of all ordinals has an ordinal number that exceeds its own ordinal number by 1.
     From: Douglas Lackey (Intros to Russell's 'Essays in Analysis' [1973], p.127)
     A reaction: Formulated by Burali-Forti in 1897.
10. Modality / B. Possibility / 6. Probability
Probability is statistical (behaviour of chance devices) or epistemological (belief based on evidence) [Hacking]
     Full Idea: Probability has two aspects: the degree of belief warranted by evidence, and the tendency displayed by some chance device to produce stable relative frequencies. These are the epistemological and statistical aspects of the subject.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.1)
     A reaction: The most basic distinction in the subject. Later (p.124) he suggests that the statistical form (known as 'aleatory' probability) is de re, and the other is de dicto.
Probability was fully explained between 1654 and 1812 [Hacking]
     Full Idea: There is hardly any history of probability to record before Pascal (1654), and the whole subject is very well understood after Laplace (1812).
     From: Ian Hacking (The Emergence of Probability [1975], Ch.1)
     A reaction: An interesting little pointer on the question of whether the human race is close to exhausting all the available intellectual problems. What then?
Epistemological probability based either on logical implications or coherent judgments [Hacking]
     Full Idea: Epistemological probability is torn between Keynes etc saying it depends on the strength of logical implication, and Ramsey etc saying it is personal judgement which is subject to strong rules of internal coherence.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.2)
     A reaction: See Idea 7449 for epistemological probability. My immediate intuition is that the Ramsey approach sounds much more plausible. In real life there are too many fine-grained particulars involved for straight implication to settle a probability.
13. Knowledge Criteria / B. Internal Justification / 3. Evidentialism / a. Evidence
In the medieval view, only deduction counted as true evidence [Hacking]
     Full Idea: In the medieval view, evidence short of deduction was not really evidence at all.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.3)
     A reaction: Hacking says the modern concept of evidence comes with probability in the 17th century. That might make it one of the most important ideas ever thought of, allowing us to abandon certainties and live our lives in a more questioning way.
Formerly evidence came from people; the new idea was that things provided evidence [Hacking]
     Full Idea: In the medieval view, people provided the evidence of testimony and of authority. What was lacking was the seventeenth century idea of the evidence provided by things.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.4)
     A reaction: A most intriguing distinction, which seems to imply a huge shift in world-view. The culmination of this is Peirce's pragmatism, in Idea 6948, of which I strongly approve.
14. Science / A. Basis of Science / 3. Experiment
An experiment is a test, or an adventure, or a diagnosis, or a dissection [Hacking, by PG]
     Full Idea: An experiment is a test (if T, then E implies R, so try E, and if R follows, T seems right), an adventure (no theory, but try things), a diagnosis (reading the signs), or a dissection (taking apart).
     From: report of Ian Hacking (The Emergence of Probability [1975], Ch.4) by PG - Db (ideas)
     A reaction: A nice analysis. The Greeks did diagnosis, then the alchemists tried adventures, then Vesalius began dissections, then the followers of Bacon concentrated on the test, setting up controlled conditions. 'If you don't believe it, try it yourself'.
14. Science / D. Explanation / 2. Types of Explanation / a. Types of explanation
Follow maths for necessary truths, and jurisprudence for contingent truths [Hacking]
     Full Idea: Mathematics is the model for reasoning about necessary truths, but jurisprudence must be our model when we deliberate about contingencies.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.10)
     A reaction: Interesting. Certainly huge thinking, especially since the Romans, has gone into the law, and creating rules of evidence. Maybe all philosophers should study law and mathematics?
14. Science / D. Explanation / 2. Types of Explanation / j. Explanations by reduction
Six reduction levels: groups, lives, cells, molecules, atoms, particles [Putnam/Oppenheim, by Watson]
     Full Idea: There are six 'reductive levels' in science: social groups, (multicellular) living things, cells, molecules, atoms, and elementary particles.
     From: report of H.Putnam/P.Oppenheim (Unity of Science as a Working Hypothesis [1958]) by Peter Watson - Convergence 10 'Intro'
     A reaction: I have the impression that fields are seen as more fundamental that elementary particles. What is the status of the 'laws' that are supposed to govern these things? What is the status of space and time within this picture?