Combining Texts

All the ideas for 'What is Logic?st1=Ian Hacking', 'On Assertion and Indicative Conditionals' and 'On Multiplying Entities'

unexpand these ideas     |    start again     |     specify just one area for these texts


17 ideas

2. Reason / B. Laws of Thought / 6. Ockham's Razor
The quest for simplicity drove scientists to posit new entities, such as molecules in gases [Quine]
     Full Idea: It is the quest for system and simplicity that has kept driving the scientist to posit further entities as values of his variables. By positing molecules, Boyles' law of gases could be assimilated into a general theory of bodies in motion.
     From: Willard Quine (On Multiplying Entities [1974], p.262)
     A reaction: Interesting that a desire for simplicity might lead to multiplications of entities. In fact, I presume molecules had been proposed elsewhere in science, and were adopted in gas-theory because they were thought to exist, not because simplicity is nice.
In arithmetic, ratios, negatives, irrationals and imaginaries were created in order to generalise [Quine]
     Full Idea: In classical arithmetic, ratios were posited to make division generally applicable, negative numbers to make subtraction generally applicable, and irrationals and finally imaginaries to make exponentiation generally applicable.
     From: Willard Quine (On Multiplying Entities [1974], p.263)
     A reaction: This is part of Quine's proposal (c.f. Idea 8207) that entities have to be multiplied in order to produce simplicity. He is speculating. Maybe they are proposed because they are just obvious, and the generality is a nice side-effect.
2. Reason / D. Definition / 3. Types of Definition
A decent modern definition should always imply a semantics [Hacking]
     Full Idea: Today we expect that anything worth calling a definition should imply a semantics.
     From: Ian Hacking (What is Logic? [1979], §10)
     A reaction: He compares this with Gentzen 1935, who was attempting purely syntactic definitions of the logical connectives.
4. Formal Logic / B. Propositional Logic PL / 2. Tools of Propositional Logic / d. Basic theorems of PL
'Thinning' ('dilution') is the key difference between deduction (which allows it) and induction [Hacking]
     Full Idea: 'Dilution' (or 'Thinning') provides an essential contrast between deductive and inductive reasoning; for the introduction of new premises may spoil an inductive inference.
     From: Ian Hacking (What is Logic? [1979], §06.2)
     A reaction: That is, inductive logic (if there is such a thing) is clearly non-monotonic, whereas classical inductive logic is monotonic.
Gentzen's Cut Rule (or transitivity of deduction) is 'If A |- B and B |- C, then A |- C' [Hacking]
     Full Idea: If A |- B and B |- C, then A |- C. This generalises to: If Γ|-A,Θ and Γ,A |- Θ, then Γ |- Θ. Gentzen called this 'cut'. It is the transitivity of a deduction.
     From: Ian Hacking (What is Logic? [1979], §06.3)
     A reaction: I read the generalisation as 'If A can be either a premise or a conclusion, you can bypass it'. The first version is just transitivity (which by-passes the middle step).
Only Cut reduces complexity, so logic is constructive without it, and it can be dispensed with [Hacking]
     Full Idea: Only the cut rule can have a conclusion that is less complex than its premises. Hence when cut is not used, a derivation is quite literally constructive, building up from components. Any theorem obtained by cut can be obtained without it.
     From: Ian Hacking (What is Logic? [1979], §08)
5. Theory of Logic / A. Overview of Logic / 4. Pure Logic
The various logics are abstractions made from terms like 'if...then' in English [Hacking]
     Full Idea: I don't believe English is by nature classical or intuitionistic etc. These are abstractions made by logicians. Logicians attend to numerous different objects that might be served by 'If...then', like material conditional, strict or relevant implication.
     From: Ian Hacking (What is Logic? [1979], §15)
     A reaction: The idea that they are 'abstractions' is close to my heart. Abstractions from what? Surely 'if...then' has a standard character when employed in normal conversation?
5. Theory of Logic / A. Overview of Logic / 5. First-Order Logic
First-order logic is the strongest complete compact theory with Löwenheim-Skolem [Hacking]
     Full Idea: First-order logic is the strongest complete compact theory with a Löwenheim-Skolem theorem.
     From: Ian Hacking (What is Logic? [1979], §13)
A limitation of first-order logic is that it cannot handle branching quantifiers [Hacking]
     Full Idea: Henkin proved that there is no first-order treatment of branching quantifiers, which do not seem to involve any idea that is fundamentally different from ordinary quantification.
     From: Ian Hacking (What is Logic? [1979], §13)
     A reaction: See Hacking for an example of branching quantifiers. Hacking is impressed by this as a real limitation of the first-order logic which he generally favours.
5. Theory of Logic / A. Overview of Logic / 7. Second-Order Logic
Second-order completeness seems to need intensional entities and possible worlds [Hacking]
     Full Idea: Second-order logic has no chance of a completeness theorem unless one ventures into intensional entities and possible worlds.
     From: Ian Hacking (What is Logic? [1979], §13)
5. Theory of Logic / E. Structures of Logic / 2. Logical Connectives / a. Logical connectives
With a pure notion of truth and consequence, the meanings of connectives are fixed syntactically [Hacking]
     Full Idea: My doctrine is that the peculiarity of the logical constants resides precisely in that given a certain pure notion of truth and consequence, all the desirable semantic properties of the constants are determined by their syntactic properties.
     From: Ian Hacking (What is Logic? [1979], §09)
     A reaction: He opposes this to Peacocke 1976, who claims that the logical connectives are essentially semantic in character, concerned with the preservation of truth.
5. Theory of Logic / E. Structures of Logic / 4. Variables in Logic
Perhaps variables could be dispensed with, by arrows joining places in the scope of quantifiers [Hacking]
     Full Idea: For some purposes the variables of first-order logic can be regarded as prepositions and place-holders that could in principle be dispensed with, say by a system of arrows indicating what places fall in the scope of which quantifier.
     From: Ian Hacking (What is Logic? [1979], §11)
     A reaction: I tend to think of variables as either pronouns, or as definite descriptions, or as temporary names, but not as prepositions. Must address this new idea...
5. Theory of Logic / J. Model Theory in Logic / 3. Löwenheim-Skolem Theorems
If it is a logic, the Löwenheim-Skolem theorem holds for it [Hacking]
     Full Idea: A Löwenheim-Skolem theorem holds for anything which, on my delineation, is a logic.
     From: Ian Hacking (What is Logic? [1979], §13)
     A reaction: I take this to be an unusually conservative view. Shapiro is the chap who can give you an alternative view of these things, or Boolos.
7. Existence / B. Change in Existence / 4. Events / c. Reduction of events
Explaining events just by bodies can't explain two events identical in space-time [Quine]
     Full Idea: An account of events just in terms of physical bodies does not distinguish between events that happen to take up just the same portion of space-time. A man's whistling and walking would be identified with the same temporal segment of the man.
     From: Willard Quine (On Multiplying Entities [1974], p.260)
     A reaction: We wouldn't want to make his 'walking' and his 'strolling' two events. Whistling and walking are different because different objects are involved (lips and legs). Hence a man is not (ontologically) a single object.
10. Modality / A. Necessity / 11. Denial of Necessity
Necessity could be just generalisation over classes, or (maybe) quantifying over possibilia [Quine]
     Full Idea: The need to add a note of necessity to 'all black crows are black' could be met by a generalisation over classes (what belongs to sets x and y belongs to y), or maybe be quantifying over possible particulars.
     From: Willard Quine (On Multiplying Entities [1974], p.262)
     A reaction: He dislikes the second strategy because 'unactualized particulars are an obscure and troublesome lot'. The second is the strategy of Lewis. I think necessity starts to creep back in as soon as you ask WHY a generalisation holds true.
10. Modality / B. Possibility / 8. Conditionals / c. Truth-function conditionals
'If A,B' affirms that A⊃B, and also that this wouldn't change if A were certain [Jackson, by Edgington]
     Full Idea: According to Jackson, in asserting 'If A,B' the speaker expresses his belief that A⊃B, and also indicates that this belief is 'robust' with respect to the antecedent A - the speaker would not abandon A⊃B if he were to learn that A.
     From: report of Frank Jackson (On Assertion and Indicative Conditionals [1979]) by Dorothy Edgington - Conditionals (Stanf) 4.2
     A reaction: The point is that you must not believe A⊃B solely on the dubious grounds of ¬A. This is 'to ensure an assertable conditional is fit for modus ponens' - that is, that you really will affirm B when you learn that A is true. Nice idea.
Conditionals are truth-functional, but should only be asserted when they are confident [Jackson, by Edgington]
     Full Idea: Jackson holds that conditionals are truth-functional, but are governed by rules of assertability, rather like 'but' compared to 'and'. The belief must be 'robust' - the speaker would not abandon his belief that A⊃B if he were to learn that A.
     From: report of Frank Jackson (On Assertion and Indicative Conditionals [1979]) by Dorothy Edgington - Conditionals 17.3.2
     A reaction: This seems to spell out more precisely the pragmatic approach to conditionals pioneered by Grice, in Idea 13767. The idea is make conditionals 'fit for modus ponens'. They mustn't just be based on a belief that ¬A.