Combining Texts

All the ideas for 'What is Logic?st1=Ian Hacking', 'Explanation and Reference' and 'Difference and Repetition'

unexpand these ideas     |    start again     |     specify just one area for these texts


22 ideas

1. Philosophy / H. Continental Philosophy / 1. Continental Philosophy
'Difference' refers to that which eludes capture [Deleuze, by May]
     Full Idea: 'Difference' is a term which Deleuze uses to refer to that which eludes capture.
     From: report of Gilles Deleuze (Difference and Repetition [1968]) by Todd May - Gilles Deleuze 3.03
     A reaction: Presumably its ancestor is Kant's noumenon. This is one of his concepts used to 'palpate' our ossified conceptual scheme.
2. Reason / D. Definition / 3. Types of Definition
A decent modern definition should always imply a semantics [Hacking]
     Full Idea: Today we expect that anything worth calling a definition should imply a semantics.
     From: Ian Hacking (What is Logic? [1979], §10)
     A reaction: He compares this with Gentzen 1935, who was attempting purely syntactic definitions of the logical connectives.
4. Formal Logic / B. Propositional Logic PL / 2. Tools of Propositional Logic / d. Basic theorems of PL
'Thinning' ('dilution') is the key difference between deduction (which allows it) and induction [Hacking]
     Full Idea: 'Dilution' (or 'Thinning') provides an essential contrast between deductive and inductive reasoning; for the introduction of new premises may spoil an inductive inference.
     From: Ian Hacking (What is Logic? [1979], §06.2)
     A reaction: That is, inductive logic (if there is such a thing) is clearly non-monotonic, whereas classical inductive logic is monotonic.
Gentzen's Cut Rule (or transitivity of deduction) is 'If A |- B and B |- C, then A |- C' [Hacking]
     Full Idea: If A |- B and B |- C, then A |- C. This generalises to: If Γ|-A,Θ and Γ,A |- Θ, then Γ |- Θ. Gentzen called this 'cut'. It is the transitivity of a deduction.
     From: Ian Hacking (What is Logic? [1979], §06.3)
     A reaction: I read the generalisation as 'If A can be either a premise or a conclusion, you can bypass it'. The first version is just transitivity (which by-passes the middle step).
Only Cut reduces complexity, so logic is constructive without it, and it can be dispensed with [Hacking]
     Full Idea: Only the cut rule can have a conclusion that is less complex than its premises. Hence when cut is not used, a derivation is quite literally constructive, building up from components. Any theorem obtained by cut can be obtained without it.
     From: Ian Hacking (What is Logic? [1979], §08)
5. Theory of Logic / A. Overview of Logic / 4. Pure Logic
The various logics are abstractions made from terms like 'if...then' in English [Hacking]
     Full Idea: I don't believe English is by nature classical or intuitionistic etc. These are abstractions made by logicians. Logicians attend to numerous different objects that might be served by 'If...then', like material conditional, strict or relevant implication.
     From: Ian Hacking (What is Logic? [1979], §15)
     A reaction: The idea that they are 'abstractions' is close to my heart. Abstractions from what? Surely 'if...then' has a standard character when employed in normal conversation?
5. Theory of Logic / A. Overview of Logic / 5. First-Order Logic
First-order logic is the strongest complete compact theory with Löwenheim-Skolem [Hacking]
     Full Idea: First-order logic is the strongest complete compact theory with a Löwenheim-Skolem theorem.
     From: Ian Hacking (What is Logic? [1979], §13)
A limitation of first-order logic is that it cannot handle branching quantifiers [Hacking]
     Full Idea: Henkin proved that there is no first-order treatment of branching quantifiers, which do not seem to involve any idea that is fundamentally different from ordinary quantification.
     From: Ian Hacking (What is Logic? [1979], §13)
     A reaction: See Hacking for an example of branching quantifiers. Hacking is impressed by this as a real limitation of the first-order logic which he generally favours.
5. Theory of Logic / A. Overview of Logic / 7. Second-Order Logic
Second-order completeness seems to need intensional entities and possible worlds [Hacking]
     Full Idea: Second-order logic has no chance of a completeness theorem unless one ventures into intensional entities and possible worlds.
     From: Ian Hacking (What is Logic? [1979], §13)
5. Theory of Logic / E. Structures of Logic / 2. Logical Connectives / a. Logical connectives
With a pure notion of truth and consequence, the meanings of connectives are fixed syntactically [Hacking]
     Full Idea: My doctrine is that the peculiarity of the logical constants resides precisely in that given a certain pure notion of truth and consequence, all the desirable semantic properties of the constants are determined by their syntactic properties.
     From: Ian Hacking (What is Logic? [1979], §09)
     A reaction: He opposes this to Peacocke 1976, who claims that the logical connectives are essentially semantic in character, concerned with the preservation of truth.
5. Theory of Logic / E. Structures of Logic / 4. Variables in Logic
Perhaps variables could be dispensed with, by arrows joining places in the scope of quantifiers [Hacking]
     Full Idea: For some purposes the variables of first-order logic can be regarded as prepositions and place-holders that could in principle be dispensed with, say by a system of arrows indicating what places fall in the scope of which quantifier.
     From: Ian Hacking (What is Logic? [1979], §11)
     A reaction: I tend to think of variables as either pronouns, or as definite descriptions, or as temporary names, but not as prepositions. Must address this new idea...
5. Theory of Logic / F. Referring in Logic / 1. Naming / a. Names
Using proper names properly doesn't involve necessary and sufficient conditions [Putnam]
     Full Idea: The important thing about proper names is that it would be ridiculous to think that having linguistic competence can be equated in their case with knowledge of a necessary and sufficient condition.
     From: Hilary Putnam (Explanation and Reference [1973], II B)
5. Theory of Logic / J. Model Theory in Logic / 3. Löwenheim-Skolem Theorems
If it is a logic, the Löwenheim-Skolem theorem holds for it [Hacking]
     Full Idea: A Löwenheim-Skolem theorem holds for anything which, on my delineation, is a logic.
     From: Ian Hacking (What is Logic? [1979], §13)
     A reaction: I take this to be an unusually conservative view. Shapiro is the chap who can give you an alternative view of these things, or Boolos.
7. Existence / A. Nature of Existence / 3. Being / a. Nature of Being
'Being' is univocal, but its subject matter is actually 'difference' [Deleuze]
     Full Idea: Being is said in a single and same sense of everything of which it is said, but that of which it is said differs: it is said of difference itself.
     From: Gilles Deleuze (Difference and Repetition [1968], p.36), quoted by Todd May - Gilles Deleuze 3.03
     A reaction: This is an attempt to express the Heraclitean view of reality, as process, movement, multiplicity - something which always eludes our attempts to pin it down.
Ontology can be continual creation, not to know being, but to probe the unknowable [Deleuze]
     Full Idea: Ontology can be an ontology of difference ....where what is there is not the same old things but a process of continual creation, an ontology that does not seek to reduce being to the knowable, but widens thought to palpate the unknowable.
     From: Gilles Deleuze (Difference and Repetition [1968]), quoted by Todd May - Gilles Deleuze 5.05
     A reaction: I'm inclined to think that the first duty of ontology is to face up to the knowable. I'm not sure that probing the unknowable, with no success or prospect of it, is a good way to spend a life. Probing ('palpating') can sometimes discover things.
7. Existence / A. Nature of Existence / 3. Being / i. Deflating being
Ontology does not tell what there is; it is just a strange adventure [Deleuze, by May]
     Full Idea: In Deleuze's hands ontology is not a matter of telling us what there is, but of taking us on strange adventures.
     From: report of Gilles Deleuze (Difference and Repetition [1968]) by Todd May - Gilles Deleuze 3.03
     A reaction: Presumably you only indulge in the strange adventure because you have no idea how to specify what there is. This sounds like the essence of post-modernism, in which life is just a game.
Being is a problem to be engaged, not solved, and needs a new mode of thinking [Deleuze, by May]
     Full Idea: In Deleuze, Being is not a puzzle to be solved but a problem to be engaged. It is to be engaged by a thought that moves as comfortably among problems as it does among solutions, as fluidly among differences as it does among identities.
     From: report of Gilles Deleuze (Difference and Repetition [1968]) by Todd May - Gilles Deleuze 4.01
     A reaction: This sounds like what I've always known as 'negative capability' (thanks to Keats). Is philosophy just a hobby, like playing darts? It seems that the aim of the process is 'liberation', about which I would like to know more.
9. Objects / D. Essence of Objects / 5. Essence as Kind
Putnam bases essences on 'same kind', but same kinds may not share properties [Mackie,P on Putnam]
     Full Idea: The only place for essentialism to come from in Putnam's semantic account is out of the 'same kind' relation. But if the same kind relation can be cashed out in terms that do not involve sharing properties (apart from 'being water') there is a gap.
     From: comment on Hilary Putnam (Explanation and Reference [1973]) by Penelope Mackie - How Things Might Have Been 10.4
     A reaction: [This is the criticism of Salmon and Mellor] See Mackie's discussion for details. I would always have thought that relations result from essences, so could never be used to define them.
14. Science / B. Scientific Theories / 2. Aim of Science
Science aims at truth, not at 'simplicity' [Putnam]
     Full Idea: Scientists are not trying to maximise some formal property of 'simplicity'; they are trying to maximise truth.
     From: Hilary Putnam (Explanation and Reference [1973], III B)
     A reaction: This seems to be aimed at the Mill-Ramsey-Lewis account of laws of nature, as the simplest axioms of experience. I'm with Putnam (as he was at this date).
19. Language / B. Reference / 3. Direct Reference / b. Causal reference
I now think reference by the tests of experts is a special case of being causally connected [Putnam]
     Full Idea: In previous papers I suggested that the reference is fixed by a test known to experts; it now seems to me that this is just a special case of my use being causally connected to an introducing event.
     From: Hilary Putnam (Explanation and Reference [1973], II C)
     A reaction: I think he was probably right the first time, and has now wandered off course.
26. Natural Theory / B. Natural Kinds / 5. Reference to Natural Kinds
Natural kind stereotypes are 'strong' (obvious, like tiger) or 'weak' (obscure, like molybdenum) [Putnam]
     Full Idea: Natural kinds can be associated with 'strong' stereotypes (giving a strong picture of a typical member, like a tiger), or with 'weak' stereotypes (with no idea of a sufficient condition, such as molybdenum or elm).
     From: Hilary Putnam (Explanation and Reference [1973], II C)
Express natural kinds as a posteriori predicate connections, not as singular terms [Putnam, by Mackie,P]
     Full Idea: Putnam implies dispensing with the designation of natural kinds by singular terms in favour of the postulation of necessary but a posteriori connections between predicates. ...We might call this 'predicate essentialism', but not 'de re essentialism'.
     From: report of Hilary Putnam (Explanation and Reference [1973]) by Penelope Mackie - How Things Might Have Been 10.1
     A reaction: It is characteristic of modern discussion that the logical form of natural kind statements is held to be crucial, rather than an account of nature in any old ways that do the job. So do I prefer singular terms, or predicate-connections. Hm.