Combining Philosophers

All the ideas for Augustin-Louis Cauchy, Ian Hacking and Albert Casullo

unexpand these ideas     |    start again     |     specify just one area for these philosophers


28 ideas

1. Philosophy / C. History of Philosophy / 4. Later European Philosophy / b. Seventeenth century philosophy
Gassendi is the first great empiricist philosopher [Hacking]
     Full Idea: Gassendi is the first in the great line of empiricist philosophers that gradually came to dominate European thought.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.5)
     A reaction: Epicurus, of course, was clearly an empiricist. British readers should note that Gassendi was not British.
2. Reason / D. Definition / 3. Types of Definition
A decent modern definition should always imply a semantics [Hacking]
     Full Idea: Today we expect that anything worth calling a definition should imply a semantics.
     From: Ian Hacking (What is Logic? [1979], §10)
     A reaction: He compares this with Gentzen 1935, who was attempting purely syntactic definitions of the logical connectives.
4. Formal Logic / B. Propositional Logic PL / 2. Tools of Propositional Logic / d. Basic theorems of PL
'Thinning' ('dilution') is the key difference between deduction (which allows it) and induction [Hacking]
     Full Idea: 'Dilution' (or 'Thinning') provides an essential contrast between deductive and inductive reasoning; for the introduction of new premises may spoil an inductive inference.
     From: Ian Hacking (What is Logic? [1979], §06.2)
     A reaction: That is, inductive logic (if there is such a thing) is clearly non-monotonic, whereas classical inductive logic is monotonic.
Gentzen's Cut Rule (or transitivity of deduction) is 'If A |- B and B |- C, then A |- C' [Hacking]
     Full Idea: If A |- B and B |- C, then A |- C. This generalises to: If Γ|-A,Θ and Γ,A |- Θ, then Γ |- Θ. Gentzen called this 'cut'. It is the transitivity of a deduction.
     From: Ian Hacking (What is Logic? [1979], §06.3)
     A reaction: I read the generalisation as 'If A can be either a premise or a conclusion, you can bypass it'. The first version is just transitivity (which by-passes the middle step).
Only Cut reduces complexity, so logic is constructive without it, and it can be dispensed with [Hacking]
     Full Idea: Only the cut rule can have a conclusion that is less complex than its premises. Hence when cut is not used, a derivation is quite literally constructive, building up from components. Any theorem obtained by cut can be obtained without it.
     From: Ian Hacking (What is Logic? [1979], §08)
5. Theory of Logic / A. Overview of Logic / 4. Pure Logic
The various logics are abstractions made from terms like 'if...then' in English [Hacking]
     Full Idea: I don't believe English is by nature classical or intuitionistic etc. These are abstractions made by logicians. Logicians attend to numerous different objects that might be served by 'If...then', like material conditional, strict or relevant implication.
     From: Ian Hacking (What is Logic? [1979], §15)
     A reaction: The idea that they are 'abstractions' is close to my heart. Abstractions from what? Surely 'if...then' has a standard character when employed in normal conversation?
5. Theory of Logic / A. Overview of Logic / 5. First-Order Logic
First-order logic is the strongest complete compact theory with Löwenheim-Skolem [Hacking]
     Full Idea: First-order logic is the strongest complete compact theory with a Löwenheim-Skolem theorem.
     From: Ian Hacking (What is Logic? [1979], §13)
A limitation of first-order logic is that it cannot handle branching quantifiers [Hacking]
     Full Idea: Henkin proved that there is no first-order treatment of branching quantifiers, which do not seem to involve any idea that is fundamentally different from ordinary quantification.
     From: Ian Hacking (What is Logic? [1979], §13)
     A reaction: See Hacking for an example of branching quantifiers. Hacking is impressed by this as a real limitation of the first-order logic which he generally favours.
5. Theory of Logic / A. Overview of Logic / 7. Second-Order Logic
Second-order completeness seems to need intensional entities and possible worlds [Hacking]
     Full Idea: Second-order logic has no chance of a completeness theorem unless one ventures into intensional entities and possible worlds.
     From: Ian Hacking (What is Logic? [1979], §13)
5. Theory of Logic / E. Structures of Logic / 2. Logical Connectives / a. Logical connectives
With a pure notion of truth and consequence, the meanings of connectives are fixed syntactically [Hacking]
     Full Idea: My doctrine is that the peculiarity of the logical constants resides precisely in that given a certain pure notion of truth and consequence, all the desirable semantic properties of the constants are determined by their syntactic properties.
     From: Ian Hacking (What is Logic? [1979], §09)
     A reaction: He opposes this to Peacocke 1976, who claims that the logical connectives are essentially semantic in character, concerned with the preservation of truth.
5. Theory of Logic / E. Structures of Logic / 4. Variables in Logic
Perhaps variables could be dispensed with, by arrows joining places in the scope of quantifiers [Hacking]
     Full Idea: For some purposes the variables of first-order logic can be regarded as prepositions and place-holders that could in principle be dispensed with, say by a system of arrows indicating what places fall in the scope of which quantifier.
     From: Ian Hacking (What is Logic? [1979], §11)
     A reaction: I tend to think of variables as either pronouns, or as definite descriptions, or as temporary names, but not as prepositions. Must address this new idea...
5. Theory of Logic / J. Model Theory in Logic / 3. Löwenheim-Skolem Theorems
If it is a logic, the Löwenheim-Skolem theorem holds for it [Hacking]
     Full Idea: A Löwenheim-Skolem theorem holds for anything which, on my delineation, is a logic.
     From: Ian Hacking (What is Logic? [1979], §13)
     A reaction: I take this to be an unusually conservative view. Shapiro is the chap who can give you an alternative view of these things, or Boolos.
6. Mathematics / A. Nature of Mathematics / 5. The Infinite / k. Infinitesimals
Values that approach zero, becoming less than any quantity, are 'infinitesimals' [Cauchy]
     Full Idea: When the successive absolute values of a variable decrease indefinitely in such a way as to become less than any given quantity, that variable becomes what is called an 'infinitesimal'. Such a variable has zero as its limit.
     From: Augustin-Louis Cauchy (Cours d'Analyse [1821], p.19), quoted by Philip Kitcher - The Nature of Mathematical Knowledge 10.4
     A reaction: The creator of the important idea of the limit still talked in terms of infinitesimals. In the next generation the limit took over completely.
6. Mathematics / A. Nature of Mathematics / 5. The Infinite / l. Limits
When successive variable values approach a fixed value, that is its 'limit' [Cauchy]
     Full Idea: When the values successively attributed to the same variable approach indefinitely a fixed value, eventually differing from it by as little as one could wish, that fixed value is called the 'limit' of all the others.
     From: Augustin-Louis Cauchy (Cours d'Analyse [1821], p.19), quoted by Philip Kitcher - The Nature of Mathematical Knowledge 10.4
     A reaction: This seems to be a highly significan proposal, because you can now treat that limit as a number, and adds things to it. It opens the door to Cantor's infinities. Is the 'limit' just a fiction?
10. Modality / A. Necessity / 11. Denial of Necessity
Maybe modal sentences cannot be true or false [Casullo]
     Full Idea: Some people claim that modal sentences do not express truths or falsehoods.
     From: Albert Casullo (A Priori Knowledge [2002], 3.2)
     A reaction: I can only imagine this coming from a narrow hardline empiricist. It seems to me obvious that we make true or false statements about what is possible or impossible.
10. Modality / B. Possibility / 6. Probability
Probability was fully explained between 1654 and 1812 [Hacking]
     Full Idea: There is hardly any history of probability to record before Pascal (1654), and the whole subject is very well understood after Laplace (1812).
     From: Ian Hacking (The Emergence of Probability [1975], Ch.1)
     A reaction: An interesting little pointer on the question of whether the human race is close to exhausting all the available intellectual problems. What then?
Probability is statistical (behaviour of chance devices) or epistemological (belief based on evidence) [Hacking]
     Full Idea: Probability has two aspects: the degree of belief warranted by evidence, and the tendency displayed by some chance device to produce stable relative frequencies. These are the epistemological and statistical aspects of the subject.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.1)
     A reaction: The most basic distinction in the subject. Later (p.124) he suggests that the statistical form (known as 'aleatory' probability) is de re, and the other is de dicto.
Epistemological probability based either on logical implications or coherent judgments [Hacking]
     Full Idea: Epistemological probability is torn between Keynes etc saying it depends on the strength of logical implication, and Ramsey etc saying it is personal judgement which is subject to strong rules of internal coherence.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.2)
     A reaction: See Idea 7449 for epistemological probability. My immediate intuition is that the Ramsey approach sounds much more plausible. In real life there are too many fine-grained particulars involved for straight implication to settle a probability.
10. Modality / D. Knowledge of Modality / 1. A Priori Necessary
If the necessary is a priori, so is the contingent, because the same evidence is involved [Casullo]
     Full Idea: If one can only know a priori that a proposition is necessary, then one can know only a priori that a proposition is contingent. The evidence relevant to determining the latter is the same as that relevant to determining the former.
     From: Albert Casullo (A Priori Knowledge [2002], 3.2)
     A reaction: This seems a telling point, but I suppose it is obvious. If you see that the cat is on the mat, nothing in the situation tells you whether this is contingent or necessary. We assume it is contingent, but that may be an a priori assumption.
12. Knowledge Sources / A. A Priori Knowledge / 1. Nature of the A Priori
Epistemic a priori conditions concern either the source, defeasibility or strength [Casullo]
     Full Idea: There are three suggested epistemic conditions on a priori knowledge: the first regards the source of justification, the second regards the defeasibility of justification, and the third appeals to the strength of justification.
     From: Albert Casullo (A Priori Knowledge [2002], 2)
     A reaction: [compressed] He says these are all inspired by Kant. The non-epistemic suggested condition involve necessity or analyticity. The source would have to be entirely mental; the defeasibly could not be experiential; the strength would be certainty.
The main claim of defenders of the a priori is that some justifications are non-experiential [Casullo]
     Full Idea: The leading claim of proponents of the a priori is that sources of justification are of two significantly different types: experiential and nonexperiential. Initially this difference is marked at the phenomenological level.
     From: Albert Casullo (A Priori Knowledge [2002], 5)
     A reaction: He cites Plantinga and Bealer for the phenomenological starting point (that some knowledge just seems rationally obvious, certain, and perhaps necessary).
12. Knowledge Sources / A. A Priori Knowledge / 4. A Priori as Necessities
Analysis of the a priori by necessity or analyticity addresses the proposition, not the justification [Casullo]
     Full Idea: There is reason to view non-epistemic analyses of a priori knowledge (in terms of necessity or analyticity) with suspicion. The a priori concerns justification. Analysis by necessity or analyticity concerns the proposition rather than the justification.
     From: Albert Casullo (A Priori Knowledge [2002], 2.1)
     A reaction: [compressed] The fact that the a priori is entirely a mode of justification, rather than a type of truth, is the modern view, influenced by Kripke. Given that assumption, this is a good objection.
12. Knowledge Sources / A. A Priori Knowledge / 10. A Priori as Subjective
Maybe imagination is the source of a priori justification [Casullo]
     Full Idea: Some maintain that experiments in imagination are the source of a priori justification.
     From: Albert Casullo (A priori/A posteriori [1992], p.1)
     A reaction: What else could assessments of possibility and necessity be based on except imagination?
13. Knowledge Criteria / A. Justification Problems / 1. Justification / c. Defeasibility
'Overriding' defeaters rule it out, and 'undermining' defeaters weaken in [Casullo]
     Full Idea: A justified belief that a proposition is not true is an 'overriding' defeater, ...and the belief that a justification is inadequate or defective is an 'undermining' defeater.
     From: Albert Casullo (A Priori Knowledge [2002], n 40)
     A reaction: Sounds more like a sliding scale than a binary option. Quite useful, though.
13. Knowledge Criteria / B. Internal Justification / 3. Evidentialism / a. Evidence
In the medieval view, only deduction counted as true evidence [Hacking]
     Full Idea: In the medieval view, evidence short of deduction was not really evidence at all.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.3)
     A reaction: Hacking says the modern concept of evidence comes with probability in the 17th century. That might make it one of the most important ideas ever thought of, allowing us to abandon certainties and live our lives in a more questioning way.
Formerly evidence came from people; the new idea was that things provided evidence [Hacking]
     Full Idea: In the medieval view, people provided the evidence of testimony and of authority. What was lacking was the seventeenth century idea of the evidence provided by things.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.4)
     A reaction: A most intriguing distinction, which seems to imply a huge shift in world-view. The culmination of this is Peirce's pragmatism, in Idea 6948, of which I strongly approve.
14. Science / A. Basis of Science / 3. Experiment
An experiment is a test, or an adventure, or a diagnosis, or a dissection [Hacking, by PG]
     Full Idea: An experiment is a test (if T, then E implies R, so try E, and if R follows, T seems right), an adventure (no theory, but try things), a diagnosis (reading the signs), or a dissection (taking apart).
     From: report of Ian Hacking (The Emergence of Probability [1975], Ch.4) by PG - Db (ideas)
     A reaction: A nice analysis. The Greeks did diagnosis, then the alchemists tried adventures, then Vesalius began dissections, then the followers of Bacon concentrated on the test, setting up controlled conditions. 'If you don't believe it, try it yourself'.
14. Science / D. Explanation / 2. Types of Explanation / a. Types of explanation
Follow maths for necessary truths, and jurisprudence for contingent truths [Hacking]
     Full Idea: Mathematics is the model for reasoning about necessary truths, but jurisprudence must be our model when we deliberate about contingencies.
     From: Ian Hacking (The Emergence of Probability [1975], Ch.10)
     A reaction: Interesting. Certainly huge thinking, especially since the Romans, has gone into the law, and creating rules of evidence. Maybe all philosophers should study law and mathematics?