Combining Texts

All the ideas for 'The Case for Closure', 'Outline of a System of Utilitarianism' and 'Brainchildren'

expand these ideas     |    start again     |     specify just one area for these texts


14 ideas

10. Modality / D. Knowledge of Modality / 1. A Priori Necessary
Philosophers regularly confuse failures of imagination with insights into necessity [Dennett]
11. Knowledge Aims / B. Certain Knowledge / 2. Common Sense Certainty
Commitment to 'I have a hand' only makes sense in a context where it has been doubted [Hawthorne]
13. Knowledge Criteria / A. Justification Problems / 2. Justification Challenges / c. Knowledge closure
How can we know the heavyweight implications of normal knowledge? Must we distort 'knowledge'? [Hawthorne]
We wouldn't know the logical implications of our knowledge if small risks added up to big risks [Hawthorne]
Denying closure is denying we know P when we know P and Q, which is absurd in simple cases [Hawthorne]
13. Knowledge Criteria / B. Internal Justification / 4. Foundationalism / f. Foundationalism critique
That every mammal has a mother is a secure reality, but without foundations [Dennett]
15. Nature of Minds / B. Features of Minds / 1. Consciousness / a. Consciousness
Does consciousness need the concept of consciousness? [Dennett]
15. Nature of Minds / B. Features of Minds / 1. Consciousness / c. Parts of consciousness
Maybe language is crucial to consciousness [Dennett]
15. Nature of Minds / B. Features of Minds / 4. Intentionality / b. Intentionality theories
Unconscious intentionality is the foundation of the mind [Dennett]
17. Mind and Body / C. Functionalism / 1. Functionalism
Could a robot be made conscious just by software? [Dennett]
18. Thought / B. Mechanics of Thought / 4. Language of Thought
A language of thought doesn't explain content [Dennett]
18. Thought / D. Concepts / 5. Concepts and Language / c. Concepts without language
Maybe there can be non-conscious concepts (e.g. in bees) [Dennett]
23. Ethics / E. Utilitarianism / 1. Utilitarianism
Negative utilitarianism implies that the world should be destroyed, to avoid future misery [Smart]
23. Ethics / E. Utilitarianism / 3. Motivation for Altruism
Any group interested in ethics must surely have a sentiment of generalised benevolence [Smart]