Combining Philosophers

Ideas for David-Hillel Ruben, Sarah Bakewell and David Papineau

unexpand these ideas     |    start again     |     choose another area for these philosophers

display all the ideas for this combination of philosophers


8 ideas

15. Nature of Minds / B. Features of Minds / 1. Consciousness / a. Consciousness
Whether octopuses feel pain is unclear, because our phenomenal concepts are too vague [Papineau]
     Full Idea: Our phenomenal concepts are irredeemably vague in certain dimensions, in ways that preclude there being any fact of the matter about whether octopuses feel phenomenal pain, or silicon-based humanoids would have any phenomenal consciousness.
     From: David Papineau (Thinking about Consciousness [2002], Intro §7)
     A reaction: It would be hard for Papineau to prove this point, but clearly our imagination finds it very hard to grasp the idea of a thing which is 'somewhat conscious'. The concept of being much more conscious than humans also bewilders us.
Our concept of consciousness is crude, and lacks theoretical articulation [Papineau]
     Full Idea: Our phenomenal concept of consciousness-as-such is a crude tool, lacking theoretical articulation
     From: David Papineau (Thinking about Consciousness [2002], 7.13)
     A reaction: This is a point well made. Given that the human brain is the most complex thing (for its size) in the known universe, we shouldn't expect it to divide up into three or four clear-cut activities. Compare the precision of 'geography' as a concept.
We can’t decide what 'conscious' means, so it is undecidable whether cats are conscious [Papineau]
     Full Idea: If consciousness is availability for HOT judgements, then cats are not conscious, but if it consists in attention, then they are. I say the concept of consciousness is indefinite between the two, so there is no fact about whether cats are conscious.
     From: David Papineau (Thinking about Consciousness [2002], 7.16)
     A reaction: Nice point. My personal view is that the question of whether cats are conscious is hopeless because philosophers insist on making consciousness all-or-nothing (e.g. Idea 5786). If I experienced cat mentality, I might say I was 'semi-conscious'.
15. Nature of Minds / B. Features of Minds / 1. Consciousness / e. Cause of consciousness
Maybe a creature is conscious if its mental states represent things in a distinct way [Papineau]
     Full Idea: The thesis of 'representational theories of consciousness' is that a creature is conscious just in case it is in a certain kind of representational state, some state which represents in a certain way.
     From: David Papineau (Thinking about Consciousness [2002])
     A reaction: [He cites Harman, Dretske and Tye] The immediate impediment I see to this view is the extreme difficulty of explaining what the special 'way' is that turns representations into consciousness. Some mental states are not representational, and vice versa.
15. Nature of Minds / B. Features of Minds / 1. Consciousness / f. Higher-order thought
The 'actualist' HOT theory says consciousness comes from actual higher judgements of mental states [Papineau]
     Full Idea: The 'actualist' HOT theory says that a state is conscious if the subject is 'aware' of it, where this is understood as a matter of the subject forming some actual Higher-Order judgement about it.
     From: David Papineau (Thinking about Consciousness [2002], 7.11)
     A reaction: As stated there seems an obvious regress problem. Is the consciousness in the mental state, or in the higher awareness of it? If the former, how does being observed make it conscious? If the latter, what gives the higher level its consciousness?
Actualist HOT theories imply that a non-conscious mental event could become conscious when remembered [Papineau]
     Full Idea: Actualist HOT theories face an awkward problem with memory judgements: ...how can an earlier mental state be rendered conscious by some later act of memory? As when I see a red pillar box with no higher-order judgement, and then recall it later.
     From: David Papineau (Thinking about Consciousness [2002], 7.11)
     A reaction: [See 7886 for 'Actualist' HOT theories] This is not altogether absurd. A red pillar box could be somewhere in my field of vision, and then I might suddenly become conscious of it (if it moved!). Police interrogation reminds me of what I only glimpsed.
States are conscious if they could be the subject of higher-order mental judgements [Papineau]
     Full Idea: The 'dispositional' HOT thesis says that a state is conscious just in case it could have been the subject of an introspective Higher-Order judgement, even if it wasn't actually so subject.
     From: David Papineau (Thinking about Consciousness [2002], 7.13)
     A reaction: [He cites Dennett and Carruthers for this view] This is designed to meet other problems, but it sounds odd. Does it really make no difference whether higher-judgement actually occurs? How can conscious events be distinguished once they've gone?
Higher-order judgements may be possible where the subject denies having been conscious [Papineau]
     Full Idea: Dispositional Higher-Order judgeability will be present in some cases which the empirical methodology catalogues as not conscious (as when a subject denies having heard a sound, or seen a bird).
     From: David Papineau (Thinking about Consciousness [2002], 7.13)
     A reaction: (This attacks Idea 7887) This confirms my intuition, that we can be quite unconscious of things which can still be recalled at a later date. Of course, one could always challenge the reliability of the subject's report in such a case.