more from this thinker | more from this text
Full Idea
Searle originally directed the Chinese Room against machine intentionality rather than consciousness, arguing that it is "understanding" that the room lacks,….but on Searle's view intentionality requires consciousness.
Gist of Idea
Maybe understanding doesn't need consciousness, despite what Searle seems to think
Source
report of John Searle (Minds, Brains and Science [1984]) by David J.Chalmers - The Conscious Mind 4.9.4
Book Ref
Chalmers,David J.: 'The Conscious Mind' [OUP 1997], p.322
A Reaction
I doubt whether 'understanding' is a sufficiently clear and distinct concept to support Searle's claim. Understanding comes in degrees, and we often think and act with minimal understanding.
2427 | Maybe understanding doesn't need consciousness, despite what Searle seems to think [Searle, by Chalmers] |
7389 | A program won't contain understanding if it is small enough to imagine [Dennett on Searle] |
7390 | If bigger and bigger brain parts can't understand, how can a whole brain? [Dennett on Searle] |
5789 | I now think syntax is not in the physics, but in the eye of the beholder [Searle] |
3496 | A program for Chinese translation doesn't need to understand Chinese [Searle] |
3384 | The person couldn't run Searle's Chinese Room without understanding Chinese [Kim] |
3216 | Is the room functionally the same as a Chinese speaker? [Rey] |
3220 | Searle is guilty of the fallacy of division - attributing a property of the whole to a part [Rey] |
2428 | Maybe the whole Chinese Room understands Chinese, though the person doesn't [Chalmers] |
6654 | A computer program is equivalent to the person AND the manual [Lowe] |
7335 | The Chinese Room should be able to ask itself questions in Mandarin [Westaway] |