structure for 'Mind and Body'    |     alphabetical list of themes    |     unexpand these ideas

17. Mind and Body / C. Functionalism / 7. Chinese Room

[counterexample of non-conscious function]

11 ideas
Maybe understanding doesn't need consciousness, despite what Searle seems to think [Searle, by Chalmers]
     Full Idea: Searle originally directed the Chinese Room against machine intentionality rather than consciousness, arguing that it is "understanding" that the room lacks,….but on Searle's view intentionality requires consciousness.
     From: report of John Searle (Minds, Brains and Science [1984]) by David J.Chalmers - The Conscious Mind 4.9.4
     A reaction: I doubt whether 'understanding' is a sufficiently clear and distinct concept to support Searle's claim. Understanding comes in degrees, and we often think and act with minimal understanding.
A program won't contain understanding if it is small enough to imagine [Dennett on Searle]
     Full Idea: There is nothing remotely like genuine understanding in any hunk of programming small enough to imagine readily.
     From: comment on John Searle (Minds, Brains and Science [1984]) by Daniel C. Dennett - Consciousness Explained 14.1
     A reaction: We mustn't hide behind 'complexity', but I think Dennett is right. It is important to think of speed as well as complexity. Searle gives the impression that he knows exactly what 'understanding' is, but I doubt if anyone else does.
If bigger and bigger brain parts can't understand, how can a whole brain? [Dennett on Searle]
     Full Idea: The argument that begins "this little bit of brain activity doesn't understand Chinese, and neither does this bigger bit..." is headed for the unwanted conclusion that even the activity of the whole brain won't account for understanding Chinese.
     From: comment on John Searle (Minds, Brains and Science [1984]) by Daniel C. Dennett - Consciousness Explained 14.1
     A reaction: In other words, Searle is guilty of a fallacy of composition (in negative form - parts don't have it, so whole can't have it). Dennett is right. The whole shebang of the full brain will obviously do wonderful (and commonplace) things brain bits can't.
I now think syntax is not in the physics, but in the eye of the beholder [Searle]
     Full Idea: It seems to me now that syntax is not intrinsic to the physics of the system, but is in the eye of the beholder.
     From: John Searle (The Mystery of Consciousness [1997], Ch.1)
     A reaction: This seems right, in that whether strung beads are a toy or an abacus depends on the user. It doesn't follow that the 'beholder' stands outside the physics. A beholder is another physical system, of a particular type of high complexity.
A program for Chinese translation doesn't need to understand Chinese [Searle]
     Full Idea: A computer, me for example, could run the steps in the program for some mental capacity, such as understanding Chinese, without understanding a word of Chinese.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.II)
     A reaction: I don't think this is true. I could recite a bit of Chinese without comprehension, but giving flexible answers to complex questions isn't plausible just by gormlessly implementing a procedure.
The person couldn't run Searle's Chinese Room without understanding Chinese [Kim]
     Full Idea: It is by no means clear that any human could manage to do what Searle imagines himself to be doing in the Chinese Room - that is, short of throwing away the rule book and learning some real Chinese.
     From: Jaegwon Kim (Philosophy of Mind [1996], p.100)
     A reaction: It is not clear how a rule book could contain answers to an infinity of possible questions. The Chinese Room is just a very poor analogy with what is envisaged in the project of artificial intelligence.
Is the room functionally the same as a Chinese speaker? [Rey]
     Full Idea: The question for a computational-representation theory about the Chinese Room is: is what is happening inside the room functionally equivalent to what is happening inside a normal Chinese speaker?
     From: Georges Rey (Contemporary Philosophy of Mind [1997], 10.2.1)
     A reaction: Certainly the Room lacks morality ('how can I torture my sister?'). It won't spot connections between recent questions. It won't ask itself questions. It will take years to spot absurd questions.
Searle is guilty of the fallacy of division - attributing a property of the whole to a part [Rey]
     Full Idea: You should no more attribute understanding of Chinese to this one part of the system than you should ascribe the properties of the entire British Empire to Queen Victoria. This is the fallacy of division.
     From: Georges Rey (Contemporary Philosophy of Mind [1997], 10.2.3)
     A reaction: This very nicely pinpoints what is wrong with the Chinese Room argument (nice analogy, too). If you carefully introspect what is involved when you 'understand' something, it is immensely complex, though it feels instant and simple.
Maybe the whole Chinese Room understands Chinese, though the person doesn't [Chalmers]
     Full Idea: Opponents typically reply to Searle's argument by conceding that the person in the room does not understand Chinese, and arguing that the understanding should instead be attributed to the system consisting of the person and the pieces of paper.
     From: David J.Chalmers (The Conscious Mind [1996], 4.9.4)
     A reaction: Searle himself spotted this reply. It seems plausible to say that a book contains 'understanding', so the translation dictionary may have it. A good Room would cope with surprise questions.
A computer program is equivalent to the person AND the manual [Lowe]
     Full Idea: A computer executing its program is not equivalent to the English-speaker in the Chinese Room, but to the combination of the English-speaker and the operation manual.
     From: E.J. Lowe (Introduction to the Philosophy of Mind [2000], Ch. 8)
     A reaction: Searle replies that there would be no understanding even if the person learned the manual off by heart. However, if we ask 'Is there any understanding of the universe in Newton's book?' the answer has to be 'yes'. So the manual contains understanding.
The Chinese Room should be able to ask itself questions in Mandarin [Westaway]
     Full Idea: If the Chinese Room is functionally equivalent to a Mandarin speaker, it ought to be able to ask itself questions in Mandarin (and it can't).
     From: Luke Westaway (talk [2005]), quoted by PG - Db (ideas)
     A reaction: Searle might triumphantly say that this proves there is no understanding in the room, but the objection won't go away, because the room is presumably functionally equivalent to a speaker, and not just a mere translator (who might use mechanical tricks).