Combining Philosophers

Ideas for Francois-Marie Voltaire, John Searle and Barbara Vetter

unexpand these ideas     |    start again     |     choose another area for these philosophers

display all the ideas for this combination of philosophers


8 ideas

17. Mind and Body / C. Functionalism / 1. Functionalism
Functionalists like the externalist causal theory of reference [Searle]
     Full Idea: Functionalism has been rejuvenated by being joined to externalist causal theories of reference.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 2.VIII)
     A reaction: This, however, seems to be roughly the reason why Putnam gave up his functionalist theory. See Ideas 2332 and 2071. However the causal network of mind can incorporate environmental features.
17. Mind and Body / C. Functionalism / 7. Chinese Room
Maybe understanding doesn't need consciousness, despite what Searle seems to think [Searle, by Chalmers]
     Full Idea: Searle originally directed the Chinese Room against machine intentionality rather than consciousness, arguing that it is "understanding" that the room lacks,….but on Searle's view intentionality requires consciousness.
     From: report of John Searle (Minds, Brains and Science [1984]) by David J.Chalmers - The Conscious Mind 4.9.4
     A reaction: I doubt whether 'understanding' is a sufficiently clear and distinct concept to support Searle's claim. Understanding comes in degrees, and we often think and act with minimal understanding.
A program won't contain understanding if it is small enough to imagine [Dennett on Searle]
     Full Idea: There is nothing remotely like genuine understanding in any hunk of programming small enough to imagine readily.
     From: comment on John Searle (Minds, Brains and Science [1984]) by Daniel C. Dennett - Consciousness Explained 14.1
     A reaction: We mustn't hide behind 'complexity', but I think Dennett is right. It is important to think of speed as well as complexity. Searle gives the impression that he knows exactly what 'understanding' is, but I doubt if anyone else does.
If bigger and bigger brain parts can't understand, how can a whole brain? [Dennett on Searle]
     Full Idea: The argument that begins "this little bit of brain activity doesn't understand Chinese, and neither does this bigger bit..." is headed for the unwanted conclusion that even the activity of the whole brain won't account for understanding Chinese.
     From: comment on John Searle (Minds, Brains and Science [1984]) by Daniel C. Dennett - Consciousness Explained 14.1
     A reaction: In other words, Searle is guilty of a fallacy of composition (in negative form - parts don't have it, so whole can't have it). Dennett is right. The whole shebang of the full brain will obviously do wonderful (and commonplace) things brain bits can't.
A program for Chinese translation doesn't need to understand Chinese [Searle]
     Full Idea: A computer, me for example, could run the steps in the program for some mental capacity, such as understanding Chinese, without understanding a word of Chinese.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.II)
     A reaction: I don't think this is true. I could recite a bit of Chinese without comprehension, but giving flexible answers to complex questions isn't plausible just by gormlessly implementing a procedure.
I now think syntax is not in the physics, but in the eye of the beholder [Searle]
     Full Idea: It seems to me now that syntax is not intrinsic to the physics of the system, but is in the eye of the beholder.
     From: John Searle (The Mystery of Consciousness [1997], Ch.1)
     A reaction: This seems right, in that whether strung beads are a toy or an abacus depends on the user. It doesn't follow that the 'beholder' stands outside the physics. A beholder is another physical system, of a particular type of high complexity.
17. Mind and Body / C. Functionalism / 8. Functionalism critique
Computation presupposes consciousness [Searle]
     Full Idea: Most of the works I have seen in the computational theory of the mind commit some variation on the homunculus fallacy.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.VI)
     A reaction: This will be because there is an unspoken user for the inner computer. But see Fodor's view (Idea 2506). The key idea here is Dennett's: that not all regresses are vicious. My mind controller isn't like all of me.
If we are computers, who is the user? [Searle]
     Full Idea: If the brain is a digital computer, we are still faced with the question 'Who is the user?'
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.VI)
     A reaction: A very nice question. Our whole current concept of a computer involves the unmentioned user. We don't have to go all mystical about persons, though. Robots aren't logically impossible.