Combining Philosophers

Ideas for Michael Stanford, Kurt Gdel and John Searle

unexpand these ideas     |    start again     |     choose another area for these philosophers

display all the ideas for this combination of philosophers


30 ideas

17. Mind and Body / A. Mind-Body Dualism / 2. Interactionism
Mind and brain don't interact if they are the same [Searle]
     Full Idea: There is no "link" between consciousness and the brain, any more than there is a link between the liquidity of water and the H2O molecules.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 4.III)
     A reaction: We say of some properties that 'x is F', and of others that 'x has F', and of others that 'x is F because of y' (as in a knife having sharpness because it is thin and hard). Consciousness might fit the third case just as well as the first.
17. Mind and Body / A. Mind-Body Dualism / 7. Zombies
Without internal content, a zombie's full behaviour couldn't be explained [Searle]
     Full Idea: There could be no intentional zombie, because (unlike with a conscious agent) there simply is no fact of the matter as to exactly which aspectual shapes its alleged intentional states have. Is it seeking water or H2O?
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 7.III)
     A reaction: The obvious response to this is behaviourist talk of 'dispositions'. The dispositions of scientist when seeking water and when seeking H2O are different. Zombies behave identically to us, so their intentional states have whatever is needed to do the job.
17. Mind and Body / B. Behaviourism / 4. Behaviourism Critique
Mental states only relate to behaviour contingently, not necessarily [Searle]
     Full Idea: I believe that the relation of mental states to behaviour is purely contingent.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 1.V.5)
     A reaction: I don't think I agree, though it will depend on where you draw the line between mental states and behaviour. Since there have never been two identical states since the beginning of time, it is a little hard to test this one.
Wanting H2O only differs from wanting water in its mental component [Searle]
     Full Idea: If a person exhibits water-seeking behaviour, they also exhibit H2O-seeking behaviour, so there is no way the behaviour itself, without reference to a mental component, can constitute wanting water rather than H2O.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 7.II.4)
     A reaction: What about the behaviour of responding to the discovery that this stuff isn't actually H2O? Or the disposition to choose the real thing rather than ersatz water? An interesting comment, though.
17. Mind and Body / C. Functionalism / 1. Functionalism
Functionalists like the externalist causal theory of reference [Searle]
     Full Idea: Functionalism has been rejuvenated by being joined to externalist causal theories of reference.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 2.VIII)
     A reaction: This, however, seems to be roughly the reason why Putnam gave up his functionalist theory. See Ideas 2332 and 2071. However the causal network of mind can incorporate environmental features.
17. Mind and Body / C. Functionalism / 2. Machine Functionalism
Basic logic can be done by syntax, with no semantics [Gödel, by Rey]
     Full Idea: Gödel in his completeness theorem for first-order logic showed that a certain set of syntactically specifiable rules was adequate to capture all first-order valid arguments. No semantics (e.g. reference, truth, validity) was necessary.
     From: report of Kurt Gödel (On Formally Undecidable Propositions [1931]) by Georges Rey - Contemporary Philosophy of Mind 8.2
     A reaction: This implies that a logic machine is possible, but we shouldn't raise our hopes for proper rationality. Validity can be shown for purely algebraic arguments, but rationality requires truth as well as validity, and that needs propositions and semantics.
17. Mind and Body / C. Functionalism / 7. Chinese Room
Maybe understanding doesn't need consciousness, despite what Searle seems to think [Searle, by Chalmers]
     Full Idea: Searle originally directed the Chinese Room against machine intentionality rather than consciousness, arguing that it is "understanding" that the room lacks,….but on Searle's view intentionality requires consciousness.
     From: report of John Searle (Minds, Brains and Science [1984]) by David J.Chalmers - The Conscious Mind 4.9.4
     A reaction: I doubt whether 'understanding' is a sufficiently clear and distinct concept to support Searle's claim. Understanding comes in degrees, and we often think and act with minimal understanding.
A program won't contain understanding if it is small enough to imagine [Dennett on Searle]
     Full Idea: There is nothing remotely like genuine understanding in any hunk of programming small enough to imagine readily.
     From: comment on John Searle (Minds, Brains and Science [1984]) by Daniel C. Dennett - Consciousness Explained 14.1
     A reaction: We mustn't hide behind 'complexity', but I think Dennett is right. It is important to think of speed as well as complexity. Searle gives the impression that he knows exactly what 'understanding' is, but I doubt if anyone else does.
If bigger and bigger brain parts can't understand, how can a whole brain? [Dennett on Searle]
     Full Idea: The argument that begins "this little bit of brain activity doesn't understand Chinese, and neither does this bigger bit..." is headed for the unwanted conclusion that even the activity of the whole brain won't account for understanding Chinese.
     From: comment on John Searle (Minds, Brains and Science [1984]) by Daniel C. Dennett - Consciousness Explained 14.1
     A reaction: In other words, Searle is guilty of a fallacy of composition (in negative form - parts don't have it, so whole can't have it). Dennett is right. The whole shebang of the full brain will obviously do wonderful (and commonplace) things brain bits can't.
A program for Chinese translation doesn't need to understand Chinese [Searle]
     Full Idea: A computer, me for example, could run the steps in the program for some mental capacity, such as understanding Chinese, without understanding a word of Chinese.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.II)
     A reaction: I don't think this is true. I could recite a bit of Chinese without comprehension, but giving flexible answers to complex questions isn't plausible just by gormlessly implementing a procedure.
I now think syntax is not in the physics, but in the eye of the beholder [Searle]
     Full Idea: It seems to me now that syntax is not intrinsic to the physics of the system, but is in the eye of the beholder.
     From: John Searle (The Mystery of Consciousness [1997], Ch.1)
     A reaction: This seems right, in that whether strung beads are a toy or an abacus depends on the user. It doesn't follow that the 'beholder' stands outside the physics. A beholder is another physical system, of a particular type of high complexity.
17. Mind and Body / C. Functionalism / 8. Functionalism critique
Computation presupposes consciousness [Searle]
     Full Idea: Most of the works I have seen in the computational theory of the mind commit some variation on the homunculus fallacy.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.VI)
     A reaction: This will be because there is an unspoken user for the inner computer. But see Fodor's view (Idea 2506). The key idea here is Dennett's: that not all regresses are vicious. My mind controller isn't like all of me.
If we are computers, who is the user? [Searle]
     Full Idea: If the brain is a digital computer, we are still faced with the question 'Who is the user?'
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.VI)
     A reaction: A very nice question. Our whole current concept of a computer involves the unmentioned user. We don't have to go all mystical about persons, though. Robots aren't logically impossible.
17. Mind and Body / D. Property Dualism / 1. Reductionism critique
Consciousness has a first-person ontology, so it cannot be reduced without omitting something [Searle]
     Full Idea: Consciousness has a first-person or subjective ontology and so cannot be reduced to anything that has third-person or objective ontology. If you try to reduce or eliminate one in favour of the other you leave something out.
     From: John Searle (The Mystery of Consciousness [1997], Concl 2.10)
     A reaction: Misconceived. There is no such thing as 'first-person' ontology, though there are subjective viewpoints, but then a camera has a viewpoint which is lost if you eliminate it. If consciousness is physical events, that leaves viewpoints untouched.
17. Mind and Body / D. Property Dualism / 3. Property Dualism
Property dualists tend to find the mind-body problem baffling [Searle]
     Full Idea: Property dualists (e.g. Nagel and McGinn) think that the mind-body problem is frightfully difficult, perhaps altogether insoluble.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 1.I)
     A reaction: Nagel's problem is that our concepts aren't up to it; McGinn's is that the very structure of our minds isn't up to it. My view is that the difficulty is the complexity we are up against, not the ontology.
Property dualism denies reductionism [Searle]
     Full Idea: What is property dualism but the view that there are irreducible mental properties?
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 5.III)
     A reaction: Being red and being square are separate, but they are both entailed by the material basis, and hence are reducible. Properties may not link directly, but they must link indirectly.
Consciousness is a brain property as liquidity is a water property [Searle]
     Full Idea: Consciousness is a higher-level or emergent property of the brain, but only in the sense that solidity is an emergent property of water when it is ice, and liquidity when it melts.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 1.IV)
     A reaction: It is hard to know which side Searle is on. These examples are highly reductive, and make him a thoroughgoing reductive physicalist (with which I agree).
Property dualism is the reappearance of Cartesianism [Searle]
     Full Idea: Opponents of materialism tend to embrace "property dualism", thus accepting the Cartesian apparatus that I had thought long discredited.
     From: John Searle (The Rediscovery of the Mind [1992], Intro)
     A reaction: This seems to be precisely the current situation. Cartesian dualism is thoroughly marginalised (but still whimpering in the corner), and the real battle is between physicalism and property dualism. The latter is daft.
17. Mind and Body / D. Property Dualism / 4. Emergentism
There is non-event causation between mind and brain, as between a table and its solidity [Searle]
     Full Idea: The solidity of a table is explained causally by the behaviour of the molecules of which it is composed, but the solidity is not an extra event, it is just a feature of the table. This non-event causation models the relationship of mind and brain.
     From: John Searle (The Mystery of Consciousness [1997], Ch.1)
     A reaction: He calls it 'non-event' causation, while referring to the 'behaviour of molecules'. Ask a physicist what a 'feature' is. Better to think of it as one process 'emerging' as another process at the macro-level.
17. Mind and Body / D. Property Dualism / 5. Supervenience of mind
If mind-brain supervenience isn't causal, this implies epiphenomenalism [Searle]
     Full Idea: There are constitutive and causal notions of supervenience. Kim claims that mental events have no causal role, and merely supervene on brain events which do (which implies epiphenomenalism). But it seems obvious that mind is caused by brain.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 5.V)
     A reaction: Personally I think the whole discussion is doomed to confusion because it is riddled with a priori dualism. There is no all-or-nothing boundary between 'mind' and 'brain'. Kim's views have changed.
Mental events can cause even though supervenient, like the solidity of a piston [Searle]
     Full Idea: That mental features supervene on neuronal features in no way diminishes their causal efficacy. The solidity of the piston is supervenient on its molecular structure, but this does not make solidity epiphenomenal.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 5.V)
     A reaction: Searle's examples never seem to quite fit what he is saying. Molecules and solidity are supervenient because they are identical (solidity is the presence of certain molecules). Solidity doesn't have causal powers that molecules lack.
Mind and brain are supervenient in respect of cause and effect [Searle]
     Full Idea: Mind is supervenient on brain in the following respect: type-identical neurophysiological causes have type-identical mentalistic effects.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 5.V)
     A reaction: An interesting statement of what might be meant by 'supervenience'. Searle's version implies necessity in the link (but not identity). I take him to imply that a zombie is impossible.
Upwards mental causation makes 'supervenience' irrelevant [Searle]
     Full Idea: Once you recognise the existence of bottom-up, micro to macro forms of causation, the notion of supervenience no longer does any work in philosophy.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 5.V)
     A reaction: I'm not sure if the notion of supervenience ever did any work. Davidson only fished up the word because none of the normal relationships between things seemed to apply (and he was wrong about that).
17. Mind and Body / D. Property Dualism / 6. Mysterianism
Consciousness seems indefinable by conditions or categories [Searle]
     Full Idea: We can't define "consciousness" by necessary and sufficient conditions, or by the Aristotelian method of genus and differentia.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 4.I)
     A reaction: We may not be able to 'define' it, but we can 'characterise' it. The third approach to definition is a catalogue of essential properties, which might tail off rather vaguely.
17. Mind and Body / E. Mind as Physical / 1. Physical Mind
The pattern of molecules in the sea is much more complex than the complexity of brain neurons [Searle]
     Full Idea: The pattern of molecules in the ocean is vastly more complex than any pattern of neurons in my brain.
     From: John Searle (The Mystery of Consciousness [1997], Concl 2.6)
     A reaction: A nice warning for anyone foolish enough to pin their explanatory hopes simply on 'complexity', but we would not be so foolish. A subtler account of complexity (e.g. by Edelman and Tononi) might make brains much more complex than oceans.
17. Mind and Body / E. Mind as Physical / 2. Reduction of Mind
Can the homunculus fallacy be beaten by recursive decomposition? [Searle]
     Full Idea: The idea (of Dennett and others) is that recursive decomposition will eliminate the homunculi.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.VI)
     A reaction: Lycan is the best exponent of this view, which I like. My brain clearly has a substantial homunculus which I call my PA; it regularly reminds of what I have to do in an hour's time. I am sure it is composed of smaller brain components working as a team.
Searle argues that biology explains consciousness, but physics won't explain biology [Searle, by Kriegel/Williford]
     Full Idea: Searle appears to argue that phenomenal consciousness is explained in biological terms, but that biological properties are irreducible to purely (micro)physical ones.
     From: report of John Searle (The Rediscovery of the Mind [1992]) by U Kriegel / K Williford - Intro to 'Self-Representational Consciousness' n1
     A reaction: Searle is very hard to pin down, and this account suggests the reason very clearly - because he is proposing something which is bizarrely implausible. The reduction of biology-to-physics looks much more likely than consciousness-to-biology.
If mind is caused by brain, does this mean mind IS brain? [Searle]
     Full Idea: I hold a view of mind/brain relations that is a form of causal reduction (mental features are caused by biological processes), but does this imply ontological reduction? (…No!)
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 5.II.5)
     A reaction: What exactly is his claim? Presumably 'causal reduction' implies identity of (philosophical) substance. This seems to imply 'emergence' in a rather old-fashioned and dramatic way, though elsewhere Searle denies this.
17. Mind and Body / E. Mind as Physical / 7. Anti-Physicalism / a. Physicalism critique
If tree rings contain information about age, then age contains information about rings [Searle]
     Full Idea: You could say that tree-rings contain information about the age of a tree, but you could as well say that the age of a tree in years contains information about the number of rings in a tree stump. ..'Information' is not a real causal feature of the world.
     From: John Searle (The Mystery of Consciousness [1997], Concl 2.5)
     A reaction: A nice point for fans of 'information' to ponder. However, you cannot deny the causal connection between the age and the rings. Information has a subjective aspect, but you cannot, for example, eliminate the role of DNA in making organisms.
17. Mind and Body / E. Mind as Physical / 7. Anti-Physicalism / b. Multiple realisability
If mind is multiply realisable, it is possible that anything could realise it [Searle]
     Full Idea: The same principle that implies multiple realisability would seem to imply universal realisability. …Any object whatever could have syntactical ascriptions made to it.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.V)
     A reaction: This leads to rather weak reductio objections to functionalism. Logically there may be no restriction on how to implement a mind, but naturally there are very tight restrictions. Stick to neurons seems the best strategy.