structure for 'Mind and Body'    |     alphabetical list of themes    |     unexpand these ideas

17. Mind and Body / C. Functionalism / 8. Functionalism critique

[criticisms of the functionalist theory of mind]

18 ideas
Is there just one computational state for each specific belief? [Putnam]
     Full Idea: The idea that there is one computational state that every being who believes that there are lots of cats in the neighbourhood is in must be false.
     From: Hilary Putnam (Representation and Reality [1988], §5 p.84)
     A reaction: It is tempting to say that the mental states of such people must have SOMETHING in common, until you realise that all you can specify is that all their states are about cats.
Functionalism can't explain reference and truth, which are needed for logic [Putnam]
     Full Idea: Functionalism has as much trouble with physical accounts of reference as of meaning. Reference is the main tool used in formal theories of truth. But 'truth' isn't folk psychology, it is central to logic, which everyone wants.
     From: Hilary Putnam (Representation and Reality [1988], Int p.xiv)
     A reaction: All logic is defined in terms of truth and falsehood resulting from reasoning, but it could be that 'true' and 'false' have no more content that 1 and 0 in binary electronics. They are distinct, but empty.
If concepts have external meaning, computational states won't explain psychology [Putnam]
     Full Idea: Computational models of the brain/mind will not suffice for cognitive psychology. We cannot individuate concepts and beliefs without reference to the environment. Meanings aren't "in the head".
     From: Hilary Putnam (Representation and Reality [1988], p.73)
     A reaction: Mr Functionalism quits!
Computation presupposes consciousness [Searle]
     Full Idea: Most of the works I have seen in the computational theory of the mind commit some variation on the homunculus fallacy.
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.VI)
     A reaction: This will be because there is an unspoken user for the inner computer. But see Fodor's view (Idea 2506). The key idea here is Dennett's: that not all regresses are vicious. My mind controller isn't like all of me.
If we are computers, who is the user? [Searle]
     Full Idea: If the brain is a digital computer, we are still faced with the question 'Who is the user?'
     From: John Searle (The Rediscovery of the Mind [1992], Ch. 9.VI)
     A reaction: A very nice question. Our whole current concept of a computer involves the unmentioned user. We don't have to go all mystical about persons, though. Robots aren't logically impossible.
How do functional states give rise to mental causation? [Kim]
     Full Idea: On the functionalist account of mental properties, just where does a mental property get its causal powers?
     From: Jaegwon Kim (Philosophy of Mind [1996], p.118)
     A reaction: That is the key problem. Something can only have a function if it has intrinsic powers (corkscrews are rigid and helix-shaped). It can't be irrelevant that pain hurts.
Could a creature without a brain be in the right functional state for pain? [Block]
     Full Idea: If pain is a functional state, it cannot be a brain state, because creatures without brains could realise the same Turing machine as creatures with brains.
     From: Ned Block (Troubles with Functionalism [1978], p. 70)
     A reaction: This strikes me as being a poorly grounded claim. There may be some hypothetical world where brainless creatures implement all our functions, but from here brains look the only plausible option.
Not just any old functional network will have mental states [Block]
     Full Idea: If there are any fixed points in the mind-body problem, one of them is that the economy of Bolivia could not have mental states, no matter how it is distorted.
     From: Ned Block (Troubles with Functionalism [1978], p. 86)
     A reaction: It is hard to disagree with this, but then it can hardly be a serious suggestion that anyone could see how to reconfigure an economy so that it mapped the functional state of the human brain. This is not a crucial problem.
In functionalism, what are the special inputs and outputs of conscious creatures? [Block]
     Full Idea: In functionalism, it is very hard to see how there could be a single physical characterization of the inputs and outputs of all and only creatures with mentality.
     From: Ned Block (Troubles with Functionalism [1978], p. 87)
     A reaction: It would be theoretically possible if the only way to achieve mentality was to have a particular pattern of inputs and outputs. I don't think, though, that 'mentality' is an all-or-nothing concept.
Functionalism needs causation and intentionality to explain actions [Papineau]
     Full Idea: The functionalist approach to the mind needs to invoke assumptions about what desires are for and beliefs are about, in order to infer what agents will do.
     From: David Papineau (Philosophical Naturalism [1993], 3.2)
     A reaction: Isn't the idea that you discover what desires are for and what beliefs are about by examining their function, and what the agent does? Which end should we start?
Role concepts either name the realising property, or the higher property constituting the role [Papineau]
     Full Idea: Role concepts can be of two kinds: they can name whichever property realises the role, or they can name the higher property which constitutes the role.
     From: David Papineau (Thinking about Consciousness [2002], 4.2 n1)
     A reaction: This points strikes me as being crucial to discussions of mental functions. Perhaps labels of Realising Properties and Constituting Properties would help. Analytical philosophy rules.
One computer program could either play chess or fight a war [Rey]
     Full Idea: It is always possible to provide incompatible interpretations of formal theories, so that a computer could use the same program one day to play chess, the next to fight a war.
     From: Georges Rey (Contemporary Philosophy of Mind [1997], 9.1.3)
     A reaction: This seems to present a huge gulf between human chess players (who 'understand' what they are doing) and machines, but I don't accept it. Giving the machine cameras and multi-level software would fix it.
The Chinese Mind doesn't seem conscious, but then nor do brains from outside [Chalmers]
     Full Idea: While it may be intuitively implausible that Block's 'mind' made of the population of China would give rise to conscious experience, it is equally intuitively implausible that a brain should give rise to experience.
     From: David J.Chalmers (The Conscious Mind [1996], 3.7.2)
     A reaction: This sounds like good support for functionalism, but I am more inclined to see it as a critique of 'intuition' as a route to truth where minds are concerned. Intuition isn't designed for that sort of work.
Functionalism cannot explain consciousness just by functional organisation [Heil]
     Full Idea: Functionalism has been widely criticized on the grounds that it is implausible to think that functional organization alone could suffice for conscious experience.
     From: John Heil (From an Ontological Point of View [2003], 20.2)
     A reaction: He cites Block's 'Chinese Mind' as an example. The obvious reply is that you can't explain consciousness with a lump of meat, or with behaviour, or with an anomalous property, or even with a non-physical substance.
If you are a functionalist, there appears to be no room for qualia [Heil]
     Full Idea: If you are a functionalist, there appears to be no room for qualia.
     From: John Heil (Philosophy of Mind [1998], Ch.6)
     A reaction: The problem is not that qualia must be denied, but that there is strong pressure to class them as epiphenomena. However, a raw colour can have a causal role (e.g. in an art gallery). Best to say (with Chalmers?) that functions cause qualia?
Functionalism can't distinguish our experiences in spectrum inversion [Lowe]
     Full Idea: It seems that functionalism can recognise no difference between my colour experiences and yours, in the case of spectrum inversion, suggesting that it fails to characterise colour experience adequately, by omitting its qualitative character.
     From: E.J. Lowe (Introduction to the Philosophy of Mind [2000], Ch. 3)
     A reaction: This is a standard objection to functionalism, but then it is an objection to most other theories as well. Even dualism just offers a mystery as to why experiences have qualities. Observing a patch of red involves about three billion brain connections.
Functionalism only discusses relational properties of mental states, not intrinsic properties [Lowe]
     Full Idea: Functionalism has nothing positive to say about the intrinsic properties of mental states, only something about their relational properties.
     From: E.J. Lowe (Introduction to the Philosophy of Mind [2000], Ch. 3)
     A reaction: This seems to me highly significant. All references to function (e.g. in Aristotle) invite the question of what enables something to have that function. Maybe the core question of philosophy of mind is whether mental states are intrinsic or relational.
Functionalism commits us to bizarre possibilities, such as 'zombies' [Lowe]
     Full Idea: Functionalism seems to commit us to bizarre possibilities, such as 'zombies'.
     From: E.J. Lowe (Introduction to the Philosophy of Mind [2000], Ch. 3)
     A reaction: This goes with the tendency of functionalism to imply epiphenomenalism - that is, to make the intrinsic character of mental states irrelevant to thinking. I'd love to eavesdrop on two zombies in an art gallery.