structure for 'Mind and Body'    |     alphabetical list of themes    |     unexpand these ideas

17. Mind and Body / C. Functionalism / 2. Machine Functionalism

[mind is in principle a Turing machine]

9 ideas
The soul's faculties depend on the brain, and are simply the brain's organisation [La Mettrie]
     Full Idea: All the soul's faculties depend so much on the specific organisation of the brain and of the whole body that they are clearly nothing but that organisation.
     From: Julien Offray de La Mettrie (Machine Man [1747], p.26)
     A reaction: An interesting idea because it suggests that La Mettrie is a functionalist, rather than simply a reductive physicalist.
Basic logic can be done by syntax, with no semantics [Gödel, by Rey]
     Full Idea: Gödel in his completeness theorem for first-order logic showed that a certain set of syntactically specifiable rules was adequate to capture all first-order valid arguments. No semantics (e.g. reference, truth, validity) was necessary.
     From: report of Kurt Gödel (On Formally Undecidable Propositions [1931]) by Georges Rey - Contemporary Philosophy of Mind 8.2
     A reaction: This implies that a logic machine is possible, but we shouldn't raise our hopes for proper rationality. Validity can be shown for purely algebraic arguments, but rationality requires truth as well as validity, and that needs propositions and semantics.
Instances of pain are physical tokens, but the nature of pain is more abstract [Putnam, by Lycan]
     Full Idea: In machine functionalism, pain tokens (individual instances of pain) are identical with particular neurophysiological states, but pain itself, the kind, universal, or 'type', can be identified only with something more abstract.
     From: report of Hilary Putnam (The Mental Life of Some Machines [1967]) by William Lycan - Introduction - Ontology p.6
     A reaction: This is where the "what is it like?" question seems important. Pain doesn't seem like a physical object, or an abstract idea. Personally I think the former is more likely to be correct than the latter. Causation by pain is not like causation by gravity.
Functionalism says robots and people are the same at one level of abstraction [Putnam]
     Full Idea: My "functionalism" insisted that a robot, a human being, a silicon creature and a disembodied spirit could all work much the same way when described at the relevant level of abstraction, and it is wrong to think the essence of mind is hardware.
     From: Hilary Putnam (Representation and Reality [1988], Int p.xii)
     A reaction: This is the key point about the theory - that it is an abstract theory of mind, saying nothing about substances. It drew, however, some misguided criticisms suggesting silly implementations.
A representational theory of the mind is an externalist theory of the mind [Dretske]
     Full Idea: A representational theory of the mind is an externalist theory of the mind.
     From: Fred Dretske (Naturalizing the Mind [1997], §2)
     A reaction: Presumably brain events bring the world into the mind, so the world must be mentioned in explaining the mind. Maybe 'externalism' sounds grand, but is stating the boringly obvious. Explanations of mind need no mention of external particulars.
In the Representational view, concepts play the key linking role [Fodor]
     Full Idea: If the Representational Theory of Mind is true, then concepts are constituents of beliefs, the units of semantic evaluation, a locus of causal interactions among mental representations, and formulas in Mentalese.
     From: Jerry A. Fodor (LOT 2 [2008], Ch.2.1)
     A reaction: I like this aspect of the theory, but then I can't really think of a theory about how the mind works that doesn't make concepts central to it.
Any piece of software can always be hard-wired [Fodor]
     Full Idea: For any machine that computes a function by executing an explicit algorithm, there exists a hard-wired machine that computes the same function by not executing an explicit algorithm.
     From: Jerry A. Fodor (Psychosemantics [1987], p. 23)
     A reaction: It is certainly vital for functionalists to understand that software can be hardwired. Presumably we should understand a hardwired alogirthm as 'implicit'?
The distinction between software and hardware is not clear in computing [Lycan]
     Full Idea: Even the software/hardware distinction as it is literally applied within computer science is philosophically unclear.
     From: William Lycan (Consciousness [1987], 4.4)
     A reaction: This is true, and very important for functionalist theories of the mind. Even very volatile software is realised in 'hard' physics, and rewritable discs etc blur the distinction between 'programmable' and 'hardwired'.
Functionalism has three linked levels: physical, functional, and mental [Lycan]
     Full Idea: Functionalism has three distinct levels of description: a neurophysiological description, a functional description (relative to a program which the brain is realising), and it may have a further mental description.
     From: William Lycan (Introduction - Ontology [1999], p.6)
     A reaction: I have always thought that the 'levels of description' idea was very helpful in describing the mind/brain. I feel certain that we are dealing with a single thing, so this is the only way we can account for the diverse ways in which we discuss it.