more on this theme | more from this thinker
Full Idea
In functionalism, it is very hard to see how there could be a single physical characterization of the inputs and outputs of all and only creatures with mentality.
Gist of Idea
In functionalism, what are the special inputs and outputs of conscious creatures?
Source
Ned Block (Troubles with Functionalism [1978], p. 87)
Book Ref
'The Philosophy of Mind', ed/tr. Beakley,B /Ludlow P [MIT 1992], p.87
A Reaction
It would be theoretically possible if the only way to achieve mentality was to have a particular pattern of inputs and outputs. I don't think, though, that 'mentality' is an all-or-nothing concept.
2574 | Behaviour requires knowledge as well as dispositions [Block] |
2576 | In functionalism, desires are internal states with causal relations [Block] |
2575 | Functionalism is behaviourism, but with mental states as intermediaries [Block] |
2578 | Could a creature without a brain be in the right functional state for pain? [Block] |
2577 | Simple machine-functionalism says mind just is a Turing machine [Block] |
2580 | A Turing machine, given a state and input, specifies an output and the next state [Block] |
2579 | Physicalism is prejudiced in favour of our neurology, when other systems might have minds [Block] |
2581 | Intuition may say that a complex sentence is ungrammatical, but linguistics can show that it is not [Block] |
2582 | A brain looks no more likely than anything else to cause qualia [Block] |
2583 | You might invert colours, but you can't invert beliefs [Block] |
2584 | Lobotomised patients can cease to care about a pain [Block] |
2585 | Not just any old functional network will have mental states [Block] |
2586 | In functionalism, what are the special inputs and outputs of conscious creatures? [Block] |