Combining Philosophers

Ideas for Anaxarchus, Daniel C. Dennett and Nicholas Rescher

unexpand these ideas     |    start again     |     choose another area for these philosophers

display all the ideas for this combination of philosophers


16 ideas

17. Mind and Body / A. Mind-Body Dualism / 6. Epiphenomenalism
If an epiphenomenon has no physical effects, it has to be undetectable [Dennett]
     Full Idea: Psychologists mean a by-product by an 'epiphenomenon', ...but the philosophical meaning is too strong: it yields a concept of no utility whatsoever. Since x has no physical effects (according to the definition), no instrument can detect it.
     From: Daniel C. Dennett (Consciousness Explained [1991], 12.5)
     A reaction: Well said! This has always been my half-formulated intuition about the claim that the mind (or anything) might be totally epiphenomenal. All a thing such as the reflection on a lake can be is irrelevant to the functioning of that specified system.
17. Mind and Body / A. Mind-Body Dualism / 8. Dualism of Mind Critique
Dualism wallows in mystery, and to accept it is to give up [Dennett]
     Full Idea: Given the way dualism wallows in mystery, accepting dualism is giving up.
     From: Daniel C. Dennett (Consciousness Explained [1991], 2.4)
     A reaction: Some things, of course, might be inherently mysterious to us, and we might as well give up. The big dualist mystery is the explanation of how such different substances can interact. How do two physical substances manage to interact?
17. Mind and Body / B. Behaviourism / 3. Intentional Stance
Beliefs and desires aren't real; they are prediction techniques [Dennett]
     Full Idea: Intentional systems don't really have beliefs and desires, but one can explain and predict their behaviour by ascribing beliefs and desires to them. This strategy is pragmatic, not right or wrong.
     From: Daniel C. Dennett (Brainstorms:Essays on Mind and Psychology [1978], p.7?)
     A reaction: If the ascription of beliefs and desires explains behaviour, then that is good grounds for thinking they might be real features of the brain, and even if that is not so, they are real enough as abstractions from brain events, like the 'economic climate'.
The active self is a fiction created because we are ignorant of our motivations [Dennett]
     Full Idea: Faced with our inability to 'see' where the centre or source of our free actions is,…we exploit the gaps in our self-knowledge by filling it with a mysterious entity, the unmoved mover, the active self.
     From: Daniel C. Dennett (Elbow Room: varieties of free will [1984], §4.1)
     A reaction: I am convinced that there is no such things as free will; its origins are to be found in religion, where it is a necessary feature of a very supreme God. I don't believe for a moment that we need to believe in free will.
If mind is just an explanation, the explainer must have beliefs [Rey on Dennett]
     Full Idea: If something has beliefs only if something else is disposed to "treat it" (i.e. think of it) as though it does, then we seem at least to have an infinite regress of appeals to believers.
     From: comment on Daniel C. Dennett (works [1985]) by Georges Rey - Contemporary Philosophy of Mind 3.2.1
     A reaction: This sounds like a serious difficulty for behaviourists, but is not insurmountable. We need a community of interlocking behaviours, with a particular pattern of behaviour being labelled (for instrumental convenience) as 'beliefs'.
The 'intentional stance' is a way of interpreting an entity by assuming it is rational and self-aware [Dennett]
     Full Idea: The 'intentional stance' is the tactic of interpreting an entity by adopting the presupposition that it is an approximation of the ideal of an optimally designed (i.e. rational) self-regarding agent.
     From: Daniel C. Dennett (Daniel Dennett on himself [1994], p.239)
     A reaction: This is Dennett's 'instrumentalism', a descendant of behaviourism, which strikes me as a pragmatist's evasion of the ontological problems of mind which should interest philosophers
17. Mind and Body / C. Functionalism / 1. Functionalism
Could a robot be made conscious just by software? [Dennett]
     Full Idea: How could you make a robot conscious? The answer, I think, is to be found in software.
     From: Daniel C. Dennett (Brainchildren [1998], Ch.6)
     A reaction: This seems to be a commitment to strong AI, though Dennett is keen to point out that brains are the only plausible implementation of such software. Most find his claim baffling.
17. Mind and Body / C. Functionalism / 6. Homuncular Functionalism
All functionalism is 'homuncular', of one grain size or another [Dennett]
     Full Idea: All varieties of functionalism can be viewed as 'homuncular' functionalism of one grain size or another.
     From: Daniel C. Dennett (Consciousness Explained [1991], 9.2)
     A reaction: This seems right, as any huge and complex mechanism (like a moon rocket) will be made up of some main systems, then sub-systems, then sub-sub-sub.... This assumes that there are one or two overarching purposes, which there are in people.
We descend from robots, and our intentionality is composed of billions of crude intentional systems [Dennett]
     Full Idea: We are descended from robots, and composed of robots, and all the intentionality we enjoy is derived from the more fundamental intentionality of billions of crude intentional systems.
     From: Daniel C. Dennett (Kinds of Minds [1996], Ch.2)
     A reaction: A more grand view of intentionality (such as Searle's) seems more attractive than this, but the crucial fact about Dennett is that he takes the implications of evolution much more seriously than other philosophers. He's probably right.
17. Mind and Body / E. Mind as Physical / 1. Physical Mind
There is no more anger in adrenaline than silliness in a bottle of whiskey [Dennett]
     Full Idea: There is no more fear or anger in adrenaline than there is silliness in a bottle of whiskey.
     From: Daniel C. Dennett (Kinds of Minds [1996], Ch.3)
     A reaction: Not exactly an argument, but a nice rhetorical point against absurd claims about identity and reduction and elimination. We may say that there is no fear without adrenaline, and no adrenaline in a live brain without fear.
17. Mind and Body / E. Mind as Physical / 2. Reduction of Mind
Intelligent agents are composed of nested homunculi, of decreasing intelligence, ending in machines [Dennett]
     Full Idea: As long as your homunculi are more stupid and ignorant than the intelligent agent they compose, the nesting of homunculi within homunculi can be finite, bottoming out, eventually, with agents so unimpressive they can be replaced by machines.
     From: Daniel C. Dennett (Sweet Dreams [2005], Ch.6)
     A reaction: [Dennett first proposed this in 'Brainstorms' 1978]. This view was developed well by Lycan. I rate it as one of the most illuminating ideas in the modern philosophy of mind. All complex systems (like aeroplanes) have this structure.
17. Mind and Body / E. Mind as Physical / 3. Eliminativism
I don't deny consciousness; it just isn't what people think it is [Dennett]
     Full Idea: I don't maintain, of course, that human consciousness does not exist; I maintain that it is not what people often think it is.
     From: Daniel C. Dennett (Sweet Dreams [2005], Ch.3)
     A reaction: I consider Dennett to be as near as you can get to an eliminativist, but he is not stupid. As far as I can see, the modern philosopher's bogey-man, the true total eliminativist, simply doesn't exist. Eliminativists usually deny propositional attitudes.
It is arbitrary to say which moment of brain processing is conscious [Dennett]
     Full Idea: If one wants to settle on some moment of processing in the brain as the moment of consciousness, this has to be arbitrary.
     From: Daniel C. Dennett (Consciousness Explained [1991], 5.3)
     A reaction: Seems eliminativist, as it implies that all that is really going on is 'processing'. But there are two senses of 'arbitrary' - that calling it consciousness is arbitrary (wrong), or thinking that mind doesn't move abruptly into consciousness (right).
Maybe there is a minimum brain speed for supporting a mind [Dennett]
     Full Idea: Perhaps there is a minimum speed for a mind, rather like the minimum escape velocity required to overcome gravity and leave the planet.
     From: Daniel C. Dennett (Kinds of Minds [1996], Ch.3)
     A reaction: Dennett rejects this speculation, but he didn't stop to imagine what it would be LIKE if your brain slowed down, and he never considers Edelman's view that mind is a process. Put the two together…
Visual experience is composed of neural activity, which we find pleasing [Dennett]
     Full Idea: All visual experience is composed of activities of neural circuits whose very activity is innately pleasing to us.
     From: Daniel C. Dennett (Consciousness Explained [1991], 12.6)
     A reaction: This is the nearest I can find to Dennett saying something eliminativist. It seems to beg the question of who 'us' refers to, and what is being pleased, and how it is 'pleased' by these neural circuits. The Hard Question?
17. Mind and Body / E. Mind as Physical / 7. Anti-Physicalism / b. Multiple realisability
The materials for a mind only matter because of speed, and a need for transducers and effectors [Dennett]
     Full Idea: I think there are only two good reasons why, when you make a mind, the materials matter: speed, and the ubiquity of transducers and effectors throughout the nervous system.
     From: Daniel C. Dennett (Kinds of Minds [1996], Ch.3)
     A reaction: This sounds roughly right, because it gives you something between multiple realisability (minds made of cans and string), and type-type identity (minds ARE a particular material). Call it 'biological functionalism'?