Combining Texts

Ideas for 'Intensional Logic', 'The Mystery of Consciousness' and 'Consciousness: matter becomes imagination'

unexpand these ideas     |    start again     |     choose another area for these texts

display all the ideas for this combination of texts


11 ideas

15. Nature of Minds / B. Features of Minds / 1. Consciousness / a. Consciousness
A system is either conscious or it isn't, though the intensity varies a lot [Searle]
     Full Idea: A system is either conscious or it isn't, but within the field of consciousness there are states of intensity ranging from drowsiness to full awareness.
     From: John Searle (The Mystery of Consciousness [1997], Ch.1)
     A reaction: I think this all-or-nothing view is the last vestiges of Cartesian dualism, and is quite wrong. Heaps of neuroscience (about blindsight, subliminal awareness, neurosis etc.) says we will never understand the mind if we think it is only the conscious part.
Consciousness has a first-person ontology, which only exists from a subjective viewpoint [Searle]
     Full Idea: Consciousness has a first-person or subjective ontology, by which I mean that conscious states only exist when experienced by a subject and they exist only from the first-person point of view of that subject.
     From: John Searle (The Mystery of Consciousness [1997], Ch.5 App)
     A reaction: I think this is nonsense, and I don't think Searle believes it. He ruthlessly attacks so-called 'eliminativists', but the definition he gives here would make him an eliminativist about other minds. There is no such thing as 'first-person' ontology.
There isn't one consciousness (information-processing) which can be investigated, and another (phenomenal) which can't [Searle]
     Full Idea: There are not two kinds of consciousness, an information-processing consciousness that is amenable to scientific investigation and a phenomenal, what-it-subjectively-feels-like form of consciousness that will forever remain mysterious.
     From: John Searle (The Mystery of Consciousness [1997], Concl.1)
     A reaction: Fodor appears to be the main target of this remark. The view that we can explain intentionality but not qualia is currently very fashionable. I am sympathetic to Searle here. Consciousness isn't an epiphenomenon, it is essential to all thought.
15. Nature of Minds / B. Features of Minds / 1. Consciousness / b. Essence of consciousness
Consciousness is a process (of neural interactions), not a location, thing, property, connectivity, or activity [Edelman/Tononi]
     Full Idea: Consciousness is neither a thing, nor a simple property. ..The conscious 'dynamic core' of the brain is a process, not a thing or a place, and is defined in terms of neural interactions, not in terms of neural locations, connectivity or activity.
     From: G Edelman / G Tononi (Consciousness: matter becomes imagination [2000], Ch.12)
     A reaction: This must be of great interest to philosophers. Edelman is adamant that it is not any specific neurons. The nice question is: what would it be like to have your brain slowed down? Presumably we would experience steps in the process. Is he a functionalist?
15. Nature of Minds / B. Features of Minds / 1. Consciousness / c. Parts of consciousness
The three essentials of conscious experience are privateness, unity and informativeness [Edelman/Tononi]
     Full Idea: The fundamental aspects of conscious experience that are common to all its phenomenological manifestations are: privateness, unity, and informativeness.
     From: G Edelman / G Tononi (Consciousness: matter becomes imagination [2000], Ch. 3)
     A reaction: Interesting, coming from neuroscientists. The list strikes me as rather passive. It is no use having good radar if you can't make decisions. Privacy and unity are overrated. Who gets 'informed'? Personal identity must be basic.
15. Nature of Minds / B. Features of Minds / 1. Consciousness / d. Purpose of consciousness
Consciousness can create new axioms, but computers can't do that [Edelman/Tononi]
     Full Idea: Conscious human thought can create new axioms, which a computer cannot do.
     From: G Edelman / G Tononi (Consciousness: matter becomes imagination [2000], Ch.17)
     A reaction: A nice challenge for the artificial intelligence community! I don't understand their confidence in making this assertion. Nothing in Gödel's Theorem seems to prevent the reassignment of axioms, and Quine implies that it is an easy and trivial game.
15. Nature of Minds / B. Features of Minds / 1. Consciousness / e. Cause of consciousness
Consciousness arises from high speed interactions between clusters of neurons [Edelman/Tononi]
     Full Idea: Our hypothesis is that the activity of a group of neurons can contribute directly to conscious experience if it is part of a functional cluster, characterized by strong interactions among a set of neuronal groups over a period of hundreds of milliseconds.
     From: G Edelman / G Tononi (Consciousness: matter becomes imagination [2000], Ch.12)
     A reaction: This is their 'dynamic core' hypothesis. It doesn't get at the Hard Questions about consciousness, but this is a Nobel prize winner hot on the trail of the location of the action. It gives support to functionalism, because the neurons vary.
15. Nature of Minds / B. Features of Minds / 4. Intentionality / a. Nature of intentionality
Dreams and imagery show the brain can generate awareness and meaning without input [Edelman/Tononi]
     Full Idea: Dreaming and imagery are striking phenomenological demonstrations that the adult brain can spontaneously and intrinsically produce consciousness and meaning without any direct input from the periphery.
     From: G Edelman / G Tononi (Consciousness: matter becomes imagination [2000], Ch.11)
     A reaction: This offers some support for Searle's claim that brain's produce 'intrinsic' (rather than 'derived') intentionality. Of course, one can have a Humean impressions/ideas theory about how the raw material got there. We SEE meaning in our experiences.
15. Nature of Minds / B. Features of Minds / 4. Intentionality / b. Intentionality theories
Physicists see information as a measure of order, but for biologists it is symbolic exchange between animals [Edelman/Tononi]
     Full Idea: Physicists may define information as a measure of order in a far-from-equilibrium state, but it is best seen as a biological concept which emerged in evolution with animals that were capable of mutual symbolic exchange.
     From: G Edelman / G Tononi (Consciousness: matter becomes imagination [2000], Ch.17)
     A reaction: The physicists' definition seems to open the road to the possibility of non-conscious intentionality (Dennett), where the biological view seems to require consciousness of symbolic meanings (Searle). Tree-rings contain potential information?
15. Nature of Minds / B. Features of Minds / 5. Qualia / a. Nature of qualia
The use of 'qualia' seems to imply that consciousness and qualia are separate [Searle]
     Full Idea: I am hesitant to use the word 'quale/qualia', because it gives the impression that there are two separate phenomena, consciousness and qualia.
     From: John Searle (The Mystery of Consciousness [1997], Ch.1)
     A reaction: He is trying to resist going back to 'sense-data', sitting uneasily between reality and our experience of it. Personally I am quite happy with qualia as an aspect of consciousness - just as I am happy with consciousness as an 'aspect' of brain.
15. Nature of Minds / B. Features of Minds / 5. Qualia / c. Explaining qualia
The sensation of red is a point in neural space created by dimensions of neuronal activity [Edelman/Tononi]
     Full Idea: The pure sensation of red is a particular neural state identified by a point within the N-dimensional neural space defined by the integrated activity of all the group of neurons that constitute the dynamic core.
     From: G Edelman / G Tononi (Consciousness: matter becomes imagination [2000], Ch.13)
     A reaction: This hardly answers the Hard Question (why experience it? why that experience?), but it is interesting to see a neuroscientist fishing for an account of qualia. He says three types of neuron firing generate the dimensions of the 'space'.