Combining Texts

All the ideas for 'Consciousness', 'Review: Meinong 'Untersuchungen zur..'' and 'Action'

unexpand these ideas     |    start again     |     specify just one area for these texts


42 ideas

1. Philosophy / F. Analytic Philosophy / 6. Logical Analysis
We can't sharply distinguish variables, domains and values, if symbols frighten us [Russell]
     Full Idea: Whoever is afraid of symbols can hardly hope to acquire exact ideas where it is necessary to distinguish 1) the variable in itself as opposed to its value, 2) any value of the variable, 3) all values, 4) some value.
     From: Bertrand Russell (Review: Meinong 'Untersuchungen zur..' [1905], p.84)
     A reaction: Not the best example, perhaps, of the need for precision, but a nice illustration of the new attitude Russell brought into philosophy.
4. Formal Logic / F. Set Theory ST / 8. Critique of Set Theory
Physicalism requires the naturalisation or rejection of set theory [Lycan]
     Full Idea: Eventually set theory will have to be either naturalised or rejected, if a thoroughgoing physicalism is to be maintained.
     From: William Lycan (Consciousness [1987], 8.4)
     A reaction: Personally I regard Platonism as a form of naturalism (though a rather bold and dramatic one). The central issue seems to be the ability of the human main/brain to form 'abstract' notions about the physical world in which it lives.
7. Existence / C. Structure of Existence / 2. Reduction
Institutions are not reducible as types, but they are as tokens [Lycan]
     Full Idea: Institutional types are irreducible, though I assume that institutional tokens are reducible in the sense of strict identity, all the way down to the subatomic level.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: This seems a promising distinction, as the boundaries of 'institutions' disappear when you begin to reduce them to lower levels (cf. Idea 4601), and yet plenty of institutions are self-evidently no more than physics. Plants are invisible as physics.
Types cannot be reduced, but levels of reduction are varied groupings of the same tokens [Lycan]
     Full Idea: If types cannot be reduced to more physical levels, this is not an embarrassment, as long as our institutional categories, our physiological categories, and our physical categories are just alternative groupings of the same tokens.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: This is a self-evident truth about a car engine, so I don't see why it wouldn't apply equally to a brain. Lycan's identification of the type as the thing which cannot be reduced seems a promising explanation of much confusion among philosophers.
7. Existence / C. Structure of Existence / 3. Levels of Reality
One location may contain molecules, a metal strip, a key, an opener of doors, and a human tragedy [Lycan]
     Full Idea: One space-time slice may be occupied by a collection of molecules, a metal strip, a key, an allower of entry to hotel rooms, a facilitator of adultery, and a destroyer souls.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: Desdemona's handkerchief is a nice example. This sort of remark seems to be felt by some philosophers to be heartless wickedness, and yet it so screamingly self-evident that it is impossible to deny.
7. Existence / E. Categories / 3. Proposed Categories
I see the 'role'/'occupant' distinction as fundamental to metaphysics [Lycan]
     Full Idea: I see the 'role'/'occupant' distinction as fundamental to metaphysics.
     From: William Lycan (Consciousness [1987], 4.0)
     A reaction: A passing remark in a discussion of functionalism about the mind, but I find it appealing. Causation is basic to materialistic metaphysics, and it creates networks of regular causes. It leaves open the essentialist question of WHY it has that role.
9. Objects / A. Existence of Objects / 4. Impossible objects
Common sense agrees with Meinong (rather than Russell) that 'Pegasus is a flying horse' is true [Lackey on Russell]
     Full Idea: Meinong's theory says that 'Pegasus is a flying horse' is true, while Russell's says that this assertion is false. The average man, if he knows his mythology, would probably agree with Meinong.
     From: comment on Bertrand Russell (Review: Meinong 'Untersuchungen zur..' [1905]) by Douglas Lackey - Intros to Russell's 'Essays in Analysis' p.19
     A reaction: It seems obvious that some disambiguation is needed here. Assenting to that assertion would be blatantly contextual. No one backs Pegasus at a race track.
I prefer to deny round squares, and deal with the difficulties by the theory of denoting [Russell]
     Full Idea: I should prefer to say that there is no such object as 'the round square'. The difficulties of excluding such objects can, I think, be avoided by the theory of denoting.
     From: Bertrand Russell (Review: Meinong 'Untersuchungen zur..' [1905], p.81)
     A reaction: The 'theory of denoting' is his brand new theory of definite descriptions, which makes implicit claims of existence explicit, so that they can be judged. Why can't we just say that a round square can be an intentional object, but not a real object?
11. Knowledge Aims / C. Knowing Reality / 1. Perceptual Realism / b. Direct realism
I think greenness is a complex microphysical property of green objects [Lycan]
     Full Idea: Personally I favour direct realism regarding secondary qualities, and identify greenness with some complex microphysical property exemplified by green physical objects.
     From: William Lycan (Consciousness [1987], 8.4)
     A reaction: He cites D.M.Armstrong (1981) as his source. Personally I find this a bewildering proposal. Does he think there is greenness in grass AS WELL AS the emission of that wavelength of electro-magnetic radiation? Is greenness zooming through the air?
15. Nature of Minds / B. Features of Minds / 4. Intentionality / a. Nature of intentionality
Intentionality comes in degrees [Lycan]
     Full Idea: Intentionality comes in degrees.
     From: William Lycan (Consciousness [1987], 5.4)
     A reaction: I agree. A footprint is 'about' a foot, in the sense of containing concentrated information about it. Can we, though, envisage a higher degree than human thought? Is there a maximum degree? Everything is 'about' everything, in some respect.
15. Nature of Minds / B. Features of Minds / 4. Intentionality / b. Intentionality theories
Teleological views allow for false intentional content, unlike causal and nomological theories [Lycan]
     Full Idea: The teleological view begins to explain intentionality, and in particular allows brain states and events to have false intentional content; causal and nomological theories of intentionality tend to falter on this last task.
     From: William Lycan (Consciousness [1987], 4.4)
     A reaction: Certainly if you say thought is 'caused' by the world, false thought become puzzling. I'm not sure I understand the rest of this, but it is an intriguing remark about a significant issue…
15. Nature of Minds / B. Features of Minds / 5. Qualia / c. Explaining qualia
Pain is composed of urges, desires, impulses etc, at different levels of abstraction [Lycan]
     Full Idea: Our phenomenal experience of pain has components - it is a complex, consisting (perhaps) of urges, desires, impulses, and beliefs, probably occurring at quite different levels of institutional abstraction.
     From: William Lycan (Consciousness [1987], 5.5)
     A reaction: This seems to be true, and offers the reductionist a strategy for making inroads into the supposed irreducable and fundamental nature of qualia. What's it like to be a complex hierarchically structured multi-functional organism?
The right 'level' for qualia is uncertain, though top (behaviourism) and bottom (particles) are false [Lycan]
     Full Idea: It is just arbitrary to choose a level of nature a priori as the locus of qualia, even though we can agree that high levels (such as behaviourism) and low-levels (such as the subatomic) can be ruled out as totally improbable.
     From: William Lycan (Consciousness [1987], 5.6)
     A reaction: Very good. People scream 'qualia!' whenever the behaviour level or the atomic level are proposed as the locations of the mind, but the suggestion that they are complex, and are spread across many functional levels in the middle sounds good.
17. Mind and Body / A. Mind-Body Dualism / 8. Dualism of Mind Critique
If energy in the brain disappears into thin air, this breaches physical conservation laws [Lycan]
     Full Idea: By interacting causally, Cartesian dualism seems to violate the conservation laws of physics (concerning matter and energy). This seems testable, and afferent and efferent pathways disappearing into thin air would suggest energy is not conserved.
     From: William Lycan (Consciousness [1987], 1.1)
     A reaction: It would seem to be no problem as long as outputs were identical in energy to inputs. If the experiment could actually be done, the result might astonish us.
In lower animals, psychology is continuous with chemistry, and humans are continuous with animals [Lycan]
     Full Idea: Evolution has proceeded in all other known species by increasingly complex configurations of molecules and organs, which support primitive psychologies; our human psychologies are more advanced, but undeniably continuous with lower animals.
     From: William Lycan (Consciousness [1987], 1.1)
     A reaction: Personally I find the evolution objection to dualism highly persuasive. I don't see how anyone can take evolution seriously and be a dualist. If there is a dramatic ontological break at some point, a plausible reason would be needed for that.
17. Mind and Body / B. Behaviourism / 4. Behaviourism Critique
Two behaviourists meet. The first says,"You're fine; how am I?" [Lycan]
     Full Idea: Old joke: two Behaviourists meet in the street, and the first says,"You're fine; how am I?"
     From: William Lycan (Consciousness [1987], n1.6)
     A reaction: This invites the response that introspection is uniquely authoritative about 'how we are', but this has been challenged quite a lot recently, which pushes us to consider whether these stupid behaviourists might actually have a good point.
17. Mind and Body / C. Functionalism / 1. Functionalism
If functionalism focuses on folk psychology, it ignores lower levels of function [Lycan]
     Full Idea: 'Analytical functionalists', who hold that meanings of mental terms are determined by the causal roles associated with them by 'folk psychology', deny themselves appeals to lower levels of functional organisation.
     From: William Lycan (Consciousness [1987], 5.4)
     A reaction: Presumably folk psychology can fit into the kind of empirical methodology favoured by behaviourists, whereas 'lower levels' are going to become rather speculative and unscientific.
Functionalism must not be too abstract to allow inverted spectrum, or so structural that it becomes chauvinistic [Lycan]
     Full Idea: The functionalist must find a level of characterisation of mental states that is not so abstract or behaviouristic as to rule out the possibility of inverted spectrum etc., nor so specific and structural as to fall into chauvinism.
     From: William Lycan (Consciousness [1987], 5.4)
     A reaction: If too specific then animals and aliens won't be able to implement the necessary functions; if the theory becomes very behaviouristic, then it loses interest in the possibility of an inverted spectrum. He is certainly right to hunt for a middle ground.
17. Mind and Body / C. Functionalism / 2. Machine Functionalism
The distinction between software and hardware is not clear in computing [Lycan]
     Full Idea: Even the software/hardware distinction as it is literally applied within computer science is philosophically unclear.
     From: William Lycan (Consciousness [1987], 4.4)
     A reaction: This is true, and very important for functionalist theories of the mind. Even very volatile software is realised in 'hard' physics, and rewritable discs etc blur the distinction between 'programmable' and 'hardwired'.
17. Mind and Body / C. Functionalism / 5. Teleological Functionalism
Mental types are a subclass of teleological types at a high level of functional abstraction [Lycan]
     Full Idea: I am taking mental types to form a small subclass of teleological types occurring for the most part at a high level of functional abstraction.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: He goes on to say that he understand teleology in evolutionary terms. There is always a gap between how you characterise or individuate something, and what it actually is. To say spanners are 'a small subclass of tools' is not enough.
Teleological characterisations shade off smoothly into brutely physical ones [Lycan]
     Full Idea: Highly teleological characterisations, unlike naïve and explicated mental characterisations, have the virtue of shading off fairly smoothly into (more) brutely physical ones.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: Thus the purpose of a car engine, and a spark plug, and the spark, and the temperature, and the vibration of molecules show a fading away of the overt purpose, disappearing into the pointless activity of electrons and quantum levels.
17. Mind and Body / E. Mind as Physical / 1. Physical Mind
Identity theory is functionalism, but located at the lowest level of abstraction [Lycan]
     Full Idea: 'Neuron' may be understood as a physiological term or a functional term, so even the Identity Theorist is a Functionalist - one who locates mental entities at a very low level of abstraction.
     From: William Lycan (Consciousness [1987], 5.4)
     A reaction: This is a striking observation, and somewhat inclines me to switch from identity theory to functionalism. If you ask what is the correct level of abstraction, Lycan's teleological-homuncular version refers you to all the levels.
17. Mind and Body / E. Mind as Physical / 2. Reduction of Mind
We reduce the mind through homuncular groups, described abstractly by purpose [Lycan]
     Full Idea: I am explicating the mental in a reductive way, by reducing mental characterizations to homuncular institutional ones, which are teleological characterizations at various levels of functional abstraction.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: I think this is the germ of a very good physicalist account of the mind. More is needed than a mere assertion about what the mind reduces to at the very lowest level; this offers a decent account of the descending stages of reduction.
Teleological functionalism helps us to understand psycho-biological laws [Lycan]
     Full Idea: Teleological functionalism helps us to understand the nature of biological and psychological laws, particularly in the face of Davidsonian scepticism about the latter.
     From: William Lycan (Consciousness [1987], 4.4)
     A reaction: Personally I doubt the existence of psycho-physical laws, but only because of the vast complexity. They would be like the laws of weather. 'Psycho-physical' laws seem to presuppose some sort of dualism.
17. Mind and Body / E. Mind as Physical / 7. Anti-Physicalism / b. Multiple realisability
A Martian may exhibit human-like behaviour while having very different sensations [Lycan]
     Full Idea: Quite possibly a Martian's humanoid behaviour is prompted by his having sensations somewhat unlike ours, despite his superficial behavioural similarities to us.
     From: William Lycan (Consciousness [1987], 5.4)
     A reaction: I think this firmly refutes the multiple realisability objection to type-type physicalism. Mental events are individuated by their phenomenal features (known only to the user), and by their causal role (publicly available). These are separate.
20. Action / A. Definition of Action / 1. Action Theory
Actions include: the involuntary, the purposeful, the intentional, and the self-consciously autonomous [Wilson/Schpall]
     Full Idea: There are different levels of action, including at least: unconscious and/or involuntary behaviour, purposeful or goal-directed activity, intentional action, and the autonomous acts or actions of self-consciously active human agents.
     From: Wilson,G/Schpall,S (Action [2012], 1)
     A reaction: The fourth class is obviously designed to distinguish us from the other animals. It immediately strikes me as very optimistic to distinguish four (at least) clear categories, but you have to start somewhere.
20. Action / A. Definition of Action / 4. Action as Movement
Maybe bodily movements are not actions, but only part of an agent's action of moving [Wilson/Schpall]
     Full Idea: Some say that the movement's of agent's body are never actions. It is only the agent's direct moving of, say, his leg that constitutes a physical action; the leg movement is merely caused by and/or incorporated as part of the act of moving.
     From: Wilson,G/Schpall,S (Action [2012], 1.2)
     A reaction: [they cite Jennifer Hornsby 1980] It seems normal to deny a twitch the accolade of an 'action', so I suppose that is right. Does the continual movement of my tongue count as action? Only if I bring it under control? Does it matter? Only in forensics.
Is the action the arm movement, the whole causal process, or just the trying to do it? [Wilson/Schpall]
     Full Idea: Some philosophers have favored the overt arm movement the agent performs, some favor the extended causal process he initiates, and some prefer the relevant event of trying that precedes and 'generates' the rest.
     From: Wilson,G/Schpall,S (Action [2012], 1.2)
     A reaction: [Davidson argues for the second, Hornsby for the third] There seems no way to settle this, and a compromise looks best. Mere movement won't do, and mere trying won't do, and whole processes get out of control.
20. Action / B. Preliminaries of Action / 1. Intention to Act / a. Nature of intentions
To be intentional, an action must succeed in the manner in which it was planned [Wilson/Schpall]
     Full Idea: If someone fires a bullet to kill someone, misses, and dislodges hornets that sting him to death, this implies that an intentional action must include succeeding in a manner according to the original plan.
     From: Wilson,G/Schpall,S (Action [2012], 2)
     A reaction: [their example, compressed] This resembles Gettier's problem cases for knowledge. If the shooter deliberately and maliciously brought down the hornet's nest, that would be intentional murder. Sounds right.
If someone believes they can control the lottery, and then wins, the relevant skill is missing [Wilson/Schpall]
     Full Idea: If someone enters the lottery with the bizarre belief that they can control who wins, and then wins it, that suggest that intentional actions must not depend on sheer luck, but needs competent exercise of the relevant skill.
     From: Wilson,G/Schpall,S (Action [2012], 2)
     A reaction: A nice companion to Idea 20022, which show that a mere intention is not sufficient to motivate and explain an action.
We might intend two ways to acting, knowing only one of them can succeed [Wilson/Schpall]
     Full Idea: If an agent tries to do something by two different means, only one of which can succeed, then the behaviour is rational, even though one of them is an attempt to do an action which cannot succeed.
     From: Wilson,G/Schpall,S (Action [2012], 2)
     A reaction: [a concise account of a laborious account of an example from Bratman 1984, 1987] Bratman uses this to challenge the 'Simple View', that intention leads straightforwardly to action.
20. Action / B. Preliminaries of Action / 1. Intention to Act / c. Reducing intentions
On one model, an intention is belief-desire states, and intentional actions relate to beliefs and desires [Wilson/Schpall]
     Full Idea: On the simple desire-belief model, an intention is a combination of desire-belief states, and an action is intentional in virtue of standing in the appropriate relation to these simpler terms.
     From: Wilson,G/Schpall,S (Action [2012], 4)
     A reaction: This is the traditional view found in Hume, and is probably endemic to folk psychology. They cite Bratman 1987 as the main opponent of the view.
20. Action / B. Preliminaries of Action / 1. Intention to Act / d. Group intentions
Groups may act for reasons held by none of the members, so maybe groups are agents [Wilson/Schpall]
     Full Idea: Rational group action may involve a 'collectivising of reasons', with participants acting in ways that are not rationally recommended from the individual viewpoint. This suggests that groups can be rational, intentional agents.
     From: Wilson,G/Schpall,S (Action [2012], 2)
     A reaction: [Pettit 2003] is the source for this. Gilbert says individuals can have joint commitment; Pettit says the group can be an independent agent. The matter of shared intentions is interesting, but there is no need for the ontology to go berserk.
If there are shared obligations and intentions, we may need a primitive notion of 'joint commitment' [Wilson/Schpall]
     Full Idea: An account of mutual obligation to do something may require that we give up reductive individualist accounts of shared activity and posit a primitive notion of 'joint commitment'.
     From: Wilson,G/Schpall,S (Action [2012], 2)
     A reaction: [attributed to Margaret Gilbert 2000] If 'we' are trying to do something, that seems to give an externalist picture of intentions, rather like all the other externalisms floating around these days. I don't buy any of it, me.
20. Action / C. Motives for Action / 2. Acting on Beliefs / b. Action cognitivism
Strong Cognitivism identifies an intention to act with a belief [Wilson/Schpall]
     Full Idea: A Strong Cognitivist is someone who identifies an intention with a certain pertinent belief about what she is doing or about to do.
     From: Wilson,G/Schpall,S (Action [2012], 1.1)
     A reaction: (Sarah Paul 2009 makes this distinction) The belief, if so, seems to be as much counterfactual as factual. Hope seems to come into it, which isn't exactly a belief.
Weak Cognitivism says intentions are only partly constituted by a belief [Wilson/Schpall]
     Full Idea: A Weak Cognitivist holds that intentions are partly constituted by, but are not identical with, relevant beliefs about the action. Grice (1971) said an intention is willing an action, combined with a belief that this will lead to the action.
     From: Wilson,G/Schpall,S (Action [2012], 1.1)
     A reaction: [compressed] I didn't find Strong Cognitivism appealing, but it seems hard to argue with some form of the weak version.
Strong Cognitivism implies a mode of 'practical' knowledge, not based on observation [Wilson/Schpall]
     Full Idea: Strong Cognitivists say intentions/beliefs are not based on observation or evidence, and are causally reliable in leading to appropriate actions, so this is a mode of 'practical' knowledge that has not been derived from observation.
     From: Wilson,G/Schpall,S (Action [2012], 1.1)
     A reaction: [compressed - Stanford unnecessarily verbose!] I see no mention in this discussion of 'hoping' that your action will turn out OK. We are usually right to hope, but it would be foolish to say that when we reach for the salt we know we won't knock it over.
20. Action / C. Motives for Action / 3. Acting on Reason / b. Intellectualism
Maybe the explanation of an action is in the reasons that make it intelligible to the agent [Wilson/Schpall]
     Full Idea: Some have maintained that we explain why an agent acted as he did when we explicate how the agent's normative reasons rendered the action intelligible in his eyes.
     From: Wilson,G/Schpall,S (Action [2012], Intro)
     A reaction: Modern psychology is moving against this, by showing how hidden biases can predominate over conscious reasons (as in Kahnemann's work). I would say this mode of explanation works better for highly educated people (but you can chuckle at that).
20. Action / C. Motives for Action / 3. Acting on Reason / c. Reasons as causes
Causalists allow purposive explanations, but then reduce the purpose to the action's cause [Wilson/Schpall]
     Full Idea: Most causalists allow that reason explanations are teleological, but say that such purposive explanations are analysable causally, where the primary reasons for the act are the guiding causes of the act.
     From: Wilson,G/Schpall,S (Action [2012], 3)
     A reaction: The authors observe that it is hard to adjudicate on this matter, and that the concept of the 'cause' of an action is unclear.
It is generally assumed that reason explanations are causal [Wilson/Schpall]
     Full Idea: The view that reason explanations are somehow causal explanations remains the dominant position.
     From: Wilson,G/Schpall,S (Action [2012], Intro)
     A reaction: I suspect that this is only because no philosopher has a better idea, and the whole issue is being slowly outflanked by psychology.
26. Natural Theory / A. Speculations on Nature / 2. Natural Purpose / b. Limited purposes
We need a notion of teleology that comes in degrees [Lycan]
     Full Idea: We need a notion of teleology that comes in degrees.
     From: William Lycan (Consciousness [1987], 4.4)
     A reaction: Anyone who says that key concepts, such as those concerning the mind, should come 'in degrees' wins my instant support. A whole car engine requires a very teleological explanation, the spark in the sparkplug far less so.
27. Natural Reality / B. Modern Physics / 4. Standard Model / a. Concept of matter
'Physical' means either figuring in physics descriptions, or just located in space-time [Lycan]
     Full Idea: An object is specifically physical if it figures in explanations and descriptions of features of ordinary non-living matter, as in current physics; it is more generally physical if it is simply located in space-time.
     From: William Lycan (Consciousness [1987], 8.5)
     A reaction: This gives a useful distinction when trying to formulate a 'physicalist' account of the mind, where type-type physicalism says only the 'postulates of physics' can be used, whereas 'naturalism' about the mind uses the more general concept.