Combining Texts

All the ideas for 'Consciousness', 'The Web of Belief' and 'In a Critical Condition'

unexpand these ideas     |    start again     |     specify just one area for these texts


66 ideas

1. Philosophy / F. Analytic Philosophy / 1. Nature of Analysis
Philosophers have given precise senses to deduction, probability, computability etc [Quine/Ullian]
     Full Idea: Successful explications (giving a precise sense to a term) have been found for the concepts of deduction, probability and computability, to name just three.
     From: W Quine / J Ullian (The Web of Belief [1970], 65), quoted by Alex Orenstein - W.V. Quine Ch.3
     A reaction: Quine also cites the concept of an 'ordered pair'. Orenstein adds Tarski's definition of truth, Russell's definite descriptions, and the explication of existence in terms of quantifications. Cf. Idea 2958.
1. Philosophy / F. Analytic Philosophy / 4. Conceptual Analysis
It seems likely that analysis of concepts is impossible, but justification can survive without it [Fodor]
     Full Idea: Lots of philosophers fear that if concepts don't have analyses, justification breaks down. My own guess is that concepts don't have analyses and that justification will survive all the same.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 3 n2)
1. Philosophy / F. Analytic Philosophy / 7. Limitations of Analysis
Despite all the efforts of philosophers, nothing can ever be reduced to anything [Fodor]
     Full Idea: The general truth is that nothing ever reduces to anything, however hard philosophers may try.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
2. Reason / A. Nature of Reason / 8. Naturalising Reason
Turing invented the idea of mechanical rationality (just based on syntax) [Fodor]
     Full Idea: The most important thing that has happened in cognitive science was Turing's invention of the notion of mechanical rationality (because some inferences are rational in virtue of the syntax of their sentences).
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.17)
2. Reason / E. Argument / 2. Transcendental Argument
Transcendental arguments move from knowing Q to knowing P because it depends on Q [Fodor]
     Full Idea: Transcendental arguments ran: "If it weren't that P, we couldn't know (now 'say' or 'think' or 'judge') that Q; and we do know (now…) that Q; therefore P". Old and new arguments tend to be equally unconvincing, because of their empiricist preconceptions.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 3)
4. Formal Logic / F. Set Theory ST / 8. Critique of Set Theory
Physicalism requires the naturalisation or rejection of set theory [Lycan]
     Full Idea: Eventually set theory will have to be either naturalised or rejected, if a thoroughgoing physicalism is to be maintained.
     From: William Lycan (Consciousness [1987], 8.4)
     A reaction: Personally I regard Platonism as a form of naturalism (though a rather bold and dramatic one). The central issue seems to be the ability of the human main/brain to form 'abstract' notions about the physical world in which it lives.
7. Existence / C. Structure of Existence / 2. Reduction
Institutions are not reducible as types, but they are as tokens [Lycan]
     Full Idea: Institutional types are irreducible, though I assume that institutional tokens are reducible in the sense of strict identity, all the way down to the subatomic level.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: This seems a promising distinction, as the boundaries of 'institutions' disappear when you begin to reduce them to lower levels (cf. Idea 4601), and yet plenty of institutions are self-evidently no more than physics. Plants are invisible as physics.
Types cannot be reduced, but levels of reduction are varied groupings of the same tokens [Lycan]
     Full Idea: If types cannot be reduced to more physical levels, this is not an embarrassment, as long as our institutional categories, our physiological categories, and our physical categories are just alternative groupings of the same tokens.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: This is a self-evident truth about a car engine, so I don't see why it wouldn't apply equally to a brain. Lycan's identification of the type as the thing which cannot be reduced seems a promising explanation of much confusion among philosophers.
7. Existence / C. Structure of Existence / 3. Levels of Reality
One location may contain molecules, a metal strip, a key, an opener of doors, and a human tragedy [Lycan]
     Full Idea: One space-time slice may be occupied by a collection of molecules, a metal strip, a key, an allower of entry to hotel rooms, a facilitator of adultery, and a destroyer souls.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: Desdemona's handkerchief is a nice example. This sort of remark seems to be felt by some philosophers to be heartless wickedness, and yet it so screamingly self-evident that it is impossible to deny.
7. Existence / E. Categories / 3. Proposed Categories
I see the 'role'/'occupant' distinction as fundamental to metaphysics [Lycan]
     Full Idea: I see the 'role'/'occupant' distinction as fundamental to metaphysics.
     From: William Lycan (Consciousness [1987], 4.0)
     A reaction: A passing remark in a discussion of functionalism about the mind, but I find it appealing. Causation is basic to materialistic metaphysics, and it creates networks of regular causes. It leaves open the essentialist question of WHY it has that role.
8. Modes of Existence / B. Properties / 7. Emergent Properties
The world is full of messy small things producing stable large-scale properties (e.g. mountains) [Fodor]
     Full Idea: Damn near everything we know about the world (e.g. a mountain) suggests that unimaginably complicated to-ings and fro-ings of bits and pieces at the extreme microlevel manage somehow to converge on stable macrolevel properties.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 2)
     A reaction: This is clearly true, and is a vital part of the physicalist picture of the mind. Personally I prefer the word 'processes' to 'properties', since no one seems to really know what a property is. A process is an abstraction from events.
8. Modes of Existence / D. Universals / 6. Platonic Forms / b. Partaking
Don't define something by a good instance of it; a good example is a special case of the ordinary example [Fodor]
     Full Idea: It's a mistake to try to construe the notion of an instance in terms of the notion of a good instance (e.g. Platonic Forms); the latter is patently a special case of the former, so the right order of exposition is the other way round.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 4)
11. Knowledge Aims / A. Knowledge / 4. Belief / e. Belief holism
How do you count beliefs? [Fodor]
     Full Idea: There is no agreed way of counting beliefs.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.16)
11. Knowledge Aims / C. Knowing Reality / 1. Perceptual Realism / b. Direct realism
I think greenness is a complex microphysical property of green objects [Lycan]
     Full Idea: Personally I favour direct realism regarding secondary qualities, and identify greenness with some complex microphysical property exemplified by green physical objects.
     From: William Lycan (Consciousness [1987], 8.4)
     A reaction: He cites D.M.Armstrong (1981) as his source. Personally I find this a bewildering proposal. Does he think there is greenness in grass AS WELL AS the emission of that wavelength of electro-magnetic radiation? Is greenness zooming through the air?
11. Knowledge Aims / C. Knowing Reality / 3. Idealism / c. Empirical idealism
Berkeley seems to have mistakenly thought that chairs are the same as after-images [Fodor]
     Full Idea: Berkeley seems to have believed that tables and chairs are logically homogeneous with afterimages. I assume that he was wrong to believe this.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.16)
12. Knowledge Sources / B. Perception / 6. Inference in Perception
Maybe explaining the mechanics of perception will explain the concepts involved [Fodor]
     Full Idea: Why mightn't fleshing out the standard psychological account of perception itself count as learning what perceptual justification amounts to?
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 1)
12. Knowledge Sources / C. Rationalism / 1. Rationalism
Rationalism can be based on an evolved computational brain with innate structure [Fodor]
     Full Idea: Pinker's rationalism involves four main ideas: mind is a computational system, which is massively modular with a lot of innate structure resulting from evolution.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.17)
12. Knowledge Sources / D. Empiricism / 2. Associationism
According to empiricists abstraction is the fundamental mental process [Fodor]
     Full Idea: According to empiricists, the fundamental mental process is not theory construction but abstraction.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.12)
12. Knowledge Sources / D. Empiricism / 5. Empiricism Critique
Rationalists say there is more to a concept than the experience that prompts it [Fodor]
     Full Idea: That there is more in the content of a concept than there is in the experiences that prompt us to form it is the burden of the traditional rationalist critique of empiricism (as worked out by Leibniz and Kant).
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.12)
15. Nature of Minds / A. Nature of Mind / 1. Mind / b. Purpose of mind
Empirical approaches see mind connections as mirrors/maps of reality [Fodor]
     Full Idea: Empirical approaches to cognition say the human mind is a blank slate at birth; experiences write on the slate, and association extracts and extrapolates trends from the record of experience. The mind is an image of statistical regularities of the world.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.17)
     A reaction: The 'blank slate' is an exaggeration. The mind at least has the tools to make associations. He tries to make it sound implausible, but the word 'extrapolates' contains a wealth of possibilities that could build into a plausible theory.
The function of a mind is obvious [Fodor]
     Full Idea: Like hands, you don't have to know how the mind evolved to make a pretty shrewd guess at what it's for; for example, that it's to think with.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.17)
     A reaction: I like this. This is one of the basic facts of philosophy of mind, and it frequently gets lost in the fog. It is obvious that the components of the mind (say, experience and intentionality) will be better understood if their function is remembered.
15. Nature of Minds / B. Features of Minds / 4. Intentionality / a. Nature of intentionality
Do intentional states explain our behaviour? [Fodor]
     Full Idea: Intentional Realism is the idea that our intentional mental states causally explain our behaviour; so holistic semantics (which says no two people have the same intentional states, or share generalisations) is irrealistic about intentional mental states.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
     A reaction: ...presumably because two people CAN have the same behaviour. The key question would be whether the intentional states have to be conscious.
Intentionality comes in degrees [Lycan]
     Full Idea: Intentionality comes in degrees.
     From: William Lycan (Consciousness [1987], 5.4)
     A reaction: I agree. A footprint is 'about' a foot, in the sense of containing concentrated information about it. Can we, though, envisage a higher degree than human thought? Is there a maximum degree? Everything is 'about' everything, in some respect.
15. Nature of Minds / B. Features of Minds / 4. Intentionality / b. Intentionality theories
Teleological views allow for false intentional content, unlike causal and nomological theories [Lycan]
     Full Idea: The teleological view begins to explain intentionality, and in particular allows brain states and events to have false intentional content; causal and nomological theories of intentionality tend to falter on this last task.
     From: William Lycan (Consciousness [1987], 4.4)
     A reaction: Certainly if you say thought is 'caused' by the world, false thought become puzzling. I'm not sure I understand the rest of this, but it is an intriguing remark about a significant issue…
15. Nature of Minds / B. Features of Minds / 5. Qualia / c. Explaining qualia
Pain is composed of urges, desires, impulses etc, at different levels of abstraction [Lycan]
     Full Idea: Our phenomenal experience of pain has components - it is a complex, consisting (perhaps) of urges, desires, impulses, and beliefs, probably occurring at quite different levels of institutional abstraction.
     From: William Lycan (Consciousness [1987], 5.5)
     A reaction: This seems to be true, and offers the reductionist a strategy for making inroads into the supposed irreducable and fundamental nature of qualia. What's it like to be a complex hierarchically structured multi-functional organism?
The right 'level' for qualia is uncertain, though top (behaviourism) and bottom (particles) are false [Lycan]
     Full Idea: It is just arbitrary to choose a level of nature a priori as the locus of qualia, even though we can agree that high levels (such as behaviourism) and low-levels (such as the subatomic) can be ruled out as totally improbable.
     From: William Lycan (Consciousness [1987], 5.6)
     A reaction: Very good. People scream 'qualia!' whenever the behaviour level or the atomic level are proposed as the locations of the mind, but the suggestion that they are complex, and are spread across many functional levels in the middle sounds good.
16. Persons / B. Nature of the Self / 6. Self as Higher Awareness
If I have a set of mental modules, someone had better be in charge of them! [Fodor]
     Full Idea: If there is a community of computers living in my head, there had also better be somebody who is in charge; and, by God, it had better be me.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.17)
     A reaction: Dennett quotes this as a quaintly old-fashioned view. I agree quite strongly with Fodor, for reasons that Dennett should like - evolutionary ones. A mind is a useless tool without central co-ordination. What makes my long-term plans? It isn't anarchy!
17. Mind and Body / A. Mind-Body Dualism / 8. Dualism of Mind Critique
If energy in the brain disappears into thin air, this breaches physical conservation laws [Lycan]
     Full Idea: By interacting causally, Cartesian dualism seems to violate the conservation laws of physics (concerning matter and energy). This seems testable, and afferent and efferent pathways disappearing into thin air would suggest energy is not conserved.
     From: William Lycan (Consciousness [1987], 1.1)
     A reaction: It would seem to be no problem as long as outputs were identical in energy to inputs. If the experiment could actually be done, the result might astonish us.
In lower animals, psychology is continuous with chemistry, and humans are continuous with animals [Lycan]
     Full Idea: Evolution has proceeded in all other known species by increasingly complex configurations of molecules and organs, which support primitive psychologies; our human psychologies are more advanced, but undeniably continuous with lower animals.
     From: William Lycan (Consciousness [1987], 1.1)
     A reaction: Personally I find the evolution objection to dualism highly persuasive. I don't see how anyone can take evolution seriously and be a dualist. If there is a dramatic ontological break at some point, a plausible reason would be needed for that.
17. Mind and Body / B. Behaviourism / 4. Behaviourism Critique
Two behaviourists meet. The first says,"You're fine; how am I?" [Lycan]
     Full Idea: Old joke: two Behaviourists meet in the street, and the first says,"You're fine; how am I?"
     From: William Lycan (Consciousness [1987], n1.6)
     A reaction: This invites the response that introspection is uniquely authoritative about 'how we are', but this has been challenged quite a lot recently, which pushes us to consider whether these stupid behaviourists might actually have a good point.
17. Mind and Body / C. Functionalism / 1. Functionalism
If functionalism focuses on folk psychology, it ignores lower levels of function [Lycan]
     Full Idea: 'Analytical functionalists', who hold that meanings of mental terms are determined by the causal roles associated with them by 'folk psychology', deny themselves appeals to lower levels of functional organisation.
     From: William Lycan (Consciousness [1987], 5.4)
     A reaction: Presumably folk psychology can fit into the kind of empirical methodology favoured by behaviourists, whereas 'lower levels' are going to become rather speculative and unscientific.
Functionalists see pains as properties involving relations and causation [Fodor]
     Full Idea: Functionalists claim that pains and the like are higher-order, relational properties that things have in virtue of the pattern of causal interactions that they (can or do) enter into.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 2)
     A reaction: The whole idea of a property being purely 'relational' strikes me as dubious (or even nonsense). "Is north of" is a relation, but it is totally derived from more basical physical geographical properties.
Functionalism must not be too abstract to allow inverted spectrum, or so structural that it becomes chauvinistic [Lycan]
     Full Idea: The functionalist must find a level of characterisation of mental states that is not so abstract or behaviouristic as to rule out the possibility of inverted spectrum etc., nor so specific and structural as to fall into chauvinism.
     From: William Lycan (Consciousness [1987], 5.4)
     A reaction: If too specific then animals and aliens won't be able to implement the necessary functions; if the theory becomes very behaviouristic, then it loses interest in the possibility of an inverted spectrum. He is certainly right to hunt for a middle ground.
17. Mind and Body / C. Functionalism / 2. Machine Functionalism
The distinction between software and hardware is not clear in computing [Lycan]
     Full Idea: Even the software/hardware distinction as it is literally applied within computer science is philosophically unclear.
     From: William Lycan (Consciousness [1987], 4.4)
     A reaction: This is true, and very important for functionalist theories of the mind. Even very volatile software is realised in 'hard' physics, and rewritable discs etc blur the distinction between 'programmable' and 'hardwired'.
17. Mind and Body / C. Functionalism / 5. Teleological Functionalism
Teleological characterisations shade off smoothly into brutely physical ones [Lycan]
     Full Idea: Highly teleological characterisations, unlike naïve and explicated mental characterisations, have the virtue of shading off fairly smoothly into (more) brutely physical ones.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: Thus the purpose of a car engine, and a spark plug, and the spark, and the temperature, and the vibration of molecules show a fading away of the overt purpose, disappearing into the pointless activity of electrons and quantum levels.
Mental types are a subclass of teleological types at a high level of functional abstraction [Lycan]
     Full Idea: I am taking mental types to form a small subclass of teleological types occurring for the most part at a high level of functional abstraction.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: He goes on to say that he understand teleology in evolutionary terms. There is always a gap between how you characterise or individuate something, and what it actually is. To say spanners are 'a small subclass of tools' is not enough.
17. Mind and Body / D. Property Dualism / 3. Property Dualism
Why bother with neurons? You don't explain bird flight by examining feathers [Fodor]
     Full Idea: Compare Churchland's strategy rooted in neurological modelling with "if it's flight you want to understand, what you need to look at is feathers".
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 8)
     A reaction: Sounds good, but may be a false analogy. You learn a lot about snake movement if you examine their scales.
17. Mind and Body / E. Mind as Physical / 1. Physical Mind
Type physicalism is a stronger claim than token physicalism [Fodor]
     Full Idea: "Type" physicalism is supposed, by general consensus, to be stronger than "token" physicalism; stronger, that is, than the mere claim that all mental states are necessarily physically instantiated.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 2)
     A reaction: Such philosopher's terminology always seems cut-and-dried, until you ask exactly what is identical to what. The word 'type' is a very broad concept. Are trees the same type of thing as roses? A thought always requires the same 'type' of brain event?
Identity theory is functionalism, but located at the lowest level of abstraction [Lycan]
     Full Idea: 'Neuron' may be understood as a physiological term or a functional term, so even the Identity Theorist is a Functionalist - one who locates mental entities at a very low level of abstraction.
     From: William Lycan (Consciousness [1987], 5.4)
     A reaction: This is a striking observation, and somewhat inclines me to switch from identity theory to functionalism. If you ask what is the correct level of abstraction, Lycan's teleological-homuncular version refers you to all the levels.
17. Mind and Body / E. Mind as Physical / 2. Reduction of Mind
We reduce the mind through homuncular groups, described abstractly by purpose [Lycan]
     Full Idea: I am explicating the mental in a reductive way, by reducing mental characterizations to homuncular institutional ones, which are teleological characterizations at various levels of functional abstraction.
     From: William Lycan (Consciousness [1987], 4.3)
     A reaction: I think this is the germ of a very good physicalist account of the mind. More is needed than a mere assertion about what the mind reduces to at the very lowest level; this offers a decent account of the descending stages of reduction.
Teleological functionalism helps us to understand psycho-biological laws [Lycan]
     Full Idea: Teleological functionalism helps us to understand the nature of biological and psychological laws, particularly in the face of Davidsonian scepticism about the latter.
     From: William Lycan (Consciousness [1987], 4.4)
     A reaction: Personally I doubt the existence of psycho-physical laws, but only because of the vast complexity. They would be like the laws of weather. 'Psycho-physical' laws seem to presuppose some sort of dualism.
17. Mind and Body / E. Mind as Physical / 4. Connectionism
Modern connectionism is just Hume's theory of the 'association' of 'ideas' [Fodor]
     Full Idea: Churchland is pushing a version of connectionism ….in which if you think of the elements as "ideas" and call the connections between them "associations", you've got a psychology that is no great advance on David Hume.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 8)
     A reaction: See Fodor's book 'Humean Variations' on how Hume should be improved. This idea strikes me as important for understanding Hume, who is very reticent about what his real views are on the mind.
17. Mind and Body / E. Mind as Physical / 7. Anti-Physicalism / b. Multiple realisability
A Martian may exhibit human-like behaviour while having very different sensations [Lycan]
     Full Idea: Quite possibly a Martian's humanoid behaviour is prompted by his having sensations somewhat unlike ours, despite his superficial behavioural similarities to us.
     From: William Lycan (Consciousness [1987], 5.4)
     A reaction: I think this firmly refutes the multiple realisability objection to type-type physicalism. Mental events are individuated by their phenomenal features (known only to the user), and by their causal role (publicly available). These are separate.
18. Thought / A. Modes of Thought / 1. Thought
The goal of thought is to understand the world, not instantly sort it into conceptual categories [Fodor]
     Full Idea: The question whether there are recognitional concepts is really the question what thought is for - for directing action, or for discerning truth. And Descartes was right on this: the goal of thought is to understand the world, not to sort it.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 4)
18. Thought / B. Mechanics of Thought / 3. Modularity of Mind
Blindness doesn't destroy spatial concepts [Fodor]
     Full Idea: Blind children are not, in general, linguistically impaired; not even in their talk about space.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.13)
     A reaction: This is offered to demonstrate that spatial concepts are innate, even in the blind. But then we would expect anyone who has to move in space to develop spatial concepts from experience.
Modules analyse stimuli, they don't tell you what to do [Fodor]
     Full Idea: The thinking involved in "figuring out" what to do is a quite different kind of mental process than the stimulus analysis that modules perform.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.13)
     A reaction: My PA theory fits this perfectly. My inner assistant keeps providing information about needs, duties etc., but takes no part in my decisions. Psychology must include the Will.
Rationality rises above modules [Fodor]
     Full Idea: Probably, modular computation doesn't explain how minds are rational; it's just a sort of precursor. You work through it to get a view of how horribly hard our rationality is to understand.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.17)
     A reaction: The choice is between a Self which weighs and judges the inputs, or merely decisions that automatically result from the balance of inputs. The latter seems unlikely. Vetoes are essential.
Modules make the world manageable [Fodor]
     Full Idea: Modules function to present the world to thought under descriptions that are germane to the success of behaviour.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.13)
     A reaction: "Descriptions" might be a bold word to use about something so obscure, but this pinpoints the evolutionary nature of modularity theory, to which I subscribe.
Babies talk in consistent patterns [Fodor]
     Full Idea: "Who Mummy love?" is recognizably baby talk; but "love Mummy who?" is not.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.14)
     A reaction: Not convincing. If she is embracing Daddy, and asking baby, she might get the answer "Daddy", after a bit of coaxing. Who knows what babies up the Amazon respond to?
Modules have in-built specialist information [Fodor]
     Full Idea: Modules contain lots of specialized information about the problem domains that they compute in.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.17)
     A reaction: At this point we must be cautious about modularity. I doubt whether 'information' is the right word. I think 'specialized procedures' might make more sense.
Modules have encapsulation, inaccessibility, private concepts, innateness [Fodor]
     Full Idea: The four essential properties of modules are: encapsulation (information doesn't flow, as in the persistence of illusions); inaccessibility (unreportable); domain specificity (they have private concepts); innateness (genetically preprogrammed).
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.11)
     A reaction: If they have no information flow, and are unreportable and private, this makes empirical testing of Fodor's hypothesis a little tricky. He must be on to something, though.
Something must take an overview of the modules [Fodor]
     Full Idea: It is not plausible that the mind could be made only of modules; one does sometimes manage to balance one's checkbook, and there can't be an innate, specialized intelligence for doing that.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.13)
     A reaction: I agree strongly with this. My own mind strikes me as being highly modular, but as long as I am aware of the output of the modules, I can pass judgement. The judger is more than a 'module'.
Obvious modules are language and commonsense explanation [Fodor]
     Full Idea: The best candidates for the status of mental modules are language (the first one, put there by Chomsky), commonsense biology, commonsense physics, commonsense psychology, and aspects of visual form perception.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.13)
     A reaction: My favourite higher level module is my Personal Assistant, who keeps nagging me to do sundry things, only some of which I agree to. It is an innate superego, but still a servant of the Self.
18. Thought / B. Mechanics of Thought / 4. Language of Thought
Mentalese may also incorporate some natural language [Fodor]
     Full Idea: I don't think it is true that all thought is in Mentalese. It is quite likely (e.g. in arithmetic algorithms) that Mentalese co-opts bits of natural language.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
     A reaction: Presumably language itself would have to be coded in mentalese. If there is some other way for thought to work, the whole mind could use it, and skip mentalese.
Language is ambiguous, but thought isn't [Fodor]
     Full Idea: Thinking can't just be in sequences of English words since, notoriously, thought needs to be ambiguity-free in ways that mere word sequences are not.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
     A reaction: I think this is a strong argument in favour of (at least) propositions. Thoughts are unambiguous, but their expression need not be. Sentences could be expanded to achieve clarity.
Mentalese doesn't require a theory of meaning [Fodor]
     Full Idea: Mentalese doesn't need Grice's theory of natural-language meaning, or indeed any theory of natural-language meaning whatsoever.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
     A reaction: Presumably what is represented by mentalese is a quite separate question from whether there exists a mentalese that does some sort of representing. Sounds plausible.
18. Thought / C. Content / 9. Conceptual Role Semantics
Content can't be causal role, because causal role is decided by content [Fodor]
     Full Idea: Functional role semantics wants to analyze the content of a belief in terms of its inferential (causal) relations; but that seems the wrong way round. The content of a belief determines its causal role.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
     A reaction: This is one of my favourite ideas, which keeps coming to mind when considering functional accounts of mental life. The buck of explanation must, however, stop somewhere.
18. Thought / D. Concepts / 2. Origin of Concepts / c. Nativist concepts
Experience can't explain itself; the concepts needed must originate outside experience [Fodor]
     Full Idea: Experience can't explain itself; eventually, some of the concepts that explaining experience requires have to come from outside it. Eventually, some of them have to be built in.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.12)
18. Thought / D. Concepts / 3. Ontology of Concepts / b. Concepts as abilities
Are concepts best seen as capacities? [Fodor]
     Full Idea: Virtually all modern theorists about philosophy, mind or language tend to agree that concepts are capacities, in particular concepts are epistemic capacities.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 3)
     A reaction: This view seems to describe concepts in functional terms, which generates my perennial question: what is it about concepts that enables them to fulfil that particular role?
For Pragmatists having a concept means being able to do something [Fodor]
     Full Idea: It's a paradigmatically Pragmatist idea that having a concept consists in being able to do something.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 3)
     A reaction: If you defined a bicycle simply by what you could do with it, you wouldn't explain much. I wonder if pragmatism and functionalism come from the same intellectual stable?
19. Language / A. Nature of Meaning / 3. Meaning as Speaker's Intention
It seems unlikely that meaning can be reduced to communicative intentions, or any mental states [Fodor]
     Full Idea: Nobody now thinks that the reduction of the meaning of English sentences to facts about the communicative intentions of English speakers - or to any facts about mental states - is likely to go through.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
     A reaction: Most attempts at 'reduction' of meaning seem to go rather badly. I assume it would be very difficult to characterise 'intentions' without implicit reference to meaning.
19. Language / A. Nature of Meaning / 7. Meaning Holism / b. Language holism
If to understand "fish" you must know facts about them, where does that end? [Fodor]
     Full Idea: If learning that fish typically live in streams is part of learning "fish", typical utterances of "pet fish" (living in bowls) are counterexamples. This argument iterates (e.g "big pet fish"). So learning where they live can't be part of learning "fish".
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 5)
     A reaction: Using 'typical' twice is rather misleading here. Town folk can learn 'fish' as typically living in bowls. There is no one way to learn a word meaning.
19. Language / E. Analyticity / 3. Analytic and Synthetic
Analysis is impossible without the analytic/synthetic distinction [Fodor]
     Full Idea: If there is no analytic/synthetic distinction then there are no analyses.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 3)
     A reaction: There are no precise analyses. I see no reason why a holistic view of language prohibits the careful elucidation of key concepts in the system. It's just a bit fluid.
19. Language / F. Communication / 4. Private Language
The theory of the content of thought as 'Mentalese' explains why the Private Language Argument doesn't work [Fodor]
     Full Idea: If the Mentalese story about the content of thought is true, then there couldn't be a Private Language Argument. Good. That explains why there isn't one.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
     A reaction: Presumably Mentalese implies that all language is, in the first instance, intrinsically private. Dogs, for example, need Mentalese, since they self-evidently think.
26. Natural Theory / A. Speculations on Nature / 2. Natural Purpose / b. Limited purposes
We need a notion of teleology that comes in degrees [Lycan]
     Full Idea: We need a notion of teleology that comes in degrees.
     From: William Lycan (Consciousness [1987], 4.4)
     A reaction: Anyone who says that key concepts, such as those concerning the mind, should come 'in degrees' wins my instant support. A whole car engine requires a very teleological explanation, the spark in the sparkplug far less so.
27. Natural Reality / B. Modern Physics / 4. Standard Model / a. Concept of matter
'Physical' means either figuring in physics descriptions, or just located in space-time [Lycan]
     Full Idea: An object is specifically physical if it figures in explanations and descriptions of features of ordinary non-living matter, as in current physics; it is more generally physical if it is simply located in space-time.
     From: William Lycan (Consciousness [1987], 8.5)
     A reaction: This gives a useful distinction when trying to formulate a 'physicalist' account of the mind, where type-type physicalism says only the 'postulates of physics' can be used, whereas 'naturalism' about the mind uses the more general concept.