Combining Philosophers

Ideas for Lynch,MP/Glasgow,JM, Jerry A. Fodor and Simone de Beauvoir

unexpand these ideas     |    start again     |     choose another area for these philosophers

display all the ideas for this combination of philosophers


21 ideas

18. Thought / B. Mechanics of Thought / 3. Modularity of Mind
Something must take an overview of the modules [Fodor]
     Full Idea: It is not plausible that the mind could be made only of modules; one does sometimes manage to balance one's checkbook, and there can't be an innate, specialized intelligence for doing that.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.13)
     A reaction: I agree strongly with this. My own mind strikes me as being highly modular, but as long as I am aware of the output of the modules, I can pass judgement. The judger is more than a 'module'.
Modules have in-built specialist information [Fodor]
     Full Idea: Modules contain lots of specialized information about the problem domains that they compute in.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.17)
     A reaction: At this point we must be cautious about modularity. I doubt whether 'information' is the right word. I think 'specialized procedures' might make more sense.
Mental modules are specialised, automatic, and isolated [Fodor, by Okasha]
     Full Idea: Fodor argues that mental modules have three important featuresL 1) they are domain-specific, 2) their operation is mandatory, 3) they are informationally encapsulated.
     From: report of Jerry A. Fodor (The Modularity of Mind [1983]) by Samir Okasha - Philosophy of Science: Very Short Intro (2nd ed) 6
     A reaction: Mandatory is interesting. When I hear an English sentence I can't decide not to process it. Modules cannot be too isolated or they couldn't participate in the team. Each one needs a comms manager.
Modules have encapsulation, inaccessibility, private concepts, innateness [Fodor]
     Full Idea: The four essential properties of modules are: encapsulation (information doesn't flow, as in the persistence of illusions); inaccessibility (unreportable); domain specificity (they have private concepts); innateness (genetically preprogrammed).
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.11)
     A reaction: If they have no information flow, and are unreportable and private, this makes empirical testing of Fodor's hypothesis a little tricky. He must be on to something, though.
Obvious modules are language and commonsense explanation [Fodor]
     Full Idea: The best candidates for the status of mental modules are language (the first one, put there by Chomsky), commonsense biology, commonsense physics, commonsense psychology, and aspects of visual form perception.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.13)
     A reaction: My favourite higher level module is my Personal Assistant, who keeps nagging me to do sundry things, only some of which I agree to. It is an innate superego, but still a servant of the Self.
Modules make the world manageable [Fodor]
     Full Idea: Modules function to present the world to thought under descriptions that are germane to the success of behaviour.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.13)
     A reaction: "Descriptions" might be a bold word to use about something so obscure, but this pinpoints the evolutionary nature of modularity theory, to which I subscribe.
Babies talk in consistent patterns [Fodor]
     Full Idea: "Who Mummy love?" is recognizably baby talk; but "love Mummy who?" is not.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.14)
     A reaction: Not convincing. If she is embracing Daddy, and asking baby, she might get the answer "Daddy", after a bit of coaxing. Who knows what babies up the Amazon respond to?
Rationality rises above modules [Fodor]
     Full Idea: Probably, modular computation doesn't explain how minds are rational; it's just a sort of precursor. You work through it to get a view of how horribly hard our rationality is to understand.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.17)
     A reaction: The choice is between a Self which weighs and judges the inputs, or merely decisions that automatically result from the balance of inputs. The latter seems unlikely. Vetoes are essential.
Modules analyse stimuli, they don't tell you what to do [Fodor]
     Full Idea: The thinking involved in "figuring out" what to do is a quite different kind of mental process than the stimulus analysis that modules perform.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.13)
     A reaction: My PA theory fits this perfectly. My inner assistant keeps providing information about needs, duties etc., but takes no part in my decisions. Psychology must include the Will.
Blindness doesn't destroy spatial concepts [Fodor]
     Full Idea: Blind children are not, in general, linguistically impaired; not even in their talk about space.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch.13)
     A reaction: This is offered to demonstrate that spatial concepts are innate, even in the blind. But then we would expect anyone who has to move in space to develop spatial concepts from experience.
18. Thought / B. Mechanics of Thought / 4. Language of Thought
Belief and desire are structured states, which need mentalese [Fodor]
     Full Idea: A defence of the language of thought has to be an argument that believing and desiring are typically structured states.
     From: Jerry A. Fodor (Psychosemantics [1987], p.136)
     A reaction: A structure is one thing, and a language is another. Both believings and desirings can be extremely vague, to the point where the owner is unsure what is believed or desired. They can, of course, be extremely precise.
Language is ambiguous, but thought isn't [Fodor]
     Full Idea: Thinking can't just be in sequences of English words since, notoriously, thought needs to be ambiguity-free in ways that mere word sequences are not.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
     A reaction: I think this is a strong argument in favour of (at least) propositions. Thoughts are unambiguous, but their expression need not be. Sentences could be expanded to achieve clarity.
Mentalese may also incorporate some natural language [Fodor]
     Full Idea: I don't think it is true that all thought is in Mentalese. It is quite likely (e.g. in arithmetic algorithms) that Mentalese co-opts bits of natural language.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
     A reaction: Presumably language itself would have to be coded in mentalese. If there is some other way for thought to work, the whole mind could use it, and skip mentalese.
Mentalese doesn't require a theory of meaning [Fodor]
     Full Idea: Mentalese doesn't need Grice's theory of natural-language meaning, or indeed any theory of natural-language meaning whatsoever.
     From: Jerry A. Fodor (In a Critical Condition [2000], Ch. 6)
     A reaction: Presumably what is represented by mentalese is a quite separate question from whether there exists a mentalese that does some sort of representing. Sounds plausible.
Ambiguities in English are the classic reason for claiming that we don't think in English [Fodor]
     Full Idea: That there are ambiguities in English is the classic reason for claiming that we don't think in English.
     From: Jerry A. Fodor (LOT 2 [2008], Ch.3.5)
     A reaction: I have always been impressed by this simple observation, which is my main reason for believing in propositions (as brain events). 'Propositions' may just be useful chunks of mentalese.
Since the language of thought is the same for all, it must be something like logical form [Fodor, by Devlin]
     Full Idea: Fodor and Jackendorff argue that since the internal language of thought, or conceptual structure, has to be more or less the same for all people, of whatever language, it will surely be something like logical form.
     From: report of Jerry A. Fodor (The Language of Thought [1975]) by Keith Devlin - Goodbye Descartes Ch.8
     A reaction: The discovery (by, e.g., Frege and Russell) that there is something called 'logical form', which we can track down and represent in precise and fairly unambiguous symbolism, may be one of the greatest of all human discoveries. Perhaps.
We must have expressive power BEFORE we learn language [Fodor]
     Full Idea: I am denying that one can learn a language whose expressive power is greater than that of a language that one already knows.
     From: Jerry A. Fodor (How there could be a private language [1975], p.389)
     A reaction: I presume someone who had a native language of limited vocabulary could learn a new language with a vast vocabulary. I can increase my expressive power with a specialist vocabulary (e.g. legal).
18. Thought / B. Mechanics of Thought / 5. Mental Files
Mental representations name things in the world, but also files in our memory [Fodor]
     Full Idea: Mental representations can serve both as names for things in the world and as names of files in the memory.
     From: Jerry A. Fodor (LOT 2 [2008], Ch.3 App)
     A reaction: I am laughed at for liking this idea (given the present files of ideas before you), but I think this it is very powerful. Chicken before egg. I was drawn to databases precisely because they seemed to map how the mind worked.
We think in file names [Fodor]
     Full Idea: We think in file names.
     From: Jerry A. Fodor (LOT 2 [2008], Ch.3 App)
     A reaction: This is Fodor's new view. He cites Treisman and Schmidt (1982) for raising it, and Pylyshyn (2003) for discussing it. I love it. It exactly fits my introspective view of how I think, and I think it would fit animals. It might not fit some other people!
18. Thought / B. Mechanics of Thought / 6. Artificial Thought / a. Artificial Intelligence
Is thought a syntactic computation using representations? [Fodor, by Rey]
     Full Idea: The modest mentalism of the Computational/Representational Theory of Thought (CRTT), associated with Fodor, says mental processes are computational, defined over syntactically specified entities, and these entities represent the world (are also semantic).
     From: report of Jerry A. Fodor (works [1986]) by Georges Rey - Contemporary Philosophy of Mind Int.3
     A reaction: This seems to imply that if you built a machine that did all these things, it would become conscious, which sounds unlikely. Do footprints 'represent' feet, or does representation need prior consciousness?
Frame Problem: how to eliminate most beliefs as irrelevant, without searching them? [Fodor]
     Full Idea: The frame problem is, precisely: How does one know that none of one's beliefs about Jupiter are germane to the current question, without having to recall and search one's beliefs about Jupiter?
     From: Jerry A. Fodor (LOT 2 [2008], Ch.4.4)
     A reaction: Presumably good chess-playing computers have made some progress with this problem. The only answer, as far as I can see, is that brains have a lot in common with relational databases. The mind is structured around a relevance-pattern.