21 ideas
19544 | Closure says if you know P, and also know P implies Q, then you must know Q [Dretske] |
19545 | We needn't regret the implications of our regrets; regretting drinking too much implies the past is real [Dretske] |
19547 | Reasons for believing P may not transmit to its implication, Q [Dretske] |
19546 | Knowing by visual perception is not the same as knowing by implication [Dretske] |
19548 | The only way to preserve our homely truths is to abandon closure [Dretske] |
19549 | P may imply Q, but evidence for P doesn't imply evidence for Q, so closure fails [Dretske] |
19550 | We know past events by memory, but we don't know the past is real (an implication) by memory [Dretske] |
2584 | Lobotomised patients can cease to care about a pain [Block] |
2582 | A brain looks no more likely than anything else to cause qualia [Block] |
5692 | Introspection is not perception, because there are no extra qualities apart from the mental events themselves [Rosenthal] |
2574 | Behaviour requires knowledge as well as dispositions [Block] |
2576 | In functionalism, desires are internal states with causal relations [Block] |
2575 | Functionalism is behaviourism, but with mental states as intermediaries [Block] |
2583 | You might invert colours, but you can't invert beliefs [Block] |
2578 | Could a creature without a brain be in the right functional state for pain? [Block] |
2585 | Not just any old functional network will have mental states [Block] |
2586 | In functionalism, what are the special inputs and outputs of conscious creatures? [Block] |
2579 | Physicalism is prejudiced in favour of our neurology, when other systems might have minds [Block] |
2577 | Simple machine-functionalism says mind just is a Turing machine [Block] |
2580 | A Turing machine, given a state and input, specifies an output and the next state [Block] |
2581 | Intuition may say that a complex sentence is ungrammatical, but linguistics can show that it is not [Block] |