Key Concepts in Philosophy of Mind
What is the Problem of Other Minds?
The problem of other minds questions how we can know whether other beings have conscious thoughts and feelings. It stems from the fact that other minds are inherently unobservable.
Russell’s Solution
Bertrand Russell proposed that we can infer the existence of other minds based on analogy:
- We observe that our own intelligent behavior is linked to our conscious thoughts.
- We observe similar intelligent behaviors in others.
- Therefore, we can infer that others likely have conscious thoughts similar to our own.
In essence, we generalize from our own case to the case of others. If someone curses and shouts, we infer they are experiencing anger, similar to how we feel when we exhibit those behaviors.
What is Behaviorism?
Behaviorism, a dominant materialist theory of mind in the 1950s, asserts that the mind is not something separate from behavior, but rather *is* the behavior itself. Mental states are simply dispositions to behave in certain ways.
Armstrong’s Criticisms of Behaviorism
D.M. Armstrong, while finding Behaviorism a step in the right direction, criticized it for several reasons. A key criticism is that people can have mental states without acting on them. For example, someone might feel angry but suppress any outward display of that anger. This suggests that mental states and behaviors are not identical.
Armstrong argued that mental states are *linked* to behaviors, but are not the behaviors themselves. He proposed that mental states are “inner” causes of behavior, similar to dispositions. For instance, the disposition of a glass to break is an inner property that causes it to shatter when struck, even if it doesn’t always break.
Armstrong, like Descartes, believed mental states *cause* behavior. Behaviorists, conversely, believed mental states *are* behavior. Unlike Descartes, Armstrong maintained that mental states are physical.
Example: A long-distance driver might zone out, continuing to drive safely without conscious awareness of every detail. This demonstrates a separation between the mental state of perceiving the road and the conscious awareness of that perception.
What is Identity Theory?
Identity theory posits that mental states are identical to specific brain states. For example, the experience of pain is equated to a particular pattern of neural activity.
Criticism of Identity Theory
A common criticism points out the variability in brains. If pain is identical to a *specific* brain state, how can different individuals, with different brain structures, experience the same sensation of pain? Furthermore, animals experience pain, but their brain states differ significantly from those of humans.
What is Frank Jackson’s Mary Argument?
Frank Jackson’s “Mary’s Room” thought experiment challenges physicalism, the view that everything is physical.
The Argument
- Mary is a color scientist who knows everything about the physical processes involved in color vision.
- However, Mary has lived her entire life in a black and white room, never experiencing color firsthand.
- When Mary leaves the room and sees a red tomato for the first time, she learns something new: what it’s *like* to see red.
This is supposed to show that physicalism is false. If Mary had all the physical facts in the room but learns something new upon experiencing color, then there must be non-physical facts (qualia, or subjective experiences).
Armstrong’s Higher-Order Perception Theory
Armstrong’s Higher-Order Perception (HOP) theory of consciousness distinguishes between levels of mental states:
- Lower-order perceptions: Basic sensory experiences (e.g., seeing a car, hearing a sound).
- Higher-order perceptions: Conscious awareness *of* those lower-order perceptions (e.g., being consciously aware that you are seeing a car).
Application to the Long-Distance Driver
The long-distance driver, while “zoned out,” still has lower-order perceptions of the road, allowing them to drive safely. However, they lack the higher-order perception of being consciously aware of those perceptions. When they “snap out of it,” they regain that higher-order awareness.
Cocktail Party Effect: You may not be consciously attending to background conversations, but you still have lower-order perceptions of them, as evidenced by your reaction to hearing your name or a trigger word.
Folk Psychology and Eliminative Materialism
- What is Folk Psychology? Folk psychology is our everyday understanding of how people think and behave, attributing beliefs, desires, and intentions to explain their actions.
Churchland’s Eliminative Materialism
Paul Churchland advocates for eliminative materialism, arguing that folk psychology is a fundamentally flawed theory that should be eliminated and replaced by a more accurate neuroscientific account of the mind.
He draws analogies to past scientific revolutions, where outdated theories (like the theory of humors or the geocentric model of the universe) were replaced by more accurate ones. He argues that folk psychology has:
- Widespread explanatory failures.
- A history of being wrong (inductive argument).
- No basis in neuroscience, which doesn’t use concepts like “belief” or “desire.”
Folk Psychology and Jerry Fodor’s View
- What is Folk Psychology? As above, it’s our common-sense understanding of minds.
Fodor’s Defense of Folk Psychology
Jerry Fodor, in contrast to Churchland, defends folk psychology as a true and indispensable theory. He argues that:
- It is highly successful in predicting and explaining behavior.
- It is indispensable; we cannot function without it, and there’s no viable alternative.
- It is deeply embedded in our social and scientific practices (e.g., linguistics, economics).
Fodor’s Representational Theory of Mind
Fodor’s Representational Theory of Mind (RTM), also known as the Language of Thought hypothesis, proposes that mental states are relations to mental representations that have a language-like structure.
Key Ideas
- Mental states are propositional attitudes, consisting of an attitude (e.g., belief, desire) and a proposition (the content of the belief or desire).
- Mental states are causally efficacious; they cause other mental states and behaviors.
- Thinking involves manipulating symbols in the mind according to logical rules, similar to a language.
Dennett’s Criticism
Daniel Dennett criticizes RTM by using the analogy of two chess-playing computer programs. If one program is criticized for “getting its queen out early,” we wouldn’t expect to find a specific line of code explicitly stating “get the queen out early.” The behavior emerges from the interaction of many rules, not a single, explicit representation. Dennett argues that mental states might similarly emerge from complex interactions, rather than being explicitly represented in a language of thought.
The Turing Test
The Turing Test, proposed by Alan Turing, is a test of a machine’s ability to exhibit intelligent behavior indistinguishable from that of a human.
Purpose
The purpose is to determine whether a machine can *think*. It’s based on the “Imitation Game,” where a human interrogator tries to distinguish between a human and a computer based solely on their text-based responses to questions.
Ex Machina Variation
In the film Ex Machina, the test is modified. Caleb, the interrogator, *knows* Ava is a machine. The test is not whether Ava can fool him into thinking she’s human, but whether he will develop feelings for her *despite* knowing she’s a robot. This explores a deeper level of interaction and emotional connection, going beyond simple imitation.
Searle’s Chinese Room Argument
John Searle’s Chinese Room Argument is a thought experiment designed to challenge the idea that machines can truly *understand* or have a mind.
The Argument
Imagine a person inside a room who doesn’t understand Chinese. They receive written Chinese questions through a slot, and using a rule book, they manipulate symbols to produce Chinese answers. To an outside observer, it might seem like the room “understands” Chinese, but the person inside is simply manipulating symbols according to rules, without any comprehension.
Searle argues that this is analogous to a computer. Computers manipulate symbols according to programmed rules, but they don’t *understand* the meaning of those symbols. Therefore, according to Searle, computers cannot have minds in the same way that humans do.
The ‘Heads in the Sand’ Objection
This objection to the possibility of thinking machines expresses fears about the potential consequences:
- Loss of human superiority: If machines can think, humans would no longer be unique in their possession of reason.
- Risk of being supplanted: Machines might become more intelligent than humans.
- Risk of domination: Machines might take over and enslave or exterminate humanity.
This is not a logical argument against machine thought, but rather an expression of anxieties. It raises ethical considerations about the development of artificial intelligence, but doesn’t refute the possibility of machines thinking.