Aphasia Therapy, Pragmatics, and Psycholinguistics
Predicting Communicative Success in Aphasia
The primary goal of aphasia therapy is not to restore language, but to restore communication so that the person with aphasia isn’t isolated from their family, friends, and the other people they need to interact with. A huge proportion of aphasia cases have so much language impairment that getting all the way back to normal language is not realistic. Most people with aphasia need to supplement their language with gesture, drawing, and whatever amount of writing they can muster, and to pay attention to both the quality of their own speech and whether the people they are speaking to truly understand them or are just nodding their heads to be supportive. Brain damage can make doing each of these things more difficult, but therapy and counseling can help to bring some of them back within reach. Controlling what your attention is focused on is part of what psychologists call your executive function. If two patients in a clinic have equal degrees and similar kinds of language difficulties, but very different communication abilities, then one of the main differences between them is likely to be that the one with better communication skills has better executive function.
Pragmatics
Pragmatics is the part of linguistics that’s sometimes called “how to do things with words,” and it is the last one to acquire in our native language. The topic of pragmatics is huge but easy to understand. People need to know polite ways to ask for things in their new language, even if the polite forms are complex. For example, English learners need to be able to use, “Could I please have…” instead of “I want…”. It’s hard enough to be a foreigner; one doesn’t also want to be perceived as rude. The differences between written and spoken language are also matters of pragmatics: colloquial narratives like, “So then this guy comes up to me and says…” Or formal alternatives like, “At that point, a man I didn’t know came toward me and said…”. People with aphasia typically have most of their sense of pragmatics spared by their brain injuries, even if they can’t find the syntax or the words that they need to express themselves.
Formulas and Clichés
Prefabricated word sequences reduce construction costs (it’s a metaphor). Prefabricated units save construction costs in assembling sentences as well as houses. Completely ready-to-go units, including greetings like “‘Sup?” (or the older version “What’s up?”), clichés like “As dead as a doornail,” and proverbs like “A stitch in time saves nine,” seem to roll off our tongues. But more important in everyday language are formulas: sequences of words with a slot or two for an additional word that you can choose. Formulas like “May I please have some (more) name of food/drink,” standard information questions like “Are we at (name of destination) yet?” These are the ones preferred by second language learners.
- Children pick up on many of these formulas early because of their frequency.
- Second language learners find them extremely useful to memorize because they help one sound more fluent.
- For people with some kinds of aphasia, they may be among the few fluent word sequences that they can produce.
Blend errors can occur when two relevant formulas are aroused at the same time.
Origins of Psycholinguistic Experiments
Where do ideas for psycholinguistic experiments come from? First observation: people having real-life language problems, where they are likely to make temporary errors. For skilled readers, these are usually places where top-down processing happens to make false predictions. We’ll start with a familiar example of an error where top-down can cause a person to make mistakes. A proofreader’s error is a failure to notice a typing mistake when you’re reading over something important. Proofreading is still necessary in the computer age because grammar-checker programs don’t catch errors that make real words and don’t violate any obvious grammar rule. A test for proofreading ability is to hand people an ordinary printed page and ask them to circle every “f” on it. If you try this (…). In written language, readers are likely to have difficulty with garden path sentences (…). Slips of the ear are another kind of observation that has given rise to experiments on what happens when people are given slightly peculiar materials to listen to. Observing how people hesitate when they speak, observing children trying to make past tenses for irregular words, and observing one’s own efforts to remember items on a shopping list have all been the starting points for illuminating experiments.
Errors in Phonological Encoding
Planning the sequence of phonemes in a phrase: Everybody enjoys looking at errors involving sounds—a shift of a segment from one word to another… What’s important to note about these errors is, again, how neat they are. They create nonwords* that obey English phonotactic patterns, and most of the time, a sound moves from its place in the target syllable to a very similar, if not identical, kind of place in the error syllable. These errors are involved in the order of the sounds, not in articulating the sounds (gesture speech level). Psycholinguistic models for how sounds get into the wrong places are quite a bit like the models that are used for errors in the placement of words or morphemes, but in our production model, these exchanges have to happen after the Positional Level. All this happens in the phonological encoding model. “Encoding” means that, to convey what we want to other people, we have to encode those meanings in sounds (or signs in sign language). Because the order of the sounds in a word is a crucial part of the information needed to produce it, psycholinguists often call these complete activated forms ordered strings of phonemes. Our model explains phonological errors that exchange or blend sounds across words (like the ones we’ve just been examining) by saying that these ordered strings of phonemes, after they have been aroused by their lemmas, have to wait around in a phonological buffer until we are on the point to produce them. When all goes well, each phoneme comes out in the right order and takes its place, but phonological errors happen when one of the phonemes is too strongly activated and pushes into the wrong place, or is too weakly activated and gets dropped out of the buffer.
How to Talk and Listen with Someone with Aphasia
- Speak slowly, and encourage them to do the same thing; act as if you have all the time in the world.
- Look at your aphasic conversation partner; watch his or her face for indications of understanding or confusion.
- Use multiple channels.
- Say words in phrases or sentences and repeat them because the person you’re talking to apparently didn’t understand them the first time. People with aphasia, just like the rest of us, rely on top-down processing to help identify a difficult word.
- Check your understanding! Try saying back to your partner what you think they said, and watch their reactions.
- Think about what you’re going to say, and keep your sentences as simple as you can.
- Treat your aphasic conversation partner like an intelligent adult.
Troubleshooting Language Problems on Familiar Tests
The first language tests we all took were probably, linguistically speaking, spelling tests, or perhaps circling the right answer in a workbook that had multiple-choice questions about pictures. In a second or foreign language class, we had to translate words, sentences, and perhaps longer passages from the new language back into the language of the school, and, much harder, from the familiar language into the new one. All these tests deal with language skills: Speaking, Writing, Reading. Conversations made us feel both idiotic and anxious. There is a special anxiety when learning another language. Troubleshooting someone’s language processing problems takes two major steps:
- Task Analysis, which gets you set to do the second step in troubleshooting.
- Error Analysis: Thinking about the kinds of errors a person has made and why they might be making those particular errors; for example, studying particular errors of native Spanish speakers studying English or the other way around.
Task analyses are annoyingly pedestrian (like reading a complete analysis of a joke), but they are essential to figuring out what to do in challenging cases. After all, any of the many steps in saying, writing, reading, or understanding a word or sentence can go wrong while other steps are working properly. You want to focus your intervention efforts on the parts of a language task that your clients or students find difficult, and not waste their time (and yours) on the parts that are not part of their problem.
- Generativists: are interested in the deep level. Their main purpose is language processes.
- Pragmatics: They think that studying disfluencies would be a good way to study language + gestures.
- Empirics: are only interested in the results/in the surface level.
How Can a Bunch of Cells Learn or Know Something?
All those neurons are sending information to one another. If information (someone’s name, the taste of a particular kind of cheese, as much as you can recall of a gorgeous sunset, etc.) or a skill (how to tie a bow, how to parallel park a car, etc.) is going to stay in your brain, it must change your brain a little. Any information or experience will be reflected in our brains forever: conscious or unconscious. Every experience, conscious or unconscious, makes tiny changes in the strengths of the synaptic connections between some of our neurons—specifically, how many bunches of neurotransmitter molecules will be sent from a neuron to its neighbors if the first neuron becomes active. When you learn a word, its sound, meaning, and the grammatical and social contexts for its use, as well as the voice of the person who said it or the look of the page where you read it, all get linked up to other information of the same kinds = memory associations. The idea that the brain keeps changing with experience as synaptic connections change strength is called plasticity. Brains are much more plastic than we used to think. In deaf people, neurons that would have been used for sound perception are used for processing visual language, and in blind people, the area for processing sounds expands into the area that normally processes vision.
- Deaf people are not able to use sound production neurons; that’s why they activate their visual language.
- Blind people have no vision process areas; that’s why they come to their processing sounds.
So your brain’s knowledge exists, at least in large measure, as an enormous set of synaptic connection strengths. And these connections mean something because they are ultimately linked to parts of the brain that are tightly connected to sensory neurons and motor neurons. When you remember how something looks, sounds, or feels, you are picking up a small part of the sensory activation caused by the original experience. If the memory is not quite true to the original event—and memories can be modified by time, emotions, and later experience—it’s because the synaptic connections that stored the original information have been changed. When you remember how to do something, your motor neurons get activated; chemical changes happen inside them that make them ready and able to send signals to the muscles you would need to carry out that action. What activates them? Your wanting (intention + memory). The memory has been formed from signals that came back to your brain from your mouth and your ears, or your hands and your eyes, as you learned to do these highly skilled acts. Some of these were sensory signals that became linked to the motor signals you had just sent out. Intention to get activation of motor neurons.