Presidential Column
How Our Bodies Do — and Don’t — Shape Our Minds
René Descartes famously declared, “I think, therefore I am,” but some modern theories about the mind would belie that proposition. More appropriate, perhaps, is writer Haruki Murakami’s version: “I move, therefore I am.”
Indeed, emerging evidence suggests that how our bodies engage with the environment around them has significant influence on how we see, feel, learn, and communicate. At the Presidential Symposium at the 2017 APS Annual Convention in Boston, APS President Susan Goldin-Meadow gathered four psychological scientists from a wide spectrum of domains and methodologies to discuss their latest findings and current theories regarding how our bodies shape our mental lives.
One of the leading researchers in this area is APS Fellow Jessica Witt, an associate professor of cognitive psychology at Colorado State University. Witt presented the results of several studies in which she examined the relationship between action and perception in a variety of settings, from the laboratory to the football field.
She showed that baseball players with higher batting averages tended to see the ball as larger, while golfers performing well saw the hole as larger than those who spent their day stuck in sand traps. Adding to these correlational results, Witt found that subjects who were more successful in a field-goal-kicking session rated the goal as larger than those who made fewer goals only after they had kicked, indicating a causative role of performance in perceived goal size.
Far from being a merely athletic phenomenon, the effect of action on visual measures such as perceived distance or size has been demonstrated in more basic motor tasks such as grasping for a just-out-of-reach target. Witt has found it to be true about body size as well: People who weigh more tend to perceive a target they must walk to as farther away compared with those who weigh less.
“Does that mean that all action-specific effects are perceptual? No,” Witt said. “But I think we’ll be able to fine-tune which ones are.”
One particularly interesting facet of these results is that a person’s beliefs — be it about their athletic ability or body size — do not appear to influence the effect of action on perception. Similar to an optical illusion, in which our misperception persists despite us knowing it is in fact wrong, these effects have the characteristic of being what Witt calls “cognitively impenetrable” — you cannot think your way out of them.
“These aren’t cognitive effects on perception, but may be related more to lower-level motor processes,” said Witt.
A specific type of action, hand gesturing, has been found to have implications for learning. While gesture is understood to be helpful in conveying particularly ambiguous or subjective information, Susan Wagner Cook, a professor at the University of Iowa, wanted to investigate the role of gesture specifically in learning. Of particular interest is whether gesture can aid learning of more formal, abstract material such as math.
In an experiment teaching students about the equal sign as a symbol of mathematical equivalence (rather than as a “put answer here” symbol, which younger children often think it is), she found that when the instructor used gesture while teaching the lesson, students did indeed improve testing performance both immediately and days later.
To investigate possible confounding variables such as other nonverbal behaviors or the quality of speech, Cook created an animated avatar whose movements could be tightly controlled when teaching the math lessons. Once again, subjects who saw the gesturing avatar learned the concept more fully, and also were more successful at generalizing the concept to other math problems.
Gesture appears to play a significant role in children’s learning, but Cook wondered about adults, who are more linguistically advanced and therefore might not need to rely as much on nonverbal cues. It turns out that we, too, learn more when taught with gesture.
Cook’s theory is that gesture helps to cue the important information being presented, setting the stage to allow the learner to make sense of upcoming information. To test this theory, she set up experiments in which she disrupted the coordination of the avatar’s gestures with its speech, so that the gestures either came before or after the point at which they would naturally occur. She found that gestures enhanced learning when they came on time or early but not when they came after the relevant information.
“One possibility is that we use gesture to constrain our understanding in the moment, and so we need gesture to come before the speech or it’s too late to influence how we interpret that speech,” Cook said.
If this is true, then gesture should be especially helpful for people who have language-processing deficits or when speech is ambiguous or vague.
“In those moments when you’re not sure what the speaker is going to say next, you might need to rely on the gesture to help you get ready for what’s coming,” Cook said.
Our corporeality is linked to how we perceive our visual environment and how we learn, but some researchers hypothesize that our bodies also can affect how we feel and, possibly, act.
Harvard Business School professor Amy Cuddy discussed the research on the feedback effects of body posture on psychological outcomes. Statistical methodological differences, she said, have led different researchers to come to disparate conclusions about the true effect of body posture on how a person feels and behaves. In fact, the distinction between those two categories of effects — on measures of feelings and other subjective mental states versus more observable data such as hormone levels and performance tests — appears to be key, according to results of Cuddy’s analyses.
In aggregate, studies examining the effects of postural feedback on feelings of power and mental states have shown stronger evidence than those that focus on behavioral and physiological outcomes, but the “kitchen-sink” method of meta-analysis that combines all these factors results in substantially weaker support for postural feedback effects, she said.
“The mental states do seem to be the stronger effects,” she said. “At this point, we cannot say for certain that there is something going on with the behaviors, but I don’t think we should throw the baby out with the bathwater.”
Bodily influence extends beyond internal targets such as perception and feelings to communication strategies. Ted Supalla, a professor of neurology, linguistics, and psychology at Georgetown University, studies the evolution of signed languages all over the world. Sign language is inherently embodied, as its roots come from imagistic depictions and embodiment, but as it has developed into a form of language, it has shed those bodily and imagistic constraints, Supalla explained.
Gesture typically relies heavily on transparency; if you are trying to communicate with someone with whom you do not share a common language, you likely will use gestures that paint a picture or act out a scene that can be universally understood. (For an example, think about how you would gesture the verb “walk” with your hands.) However, as gesture develops into formalized language, it changes toward more opaque morphology; the form and meaning no longer have an obvious connection. Grammatical elements replace visual analogy, restructuring morphological components of gesture into a formalized symbolic system.
Supalla discussed several universal requirements for sign language structure, including the importance of gestural sequence as a way to express complex meanings. Expressing a reference to the sun, for example, by tracing one’s face (describing a round object) before pointing upwards helps convey that you are referring to round objects in outer space rather than the ceiling.
Space is often used for reference, so the directional path of a sign — for example, for a signed verb, where its movement path begins and where it ends — provides information about the subject and the object or recipient involved. Facial expression is certainly useful for conveying affective information, but many nonsigners are not aware that these expressions also communicate syntactic information. Raised eyebrows indicate that a yes or no question is being asked, for instance, and the shape of the mouth serves an adverbial function, communicating how an action happened (e.g., carelessly or intensely).
Supalla’s collaborative work with colleagues in brain-imaging studies has revealed that signers showed lateralized activation in their left hemispheres when viewing a person signing, as expected given the left hemisphere’s role in language processing regardless of modality. Nonsigners showed more activation in their right hemispheres, where motion-perception processing occurs, since they were viewing the person’s hand movements but not interpreting them as language. Nonsigners also had the same tendency when viewing nonlinguistic gesture. Signers still showed increased activation in their left hemispheres when viewing gesture, indicating that they were processing gesture in a structured way, similarly to how they processed language.
“This tells us that, throughout the evolution of a signed language, the language changes to become a better formal system, and the brain lateralizes this processing using the left hemisphere, just as in the evolution and processing of spoken languages,” Supalla said.
All the symposium speakers have plans for further research into their respective subjects, hoping to shed ever more light on the hidden connections between our bodies and our mental lives.