Summary: Constraining hand movements affect the processing of object meaning, a finding that supports the theory of embodied cognition.
Source: Osaka Metropolitan University
How do we understand words? Scientists don’t fully understand what happens when a word appears in your brain. A research group led by Professor Shogo Makioka from the Graduate School of Sustainable System Sciences at Osaka Metropolitan University wanted to test the idea of embodied cognition.
Embodied cognition proposes that people understand words for objects through the way they interact with them, so researchers designed a test to observe the semantic processing of words when the ways in which participants could interact with objects were limited.
Words are expressed relative to other words; a “cup”, for example, can be a “glass vessel, used for drinking”. However, you can only use a cup if you understand that to drink from a cup of water, you hold it in your hand and bring it to your mouth, or if you drop the cup, it will smash on the ground .
Without understanding this, it would be difficult to create a robot capable of manipulating a real cup. In artificial intelligence research, these problems are known as symbol grounding problems, which map symbols to the real world.
How do humans achieve the rooting of symbols? Cognitive psychology and cognitive science propose the concept of embodied cognition, where objects are given meaning through interactions with the body and the environment.
To test embodied cognition, researchers conducted experiments to see how participants’ brains responded to words describing objects that can be manipulated by hand, when participants’ hands could move freely compared to when they were restrained.
“It was very difficult to establish a method for measuring and analyzing brain activity. The first author, Ms. Sae Onishi, worked tirelessly to come up with a task, so that brain activity could be measured with sufficient accuracy,” Professor Makioka explained.
In the experiment, two words such as “mug” and “broom” were presented to participants on a screen. They were asked to compare the relative sizes of the objects represented by these words and verbally respond which object was larger – in this case, “broom”.
Comparisons have been made between the words, describing two types of objects, objects manipulable by hand, such as “mug” or “broom” and non-manipulable objects, such as “building” or “street lamp”, to observe how each type was treated.
During the tests, the participants rested their hands on a desk, where they were either free or restrained by a transparent acrylic plate. When the two words were presented on the screen, to answer which represented a larger object, participants had to think of the two objects and compare their sizes, forcing them to process the meaning of each word.
Brain activity was measured by functional near-infrared spectroscopy (fNIRS), which has the advantage of taking measurements without imposing additional physical constraints.
The measurements focused on the interparietal sulcus and the inferior parietal lobule (supramarginal gyrus and angular gyrus) of the left brain, responsible for tool-related semantic processing.
Verbal response speed was measured to determine how quickly the participant responded after the words appeared on the screen.
The results showed that left-brain activity in response to hand-manipulable objects was significantly reduced by the hand restraints. Verbal responses were also affected by manual constraints.
These results indicate that constraining hand movement affects object meaning processing, supporting the idea of embodied cognition. These findings suggest that the idea of embodied cognition might also be effective for artificial intelligence to learn the meaning of objects.
About this cognitive research news
Author: Yoshiko Tani
Source: Osaka Metropolitan University
Contact: Yoshiko Tani – Osaka Metropolitan University
Image: Image is credited to Makioka, Osaka Metropolitan University
Original research: Free access.
“Hand restraint reduces brain activity and affects the speed of verbal responses on semantic tasks” by Sae Onishi et al. Scientific reports
Manual constraint reduces brain activity and affects the speed of verbal responses on semantic tasks
According to the theory of embodied cognition, semantic processing is closely related to body movements. For example, awkward hand movements inhibit memory of objects that can be manipulated with the hands. However, it has not been confirmed whether bodily restraint reduces semantic-related brain activity.
We measured the effect of manual constraint on semantic processing in the parietal lobe using functional near-infrared spectroscopy.
A pair of words representing the nouns of hand-manipulable (eg, cup or pencil) or non-manipulable (eg, windmill or fountain) objects were presented, and participants were asked to identify which object was bigger.
The reaction time (RT) in the judgment task and the activation of the left intraparietal sulcus (LIPS) and the left inferior parietal lobule (LIPL), including the supramarginal gyrus and the angular gyrus, were analyzed. We found that hand movement constraint suppressed brain activity in LIPS toward hand-manipulable objects and affected RT in the size judgment task.
These results indicate that bodily restraint reduces the activity of brain regions involved in semantics. Manual restraint could inhibit motor simulation, which, in turn, would inhibit body-related semantic processing.
#Talk #hands #Neuroscience #News