Language and the Hands: The Effects of the Hands on Semantic Processing
Item statusRestricted Access
The hands are closely tied to language through their role in gestures, sign language, and the mirror neuron system for grasping. Studies within embodied cognition have found that motor resonance occurs when people observe graspable objects represented as pictures or words. This has led to questions about the influence of the hands on reading. This experiment investigated the effects of the hands on semantic processing. Participants completed a semantic categorization task and a visual task in two different postures: a proximal posture in which their hands were near the text, and a distal posture in which their hands were in their lap and out of view. It was hypothesized, in line with previous research, that semantic processing would be reduced when the hands were in the proximal posture because of a trade-off between semantic and spatial processing. To the contrary, results showed no effect from the hands on semantic processing. Response times were slower in the proximal posture condition than in the distal posture condition of the visual task. The semantic processing of tool words was faster for the proximal posture condition, but this was not significant. Results suggest that the effects of the hands on semantic processing are not robust, and that further research is necessary to clarify any influences of the hands on reading. Future experiments on a possible facilitation effect from the hands when semantically processing graspable objects would be beneficial in shedding light on the relationship between the hands and higher levels of cognition.