Utilization of visual feedback of the hand according to target view availability in the online control of prehension movements
Introduction
The reach-to-grasp movement is a fundamental human skill used in daily life. It serves as a form of interaction with the external world and has been a research focus for the last three decades (see Castiello and Begliomini, 2008, Grafton, 2010, Rosenbaum et al., 2012 for recent reviews). In early studies, Jeannerod, 1981, Jeannerod, 1984 proposed that this movement consists of two components: a transport component, which is thought to be involved in directing the arm to the spatial location of the target, and a manipulation component, which is involved in grasping a three-dimensional object. The behavioral aspects of grasping indicate that the fingers first open gradually and form the appropriate configuration for the target object to be grasped (“preshaping”). The fingers then open wider than the size of the target object and stop opening at a point about 60–70% into the movement (i.e., the peak grip aperture; PGA). They then enclose the object, and finally touch its surface (e.g., Jeannerod & Marteniuk, 1992). The PGA is highly correlated with the target size (Marteniuk, Leavitt, MacKenzie, & Athènes, 1990).
Proper performance of this movement requires visuomotor transformation, indicating that online vision is essential for goal-directed movements. Since Woodworth’s pioneering work (Woodworth, 1899), many researchers have investigated how online vision affects the control of goal-directed movements (e.g., Beggs and Howarth, 1970, Keele and Posner, 1968, Vince, 1948, Zelaznik et al., 1983). Previous studies on goal-directed movements focused on: (i) which source of vision was critical for the control of movement, and (ii) how that source affected the movement kinematics. In aiming and reaching tasks, visual feedback of the hand position came to be viewed as important in the later phase of the movement (e.g., Beaubaton and Hay, 1986, Carlton, 1981, Chua and Elliott, 1993), in line with the notion that sensory feedback is too slow for appropriate control of the hand trajectory due to the inherent delay of the sensorimotor loop (Gerdes and Happee, 1994, Hollerbach, 1982). However, in addition to a rather classical study by Bard, Hay, and Fleury (1985), more recent investigations have challenged this notion. For example, Saunders and Knill (2003) demonstrated that visual feedback of hand position in the early phase of reaching was used in a continuous fashion to correct movement (see also Bedard and Proteau, 2004, Ma-Wyatt and McKee, 2007, Sheth and Shimojo, 2002). This observation has been interpreted as evidence for an internal forward model (e.g., Kawato, 1999, Miall and Wolpert, 1996), which uses an efferent copy of the motor command to predict the current state of an effector, which could compensate for sensory delays in feedback processing (Desmurget & Grafton, 2000). Sarlegna et al. (2003) focused on the relative contributions of viewing the hand and the target for the online control of reaching, and demonstrated a predominant role of target position and a lesser contribution of visual information relative to hand position (see also Berkinblit, Fookson, Smetanin, Adamovich, & Poizner, 1995). Recently, Elliott et al. (2010) presented a multiple-process model of limb control in aiming action. They assumed that at least three types of online regulation were operative: (a) early efferent control based on a comparison of efferent copy to efferent outflow; (b) early and continuing afferent control based on a comparison of the early dynamic properties of the limb movement (visual and proprioceptive) to the expected sensory consequences; and (c) late visual control based on a comparison of limb and target position as the limb enters central vision.
Reach-to-grasp movements show a significantly larger PGA when online vision from the entire visual field is absent during prehension (Bradshaw and Elliott, 2003, Fukui and Inui, 2006, Jakobson and Goodale, 1991, Wing et al., 1986). Furthermore, when participants were prevented from seeing their hands during movement, this affected both movement duration and finger aperture (e.g., Berthier et al., 1996, Gentilucci et al., 1994). Specifically, movement duration was longer and PGA was larger when participants were prevented from seeing their moving hands (but see Connolly and Goodale, 1999, Jeannerod, 1984). One possible reason for a larger PGA when online vision is lacking is that the greater margin of hand aperture allows for error in movement and prevents collision of the fingers with the target object (e.g., Edwards et al., 2005, Haggard and Wing, 1995, Rand et al., 2007, Wing et al., 1986). Therefore, PGA has been regarded as an indicator of the influence of online vision on grasping.
In a previous study (Fukui & Inui, 2006), we investigated: (i) whether online vision in the early phase of movement influences the control of reach-to-grasp movements, and (ii) how vision of the target object and of the participant’s moving limb in the early phase affects the online control of movement. We used liquid crystal shutter goggles and liquid crystal shutter plates to manipulate the duration of online vision during the movement. In addition, we used the liquid crystal shutter plates to manipulate views of the target and the hand. We found that: (i) the presence or absence of vision in the early phase of the movement (i.e., 150–350 ms following movement onset) affected the PGA, using a typical movement duration employed for prehension to a moderate distance in a daily life (i.e., movement duration: approximately 1000 ms, distance between start position and the target: 50 cm), and (ii) early occlusion of both target and hand led to a larger PGA than was seen with full vision during prehension, whereas early occlusion of only the hand had no impact on the PGA. These findings indicated that online vision, especially of the target object, during the early phase of prehension movements is critical to the control of grasping.
Although we were able to demonstrate a prominent role of early phase target view in the online control of reach-to-grasp movements (Fukui & Inui, 2006), the situations where a role emerges for visual feedback of the hand in the early phase of the movement for the online control of grasping remains unclear. Two possibilities can be considered.
- (1)
Hand position is determined by vision and proprioception (e.g., Sober and Sabes, 2005, van Beers et al., 1999, van Beers et al., 2002). If initial views of the target and the hand before the movement (i.e., during the reaction time phase) are provided (cf. Rossetti, Stelmach, Desmurget, Prablanc, & Jeannerod, 1994), proprioception supplies ample information to guide the hand to the target location and generate an appropriate finger configuration, even with no view of the target during the movement. Therefore, hand view in the early phase would not contribute substantially to online control of grasping.
- (2)
In the absence of target information, an increase in the relative influence of the hand view in the early phase would be expected for the online control of grasping. Therefore, visual feedback of the hand, in addition to proprioception, would contribute to the online control of grasping.
The aim of the present study was to test which of these two possibilities is more plausible by separately manipulating the early views of the target and the hand during reach-to-grasp movements using liquid shutter plates. Specifically, we examined whether an effect of hand view emerges under the condition where target view cannot be used.
Section snippets
Participants
Eight self-reported right-handed students (3 males, 5 females; aged 18–22 years) participated in this experiment after giving their informed consent according to the Declaration of Helsinki. All participants reported normal or corrected-to-normal vision, and none had any motor or sensory abnormalities. They were naive as to the purpose of the experiment and were paid for their participation.
Apparatus
Fig. 1A shows the experimental setup. Participants wore crystal shutter goggles (Takei Scientific
Results
Mean values and SDs of kinematic parameters in each target object (diameter: 4 cm, 6 cm) are shown in Table 1. Participants reported no difficulties in performing the task because the views of both target object and hand were available when they grasped the object (i.e., contact with the object surface), and they were able to adequately grasp the object for each condition. Even though this was the case, subsequent findings indicated that the availability of early phase online vision influenced
Discussion
This study explored the effects of viewing the target and hand on the online control of grasping, with a particular focus on the effect of hand view in relation to the visual condition of the target. In addition to the temporal manipulation of target view, we independently manipulated the early phase view of the hand and the object by using two liquid crystal shutter plates (Fig. 1A). This concurrent manipulation of online vision was a methodological expansion of previous studies on
Acknowledgment
We thank Prof. D. Elliott for his helpful comments on an earlier version of this article.
References (98)
- et al.
Contribution of visual information to feedforward and feedback processes in rapid pointing movements
Human Movement Science
(1986) - et al.
Grasping versus pointing and the differential use of visual feedback
Human Movement Science
(1993) - et al.
Visual regulation of manual aiming
Human Movement Science
(1993) - et al.
Forward modeling allows feedback control for fast reaching movements
Trends in Cognitive Sciences
(2000) - et al.
Pointing and grasping in unilateral visual neglect: Effect of on-line visual feedback in grasping
Neuropsychologia
(1999) - et al.
Cortical topography of human anterior intraparietal cortex active during visually guided grasping
Cognitive Brain Research
(2005) - et al.
The effect of viewing the moving limb and target object during the early phase of movement on the online control of grasping
Human Movement Science
(2006) Computers, brains and the control of movement
Trends in Neurosciences
(1982)- et al.
Functional characteristics of prehension: From data to artificial neural networks
Internal models for motor control and trajectory planning
Current Opinion in Neurobiology
(1999)
Inferring online and offline processing of visual feedback in target-directed movements from kinematic data
Neuroscience and Biobehavioral Reviews
Functional relationships between grasp and transport components in a prehension task
Human Movement Science
Forward models for physiological motor control
Neural Networks
When feeling is more important than seeing in sensorimotor adaptation
Current Biology
Accuracy demands in natural prehension
Human Movement Science
Differential influence of vision and proprioception on control of movement distance
Experimental Brain Research
Role of peripheral vision in the directional control of rapid aiming movements
Canadian Journal of Psychology
On-line vs. off-line utilization of peripheral visual afferent information to ensure spatial accuracy of goal-directed movements
Experimental Brain Research
Movement control in a repetitive motor task
Nature
The effects of intermittent vision on prehension under binocular and monocular viewing
Motor Control
The interaction of visual and proprioceptive inputs in pointing to actual and remembered targets
Experimental Brain Research
Visual information and object size in the control of reaching
Journal of Motor Behavior
Spatial and effector processing in the human parietofrontal network for reaches and saccades
Journal of Neurophysiology
A fronto-parietal circuit for object manipulation in man: Evidence from an fMRI-study
The European Journal of Neuroscience
On-line versus off-line control of rapid aiming movements
Journal of Motor Behavior
The speed-accuracy trade-off in manual prehension: Effects of movement amplitude, object size and object width on kinematic characteristics
Experimental Brain Research
The role of binocular information in the ‘on-line’ control of prehension
Spatial Vision
Fast responses of the human hand to changes in target position
Journal of Motor Behavior
The effect of trial number on the emergence of the ‘broken escalator’ locomotor aftereffect
Experimental Brain Research
Visual information: The control of aiming movements
Quarterly Journal of Experimental Psychology
The cortical control of visually guided grasping
Neuroscientist
Randomizing visual feedback in manual aiming: Reminiscence of the previous trial condition and prior knowledge of feedback availability
Experimental Brain Research
Vision of the hand and environmental context in human prehension
Experimental Brain Research
The role of visual feedback of hand position in the control of manual prehension
Experimental Brain Research
On the relation between object shape and grasping kinematics
Journal of Neurophysiology
Visually guided grasping produces fMRI activation in dorsal but not ventral stream brain areas
Experimental Brain Research
Voluntary modification of automatic arm movements evoked by motion of a visual target
Experimental Brain Research
Representation of hand position prior to movement and motor variability
Canadian Journal of Physiology and Pharmacology
Knowing your nose better than your thumb: Measures of over-grasp reveal that face-parts are special for grasping
Experimental Brain Research
The utilization of visual feedback information during rapid pointing movements
The Quarterly Journal of Experimental Psychology
Goal-directed aiming: Two components but multiple processes
Psychological Bulletin
Human cortical control of hand movements: Parietofrontal networks for reaching, grasping, and pointing
The Neuroscientist
Odd sensation induced by moving-phantom which triggers subconscious motor program
PLoS One
Visuomotor transformation process in goal-directed prehension: Utilization of online vision during preshaping phase of grasping
Japanese Psychological Research
The role of proprioception in the control of prehension movements: A kinematic study in a peripherally deafferented patient and in normal subjects
Experimental Brain Research
The use of internal representation in fast goal-directed movements: A modeling approach
Biological Cybernetics
Large adjustments in visually guided reaching do not depend on vision of the hand or perception of target displacement
Nature
Evidence for automatic on-line adjustments of hand orientation during natural reaching movements to stationary targets
Journal of Neurophysiology
The cognitive neuroscience of prehension: Recent developments
Experimental Brain Research
Cited by (9)
Gait and reach-to-grasp movements are mutually modified when performed simultaneously
2015, Human Movement ScienceCitation Excerpt :For conditions with obstacles, a hand position close to the object can facilitate the configuration of the hand format to the gap available for grasping the object. These adaptations are probably driven by visual and proprioceptive inputs and can be used in a continuous fashion to adjust grasping movements (Fukui & Inui, 2013; Ma-Wyatt & McKee, 2007). The interaction between condition and task shows that peak grip aperture velocity during walking was unaffected by condition, whereas it gradually decreased with the addition of obstacles for the stationary task.
Eye–hand coordination: memory-guided grasping during obstacle avoidance
2022, Experimental Brain ResearchSome binocular advantages for planning reach, but not grasp, components of prehension
2019, Experimental Brain ResearchGaze anchoring guides real but not pantomime reach-to-grasp: support for the action–perception theory
2018, Experimental Brain ResearchHow removing visual information affects grasping movements
2018, Experimental Brain ResearchGrasping in absence of feedback: systematic biases endure extensive training
2016, Experimental Brain Research