Unifying the Sensory and Motor Components of Sensorimotor Adaptation
Proc. Advances in Neural Information Processing Systems (NIPS '08)
Date
2008Author
Haith, Adrian
Jackson, Carl
Miall, Chris
Vijayakumar, Sethu
Metadata
Abstract
Adaptation of visually guided reaching movements in novel visuomotor environments
(e.g. wearing prism goggles) comprises not only motor adaptation
but also substantial sensory adaptation, corresponding to shifts in the
perceived spatial location of visual and proprioceptive cues. Previous computational
models of the sensory component of visuomotor adaptation have
assumed that it is driven purely by the discrepancy introduced between visual
and proprioceptive estimates of hand position and is independent of
any motor component of adaptation. We instead propose a unified model in
which sensory and motor adaptation are jointly driven by optimal Bayesian
estimation of the sensory and motor contributions to perceived errors. Our
model is able to account for patterns of performance errors during visuomotor
adaptation as well as the subsequent perceptual aftereffects. This
unified model also makes the surprising prediction that force field adaptation
will elicit similar perceptual shifts, even though there is never any
discrepancy between visual and proprioceptive observations. We confirm
this prediction with an experiment.