Visual-haptic integration during tool use

Electronic versions

Documents

  • Chie Takahashi

    Research areas

  • PhD, School of Psychology

Abstract

Tointegratevisual and haptic information effectively, the brain should only combine information that refersto the same object. Thus, it must solve a ‘correspondence problem’, to determine if signals relate to the same object or not. This could be achieved by considering the similarity of the two sensory signals in time and space. For example, if two size estimatesare spatially separated or conflicting,it is unlikely that theyoriginate from the same object;so sensory integration should not occur. Humans are adept at using tools such as pliers, however,which can systematically change the spatial relationships between (visual) object size and the opening of the hand.Here we investigate whether and how the brainsolvesthis visual-haptic correspondence problem during tool use. In a series of psychophysical experiments we measured object-size discrimination performance,and compared this to statistically optimal predictions, derived froma computational model of sensory integration. We manipulatedthe spatial offset between seen and felt object positions, and also the relative gain between object size and hand opening.When using a tool, we changed these spatial properties by manipulating tool length and the pivot position (for a pliers-like tool). We found that the brain integrates visual and haptic information near-optimallywhen using tools (independent of spatial offset and size-conflict between raw sensory signals), but only when the hand opening was appropriately remapped onto the objectcoordinatesby the tool geometry. This suggests that visual-haptic integration is not based on the similarity between raw sensory signals, but instead on the similarity between the distal causes of the visual and haptic estimates. We also showed that perceived size from hapticsand the haptic reliabilitywere changed with tool gain.Moreover, cue weights of the same object size were altered by the tool geometry, suggesting that the brain does dynamically take spatial changes into account when using a tool.These findings can be explained within a Bayesian framework of multisensory integration.We conclude that the brain takesinto accountthedynamics and geometry of toolsallowing the visual-haptic correspondence problem to be solved correctly under a range of circumstances. We explore thetheoretical implications of this for understanding sensory integration, as well as practical implications for the design of visual-haptic interfaces.

Details

Original languageEnglish
Awarding Institution
Supervisors/Advisors
Thesis sponsors
  • EPSRC
Award dateJan 2012