Optimal visual-haptic integration with articulated tools

Research output: Contribution to journalArticlepeer-review

Standard Standard

Optimal visual-haptic integration with articulated tools. / Takahashi, Chie; Watt, Simon.
In: Experimental Brain Research, Vol. 235, 05.2017, p. 1361-1373.

Research output: Contribution to journalArticlepeer-review

HarvardHarvard

Takahashi, C & Watt, S 2017, 'Optimal visual-haptic integration with articulated tools', Experimental Brain Research, vol. 235, pp. 1361-1373. https://doi.org/10.1007/s00221-017-4896-5

APA

CBE

MLA

VancouverVancouver

Takahashi C, Watt S. Optimal visual-haptic integration with articulated tools. Experimental Brain Research. 2017 May;235:1361-1373. Epub 2017 Feb 18. doi: 10.1007/s00221-017-4896-5

Author

Takahashi, Chie ; Watt, Simon. / Optimal visual-haptic integration with articulated tools. In: Experimental Brain Research. 2017 ; Vol. 235. pp. 1361-1373.

RIS

TY - JOUR

T1 - Optimal visual-haptic integration with articulated tools

AU - Takahashi, Chie

AU - Watt, Simon

PY - 2017/5

Y1 - 2017/5

N2 - When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory ‘correspondence problem’ underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world—seeing and feeling the same thing—and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual–haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools’ properties.

AB - When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory ‘correspondence problem’ underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world—seeing and feeling the same thing—and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual–haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools’ properties.

UR - https://static-content.springer.com/esm/art%3A10.1007%2Fs00221-017-4896-5/MediaObjects/221_2017_4896_MOESM1_ESM.docx

U2 - 10.1007/s00221-017-4896-5

DO - 10.1007/s00221-017-4896-5

M3 - Article

VL - 235

SP - 1361

EP - 1373

JO - Experimental Brain Research

JF - Experimental Brain Research

SN - 0014-4819

ER -