Holmes, Nicholas P., Dakwar, Azar (2015) Online control of reaching and pointing to visual, auditory, and multimodal targets: Effects of target modality and method of determining correction latency. Vision Research, 117 . pp. 105-116. ISSN 0042-6989. (doi:10.1016/j.visres.2015.08.019) (The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided) (KAR id:93245)
The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided. | |
Official URL: https://doi.org/10.1016/j.visres.2015.08.019 |
Abstract
Movements aimed towards objects occasionally have to be adjusted when the object moves. These online adjustments can be very rapid, occurring in as little as 100 ms. More is known about the latency and neural basis of online control of movements to visual than to auditory target objects. We examined the latency of online corrections in reaching-to-point movements to visual and auditory targets that could change side and/or modality at movement onset. Visual or auditory targets were presented on the left or right sides, and participants were instructed to reach and point to them as quickly and as accurately as possible. On half of the trials, the targets changed side at movement onset, and participants had to correct their movements to point to the new target location as quickly as possible. Given different published approaches to measuring the latency for initiating movement corrections, we examined several different methods systematically. What we describe here as the optimal methods involved fitting a straight-line model to the velocity of the correction movement, rather than using a statistical criterion to determine correction onset. In the multimodal experiment, these model-fitting methods produced significantly lower latencies for correcting movements away from the auditory targets than away from the visual targets. Our results confirm that rapid online correction is possible for auditory targets, but further work is required to determine whether the underlying control system for reaching and pointing movements is the same for auditory and visual targets.
Item Type: | Article |
---|---|
DOI/Identification number: | 10.1016/j.visres.2015.08.019 |
Uncontrolled keywords: | Multisensory, Multimodal Space, Online control, Methods |
Divisions: | Divisions > Division of Natural Sciences > Biosciences |
Depositing User: | Azar Dakwar |
Date Deposited: | 18 Feb 2022 06:29 UTC |
Last Modified: | 17 Aug 2022 11:02 UTC |
Resource URI: | https://kar.kent.ac.uk/id/eprint/93245 (The current URI for this page, for reference purposes) |
- Export to:
- RefWorks
- EPrints3 XML
- BibTeX
- CSV
- Depositors only (login required):