Haynes, Joshua D., Gallagher, Maria, Culling, John F., Freeman, Tom C. A. (2024) The precision of signals encoding active self-movement. Journal of Neurophysiology, 132 (2). pp. 389-402. ISSN 0022-3077. E-ISSN 1522-1598. (doi:10.1152/jn.00370.2023) (KAR id:106247)
|
PDF
Publisher pdf
Language: English
This work is licensed under a Creative Commons Attribution 4.0 International License.
|
|
|
Download this file (PDF/1MB) |
Preview |
| Request a format suitable for use with assistive technology e.g. a screenreader | |
|
PDF
Author's Accepted Manuscript
Language: English |
|
|
Download this file (PDF/1MB) |
Preview |
| Request a format suitable for use with assistive technology e.g. a screenreader | |
| Official URL: https://doi.org/10.1152/jn.00370.2023 |
|
Abstract
Everyday actions like moving the head, walking around and grasping objects are typically self-controlled. This presents a problem when studying the signals encoding such actions because active self-movement is difficult to control experimentally. Available techniques demand repeatable trials, but each action is unique, making it difficult to measure fundamental properties like psychophysical thresholds. We present a novel paradigm that recovers both precision and bias of self-movement signals with minimal constraint on the participant. The paradigm relies on linking image motion to previous self-movement, and two experimental phases to extract the signal encoding the latter. The paradigm takes care of a hidden source of external noise not previously accounted for in techniques that link display motion to self-movement in real time (e.g. virtual reality). We use head rotations as an example of self-movement, and show that the precision of the signals encoding head movement depends on whether they are being used to judge visual motion or auditory motion. We find that perceived motion is slowed during head movement in both cases. The 'non-image' signals encoding active head rotation (motor commands, proprioception and vestibular cues) are therefore biased towards lower speeds and/or displacements. In a second experiment, we trained participants to rotate their heads at different rates and found that the imprecision of the head rotation signal rises proportionally with head speed (Weber's Law). We discuss the findings in terms of the different motion cues used by vision and hearing, and the implications they have for Bayesian models of motion perception.
| Item Type: | Article |
|---|---|
| DOI/Identification number: | 10.1152/jn.00370.2023 |
| Uncontrolled keywords: | motion; psychophysics; self-movement; vestibular; motor; Weber’s law; head movement |
| Subjects: | H Social Sciences |
| Institutional Unit: | Schools > School of Psychology > Psychology |
| Former Institutional Unit: |
Divisions > Division of Human and Social Sciences > School of Psychology
|
| Funders: | Leverhulme Trust (https://ror.org/012mzw131) |
| Depositing User: | Maria Gallagher |
| Date Deposited: | 13 Jun 2024 09:54 UTC |
| Last Modified: | 22 Jul 2025 09:19 UTC |
| Resource URI: | https://kar.kent.ac.uk/id/eprint/106247 (The current URI for this page, for reference purposes) |
- Link to SensusAccess
- Export to:
- RefWorks
- EPrints3 XML
- BibTeX
- CSV
- Depositors only (login required):

Altmetric
Altmetric