Vazquez-Alvarez, Yolanda, Aylett, Matthew P., Brewster, Stephen A., von Jungenfeld, Rocio, Virolainen, Antti (2016) Designing Interactions with Multilevel Auditory Displays in Mobile Audio-Augmented Reality. ACM Transactions on Computer-Human Interaction, 23 (1). Article Number 3. ISSN 1073-0516. E-ISSN 1557-7325. (doi:10.1145/2829944) (KAR id:58619)
PDF
Author's Accepted Manuscript
Language: English |
|
Download this file (PDF/5MB) |
|
Request a format suitable for use with assistive technology e.g. a screenreader | |
Official URL: http://dx.doi.org/10.1145/2829944 |
Abstract
Auditory interfaces offer a solution to the problem of effective eyes-free mobile interactions. In this article, we investigate the use of multilevel auditory displays to enable eyes-free mobile interaction with indoor location-based information in non-guided audio-augmented environments. A top-level exocentric sonification layer advertises information in a gallery-like space. A secondary interactive layer is used to evaluate three different conditions that varied in the presentation (sequential versus simultaneous) and spatialisation (non-spatialised versus egocentric/exocentric spatialisation) of multiple auditory sources. Our findings show that (1) participants spent significantly more time interacting with spatialised displays; (2) using the same design for primary and interactive secondary display (simultaneous exocentric) showed a negative impact on the user experience, an increase in workload and substantially increased participant movement; and (3) the other spatial interactive secondary display designs (simultaneous egocentric, sequential egocentric, and sequential exocentric) showed an increase in time spent stationary but no negative impact on the user experience, suggesting a more exploratory experience. A follow-up qualitative and quantitative analysis of user behaviour support these conclusions. These results provide practical guidelines for designing effective eyes-free interactions for far richer auditory soundscapes.
Item Type: | Article |
---|---|
DOI/Identification number: | 10.1145/2829944 |
Uncontrolled keywords: | eyes-free interaction, auditory displays, spatial audio, mobile audioaugmented reality, exploratory behaviour |
Subjects: | T Technology > TA Engineering (General). Civil engineering (General) > TA168 Systems engineering |
Divisions: | Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Engineering and Digital Arts |
Depositing User: | Rocio von Jungenfeld |
Date Deposited: | 18 Jan 2017 15:40 UTC |
Last Modified: | 05 Nov 2024 10:50 UTC |
Resource URI: | https://kar.kent.ac.uk/id/eprint/58619 (The current URI for this page, for reference purposes) |
- Link to SensusAccess
- Export to:
- RefWorks
- EPrints3 XML
- BibTeX
- CSV
- Depositors only (login required):