Designing Interactions with Multilevel Auditory Displays in Mobile Audio-Augmented Reality

Vazquez-Alvarez, Yolanda and Aylett, Matthew P. and Brewster, Stephen A. and von Jungenfeld, Rocio and Virolainen, Antti (2016) Designing Interactions with Multilevel Auditory Displays in Mobile Audio-Augmented Reality. ACM Transactions on Computer-Human Interaction, 23 (1). ISSN 1073-0516. E-ISSN 1557-7325. (doi:https://doi.org/10.1145/2829944) (The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided)

The full text of this publication is not currently available from this repository. You may be able to access a copy if URLs are provided. (Contact us about this Publication)
Official URL
http://dx.doi.org/10.1145/2829944

Abstract

Auditory interfaces offer a solution to the problem of effective eyes-free mobile interactions. In this article, we investigate the use of multilevel auditory displays to enable eyes-free mobile interaction with indoor location-based information in non-guided audio-augmented environments. A top-level exocentric sonification layer advertises information in a gallery-like space. A secondary interactive layer is used to evaluate three different conditions that varied in the presentation (sequential versus simultaneous) and spatialisation (non-spatialised versus egocentric/exocentric spatialisation) of multiple auditory sources. Our findings show that (1) participants spent significantly more time interacting with spatialised displays; (2) using the same design for primary and interactive secondary display (simultaneous exocentric) showed a negative impact on the user experience, an increase in workload and substantially increased participant movement; and (3) the other spatial interactive secondary display designs (simultaneous egocentric, sequential egocentric, and sequential exocentric) showed an increase in time spent stationary but no negative impact on the user experience, suggesting a more exploratory experience. A follow-up qualitative and quantitative analysis of user behaviour support these conclusions. These results provide practical guidelines for designing effective eyes-free interactions for far richer auditory soundscapes.

Item Type: Article
Subjects: T Technology > TA Engineering (General). Civil engineering (General) > TA168 Systems engineering, cybernetics and intelligent systems
Divisions: Faculties > Sciences > School of Engineering and Digital Arts
Depositing User: Rocio von Jungenfeld
Date Deposited: 18 Jan 2017 15:40 UTC
Last Modified: 19 Jan 2017 14:49 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/58619 (The current URI for this page, for reference purposes)
ORCiD (Vazquez-Alvarez, Yolanda): http://orcid.org/0000-0002-0029-3478
ORCiD (von Jungenfeld, Rocio): http://orcid.org/0000-0002-2154-8054
  • Depositors only (login required):