Skip to main content
Kent Academic Repository

Using Eye Tracking and Heart-Rate Activity to Examine Crossmodal Correspondences QoE in Mulsemedia

Mesfin, Gebremariam, Hussain, Nadia, Covaci, Alexandra, Ghinea, Gheorghita (2019) Using Eye Tracking and Heart-Rate Activity to Examine Crossmodal Correspondences QoE in Mulsemedia. ACM Transactions on Multimedia Computing, Communications, and Applications, 15 (2). pp. 1-22. ISSN 1551-6857. (doi:10.1145/3303080) (KAR id:77598)

PDF Author's Accepted Manuscript
Language: English
Download this file
(PDF/1MB)
[thumbnail of Using_Eye_Tracking_and_Heart_Rate_Activity_to_Examine_Crossmodal_Correspondences_QoE_in_Mulsemedia.pdf]
Preview
Request a format suitable for use with assistive technology e.g. a screenreader
PDF Publisher pdf
Language: English

Restricted to Repository staff only
Contact us about this Publication
[thumbnail of a34-mesfin.pdf]
Official URL:
http://dx.doi.org/10.1145/3303080

Abstract

Different senses provide us with information of various levels of precision and enable us to construct a more precise representation of the world. Rich multisensory simulations are thus beneficial for comprehension, memory reinforcement, or retention of information. Crossmodal mappings refer to the systematic associations often made between different sensory modalities (e.g., high pitch is matched with angular shapes) and govern multisensory processing. A great deal of research effort has been put into exploring cross-modal correspondences in the field of cognitive science. However, the possibilities they open in the digital world have been relatively unexplored. Multiple sensorial media (mulsemedia) provides a highly immersive experience to the users and enhances their Quality of Experience (QoE) in the digital world. Thus, we consider that studying the plasticity and the effects of cross-modal correspondences in a mulsemedia setup can bring interesting insights about improving the human computer dialogue and experience. In our experiments, we exposed users to videos with certain visual dimensions (brightness, color, and shape), and we investigated whether the pairing with a cross-modal matching sound (high and low pitch) and the corresponding auto-generated vibrotactile effects (produced by a haptic vest) lead to an enhanced QoE. For this, we captured the eye gaze and the heart rate of users while experiencing mulsemedia, and we asked them to fill in a set of questions targeting their enjoyment and perception at the end of the experiment. Results showed differences in eye-gaze patterns and heart rate between the experimental and the control group, indicating changes in participants’ engagement when videos were accompanied by matching cross-modal sounds (this effect was the strongest for the video displaying angular shapes and high-pitch audio) and transitively generated cross-modal vibrotactile effects.<?vsp -1pt?>

Item Type: Article
DOI/Identification number: 10.1145/3303080
Uncontrolled keywords: Mulsemedia, audio, cross-modal correspondence, gaze tracking, haptic, heart-rate variability, quality of experience, video
Divisions: Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Engineering and Digital Arts
Depositing User: Alexandra Covaci
Date Deposited: 18 Oct 2019 16:17 UTC
Last Modified: 05 Nov 2024 12:42 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/77598 (The current URI for this page, for reference purposes)

University of Kent Author Information

  • Depositors only (login required):

Total unique views for this document in KAR since July 2020. For more details click on the image.