Skip to main content
Kent Academic Repository

VREED: Virtual Reality Emotion Recognition Dataset using Eye Tracking & Physiological Measures

Tabbaa, Luma, Searle, Ryan, Mirzaee, Saber, Hossain, Md. Moinul, Intarasirisawat, Jittrapol, Glancy, Maxine, Ang, Chee Siang (2021) VREED: Virtual Reality Emotion Recognition Dataset using Eye Tracking & Physiological Measures. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 5 (4). pp. 1-20. (doi:10.1145/3495002) (Access to this publication is currently restricted. You may be able to access a copy if URLs are provided) (KAR id:91242)

PDF Author's Accepted Manuscript
Language: English

Restricted to Repository staff only
Contact us about this Publication
[thumbnail of VREED-5.pdf]
Official URL:
http://dx.doi.org/10.1145/3495002

Abstract

The paper introduces a multimodal affective dataset named VREED (VR Eyes: Emotions Dataset) in which emotions were triggered using immersive 360° Video-Based Virtual Environments (360-VEs) delivered via Virtual Reality (VR) headset. Behavioural (eye tracking) and physiological signals (Electrocardiogram (ECG) and Galvanic Skin Response (GSR)) were captured, together with self-reported responses, from healthy participants (n=34) experiencing 360-VEs (n=12, 1-3 min each) selected through focus groups and a pilot trial. Statistical analysis confirmed the validity of the selected 360-VEs in eliciting the desired emotions. Preliminary machine learning analysis was carried out, demonstrating state-of-the-art performance reported in affective computing literature using non-immersive modalities. VREED is among the first multimodal VR datasets in emotion recognition using behavioural and physiological signals. VREED is made publicly available on Kaggle 1. We hope that this contribution encourages other researchers to utilise VREED further to understand emotional responses in VR and ultimately enhance VR experiences design in applications where emotional elicitation plays a key role, i.e. healthcare, gaming, education, etc.

Item Type: Article
DOI/Identification number: 10.1145/3495002
Uncontrolled keywords: Dataset, Virtual Reality, ECG, GSR, Affective Computing
Subjects: Q Science > QA Mathematics (inc Computing science) > QA 76 Software, computer programming, > QA76.9.H85 Human computer interaction
Divisions: Divisions > Division of Computing, Engineering and Mathematical Sciences > School of Computing
Depositing User: Jim Ang
Date Deposited: 01 Nov 2021 12:32 UTC
Last Modified: 01 Feb 2022 10:01 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/91242 (The current URI for this page, for reference purposes)

University of Kent Author Information

Tabbaa, Luma.

Creator's ORCID: https://orcid.org/0000-0002-0947-4988
CReDIT Contributor Roles:

Searle, Ryan.

Creator's ORCID:
CReDIT Contributor Roles:

Mirzaee, Saber.

Creator's ORCID:
CReDIT Contributor Roles:

Hossain, Md. Moinul.

Creator's ORCID: https://orcid.org/0000-0003-4184-2397
CReDIT Contributor Roles:

Ang, Chee Siang.

Creator's ORCID: https://orcid.org/0000-0002-1109-9689
CReDIT Contributor Roles:
  • Depositors only (login required):

Total unique views for this document in KAR since July 2020. For more details click on the image.