The first one-day workshop on Momentary Emotion Elicitation and Capture (MEEC) will take place on April 25th or 26th, 2020 at the ACM CHI 2020 Conference in Honolulu (Hawaii, USA).
Deadline: 11 February 2020
Recognizing human emotions and responding appropriately has the potential to radically change the way we interact with technology. However, to train machines to sensibly detect and recognize human emotions, we need valid emotion ground truths. We face a fundamental challenge concerning temporal resolution in emotion elicitation and measurement: even though emotions, whether microexpressions or bodily changes, are in continuous flux and can be measured, self-reports do not have the same temporal resolution. Several factors contribute to this temporal resolution mismatch, including different awareness levels across individuals; non-linearity in time perception; and how emotions themselves alter time perception.
In this workshop, we address this challenge of Momentary Emotion Elicitation and Capture (MEEC) from individuals, continuously and in real-time, without adversely affecting user experience. The goals for this first edition of the one-day CHI 2020 workshop are:
- Explore and define novel elicitation tasks
- Survey sensing and annotation techniques
- Create a taxonomy of when and where to apply an elicitation method
Call for papers
To train machines to sensibly detect and recognize human emotions, we need valid emotion ground truths. A fundamental challenge here is the momentary emotion elicitation and capture (MEEC) from individuals continuously and in real-time, without adversely affecting user experience. In this one-day CHI 2020 workshop, we will (a) explore and define novel elicitation tasks (b) survey sensing and annotation techniques (c) create a taxonomy of when and where to apply an elicitation method.
Topics of interest
We seek contributions across disciplines that explore how emotions can be naturally elicited and captured in the moment. Topics include:
- multi-modal (e.g., film, music) and multi-sensory (e.g., auditory, taste, olfactory) elicitation
- emotion elicitation across domains (e.g., automotive, healthcare)
- elicitation and immersiveness (e.g., AR, VR)
- elicitation over time (e.g., mood)
- ethical considerations
- emotion models (dimensional, discrete)
- annotation modalities (e.g., speech, gestures) and tools (e.g., questionnaires, ESMs)
- devices (e.g., mobile, wearable) and sensors (e.g., RGB / thermal cameras, EEG)
- attention considerations (e.g., interruptions)
Papers Deadline: 11-Feb-2020
Papers Decisions: 28-Feb-2020
Workshop: 25 or 26-April-2020
We invite position papers, posters, and demos that present emotion elicitation and/or capture methods. Submissions will be selected on their potential to spark discussion by two peer-reviewers in a single blind process. Submissions must include author names, and have a length of 2-8 pages, including references. Submissions should be in the SIGCHI Extended Abstracts format and submitted in PDF through Easychair. Accepted submissions will be made available on the workshop website. At least one author must register for the workshop and one day of the conference.
- Abdallah El Ali - CWI, Amsterdam
- Monica Perusquía-Hernández - NTT , Japan
- Pete Denman - Intel, USA
- Yomna Abdelrahman - BU Munich, Germany
- Mariam Hassib - BU Munich, Germany
- Alexander Meschtscherjakov - Uni Salzburg, Austria
- Denzil Ferreira - Uni Oulu, Finland
- Niels Henze - Uni Regensburg, Germany