his.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Bevilacqua, FernandoORCID iD iconorcid.org/0000-0001-6479-4856
Publications (9 of 9) Show all publications
Bevilacqua, F., Engström, H. & Backlund, P. (2018). Accuracy Evaluation of Remote Photoplethysmography Estimations of Heart Rate in Gaming Sessions with Natural Behavior (1ed.). In: Adrian David Cheok, Masahiko Inami,Teresa Romão (Ed.), Advances in Computer Entertainment Technology: 14th International Conference, ACE 2017, London, UK, December 14-16, 2017, Proceedings. Springer Publishing Company
Open this publication in new window or tab >>Accuracy Evaluation of Remote Photoplethysmography Estimations of Heart Rate in Gaming Sessions with Natural Behavior
2018 (English)In: Advances in Computer Entertainment Technology: 14th International Conference, ACE 2017, London, UK, December 14-16, 2017, Proceedings / [ed] Adrian David Cheok, Masahiko Inami,Teresa Romão, Springer Publishing Company, 2018, 1Chapter in book (Refereed)
Abstract [en]

Remote photoplethysmography (rPPG) can be used to remotely estimate heart rate (HR) of users to infer their emotional state. However natural body movement and facial actions of users significantly impact such techniques, so their reliability within contexts involving natural behavior must be checked. We present an experiment focused on the accuracy evaluation of an established rPPG technique in a gaming context. The technique was applied to estimate the HR of subjects behaving naturally in gaming sessions whose games were carefully designed to be casual-themed, similar to off-the-shelf games and have a difficulty level that linearly progresses from a boring to a stressful state. Estimations presented mean error of 2.99 bpm and Pearson correlationr = 0.43, p < 0.001, however with significant variations among subjects. Our experiment is the first to measure the accuracy of an rPPG techniqueusing boredom/stress-inducing casual games with subjects behaving naturally.

Place, publisher, year, edition, pages
Springer Publishing Company, 2018 Edition: 1
Series
Information Systems and Applications, incl. Internet/Web, and HCI
Keywords
Games, Emotion assessment, Remote photoplethysmography, Computer vision, Affective computing
National Category
Interaction Technologies
Research subject
Interaction Lab (ILAB)
Identifiers
urn:nbn:se:his:diva-14772 (URN)10.1007/978-3-319-76270-8 (DOI)000432607700035 ()2-s2.0-85043535153 (Scopus ID)978-3-319-76269-2 (ISBN)978-3-319-76270-8 (ISBN)
Funder
EU, European Research Council, Project Gamehub Scandinavia
Available from: 2018-02-23 Created: 2018-02-23 Last updated: 2018-06-14
Bevilacqua, F., Engström, H. & Backlund, P. (2018). Automated analysis of facial cues from videos as a potential method for differentiating stress and boredom of players in games. International Journal of Computer Games Technology, Article ID 8734540.
Open this publication in new window or tab >>Automated analysis of facial cues from videos as a potential method for differentiating stress and boredom of players in games
2018 (English)In: International Journal of Computer Games Technology, ISSN 1687-7047, E-ISSN 1687-7055, article id 8734540Article in journal (Refereed) Published
Abstract [en]

Facial analysis is a promising approach to detect emotions of players unobtrusively, however approaches are commonly evaluated in contexts not related to games, or facial cues are derived from models not designed for analysis of emotions during interactions with games. We present a method for automated analysis of facial cues from videos as a potential tool for detecting stress and boredom of players behaving naturally while playing games. Computer vision is used to automatically and unobtrusively extract 7 facial features aimed to detect the activity of a set of facial muscles. Features are mainly based on the Euclidean distance of facial landmarks and do not rely on pre-dened facial expressions, training of a model or the use of facial standards. An empirical evaluation was conducted on video recordings of an experiment involving games as emotion elicitation sources. Results show statistically signicant dierences in the values of facial features during boring and stressful periods of gameplay for 5 of the 7 features. We believe our approach is more user-tailored, convenient and better suited for contexts involving games.

Place, publisher, year, edition, pages
Hindawi Publishing Corporation, 2018
Keywords
games, boredom, stress, facial expression, affective computing, computer vision
National Category
Interaction Technologies
Research subject
Interaction Lab (ILAB)
Identifiers
urn:nbn:se:his:diva-14771 (URN)10.1155/2018/8734540 (DOI)000427897600001 ()2-s2.0-85046279378 (Scopus ID)
Funder
EU, European Research Council, Project Game Hub Scandinavia
Available from: 2018-02-23 Created: 2018-02-23 Last updated: 2018-05-17Bibliographically approved
Bevilacqua, F., Engström, H. & Backlund, P. (2018). Changes in heart rate and facial actions during a gaming session with provoked boredom and stress. Entertainment Computing, 24, 10-20
Open this publication in new window or tab >>Changes in heart rate and facial actions during a gaming session with provoked boredom and stress
2018 (English)In: Entertainment Computing, ISSN 1875-9521, E-ISSN 1875-953X, Vol. 24, p. 10-20Article in journal (Refereed) Published
Abstract [en]

This paper presents an experiment aimed at exploring the relation between facial actions (FA), heart rate (HR) and emotional states, particularly stress and boredom, during the interaction with games. Subjects played three custom-made games with a linear and constant progression from a boring to a stressful state, without pre-defined levels, modes or stopping conditions. Such configuration gives our experiment a novel approach for the exploration of FA and HR regarding their connection to emotional states, since we can categorize information according to the induced (and theoretically known) emotional states on a user level. The HR data was divided into segments, whose HR mean was calculated and compared in periods (boring/stressful part of the games). Additionally the 6 h of recordings were manually analyzed and FA were annotated and categorized in the same periods. Findings show that variations of HR and FA on a group and on an individual level are different when comparing boring and stressful parts of the gaming sessions. This paper contributes information regarding variations of HR and FA in the context of games, which can potentially be used as input candidates to create user-tailored models for emotion detection with game-based emotion elicitation sources.

Place, publisher, year, edition, pages
Elsevier, 2018
Keywords
Games, Boredom, Stress, Facial expression, Multifactorial, Heart rate
National Category
Human Computer Interaction
Research subject
Interaction Lab (ILAB)
Identifiers
urn:nbn:se:his:diva-14267 (URN)10.1016/j.entcom.2017.10.004 (DOI)000418497800002 ()2-s2.0-85032270414 (Scopus ID)
Funder
Interreg Öresund-Kattegat-Skagerrak, project Game Hub Scandinavia
Available from: 2017-10-30 Created: 2017-10-30 Last updated: 2018-02-14Bibliographically approved
Venson, J. E., Bevilacqua, F., Berni, J., Onuki, F. & Maciel, A. (2018). Diagnostic concordance between mobile interfaces and conventional workstations for emergency imaging assessment. International Journal of Medical Informatics, 113, 1-8
Open this publication in new window or tab >>Diagnostic concordance between mobile interfaces and conventional workstations for emergency imaging assessment
Show others...
2018 (English)In: International Journal of Medical Informatics, ISSN 1386-5056, E-ISSN 1872-8243, Vol. 113, p. 1-8Article in journal (Refereed) Published
Abstract [en]

Introduction

Mobile devices and software are now available with sufficient computing power, speed and complexity to allow for real-time interpretation of radiology exams. In this paper, we perform a multivariable user study that investigates concordance of image-based diagnoses provided using mobile devices on the one hand and conventional workstations on the other hand.

Methods

We performed a between-subjects task-analysis using CT, MRI and radiography datasets. Moreover, we investigated the adequacy of the screen size, image quality, usability and the availability of the tools necessary for the analysis. Radiologists, members of several teams, participated in the experiment under real work conditions. A total of 64 studies with 93 main diagnoses were analyzed.

Results

Our results showed that 56 cases were classified with complete concordance (87.69%), 5 cases with almost complete concordance (7.69%) and 1 case (1.56%) with partial concordance. Only 2 studies presented discordance between the reports (3.07%). The main reason to explain the cause of those disagreements was the lack of multiplanar reconstruction tool in the mobile viewer. Screen size and image quality had no direct impact on the mobile diagnosis process.

Conclusion

We concluded that for images from emergency modalities, a mobile interface provides accurate interpretation and swift response, which could benefit patients' healthcare.

Place, publisher, year, edition, pages
Elsevier, 2018
Keywords
Mobile diagnosis, Radiology, Medical imaging
National Category
Medical and Health Sciences Health Care Service and Management, Health Policy and Services and Health Economy Human Computer Interaction
Research subject
Interaction Lab (ILAB)
Identifiers
urn:nbn:se:his:diva-14770 (URN)10.1016/j.ijmedinf.2018.01.019 (DOI)000431199300001 ()29602428 (PubMedID)2-s2.0-85042208743 (Scopus ID)
Available from: 2018-02-23 Created: 2018-02-23 Last updated: 2018-08-31Bibliographically approved
Bevilacqua, F. (2018). Game-calibrated and user-tailored remote detection of emotions: A non-intrusive, multifactorial camera-based approach for detecting stress and boredom of players in games. (Doctoral dissertation). Skövde: University of Skövde
Open this publication in new window or tab >>Game-calibrated and user-tailored remote detection of emotions: A non-intrusive, multifactorial camera-based approach for detecting stress and boredom of players in games
2018 (English)Doctoral thesis, monograph (Other academic)
Abstract [en]

Questionnaires and physiological measurements are the most common approach used to obtain data for emotion estimation in the field of human-computer interaction (HCI) and games research. Both approaches interfere with the natural behavior of users. Initiatives based on computer vision and the remote extraction of user signals for emotion estimation exist, however they are limited. Experiments of such initiatives have been performed under extremely controlled situations with few game-related stimuli. Users had a passive role with limited possibilities for interaction or emotional involvement, instead of game-based emotion stimuli, where users take an active role in the process, making decisions and directly interacting with the media. Previous works also focus on predictive models based on a group perspective. As a consequence, a model is usually trained from the data of several users, which in practice describes the average behavior of the group, excluding or diluting key individualities of each user. In that light, there is a lack of initiatives focusing on non-obtrusive, user-tailored emotion detection models, in particular regarding stress and boredom, within the context of games research that is based on emotion data generated from game stimuli. This research aims to fill that gap, providing the HCI and the games research community with an emotion detection process that can be used to remotely study user's emotions in a non-obtrusive way within the context of games.

The main knowledge contribution of this research is a novel process for emotion detection that is non-obtrusive, user-tailored and game-based. It uses remotely acquired signals, namely, heart rate (HR) and facial actions (FA), to create a user-tailored model, i.e. trained neural network, able to detect the emotional states of boredom and stress of a given subject. The process is automated and relies on computer vision and remote photoplethysmography (rPPG) to acquire user signals, so that specialized equipment, e.g. HR sensors, is not required and only an ordinary camera is needed. The approach comprises two phases: training (or calibration) and testing. In the training phase, a model is trained using a user-tailored approach, i.e. data from a given subject playing calibration games is used to create a model for that given subject. Calibration games are a novel emotion elicitation material introduced by this research. These games are carefully designed to present a difficulty level that constantly and linearly progresses over time without a pre-defined stopping point. They induce emotional states of boredom and stress, accounting for particularities at an individual level. Finally, the testing phase occurs in a game session involving a subject playing any ordinary, non-calibration game, e.g. Super Mario. During the testing phase, the subject's signals are remotely acquired and fed into the model previously trained for that particular subject. The model subsequently outputs the estimated emotional state of that given subject for that particular testing game.

The method for emotion detection proposed in this thesis has been conceived on the basis of established theories and it has been carefully evaluated in experimental setups. Results show a statistical significance classification of emotional states with a mean accuracy of 61.6\%. Finally, this thesis presents a series of systematic evaluations conducted in order to understand the relation between psychophysiological signals and emotions. Facial behavior and physiological signals, i.e. HR, are analyzed and discussed as indicators of emotional states. This research reveals that individualities can be detected regarding facial activity, e.g. an increased number of facial actions during the stressful part of games. Regarding physiological signals, findings are aligned with and reinforce previous research that indicates higher HR mean during stressful situations in a gaming context. Results also suggest that changes in HR during gaming sessions are a promising indicator of stress. The method for the remote detection of emotions, presented in this thesis, is feasible, but does contain limitations. Nevertheless, it is a solid initiative to move away from questionnaires and physical sensors into a non-obtrusive, remote-based solution for the evaluation of user emotions.

Abstract [sv]

Frågeformulär och fysiologiska mätningar med hjälp av sensorer är i dagsläget de vanligaste metoderna för insamling av data som kan användas för att identifiera användares känslotillstånd inom människa- datorinteraktion och spelforskning. Dessa metoder påverkar dock användares naturliga beteenden då de antingen är påträngande under själva användningstillfället (till exempel EEG och ECG sensorer) eller genomförs först efter användningstillfället. Nya metoder försöker minska den direkta påverkan på användaren genom att samla användardata med hjälp av datorseende och olika fjärrinsamlingsverktyg (till exempel \textit{eye-tracking}), men dessa är för tillfället begränsade. Många av dessa metoder kan enbart användas i omsorgsfullt kontrollerade situationer med stimuli från experimentspecifik mjukvara. För att mätinstrumenten ska få tydlig data i dessa experimentsituationer har användare ofta förhållandevis begränsade interaktionsmöjligheter med specialutvecklade spel. Detta gör det tveksamt att de representerar komplexiteten hos verkliga spelsituationer. Metoderna använder sig även ofta av projiceringsmodeller baserade på genomsnittsdata från stora användargrupper, vilket gör att individuella egenheter hos användare ofta förbises. Med detta i åtanke finns det ett stort behov av nya verktyg och mätmetoder som är både icke-påträngande och användarspecifika. Denna avhandling presenterar ett forskningsprojekt där ett sådant verktyg utvecklas och utvärderas.

Det huvudsakliga kunskapsbidraget från denna forskning är en nydanande process för känslomätning som är icke-påträngande, användarspecifik och spelbaserad. Processen använder sig av fjärrinsamling av hjärtrytm (HR) och rörelser i ansiktsmuskler för att träna ett användarspecifikt neuralt nätverk som kan identifiera om användaren är uttråkad eller stressad. Denna lösning är helt automatiserad och använder sig av datorseende och fotopletysmografi vid analys av videoinspelningar för insamling av användardata och kräver inga specialanpassade verktyg (till exempel HR-sensorer). Processen består av två faser: en tränings- (eller kalibrerings-) och en testfas. I träningsfasen konstrueras och tränas en modell av en användares känslorespons under spelandet av särskilt utformade kalibreringsspel. Dessa kalibreringsspel är utvecklade för att framkalla olika typer av känslorespons i form av stress och uttråkning genom att utsätta användare för utmaningar med olika svårighetsgrader. I testfasen spelar användaren ett vanligt spel (till exempel Super Mario). Under detta spelande fjärrinsamlas fysiologisk användardata, vilken behandlas av den tidigare konstruerade modellen som är anpassad för att tolka data från just denna användare. Modellen producerar slutligen en uppskattning av användarens känslotillstånd under speltillfället.

Metoden för känslomätning som föreslås i denna avhandling är baserad på tidigare etablerade teorier och har även blivit utvärderad i en serie kontrollerade experiment. Resultat från utvärdering visar att det finns en statistiskt signifikant identifiering av känslotillstånd med en precision på 61,6\%. Utöver presentationen av det framtagna verktyget för känslomätningar presenteras även en serie av systematiska utvärderingar av förhållandet mellan psykofysiologiska signaler och känslor. Användning av ansiktsmuskler och fysiologiska signaler (till exempel HR) analyseras och deras roll som indikatorer på känslotillstånd diskuteras. Denna forskning visar att individuella egenheter i människors ansiktsuttryck kan identifieras (till exempel ökad mängd och intensitet av olika ansiktsuttryck under stressframkallande spelsegment). Angående fysiologiska signaler är studieresultaten förenliga med, och styrker, tidigare forskning som drar paralleller mellan HR och stresskänslor i spelsituationer. Metoden för fjärrmätning av känslotillstånd som presenteras i denna avhandling är användbar, men har vissa begränsningar. Oavsett detta är metoden ett lovande första steg bort från användning av frågeformulär och fysiskt påträngande sensorer och mot fjärrinsamlingsbaserade lösningar för utvärdering av användares känslotillstånd.

Place, publisher, year, edition, pages
Skövde: University of Skövde, 2018. p. 170
Series
Dissertation Series ; 27
Keywords
human-computer interaction, computer vision, non-obtrusive, remote sensing, affective computing, games, rPPG
National Category
Human Computer Interaction
Research subject
Interaction Lab (ILAB)
Identifiers
urn:nbn:se:his:diva-16347 (URN)978-91-984187-9-8 (ISBN)
Public defence
2018-11-19, Insikten, Portalen, Skövde, 13:00 (English)
Opponent
Supervisors
Note

This work has been performed with support from: CNPq, Conselho Nacional de DesenvolvimentoCientífico e Tecnológico - Brasil; University of Skövde; EU Interreg ÖKS project Game Hub Scandinavia; UFFS, Federal University of Fronteira Sul.

Available from: 2018-10-30 Created: 2018-10-29 Last updated: 2018-10-30Bibliographically approved
Bevilacqua, F. (2017). Game-calibrated and user-tailored remote detection of emotions: A non-intrusive, multifactorial camera-based approach for detecting stress and boredom of players in games.
Open this publication in new window or tab >>Game-calibrated and user-tailored remote detection of emotions: A non-intrusive, multifactorial camera-based approach for detecting stress and boredom of players in games
2017 (English)Report (Other academic)
Abstract [en]

Questionnaires and physiological measurements are the most common approach used to obtain data for emotion estimation in the field of human-computer interaction (HCI) and games research. Both approaches interfere with the natural behavior of users, which affects any research procedure. Initiatives based on computer vision and remote extraction of user signals for emotion estimation exist, however they are limited. Experiments of such initiatives have been performed under extremely controlled situations with few game-related stimuli. Users had a passive role with limited possibilities for interaction or emotional involvement, differently than game-based emotion stimuli, where users take an active role in the process, making decisions and directly interacting with the media. Previous works also focus on predictive models based on a group perspective. As a consequence, a model is usually trained from data of several users, which in practice describes the average behavior of the group, excluding or diluting key individualities of each user. In that light, there is a lack of initiatives focusing on non-obtrusive, user-tailored emotion detection models, in particular regarding stress and boredom, within the context of games research that is based on emotion data generated from game stimuli.

This thesis proposal presents a research that aims to fill that gap, providing the HCI and the games research community with an emotion detection process, instantiated as a software tool, which can be used to remotely study user's emotions in a non-obtrusive way within the context of games. The main knowledge contribution of this research is a novel process for emotion detection that is remote (non-contact) and constructed from a game-based, multifactorial, user-tailored calibration phase. The process relies on computer vision and remote photoplethysmography (rPPG) to read user signals, e.g. heart rate (HR) and facial actions, without physical contact during the interaction with games to perform the detection of stress/boredom levels of users. The approach is automated and uses an ordinary camera to collect information, so specialized equipment, e.g. HR sensors, are not required.

Current results of this research show that individualities can be detected regarding facial activity, e.g. increased number of facial actions during the stressful part of games. Regarding physiological signals, findings are aligned with and reinforce previous research that indicates higher HR mean during stressful situations in a gaming context. The findings also suggest that changes in the HR during gaming sessions are a promising indicator of stress, which can be incorporated into a model aimed at emotion detection. The literature reviews, the experiments conducted so far and the planned future tasks support the idea of using a set of signals, e.g. facial activity, body movement, and HR estimations as sources of information in a multifactorial analysis for the identification of stress and boredom in games. It will produce a novel user-tailored approach for emotion detection focused on the behavioral particularities of each user instead of the average group pattern. The proposed approach will be implemented as a software tool, which can be used by researchers and practitioners for games research.

Publisher
p. 127
Keywords
games, human-computer interaction, emotions, affective computing, rPPG, stress, boredom, remote
National Category
Human Computer Interaction
Research subject
Interaction Lab (ILAB)
Identifiers
urn:nbn:se:his:diva-14042 (URN)
Note

Thesis proposal, PhD programme, University of Skövde

Available from: 2017-08-25 Created: 2017-08-25 Last updated: 2018-01-13Bibliographically approved
Eduardo Venson, J., Bevilacqua, F., Onuki, F., Cordeiro d’Ornellas, M. & Anderson, M. (2016). Efficient medical image access in diagnostic environments with limited resources. Research on Biomedical Engineering, 32(4), 347-357
Open this publication in new window or tab >>Efficient medical image access in diagnostic environments with limited resources
Show others...
2016 (English)In: Research on Biomedical Engineering, ISSN 2446-4732, Vol. 32, no 4, p. 347-357Article in journal (Refereed) Published
Abstract [en]

Introduction

A medical application running outside the workstation environment has to deal with several constraints, such as reduced available memory and low network bandwidth. The aim of this paper is to present an approach to optimize the data flow for fast image transfer and visualization on mobile devices and remote stationary devices.

Methods

We use a combination of client- and server-side procedures to reduce the amount of information transferred by the application. Our approach was implemented on top of a commercial PACS and evaluated through user experiments with specialists in typical diagnosis tasks. The quality of the system outcome was measured in relation to the accumulated amount of network data transference and the amount of memory used in the host device. Besides, the system's quality of use (usability) was measured through participants’ feedback.

Results

Contrarily to previous approaches, ours keeps the application within the memory constraints, minimizing data transferring whenever possible, allowing the application to run on a variety of devices. Moreover, it does that without sacrificing the user experience. Experimental data point that over 90% of the users did not notice any delays or degraded image quality, and when they did, they did not impact on the clinical decisions.

Conclusion

The combined activities and orchestration of our methods allow the image viewer to run on resource-constrained environments, such as those with low network bandwidth or little available memory. These results demonstrate the ability to explore the use of mobile devices as a support tool in the medical workflow.

Place, publisher, year, edition, pages
Sociedade Brasileira de Engenharia Biomédica, 2016
Keywords
mHealth, Teleradiology, Radiology, Mobile viewer
National Category
Biomedical Laboratory Science/Technology
Research subject
Interaction Lab (ILAB)
Identifiers
urn:nbn:se:his:diva-13451 (URN)10.1590/2446-4740.05915 (DOI)2-s2.0-85012299138 (Scopus ID)
Projects
Appification of Medical Reports
Available from: 2017-03-24 Created: 2017-03-24 Last updated: 2017-11-27Bibliographically approved
Bevilacqua, F., Backlund, P. & Engström, H. (2016). Variations of Facial Actions While Playing Games with Inducing Boredom and Stress. In: 2016 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-Games): . Paper presented at International Conference on Games and Virtual Worlds for Serious Applications (VS-Games), Barcelona, Spain, September 7-9, 2016. IEEE
Open this publication in new window or tab >>Variations of Facial Actions While Playing Games with Inducing Boredom and Stress
2016 (English)In: 2016 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-Games), IEEE, 2016Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents an experiment aimed at empirically exploring the variations of facial actions (FA) during gaming sessions with induced boredom and stress. Twenty adults with different ages and gaming experiences played three games while being recorded by a video camera and monitored by a heart rate sensor. The games were carefully designed to have a linear progression from a boring to a stressful state. Self-reported answers indicate participants perceived the games as being boring at the beginning and stressful at the end. The 6 hours of recordings of all subjects were manually analyzed and FA were annotated. We annotated FA that appeared in the recordings at least twice; annotations were categorized by the period when they happened (boring/stressful part of the games) and analysed on a group and on an individual level. Group level analysis revealed that FA patterns were related to no more than 25% of the subjects. The individual level analysis revealed particular patterns for 50% of the subjects. More FA annotations were made during the stressful part of the games. We conclude that, for the context of our experiment, FA provide an unclear foundation for detection of boredom/stressful states when observed from a group level perspective, while the individual level perspective might produce more information.

Place, publisher, year, edition, pages
IEEE, 2016
Series
International Conference on Games and Virtual Worlds for Serious Applications, ISSN 2474-0470
Keywords
Games, Stress, Heart rate, Context, Physiology, Cameras, Predictive models
National Category
Computer Engineering
Research subject
Technology; Interaction Lab (ILAB)
Identifiers
urn:nbn:se:his:diva-13071 (URN)10.1109/VS-GAMES.2016.7590374 (DOI)000386980000041 ()2-s2.0-85013187616 (Scopus ID)978-1-5090-2722-4 (ISBN)978-1-5090-2723-1 (ISBN)
Conference
International Conference on Games and Virtual Worlds for Serious Applications (VS-Games), Barcelona, Spain, September 7-9, 2016
Available from: 2016-11-03 Created: 2016-11-03 Last updated: 2018-08-30Bibliographically approved
Bevilacqua, F., Backlund, P. & Engström, H. (2015). Proposal for Non-contact Analysis of Multimodal Inputs to Measure Stress Level in Serious Games. In: Per Backlund, Henrik Engström & Fotis Liarokapis (Ed.), VS-Games 2015: 7th International Conference on Games and Virtual Worlds for Serious Applications. Paper presented at IEEE 7th International Conference on Games and Virtual Worlds for Serious Applications (VS-Games), Skövde, September 16-18, 2015 (pp. 171-174). Red Hook, NY: IEEE Computer Society
Open this publication in new window or tab >>Proposal for Non-contact Analysis of Multimodal Inputs to Measure Stress Level in Serious Games
2015 (English)In: VS-Games 2015: 7th International Conference on Games and Virtual Worlds for Serious Applications / [ed] Per Backlund, Henrik Engström & Fotis Liarokapis, Red Hook, NY: IEEE Computer Society, 2015, p. 171-174Conference paper, Published paper (Refereed)
Abstract [en]

The process of monitoring user emotions in serious games or human-computer interaction is usually obtrusive. The work-flow is typically based on sensors that are physically attached to the user. Sometimes those sensors completely disturb the user experience, such as finger sensors that prevent the use of keyboard/mouse. This short paper presents techniques used to remotely measure different signals produced by a person, e.g. heart rate, through the use of a camera and computer vision techniques. The analysis of a combination of such signals (multimodal input) can be used in a variety of applications such as emotion assessment and measurement of cognitive stress. We present a research proposal for measurement of player’s stress level based on a non-contact analysis of multimodal user inputs. Our main contribution is a survey of commonly used methods to remotely measure user input signals related to stress assessment.

Place, publisher, year, edition, pages
Red Hook, NY: IEEE Computer Society, 2015
Keywords
Serious Games, Emotion Assessment, Remote Sensing, Computer Vision, Multimodal Input, Affective Computing
National Category
Computer Sciences Human Computer Interaction Computer Vision and Robotics (Autonomous Systems)
Research subject
Technology; Interaction Lab (ILAB)
Identifiers
urn:nbn:se:his:diva-11588 (URN)10.1109/VS-GAMES.2015.7295783 (DOI)000380426500026 ()2-s2.0-84954479671 (Scopus ID)978-1-4799-8102-1 (ISBN)978-1-4799-8101-4 (ISBN)
Conference
IEEE 7th International Conference on Games and Virtual Worlds for Serious Applications (VS-Games), Skövde, September 16-18, 2015
Available from: 2015-10-08 Created: 2015-10-08 Last updated: 2018-08-03Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0001-6479-4856

Search in DiVA

Show all publications