his.sePublikationer
Ändra sökning
Länk till posten
Permanent länk

Direktlänk
BETA
Bevilacqua, FernandoORCID iD iconorcid.org/0000-0001-6479-4856
Publikationer (10 of 11) Visa alla publikationer
Bevilacqua, F., Engström, H. & Backlund, P. (2019). Game-Calibrated and User-Tailored Remote Detection of Stress and Boredom in Games. Sensors, 19(13), 1-43, Article ID 2877.
Öppna denna publikation i ny flik eller fönster >>Game-Calibrated and User-Tailored Remote Detection of Stress and Boredom in Games
2019 (Engelska)Ingår i: Sensors, ISSN 1424-8220, E-ISSN 1424-8220, Vol. 19, nr 13, s. 1-43, artikel-id 2877Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Emotion detection based on computer vision and remote extraction of user signals commonly rely on stimuli where users have a passive role with limited possibilities for interaction or emotional involvement, e.g., images and videos. Predictive models are also trained on a group level, which potentially excludes or dilutes key individualities of users. We present a non-obtrusive, multifactorial, user-tailored emotion detection method based on remotely estimated psychophysiological signals. A neural network learns the emotional profile of a user during the interaction with calibration games, a novel game-based emotion elicitation material designed to induce emotions while accounting for particularities of individuals. We evaluate our method in two experiments (n = 20 and n = 62) with mean classification accuracy of 61.6%, which is statistically significantly better than chance-level classification. Our approach and its evaluation present unique circumstances: our model is trained on one dataset (calibration games) and tested on another (evaluation game), while preserving the natural behavior of subjects and using remote acquisition of signals. Results of this study suggest our method is feasible and an initiative to move away from questionnaires and physical sensors into a non-obtrusive, remote-based solution for detecting emotions in a context involving more naturalistic user behavior and games.

Ort, förlag, år, upplaga, sidor
MDPI, 2019
Nyckelord
human–computer interaction, games, affective computing, remote photoplethysmography
Nationell ämneskategori
Interaktionsteknik
Forskningsämne
Interaction Lab (ILAB)
Identifikatorer
urn:nbn:se:his:diva-17485 (URN)10.3390/s19132877 (DOI)000477045000038 ()31261716 (PubMedID)2-s2.0-85069267193 (Scopus ID)
Tillgänglig från: 2019-07-29 Skapad: 2019-07-29 Senast uppdaterad: 2019-11-08Bibliografiskt granskad
Fank, E., Bevilacqua, F., Duarte, D. & Scapinello, A. (2019). INSIDe: Image recognition tool aimed at helping visually impaired people contextualize indoor environments. Revista Brasileira de Computação Aplicada, 11(3), 59-71
Öppna denna publikation i ny flik eller fönster >>INSIDe: Image recognition tool aimed at helping visually impaired people contextualize indoor environments
2019 (Engelska)Ingår i: Revista Brasileira de Computação Aplicada, ISSN 2176-6649, Vol. 11, nr 3, s. 59-71Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Visually impaired (VI) people face a set of challenges when trying to orient and contextualize themselves. Computer vision and mobile devices can be valuable tools to help them improve their quality of life. This work presents a tool based on computer vision and image recognition to assist VI people to better contextualize themselves indoors. The tool works as follows: user takes a picture rho using a mobile application; rho is sent to the server; rho is compared to a database of previously taken pictures; server returns metadata of the database image that is most similar to rho; finally the mobile application gives an audio feedback based on the received metadata. Similarity test among database images and rho is based on the search of nearest neighbors in key points extracted from the images by SIFT descriptors. Three experiments are presented to support the feasibility of the tool. We believe our solution is a low cost, convenient approach that can leverage existing IT infrastructure, e.g. wireless networks, and does not require any physical adaptation in the environment where it will be used.

Ort, förlag, år, upplaga, sidor
UNIV PASSO FUNDO, 2019
Nyckelord
Android system, computer vision, SIFT, Visually impaired
Nationell ämneskategori
Datavetenskap (datalogi)
Forskningsämne
Interaction Lab (ILAB)
Identifikatorer
urn:nbn:se:his:diva-18024 (URN)10.5335/rbca.v11i3.9455 (DOI)000493127600006 ()
Tillgänglig från: 2019-12-19 Skapad: 2019-12-19 Senast uppdaterad: 2019-12-23Bibliografiskt granskad
Bevilacqua, F., Engström, H. & Backlund, P. (2018). Accuracy Evaluation of Remote Photoplethysmography Estimations of Heart Rate in Gaming Sessions with Natural Behavior (1ed.). In: Adrian David Cheok, Masahiko Inami,Teresa Romão (Ed.), Adrian David Cheok, Masahiko Inami, Teresa Romão (Ed.), Advances in Computer Entertainment Technology: 14th International Conference, ACE 2017, London, UK, December 14-16, 2017, Proceedings. Paper presented at 14th International Conference, ACE 2017, London, UK, December 14-16, 2017 (pp. 508-530). Springer
Öppna denna publikation i ny flik eller fönster >>Accuracy Evaluation of Remote Photoplethysmography Estimations of Heart Rate in Gaming Sessions with Natural Behavior
2018 (Engelska)Ingår i: Advances in Computer Entertainment Technology: 14th International Conference, ACE 2017, London, UK, December 14-16, 2017, Proceedings / [ed] Adrian David Cheok, Masahiko Inami, Teresa Romão, Springer, 2018, 1, s. 508-530Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Remote photoplethysmography (rPPG) can be used to remotely estimate heart rate (HR) of users to infer their emotional state. However natural body movement and facial actions of users significantly impact such techniques, so their reliability within contexts involving natural behavior must be checked. We present an experiment focused on the accuracy evaluation of an established rPPG technique in a gaming context. The technique was applied to estimate the HR of subjects behaving naturally in gaming sessions whose games were carefully designed to be casual-themed, similar to off-the-shelf games and have a difficulty level that linearly progresses from a boring to a stressful state. Estimations presented mean error of 2.99 bpm and Pearson correlationr = 0.43, p < 0.001, however with significant variations among subjects. Our experiment is the first to measure the accuracy of an rPPG techniqueusing boredom/stress-inducing casual games with subjects behaving naturally.

Ort, förlag, år, upplaga, sidor
Springer, 2018 Upplaga: 1
Serie
Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349 ; 10714
Nyckelord
Games, Emotion assessment, Remote photoplethysmography, Computer vision, Affective computing
Nationell ämneskategori
Interaktionsteknik
Forskningsämne
Interaction Lab (ILAB)
Identifikatorer
urn:nbn:se:his:diva-14772 (URN)10.1007/978-3-319-76270-8_35 (DOI)000432607700035 ()2-s2.0-85043535153 (Scopus ID)978-3-319-76269-2 (ISBN)978-3-319-76270-8 (ISBN)
Konferens
14th International Conference, ACE 2017, London, UK, December 14-16, 2017
Forskningsfinansiär
EU, Europeiska forskningsrådet, Project Gamehub Scandinavia
Anmärkning

Also part of the Information Systems and Applications, incl. Internet/Web, and HCI book sub series (LNISA, volume 10714)

Tillgänglig från: 2018-02-23 Skapad: 2018-02-23 Senast uppdaterad: 2019-08-23Bibliografiskt granskad
Bevilacqua, F., Engström, H. & Backlund, P. (2018). Automated analysis of facial cues from videos as a potential method for differentiating stress and boredom of players in games. International Journal of Computer Games Technology, Article ID 8734540.
Öppna denna publikation i ny flik eller fönster >>Automated analysis of facial cues from videos as a potential method for differentiating stress and boredom of players in games
2018 (Engelska)Ingår i: International Journal of Computer Games Technology, ISSN 1687-7047, E-ISSN 1687-7055, artikel-id 8734540Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Facial analysis is a promising approach to detect emotions of players unobtrusively, however approaches are commonly evaluated in contexts not related to games, or facial cues are derived from models not designed for analysis of emotions during interactions with games. We present a method for automated analysis of facial cues from videos as a potential tool for detecting stress and boredom of players behaving naturally while playing games. Computer vision is used to automatically and unobtrusively extract 7 facial features aimed to detect the activity of a set of facial muscles. Features are mainly based on the Euclidean distance of facial landmarks and do not rely on pre-dened facial expressions, training of a model or the use of facial standards. An empirical evaluation was conducted on video recordings of an experiment involving games as emotion elicitation sources. Results show statistically signicant dierences in the values of facial features during boring and stressful periods of gameplay for 5 of the 7 features. We believe our approach is more user-tailored, convenient and better suited for contexts involving games.

Ort, förlag, år, upplaga, sidor
Hindawi Publishing Corporation, 2018
Nyckelord
games, boredom, stress, facial expression, affective computing, computer vision
Nationell ämneskategori
Interaktionsteknik
Forskningsämne
Interaction Lab (ILAB)
Identifikatorer
urn:nbn:se:his:diva-14771 (URN)10.1155/2018/8734540 (DOI)000427897600001 ()2-s2.0-85046279378 (Scopus ID)
Forskningsfinansiär
EU, Europeiska forskningsrådet, Project Game Hub Scandinavia
Tillgänglig från: 2018-02-23 Skapad: 2018-02-23 Senast uppdaterad: 2018-05-17Bibliografiskt granskad
Bevilacqua, F., Engström, H. & Backlund, P. (2018). Changes in heart rate and facial actions during a gaming session with provoked boredom and stress. Entertainment Computing, 24, 10-20
Öppna denna publikation i ny flik eller fönster >>Changes in heart rate and facial actions during a gaming session with provoked boredom and stress
2018 (Engelska)Ingår i: Entertainment Computing, ISSN 1875-9521, E-ISSN 1875-953X, Vol. 24, s. 10-20Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

This paper presents an experiment aimed at exploring the relation between facial actions (FA), heart rate (HR) and emotional states, particularly stress and boredom, during the interaction with games. Subjects played three custom-made games with a linear and constant progression from a boring to a stressful state, without pre-defined levels, modes or stopping conditions. Such configuration gives our experiment a novel approach for the exploration of FA and HR regarding their connection to emotional states, since we can categorize information according to the induced (and theoretically known) emotional states on a user level. The HR data was divided into segments, whose HR mean was calculated and compared in periods (boring/stressful part of the games). Additionally the 6 h of recordings were manually analyzed and FA were annotated and categorized in the same periods. Findings show that variations of HR and FA on a group and on an individual level are different when comparing boring and stressful parts of the gaming sessions. This paper contributes information regarding variations of HR and FA in the context of games, which can potentially be used as input candidates to create user-tailored models for emotion detection with game-based emotion elicitation sources.

Ort, förlag, år, upplaga, sidor
Elsevier, 2018
Nyckelord
Games, Boredom, Stress, Facial expression, Multifactorial, Heart rate
Nationell ämneskategori
Människa-datorinteraktion (interaktionsdesign)
Forskningsämne
Interaction Lab (ILAB)
Identifikatorer
urn:nbn:se:his:diva-14267 (URN)10.1016/j.entcom.2017.10.004 (DOI)000418497800002 ()2-s2.0-85032270414 (Scopus ID)
Forskningsfinansiär
Interreg Öresund-Kattegat-Skagerrak, project Game Hub Scandinavia
Tillgänglig från: 2017-10-30 Skapad: 2017-10-30 Senast uppdaterad: 2018-02-14Bibliografiskt granskad
Venson, J. E., Bevilacqua, F., Berni, J., Onuki, F. & Maciel, A. (2018). Diagnostic concordance between mobile interfaces and conventional workstations for emergency imaging assessment. International Journal of Medical Informatics, 113, 1-8
Öppna denna publikation i ny flik eller fönster >>Diagnostic concordance between mobile interfaces and conventional workstations for emergency imaging assessment
Visa övriga...
2018 (Engelska)Ingår i: International Journal of Medical Informatics, ISSN 1386-5056, E-ISSN 1872-8243, Vol. 113, s. 1-8Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Introduction

Mobile devices and software are now available with sufficient computing power, speed and complexity to allow for real-time interpretation of radiology exams. In this paper, we perform a multivariable user study that investigates concordance of image-based diagnoses provided using mobile devices on the one hand and conventional workstations on the other hand.

Methods

We performed a between-subjects task-analysis using CT, MRI and radiography datasets. Moreover, we investigated the adequacy of the screen size, image quality, usability and the availability of the tools necessary for the analysis. Radiologists, members of several teams, participated in the experiment under real work conditions. A total of 64 studies with 93 main diagnoses were analyzed.

Results

Our results showed that 56 cases were classified with complete concordance (87.69%), 5 cases with almost complete concordance (7.69%) and 1 case (1.56%) with partial concordance. Only 2 studies presented discordance between the reports (3.07%). The main reason to explain the cause of those disagreements was the lack of multiplanar reconstruction tool in the mobile viewer. Screen size and image quality had no direct impact on the mobile diagnosis process.

Conclusion

We concluded that for images from emergency modalities, a mobile interface provides accurate interpretation and swift response, which could benefit patients' healthcare.

Ort, förlag, år, upplaga, sidor
Elsevier, 2018
Nyckelord
Mobile diagnosis, Radiology, Medical imaging
Nationell ämneskategori
Medicin och hälsovetenskap Hälso- och sjukvårdsorganisation, hälsopolitik och hälsoekonomi Människa-datorinteraktion (interaktionsdesign)
Forskningsämne
Interaction Lab (ILAB)
Identifikatorer
urn:nbn:se:his:diva-14770 (URN)10.1016/j.ijmedinf.2018.01.019 (DOI)000431199300001 ()29602428 (PubMedID)2-s2.0-85042208743 (Scopus ID)
Tillgänglig från: 2018-02-23 Skapad: 2018-02-23 Senast uppdaterad: 2019-03-13Bibliografiskt granskad
Bevilacqua, F. (2018). Game-calibrated and user-tailored remote detection of emotions: A non-intrusive, multifactorial camera-based approach for detecting stress and boredom of players in games. (Doctoral dissertation). Skövde: University of Skövde
Öppna denna publikation i ny flik eller fönster >>Game-calibrated and user-tailored remote detection of emotions: A non-intrusive, multifactorial camera-based approach for detecting stress and boredom of players in games
2018 (Engelska)Doktorsavhandling, monografi (Övrigt vetenskapligt)
Abstract [en]

Questionnaires and physiological measurements are the most common approach used to obtain data for emotion estimation in the field of human-computer interaction (HCI) and games research. Both approaches interfere with the natural behavior of users. Initiatives based on computer vision and the remote extraction of user signals for emotion estimation exist, however they are limited. Experiments of such initiatives have been performed under extremely controlled situations with few game-related stimuli. Users had a passive role with limited possibilities for interaction or emotional involvement, instead of game-based emotion stimuli, where users take an active role in the process, making decisions and directly interacting with the media. Previous works also focus on predictive models based on a group perspective. As a consequence, a model is usually trained from the data of several users, which in practice describes the average behavior of the group, excluding or diluting key individualities of each user. In that light, there is a lack of initiatives focusing on non-obtrusive, user-tailored emotion detection models, in particular regarding stress and boredom, within the context of games research that is based on emotion data generated from game stimuli. This research aims to fill that gap, providing the HCI and the games research community with an emotion detection process that can be used to remotely study user's emotions in a non-obtrusive way within the context of games.

The main knowledge contribution of this research is a novel process for emotion detection that is non-obtrusive, user-tailored and game-based. It uses remotely acquired signals, namely, heart rate (HR) and facial actions (FA), to create a user-tailored model, i.e. trained neural network, able to detect the emotional states of boredom and stress of a given subject. The process is automated and relies on computer vision and remote photoplethysmography (rPPG) to acquire user signals, so that specialized equipment, e.g. HR sensors, is not required and only an ordinary camera is needed. The approach comprises two phases: training (or calibration) and testing. In the training phase, a model is trained using a user-tailored approach, i.e. data from a given subject playing calibration games is used to create a model for that given subject. Calibration games are a novel emotion elicitation material introduced by this research. These games are carefully designed to present a difficulty level that constantly and linearly progresses over time without a pre-defined stopping point. They induce emotional states of boredom and stress, accounting for particularities at an individual level. Finally, the testing phase occurs in a game session involving a subject playing any ordinary, non-calibration game, e.g. Super Mario. During the testing phase, the subject's signals are remotely acquired and fed into the model previously trained for that particular subject. The model subsequently outputs the estimated emotional state of that given subject for that particular testing game.

The method for emotion detection proposed in this thesis has been conceived on the basis of established theories and it has been carefully evaluated in experimental setups. Results show a statistical significance classification of emotional states with a mean accuracy of 61.6\%. Finally, this thesis presents a series of systematic evaluations conducted in order to understand the relation between psychophysiological signals and emotions. Facial behavior and physiological signals, i.e. HR, are analyzed and discussed as indicators of emotional states. This research reveals that individualities can be detected regarding facial activity, e.g. an increased number of facial actions during the stressful part of games. Regarding physiological signals, findings are aligned with and reinforce previous research that indicates higher HR mean during stressful situations in a gaming context. Results also suggest that changes in HR during gaming sessions are a promising indicator of stress. The method for the remote detection of emotions, presented in this thesis, is feasible, but does contain limitations. Nevertheless, it is a solid initiative to move away from questionnaires and physical sensors into a non-obtrusive, remote-based solution for the evaluation of user emotions.

Abstract [sv]

Frågeformulär och fysiologiska mätningar med hjälp av sensorer är i dagsläget de vanligaste metoderna för insamling av data som kan användas för att identifiera användares känslotillstånd inom människa- datorinteraktion och spelforskning. Dessa metoder påverkar dock användares naturliga beteenden då de antingen är påträngande under själva användningstillfället (till exempel EEG och ECG sensorer) eller genomförs först efter användningstillfället. Nya metoder försöker minska den direkta påverkan på användaren genom att samla användardata med hjälp av datorseende och olika fjärrinsamlingsverktyg (till exempel \textit{eye-tracking}), men dessa är för tillfället begränsade. Många av dessa metoder kan enbart användas i omsorgsfullt kontrollerade situationer med stimuli från experimentspecifik mjukvara. För att mätinstrumenten ska få tydlig data i dessa experimentsituationer har användare ofta förhållandevis begränsade interaktionsmöjligheter med specialutvecklade spel. Detta gör det tveksamt att de representerar komplexiteten hos verkliga spelsituationer. Metoderna använder sig även ofta av projiceringsmodeller baserade på genomsnittsdata från stora användargrupper, vilket gör att individuella egenheter hos användare ofta förbises. Med detta i åtanke finns det ett stort behov av nya verktyg och mätmetoder som är både icke-påträngande och användarspecifika. Denna avhandling presenterar ett forskningsprojekt där ett sådant verktyg utvecklas och utvärderas.

Det huvudsakliga kunskapsbidraget från denna forskning är en nydanande process för känslomätning som är icke-påträngande, användarspecifik och spelbaserad. Processen använder sig av fjärrinsamling av hjärtrytm (HR) och rörelser i ansiktsmuskler för att träna ett användarspecifikt neuralt nätverk som kan identifiera om användaren är uttråkad eller stressad. Denna lösning är helt automatiserad och använder sig av datorseende och fotopletysmografi vid analys av videoinspelningar för insamling av användardata och kräver inga specialanpassade verktyg (till exempel HR-sensorer). Processen består av två faser: en tränings- (eller kalibrerings-) och en testfas. I träningsfasen konstrueras och tränas en modell av en användares känslorespons under spelandet av särskilt utformade kalibreringsspel. Dessa kalibreringsspel är utvecklade för att framkalla olika typer av känslorespons i form av stress och uttråkning genom att utsätta användare för utmaningar med olika svårighetsgrader. I testfasen spelar användaren ett vanligt spel (till exempel Super Mario). Under detta spelande fjärrinsamlas fysiologisk användardata, vilken behandlas av den tidigare konstruerade modellen som är anpassad för att tolka data från just denna användare. Modellen producerar slutligen en uppskattning av användarens känslotillstånd under speltillfället.

Metoden för känslomätning som föreslås i denna avhandling är baserad på tidigare etablerade teorier och har även blivit utvärderad i en serie kontrollerade experiment. Resultat från utvärdering visar att det finns en statistiskt signifikant identifiering av känslotillstånd med en precision på 61,6\%. Utöver presentationen av det framtagna verktyget för känslomätningar presenteras även en serie av systematiska utvärderingar av förhållandet mellan psykofysiologiska signaler och känslor. Användning av ansiktsmuskler och fysiologiska signaler (till exempel HR) analyseras och deras roll som indikatorer på känslotillstånd diskuteras. Denna forskning visar att individuella egenheter i människors ansiktsuttryck kan identifieras (till exempel ökad mängd och intensitet av olika ansiktsuttryck under stressframkallande spelsegment). Angående fysiologiska signaler är studieresultaten förenliga med, och styrker, tidigare forskning som drar paralleller mellan HR och stresskänslor i spelsituationer. Metoden för fjärrmätning av känslotillstånd som presenteras i denna avhandling är användbar, men har vissa begränsningar. Oavsett detta är metoden ett lovande första steg bort från användning av frågeformulär och fysiskt påträngande sensorer och mot fjärrinsamlingsbaserade lösningar för utvärdering av användares känslotillstånd.

Ort, förlag, år, upplaga, sidor
Skövde: University of Skövde, 2018. s. 170
Serie
Dissertation Series ; 27
Nyckelord
human-computer interaction, computer vision, non-obtrusive, remote sensing, affective computing, games, rPPG
Nationell ämneskategori
Människa-datorinteraktion (interaktionsdesign)
Forskningsämne
Interaction Lab (ILAB)
Identifikatorer
urn:nbn:se:his:diva-16347 (URN)978-91-984187-9-8 (ISBN)
Disputation
2018-11-19, Insikten, Portalen, Skövde, 13:00 (Engelska)
Opponent
Handledare
Anmärkning

This work has been performed with support from: CNPq, Conselho Nacional de DesenvolvimentoCientífico e Tecnológico - Brasil; University of Skövde; EU Interreg ÖKS project Game Hub Scandinavia; UFFS, Federal University of Fronteira Sul.

Tillgänglig från: 2018-10-30 Skapad: 2018-10-29 Senast uppdaterad: 2018-10-30Bibliografiskt granskad
Bevilacqua, F. (2017). Game-calibrated and user-tailored remote detection of emotions: A non-intrusive, multifactorial camera-based approach for detecting stress and boredom of players in games.
Öppna denna publikation i ny flik eller fönster >>Game-calibrated and user-tailored remote detection of emotions: A non-intrusive, multifactorial camera-based approach for detecting stress and boredom of players in games
2017 (Engelska)Rapport (Övrigt vetenskapligt)
Abstract [en]

Questionnaires and physiological measurements are the most common approach used to obtain data for emotion estimation in the field of human-computer interaction (HCI) and games research. Both approaches interfere with the natural behavior of users, which affects any research procedure. Initiatives based on computer vision and remote extraction of user signals for emotion estimation exist, however they are limited. Experiments of such initiatives have been performed under extremely controlled situations with few game-related stimuli. Users had a passive role with limited possibilities for interaction or emotional involvement, differently than game-based emotion stimuli, where users take an active role in the process, making decisions and directly interacting with the media. Previous works also focus on predictive models based on a group perspective. As a consequence, a model is usually trained from data of several users, which in practice describes the average behavior of the group, excluding or diluting key individualities of each user. In that light, there is a lack of initiatives focusing on non-obtrusive, user-tailored emotion detection models, in particular regarding stress and boredom, within the context of games research that is based on emotion data generated from game stimuli.

This thesis proposal presents a research that aims to fill that gap, providing the HCI and the games research community with an emotion detection process, instantiated as a software tool, which can be used to remotely study user's emotions in a non-obtrusive way within the context of games. The main knowledge contribution of this research is a novel process for emotion detection that is remote (non-contact) and constructed from a game-based, multifactorial, user-tailored calibration phase. The process relies on computer vision and remote photoplethysmography (rPPG) to read user signals, e.g. heart rate (HR) and facial actions, without physical contact during the interaction with games to perform the detection of stress/boredom levels of users. The approach is automated and uses an ordinary camera to collect information, so specialized equipment, e.g. HR sensors, are not required.

Current results of this research show that individualities can be detected regarding facial activity, e.g. increased number of facial actions during the stressful part of games. Regarding physiological signals, findings are aligned with and reinforce previous research that indicates higher HR mean during stressful situations in a gaming context. The findings also suggest that changes in the HR during gaming sessions are a promising indicator of stress, which can be incorporated into a model aimed at emotion detection. The literature reviews, the experiments conducted so far and the planned future tasks support the idea of using a set of signals, e.g. facial activity, body movement, and HR estimations as sources of information in a multifactorial analysis for the identification of stress and boredom in games. It will produce a novel user-tailored approach for emotion detection focused on the behavioral particularities of each user instead of the average group pattern. The proposed approach will be implemented as a software tool, which can be used by researchers and practitioners for games research.

Förlag
s. 127
Nyckelord
games, human-computer interaction, emotions, affective computing, rPPG, stress, boredom, remote
Nationell ämneskategori
Människa-datorinteraktion (interaktionsdesign)
Forskningsämne
Interaction Lab (ILAB)
Identifikatorer
urn:nbn:se:his:diva-14042 (URN)
Anmärkning

Thesis proposal, PhD programme, University of Skövde

Tillgänglig från: 2017-08-25 Skapad: 2017-08-25 Senast uppdaterad: 2018-01-13Bibliografiskt granskad
Eduardo Venson, J., Bevilacqua, F., Onuki, F., Cordeiro d’Ornellas, M. & Anderson, M. (2016). Efficient medical image access in diagnostic environments with limited resources. Research on Biomedical Engineering, 32(4), 347-357
Öppna denna publikation i ny flik eller fönster >>Efficient medical image access in diagnostic environments with limited resources
Visa övriga...
2016 (Engelska)Ingår i: Research on Biomedical Engineering, ISSN 2446-4732, Vol. 32, nr 4, s. 347-357Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Introduction

A medical application running outside the workstation environment has to deal with several constraints, such as reduced available memory and low network bandwidth. The aim of this paper is to present an approach to optimize the data flow for fast image transfer and visualization on mobile devices and remote stationary devices.

Methods

We use a combination of client- and server-side procedures to reduce the amount of information transferred by the application. Our approach was implemented on top of a commercial PACS and evaluated through user experiments with specialists in typical diagnosis tasks. The quality of the system outcome was measured in relation to the accumulated amount of network data transference and the amount of memory used in the host device. Besides, the system's quality of use (usability) was measured through participants’ feedback.

Results

Contrarily to previous approaches, ours keeps the application within the memory constraints, minimizing data transferring whenever possible, allowing the application to run on a variety of devices. Moreover, it does that without sacrificing the user experience. Experimental data point that over 90% of the users did not notice any delays or degraded image quality, and when they did, they did not impact on the clinical decisions.

Conclusion

The combined activities and orchestration of our methods allow the image viewer to run on resource-constrained environments, such as those with low network bandwidth or little available memory. These results demonstrate the ability to explore the use of mobile devices as a support tool in the medical workflow.

Ort, förlag, år, upplaga, sidor
Sociedade Brasileira de Engenharia Biomédica, 2016
Nyckelord
mHealth, Teleradiology, Radiology, Mobile viewer
Nationell ämneskategori
Biomedicinsk laboratorievetenskap/teknologi
Forskningsämne
Interaction Lab (ILAB)
Identifikatorer
urn:nbn:se:his:diva-13451 (URN)10.1590/2446-4740.05915 (DOI)2-s2.0-85012299138 (Scopus ID)
Projekt
Appification of Medical Reports
Tillgänglig från: 2017-03-24 Skapad: 2017-03-24 Senast uppdaterad: 2017-11-27Bibliografiskt granskad
Bevilacqua, F., Backlund, P. & Engström, H. (2016). Variations of Facial Actions While Playing Games with Inducing Boredom and Stress. In: 2016 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-Games): . Paper presented at International Conference on Games and Virtual Worlds for Serious Applications (VS-Games), Barcelona, Spain, September 7-9, 2016. IEEE
Öppna denna publikation i ny flik eller fönster >>Variations of Facial Actions While Playing Games with Inducing Boredom and Stress
2016 (Engelska)Ingår i: 2016 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-Games), IEEE, 2016Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

This paper presents an experiment aimed at empirically exploring the variations of facial actions (FA) during gaming sessions with induced boredom and stress. Twenty adults with different ages and gaming experiences played three games while being recorded by a video camera and monitored by a heart rate sensor. The games were carefully designed to have a linear progression from a boring to a stressful state. Self-reported answers indicate participants perceived the games as being boring at the beginning and stressful at the end. The 6 hours of recordings of all subjects were manually analyzed and FA were annotated. We annotated FA that appeared in the recordings at least twice; annotations were categorized by the period when they happened (boring/stressful part of the games) and analysed on a group and on an individual level. Group level analysis revealed that FA patterns were related to no more than 25% of the subjects. The individual level analysis revealed particular patterns for 50% of the subjects. More FA annotations were made during the stressful part of the games. We conclude that, for the context of our experiment, FA provide an unclear foundation for detection of boredom/stressful states when observed from a group level perspective, while the individual level perspective might produce more information.

Ort, förlag, år, upplaga, sidor
IEEE, 2016
Serie
International Conference on Games and Virtual Worlds for Serious Applications, ISSN 2474-0470
Nyckelord
Games, Stress, Heart rate, Context, Physiology, Cameras, Predictive models
Nationell ämneskategori
Datorteknik
Forskningsämne
Teknik; Interaction Lab (ILAB)
Identifikatorer
urn:nbn:se:his:diva-13071 (URN)10.1109/VS-GAMES.2016.7590374 (DOI)000386980000041 ()2-s2.0-85013187616 (Scopus ID)978-1-5090-2722-4 (ISBN)978-1-5090-2723-1 (ISBN)
Konferens
International Conference on Games and Virtual Worlds for Serious Applications (VS-Games), Barcelona, Spain, September 7-9, 2016
Tillgänglig från: 2016-11-03 Skapad: 2016-11-03 Senast uppdaterad: 2018-08-30Bibliografiskt granskad
Organisationer
Identifikatorer
ORCID-id: ORCID iD iconorcid.org/0000-0001-6479-4856

Sök vidare i DiVA

Visa alla publikationer