Högskolan i Skövde

his.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • apa-cv
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
The ANEMONE: Theoretical Foundations for UX Evaluation of Action and Intention Recognition in Human-Robot Interaction
University of Skövde, School of Informatics. University of Skövde, Informatics Research Environment. (Interaction Lab)ORCID iD: 0000-0003-0946-7531
University of Skövde, School of Informatics. University of Skövde, Informatics Research Environment. (Interaction Lab)ORCID iD: 0000-0002-7554-2301
2020 (English)In: Sensors, E-ISSN 1424-8220, Vol. 20, no 15, article id 4284Article in journal (Refereed) Published
Abstract [en]

The coexistence of robots and humans in shared physical and social spaces is expected toincrease. A key enabler of high-quality interaction is a mutual understanding of each other’s actionsand intentions. In this paper, we motivate and present a systematic user experience (UX) evaluationframework of action and intention recognition between humans and robots from a UX perspective,because there is an identified lack of this kind of evaluation methodology. The evaluationframework is packaged into a methodological approach called ANEMONE (action and intentionrecognition in human robot interaction). ANEMONE has its foundation in cultural-historicalactivity theory (AT) as the theoretical lens, the seven stages of action model, and user experience(UX) evaluation methodology, which together are useful in motivating and framing the workpresented in this paper. The proposed methodological approach of ANEMONE provides guidanceon how to measure, assess, and evaluate the mutual recognition of actions and intentions betweenhumans and robots for investigators of UX evaluation. The paper ends with a discussion, addressesfuture work, and some concluding remarks.

Place, publisher, year, edition, pages
MDPI, 2020. Vol. 20, no 15, article id 4284
Keywords [en]
human-robot interaction, human-robot collaboration, user-centered, evaluation, action recognition, intention recognition, activity theory, seven stages of action, user experience (UX)
National Category
Human Computer Interaction
Research subject
INF302 Autonomous Intelligent Systems; Interaction Lab (ILAB)
Identifiers
URN: urn:nbn:se:his:diva-18862DOI: 10.3390/s20154284ISI: 000559213800001PubMedID: 32752008Scopus ID: 2-s2.0-85088948958OAI: oai:DiVA.org:his-18862DiVA, id: diva2:1456096
Funder
Knowledge Foundation, 20140220EU, Horizon 2020, 637107
Note

CC BY 4.0 This article belongs to the Special Issue Human-Robot Interaction and Sensors for Social Robotics

Available from: 2020-07-31 Created: 2020-07-31 Last updated: 2024-09-02Bibliographically approved

Open Access in DiVA

fulltext(3021 kB)250 downloads
File information
File name FULLTEXT02.pdfFile size 3021 kBChecksum SHA-512
185cf4b94fcafaaf37239ac46f71f0c10d082701a9044b12a9eb6f68a47b13bc4b33f9611582be376296f3789500227df7ae383c57c0e258fc3593cdabf99403
Type fulltextMimetype application/pdf

Other links

Publisher's full textPubMedScopus

Authority records

Lindblom, JessicaAlenljung, Beatrice

Search in DiVA

By author/editor
Lindblom, JessicaAlenljung, Beatrice
By organisation
School of InformaticsInformatics Research Environment
In the same journal
Sensors
Human Computer Interaction

Search outside of DiVA

GoogleGoogle Scholar
Total: 250 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 330 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • apa-cv
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf