Högskolan i Skövde

his.sePublications
Change search
Link to record
Permanent link

Direct link
Eklund, Malin
Publications (4 of 4) Show all publications
Eklund, M., Forslund, J. & Billing, E. (2024). Effects of body language during conversation with socially assistive robots. In: Jonas Olofsson; Teodor Jernsäther-Ohlsson; Sofia Thunberg; Linus Holm; Erik Billing (Ed.), Proceedings of the 19th SweCog Conference: . Paper presented at Annual conference of the Swedish Cognitive Science Society (SweCog), Stockholm, October 10-11, 2024 (pp. 106-106). Skövde: University of Skövde, Article ID P60.
Open this publication in new window or tab >>Effects of body language during conversation with socially assistive robots
2024 (English)In: Proceedings of the 19th SweCog Conference / [ed] Jonas Olofsson; Teodor Jernsäther-Ohlsson; Sofia Thunberg; Linus Holm; Erik Billing, Skövde: University of Skövde , 2024, p. 106-106, article id P60Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

It has been shown that interaction methods such as body language and gestures in socially assistive robots (SAR) contribute to engagement, attention, and entertainment value. Studies in social cognition emphasize the significance of body language for facilitating interaction in social exchanges. Inspired by these results, an independent group experiment (N=45) was designed to investigate how body language, as an interaction method in SAR, affects perceived social presence. Participants engaged in semi-structured conversations with the social robot Pepper, equipped with a ChatGPT-based dialogue system with, or without, body language. Perceived social presence was retrieved through the Almere questionnaire. Contrary to our hypothesis, the results did not show any significant differences in perceived social presence. Detailed analysis did however show that the interactive condition enhanced the feeling of being seen and tended to make the robot more entertaining. The lack of support for the hypothesis suggests that the robot's body language might be less significant than previously thought, possibly due to method and design factors, as well as the robot's advanced dialogue system. This study highlights the potential of large language models for SAR and could indicate that some aspects of the robot’s design might overshadow other aspects.

Place, publisher, year, edition, pages
Skövde: University of Skövde, 2024
Series
Skövde University Studies in Informatics: SUSI, ISSN 1653-2325 ; 2024:1
National Category
Psychology (excluding Applied Psychology) Computer graphics and computer vision Human Computer Interaction
Research subject
User Centred Product Design; Interaction Lab (ILAB)
Identifiers
urn:nbn:se:his:diva-24712 (URN)978-91-989038-1-2 (ISBN)
Conference
Annual conference of the Swedish Cognitive Science Society (SweCog), Stockholm, October 10-11, 2024
Projects
Social robots in home environments for older persons' quality of life - needs, opportunities and obstacles (RO-LIV)
Available from: 2024-11-19 Created: 2024-11-19 Last updated: 2025-02-01Bibliographically approved
Billing, E., Brolin, A., Quesada Díaz, R., Eklund, M. & Lämkull, D. (2024). Predicting repetitive worker behaviour using eye-gaze. In: Silje-Adelen Nenseth; Ruud van der Weel; Audrey van der Meer (Ed.), Studies in Perception and Action XVII: 22nd International Conference on Perception and Action. Paper presented at The XXII International Conference on Perception and Action (ICPA), June 25-28, 2024 Trondheim, Norway (pp. 4-4). Trondheim
Open this publication in new window or tab >>Predicting repetitive worker behaviour using eye-gaze
Show others...
2024 (English)In: Studies in Perception and Action XVII: 22nd International Conference on Perception and Action / [ed] Silje-Adelen Nenseth; Ruud van der Weel; Audrey van der Meer, Trondheim, 2024, p. 4-4Conference paper, Poster (with or without abstract) (Refereed)
Place, publisher, year, edition, pages
Trondheim: , 2024
National Category
Psychology Computer graphics and computer vision
Research subject
Interaction Lab (ILAB); User Centred Product Design; Virtual Production Development (VPD)
Identifiers
urn:nbn:se:his:diva-24264 (URN)
Conference
The XXII International Conference on Perception and Action (ICPA), June 25-28, 2024 Trondheim, Norway
Projects
Empowering Human Workers for Assembly of Wire Harnesses (EWASS)
Funder
Vinnova, 2022-01279
Note

CC BY-NC-ND 4.0

The present work was financially supported by the Swedish innovation agency Vinnova through the research and innovation programme Produktion2030, grant #2022-01279: Empowering Human Workers for Assembly of Wire Harnesses (EWASS)

Available from: 2024-07-08 Created: 2024-07-08 Last updated: 2025-02-01Bibliographically approved
Billing, E., Quesada Díaz, R., Eklund, M. & Brolin, A. (2024). Proactive eye-gaze for predicting repetitive worker behavior. In: Jonas Olofsson; Teodor Jernsäther-Ohlsson; Sofia Thunberg; Linus Holm; Erik Billing (Ed.), Proceedings of the 19th SweCog Conference: . Paper presented at Annual conference of the Swedish Cognitive Science Society (SweCog), Stockholm, October 10-11, 2024 (pp. 151-154). Skövde: University of Skövde, Article ID P57.
Open this publication in new window or tab >>Proactive eye-gaze for predicting repetitive worker behavior
2024 (English)In: Proceedings of the 19th SweCog Conference / [ed] Jonas Olofsson; Teodor Jernsäther-Ohlsson; Sofia Thunberg; Linus Holm; Erik Billing, Skövde: University of Skövde , 2024, p. 151-154, article id P57Conference paper, Published paper (Refereed)
Abstract [en]

Proactive eye-gaze (PEG) is a behavioural pattern where eye fixations precede actions, such as reaching. With the proliferation of eye-tracking technology, PEG shows promise for predicting human actions, which has many applications, for example, within industrial human-robot collaboration (HRC). This study investigates PEG in repetitive assembly tasks. Eye-tracking data from four experienced workers were recorded and analysed. The study recorded 57 assembly sessions, identifying 3793 fixations, of which 35% were proactive gazes. The mean PEG interval was 795 ms. Contrary to the hypothesis, PEG was found to be as strong, if not stronger, in repetitive tasks compared to previous studies investigating PEG in other contexts. These findings suggest PEG could be a reliable predictor of worker actions in repetitive tasks, enhancing coordination in HRC.

Place, publisher, year, edition, pages
Skövde: University of Skövde, 2024
Series
Skövde University Studies in Informatics: SUSI, ISSN 1653-2325 ; 2024:1
National Category
Human Computer Interaction Computer graphics and computer vision Psychology (excluding Applied Psychology)
Research subject
Interaction Lab (ILAB); Virtual Production Development (VPD); User Centred Product Design
Identifiers
urn:nbn:se:his:diva-24711 (URN)978-91-989038-1-2 (ISBN)
Conference
Annual conference of the Swedish Cognitive Science Society (SweCog), Stockholm, October 10-11, 2024
Projects
EWASS - Empowering Human Workers for Assembly of Wire Harnesses
Funder
Vinnova, 2022-01279
Available from: 2024-11-19 Created: 2024-11-19 Last updated: 2025-02-01Bibliographically approved
Iriondo Pascual, A., Eklund, M. & Högberg, D. (2024). Towards automated hand force predictions: Use of random forest to classify hand postures. In: : . Paper presented at The 22nd Triennial Congress of the International Ergonomics Association (IEA), August 25-29, 2024 ICC JEJU, Republic of Korea.
Open this publication in new window or tab >>Towards automated hand force predictions: Use of random forest to classify hand postures
2024 (English)Conference paper, Oral presentation with published abstract (Refereed)
Abstract [en]

SUMMATIVE STATEMENT:

This paper studies the use of motion capture to record hand motions and the use of the random forest machine learning algorithm for classification of motion capture data into categories and subcategories of the HandPak ergonomics evaluation method.

KEYWORDS:

Ergonomics, Motion capture, Posture recognition, Hand evaluation, Random forest.

PROBLEM STATEMENT:

Nowadays, different technologies are available for ergonomics evaluations in the workplace. The use of technologies, such as camera-based or inertial motion unit sensors-based motion capture systems, facilitates measuring and digitalizing postures of humans over time. These motion capture systems are now being integrated in the processes of performing ergonomics evaluations in production systems to evaluate the well-being of the workers in a more efficient and objective manner (Rybnikár, Kačerová, Hořejší, & Šimon, 2023).

Ergonomics evaluation methods are commonly used in order to assess worker well-being. The use of motion capture systems enables automating assessments of posture related exposure criteria in ergonomics evaluation methods. However, the most commonly used ergonomics evaluation methods are based on observation. The observational ergonomics evaluation methods were initially created with the intention to provide a structure for risk assessment based on observations for ergonomists (Takala et al., 2010). The observational ergonomics evaluation methods therefore rely on the assessment made by ergonomist, and the criteria are often defined as a subjective measurement in the ergonomics evaluation method, often leading to subjective assessments not being consistent between different ergonomists (Nyman et al., 2023). At the same time, these subjective definitions of the criteria make it difficult to automate the criteria assessment with the data obtained from motion capture systems.

One ergonomics evaluation method for quantifying acceptable forces and torques on the forearm, wrist and hand is HandPak (Potvin, 2024). Performing a HandPak evaluation requires selecting one of the nine categories depending on the grip and force that the worker applies. Inside each category, it is necessary to define sub-categories of hand posture. For example, for the category “Torque: Wrist Flexion or Extension” there is a subcategory of “Type of Grip/Pinch” that needs to be classified in “Power Grip”, “Lateral Pinch” or “Pull Pinch”. This classification and subclassification is not easily quantifiable and cannot be defined as a logical set of rules from the joint angles of the fingers.

OBJECTIVE/QUESTION:

The objective of this article is to automatically recognize hand postures from motion capture data to help with the categorization of the hand postures in the HandPak (Potvin, 2024) ergonomics evaluation method.

METHODOLOGY:

In this paper we study the use of motion capture systems to record hand motions, and the use of the random forest (Cutler, Cutler, & Stevens, 2012) machine learning algorithm for classification of motion capture data into categories and subcategories of the HandPak ergonomics evaluation method. We created random forests for the categorization of three different hand postures based on a dataset of more than 10.000 data points.

RESULTS:

The study is ongoing and the results will be added in the full paper.DISCUSSION:The study shows that random forests can be used to classify hand postures based on hand joints angle data, coming from a motion capture system, into subcategories of the HandPak ergonomics evaluation method, without the overfitting issues that decision trees usually present. The study is limited in that it only considers three subcategories in the HandPak ergonomics evaluation method. Other subcategories in HandPak, such as the frequency or duration, present difficulties to be automated without manual input. In addition to that limitation, the training and test data was obtained from two subjects (a male and a female). Adding more subjects to consider variation of postures could improve the accuracy of the random forest model.

CONCLUSIONS:

The use of machine learning for categorization of hand postures enables partial automation of evaluation of criteria in ergonomics evaluation methods of hands such as HandPak (Potvin, 2024) that would otherwise require manual input. Reducing the need of manual input is argued to make the use ergonomics evaluation methods faster and less subjective.

REFERENCES:

Cutler, A., Cutler, D. R., & Stevens, J. R. (2012). Random Forests. In C. Zhang & Y. Ma (Eds.), Ensemble Machine Learning: Methods and Applications (pp. 157–175). New York, NY: Springer.

Douwes, M., & de Kraker, H. (2012). HARM overview and its application: Some practical examples. Work (Reading, Mass.), 41 Suppl 1, 4004–4009.

Nyman, T., Rhén, I.-M., Johansson, P. J., Eliasson, K., Kjellberg, K., Lindberg, P., Fan, X., et al. (2023). Reliability and Validity of Six Selected Observational Methods for Risk Assessment of Hand Intensive and Repetitive Work. International Journal of Environmental Research and Public Health, 20(8), 5505.

Potvin, J. R. (2024). HandPak. Retrieved March 11, 2024, from https://potvinbiomechanics.com/handpak/

Rybnikár, F., Kačerová, I., Hořejší, P., & Šimon, M. (2023). Ergonomics Evaluation Using Motion Capture Technology—Literature Review. Applied Sciences, 13(1), 162. Multidisciplinary Digital Publishing Institute.

Takala, E.-P., Pehkonen, I., Forsman, M., Hansson, G.-A., Mathiassen, S. E., Neumann, W. P., Sjøgaard, G., et al. (2010). Systematic evaluation of observational methods assessing biomechanical exposures at work. Scandinavian Journal of Work, Environment & Health, 36(1), 3–24.

Keywords
Ergonomics, Motion capture, Posture recognition, Hand evaluation, Random forest
National Category
Production Engineering, Human Work Science and Ergonomics
Research subject
User Centred Product Design; VF-KDO
Identifiers
urn:nbn:se:his:diva-24859 (URN)
Conference
The 22nd Triennial Congress of the International Ergonomics Association (IEA), August 25-29, 2024 ICC JEJU, Republic of Korea
Available from: 2025-01-23 Created: 2025-01-23 Last updated: 2025-03-11Bibliographically approved
Organisations

Search in DiVA

Show all publications