Collaborative robots (cobots) are designed to directly interact with human beings within a shared workspace. To minimize the risk of musculoskeletal disease for the workers, a physical ergonomic assessment of their interaction is needed. Virtual reality (VR) and motion capture (Mocap) systems can aid designers in building low-hazard collaborative environments. This work presents a framework based on VR and Mocap systems for the ergonomic evaluation of collaborative robotic workstations. Starting from the 3D models of the cobot and workstation components, a virtual environment is built in Unity and ROS is employed to manage the cobot behavior. The physical ergonomics is evaluated by means of RULA methodology, exploiting the body tracking capabilities of the device Kinect Azure, a low-cost markerless Mocap system. The framework has been tested by building a virtual environment for collaborative control of flanges with different diameters. The worker interacts with a six-axis Nyro One to move parts on the workstation. The ergonomic assessment is performed in real-time, and a report is generated for later uses and evaluations. The proposed framework fosters the design of collaborative robotics workstations based on an objective assessment of ergonomics. The results of this research work allow planning future development steps for the emulation of more complex workstations with cobots and the use of augmented reality to evaluate how to modify existing workstations to introduce a cobot.

(2023). Markerless Motion Capture and Virtual Reality for Real-Time Ergonomic Analysis of Operators in Workstations with Collaborative Robots: a preliminary study . Retrieved from http://hdl.handle.net/10446/229291

Markerless Motion Capture and Virtual Reality for Real-Time Ergonomic Analysis of Operators in Workstations with Collaborative Robots: a preliminary study

Lanzoni, Daniel;Cattaneo, Andrea;Vitali, Andrea;Regazzoni, Daniele;Rizzi, Caterina
2023-01-01

Abstract

Collaborative robots (cobots) are designed to directly interact with human beings within a shared workspace. To minimize the risk of musculoskeletal disease for the workers, a physical ergonomic assessment of their interaction is needed. Virtual reality (VR) and motion capture (Mocap) systems can aid designers in building low-hazard collaborative environments. This work presents a framework based on VR and Mocap systems for the ergonomic evaluation of collaborative robotic workstations. Starting from the 3D models of the cobot and workstation components, a virtual environment is built in Unity and ROS is employed to manage the cobot behavior. The physical ergonomics is evaluated by means of RULA methodology, exploiting the body tracking capabilities of the device Kinect Azure, a low-cost markerless Mocap system. The framework has been tested by building a virtual environment for collaborative control of flanges with different diameters. The worker interacts with a six-axis Nyro One to move parts on the workstation. The ergonomic assessment is performed in real-time, and a report is generated for later uses and evaluations. The proposed framework fosters the design of collaborative robotics workstations based on an objective assessment of ergonomics. The results of this research work allow planning future development steps for the emulation of more complex workstations with cobots and the use of augmented reality to evaluate how to modify existing workstations to introduce a cobot.
2023
Lanzoni, Daniel; Cattaneo, Andrea; Vitali, Andrea; Regazzoni, Daniele; Rizzi, Caterina
File allegato/i alla scheda:
File Dimensione del file Formato  
Final.pdf

Solo gestori di archivio

Descrizione: Paper
Versione: publisher's version - versione editoriale
Licenza: Licenza default Aisberg
Dimensione del file 1.24 MB
Formato Adobe PDF
1.24 MB Adobe PDF   Visualizza/Apri
Pubblicazioni consigliate

Aisberg ©2008 Servizi bibliotecari, Università degli studi di Bergamo | Terms of use/Condizioni di utilizzo

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10446/229291
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact