Towards a Multifidelity Digital Human Model for Use in Simulation Environments for Tactile Human/Robot Interactions

  • Rajarshi Roy
  • , Qianwen Zhao
  • , Aldrin Padua
  • , Kent Butz
  • , Chad Spurlock
  • , Ethan Quist
  • , Nathan Fisher
  • , Kevin Lister
  • , Long Wang

Research output: Contribution to journalArticlepeer-review

Abstract

Introduction There has been an increasing research interest in the development of robotics and autonomous systems (RAS) capable of battlefield casualty extraction and assistive diagnostics/interventions while mitigating risk to human rescue teams. A common such scenario involves a vision-enabled robot estimating the casualty anthropometry, estimating grasp locations on the body and planning the necessary human repositioning maneuvers, and finally executing the robot trajectories to enable casualty extraction. The efficacy of autonomous robotic extraction, however, is predicated on safe manipulation during human-robot interaction (HRI) to avoid causing additional injury. This calls for accurate modeling of robotic grasping and manipulation of the human in robotic simulation environments and subsequent iterative improvements to path planning algorithms. However, existing digital human models (DHMs) differ in biofidelity and scalability, and are presently not suitable for integration into robotic simulation environments. To this end, a multifidelity DHM (M-DHM) has been developed by integrating low-fidelity multibody dynamics (MD) DHMs with high-fidelity whole-body finite-element (FE) DHM, providing path planners with both point loading (joint forces/moments) and distributed loading (stress/strain) during an HRI event. Materials and Methods The M-DHM consists of a software architecture integrating two low-fidelity MD-DHMs and one high-fidelity FE-DHM to accomplish the following objectives: (a) human joint kinematics and applied forces can be exchanged between MD-DHM(s) and FE-DHM, (b) multifidelity anatomical loading (M-AL) from both MD-DHM(s) and FE-DHM can be retrieved by a planning tool for a desired HRI event, and (c) M-AL data can be estimated for HRI events wherever real-time simulation data are unavailable using reduced order models. Results Successful demonstration of the M-DHM has been carried out for grasping and palpation of the lower arms, and repositioning and dragging of the leg. Conclusions Future development of the M-DHM will advance next-generation planning algorithms for both military (e.g., search-and-rescue missions) and civilian use (e.g., RAS for patient manipulation).

Original languageEnglish
Pages (from-to)64-72
Number of pages9
JournalMilitary Medicine
Volume190
Issue numberSupplement_2
DOIs
StatePublished - 1 Sep 2025

Fingerprint

Dive into the research topics of 'Towards a Multifidelity Digital Human Model for Use in Simulation Environments for Tactile Human/Robot Interactions'. Together they form a unique fingerprint.

Cite this