TY - JOUR
T1 - Real-Time, Free-Viewpoint Holographic Patient Rendering for Telerehabilitation via a Single Camera
T2 - A Data-Driven Approach With 3D Gaussian Splatting for Real-World Adaptation
AU - Cao, Shengting
AU - Zhao, Jiamiao
AU - Hu, Fei
AU - Gan, Yu
N1 - Publisher Copyright:
© 1995-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Telerehabilitation is a cost-effective alternative to in-clinic rehabilitation. Although convenient, it lacks immersive and free-viewpoint patient visualization. Current research explores two solutions to this issue. Mesh-based methods use 3D models and motion capture for AR visualization. However, they are labor-intensive and less photorealistic than 2D images. Microsoft’s Holoportation generates photorealistic 3D models with eight RGBD cameras in real time. However, it requires complex setups, high GPU power, and high-speed communication infrastructure, making deployment challenging. This article presents a Real-Time Free-Viewpoint Holographic Patient Rendering (RT-FVHP) system for telerehabilitation. Unlike traditional methods that require manually crafted assets such as 3D meshes, texture maps, and skeletal rigging, our data-driven approach eliminates the need for explicit asset definitions. Inspired by the HumanNeRF framework, we retarget dynamic human poses to a canonical pose and leverage 3D Gaussian Splatting to train a neural network in canonical space for patient representation. The trained model generates 2D RGBσ outputs via Gaussian Splatting rasterization, guided by camera parameters and human pose inputs. Compatible with HoloLens 2 and web-based platforms, RT-FVHP operates effectively under real-world conditions, including handling occlusions caused by treadmills. Occlusion handling is accomplished using our Shape-Enforced Gaussian Density Control (SGDC), which initializes and densifies 3D Gaussians in occluded regions using estimated SMPL human body priors. This approach minimizes manual intervention while ensuring complete body reconstruction. With efficient Gaussian rasterization, the model delivers real-time performance of up to 400 FPS at 1080p resolution on a dedicated RTX6000 GPU.
AB - Telerehabilitation is a cost-effective alternative to in-clinic rehabilitation. Although convenient, it lacks immersive and free-viewpoint patient visualization. Current research explores two solutions to this issue. Mesh-based methods use 3D models and motion capture for AR visualization. However, they are labor-intensive and less photorealistic than 2D images. Microsoft’s Holoportation generates photorealistic 3D models with eight RGBD cameras in real time. However, it requires complex setups, high GPU power, and high-speed communication infrastructure, making deployment challenging. This article presents a Real-Time Free-Viewpoint Holographic Patient Rendering (RT-FVHP) system for telerehabilitation. Unlike traditional methods that require manually crafted assets such as 3D meshes, texture maps, and skeletal rigging, our data-driven approach eliminates the need for explicit asset definitions. Inspired by the HumanNeRF framework, we retarget dynamic human poses to a canonical pose and leverage 3D Gaussian Splatting to train a neural network in canonical space for patient representation. The trained model generates 2D RGBσ outputs via Gaussian Splatting rasterization, guided by camera parameters and human pose inputs. Compatible with HoloLens 2 and web-based platforms, RT-FVHP operates effectively under real-world conditions, including handling occlusions caused by treadmills. Occlusion handling is accomplished using our Shape-Enforced Gaussian Density Control (SGDC), which initializes and densifies 3D Gaussians in occluded regions using estimated SMPL human body priors. This approach minimizes manual intervention while ensuring complete body reconstruction. With efficient Gaussian rasterization, the model delivers real-time performance of up to 400 FPS at 1080p resolution on a dedicated RTX6000 GPU.
KW - Gaussian splatting
KW - HumanNeRF
KW - SMPL
KW - Telerehabilitation
KW - Unity3D
KW - extended reality (XR)
KW - gait rehabilitation
KW - metaverse
KW - neural radiance field (NeRF)
KW - telepresence
UR - https://www.scopus.com/pages/publications/85218798216
UR - https://www.scopus.com/pages/publications/85218798216#tab=citedBy
U2 - 10.1109/TVCG.2025.3544297
DO - 10.1109/TVCG.2025.3544297
M3 - Article
C2 - 40031791
AN - SCOPUS:85218798216
SN - 1077-2626
VL - 31
SP - 7311
EP - 7323
JO - IEEE Transactions on Visualization and Computer Graphics
JF - IEEE Transactions on Visualization and Computer Graphics
IS - 10
ER -