TY - JOUR
T1 - Deep Heterogeneous Dilation of LSTM for Transient-Phase Gesture Prediction Through High-Density Electromyography
T2 - Towards Application in Neurorobotics
AU - Sun, Tianyun
AU - Hu, Qin
AU - Libby, Jacqueline
AU - Atashzar, S. Farokh
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2022/4/1
Y1 - 2022/4/1
N2 - Deep networks have been recently proposed to estimate motor intention using conventional bipolar surface electromyography (sEMG) signals for myoelectric control of neurorobots. In this regard, Deepnets are generally challenged by long training times (affecting practicality and calibration), complex model architectures (affecting the predictability of the outcomes), and a large number of trainable parameters (increasing the need for Big Data). Capitalizing on our recent work on homogeneous temporal dilation in a Recurrent Neural Network (RNN) model, this letter proposes, for the first time, heterogeneous temporal dilation in an LSTM model and applies that to high-density surface electromyography (HD-sEMG), allowing for the decoding of dynamic temporal dependencies with tunable temporal foci. In this letter, a 128-channel HD-sEMG signal space is considered due to the potential for enhancing the spatiotemporal resolution of human-robot interfaces. Accordingly, this letter addresses a challenging motor intention decoding problem of neurorobots, namely, transient intention identification. Our approach uses only the dynamic and transient phase of gesture movements when the signals are not stabilized or plateaued, which can significantly enhance the temporal resolution of human-robot interfaces. This would eventually enhance seamless real-time implementations. Additionally, this letter introduces the concept of 'dilation foci' to modulate the modeling of temporal variation in transient phases. In this work a high number (e.g., 65) of gestures is included, which adds to the complexity and significance of the understudied problem. Our results show state-of-the-art performance for gesture prediction in terms of accuracy, training time, and model convergence.
AB - Deep networks have been recently proposed to estimate motor intention using conventional bipolar surface electromyography (sEMG) signals for myoelectric control of neurorobots. In this regard, Deepnets are generally challenged by long training times (affecting practicality and calibration), complex model architectures (affecting the predictability of the outcomes), and a large number of trainable parameters (increasing the need for Big Data). Capitalizing on our recent work on homogeneous temporal dilation in a Recurrent Neural Network (RNN) model, this letter proposes, for the first time, heterogeneous temporal dilation in an LSTM model and applies that to high-density surface electromyography (HD-sEMG), allowing for the decoding of dynamic temporal dependencies with tunable temporal foci. In this letter, a 128-channel HD-sEMG signal space is considered due to the potential for enhancing the spatiotemporal resolution of human-robot interfaces. Accordingly, this letter addresses a challenging motor intention decoding problem of neurorobots, namely, transient intention identification. Our approach uses only the dynamic and transient phase of gesture movements when the signals are not stabilized or plateaued, which can significantly enhance the temporal resolution of human-robot interfaces. This would eventually enhance seamless real-time implementations. Additionally, this letter introduces the concept of 'dilation foci' to modulate the modeling of temporal variation in transient phases. In this work a high number (e.g., 65) of gestures is included, which adds to the complexity and significance of the understudied problem. Our results show state-of-the-art performance for gesture prediction in terms of accuracy, training time, and model convergence.
KW - Human-centered robotics
KW - high density sEMG
KW - neurorobotics
KW - recurrent neural networks
KW - temporal dilation
UR - http://www.scopus.com/inward/record.url?scp=85123316587&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85123316587&partnerID=8YFLogxK
U2 - 10.1109/LRA.2022.3142721
DO - 10.1109/LRA.2022.3142721
M3 - Article
AN - SCOPUS:85123316587
VL - 7
SP - 2851
EP - 2858
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 2
ER -