TY - GEN
T1 - Data Driven Kinematic Modeling of Human Gait for Synthesize Joint Trajectory
AU - Singh, Bharat
AU - Vijayvargiya, Ankit
AU - Kumar, Rajesh
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Synthesis of reference joint trajectories for the legged robot is a very difficult task due to higher degrees of free-dom. The gait dataset can be used to develop the models which can provide the required references. This paper presents the kine-matic modeling of human gait data, which is used as the reference joint trajectory for a Biped robot, 8 deep learning models are proposed. Gait data-set of 120 subjects are collected at RAMAN Lab, MNIT Jaipur, India using the vision-based methodology. All subjects belong to the 5-60 years age group. Four type of novel mappings, one-to-one (knee-to-knee, hip-to-hip, and ankle-to-ankle), many-to-one (knee+hip+ankle-to-knee/hip/ankle), one-to-many (knee/ankle/hip-to-knee+hip+ankle), and many-to-many (knee+hip+ankle-to-knee+hip+ankle), are also developed. These mapping provides the reference trajectories to biped robot and relationships between the knee/hip/ankle trajectories is also ob-tained. Performance evaluation of developed models is measured by average error, maximum error and root mean square error. Results show that the bidirectional deep learning technique performs better for different mappings. Finally, a discussion is provided for the applicability of developed mapping robots in real biped robots.
AB - Synthesis of reference joint trajectories for the legged robot is a very difficult task due to higher degrees of free-dom. The gait dataset can be used to develop the models which can provide the required references. This paper presents the kine-matic modeling of human gait data, which is used as the reference joint trajectory for a Biped robot, 8 deep learning models are proposed. Gait data-set of 120 subjects are collected at RAMAN Lab, MNIT Jaipur, India using the vision-based methodology. All subjects belong to the 5-60 years age group. Four type of novel mappings, one-to-one (knee-to-knee, hip-to-hip, and ankle-to-ankle), many-to-one (knee+hip+ankle-to-knee/hip/ankle), one-to-many (knee/ankle/hip-to-knee+hip+ankle), and many-to-many (knee+hip+ankle-to-knee+hip+ankle), are also developed. These mapping provides the reference trajectories to biped robot and relationships between the knee/hip/ankle trajectories is also ob-tained. Performance evaluation of developed models is measured by average error, maximum error and root mean square error. Results show that the bidirectional deep learning technique performs better for different mappings. Finally, a discussion is provided for the applicability of developed mapping robots in real biped robots.
KW - Deep learning
KW - Gait generation
KW - Kinematic modeling
KW - Mapping models
UR - http://www.scopus.com/inward/record.url?scp=85126800688&partnerID=8YFLogxK
U2 - 10.1109/CENTCON52345.2021.9688100
DO - 10.1109/CENTCON52345.2021.9688100
M3 - Conference contribution
AN - SCOPUS:85126800688
T3 - Proceedings of IEEE International Conference on Disruptive Technologies for Multi-Disciplinary Research and Applications, CENTCON 2021
SP - 27
EP - 32
BT - Proceedings of IEEE International Conference on Disruptive Technologies for Multi-Disciplinary Research and Applications, CENTCON 2021
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 IEEE International Conference on Disruptive Technologies for Multi-Disciplinary Research and Applications, CENTCON 2021
Y2 - 19 November 2021 through 21 November 2021
ER -