In fields such as sports rehabilitation, wearable devices, and human-computer interaction, the precise capture of human motion remains a core technological challenge. Among these, the shoulder joint—one of the most complex and highly mobile joints in the body—presents persistent difficulties in long-term stable 3D motion monitoring. While traditional optical motion capture systems offer high accuracy, they rely on expensive equipment and laboratory settings; conversely, IMU-based wearable solutions face drift issues, with errors accumulating over prolonged use. Achieving high-precision, low-cost, long-term shoulder motion tracking in real-world scenarios has thus emerged as a critical challenge in this domain. To address this, Professor Conor J. Walsh’s team at Harvard University developed a groundbreaking multimodal wearable sensing system: integrating inertial measurement units (IMUs) with flexible strain sensors (SS) and combining them with machine learning algorithms to enable drift-free, high-precision 3D shoulder joint motion tracking without complex calibration, sustaining performance for over an hour. Starting from the concept of "clothing," the research began with a seemingly ordinary compression sports top. However, this "smart shirt" incorporated two types of key sensors: IMUs mounted on the torso and upper arm, and eight flexible strain sensors arranged around the shoulders. These flexible sensors can detect real-time fabric stretching changes, indirectly reflecting joint movements. More crucially, the algorithm design plays a pivotal role. The research team proposed a fusion framework called FIS: first using convolutional neural networks (CNNs) to predict "drift-free" shoulder motion from flexible sensor signals, then applying this result to correct the yaw drift accumulated in IMUs over time. This "complementary" strategy retains the short-term high-accuracy advantages of IMUs while leveraging the long-term stability of flexible sensors to eliminate drift. In practical use, the system requires less than 2.5 minutes of simple calibration. Users can complete model training by simply moving their arms freely, eliminating the need for lab equipment or complex operations and significantly lowering the usage threshold.
Overall, this study proposes a novel wearable motion capture solution that balances accuracy, stability, and practicality. By integrating IMUs with flexible sensors and introducing lightweight machine learning algorithms, it successfully addresses the long-standing "drift" challenge in the field, achieving long-term, high-precision 3D shoulder joint motion tracking in real-world scenarios. In the future, with further algorithm optimization and hardware integration, the system is expected to enable real-time embedded operation and expand to more human joints or even full-body motion capture. Additionally, combined with AI and digital twin technologies, its application potential in medical rehabilitation, smart wearables, and soft robotics will be further unleashed.
