Based on Yobionic, I implemented a full robotic arm that has APIs to mimic full hand gestures and movements of a participant.
This arm and its software is capable of:
1. Programmable tasks for different hand poses, gestures and movements.
2. Serves as a demonstration for the project Finger Trak (use the Deep Learning out put of a hand movement)
FingerTrak: Continuous 3D Hand Pose Tracking by Deep Learning Hand Silhouettes Captured by Miniature Thermal Cameras on Wrist.
Published on Proceedings of the Association for Computing Machinery on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT)/Ubicomp’20
Fang Hu, Peng He, Songlin Xu, Yin Li, Cheng Zhang
In this paper, we present FingerTrak, an intelligent, minimally-obtrusive wristband that enables continuous 3D finger tracking and hand pose estimation with four miniature thermal cameras mounted closely on a form-fitting wristband. FingerTrak explores the feasibility of continuously reconstructing full-hand posture (20 hand joint positions) without the need to actually see the fingers directly. We demonstrate that our system is able to estimate full-hand posture by observing only the outline or the contour of the hand (hand silhouettes) from the wrist using low-resolution thermal cameras. A customized deep neural network is developed to learn to ``stitch'' these multi-view images and estimate 20 joint positions in 3D space. Our user study with 11 participants shows that the system can achieve an average angular error of 6.46 degrees when tested under the same background, and 8.06 degrees when tested under a different background. FingerTrak also demonstrates encouraging results after re-mounting the device and has the potential to reconstruct more complicated poses. We conclude this paper with further discussion on the opportunities and challenges of implementing this technology into the real world.