Skip to the content.

Online Recognition of Bimanual Coordination Using a Theoretical Framework with Application to Robot Assisted Surgery

As robotic systems become more ubiquitous and integrated with a human operator, it may be critically important for robotic systems to predict how the operator coordinates his or her limbs. This is particularly important in high-risk teleoperation tasks that require high precision from the operator, such as robotic surgery. We developed an online recognition method for human bimanual coordination modes using geometric analysis. This includes a modified Procrustes analysis, where the rotation and reflection component is limited to pre-defined matrices, in order to determine the type of bimanual symmetry (e.g. mirror, point, and visual). This method was tested on two motion data sets: a 2D bimanual path following task and 3D bimanual robotic surgical training tasks. Using an objective ground truth obtained from ideal trajectories in the 2D path following task, we obtained overall accuracies for types of symmetry and direction. To showcase the applicability of this algorithm to realistic tasks, three surgical gestures, including: needle transfer, making a C loop, and suture pulling were analyzed from the JIGSAWS surgical robotics dataset to characterize the kinds of bimanual coordination modes associated with each gesture. Results generally match the expected bimanual motions required to complete each gesture. Overall, this paper presents a promising method to not only recognize bimanual coordination modes, but to do so at time-scales fast enough for online integration with bimanual robotic systems.