Learning Bimanual Cloth Manipulation with Vision-based Tactile Sensing via Single Robotic Arm
This paper introduces Touch G.O.G., a cost-effective single-arm framework utilizing a novel vision-based tactile gripper and deep learning models to achieve high-precision bimanual cloth manipulation, including reliable unfolding of crumpled fabrics, by overcoming the challenges of deformable object handling and occlusion.