Precise Robotic Needle-Threading with Tactile Perception and Reinforcement LearningDownload PDF

Published: 30 Aug 2023, Last Modified: 10 Oct 2023CoRL 2023 PosterReaders: Everyone
Keywords: tactile perception, needle threading
TL;DR: A tactile-perception-based solution to robotic needle threading task
Abstract: This work presents a novel tactile perception-based method, named T-NT, for performing the needle-threading task, an application of deformable linear object (DLO) manipulation. This task is divided into two main stages: \textit{Tail-end Finding} and \textit{Tail-end Insertion}. In the first stage, the agent traces the contour of the thread twice using vision-based tactile sensors mounted on the gripper fingers. The two-run tracing is to locate the tail-end of the thread. In the second stage, it employs a tactile-guided reinforcement learning (RL) model to drive the robot to insert the thread into the target needle eyelet. The RL model is trained in a Unity-based simulated environment. The simulation environment supports tactile rendering which can produce realistic tactile images and thread modeling. During insertion, the position of the poke point and the center of the eyelet are obtained through a pre-trained segmentation model, Grounded-SAM, which predicts the masks for both the needle eye and thread imprints. These positions are then fed into the reinforcement learning model, aiding in a smoother transition to real-world applications. Extensive experiments on real robots are conducted to demonstrate the efficacy of our method. More experiments and videos can be found in the supplementary materials and on the website: \url{}.
Student First Author: yes
Supplementary Material: zip
Instructions: I have read the instructions for authors (
Publication Agreement: pdf
Poster Spotlight Video: mp4
15 Replies