Robots have become remarkably advanced, capable of executing complex tasks with impressive accuracy, speed, and coordination. Yet, despite these leaps, they often struggle with something fundamental like sensing and responding to touch.
That limitation may soon change.
A team from the University at Buffalo has developed an electronic textile (E-textile) that mimics how human skin senses pressure, slippage, and movement.
The breakthrough could transform how robots interact with physical objects in collaborative environments like manufacturing, surgery, and prosthetics.
It represents a notable step toward more sensitive and responsive machines, especially in settings that demand fine motor control.
“The applications are very exciting… The technology could be used in manufacturing tasks like assembling products and packaging them – basically any situation where humans and robots collaborate,” said Jun Liu, assistant professor in UB’s Department of Mechanical and Aerospace Engineering and the study’s corresponding author. “It could also help improve robotic surgery tools and prosthetic limbs.”
Faster-than-human response times
The textile sensor generates electricity through the tribovoltaic effect, a process where friction between materials produces a direct-current (DC) signal.
Researchers mounted this sensor onto 3D-printed robotic fingers and connected them to a compliant gripper developed at UB.
The system proved exceptionally responsive. Depending on the experiment, it reacted in as little as 0.76 milliseconds, with a maximum of 38 milliseconds.
Human touch receptors typically respond within 1 to 50 milliseconds.
“The system is incredibly fast, and well within the biological benchmarks set forth by human performance,” Liu said. “We found that the stronger or faster the slip, the stronger the response is from the sensor – this is fortuitous because it makes it easier to build control algorithms to enable the robot to act with precision.”
Real-time grip adjustments
In one test, researchers attempted to pull a copper weight from the robot’s grip. The system sensed the movement and automatically increased its grip to secure the object without crushing it.
“The integration of this sensor allows the robotic gripper to detect slippage and dynamically adjust its compliance and grip force,” said Ehsan Esfahani, associate professor at UB and co-author of the study. “This sensor is the missing component that brings robotic hands one step closer to functioning like a human hand.”
PhD candidate Vashin Gautham, the study’s first author, added: “Our sensor functions like human skin — it’s flexible, highly sensitive, and uniquely capable of detecting not just pressure, but also subtle slip and movement of objects.”
The researchers plan to further test the system by integrating reinforcement learning to refine control algorithms.
Potential use cases include advanced prosthetics, precision robotic tools for surgery, and enhanced human-machine interaction systems.
The team is also exploring the sensor’s adaptability to different robotic platforms.