New research from Carnegie Mellon University robotics institute can help robots sense layers of tissue rather than relying on computer vision tools to only see it. The work could allow robots to help people with household chores like folding laundry.
Humans use their senses of sight and touch to grab a drink or pick up a piece of cloth. It’s so routine that you don’t think much about it. For robots, however, these tasks are extremely difficult. The amount of data collected by touch is difficult to quantify, and meaning has been difficult to simulate in robotics – until recently.
“Humans look at something, we grab it, and then we use touch to make sure we’re in the right position to grab it,” said David Heldlecturer at the School of Computer Science and director of the Robot Perception and Action Laboratory (R-Pad). “A lot of the tactile sensing that humans do is natural to us. We don’t think about it much, so we don’t realize how valuable it is.”
For example, to fold laundry, robots need a sensor to mimic the way a human’s fingers can sense the top layer of a towel or shirt and grab the layers below. Researchers could teach a robot to feel the top layer of fabric and grab it, but without the robot sensing the other layers of fabric, the robot would only grab the top layer and never successfully fold the fabric.
“How do we fix this?” Held asked. “Well, maybe what we need is touch sensing.”
ReSkin, developed by researchers at Carnegie Mellon and Meta AI, was the ideal solution. The open source touch sensing “skin” is made of a thin, elastic polymer embedded with magnetic particles to measure three-axis touch signals. In a recent postthe researchers used ReSkin to help the robot sense the layers of tissue rather than relying on its vision sensors to see them.
“By reading changes in the magnetic fields of depressions or skin movements, we can get tactile sensing,” said Thomas Weng, a Ph.D. student at the R-Pad Lab, who worked on the project with RI postdoc Daniel Seita and graduate student Sashank Tirumala. “We can use this tactile sensing to determine how many layers of tissue we picked up by pinching with the sensor.”
Other research has used touch sensing to grasp rigid objects, but fabric is “deformable”, meaning it changes when you touch it, making the task even more difficult. Adjusting the robot’s grip on the fabric changes both its pose and the sensor readings.
The researchers didn’t teach the robot how or where to grab the fabric. Instead, they taught him how many layers of tissue he grabbed by first estimating how many layers he was holding using ReSkin’s sensors, then adjusting the grip to try again. The team evaluated the robot by picking up one and two layers of fabric and used different fabric textures and colors to demonstrate generalization beyond the training data.
The finesse and flexibility of the ReSkin sensor made it possible to teach robots to manipulate something as delicate as layers of fabric.
“The profile of this sensor is so small that we were able to do this very fine task, inserting it between layers of fabric, which we cannot do with other sensors, especially optical sensors,” said Weng said. “We were able to use it to do things that weren’t possible before.”
There is, however, a lot of research to do before handing the laundry hamper over to a robot. It all starts with steps like smoothing out a wrinkled fabric, choosing the right number of layers of fabric to fold, and then folding the fabric the right way.
“It’s really an exploration of what we can do with this new sensor,” Weng said. “We’re exploring how to make robots feel with this magnetic skin for things that are soft, and exploring simple strategies for manipulating the fabric we’ll need for robots to eventually do our laundry.”
The team’s research paper “Learn to Distinguish Layers of Tissue Using Tactile Feedbackwill be presented at the 2022 International Conference on Robots and Intelligent Systems, October 23-27 in Kyoto, Japan. It also received the Best Paper Award at the conference’s RoMaDO-SI 2022 workshop.
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of press releases posted on EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.