Research

Open Source

ReSkin: a versatile, replaceable, low-cost skin for AI research on tactile perception

Nov. 1

Our sense of touch helps us navigate the world around us. With it, we can gather information about objects — such as whether they’re light or heavy, soft or hard, stable or unstable — that we use to accomplish everyday tasks, from putting on our shoes to preparing a meal. AI today effectively incorporates senses like vision and sound, but touch remains an ongoing challenge. That’s in part due to limited access to tactile-sensing data in the wild. As a result, AI researchers hoping to incorporate touch into their models struggle to exploit richness and redundancy from touch sensing the way people do.

That’s why we’re excited to announce ReSkin, a new open source touch-sensing “skin” created by Meta AI researchers, in collaboration with Carnegie Mellon University, that can help researchers advance their AI’s tactile-sensing skills quickly and at scale. Leveraging advances in machine learning and magnetic sensing, ReSkin offers an inexpensive, versatile, durable, and replaceable solution for long-term use. It employs a self-supervised learning algorithm to help autocalibrate the sensor, making it generalizable and able to share data between sensors and systems.

We’ll be releasing the design, relevant documentation, code, and base models in order to help AI researchers use ReSkin without having to collect or train their own data sets. That in turn should help advance AI’s tactile sensing skills quickly and at scale.

A generalized tactile sensing skin like ReSkin will provide a source of rich contact data that could be helpful in advancing AI in a wide range of touch-based tasks, including object classification, proprioception, and robotic grasping. And AI models trained with learned tactile sensing skills will be capable of many types of tasks, including those that require higher sensitivity, such as working in health care settings, or greater dexterity, such as maneuvering small, soft, or sensitive objects. ReSkin can also be integrated with other sensors to collect visual, sound, and touch data outside the lab and in unstructured environments. Combining multimodal data sets helps build physically realistic models of the world while also exploiting redundancy for self-supervised learning.

ReSkin: a better soft sensor

ReSkin is inexpensive to produce, costing less than $6 each at 100 units and even less at larger quantities. It’s 2-3 mm thick and can be used for more than 50,000 interactions, while also having a high temporal resolution of up to 400Hz and a spatial resolution of 1 mm with 90 percent accuracy. These specifications make it ideal for form factors as varied as robot hands, tactile gloves, arm sleeves, and even dog shoes, all of which can help researchers collect tactile data for new AI models that would previously have been difficult or impossible to gather. ReSkin can also provide high-frequency three-axis tactile signals for fast manipulation tasks like slipping, throwing, catching, and clapping. And when it wears out, it can be easily stripped off and replaced.

Something Went Wrong
We're having trouble playing this video.

Because ReSkin is a deformable elastomer with embedded magnetic particles, when it deforms in any way, the surrounding magnetic signal changes. We can measure these changes with nearby magnetometers and use data-driven techniques to translate this data into information such as contact location and amount of applied force.

Currently, most tactile-sensing experiments have relied on a single sensor, as relearning a model has been required whenever a skin is replaced, which is inefficient and impractical. To sidestep the need to train a new skin each time it was replaced, we worked on creating a generalizable skin.

However, soft skins like ReSkin have typically proved difficult to generalize due to manufacturing variations that naturally occur when using soft materials. Each sensor needs to go through an initial and thorough calibration routine to determine its individual response. Further, soft materials change properties over time and change differently depending on how you use them. That means that the calibration routine must also adapt with these changes on its own. Early soft sensor development has often focused on the detailed analysis of the sensing principle, but many fail to also study long-term responses (such as those occurring after a week) and to develop automatic calibration processes for nonexpert use.

With ReSkin, we overcome these challenges with three key insights.

First, ReSkin removes the need for electrical connection between the soft material and traditional measurement electronics. Magnetic signals instead rely on proximity, so the electronics need only be nearby. ReSkin is also more effective than typical soft sensors because it separates the internal electronic circuitry from a passive interface. This makes replacing worn out skins as easy as peeling off and putting on a sticker.

Something Went Wrong
We're having trouble playing this video.

Second, rather than relying on input data from a single sensor, we could better train the model’s mapping function by utilizing data from multiple sensors. By doing this, we trained the model on a higher diversity of data, which helps it produce more effective and generalizable output.

Finally, instead of collecting calibration data for every new sensor, we exploit advances in self-supervised learning to fine-tune sensors automatically using small amounts of unlabeled data. We found that a self-supervised model performed significantly better than those that weren’t; instead of providing ground-truth force labels, we can use relative positions of unlabeled data to help fine-tune the sensor’s calibration. For example, we know that out of three contact points, the two that are physically closer to each other will have a more similar tactile signal.

Taken together, ReSkin opens up a diverse range of versatile, scalable, and inexpensive tactile sensation modules that aren’t possible with existing systems. Existing camera-based tactile sensors require a minimum distance between the surface and the camera that results in much bulkier designs. By comparison, ReSkin can be incorporated as a layer over both human and robot hands and arms.

“Robust tactile sensing is a significant bottleneck in robotics,” says Lerrel Pinto, an assistant professor of computer science at NYU. “Current sensors are either too expensive, offer poor resolution, or are simply too unwieldy for custom robots. ReSkin has the potential to overcome several of these issues. Its lightweight and small form factor makes it compatible with arbitrary grippers, and I’m excited to further explore applications of this sensor on our lab’s robots.”

ReSkin in action

To highlight ReSkin’s utility, and to show how it could be useful for helping researchers advance AI with a diverse range of tactile data that would have previously been difficult to collect, we demonstrated it on several sample applications:

In-hand manipulation

ReSkin offers useful tactile-sensing features for in-hand manipulation, such as for designing AI for training robots to use a key to unlock a door or to grasp delicate objects, like grapes or blueberries. We demonstrate its effectiveness on a robot gripper:

Something Went Wrong
We're having trouble playing this video.

With two magnetic skins and flexible circuit boards placed on either side of a parallel jaw gripper, it’s able to sample the data via an onboard microcontroller and evaluate the force feedback. The gripper’s built-in force sensing is incapable of completing the task, but with ReSkin, it can sense force feedback well enough to control grasping. The system works just as well, with no additional tuning required, when replacing a skin with a new one.

Measuring tactile forces in the wild

ReSkin’s compact and nonobtrusive design makes it ideal for measuring tactile forces in the wild. To showcase this, we placed one magnetic skin and a flexible circuit board inside the sole of a dog shoe. With data collected onboard, the sensor tracks magnitude and direction of applied force during the dog’s resting, walking, and running.

Something Went Wrong
We're having trouble playing this video.

Human-object interactions

ReSkin is also useful for measuring forces during natural human-object interactions, such as when we pick things up or push on things with our hands. We placed a skin and a rigid circuit board on a subject’s right-hand index finger, under a nitrile glove. With data collected onboard, we measured sensor output while making a red bean bun.

Something Went Wrong
We're having trouble playing this video.

Contact localization

ReSkin can also be scaled up for contact localization across larger surface areas. This is useful for building models that rely on knowing where an object is. For example, if a researcher wants to teach a robot to pick something up reliably, it needs to know where it is touching the object and with how much force.

Something Went Wrong
We're having trouble playing this video.

What’s next

Our research into generalizable tactile sensing led to today’s ReSkin, which is low-cost, compact, and long-lasting. With skin that’s as easy to replace as a bandage, it can be used immediately, and our learned models perform strongly on new skins out of the box. It’s a powerful tool that will help researchers build AI models that will power a broad diversity of applications.

Our work on ReSkin is part of our larger commitment to advancing tactile sensing as an AI research domain. Alongside ReSkin, we’re also announcing an open source ecosystem for touch processing that includes high-resolution tactile hardware (DIGIT), simulators (TACTO), benchmarks (PyTouch), and data sets.

Taken together, we believe these advances should make tactile perception far more accessible and attractive for physical-world applications by advancing the sensing abilities of AI beyond its current proficiency in vision and sound. We’re excited to see how the research community evolves these tools to build models that will power a broad diversity of tactile-sensing-based applications and, as a result, further advance the field of AI.

Read the full paper and learn more about the work

Written By

Research Manager

Tess Hellebrekers

Postdoctoral Researcher