Sensor-Invariant Tactile Representation

Created by MG96

External Public cs.RO cs.CV cs.LG

Statistics

Citations
2
References
34
Last updated
Loading...
Authors

Harsh Gupta Yuchen Mo Shengmiao Jin Wenzhen Yuan
Project Resources

Name Type Source Actions
ArXiv Paper Paper arXiv
Semantic Scholar Paper Semantic Scholar
Abstract

High-resolution tactile sensors have become critical for embodied perception and robotic manipulation. However, a key challenge in the field is the lack of transferability between sensors due to design and manufacturing variations, which result in significant differences in tactile signals. This limitation hinders the ability to transfer models or knowledge learned from one sensor to another. To address this, we introduce a novel method for extracting Sensor-Invariant Tactile Representations (SITR), enabling zero-shot transfer across optical tactile sensors. Our approach utilizes a transformer-based architecture trained on a diverse dataset of simulated sensor designs, allowing it to generalize to new sensors in the real world with minimal calibration. Experimental results demonstrate the method's effectiveness across various tactile sensing applications, facilitating data and model transferability for future advancements in the field.

Note:

No note available for this project.

No note available for this project.
Contact:

No contact available for this project.

No contact available for this project.