Binding touch to everything: Learning unified multimodal tactile representations

F Yang, C Feng, Z Chen, H Park… - Proceedings of the …, 2024 - openaccess.thecvf.com
The ability to associate touch with other modalities has huge implications for humans and
computational systems. However multimodal learning with touch remains challenging due to …

Beyond flat gelsight sensors: Simulation of optical tactile sensors of complex morphologies for sim2real learning

DF Gomes, P Paoletti, S Luo - arXiv preprint arXiv:2305.12605, 2023 - arxiv.org
Recently, several morphologies, each with its advantages, have been proposed for the\textit
{GelSight} high-resolution tactile sensors. However, existing simulation methods are limited …

Cross-modal generation of tactile friction coefficient from audio and visual measurements by transformer

R Song, X Sun, G Liu - IEEE Transactions on Instrumentation …, 2023 - ieeexplore.ieee.org
Generating tactile data (eg, friction coefficient) from audio and visual modalities can avoid
time-consuming practical measurements and ensure high-fidelity haptic rendering of surface …

Controllable visual-tactile synthesis

R Gao, W Yuan, JY Zhu - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Deep generative models have various content creation applications such as graphic design,
e-commerce, and virtual try-on. However, current works mainly focus on synthesizing …

Marker or Markerless? Mode-Switchable Optical Tactile Sensing for Diverse Robot Tasks

N Ou, Z Chen, S Luo - IEEE Robotics and Automation Letters, 2024 - ieeexplore.ieee.org
Optical tactile sensors play a pivotal role in robot perception and manipulation tasks. The
membrane of these sensors can be painted with markers or remain markerless, enabling …

[HTML][HTML] Multimodal zero-shot learning for tactile texture recognition

G Cao, J Jiang, D Bollegala, M Li, S Luo - Robotics and Autonomous …, 2024 - Elsevier
Tactile sensing plays an irreplaceable role in robotic material recognition. It enables robots
to distinguish material properties such as their local geometry and textures, especially for …

[PDF][PDF] A Case Study on Visual-Audio-Tactile Cross-Modal Retrieval

J Jiang, S Luo - The 2024 IEEE/RSJ International Conference on …, 2024 - kclpure.kcl.ac.uk
Cross-Modal Retrieval (CMR), which retrieves relevant items from one modality (eg, audio)
given a query in another modality (eg, visual), has undergone significant advancements in …

CM-AVAE: Cross-Modal Adversarial Variational Autoencoder for Visual-to-Tactile Data Generation

Q Xi, F Wang, L Tao, H Zhang… - IEEE Robotics and …, 2024 - ieeexplore.ieee.org
Vibration acceleration signals allow humans to perceive the surface characteristics of
textures during tool-surface interactions. However, acquiring acceleration signals requires a …

Low-Cost Teleoperation with Haptic Feedback through Vision-based Tactile Sensors for Rigid and Soft Object Manipulation

M Lippi, MC Welle, MK Wozniak, A Gasparri… - arXiv preprint arXiv …, 2024 - arxiv.org
Haptic feedback is essential for humans to successfully perform complex and delicate
manipulation tasks. A recent rise in tactile sensors has enabled robots to leverage the sense …

A Case Study on Visual-Audio-Tactile Cross-Modal Retrieval

J Wojcik, J Jiang, J Wu, S Luo - arXiv preprint arXiv:2407.20709, 2024 - arxiv.org
Cross-Modal Retrieval (CMR), which retrieves relevant items from one modality (eg, audio)
given a query in another modality (eg, visual), has undergone significant advancements in …