Toward general-purpose robots via foundation models: A survey and meta-analysis

Y Hu, Q Xie, V Jain, J Francis, J Patrikar… - arXiv preprint arXiv …, 2023 - arxiv.org
Building general-purpose robots that operate seamlessly in any environment, with any
object, and utilizing various skills to complete diverse tasks has been a long-standing goal in …

Technological development and optimization of pushing and grasping functions in robot arms: A review

A Efendi, YH Shao, CY Huang - Measurement, 2024 - Elsevier
Pushing and grasping are fundamental actions in robotic manipulation, applicable across
diverse fields ranging from industrial automation to assistive robotics. Robotic arm faces …

Jointly improving parsing and perception for natural language commands through human-robot dialog

J Thomason, A Padmakumar, J Sinapov… - Journal of Artificial …, 2020 - jair.org
In this work, we present methods for using human-robot dialog to improve language
understanding for a mobile robot agent. The agent parses natural language to underlying …

Transferring implicit knowledge of non-visual object properties across heterogeneous robot morphologies

G Tatiya, J Francis, J Sinapov - 2023 IEEE International …, 2023 - ieeexplore.ieee.org
Humans leverage multiple sensor modalities when interacting with objects and discovering
their intrinsic properties. Using the visual modality alone is insufficient for deriving intuition …

Mosaic: Learning unified multi-sensory object property representations for robot learning via interactive perception

G Tatiya, J Francis, HH Wu, Y Bisk… - 2024 IEEE International …, 2024 - ieeexplore.ieee.org
A holistic understanding of object properties across diverse sensory modalities (eg, visual,
audio, and haptic) is essential for tasks ranging from object categorization to complex …

Cross-tool and cross-behavior perceptual knowledge transfer for grounded object recognition

G Tatiya, J Francis, J Sinapov - arXiv preprint arXiv:2303.04023, 2023 - arxiv.org
Humans learn about objects via interaction and using multiple perceptions, such as vision,
sound, and touch. While vision can provide information about an object's appearance, non …

A framework for sensorimotor cross-perception and cross-behavior knowledge transfer for object categorization

G Tatiya, R Hosseini, MC Hughes… - Frontiers in Robotics and …, 2020 - frontiersin.org
From an early age, humans learn to develop an intuition for the physical nature of the
objects around them by using exploratory behaviors. Such exploration provides …

Haptic knowledge transfer between heterogeneous robots using kernel manifold alignment

G Tatiya, Y Shukla, M Edegware… - 2020 IEEE/RSJ …, 2020 - ieeexplore.ieee.org
Humans learn about object properties using multiple modes of perception. Recent advances
show that robots can use non-visual sensory modalities (ie, haptic and tactile sensory data) …

CITR: A Coordinate-Invariant Task Representation for Robotic Manipulation

P So, RIC Muchacho, RJ Kirschner… - … on Robotics and …, 2024 - ieeexplore.ieee.org
The basis for robotics skill learning is an adequate representation of manipulation tasks
based on their physical properties. As manipulation tasks are inherently invariant to the …

A framework for multisensory foresight for embodied agents

X Chen, R Hosseini, K Panetta… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
Predicting future sensory states is crucial for learning agents such as robots, drones, and
autonomous vehicles. In this paper, we couple multiple sensory modalities with exploratory …