3D point-of-intention determination using a multimodal fusion of hand pointing and eye gaze for a 3D display

S Yeamkuan, K Chamnongthai - Sensors, 2021 - mdpi.com
This paper proposes a three-dimensional (3D) point-of-intention (POI) determination method
using multimodal fusion between hand pointing and eye gaze for a 3D virtual display. In the …

Interactive multimodal robot dialog using pointing gesture recognition

S Constantin, FI Eyiokur, D Yaman, L Bärmann… - European conference on …, 2022 - Springer
Pointing gestures are an intuitive and ubiquitous way of human communication and thus
constitute a crucial aspect of human-robot interaction. However, isolated pointing …

Multimodal Error Correction with Natural Language and Pointing Gestures

S Constantin, FI Eyiokur, D Yaman… - Proceedings of the …, 2023 - openaccess.thecvf.com
Error correction is crucial in human-computer interaction, as it can provide supervision for
incrementally learning artificial intelligence. If a system maps entities like objects or persons …

Augmented pointing gesture estimation for human-robot interaction

Z Hu, Y Xu, W Lin, Z Wang, Z Sun - … International Conference on …, 2022 - ieeexplore.ieee.org
With recent advancements in CV (computer vision) and AI (Artificial Intelligence)
technologies, pointing gesture is becoming an emerging trend for human-robot interaction …

A 3d point-of-intention estimation method using multimodal fusion of hand pointing, eye gaze and depth sensing for collaborative robots

S Yeamkuan, K Chamnongthai… - IEEE Sensors …, 2021 - ieeexplore.ieee.org
Hand pointing psychologically expressing intention has been fused with eye gaze to assist
in detecting the point of intention (POI). Ideally, a POI detection approach using a pair of …

Synchronized Colored Petri Net based Multimodal Modeling and Real-time Recognition of Conversational Spatial Deictic Gestures

A Singh, AK Bansal - Science and Information Conference, 2023 - Springer
Gestures are an important part of intelligent human-robot interactions. Co-speech gestures
are a subclass of gestures that integrate speech and dialogs with synchronous combinations …

Neuro-Symbolic Reasoning for Multimodal Referring Expression Comprehension in HMI Systems

A Jain, AR Kondapally, K Yamada… - New Generation …, 2024 - Springer
Abstract Conventional Human–Machine Interaction (HMI) interfaces have predominantly
relied on GUI and voice commands. However, natural human communication also consists …

Exophora Resolution of Linguistic Instructions with a Demonstrative based on Real-World Multimodal Information

A Oyama, S Hasegawa, H Nakagawa… - 2023 32nd IEEE …, 2023 - ieeexplore.ieee.org
To enable a robot to provide support in a home environment through human-robot
interaction, exophora resolution is crucial for accurately identifying the target of ambiguous …

Sharing Cognition: Human Gesture and Natural Language Grounding Based Planning and Navigation for Indoor Robots

G Kumar, S Maity, B Bhowmick - arXiv preprint arXiv:2108.06478, 2021 - arxiv.org
Cooperation among humans makes it easy to execute tasks and navigate seamlessly even
in unknown scenarios. With our individual knowledge and collective cognition skills, we can …

Methods and systems for enabling human robot interaction by sharing cognition

S Maity, G Kumar, RDROY CHOUDHURY… - US Patent …, 2023 - Google Patents
2021-02-04 Assigned to TATA CONSULTANCY SERVICES LIMITED reassignment TATA
CONSULTANCY SERVICES LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE …