Analysis of the hands in egocentric vision: A survey

A Bandini, J Zariffa - IEEE transactions on pattern analysis and …, 2020 - ieeexplore.ieee.org
Egocentric vision (aka first-person vision–FPV) applications have thrived over the past few
years, thanks to the availability of affordable wearable cameras and large annotated …

Joint hand motion and interaction hotspots prediction from egocentric videos

S Liu, S Tripathi, S Majumdar… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
We propose to forecast future hand-object interactions given an egocentric video. Instead of
predicting action labels or pixels, we directly predict the hand motion trajectory and the …

The epic-kitchens dataset: Collection, challenges and baselines

D Damen, H Doughty, GM Farinella… - … on Pattern Analysis …, 2020 - ieeexplore.ieee.org
Since its introduction in 2018, EPIC-KITCHENS has attracted attention as the largest
egocentric video benchmark, offering a unique viewpoint on people's interaction with …

Wanderlust: Online continual object detection in the real world

J Wang, X Wang, Y Shang-Guan… - Proceedings of the …, 2021 - openaccess.thecvf.com
Online continual learning from data streams in dynamic environments is a critical direction in
the computer vision field. However, realistic benchmarks and fundamental studies in this line …

Depth sensors-based action recognition using a modified K-ary entropy classifier

M Batool, SS Alotaibi, MH Alatiyyah… - IEEE …, 2023 - ieeexplore.ieee.org
Surveillance system is acquiring an ample interest in the field of computer vision. Existing
surveillance system usually relies on optical or wearable sensors for indoor and outdoor …

Ego-exo: Transferring visual representations from third-person to first-person videos

Y Li, T Nagarajan, B Xiong… - Proceedings of the …, 2021 - openaccess.thecvf.com
We introduce an approach for pre-training egocentric video models using large-scale third-
person video datasets. Learning from purely egocentric data is limited by low dataset scale …

CAPturAR: An augmented reality tool for authoring human-involved context-aware applications

T Wang, X Qian, F He, X Hu, K Huo, Y Cao… - Proceedings of the 33rd …, 2020 - dl.acm.org
Recognition of human behavior plays an important role in context-aware applications.
However, it is still a challenge for end-users to build personalized applications that …

[HTML][HTML] Egocentric vision-based action recognition: A survey

A Núñez-Marcos, G Azkune, I Arganda-Carreras - Neurocomputing, 2022 - Elsevier
The egocentric action recognition EAR field has recently increased its popularity due to the
affordable and lightweight wearable cameras available nowadays such as GoPro and …

Predicting the future from first person (egocentric) vision: A survey

I Rodin, A Furnari, D Mavroeidis… - Computer Vision and …, 2021 - Elsevier
Egocentric videos can bring a lot of information about how humans perceive the world and
interact with the environment, which can be beneficial for the analysis of human behaviour …

Human action recognition using machine learning in uncontrolled environment

IM Nasir, M Raza, JH Shah, MA Khan… - 2021 1st International …, 2021 - ieeexplore.ieee.org
Video based Human Action Recognition (HAR) is an active research field of Machine
Learning (ML) and human detection in videos is the most important step in action …