Recently, event-based vision sensors have gained attention for autonomous driving applications, as conventional RGB cameras face limitations in handling challenging …
Z Liu, J Wu, G Shi, W Yang, J Ma - Pattern Recognition, 2025 - Elsevier
Integral cameras cause motion blur during relative object displacement, leading to degraded image aesthetics and reduced performance of image-based algorithms. Event cameras …
J Hu, C Guo, Y Luo, Z Mao - Information Fusion, 2025 - Elsevier
Objectives: We focus on exploring an unsupervised learning-based model which can take advantage of a single image and events to estimate dense and time-continuous optical flow …
Visual sensors are not only becoming better at capturing high-quality images but also they have steadily increased their capabilities in processing data on their own on-chip. Yet the …
H Han, W Zhai, Y Cao, B Li, Z Zha - arXiv preprint arXiv:2412.01300, 2024 - arxiv.org
Tracking Any Point (TAP) plays a crucial role in motion analysis. Video-based approaches rely on iterative local matching for tracking, but they assume linear motion during the blind …
Y Sekikawa, J Nagata - … of the IEEE/CVF Conference on …, 2023 - openaccess.thecvf.com
Optical flow estimation is a fundamental functionality in computer vision. An event-based camera, which asynchronously detects sparse intensity changes, is an ideal device for …