Anticipative feature fusion transformer for multi-modal action anticipation Z Zhong, D Schneider, M Voit, R Stiefelhagen, J Beyerer Proceedings of the IEEE/CVF Winter Conference on Applications of Computer …, 2023 | 42 | 2023 |
A survey on deep learning techniques for action anticipation Z Zhong, M Martin, M Voit, J Gall, J Beyerer arXiv preprint arXiv:2309.17257, 2023 | 5 | 2023 |
Diffant: Diffusion models for action anticipation Z Zhong, C Wu, M Martin, M Voit, J Gall, J Beyerer arXiv preprint arXiv:2311.15991, 2023 | 3 | 2023 |
Unsupervised 3D skeleton-based action recognition using cross-attention with conditioned generation capabilities DJ Lerch, Z Zhong, M Martin, M Voit, J Beyerer Proceedings of the IEEE/CVF Winter Conference on Applications of Computer …, 2024 | 2 | 2024 |
Mixed probability models for aleatoric uncertainty estimation in the context of dense stereo matching Z Zhong, M Mehltretter ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information …, 2021 | 2 | 2021 |
QueryMamba: A Mamba-Based Encoder-Decoder Architecture with a Statistical Verb-Noun Interaction Module for Video Action Forecasting@ Ego4D Long-Term Action Anticipation … Z Zhong, M Martin, F Diederichs, J Beyerer arXiv preprint arXiv:2407.04184, 2024 | 1 | 2024 |
Rethinking Attention Module Design for Point Cloud Analysis C Wu, K Wang, Z Zhong, H Fu, J Zheng, J Zhang, J Pfrommer, J Beyerer arXiv preprint arXiv:2407.19294, 2024 | | 2024 |
Activities that Correlate with Motion Sickness in Driving Cars–An International Online Survey F Diederichs, A Herrmanns, D Lerch, Z Zhong, D Piechnik, LA Mathis, ... International Conference on Human-Computer Interaction, 3-12, 2024 | | 2024 |
SynthAct: Towards Generalizable Human Action Recognition based on Synthetic Data D Schneider, M Keller, Z Zhong, K Peng, A Roitberg, J Beyerer, ... 2024 IEEE International Conference on Robotics and Automation (ICRA), 13038 …, 2024 | | 2024 |
Long-term Action Anticipation: A Quick Survey Z Zhong Workshop of Fraunhofer IOSB and Institute for Anthropomatics, Vision and …, 2023 | | 2023 |