All one needs to know about metaverse: A complete survey on technological singularity, virtual ecosystem, and research agenda

LH Lee, T Braud, PY Zhou, L Wang… - … and trends® in …, 2024 - nowpublishers.com
Since the popularisation of the Internet in the 1990s, the cyberspace has kept evolving. We
have created various computer-mediated virtual environments, including social networks …

Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks

L Wang, KJ Yoon - IEEE transactions on pattern analysis and …, 2021 - ieeexplore.ieee.org
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …

Deep dual-resolution networks for real-time and accurate semantic segmentation of traffic scenes

H Pan, Y Hong, W Sun, Y Jia - IEEE Transactions on Intelligent …, 2022 - ieeexplore.ieee.org
Using light-weight architectures or reasoning on low-resolution images, recent methods
realize very fast scene parsing, even running at more than 100 FPS on a single GPU …

Artificial intelligence for the metaverse: A survey

T Huynh-The, QV Pham, XQ Pham, TT Nguyen… - … Applications of Artificial …, 2023 - Elsevier
Along with the massive growth of the Internet from the 1990s until now, various innovative
technologies have been created to bring users breathtaking experiences with more virtual …

Point-to-voxel knowledge distillation for lidar semantic segmentation

Y Hou, X Zhu, Y Ma, CC Loy… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
This article addresses the problem of distilling knowledge from a large teacher model to a
slim student network for LiDAR semantic segmentation. Directly employing previous …

Knowledge distillation from a stronger teacher

T Huang, S You, F Wang, C Qian… - Advances in Neural …, 2022 - proceedings.neurips.cc
Unlike existing knowledge distillation methods focus on the baseline settings, where the
teacher models and training strategies are not that strong and competing as state-of-the-art …

Knowledge distillation with the reused teacher classifier

D Chen, JP Mei, H Zhang, C Wang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Knowledge distillation aims to compress a powerful yet cumbersome teacher model
into a lightweight student model without much sacrifice of performance. For this purpose …

Cross-image relational knowledge distillation for semantic segmentation

C Yang, H Zhou, Z An, X Jiang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Abstract Current Knowledge Distillation (KD) methods for semantic segmentation often
guide the student to mimic the teacher's structured information generated from individual …

Diffusion probabilistic model made slim

X Yang, D Zhou, J Feng… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Despite the visually-pleasing results achieved, the massive computational cost has been a
long-standing flaw for diffusion probabilistic models (DPMs), which, in turn, greatly limits …

Masked generative distillation

Z Yang, Z Li, M Shao, D Shi, Z Yuan, C Yuan - European Conference on …, 2022 - Springer
Abstract Knowledge distillation has been applied to various tasks successfully. The current
distillation algorithm usually improves students' performance by imitating the output of the …