Visual tuning

BXB Yu, J Chang, H Wang, L Liu, S Wang… - ACM Computing …, 2023 - dl.acm.org
Fine-tuning visual models has been widely shown promising performance on many
downstream visual tasks. With the surprising development of pre-trained visual foundation …

Generalized focal loss: Towards efficient representation learning for dense object detection

X Li, C Lv, W Wang, G Li, L Yang… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
Object detection is a fundamental computer vision task that simultaneously predicts the
category and localization of the targets of interest. Recently one-stage (also termed “dense”) …

When object detection meets knowledge distillation: A survey

Z Li, P Xu, X Chang, L Yang, Y Zhang… - … on Pattern Analysis …, 2023 - ieeexplore.ieee.org
Object detection (OD) is a crucial computer vision task that has seen the development of
many algorithms and models over the years. While the performance of current OD models …

Mutual-assistance learning for object detection

X Xie, C Lang, S Miao, G Cheng, K Li… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Object detection is a fundamental yet challenging task in computer vision. Despite the great
strides made over recent years, modern detectors may still produce unsatisfactory …

Bevdistill: Cross-modal bev distillation for multi-view 3d object detection

Z Chen, Z Li, S Zhang, L Fang, Q Jiang… - arXiv preprint arXiv …, 2022 - arxiv.org
3D object detection from multiple image views is a fundamental and challenging task for
visual scene understanding. Owing to its low cost and high efficiency, multi-view 3D object …

Pkd: General distillation framework for object detectors via pearson correlation coefficient

W Cao, Y Zhang, J Gao, A Cheng… - Advances in Neural …, 2022 - proceedings.neurips.cc
Abstract Knowledge distillation (KD) is a widely-used technique to train compact models in
object detection. However, there is still a lack of study on how to distill between …

Scalekd: Distilling scale-aware knowledge in small object detector

Y Zhu, Q Zhou, N Liu, Z Xu, Z Ou… - Proceedings of the …, 2023 - openaccess.thecvf.com
Despite the prominent success of general object detection, the performance and efficiency of
Small Object Detection (SOD) are still unsatisfactory. Unlike existing works that struggle to …

Teacher-student architecture for knowledge distillation: A survey

C Hu, X Li, D Liu, H Wu, X Chen, J Wang… - arXiv preprint arXiv …, 2023 - arxiv.org
Although Deep neural networks (DNNs) have shown a strong capacity to solve large-scale
problems in many areas, such DNNs are hard to be deployed in real-world systems due to …

Bridging cross-task protocol inconsistency for distillation in dense object detection

L Yang, X Zhou, X Li, L Qiao, Z Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Knowledge distillation (KD) has shown potential for learning compact models in
dense object detection. However, the commonly used softmax-based distillation ignores the …

Spatial self-distillation for object detection with inaccurate bounding boxes

D Wu, P Chen, X Yu, G Li, Z Han… - Proceedings of the …, 2023 - openaccess.thecvf.com
Object detection via inaccurate bounding box supervision has boosted a broad interest due
to the expensive high-quality annotation data or the occasional inevitability of low annotation …