Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks

L Wang, KJ Yoon - IEEE transactions on pattern analysis and …, 2021 - ieeexplore.ieee.org
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …

Optimized 3D bioprinting technology based on machine learning: A review of recent trends and advances

J Shin, Y Lee, Z Li, J Hu, SS Park, K Kim - Micromachines, 2022 - mdpi.com
The need for organ transplants has risen, but the number of available organ donations for
transplants has stagnated worldwide. Regenerative medicine has been developed to make …

In defense of pseudo-labeling: An uncertainty-aware pseudo-label selection framework for semi-supervised learning

MN Rizve, K Duarte, YS Rawat, M Shah - arXiv preprint arXiv:2101.06329, 2021 - arxiv.org
The recent research in semi-supervised learning (SSL) is mostly dominated by consistency
regularization based methods which achieve strong performance. However, they heavily …

Rethinking pre-training and self-training

B Zoph, G Ghiasi, TY Lin, Y Cui, H Liu… - Advances in neural …, 2020 - proceedings.neurips.cc
Pre-training is a dominant paradigm in computer vision. For example, supervised ImageNet
pre-training is commonly used to initialize the backbones of object detection and …

A survey on semi-supervised learning

JE Van Engelen, HH Hoos - Machine learning, 2020 - Springer
Semi-supervised learning is the branch of machine learning concerned with using labelled
as well as unlabelled data to perform certain learning tasks. Conceptually situated between …

Self-training with noisy student improves imagenet classification

Q Xie, MT Luong, E Hovy… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com
We present a simple self-training method that achieves 88.4% top-1 accuracy on ImageNet,
which is 2.0% better than the state-of-the-art model that requires 3.5 B weakly labeled …

Confidence regularized self-training

Y Zou, Z Yu, X Liu, BVK Kumar… - Proceedings of the …, 2019 - openaccess.thecvf.com
Recent advances in domain adaptation show that deep self-training presents a powerful
means for unsupervised domain adaptation. These methods often involve an iterative …

Comatch: Semi-supervised learning with contrastive graph regularization

J Li, C Xiong, SCH Hoi - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Semi-supervised learning has been an effective paradigm for leveraging unlabeled data to
reduce the reliance on labeled data. We propose CoMatch, a new semi-supervised learning …

Pseudo-labeling and confirmation bias in deep semi-supervised learning

E Arazo, D Ortego, P Albert… - … joint conference on …, 2020 - ieeexplore.ieee.org
Semi-supervised learning, ie jointly learning from labeled and unlabeled samples, is an
active research topic due to its key role on relaxing human supervision. In the context of …

Semi-supervised and unsupervised deep visual learning: A survey

Y Chen, M Mancini, X Zhu… - IEEE transactions on …, 2022 - ieeexplore.ieee.org
State-of-the-art deep learning models are often trained with a large amount of costly labeled
training data. However, requiring exhaustive manual annotations may degrade the model's …