Learning from future: A novel self-training framework for semantic segmentation

Y Du, Y Shen, H Wang, J Fei, W Li… - Advances in …, 2022 - proceedings.neurips.cc
Self-training has shown great potential in semi-supervised learning. Its core idea is to use
the model learned on labeled data to generate pseudo-labels for unlabeled samples, and in
turn teach itself. To obtain valid supervision, active attempts typically employ a momentum
teacher for pseudo-label prediction yet observe the confirmation bias issue, where the
incorrect predictions may provide wrong supervision signals and get accumulated in the
training process. The primary cause of such a drawback is that the prevailing self-training …

[PDF][PDF] Learning from Future: A Novel Self-Training Framework for Semantic Segmentation–Supplementary Material–

Y Du, Y Shen, H Wang, J Fei, W Li, L Wu, R Zhao, Z Fu… - proceedings.neurips.cc
The supplementary material is organized as follows. Sec. B shows more dataset and
implementation details. Sec. C provide more ablation studies of our FST, including the
ablation on SYNTHIA→ Cityscapes and evaluation of various segmentation decoders. Sec.
D and Sec. E present more comparisons of our FST with state-of-the-art methods on both
UDA and SSL benchmarks. Sec. F analyzes the training process of our method and shows
more visualization comparisons with classical self-training. Sec. G discusses the social …
以上显示的是最相近的搜索结果。 查看全部搜索结果