Foster: Feature boosting and compression for class-incremental learning

FY Wang, DW Zhou, HJ Ye, DC Zhan - European conference on computer …, 2022 - Springer
The ability to learn new concepts continually is necessary in this ever-changing world.
However, deep neural networks suffer from catastrophic forgetting when learning new …

A survey on long-tailed visual recognition

L Yang, H Jiang, Q Song, J Guo - International Journal of Computer Vision, 2022 - Springer
The heavy reliance on data is one of the major reasons that currently limit the development
of deep learning. Data quality directly dominates the effect of deep learning models, and the …

Introspective distillation for robust question answering

Y Niu, H Zhang - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Question answering (QA) models are well-known to exploit data bias, eg, the language prior
in visual QA and the position bias in reading comprehension. Recent debiasing methods …

Constructing balance from imbalance for long-tailed image recognition

Y Xu, YL Li, J Li, C Lu - European Conference on Computer Vision, 2022 - Springer
Long-tailed image recognition presents massive challenges to deep learning systems since
the imbalance between majority (head) classes and minority (tail) classes severely skews …

Robust distillation for worst-class performance: on the interplay between teacher and student objectives

S Wang, H Narasimhan, Y Zhou… - Uncertainty in …, 2023 - proceedings.mlr.press
Abstract Knowledge distillation is a popular technique that has been shown to produce
remarkable gains in average accuracy. However, recent work has shown that these gains …

Deep hierarchical distillation proxy-oil modeling for heterogeneous carbonate reservoirs

G Cirac, J Farfan, GD Avansi, DJ Schiozer… - … Applications of Artificial …, 2023 - Elsevier
This paper presents a novel few-shot proxy modeling approach for the oil and gas industry
to reduce reliance on numerical simulators for reservoir analysis. The strategy introduces a …

Gradient reweighting: Towards imbalanced class-incremental learning

J He - Proceedings of the IEEE/CVF Conference on …, 2024 - openaccess.thecvf.com
Abstract Class-Incremental Learning (CIL) trains a model to continually recognize new
classes from non-stationary data while retaining learned knowledge. A major challenge of …

Nicest: Noisy label correction and training for robust scene graph generation

L Li, J Xiao, H Shi, H Zhang, Y Yang… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Nearly all existing scene graph generation (SGG) models have overlooked the ground-truth
annotation qualities of mainstream SGG datasets, ie, they assume: 1) all the manually …

PKDN: Prior Knowledge Distillation Network for bronchoscopy diagnosis

P Yan, W Sun, X Li, M Li, Y Jiang, H Luo - Computers in Biology and …, 2023 - Elsevier
Bronchoscopy plays a crucial role in diagnosing and treating lung diseases. The deep
learning-based diagnostic system for bronchoscopic images can assist physicians in …

FoPro-KD: fourier prompted effective knowledge distillation for long-tailed medical image recognition

M Elbatel, R Martí, X Li - IEEE Transactions on Medical Imaging, 2023 - ieeexplore.ieee.org
Representational transfer from publicly available models is a promising technique for
improving medical image classification, especially in long-tailed datasets with rare diseases …