Many machine learning algorithms are trained and evaluated by splitting data from a single source into training and test sets. While such focus on in-distribution learning scenarios has …
J Lee, E Kim, J Lee, J Lee… - Advances in Neural …, 2021 - proceedings.neurips.cc
Image classification models tend to make decisions based on peripheral attributes of data items that have strong correlation with a target variable (ie, dataset bias). These biased …
E Kim, J Lee, J Choo - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Deep neural networks often make decisions based on the spurious correlations inherent in the dataset, failing to generalize in an unbiased data distribution. Although previous …
Y Xiao, Z Tang, P Wei, C Liu… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
Deep learning models are challenged by the distribution shift between the training data and test data. Recently, the large models pre-trained on diverse data have demonstrated …
Approaches based on deep neural networks have achieved striking performance when testing data and training data share similar distribution, but can significantly fail otherwise …
We are interested in learning data-driven representations that can generalize well, even when trained on inherently biased data. In particular, we face the case where some …
Neural networks are prone to be biased towards spurious correlations between classes and latent attributes exhibited in a major portion of training data, which ruins their generalization …
Despite the impressive prediction ability, machine learning models show discrimination towards certain demographics and suffer from unfair prediction behaviors. To alleviate the …
Active learning (AL) aims to minimize labeling efforts for data-demanding deep neural networks (DNNs) by selecting the most representative data points for annotation. However …