H Lee, SW Li, NT Vu - arXiv preprint arXiv:2205.01500, 2022 - arxiv.org
Deep learning has been the mainstream technique in natural language processing (NLP) area. However, the techniques require many labeled data and are less generalizable across …
Given a long list of anomaly detection algorithms developed in the last few decades, how do they perform with regard to (i) varying levels of supervision,(ii) different types of anomalies …
N Karim, MN Rizve, N Rahnavard… - Proceedings of the …, 2022 - openaccess.thecvf.com
Supervised deep learning methods require a large repository of annotated data; hence, label noise is inevitable. Training with such noisy data negatively impacts the generalization …
R Liu, F Bai, Y Du, Y Yang - Advances in Neural …, 2022 - proceedings.neurips.cc
Abstract Setting up a well-designed reward function has been challenging for many reinforcement learning applications. Preference-based reinforcement learning (PbRL) …
S Choe, SV Mehta, H Ahn… - Advances in neural …, 2024 - proceedings.neurips.cc
Despite its flexibility to learn diverse inductive biases in machine learning programs, meta learning (ie,\learning to learn) has long been recognized to suffer from poor scalability due …
Graph classification is a fundamental problem with diverse applications in bioinformatics and chemistry. Due to the intricate procedures of manual annotations in graphical domains …
Learning with noisy labels (LNL) aims to ensure model generalization given a label- corrupted training set. In this work, we investigate a rarely studied scenario of LNL on fine …
YC Yu, HT Lin - Proceedings of the IEEE/CVF Conference …, 2023 - openaccess.thecvf.com
Abstract Semi-Supervised Domain Adaptation (SSDA) involves learning to classify unseen target data with a few labeled and lots of unlabeled target data, along with many labeled …
Few-shot learning (FSL) methods typically assume clean support sets with accurately labeled samples when training on novel classes. This assumption can often be unrealistic …