We study the problem of uncertainty quantification via prediction sets, in an online setting where the data distribution may vary arbitrarily over time. Recent work develops online …
Conformal prediction (CP) is a framework to quantify uncertainty of machine learning classifiers including deep neural networks. Given a testing example and a trained classifier …
S Destercke - Conformal and Probabilistic Prediction with …, 2022 - proceedings.mlr.press
Dealing with uncertain data in statistical estimation problems or in machine learning is not really a new issue. However, such uncertainty has so far mostly been modelled either as …
Machine learning for malware classification shows encouraging results, but real deployments suffer from performance degradation as malware authors adapt their …
Selective classification allows models to abstain from making predictions (eg, say" I don't know") when in doubt in order to obtain better effective accuracy. While typical selective …
Deep learning models achieve high predictive accuracy across a broad spectrum of tasks, but rigorously quantifying their predictive uncertainty remains challenging. Usable estimates …