Conformal pid control for time series prediction

A Angelopoulos, E Candes… - Advances in neural …, 2024 - proceedings.neurips.cc
We study the problem of uncertainty quantification for time series prediction, with the goal of
providing easy-to-use algorithms with formal guarantees. The algorithms we present build …

Accurate uncertainties for deep learning using calibrated regression

V Kuleshov, N Fenner, S Ermon - … conference on machine …, 2018 - proceedings.mlr.press
Methods for reasoning under uncertainty are a key building block of accurate and reliable
machine learning systems. Bayesian methods provide a general framework to quantify …

On calibration of modern neural networks

C Guo, G Pleiss, Y Sun… - … conference on machine …, 2017 - proceedings.mlr.press
Confidence calibration–the problem of predicting probability estimates representative of the
true correctness likelihood–is important for classification models in many applications. We …

Trainable calibration measures for neural networks from kernel mean embeddings

A Kumar, S Sarawagi, U Jain - International Conference on …, 2018 - proceedings.mlr.press
Modern neural networks have recently been found to be poorly calibrated, primarily in the
direction of over-confidence. Methods like entropy penalty and temperature smoothing …

Soft calibration objectives for neural networks

A Karandikar, N Cain, D Tran… - Advances in …, 2021 - proceedings.neurips.cc
Optimal decision making requires that classifiers produce uncertainty estimates consistent
with their empirical accuracy. However, deep neural networks are often under-or over …

Maximum likelihood with bias-corrected calibration is hard-to-beat at label shift adaptation

A Alexandari, A Kundaje… - … Conference on Machine …, 2020 - proceedings.mlr.press
Label shift refers to the phenomenon where the prior class probability p (y) changes
between the training and test distributions, while the conditional probability p (x| y) stays …

Non-parametric calibration for classification

J Wenger, H Kjellström… - … Conference on Artificial …, 2020 - proceedings.mlr.press
Many applications of classification methods not only require high accuracy but also reliable
estimation of predictive uncertainty. However, while many current classification frameworks …

Doctor: A simple method for detecting misclassification errors

F Granese, M Romanelli, D Gorla… - Advances in …, 2021 - proceedings.neurips.cc
Deep neural networks (DNNs) have shown to perform very well on large scale object
recognition problems and lead to widespread use for real-world applications, including …

Calibrated and sharp uncertainties in deep learning via density estimation

V Kuleshov, S Deshpande - International Conference on …, 2022 - proceedings.mlr.press
Accurate probabilistic predictions can be characterized by two properties {—} calibration and
sharpness. However, standard maximum likelihood training yields models that are poorly …

Calibrated prediction with covariate shift via unsupervised domain adaptation

S Park, O Bastani, J Weimer… - … Conference on Artificial …, 2020 - proceedings.mlr.press
Reliable uncertainty estimates are an important tool for helping autonomous agents or
human decision makers understand and lever-age predictive models. However, existing …