Bayesian neural networks (BNNs) provide a formalism to quantify and calibrate uncertainty in deep learning. Current inference approaches for BNNs often resort to few-sample …
Y Addad, A Lechervy, F Jurie - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
In this paper, we propose a test-time resource-efficient neural architecture for image classification. Building on MSDNet [12], our multi-exit architecture excels in both anytime …
J Yu, L Zhang, D Cheng, W Huang… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Early-exiting has recently provided an ideal solution for accelerating activity inference by attaching internal classifiers to deep neural networks. It allows easy activity samples to be …
The rising interest in Bayesian deep learning (BDL) has led to a plethora of methods for estimating the posterior distribution. However, efficient computation of inferences, such as …
Early-exit neural networks (EENNs) facilitate adaptive inference by producing predictions at multiple stages of the forward pass. In safety-critical applications, these predictions are only …
Vision-language models (VLMs), such as CLIP and SigLIP, have found remarkable success in classification, retrieval, and generative tasks. For this, VLMs deterministically map images …
Due to the vast testing space, the increasing demand for effective and efficient testing of deep neural networks (DNNs) has led to the development of various DNN test case …
Scaling machine learning models significantly improves their performance. However, such gains come at the cost of inference being slow and resource-intensive. Early-exit neural …
Distinguishing epistemic from aleatoric uncertainty is a central idea to out-of-distribution (OOD) detection. By interpreting adversarial and OOD inputs from this perspective, we can …