Learning with differentiable pertubed optimizers

Q Berthet, M Blondel, O Teboul… - Advances in neural …, 2020 - proceedings.neurips.cc
Abstract Machine learning pipelines often rely on optimizers procedures to make discrete
decisions (eg, sorting, picking closest neighbors, or shortest paths). Although these discrete …

Adaptive perturbation-based gradient estimation for discrete latent variable models

P Minervini, L Franceschi, M Niepert - Proceedings of the AAAI …, 2023 - ojs.aaai.org
The integration of discrete algorithmic components in deep learning architectures has
numerous applications. Recently, Implicit Maximum Likelihood Estimation, a class of …

High dimensional inference with random maximum a-posteriori perturbations

T Hazan, F Orabona, AD Sarwate… - IEEE Transactions …, 2019 - ieeexplore.ieee.org
This paper presents a new approach, called perturb-max, for high-dimensional statistical
inference in graphical models that is based on applying random perturbations followed by …

Marginal weighted maximum log-likelihood for efficient learning of perturb-and-map models

T Shpakova, F Bach, A Osokin - arXiv preprint arXiv:1811.08725, 2018 - arxiv.org
We consider the structured-output prediction problem through probabilistic approaches and
generalize the" perturb-and-MAP" framework to more challenging weighted Hamming …

On the parameter learning for Perturb-and-MAP models

T Shpakova - 2019 - theses.hal.science
Probabilistic graphical models encode hidden dependencies between random variables for
data modelling. Parameter estimation is a crucial part of handling such probabilistic models …

[PDF][PDF] Variable clamping for optimization-based inference

J Zhao, J Djolonga… - NIPS Workshop on …, 2016 - approximateinference.org
While central to the application of probabilistic models to discrete data, the problem of
marginal inference is in general intractable and efficient approximation schemes need to …

High Dimensional Inference with Random Maximum A-Posteriori Perturbations

S Maji, TS Jaakkola - 2019 - dspace.mit.edu
This paper presents a new approach, called perturb-max, for high-dimensional statistical
inference in graphical models that is based on applying random perturbations followed by …

[PDF][PDF] Improving Optimization-Based Approximate Inference by Clamping Variables.

J Zhao, J Djolonga, S Tschiatschek, A Krause - UAI, 2017 - tschiatschek.net
While central to the application of probabilistic models to discrete data, the problem of
marginal inference is in general intractable and efficient approximation schemes need to …

[引用][C] High Dimensional Inference with Random Maximum A-Posteriori Perturbations

T Jaakkola - arXiv preprint arXiv:1602.03571, 2016