[PDF][PDF] Bayesian model ensembling using meta-trained recurrent neural networks

L Ambrogioni, Y Berezutskaya, U Güçlü… - 2017 - repository.ubn.ru.nl
2017repository.ubn.ru.nl
In this paper we demonstrate that a recurrent neural network meta-trained on an ensemble
of arbitrary classification tasks can be used as an approximation of the Bayes optimal
classifier. This result is obtained by relying on the framework of e-free approximate Bayesian
inference, where the Bayesian posterior is approximated by training a neural network using
synthetic samples. We denote the resulting model as neural ensembler. We show that a
single neural ensembler trained on a large set of synthetic data achieves competitive …
In this paper we demonstrate that a recurrent neural network meta-trained on an ensemble of arbitrary classification tasks can be used as an approximation of the Bayes optimal classifier. This result is obtained by relying on the framework of e-free approximate Bayesian inference, where the Bayesian posterior is approximated by training a neural network using synthetic samples. We denote the resulting model as neural ensembler. We show that a single neural ensembler trained on a large set of synthetic data achieves competitive classification performance on multiple real-world classification problems without additional training.
repository.ubn.ru.nl
以上显示的是最相近的搜索结果。 查看全部搜索结果