We introduce a novel training principle for generative probabilistic models that is an alternative to maximum likelihood. The proposed Generative Stochastic Networks (GSNs) …
M Zöhrer, F Pernkopf - Advances in Neural Information …, 2014 - proceedings.neurips.cc
We extend generative stochastic networks to supervised learning of representations. In particular, we introduce a hybrid training objective considering a generative and …
In this paper, we use deep representation learning for model-based single-channel source separation (SCSS) and artificial bandwidth extension (ABE). Both tasks are ill-posed and …
J Rudy, G Taylor - arXiv preprint arXiv:1412.7009, 2014 - arxiv.org
Recent work by Bengio et al.(2013) proposes a sampling procedure for denoising autoencoders which involves learning the transition operator of a Markov chain. The …
Single channel source separation (SCSS) is ill-posed and thus challenging. In this paper, we apply general stochastic networks (GSNs)–a deep neural network architecture–to SCSS …
Autoregressive models factorize a multivariate joint probability distribution into a product of one-dimensional conditional distributions. The variables are assigned an ordering, and the …
J Rudy, G Taylor - arXiv preprint arXiv:1412.7009, 2014 - Citeseer
Recent work by Bengio et al.(2013) proposes a sampling procedure for denoising autoencoders which involves learning the transition operator of a Markov chain. The …
S Sen, R Das, S Dasgupta, U Maulik - Deep Learning Techniques for …, 2020 - Springer
Recent discoveries in the field of biology have transformed it into a data-rich domain. This has invited multiple machine learning applications, and in particular, deep learning a set of …
The goal of this thesis is to present a body of work that serves as my modest contribution to humanity's quest to understand intelligence and to implement intelligent systems. This is a …