How auto-encoders could provide credit assignment in deep networks via target propagation

Y Bengio - arXiv preprint arXiv:1407.7906, 2014 - arxiv.org
We propose to exploit {\em reconstruction} as a layer-local training signal for deep learning.
Reconstructions can be propagated in a form of target propagation playing a role similar to …

GSNs: generative stochastic networks

G Alain, Y Bengio, L Yao, J Yosinski… - … and Inference: A …, 2016 - academic.oup.com
We introduce a novel training principle for generative probabilistic models that is an
alternative to maximum likelihood. The proposed Generative Stochastic Networks (GSNs) …

General stochastic networks for classification

M Zöhrer, F Pernkopf - Advances in Neural Information …, 2014 - proceedings.neurips.cc
We extend generative stochastic networks to supervised learning of representations. In
particular, we introduce a hybrid training objective considering a generative and …

Representation learning for single-channel source separation and bandwidth extension

M Zöhrer, R Peharz, F Pernkopf - IEEE/ACM Transactions on …, 2015 - ieeexplore.ieee.org
In this paper, we use deep representation learning for model-based single-channel source
separation (SCSS) and artificial bandwidth extension (ABE). Both tasks are ill-posed and …

Generative class-conditional autoencoders

J Rudy, G Taylor - arXiv preprint arXiv:1412.7009, 2014 - arxiv.org
Recent work by Bengio et al.(2013) proposes a sampling procedure for denoising
autoencoders which involves learning the transition operator of a Markov chain. The …

[PDF][PDF] Single channel source separation with general stochastic networks.

M Zöhrer, F Pernkopf - Interspeech, 2014 - isca-archive.org
Single channel source separation (SCSS) is ill-posed and thus challenging. In this paper,
we apply general stochastic networks (GSNs)–a deep neural network architecture–to SCSS …

Connectionist multivariate density-estimation and its application to speech synthesis

B Uria - 2016 - era.ed.ac.uk
Autoregressive models factorize a multivariate joint probability distribution into a product of
one-dimensional conditional distributions. The variables are assigned an ordering, and the …

[PDF][PDF] Generative classconditional denoising au-toencoders

J Rudy, G Taylor - arXiv preprint arXiv:1412.7009, 2014 - Citeseer
Recent work by Bengio et al.(2013) proposes a sampling procedure for denoising
autoencoders which involves learning the transition operator of a Markov chain. The …

Application of Deep Architecture in Bioinformatics

S Sen, R Das, S Dasgupta, U Maulik - Deep Learning Techniques for …, 2020 - Springer
Recent discoveries in the field of biology have transformed it into a data-rich domain. This
has invited multiple machine learning applications, and in particular, deep learning a set of …

Auto-Encoders, Distributed Training and Information Representation in Deep Neural Networks

G Alain - 2019 - papyrus.bib.umontreal.ca
The goal of this thesis is to present a body of work that serves as my modest contribution to
humanity's quest to understand intelligence and to implement intelligent systems. This is a …