Predictive state recurrent neural networks

C Downey, A Hefny, B Boots… - Advances in Neural …, 2017 - proceedings.neurips.cc
Advances in Neural Information Processing Systems, 2017proceedings.neurips.cc
We present a new model, Predictive State Recurrent Neural Networks (PSRNNs), for filtering
and prediction in dynamical systems. PSRNNs draw on insights from both Recurrent Neural
Networks (RNNs) and Predictive State Representations (PSRs), and inherit advantages from
both types of models. Like many successful RNN architectures, PSRNNs use (potentially
deeply composed) bilinear transfer functions to combine information from multiple sources.
We show that such bilinear functions arise naturally from state updates in Bayes filters like …
Abstract
We present a new model, Predictive State Recurrent Neural Networks (PSRNNs), for filtering and prediction in dynamical systems. PSRNNs draw on insights from both Recurrent Neural Networks (RNNs) and Predictive State Representations (PSRs), and inherit advantages from both types of models. Like many successful RNN architectures, PSRNNs use (potentially deeply composed) bilinear transfer functions to combine information from multiple sources. We show that such bilinear functions arise naturally from state updates in Bayes filters like PSRs, in which observations can be viewed as gating belief states. We also show that PSRNNs can be learned effectively by combining Backpropogation Through Time (BPTT) with an initialization derived from a statistically consistent learning algorithm for PSRs called two-stage regression (2SR). Finally, we show that PSRNNs can be factorized using tensor decomposition, reducing model size and suggesting interesting connections to existing multiplicative architectures such as LSTMs and GRUs. We apply PSRNNs to 4 datasets, and show that we outperform several popular alternative approaches to modeling dynamical systems in all cases.
proceedings.neurips.cc
以上显示的是最相近的搜索结果。 查看全部搜索结果