The technology disclosed proposes using a combination of computationally cheap, less- accurate bag of words (BoW) model and computationally expensive, more-accurate long …
J Bradbury - US Patent 10,565,318, 2020 - Google Patents
We introduce an attentional neural machine translation model for the task of machine translation that accomplishes the longstanding goal of natural language processing to take …
A computer-implemented method for dual sequence inference using a neural network model includes generating a codependent representation based on a first input representation of a …
The technology disclosed presents a novel spatial attention model that uses current hidden state information of a decoder long short-term memory (LSTM) to guide attention and to …
A method for sequence-to-sequence prediction using a neural network model includes generating an encoded representation based on an input sequence using an encoder of the …
LU Jiasen, C Xiong, R Socher - US Patent 10,558,750, 2020 - Google Patents
The technology disclosed presents a novel spatial attention model that uses current hidden state information of a decoder long short-term memory (LSTM) to guide attention and to …
Y Zhou, C Xiong - US Patent 10,573,295, 2020 - Google Patents
The disclosed technology teaches a deep end-to-end speech recognition model, including using multi-objective learning criteria to train a deep end-to-end speech recognition model …
C Xiong, SHU Tianmin, R Socher - US Patent 11,562,287, 2023 - Google Patents
US11562287B2 - Hierarchical and interpretable skill acquisition in multi-task reinforcement learning - Google Patents US11562287B2 - Hierarchical and interpretable skill acquisition in …
Systems and methods for dense captioning of a video include a multi-layer encoder stack configured to receive information extracted from a plurality of video frames, a proposal …