Coattention based bilstm for answer selection

L Zhang, L Ma - 2017 IEEE International Conference on …, 2017 - ieeexplore.ieee.org
L Zhang, L Ma
2017 IEEE International Conference on Information and Automation …, 2017ieeexplore.ieee.org
Attention based recurrent neural networks have achieved great success in answer selection,
which is an important subtask of question answering (QA). However, previous work used
fixed representation of question to compute the attention information for answers, which fails
to extract the influence that answers make on question. In our work, we propose a
coattention based bidirectional LSTM for answer selection. Distinct from previous attention
based work, we resort to a coattention encoder to capture the interactions between the …
Attention based recurrent neural networks have achieved great success in answer selection, which is an important subtask of question answering (QA). However, previous work used fixed representation of question to compute the attention information for answers, which fails to extract the influence that answers make on question. In our work, we propose a coattention based bidirectional LSTM for answer selection. Distinct from previous attention based work, we resort to a coattention encoder to capture the interactions between the question and the answers, which generates different question representations according to the answers. We carry out large-scale experiments on WikiQA dataset. Our proposed model achieves good MAP (0.7148) and MRR (0.7289) on the test set and outperforms many strong baselines.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果