which is an important subtask of question answering (QA). However, previous work used
fixed representation of question to compute the attention information for answers, which fails
to extract the influence that answers make on question. In our work, we propose a
coattention based bidirectional LSTM for answer selection. Distinct from previous attention
based work, we resort to a coattention encoder to capture the interactions between the …