RMAN: Relational multi-head attention neural network for joint extraction of entities and relations

T Lai, L Cheng, D Wang, H Ye, W Zhang - Applied Intelligence, 2022 - Springer
T Lai, L Cheng, D Wang, H Ye, W Zhang
Applied Intelligence, 2022Springer
The task of extracting entities and relations has evolved from distributed extraction to joint
extraction. The joint model overcomes the disadvantages of distributed extraction method
and strengthens the information interaction between entities and relations. However, the
existing methods of the joint model rarely pay attention to the semantic information between
words, which have limitations in solving the problem of overlapping relations. In this paper,
we propose an RMAN model for joint extraction of entities and relations, which includes …
Abstract
The task of extracting entities and relations has evolved from distributed extraction to joint extraction. The joint model overcomes the disadvantages of distributed extraction method and strengthens the information interaction between entities and relations. However, the existing methods of the joint model rarely pay attention to the semantic information between words, which have limitations in solving the problem of overlapping relations. In this paper, we propose an RMAN model for joint extraction of entities and relations, which includes multi-feature fusion encoder sentence representation and decoder sequence annotation. We first add a multi-head attention layer after Bi-LSTM to obtain sentence representations, and leverage the attention mechanism to capture relation-based sentence representations. Then, we perform sequence annotation on the sentence representation to obtain entity pairs. Experiments on NYT-single, NYT-multi and WebNLG datasets demonstrate that our model can efficiently extract overlapping triples, which outperforms other baselines.
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果