Cross-domain aspect extraction using transformers augmented with knowledge graphs

P Howard, A Ma, V Lal, AP Simoes, D Korat… - Proceedings of the 31st …, 2022 - dl.acm.org
P Howard, A Ma, V Lal, AP Simoes, D Korat, O Pereg, M Wasserblat, G Singer
Proceedings of the 31st acm international conference on information …, 2022dl.acm.org
The extraction of aspect terms is a critical step in fine-grained sentiment analysis of text.
Existing approaches for this task have yielded impressive results when the training and
testing data are from the same domain. However, these methods show a drastic decrease in
performance when applied to cross-domain settings where the domain of the testing data
differs from that of the training data. To address this lack of extensibility and robustness, we
propose a novel approach for automatically constructing domain-specific knowledge graphs …
The extraction of aspect terms is a critical step in fine-grained sentiment analysis of text. Existing approaches for this task have yielded impressive results when the training and testing data are from the same domain. However, these methods show a drastic decrease in performance when applied to cross-domain settings where the domain of the testing data differs from that of the training data. To address this lack of extensibility and robustness, we propose a novel approach for automatically constructing domain-specific knowledge graphs that contain information relevant to the identification of aspect terms. We introduce a methodology for injecting information from these knowledge graphs into Transformer models, including two alternative mechanisms for knowledge insertion: via query enrichment and via manipulation of attention patterns. We demonstrate state-of-the-art performance on benchmark datasets for cross-domain aspect term extraction using our approach and investigate how the amount of external knowledge available to the Transformer impacts model performance.
ACM Digital Library
以上显示的是最相近的搜索结果。 查看全部搜索结果