A survey of knowledge enhanced pre-trained language models

L Hu, Z Liu, Z Zhao, L Hou, L Nie… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …

Back to common sense: Oxford dictionary descriptive knowledge augmentation for aspect-based sentiment analysis

W Jin, B Zhao, L Zhang, C Liu, H Yu - Information Processing & …, 2023 - Elsevier
Abstract Aspect-based Sentiment Analysis (ABSA) is a crucial natural language
understanding (NLU) research field which aims to accurately recognize reviewers' opinions …

MCL-NER: Cross-Lingual Named Entity Recognition via Multi-View Contrastive Learning

Y Mo, J Yang, J Liu, Q Wang, R Chen… - Proceedings of the AAAI …, 2024 - ojs.aaai.org
Cross-lingual named entity recognition (CrossNER) faces challenges stemming from
uneven performance due to the scarcity of multilingual corpora, especially for non-English …

Knowledgeable parameter efficient tuning network for commonsense question answering

Z Zhao, L Hu, H Zhao, Y Shao… - Proceedings of the 61st …, 2023 - aclanthology.org
Commonsense question answering is important for making decisions about everyday
matters. Although existing commonsense question answering works based on fully fine …

A multi-task semantic decomposition framework with task-specific pre-training for few-shot ner

G Dong, Z Wang, J Zhao, G Zhao, D Guo, D Fu… - Proceedings of the …, 2023 - dl.acm.org
The objective of few-shot named entity recognition is to identify named entities with limited
labeled instances. Previous works have primarily focused on optimizing the traditional token …

Knowledge prompting in pre-trained language model for natural language understanding

J Wang, W Huang, Q Shi, H Wang, M Qiu, X Li… - arXiv preprint arXiv …, 2022 - arxiv.org
Knowledge-enhanced Pre-trained Language Model (PLM) has recently received significant
attention, which aims to incorporate factual knowledge into PLMs. However, most existing …

Distinguish before answer: Generating contrastive explanation as knowledge for commonsense question answering

Q Chen, G Xu, M Yan, J Zhang, F Huang, L Si… - arXiv preprint arXiv …, 2023 - arxiv.org
Existing knowledge-enhanced methods have achieved remarkable results in certain QA
tasks via obtaining diverse knowledge from different knowledge bases. However, limited by …

3d-ex: A unified dataset of definitions and dictionary examples

F Almeman, H Sheikhi, L Espinosa-Anke - arXiv preprint arXiv:2308.03043, 2023 - arxiv.org
Definitions are a fundamental building block in lexicography, linguistics and computational
semantics. In NLP, they have been used for retrofitting word embeddings or augmenting …

DG Embeddings: The unsupervised definition embeddings learned from dictionary and glossary to gloss context words of Cloze task

X Liu, R Rzepka, K Araki - Knowledge-Based Systems, 2024 - Elsevier
For both humans and machines to acquire vocabulary, it is effective to learn words from
context while using dictionaries as an auxiliary tool. It has been shown in previous linguistic …

Rethinking dictionaries and glyphs for Chinese language pre-training

Y Wang, J Wang, D Zhao, Z Zheng - Findings of the Association …, 2023 - aclanthology.org
We introduce CDBert, a new learning paradigm that enhances the semantics understanding
ability of the Chinese PLMs with dictionary knowledge and structure of Chinese characters …