A survey on deep learning for named entity recognition

J Li, A Sun, J Han, C Li - IEEE transactions on knowledge and …, 2020 - ieeexplore.ieee.org
Named entity recognition (NER) is the task to identify mentions of rigid designators from text
belonging to predefined semantic types such as person, location, organization etc. NER …

Named entity recognition and relation extraction: State-of-the-art

Z Nasar, SW Jaffry, MK Malik - ACM Computing Surveys (CSUR), 2021 - dl.acm.org
With the advent of Web 2.0, there exist many online platforms that result in massive textual-
data production. With ever-increasing textual data at hand, it is of immense importance to …

FLAT: Chinese NER using flat-lattice transformer

X Li, H Yan, X Qiu, X Huang - arXiv preprint arXiv:2004.11795, 2020 - arxiv.org
Recently, the character-word lattice structure has been proved to be effective for Chinese
named entity recognition (NER) by incorporating the word information. However, since the …

How to fine-tune bert for text classification?

C Sun, X Qiu, Y Xu, X Huang - … : 18th China national conference, CCL 2019 …, 2019 - Springer
Abstract Language model pre-training has proven to be useful in learning universal
language representations. As a state-of-the-art language model pre-training model, BERT …

Contextual string embeddings for sequence labeling

A Akbik, D Blythe, R Vollgraf - Proceedings of the 27th …, 2018 - aclanthology.org
Recent advances in language modeling using recurrent neural networks have made it
viable to model language as distributions over characters. By learning to predict the next …

Universal language model fine-tuning for text classification

J Howard, S Ruder - arXiv preprint arXiv:1801.06146, 2018 - arxiv.org
Inductive transfer learning has greatly impacted computer vision, but existing approaches in
NLP still require task-specific modifications and training from scratch. We propose Universal …

Exploiting BERT for end-to-end aspect-based sentiment analysis

X Li, L Bing, W Zhang, W Lam - arXiv preprint arXiv:1910.00883, 2019 - arxiv.org
In this paper, we investigate the modeling power of contextualized embeddings from pre-
trained language models, eg BERT, on the E2E-ABSA task. Specifically, we build a series of …

Chinese NER using lattice LSTM

Y Zhang, J Yang - arXiv preprint arXiv:1805.02023, 2018 - arxiv.org
We investigate a lattice-structured LSTM model for Chinese NER, which encodes a
sequence of input characters as well as all potential words that match a lexicon. Compared …

TENER: adapting transformer encoder for named entity recognition

H Yan, B Deng, X Li, X Qiu - arXiv preprint arXiv:1911.04474, 2019 - arxiv.org
The Bidirectional long short-term memory networks (BiLSTM) have been widely used as an
encoder in models solving the named entity recognition (NER) task. Recently, the …

Natural language processing advancements by deep learning: A survey

A Torfi, RA Shirvani, Y Keneshloo, N Tavaf… - arXiv preprint arXiv …, 2020 - arxiv.org
Natural Language Processing (NLP) helps empower intelligent machines by enhancing a
better understanding of the human language for linguistic-based human-computer …