Z Nasar, SW Jaffry, MK Malik - ACM Computing Surveys (CSUR), 2021 - dl.acm.org
With the advent of Web 2.0, there exist many online platforms that result in massive textual- data production. With ever-increasing textual data at hand, it is of immense importance to …
Recently, the character-word lattice structure has been proved to be effective for Chinese named entity recognition (NER) by incorporating the word information. However, since the …
C Sun, X Qiu, Y Xu, X Huang - … : 18th China national conference, CCL 2019 …, 2019 - Springer
Abstract Language model pre-training has proven to be useful in learning universal language representations. As a state-of-the-art language model pre-training model, BERT …
A Akbik, D Blythe, R Vollgraf - Proceedings of the 27th …, 2018 - aclanthology.org
Recent advances in language modeling using recurrent neural networks have made it viable to model language as distributions over characters. By learning to predict the next …
Inductive transfer learning has greatly impacted computer vision, but existing approaches in NLP still require task-specific modifications and training from scratch. We propose Universal …
In this paper, we investigate the modeling power of contextualized embeddings from pre- trained language models, eg BERT, on the E2E-ABSA task. Specifically, we build a series of …
Y Zhang, J Yang - arXiv preprint arXiv:1805.02023, 2018 - arxiv.org
We investigate a lattice-structured LSTM model for Chinese NER, which encodes a sequence of input characters as well as all potential words that match a lexicon. Compared …
H Yan, B Deng, X Li, X Qiu - arXiv preprint arXiv:1911.04474, 2019 - arxiv.org
The Bidirectional long short-term memory networks (BiLSTM) have been widely used as an encoder in models solving the named entity recognition (NER) task. Recently, the …
Natural Language Processing (NLP) helps empower intelligent machines by enhancing a better understanding of the human language for linguistic-based human-computer …