Natural Language Processing (NLP) helps empower intelligent machines by enhancing a better understanding of the human language for linguistic-based human-computer …
We design a predictive layer for structured-output prediction (SOP) that can be plugged into any neural network guaranteeing its predictions are consistent with a set of predefined …
Current language models can generate high-quality text. Are they simply copying text they have seen before, or have they learned generalizable linguistic abstractions? To tease apart …
Pretrained contextualized embeddings are powerful word representations for structured prediction tasks. Recent work found that better word representations can be obtained by …
The activations of language transformers like GPT-2 have been shown to linearly map onto brain activity during speech comprehension. However, the nature of these activations …
This work proposes a syntax-enhanced grammatical error correction (GEC) approach named SynGEC that effectively incorporates dependency syntactic information into the …
Y Zhang, H Zhou, Z Li - arXiv preprint arXiv:2008.03736, 2020 - arxiv.org
Estimating probability distribution is one of the core issues in the NLP field. However, in both deep learning (DL) and pre-DL eras, unlike the vast applications of linear-chain CRF in …
C Lou, S Yang, K Tu - arXiv preprint arXiv:2203.04665, 2022 - arxiv.org
Nested named entity recognition (NER) has been receiving increasing attention. Recently,(Fu et al, 2021) adapt a span-based constituency parser to tackle nested NER …
Z Yuan, C Tan, S Huang, F Huang - arXiv preprint arXiv:2110.07480, 2021 - arxiv.org
Nested entities are observed in many domains due to their compositionality, which cannot be easily recognized by the widely-used sequence labeling framework. A natural solution is …