A primer on neural network models for natural language processing

Y Goldberg - Journal of Artificial Intelligence Research, 2016 - jair.org
Over the past few years, neural networks have re-emerged as powerful machine-learning
models, yielding state-of-the-art results in fields such as image recognition and speech …

ScispaCy: fast and robust models for biomedical natural language processing

M Neumann, D King, I Beltagy, W Ammar - arXiv preprint arXiv …, 2019 - arxiv.org
Despite recent advances in natural language processing, many statistical models for
processing text perform extremely poorly under domain shift. Processing biomedical and …

[图书][B] Neural network methods in natural language processing

Y Goldberg - 2017 - books.google.com
Neural networks are a family of powerful machine learning models and this book focuses on
their application to natural language data. The first half of the book (Parts I and II) covers the …

Findings of the 2017 conference on machine translation (wmt17)

O Bojar, R Chatterjee, C Federmann, Y Graham… - 2017 - doras.dcu.ie
This paper presents the results of the WMT17 shared tasks, which included three machine
translation (MT) tasks (news, biomedical, and multimodal), two evaluation tasks (metrics and …

Semi-supervised sequence modeling with cross-view training

K Clark, MT Luong, CD Manning, QV Le - arXiv preprint arXiv:1809.08370, 2018 - arxiv.org
Unsupervised representation learning algorithms such as word2vec and ELMo improve the
accuracy of many supervised NLP models, mainly because they can take advantage of large …

A stack-propagation framework with token-level intent detection for spoken language understanding

L Qin, W Che, Y Li, H Wen, T Liu - arXiv preprint arXiv:1909.02188, 2019 - arxiv.org
Intent detection and slot filling are two main tasks for building a spoken language
understanding (SLU) system. The two tasks are closely tied and the slots often highly …

Linguistically-informed self-attention for semantic role labeling

E Strubell, P Verga, D Andor, D Weiss… - arXiv preprint arXiv …, 2018 - arxiv.org
Current state-of-the-art semantic role labeling (SRL) uses a deep neural network with no
explicit linguistic features. However, prior work has shown that gold syntax trees can …

A joint many-task model: Growing a neural network for multiple nlp tasks

K Hashimoto, C Xiong, Y Tsuruoka… - arXiv preprint arXiv …, 2016 - arxiv.org
Transfer and multi-task learning have traditionally focused on either a single source-target
pair or very few, similar tasks. Ideally, the linguistic levels of morphology, syntax and …

A survey of syntactic-semantic parsing based on constituent and dependency structures

MS Zhang - Science China Technological Sciences, 2020 - Springer
Syntactic and semantic parsing has been investigated for decades, which is one primary
topic in the natural language processing community. This article aims for a brief survey on …

[PDF][PDF] Predictor-estimator using multilevel task learning with stack propagation for neural quality estimation

H Kim, JH Lee, SH Na - Proceedings of the second conference on …, 2017 - aclanthology.org
In this paper, we present a two-stage neural quality estimation model that uses multilevel
task learning for translation quality estimation (QE) at the sentence, word, and phrase levels …