N Kitaev, D Klein - arXiv preprint arXiv:1805.01052, 2018 - arxiv.org
We demonstrate that replacing an LSTM encoder with a self-attentive architecture can lead to improvements to a state-of-the-art discriminative constituency parser. The use of attention …
Head-driven phrase structure grammar (HPSG) enjoys a uniform formalism representing rich contextual syntactic and even semantic meanings. This paper makes the first attempt to …
MS Zhang - Science China Technological Sciences, 2020 - Springer
Syntactic and semantic parsing has been investigated for decades, which is one primary topic in the natural language processing community. This article aims for a brief survey on …
Y Zhang, Z Li, M Zhang - arXiv preprint arXiv:2005.00975, 2020 - arxiv.org
In the deep learning (DL) era, parsing models are extremely simplified with little hurt on performance, thanks to the remarkable capability of multi-layer BiLSTMs in context …
Attention mechanisms have improved the performance of NLP tasks while allowing models to remain explainable. Self-attention is currently widely used, however interpretability is …
Y Zhang, H Zhou, Z Li - arXiv preprint arXiv:2008.03736, 2020 - arxiv.org
Estimating probability distribution is one of the core issues in the NLP field. However, in both deep learning (DL) and pre-DL eras, unlike the vast applications of linear-chain CRF in …
In this paper, we present a Linguistic Informed Multi-Task BERT (LIMIT-BERT) for learning language representations across multiple linguistic tasks by Multi-Task Learning (MTL) …
In this work, we propose a novel constituency parsing scheme. The model predicts a vector of real-valued scalars, named syntactic distances, for each split position in the input …
Y Tian, Y Song, F Xia, T Zhang - arXiv preprint arXiv:2010.07543, 2020 - arxiv.org
Constituency parsing is a fundamental and important task for natural language understanding, where a good representation of contextual information can help this task. N …