Rethinking self-attention: Towards interpretability in neural parsing

K Mrini, F Dernoncourt, Q Tran, T Bui, W Chang… - arXiv preprint arXiv …, 2019 - arxiv.org
Attention mechanisms have improved the performance of NLP tasks while allowing models
to remain explainable. Self-attention is currently widely used, however interpretability is …

Improving constituency parsing with span attention

Y Tian, Y Song, F Xia, T Zhang - arXiv preprint arXiv:2010.07543, 2020 - arxiv.org
Constituency parsing is a fundamental and important task for natural language
understanding, where a good representation of contextual information can help this task. N …

Strongly incremental constituency parsing with graph neural networks

K Yang, J Deng - Advances in neural information …, 2020 - proceedings.neurips.cc
Parsing sentences into syntax trees can benefit downstream applications in NLP. Transition-
based parsers build trees by executing actions in a state transition system. They are …

Direct output connection for a high-rank language model

S Takase, J Suzuki, M Nagata - arXiv preprint arXiv:1808.10143, 2018 - arxiv.org
This paper proposes a state-of-the-art recurrent neural network (RNN) language model that
combines probability distributions computed not only from a final RNN layer but also from …

A simple and strong baseline for end-to-end neural RST-style discourse parsing

N Kobayashi, T Hirao, H Kamigaito, M Okumura… - arXiv preprint arXiv …, 2022 - arxiv.org
To promote and further develop RST-style discourse parsing models, we need a strong
baseline that can be regarded as a reference for reporting reliable experimental results. This …

Subword-based compact reconstruction of word embeddings

S Sasaki, J Suzuki, K Inui - … of the 2019 Conference of the North …, 2019 - aclanthology.org
The idea of subword-based word embeddings has been proposed in the literature, mainly
for solving the out-of-vocabulary (OOV) word problem observed in standard word-based …

Large Language Models Are No Longer Shallow Parsers

Y Tian, F Xia, Y Song - Proceedings of the 62nd Annual Meeting of …, 2024 - aclanthology.org
The development of large language models (LLMs) brings significant changes to the field of
natural language processing (NLP), enabling remarkable performance in various high-level …

Enriched in-order linearization for faster sequence-to-sequence constituent parsing

D Fernández-González… - arXiv preprint arXiv …, 2020 - arxiv.org
Sequence-to-sequence constituent parsing requires a linearization to represent trees as
sequences. Top-down tree linearizations, which can be based on brackets or shift-reduce …

A conditional splitting framework for efficient constituency parsing

TT Nguyen, XP Nguyen, S Joty, X Li - arXiv preprint arXiv:2106.15760, 2021 - arxiv.org
We introduce a generic seq2seq parsing framework that casts constituency parsing
problems (syntactic and discourse parsing) into a series of conditional splitting decisions …

[HTML][HTML] Discontinuous grammar as a foreign language

D Fernández-González, C Gómez-Rodríguez - Neurocomputing, 2023 - Elsevier
In order to achieve deep natural language understanding, syntactic constituent parsing is a
vital step, highly demanded by many artificial intelligence systems to process both text and …