Seq2seq dependency parsing

Z Li, J Cai, S He, H Zhao - … of the 27th International Conference on …, 2018 - aclanthology.org
This paper presents a sequence to sequence (seq2seq) dependency parser by directly
predicting the relative position of head for each given word, which therefore results in a truly …

Pushing the limits of chatgpt on nlp tasks

X Sun, L Dong, X Li, Z Wan, S Wang, T Zhang… - arXiv preprint arXiv …, 2023 - arxiv.org
Despite the success of ChatGPT, its performances on most NLP tasks are still well below the
supervised baselines. In this work, we looked into the causes, and discovered that its subpar …

Benchclamp: A benchmark for evaluating language models on syntactic and semantic parsing

S Roy, S Thomson, T Chen, R Shin… - Advances in …, 2024 - proceedings.neurips.cc
Recent work has shown that generation from a prompted or fine-tuned language model can
perform well at semantic parsing when the output is constrained to be a valid semantic …

Transition-based parsing with stack-transformers

RF Astudillo, M Ballesteros, T Naseem… - arXiv preprint arXiv …, 2020 - arxiv.org
Modeling the parser state is key to good performance in transition-based parsing. Recurrent
Neural Networks considerably improved the performance of transition-based systems by …

Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing

J Zhou, T Naseem, RF Astudillo, YS Lee… - arXiv preprint arXiv …, 2021 - arxiv.org
Predicting linearized Abstract Meaning Representation (AMR) graphs using pre-trained
sequence-to-sequence Transformer models has recently led to large improvements on AMR …

Chinese text classification based on attention mechanism and feature-enhanced fusion neural network

J Xie, Y Hou, Y Wang, Q Wang, B Li, S Lv… - Computing, 2020 - Springer
Owing to the uneven distribution of key features in Chinese texts, key features play different
roles in text recognition in Chinese text classification tasks. We propose a feature-enhanced …

Viable dependency parsing as sequence labeling

M Strzyz, D Vilares, C Gómez-Rodríguez - arXiv preprint arXiv:1902.10505, 2019 - arxiv.org
We recast dependency parsing as a sequence labeling problem, exploring several
encodings of dependency trees as labels. While dependency parsing by means of …

Scheduled multi-task learning: From syntax to translation

E Kiperwasser, M Ballesteros - Transactions of the Association for …, 2018 - direct.mit.edu
Neural encoder-decoder models of machine translation have achieved impressive results,
while learning linguistic knowledge of both the source and target languages in an implicit …

Graph-to-graph transformer for transition-based dependency parsing

A Mohammadshahi, J Henderson - arXiv preprint arXiv:1911.03561, 2019 - arxiv.org
We propose the Graph2Graph Transformer architecture for conditioning on and predicting
arbitrary graphs, and apply it to the challenging task of transition-based dependency …

Hiformer: Sequence Modeling Networks With Hierarchical Attention Mechanisms

X Wu, H Lu, K Li, Z Wu, X Liu… - IEEE/ACM Transactions …, 2023 - ieeexplore.ieee.org
The attention-based encoder-decoder structure, such as the Transformer, has achieved
state-of-the-art performance on various sequence modeling tasks, eg, machine translation …