Recent work has shown that generation from a prompted or fine-tuned language model can perform well at semantic parsing when the output is constrained to be a valid semantic …
Modeling the parser state is key to good performance in transition-based parsing. Recurrent Neural Networks considerably improved the performance of transition-based systems by …
Abstract Landgrebe and Smith (Synthese 198 (March): 2061–2081, 2021) present an unflattering diagnosis of recent advances in what they call language-centric artificial …
In this paper, we trace the history of neural networks applied to natural language understanding tasks, and identify key contributions which the nature of language has made …
Higher-order methods for dependency parsing can partially but not fully address the issue that edges in dependency trees should be constructed at the text span/subtree level rather …
Dependency parsing aims to extract syntactic dependency structure or semantic dependency structure for sentences. Existing methods for dependency parsing include …
The state-of-the-art models for coreference resolution are based on independent mention pair-wise decisions. We propose a modelling approach that learns coreference at the …
We propose the Graph2Graph Transformer architecture for conditioning on and predicting arbitrary graphs, and apply it to the challenging task of transition-based dependency …
Y Tian, Y Song, F Xia - … of the 29th International Conference on …, 2022 - aclanthology.org
Dependency parsing is an important fundamental natural language processing task which analyzes the syntactic structure of an input sentence by illustrating the syntactic relations …