Efficient second-order TreeCRF for neural dependency parsing

Y Zhang, Z Li, M Zhang - arXiv preprint arXiv:2005.00975, 2020 - arxiv.org
In the deep learning (DL) era, parsing models are extremely simplified with little hurt on
performance, thanks to the remarkable capability of multi-layer BiLSTMs in context …

Learning to decouple relations: Few-shot relation classification with entity-guided attention and confusion-aware training

Y Wang, J Bao, G Liu, Y Wu, X He, B Zhou… - arXiv preprint arXiv …, 2020 - arxiv.org
This paper aims to enhance the few-shot relation classification especially for sentences that
jointly describe multiple relations. Due to the fact that some relations usually keep high co …

Parsing as pretraining

D Vilares, M Strzyz, A Søgaard… - Proceedings of the AAAI …, 2020 - aaai.org
Recent analyses suggest that encoders pretrained for language modeling capture certain
morpho-syntactic structure. However, probing frameworks for word vectors still do not report …

An efficient confusing choices decoupling framework for multi-choice tasks over texts

Y Wang, J Bao, C Duan, Y Wu, X He, C Zhu… - Neural Computing and …, 2024 - Springer
This paper focuses on the multi-choice tasks, which aim to select the correct choice for a
given query by reasoning over texts, such as sentences and passages. Benefiting from the …

Automated Orthodontic Diagnosis from a Summary of Medical Findings

T Ohtsuka, T Kajiwara, C Tanikawa… - Proceedings of the …, 2023 - aclanthology.org
We propose a method to automate orthodontic diagnosis with natural language processing.
It is worthwhile to assist dentists with such technology to prevent errors by inexperienced …

Applying Occam's Razor to transformer-based dependency parsing: what works, what doesn't, and what is really necessary

S Grünewald, A Friedrich, J Kuhn - arXiv preprint arXiv:2010.12699, 2020 - arxiv.org
The introduction of pre-trained transformer-based contextualized word embeddings has led
to considerable improvements in the accuracy of graph-based parsers for frameworks such …

Headed-span-based projective dependency parsing

S Yang, K Tu - arXiv preprint arXiv:2108.04750, 2021 - arxiv.org
We propose a new method for projective dependency parsing based on headed spans. In a
projective dependency tree, the largest subtree rooted at each word covers a contiguous …

Context analysis for pre-trained masked language models

YA Lai, G Lalwani, Y Zhang - Findings of the Association for …, 2020 - aclanthology.org
Pre-trained language models that learn contextualized word representations from a large un-
annotated corpus have become a standard component for many state-of-the-art NLP …

Nucleus composition in transition-based dependency parsing

J Nivre, A Basirat, L Dürlich, A Moss - Computational Linguistics, 2022 - direct.mit.edu
Dependency-based approaches to syntactic analysis assume that syntactic structure can be
analyzed in terms of binary asymmetric dependency relations holding between elementary …

Combining (second-order) graph-based and headed-span-based projective dependency parsing

S Yang, K Tu - arXiv preprint arXiv:2108.05838, 2021 - arxiv.org
Graph-based methods, which decompose the score of a dependency tree into scores of
dependency arcs, are popular in dependency parsing for decades. Recently,\citet …