This paper aims to enhance the few-shot relation classification especially for sentences that jointly describe multiple relations. Due to the fact that some relations usually keep high co …
Recent analyses suggest that encoders pretrained for language modeling capture certain morpho-syntactic structure. However, probing frameworks for word vectors still do not report …
Y Wang, J Bao, C Duan, Y Wu, X He, C Zhu… - Neural Computing and …, 2024 - Springer
This paper focuses on the multi-choice tasks, which aim to select the correct choice for a given query by reasoning over texts, such as sentences and passages. Benefiting from the …
T Ohtsuka, T Kajiwara, C Tanikawa… - Proceedings of the …, 2023 - aclanthology.org
We propose a method to automate orthodontic diagnosis with natural language processing. It is worthwhile to assist dentists with such technology to prevent errors by inexperienced …
The introduction of pre-trained transformer-based contextualized word embeddings has led to considerable improvements in the accuracy of graph-based parsers for frameworks such …
S Yang, K Tu - arXiv preprint arXiv:2108.04750, 2021 - arxiv.org
We propose a new method for projective dependency parsing based on headed spans. In a projective dependency tree, the largest subtree rooted at each word covers a contiguous …
Pre-trained language models that learn contextualized word representations from a large un- annotated corpus have become a standard component for many state-of-the-art NLP …
Dependency-based approaches to syntactic analysis assume that syntactic structure can be analyzed in terms of binary asymmetric dependency relations holding between elementary …
S Yang, K Tu - arXiv preprint arXiv:2108.05838, 2021 - arxiv.org
Graph-based methods, which decompose the score of a dependency tree into scores of dependency arcs, are popular in dependency parsing for decades. Recently,\citet …