X Wang, K Tu - arXiv preprint arXiv:2010.05003, 2020 - arxiv.org
In this paper, we propose second-order graph-based neural dependency parsing using message passing and end-to-end neural networks. We empirically show that our …
Higher-order methods for dependency parsing can partially but not fully address the issue that edges in dependency trees should be constructed at the text span/subtree level rather …
S Yang, K Tu - Proceedings of the 61st Annual Meeting of the …, 2023 - aclanthology.org
We present a simple and unified approach for both continuous and discontinuous constituency parsing via autoregressive span selection. Constituency parsing aims to …
We investigate the ability of transformer models to approximate the CKY algorithm, using them to directly predict a sentence's parse and thus avoid the CKY algorithm's cubic …
Memory-based learning can be characterized as a lazy learning method in machine learning terminology because it delays the processing of input by storing the input until …
Y Gu, Y Hou, Z Wang, X Duan, Z Li - arXiv preprint arXiv:2309.11888, 2023 - arxiv.org
This work visits the topic of jointly parsing constituency and dependency trees, ie, to produce compatible constituency and dependency trees simultaneously for input sentences, which is …
Y Gu, Y Hou, Z Wang, X Duan, Z Li - Proceedings of the 2024 …, 2024 - aclanthology.org
This work revisits the topic of jointly parsing constituency and dependency trees, ie, to produce compatible constituency and dependency trees simultaneously for input sentences …
G Wang, K Tu - Proceedings of the 28th International Conference …, 2020 - aclanthology.org
Mannual annotation for dependency parsing is both labourious and time costly, resulting in the difficulty to learn practical dependency parsers for many languages due to the lack of …
KM Kurniawan - 2023 - minerva-access.unimelb.edu.au
Source-Free Transductive Transfer Learning for Structured Prediction Page 1 Source-Free Transductive Transfer Learning for Structured Prediction by Kemal Maulana Kurniawan …