Discrete opinion tree induction for aspect-based sentiment analysis

C Chen, Z Teng, Z Wang, Y Zhang - … of the 60th Annual Meeting of …, 2022 - aclanthology.org
Dependency trees have been intensively used with graph neural networks for aspect-based
sentiment classification. Though being effective, such methods rely on external dependency …

Toward understanding the communication in sperm whales

J Andreas, G Beguš, MM Bronstein, R Diamant… - Iscience, 2022 - cell.com
Machine learning has been advancing dramatically over the past decade. Most strides are
human-based applications due to the availability of large-scale datasets; however …

LM-critic: Language models for unsupervised grammatical error correction

M Yasunaga, J Leskovec, P Liang - arXiv preprint arXiv:2109.06822, 2021 - arxiv.org
Training a model for grammatical error correction (GEC) requires a set of labeled
ungrammatical/grammatical sentence pairs, but manually annotating such pairs can be …

Contextual distortion reveals constituency: Masked language models are implicit parsers

J Li, W Lu - arXiv preprint arXiv:2306.00645, 2023 - arxiv.org
Recent advancements in pre-trained language models (PLMs) have demonstrated that
these models possess some degree of syntactic awareness. To leverage this knowledge, we …

Ensemble distillation for unsupervised constituency parsing

B Shayegh, Y Cao, X Zhu, JCK Cheung… - arXiv preprint arXiv …, 2023 - arxiv.org
We investigate the unsupervised constituency parsing task, which organizes words and
phrases of a sentence into a hierarchical structure without using linguistically annotated …

Tree-Averaging Algorithms for Ensemble-Based Unsupervised Discontinuous Constituency Parsing

B Shayegh, Y Wen, L Mou - … of the 62nd Annual Meeting of the …, 2024 - aclanthology.org
We address unsupervised discontinuous constituency parsing, where we observe a high
variance in the performance of the only previous model in the literature. We propose to build …

Neural bi-lexicalized PCFG induction

S Yang, Y Zhao, K Tu - arXiv preprint arXiv:2105.15021, 2021 - arxiv.org
Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar induction.
However, to reduce computational complexity, they make a strong independence …

PCFGs can do better: Inducing probabilistic context-free grammars with many symbols

S Yang, Y Zhao, K Tu - arXiv preprint arXiv:2104.13727, 2021 - arxiv.org
Probabilistic context-free grammars (PCFGs) with neural parameterization have been shown
to be effective in unsupervised phrase-structure grammar induction. However, due to the …

ContextRef: Evaluating Referenceless Metrics For Image Description Generation

E Kreiss, E Zelikman, C Potts, N Haber - arXiv preprint arXiv:2309.11710, 2023 - arxiv.org
Referenceless metrics (eg, CLIPScore) use pretrained vision--language models to assess
image descriptions directly without costly ground-truth reference texts. Such methods can …

Improved latent tree induction with distant supervision via span constraints

Z Xu, A Drozdov, JY Lee, T O'Gorman… - arXiv preprint arXiv …, 2021 - arxiv.org
For over thirty years, researchers have developed and analyzed methods for latent tree
induction as an approach for unsupervised syntactic parsing. Nonetheless, modern systems …