Latent variable models (LVMs) with discrete compositional latents are an important but challenging setting due to a combinatorially large number of possible configurations of the …
Y Kim - Advances in Neural Information Processing …, 2021 - proceedings.neurips.cc
Sequence-to-sequence learning with neural networks has become the de facto standard for sequence modeling. This approach typically models the local distribution over the next …
J Li, W Lu - arXiv preprint arXiv:2306.00645, 2023 - arxiv.org
Recent advancements in pre-trained language models (PLMs) have demonstrated that these models possess some degree of syntactic awareness. To leverage this knowledge, we …
We describe a neural transducer that maintains the flexibility of standard sequence-to- sequence (seq2seq) models while incorporating hierarchical phrases as a source of …
B Shayegh, Y Wen, L Mou - … of the 62nd Annual Meeting of the …, 2024 - aclanthology.org
We address unsupervised discontinuous constituency parsing, where we observe a high variance in the performance of the only previous model in the literature. We propose to build …
S Yang, Y Zhao, K Tu - arXiv preprint arXiv:2105.15021, 2021 - arxiv.org
Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar induction. However, to reduce computational complexity, they make a strong independence …
Video-aided grammar induction aims to leverage video information for finding more accurate syntactic grammars for accompanying text. While previous work focuses on building systems …
S Yang, RP Levy, Y Kim - arXiv preprint arXiv:2212.09140, 2022 - arxiv.org
We study grammar induction with mildly context-sensitive grammars for unsupervised discontinuous parsing. Using the probabilistic linear context-free rewriting system (LCFRS) …
S Yang, W Liu, K Tu - arXiv preprint arXiv:2205.00484, 2022 - arxiv.org
Hidden Markov Models (HMMs) and Probabilistic Context-Free Grammars (PCFGs) are widely used structured models, both of which can be represented as factor graph grammars …