Compound probabilistic context-free grammars for grammar induction

Y Kim, C Dyer, AM Rush - arXiv preprint arXiv:1906.10225, 2019 - arxiv.org
We study a formalization of the grammar induction problem that models sentences as being
generated by a compound probabilistic context-free grammar. In contrast to traditional …

Holographic CCG Parsing

R Yamaki, T Taniguchi… - Proceedings of the 61st …, 2023 - aclanthology.org
We propose a method for formulating CCG as a recursive composition in a continuous
vector space. Recent CCG supertagging and parsing models generally demonstrate high …

PCFGs can do better: Inducing probabilistic context-free grammars with many symbols

S Yang, Y Zhao, K Tu - arXiv preprint arXiv:2104.13727, 2021 - arxiv.org
Probabilistic context-free grammars (PCFGs) with neural parameterization have been shown
to be effective in unsupervised phrase-structure grammar induction. However, due to the …

Linguistic parameters of spontaneous speech for identifying mild cognitive impairment and Alzheimer disease

V Vincze, MK Szabó, I Hoffmann, L Tóth… - Computational …, 2022 - direct.mit.edu
In this article, we seek to automatically identify Hungarian patients suffering from mild
cognitive impairment (MCI) or mild Alzheimer disease (mAD) based on their speech …

Recursive Bayesian networks: Generalising and unifying probabilistic context-free grammars and dynamic Bayesian networks

R Lieck, M Rohrmeier - Advances in Neural Information …, 2021 - proceedings.neurips.cc
Probabilistic context-free grammars (PCFGs) and dynamic Bayesian networks (DBNs) are
widely used sequence models with complementary strengths and limitations. While PCFGs …

PRIMAL-GMM: PaRametrIc MAnifold learning of Gaussian mixture models

Z Liu, L Yu, JH Hsiao, AB Chan - IEEE Transactions on Pattern …, 2021 - ieeexplore.ieee.org
We propose a ParametRIc MAnifold Learning (PRIMAL) algorithm for Gaussian mixtures
models (GMM), assuming that GMMs lie on or near to a manifold of probability distributions …

Latent variable sentiment grammar

L Zhang, K Tu, Y Zhang - arXiv preprint arXiv:1907.00218, 2019 - arxiv.org
Neural models have been investigated for sentiment classification over constituent trees.
They learn phrase composition automatically by encoding tree structures but do not explicitly …

Best Trees Extraction and Contextual Grammars for Language Processing

A Jonsson - 2021 - diva-portal.org
In natural language processing, the syntax of a sentence refers to the words used in the
sentence, their grammatical role, and their order. Semantics concerns the concepts …

Improved N-Best Extraction with an Evaluation on Language Data

J Bjorklund, F Drewes… - Computational …, 2022 - submissions.cljournal.org
We show that a previously proposed algorithm for the N-best trees problem can be made
more efficient by changing how it arranges and explores the search space. Given an integer …

Unsupervised structure induction and multimodal grounding

Y Zhao - 2023 - era.ed.ac.uk
Structured representations build upon symbolic abstraction (eg, words in natural language
and visual concepts in natural images), offer a principled way of encoding our perceptions …