Decoupling sparsity and smoothness in the dirichlet variational autoencoder topic model

S Burkhardt, S Kramer - Journal of Machine Learning Research, 2019 - jmlr.org
Recent work on variational autoencoders (VAEs) has enabled the development of
generative topic models using neural networks. Topic models based on latent Dirichlet …

GASC: Genre-aware semantic change for Ancient Greek

V Perrone, M Palma, S Hengchen, A Vatri… - arXiv preprint arXiv …, 2019 - arxiv.org
Word meaning changes over time, depending on linguistic and extra-linguistic factors.
Associating a word's correct meaning in its historical context is a central challenge in …

[PDF][PDF] A new SVD approach to optimal topic estimation

ZT Ke, M Wang - arXiv preprint arXiv:1704.07016, 2017 - academia.edu
In the probabilistic topic models, the quantity of interest—a lowrank matrix consisting of topic
vectors—is hidden in the text corpus matrix, masked by noise, and Singular Value …

A topic modeling and image classification framework: The Generalized Dirichlet variational autoencoder

AO Ojo, N Bouguila - Pattern Recognition, 2024 - Elsevier
Latent Dirichlet allocation model (LDA) has been widely used in topic modeling. Recent
works have shown the effectiveness of integrating neural network mechanisms with this …

Neural dynamic focused topic model

K Cvejoski, RJ Sánchez, C Ojeda - … of the AAAI Conference on Artificial …, 2023 - ojs.aaai.org
Topic models and all their variants analyse text by learning meaningful representations
through word co-occurrences. As pointed out by previous work, such models implicitly …

[PDF][PDF] The sketched Wasserstein distance for mixture distributions

X Bing, F Bunea, J Niles-Weed - arXiv preprint arXiv:2206.12768, 2022 - researchgate.net
Abstract The Sketched Wasserstein Distance (WS) is a new probability distance specifically
tailored to finite mixture distributions. Given any metric d defined on a set A of probability …

Regularized weighted low rank approximation

F Ban, D Woodruff, R Zhang - Advances in neural …, 2019 - proceedings.neurips.cc
The classical low rank approximation problem is to find a rank $ k $ matrix $ UV $(where $ U
$ has $ k $ columns and $ V $ has $ k $ rows) that minimizes the Frobenius norm of $ A-UV …

Learning topic models: Identifiability and finite-sample analysis

Y Chen, S He, Y Yang, F Liang - Journal of the American Statistical …, 2023 - Taylor & Francis
Topic models provide a useful text-mining tool for learning, extracting, and discovering latent
structures in large text corpora. Although a plethora of methods have been proposed for …

Bridging insight gaps in topic dependency discovery with a knowledge-inspired topic model

YK Tang, H Huang, X Shi, XL Mao - Information Processing & Management, 2025 - Elsevier
Discovering intricate dependencies between topics in topic modeling is challenging due to
the noisy and incomplete nature of real-world data and the inherent complexity of topic …

[PDF][PDF] Lexical semantic change for Ancient Greek and Latin

V Perrone, S Hengchen, M Palma, A Vatri… - … to semantic change, 2021 - library.oapen.org
Change and its precondition, variation, are inherent in languages. Over time, new words
enter the lexicon, others become obsolete, and existing words acquire new senses …