Unsupervised grouped axial data modeling via hierarchical Bayesian nonparametric models with Watson distributions

W Fan, L Yang, N Bouguila - IEEE Transactions on Pattern …, 2021 - ieeexplore.ieee.org
This paper aims at proposing an unsupervised hierarchical nonparametric Bayesian
framework for modeling axial data (ie, observations are axes of direction) that can be …

Variational Bayesian learning for Dirichlet process mixture of inverted Dirichlet distributions in non-Gaussian image feature modeling

Z Ma, Y Lai, WB Kleijn, YZ Song… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
In this paper, we develop a novel variational Bayesian learning method for the Dirichlet
process (DP) mixture of the inverted Dirichlet distributions, which has been shown to be very …

Truncation-free online variational inference for Bayesian nonparametric models

C Wang, D Blei - Advances in neural information …, 2012 - proceedings.neurips.cc
We present a truncation-free online variational inference algorithm for Bayesian
nonparametric models. Unlike traditional (online) variational inference algorithms that …

Bayesian estimation of the von-Mises Fisher mixture model with variational inference

J Taghia, Z Ma, A Leijon - IEEE transactions on pattern analysis …, 2014 - ieeexplore.ieee.org
This paper addresses the Bayesian estimation of the von-Mises Fisher (vMF) mixture model
with variational inference (VI). The learning task in VI consists of optimization of the …

Nonparametric empirical Bayes for the Dirichlet process mixture model

JD McAuliffe, DM Blei, MI Jordan - Statistics and Computing, 2006 - Springer
The Dirichlet process prior allows flexible nonparametric mixture modeling. The number of
mixture components is not specified in advance and can grow as new data arrive. However …

MAD-Bayes: MAP-based asymptotic derivations from Bayes

T Broderick, B Kulis, M Jordan - International Conference on …, 2013 - proceedings.mlr.press
The classical mixture of Gaussians model is related to K-means via small-variance
asymptotics: as the covariances of the Gaussians tend to zero, the negative log-likelihood of …

Hierarchical mixture modeling with normalized inverse-Gaussian priors

A Lijoi, RH Mena, I Prünster - Journal of the American Statistical …, 2005 - Taylor & Francis
In recent years the Dirichlet process prior has experienced a great success in the context of
Bayesian mixture modeling. The idea of overcoming discreteness of its realizations by …

Robust Bayesian hierarchical modeling and inference using scale mixtures of normal distributions

L Ouyang, S Zhu, K Ye, C Park, M Wang - IISE Transactions, 2022 - Taylor & Francis
Empirical models that relate multiple quality features to a set of design variables play a vital
role in many industrial process optimization methods. Many of the current modeling methods …

[PDF][PDF] Applied Bayesian non-and semi-parametric inference using DPpackage

A Jara - SpherWave: An R Package for Analyzing Scattered …, 2007 - 130.225.254.116
In many practical situations, a parametric model cannot be expected to describe in an
appropriate manner the chance mechanism generating an observed dataset, and unrealistic …

Inferring parameters and structure of latent variable models by variational Bayes

H Attias - arXiv preprint arXiv:1301.6676, 2013 - arxiv.org
Current methods for learning graphical models with latent variables and a fixed structure
estimate optimal values for the model parameters. Whereas this approach usually produces …