Deep neural networks with dependent weights: Gaussian process mixture limit, heavy tails, sparsity and compressibility

H Lee, F Ayed, P Jung, J Lee, H Yang… - arXiv preprint arXiv …, 2022 - arxiv.org
This article studies the infinite-width limit of deep feedforward neural networks whose
weights are dependent, and modelled via a mixture of Gaussian distributions. Each hidden …

Heavy-Tailed NGG-Mixture Models

VP Ramírez, M de Carvalho, L Gutiérrez - Bayesian Analysis, 2024 - projecteuclid.org
Heavy tails are often found in practice, and yet they are an Achilles heel of a variety of
mainstream random probability measures such as the Dirichlet process (DP). The first …

Batch and online variational learning of hierarchical pitman-yor mixtures of multivariate beta distributions

N Manouchehri, N Bouguila… - 2021 20th IEEE …, 2021 - ieeexplore.ieee.org
In this paper, we propose hierarchical Pitman-Yor process mixtures of multivariate Beta
distributions and learn this novel clustering method by online variational inference. The …

Sparse networks with core-periphery structure

C Naik, F Caron, J Rousseau - 2021 - projecteuclid.org
We propose a statistical model for graphs with a core-periphery structure. We give a precise
notion of what it means for a graph to have this structure, based on the sparsity properties of …

Deep neural networks with dependent weights: Gaussian process mixture limit, heavy tails, sparsity and compressibility

H Lee, F Ayed, P Jung, J Lee, H Yang… - Journal of Machine …, 2023 - jmlr.org
This article studies the infinite-width limit of deep feedforward neural networks whose
weights are dependent, and modelled via a mixture of Gaussian distributions. Each hidden …

Asymptotic behavior of the number of distinct values in a sample from the geometric stick-breaking process

P De Blasi, RH Mena, I Prünster - Annals of the Institute of Statistical …, 2021 - Springer
Discrete random probability measures are a key ingredient of Bayesian nonparametric
inference. A sample generates ties with positive probability and a fundamental object of both …

On sparsity, power-law, and clustering properties of graphex processes

F Caron, F Panero, J Rousseau - Advances in Applied Probability, 2023 - cambridge.org
This paper investigates properties of the class of graphs based on exchangeable point
processes. We provide asymptotic expressions for the number of edges, number of nodes …

A General Purpose Approximation to the Ferguson-Klass Algorithm for Sampling from L\'evy Processes Without Gaussian Components

D Bernaciak, JE Griffin - arXiv preprint arXiv:2407.01483, 2024 - arxiv.org
We propose a general-purpose approximation to the Ferguson-Klass algorithm for
generating samples from L\'evy processes without Gaussian components. We show that the …

A unified construction for series representations and finite approximations of completely random measures

J Lee, X Miscouridou, F Caron - Bernoulli, 2023 - projecteuclid.org
A unified construction for series representations and finite approximations of completely random
measures Page 1 Bernoulli 29(3), 2023, 2142–2166 https://doi.org/10.3150/22-BEJ1536 A …

Similarity-based Random Partition Distribution for Clustering Functional Data

T Wakayama, S Sugasawa, G Kobayashi - arXiv preprint arXiv …, 2023 - arxiv.org
Random partitioned distribution is a powerful tool for model-based clustering. However, the
implementation in practice can be challenging for functional spatial data such as hourly …