When Gaussian process meets big data: A review of scalable GPs

H Liu, YS Ong, X Shen, J Cai - IEEE transactions on neural …, 2020 - ieeexplore.ieee.org
The vast quantity of information brought by big data as well as the evolving computer
hardware encourages success stories in the machine learning community. In the …

Rethinking Bayesian learning for data analysis: The art of prior and inference in sparsity-aware modeling

L Cheng, F Yin, S Theodoridis… - IEEE Signal …, 2022 - ieeexplore.ieee.org
Sparse modeling for signal processing and machine learning, in general, has been at the
focus of scientific research for over two decades. Among others, supervised sparsity-aware …

Simulation intelligence: Towards a new generation of scientific methods

A Lavin, D Krakauer, H Zenil, J Gottschlich… - arXiv preprint arXiv …, 2021 - arxiv.org
The original" Seven Motifs" set forth a roadmap of essential methods for the field of scientific
computing, where a motif is an algorithmic method that captures a pattern of computation …

Functional variational Bayesian neural networks

S Sun, G Zhang, J Shi, R Grosse - arXiv preprint arXiv:1903.05779, 2019 - arxiv.org
Variational Bayesian neural networks (BNNs) perform variational inference over weights, but
it is difficult to specify meaningful priors and approximate posteriors in a high-dimensional …

Exact Gaussian processes on a million data points

K Wang, G Pleiss, J Gardner, S Tyree… - Advances in neural …, 2019 - proceedings.neurips.cc
Gaussian processes (GPs) are flexible non-parametric models, with a capacity that grows
with the available data. However, computational constraints with standard inference …

Kernel methods through the roof: handling billions of points efficiently

G Meanti, L Carratino, L Rosasco… - Advances in Neural …, 2020 - proceedings.neurips.cc
Kernel methods provide an elegant and principled approach to nonparametric learning, but
so far could hardly be used in large scale problems, since naïve implementations scale …

Hilbert space methods for reduced-rank Gaussian process regression

A Solin, S Särkkä - Statistics and Computing, 2020 - Springer
This paper proposes a novel scheme for reduced-rank Gaussian process regression. The
method is based on an approximate series expansion of the covariance function in terms of …

End-to-end meta-bayesian optimisation with transformer neural processes

A Maraval, M Zimmer, A Grosnit… - Advances in Neural …, 2024 - proceedings.neurips.cc
Meta-Bayesian optimisation (meta-BO) aims to improve the sample efficiency of Bayesian
optimisation by leveraging data from related tasks. While previous methods successfully …

Posterior and computational uncertainty in gaussian processes

J Wenger, G Pleiss, M Pförtner… - Advances in …, 2022 - proceedings.neurips.cc
Gaussian processes scale prohibitively with the size of the dataset. In response, many
approximation methods have been developed, which inevitably introduce approximation …

Permutation search of tensor network structures via local sampling

C Li, J Zeng, Z Tao, Q Zhao - International Conference on …, 2022 - proceedings.mlr.press
Recent works put much effort into tensor network structure search (TN-SS), aiming to select
suitable tensor network (TN) structures, involving the TN-ranks, formats, and so on, for the …