Sparse polynomial chaos expansions: Literature survey and benchmark

N Lüthen, S Marelli, B Sudret - SIAM/ASA Journal on Uncertainty …, 2021 - SIAM
Sparse polynomial chaos expansions (PCE) are a popular surrogate modelling method that
takes advantage of the properties of PCE, the sparsity-of-effects principle, and powerful …

Validation free and replication robust volume-based data valuation

X Xu, Z Wu, CS Foo, BKH Low - Advances in Neural …, 2021 - proceedings.neurips.cc
Data valuation arises as a non-trivial challenge in real-world use cases such as
collaborative machine learning, federated learning, trusted data sharing, data marketplaces …

TTOpt: A maximum volume quantized tensor train-based optimization and its application to reinforcement learning

K Sozykin, A Chertkov, R Schutski… - Advances in …, 2022 - proceedings.neurips.cc
We present a novel procedure for optimization based on the combination of efficient
quantized tensor train representation and a generalized maximum matrix volume principle …

Robust CUR decomposition: Theory and imaging applications

HQ Cai, K Hamm, L Huang, D Needell - SIAM Journal on Imaging Sciences, 2021 - SIAM
This paper considers the use of robust principal component analysis (RPCA) in a CUR
decomposition framework and applications thereof. Our main algorithms produce a robust …

Sobol tensor trains for global sensitivity analysis

R Ballester-Ripoll, EG Paredes, R Pajarola - Reliability Engineering & …, 2019 - Elsevier
Sobol indices are a widespread quantitative measure for variance-based global sensitivity
analysis, but computing and utilizing them remains challenging for high-dimensional …

[HTML][HTML] Parallel cross interpolation for high-precision calculation of high-dimensional integrals

S Dolgov, D Savostyanov - Computer Physics Communications, 2020 - Elsevier
We propose a parallel version of the cross interpolation algorithm and apply it to calculate
high-dimensional integrals motivated by Ising model in quantum physics. In contrast to …

Deep composition of tensor-trains using squared inverse rosenblatt transports

T Cui, S Dolgov - Foundations of Computational Mathematics, 2022 - Springer
Characterising intractable high-dimensional random variables is one of the fundamental
challenges in stochastic computation. The recent surge of transport maps offers a …

Mode-wise tensor decompositions: Multi-dimensional generalizations of CUR decompositions

HQ Cai, K Hamm, L Huang, D Needell - Journal of machine learning …, 2021 - jmlr.org
Low rank tensor approximation is a fundamental tool in modern machine learning and data
science. In this paper, we study the characterization, perturbation analysis, and an efficient …

[HTML][HTML] Tensor Network Space-Time Spectral Collocation Method for Time-Dependent Convection-Diffusion-Reaction Equations

D Adak, DP Truong, G Manzini, KØ Rasmussen… - Mathematics, 2024 - mdpi.com
Emerging tensor network techniques for solutions of partial differential equations (PDEs),
known for their ability to break the curse of dimensionality, deliver new mathematical …

Generalized separable nonnegative matrix factorization

J Pan, N Gillis - IEEE transactions on pattern analysis and …, 2019 - ieeexplore.ieee.org
Nonnegative matrix factorization (NMF) is a linear dimensionality technique for nonnegative
data with applications such as image analysis, text mining, audio source separation, and …