R Minster, AK Saibaba, ME Kilmer - SIAM journal on mathematics of data …, 2020 - SIAM
Many applications in data science and scientific computing involve large-scale datasets that are expensive to store and manipulate. However, these datasets possess inherent …
A Holmes, AY Matsuura - 2020 IEEE International Conference …, 2020 - ieeexplore.ieee.org
Effective quantum computation relies upon making good use of the exponential information capacity of a quantum machine. A large barrier to designing quantum algorithms for …
The tensor-train (TT) format is a highly compact low-rank representation for high- dimensional tensors. TT is particularly useful when representing approximations to the …
Randomized algorithms are efficient techniques for big data tensor analysis. In this tutorial paper, we review and extend a variety of randomized algorithms for decomposing large …
L Li, W Yu, K Batselier - Journal of Computational and Applied Mathematics, 2022 - Elsevier
In recent years, the application of tensors has become more widespread in fields that involve data analytics and numerical computation. Due to the explosive growth of data, low-rank …
L Ma, E Solomonik - Advances in Neural Information …, 2022 - proceedings.neurips.cc
This work discusses tensor network embeddings, which are random matrices ($ S $) with tensor network structure. These embeddings have been used to perform dimensionality …
We present an object-oriented optimization framework that can be employed to solve small- and large-scale problems based on the concept of vectors and operators. By using such a …
Z Chen, H Jiang, G Yu, L Qi - arXiv preprint arXiv:2309.08093, 2023 - arxiv.org
Tensor train decomposition is one of the most powerful approaches for processing high- dimensional data. For low-rank tensor train decomposition of large tensors, the alternating …
In a recent Letter [Phys. Rev. Lett. 130, 246402 (2023)], Gleis, Li, and von Delft present an algorithm for expanding the bond dimension of a Matrix Product State wave function, giving …