Randomized algorithms for computation of Tucker decomposition and higher order SVD (HOSVD)

S Ahmadi-Asl, S Abukhovich, MG Asante-Mensah… - IEEE …, 2021 - ieeexplore.ieee.org
Big data analysis has become a crucial part of new emerging technologies such as the
internet of things, cyber-physical analysis, deep learning, anomaly detection, etc. Among …

Randomized algorithms for low-rank tensor decompositions in the Tucker format

R Minster, AK Saibaba, ME Kilmer - SIAM journal on mathematics of data …, 2020 - SIAM
Many applications in data science and scientific computing involve large-scale datasets that
are expensive to store and manipulate. However, these datasets possess inherent …

Efficient quantum circuits for accurate state preparation of smooth, differentiable functions

A Holmes, AY Matsuura - 2020 IEEE International Conference …, 2020 - ieeexplore.ieee.org
Effective quantum computation relies upon making good use of the exponential information
capacity of a quantum machine. A large barrier to designing quantum algorithms for …

Randomized algorithms for rounding in the tensor-train format

H Al Daas, G Ballard, P Cazeaux, E Hallman… - SIAM Journal on …, 2023 - SIAM
The tensor-train (TT) format is a highly compact low-rank representation for high-
dimensional tensors. TT is particularly useful when representing approximations to the …

Randomized algorithms for fast computation of low rank tensor ring model

S Ahmadi-Asl, A Cichocki, AH Phan… - Machine Learning …, 2020 - iopscience.iop.org
Randomized algorithms are efficient techniques for big data tensor analysis. In this tutorial
paper, we review and extend a variety of randomized algorithms for decomposing large …

Faster tensor train decomposition for sparse data

L Li, W Yu, K Batselier - Journal of Computational and Applied Mathematics, 2022 - Elsevier
In recent years, the application of tensors has become more widespread in fields that involve
data analytics and numerical computation. Due to the explosive growth of data, low-rank …

Cost-efficient Gaussian tensor network embeddings for tensor-structured inputs

L Ma, E Solomonik - Advances in Neural Information …, 2022 - proceedings.neurips.cc
This work discusses tensor network embeddings, which are random matrices ($ S $) with
tensor network structure. These embeddings have been used to perform dimensionality …

An object-oriented optimization framework for large-scale inverse problems

E Biondi, G Barnier, RG Clapp, F Picetti… - Computers & Geosciences, 2021 - Elsevier
We present an object-oriented optimization framework that can be employed to solve small-
and large-scale problems based on the concept of vectors and operators. By using such a …

Low-rank tensor train decomposition using tensorsketch

Z Chen, H Jiang, G Yu, L Qi - arXiv preprint arXiv:2309.08093, 2023 - arxiv.org
Tensor train decomposition is one of the most powerful approaches for processing high-
dimensional data. For low-rank tensor train decomposition of large tensors, the alternating …

Comment on" Controlled Bond Expansion for Density Matrix Renormalization Group Ground State Search at Single-Site Costs"(Extended Version)

IP McCulloch, JJ Osborne - arXiv preprint arXiv:2403.00562, 2024 - arxiv.org
In a recent Letter [Phys. Rev. Lett. 130, 246402 (2023)], Gleis, Li, and von Delft present an
algorithm for expanding the bond dimension of a Matrix Product State wave function, giving …