Frequency principle: Fourier analysis sheds light on deep neural networks ZQJ Xu, Y Zhang, T Luo, Y Xiao, Z Ma arXiv preprint arXiv:1901.06523, 2019 | 577 | 2019 |
Two-layer neural networks for partial differential equations: Optimization and generalization theory T Luo, H Yang arXiv preprint arXiv:2006.15733, 2020 | 89 | 2020 |
Theory of the frequency principle for general deep neural networks T Luo, Z Ma, ZQJ Xu, Y Zhang arXiv preprint arXiv:1906.09235, 2019 | 86 | 2019 |
Overview frequency principle/spectral bias in deep learning ZQJ Xu, Y Zhang, T Luo Communications on Applied Mathematics and Computation, 1-38, 2024 | 74 | 2024 |
Improving “fast iterative shrinkage-thresholding algorithm”: Faster, smarter, and greedier J Liang, T Luo, CB Schonlieb SIAM Journal on Scientific Computing 44 (3), A1069-A1091, 2022 | 73 | 2022 |
Phase diagram for two-layer relu neural networks at infinite-width limit T Luo, ZQJ Xu, Z Ma, Y Zhang Journal of Machine Learning Research 22 (71), 1-47, 2021 | 73 | 2021 |
A type of generalization error induced by initialization in deep neural networks Y Zhang, ZQJ Xu, T Luo, Z Ma Mathematical and Scientific Machine Learning, 144-164, 2020 | 61 | 2020 |
Embedding principle of loss landscape of deep neural networks Y Zhang, Z Zhang, T Luo, ZJ Xu Advances in Neural Information Processing Systems 34, 14848-14859, 2021 | 39 | 2021 |
MOD-Net: A machine learning approach via model-operator-data network for solving PDEs L Zhang, T Luo, Y Zhang, ZQJ Xu, Z Ma arXiv preprint arXiv:2107.03673, 2021 | 39 | 2021 |
Explicitizing an implicit bias of the frequency principle in two-layer neural networks Y Zhang, ZQJ Xu, T Luo, Z Ma arXiv preprint arXiv:1905.10264, 2019 | 38 | 2019 |
Towards understanding the condensation of neural networks at initial training H Zhou, Z Qixuan, T Luo, Y Zhang, ZQ Xu Advances in Neural Information Processing Systems 35, 2184-2196, 2022 | 33 | 2022 |
Embedding principle: a hierarchical structure of loss landscape of deep neural networks Y Zhang, Y Li, Z Zhang, T Luo, ZQJ Xu arXiv preprint arXiv:2111.15527, 2021 | 32 | 2021 |
A linear frequency principle model to understand the absence of overfitting in neural networks Y Zhang, T Luo, Z Ma, ZQJ Xu Chinese Physics Letters 38 (3), 038701, 2021 | 24 | 2021 |
Empirical phase diagram for three-layer neural networks with infinite width H Zhou, Z Qixuan, Z Jin, T Luo, Y Zhang, ZQ Xu Advances in Neural Information Processing Systems 35, 26021-26033, 2022 | 23 | 2022 |
On the exact computation of linear frequency principle dynamics and its generalization T Luo, Z Ma, ZQJ Xu, Y Zhang SIAM Journal on Mathematics of Data Science 4 (4), 1272-1292, 2022 | 19 | 2022 |
From Atomistic Model to the Peierls–Nabarro Model with -surface for Dislocations T Luo, P Ming, Y Xiang Archive for Rational Mechanics and Analysis 230 (2), 735-781, 2018 | 12 | 2018 |
Phase diagram of initial condensation for two-layer neural networks Z Chen, Y Li, T Luo, Z Zhou, ZQJ Xu arXiv preprint arXiv:2303.06561, 2023 | 11 | 2023 |
A regularised deep matrix factorised model of matrix completion for image restoration Z Li, ZQJ Xu, T Luo, H Wang IET Image Processing 16 (12), 3212-3224, 2022 | 10 | 2022 |
Revisit of the Peierls-Nabarro model for edge dislocations in Hilbert space Y Gao, JG Liu, T Luo, Y Xiang Discrete & Continuous Dynamical Systems-B 26 (6), 2021 | 8 | 2021 |
Linear stability hypothesis and rank stratification for nonlinear models Y Zhang, Z Zhang, L Zhang, Z Bai, T Luo, ZQJ Xu arXiv preprint arXiv:2211.11623, 2022 | 7 | 2022 |