Scientific machine learning through physics–informed neural networks: Where we are and what's next

S Cuomo, VS Di Cola, F Giampaolo, G Rozza… - Journal of Scientific …, 2022 - Springer
Abstract Physics-Informed Neural Networks (PINN) are neural networks (NNs) that encode
model equations, like Partial Differential Equations (PDE), as a component of the neural …

Artificial intelligence: A powerful paradigm for scientific research

Y Xu, X Liu, X Cao, C Huang, E Liu, S Qian, X Liu… - The Innovation, 2021 - cell.com
Artificial intelligence (AI) coupled with promising machine learning (ML) techniques well
known from computer science is broadly affecting many aspects of various fields including …

Towards revealing the mystery behind chain of thought: a theoretical perspective

G Feng, B Zhang, Y Gu, H Ye, D He… - Advances in Neural …, 2024 - proceedings.neurips.cc
Recent studies have discovered that Chain-of-Thought prompting (CoT) can dramatically
improve the performance of Large Language Models (LLMs), particularly when dealing with …

Spiking neural networks and their applications: A review

K Yamazaki, VK Vo-Ho, D Bulsara, N Le - Brain Sciences, 2022 - mdpi.com
The past decade has witnessed the great success of deep neural networks in various
domains. However, deep neural networks are very resource-intensive in terms of energy …

Recent advances and applications of deep learning methods in materials science

K Choudhary, B DeCost, C Chen, A Jain… - npj Computational …, 2022 - nature.com
Deep learning (DL) is one of the fastest-growing topics in materials data science, with
rapidly emerging applications spanning atomistic, image-based, spectral, and textual data …

Score approximation, estimation and distribution recovery of diffusion models on low-dimensional data

M Chen, K Huang, T Zhao… - … Conference on Machine …, 2023 - proceedings.mlr.press
Diffusion models achieve state-of-the-art performance in various generation tasks. However,
their theoretical foundations fall far behind. This paper studies score approximation …

A comprehensive and fair comparison of two neural operators (with practical extensions) based on fair data

L Lu, X Meng, S Cai, Z Mao, S Goswami… - Computer Methods in …, 2022 - Elsevier
Neural operators can learn nonlinear mappings between function spaces and offer a new
simulation paradigm for real-time prediction of complex dynamics for realistic diverse …

How attentive are graph attention networks?

S Brody, U Alon, E Yahav - arXiv preprint arXiv:2105.14491, 2021 - arxiv.org
Graph Attention Networks (GATs) are one of the most popular GNN architectures and are
considered as the state-of-the-art architecture for representation learning with graphs. In …

Graph neural networks: foundation, frontiers and applications

L Wu, P Cui, J Pei, L Zhao, X Guo - … of the 28th ACM SIGKDD Conference …, 2022 - dl.acm.org
The field of graph neural networks (GNNs) has seen rapid and incredible strides over the
recent years. Graph neural networks, also known as deep learning on graphs, graph …

Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators

L Lu, P Jin, G Pang, Z Zhang… - Nature machine …, 2021 - nature.com
It is widely known that neural networks (NNs) are universal approximators of continuous
functions. However, a less known but powerful result is that a NN with a single hidden layer …