Normalization techniques are essential for accelerating the training and improving the generalization of deep neural networks (DNNs), and have successfully been used in various …
J Gasteiger, F Becker… - Advances in Neural …, 2021 - proceedings.neurips.cc
Effectively predicting molecular interactions has the potential to accelerate molecular dynamics by multiple orders of magnitude and thus revolutionize chemical simulations …
X Li, M Jiang, X Zhang, M Kamp, Q Dou - arXiv preprint arXiv:2102.07623, 2021 - arxiv.org
The emerging paradigm of federated learning (FL) strives to enable collaborative training of deep models on the network edge without centrally aggregating raw data and hence …
This open-source book represents our attempt to make deep learning approachable, teaching readers the concepts, the context, and the code. The entire book is drafted in …
C Nwankpa, W Ijomah, A Gachagan… - arXiv preprint arXiv …, 2018 - arxiv.org
Deep neural networks have been successfully used in diverse emerging domains to solve real world complex problems with may more deep learning (DL) architectures, being …
Object detection, one of the most fundamental and challenging problems in computer vision, seeks to locate object instances from a large number of predefined categories in natural …
In this paper, we investigate the problem of overfitting in deep reinforcement learning. Among the most common benchmarks in RL, it is customary to use the same environments …
X Zhou, Y Lin, W Zhang… - … Conference on Machine …, 2022 - proceedings.mlr.press
Abstract Invariant Risk Minimization (IRM) is an emerging invariant feature extracting technique to help generalization with distributional shift. However, we find that there exists a …
K Lyu, Z Li, S Arora - Advances in Neural Information …, 2022 - proceedings.neurips.cc
Abstract Normalization layers (eg, Batch Normalization, Layer Normalization) were introduced to help with optimization difficulties in very deep nets, but they clearly also help …