Gradient flow in sparse neural networks and how lottery tickets win

U Evci, Y Ioannou, C Keskin, Y Dauphin - Proceedings of the AAAI …, 2022 - ojs.aaai.org
Abstract Sparse Neural Networks (NNs) can match the generalization of dense NNs using a
fraction of the compute/storage for inference, and have the potential to enable efficient …

Prune and tune ensembles: low-cost ensemble learning with sparse independent subnetworks

T Whitaker, D Whitley - Proceedings of the AAAI Conference on Artificial …, 2022 - ojs.aaai.org
Ensemble Learning is an effective method for improving generalization in machine learning.
However, as state-of-the-art neural networks grow larger, the computational cost associated …

Measuring the Energy Consumption and Efficiency of Deep Neural Networks: An Empirical Analysis and Design Recommendations

CE Tripp, J Perr-Sauer, J Gafur, A Nag… - arXiv preprint arXiv …, 2024 - arxiv.org
Addressing the so-called``Red-AI''trend of rising energy consumption by large-scale neural
networks, this study investigates the actual energy consumption, as measured by node-level …