作者
Ariel Keller Rorabaugh, Silvina Caino-Lores, Travis Johnston, Michela Taufer
发表日期
2022/1/7
期刊
IEEE Transactions on Parallel and Distributed Systems
卷号
33
期号
11
页码范围
2913-2926
出版商
IEEE
简介
Neural networks (NN) are used in high-performance computing and high-throughput analysis to extract knowledge from datasets. Neural architecture search (NAS) automates NN design by generating, training, and analyzing thousands of NNs. However, NAS requires massive computational power for NN training. To address challenges of efficiency and scalability, we propose PENGUIN , a decoupled fitness prediction engine that informs the search without interfering in it. PENGUIN uses parametric modeling to predict fitness of NNs. Existing NAS methods and parametric modeling functions can be plugged into PENGUIN to build flexible NAS workflows. Through this decoupling and flexible parametric modeling, PENGUIN reduces training costs: it predicts the fitness of NNs, enabling NAS to terminate training NNs early. Early termination increases the number of NNs that fixed compute resources can evaluate …
引用总数
学术搜索中的文章
AK Rorabaugh, S Caino-Lores, T Johnston, M Taufer - IEEE Transactions on Parallel and Distributed Systems, 2022