Weight-sharing neural architecture search: A battle to shrink the optimization gap

L Xie, X Chen, K Bi, L Wei, Y Xu, L Wang… - ACM Computing …, 2021 - dl.acm.org
ACM Computing Surveys (CSUR), 2021dl.acm.org
Neural architecture search (NAS) has attracted increasing attention. In recent years,
individual search methods have been replaced by weight-sharing search methods for higher
search efficiency, but the latter methods often suffer lower instability. This article provides a
literature review on these methods and owes this issue to the optimization gap. From this
perspective, we summarize existing approaches into several categories according to their
efforts in bridging the gap, and we analyze both advantages and disadvantages of these …
Neural architecture search (NAS) has attracted increasing attention. In recent years, individual search methods have been replaced by weight-sharing search methods for higher search efficiency, but the latter methods often suffer lower instability. This article provides a literature review on these methods and owes this issue to the optimization gap. From this perspective, we summarize existing approaches into several categories according to their efforts in bridging the gap, and we analyze both advantages and disadvantages of these methodologies. Finally, we share our opinions on the future directions of NAS and AutoML. Due to the expertise of the authors, this article mainly focuses on the application of NAS to computer vision problems.
ACM Digital Library
以上显示的是最相近的搜索结果。 查看全部搜索结果