Weight-sharing neural architecture search: A battle to shrink the optimization gap

L Xie, X Chen, K Bi, L Wei, Y Xu, L Wang… - ACM Computing …, 2021 - dl.acm.org
Neural architecture search (NAS) has attracted increasing attention. In recent years,
individual search methods have been replaced by weight-sharing search methods for higher …

Angle-based search space shrinking for neural architecture search

Y Hu, Y Liang, Z Guo, R Wan, X Zhang, Y Wei… - Computer Vision–ECCV …, 2020 - Springer
In this work, we present a simple and general search space shrinking method, called Angle-
Based search space Shrinking (ABS), for Neural Architecture Search (NAS). Our approach …

Optimizing neural networks through activation function discovery and automatic weight initialization

G Bingham - arXiv preprint arXiv:2304.03374, 2023 - arxiv.org
Automated machine learning (AutoML) methods improve upon existing models by
optimizing various aspects of their design. While present methods focus on hyperparameters …

M²NAS: Joint Neural Architecture Optimization System With Network Transmission

L Wang, L Xie, K Bi, K Zhao, J Guo… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Differentiable neural architecture search (NAS) methods have achieved comparable results
for low search costs and high performance. Existing differentiable methods focus on …