Weight-sharing neural architecture search: A battle to shrink the optimization gap

L Xie, X Chen, K Bi, L Wei, Y Xu, L Wang… - ACM Computing …, 2021 - dl.acm.org
Neural architecture search (NAS) has attracted increasing attention. In recent years,
individual search methods have been replaced by weight-sharing search methods for higher …

Deep learning for mining protein data

Q Shi, W Chen, S Huang, Y Wang… - Briefings in …, 2021 - academic.oup.com
The recent emergence of deep learning to characterize complex patterns of protein big data
reveals its potential to address the classic challenges in the field of protein data mining …

NAS evaluation is frustratingly hard

A Yang, PM Esperança, FM Carlucci - arXiv preprint arXiv:1912.12522, 2019 - arxiv.org
Neural Architecture Search (NAS) is an exciting new field which promises to be as much as
a game-changer as Convolutional Neural Networks were in 2012. Despite many great works …

BNAS: Efficient neural architecture search using broad scalable architecture

Z Ding, Y Chen, N Li, D Zhao, Z Sun… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Efficient neural architecture search (ENAS) achieves novel efficiency for learning
architecture with high-performance via parameter sharing and reinforcement learning (RL) …

Neural architecture generator optimization

R Ru, P Esperanca, FM Carlucci - Advances in Neural …, 2020 - proceedings.neurips.cc
Abstract Neural Architecture Search (NAS) was first proposed to achieve state-of-the-art
performance through the discovery of new architecture patterns, without human intervention …

Efficient guided evolution for neural architecture search

V Lopes, M Santos, B Degardin… - Proceedings of the …, 2022 - dl.acm.org
Neural Architecture Search methods have been successfully applied to image tasks with
excellent results. However, NAS methods are often complex and tend to quickly converge for …

[HTML][HTML] Are neural architecture search benchmarks well designed? A deeper look into operation importance

V Lopes, B Degardin, LA Alexandre - Information Sciences, 2023 - Elsevier
Abstract Neural Architecture Search (NAS) benchmarks significantly improved the capability
of developing and comparing NAS methods while at the same time drastically reduced the …

Data-Free Quantization via Pseudo-label Filtering

C Fan, Z Wang, D Guo, M Wang - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Quantization for model compression can efficiently reduce the network complexity and
storage requirement but the original training data is necessary to remedy the performance …

[PDF][PDF] Direct federated neural architecture search

A Garg, AK Saha, D Dutta - arXiv preprint arXiv:2010.06223, 2020 - academia.edu
Abstract Neural Architecture Search (NAS) is a collection of methods to craft the way neural
networks are built. We apply this idea to Federated Learning (FL), wherein predefined …

Toward Less Constrained Macro-Neural Architecture Search

V Lopes, LA Alexandre - IEEE Transactions on Neural …, 2023 - ieeexplore.ieee.org
Networks found with neural architecture search (NAS) achieve the state-of-the-art
performance in a variety of tasks, out-performing human-designed networks. However, most …