X Luo, D Liu, H Kong, S Huai, H Chen… - ACM Transactions on …, 2024 - dl.acm.org
Deep neural networks (DNNs) have recently achieved impressive success across a wide range of real-world vision and language processing tasks, spanning from image …
H Shi, S Ren, T Zhang, SJ Pan - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
We propose a novel progressive parameter-sharing strategy (MPPS) in this paper for effectively training multitask learning models on diverse computer vision tasks …
Based on the weight-sharing mechanism, one-shot NAS methods train a supernet and then inherit the pre-trained weights to evaluate sub-models, largely reducing the search cost …
Neural parameter allocation search (NPAS) automates parameter sharing by obtaining weights for a network given an arbitrary, fixed parameter budget. Prior work has two major …
One-shot neural architecture search (NAS) substantially improves the search efficiency by training one supernet to estimate the performance of every possible child architecture (ie …
Neural architecture search tries to shift the manual design of neural network (NN) architectures to algorithmic design. In these cases, the NN architecture itself can be viewed …
G Yuan, B Xue, M Zhang - Proceedings of the Genetic and Evolutionary …, 2023 - dl.acm.org
Neural architecture search (NAS) is becoming increasingly popular for its ability to automatically search for an appropriate network architecture, avoiding laborious manual …
X Ning, Y Zheng, Z Zhou, T Zhao… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Neural architecture search (NAS) can automatically discover well-performing architectures in a large search space and has been shown to bring improvements to various applications …
While Chain of Thought (CoT) prompting approaches have significantly consolidated the reasoning capabilities of large language models (LLMs), they still face limitations that …