作者
Thanh Phung Truong, The-Vi Nguyen, Wonjong Noh, Sungrae Cho
发表日期
2021/3/9
期刊
IEEE Internet of Things Journal
卷号
8
期号
17
页码范围
13196-13208
出版商
IEEE
简介
Mobile-edge computing (MEC) and nonorthogonal multiple access (NOMA) have been regarded as promising technologies for beyond fifth-generation (B5G) and sixth-generation (6G) networks. This study aims to reduce the computational overhead (weighted sum of consumed energy and latency) in a NOMA-assisted MEC network by jointly optimizing the computation offloading policy and channel resource allocation under dynamic network environments with time-varying channels. To this end, we propose a deep reinforcement learning algorithm named ACDQN that utilizes the advantages of both actor-critic and deep Q-network methods and provides low complexity. The proposed algorithm considers partial computation offloading, where users can split computation tasks so that some are performed on the local terminal while some are offloaded to the MEC server. It also considers a hybrid multiple access …
引用总数