作者
Hyelin Nam, Jihong Park, Seong-Lyun Kim
发表日期
2023/5/28
研讨会论文
2023 IEEE International Conference on Communications Workshops (ICC Workshops)
页码范围
825-830
出版商
IEEE
简介
In Split learning (SL), a promising edge learning, edge devices and a server split a neural network and jointly train by exchanging smashed data and corresponding gradients. To alleviate the frequent communication cost, we propose a new SL that includes tiny server. The tiny server acts as output layers of the devices' model, determining if smashed data is worth transmitting to the server for training. Our spatio-temporal distillation method enables the tiny server to evaluate similarly to the server's view. Our SL framework enhanced in performance while reducing 50% of overall communication cost.
引用总数
学术搜索中的文章