关注
Chen-Yu Ho
Chen-Yu Ho
ByteDance Inc.
在 bytedance.com 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Scaling Distributed Machine Learning with In-Network Aggregation
A Sapio, M Canini, CY Ho, J Nelson, P Kalnis, C Kim, A Krishnamurthy, ...
Proceedings of the 18th USENIX Symposium on Networked Systems Design and …, 2021
4172021
Natural compression for distributed deep learning
S Horvóth, CY Ho, L Horvath, AN Sahu, M Canini, P Richtárik
Mathematical and Scientific Machine Learning, 129-141, 2022
1612022
GRACE: A compressed communication framework for distributed machine learning
H Xu, CY Ho, AM Abdelmoniem, A Dutta, EH Bergou, K Karatsenidis, ...
2021 IEEE 41st international conference on distributed computing systems …, 2021
158*2021
On the discrepancy between the theoretical analysis and practical implementations of compressed communication for distributed deep learning
A Dutta, EH Bergou, AM Abdelmoniem, CY Ho, AN Sahu, M Canini, ...
Proceedings of the AAAI Conference on Artificial Intelligence 34 (04), 3817-3824, 2020
942020
Efficient sparse collective communication and its application to accelerate distributed deep learning
J Fei, CY Ho, AN Sahu, M Canini, A Sapio
Proceedings of the 2021 ACM SIGCOMM 2021 Conference, 676-691, 2021
832021
A Comprehensive Empirical Study of Heterogeneity in Federated Learning
AM Abdelmoniem, CY Ho, P Papageorgiou, M Canini
IEEE Internet of Things Journal, 2023
64*2023
Tackling the Communication Bottlenecks of Distributed Deep Learning Training Workloads
CY Ho
2023
系统目前无法执行此操作,请稍后再试。
文章 1–7