Distillation-based semi-supervised federated learning for communication-efficient collaborative training with non-iid private data

S Itahara, T Nishio, Y Koda, M Morikura… - IEEE Transactions …, 2021 - ieeexplore.ieee.org
This study develops a federated learning (FL) framework overcoming largely incremental
communication costs due to model sizes in typical frameworks without compromising model …

Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training with Non-IID Private Data

S Itahara, T Nishio, Y Koda, M Morikura… - arXiv e …, 2020 - ui.adsabs.harvard.edu
This study develops a federated learning (FL) framework overcoming largely incremental
communication costs due to model sizes in typical frameworks without compromising model …

Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training with Non-IID Private Data

S Itahara, T Nishio, Y Koda… - IEEE Transactions …, 2023 - repository.kulib.kyoto-u.ac.jp
This study develops a federated learning (FL) framework overcoming largely incremental
communication costs due to model sizes in typical frameworks without compromising model …

Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training with Non-IID Private Data

S Itahara, T Nishio, Y Koda, M Morikura… - arXiv preprint arXiv …, 2020 - arxiv.org
This study develops a federated learning (FL) framework overcoming largely incremental
communication costs due to model sizes in typical frameworks without compromising model …

Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training With Non-IID Private Data

S Itahara, T Nishio, Y Koda, M Morikura… - IEEE Transactions on …, 2023 - computer.org
This study develops a federated learning (FL) framework overcoming largely incremental
communication costs due to model sizes in typical frameworks without compromising model …

Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training with Non-IID Private Data

板原 - IEEE Transactions on Mobile Computing, 2023 - cir.nii.ac.jp
抄録 This study develops a federated learning (FL) framework overcoming largely
incremental communication costs due to model sizes in typical frameworks without …

[引用][C] Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training With Non-IID Private Data

S Itahara, T Nishio, Y Koda, M Morikura… - IEEE Transactions on …, 2023 - cir.nii.ac.jp
Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient
Collaborative Training With Non-IID Private Data | CiNii Research CiNii 国立情報学研究所 学術 …