Blockchain-enabled federated learning: A survey

Y Qu, MP Uddin, C Gan, Y Xiang, L Gao… - ACM Computing …, 2022 - dl.acm.org
Federated learning (FL) has experienced a boom in recent years, which is jointly promoted
by the prosperity of machine learning and Artificial Intelligence along with emerging privacy …

Demystifying parallel and distributed deep learning: An in-depth concurrency analysis

T Ben-Nun, T Hoefler - ACM Computing Surveys (CSUR), 2019 - dl.acm.org
Deep Neural Networks (DNNs) are becoming an important tool in modern computing
applications. Accelerating their training is a major challenge and techniques range from …

Federated learning with hierarchical clustering of local updates to improve training on non-IID data

C Briggs, Z Fan, P Andras - 2020 international joint conference …, 2020 - ieeexplore.ieee.org
Federated learning (FL) is a well established method for performing machine learning tasks
over massively distributed data. However in settings where data is distributed in a non-iid …

Parallel restarted SGD with faster convergence and less communication: Demystifying why model averaging works for deep learning

H Yu, S Yang, S Zhu - Proceedings of the AAAI conference on artificial …, 2019 - ojs.aaai.org
In distributed training of deep neural networks, parallel minibatch SGD is widely used to
speed up the training process by using multiple workers. It uses multiple workers to sample …

Aishell-1: An open-source mandarin speech corpus and a speech recognition baseline

H Bu, J Du, X Na, B Wu, H Zheng - … of the oriental chapter of the …, 2017 - ieeexplore.ieee.org
An open-source Mandarin speech corpus called AISHELL-1 is released. It is by far the
largest corpus which is suitable for conducting the speech recognition research and building …

[PDF][PDF] Deep neural network embeddings for text-independent speaker verification.

D Snyder, D Garcia-Romero, D Povey, S Khudanpur - Interspeech, 2017 - isca-archive.org
This paper investigates replacing i-vectors for text-independent speaker verification with
embeddings extracted from a feedforward deep neural network. Long-term speaker …

Communication-efficient learning of deep networks from decentralized data

B McMahan, E Moore, D Ramage… - Artificial intelligence …, 2017 - proceedings.mlr.press
Modern mobile devices have access to a wealth of data suitable for learning models, which
in turn can greatly improve the user experience on the device. For example, language …

On the convergence of local descent methods in federated learning

F Haddadpour, M Mahdavi - arXiv preprint arXiv:1910.14425, 2019 - arxiv.org
In federated distributed learning, the goal is to optimize a global training objective defined
over distributed devices, where the data shard at each device is sampled from a possibly …

Batch normalization: Accelerating deep network training by reducing internal covariate shift

S Ioffe, C Szegedy - International conference on machine …, 2015 - proceedings.mlr.press
Abstract Training Deep Neural Networks is complicated by the fact that the distribution of
each layer's inputs changes during training, as the parameters of the previous layers …

Federated learning for keyword spotting

D Leroy, A Coucke, T Lavril… - ICASSP 2019-2019 …, 2019 - ieeexplore.ieee.org
We propose a practical approach based on federated learning to solve out-of-domain issues
with continuously running embedded speech-based models such as wake word detectors …