Distributed machine learning (ML) has been extensively studied to meet the explosive growth of training data. A wide range of machine learning models are trained by a family of …
In this paper we tackle the challenge of making the stochastic coordinate descent algorithm differentially private. Compared to the classical gradient descent algorithm where updates …
We propose a generic algorithmic building block to accelerate training of machine learning models on heterogeneous compute systems. Our scheme allows to efficiently employ …
In this work we propose an asynchronous, GPU-based implementation of the widely-used stochastic coordinate descent algorithm for convex optimization. We define the class of …
Y Ma, F Rusu, M Torres - arXiv preprint arXiv:1802.08800, 2018 - arxiv.org
There is an increased interest in building data analytics frameworks with advanced algebraic capabilities both in industry and academia. Many of these frameworks, eg …
H Bal, A Pal - Future Generation Computer Systems, 2020 - Elsevier
This editorial is for the Special Issue of the journal Future Generation Computing Systems, consisting of the selected papers of the 6th International Workshop on Parallel and …
In this paper we propose a novel parallel stochastic coordinate descent (SCD) algorithm with convergence guarantees that exhibits strong scalability. We start by studying a state-of …
We propose a generic algorithmic building block to accelerate training of machine learning models on heterogeneous compute systems. Our scheme allows to efficiently employ …
The ever-growing number of edge devices (eg, smartphones) and the exploding volume of sensitive data they produce, call for distributed machine learning techniques that are privacy …