Y Lu, X Huang, K Zhang, S Maharjan… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
In Internet of Vehicles (IoV), data sharing among vehicles for collaborative analysis can improve the driving experience and service quality. However, the bandwidth, security and …
We study the asynchronous stochastic gradient descent algorithm, for distributed training over $ n $ workers that might be heterogeneous. In this algorithm, workers compute …
Most commonly used distributed machine learning systems are either synchronous or centralized asynchronous. Synchronous algorithms like AllReduce-SGD perform poorly in a …
We analyze (stochastic) gradient descent (SGD) with delayed updates on smooth quasi- convex and non-convex functions and derive concise, non-asymptotic, convergence rates …
The asynchronous parallel implementations of stochastic gradient (SG) have been broadly used in solving deep neural network and received many successes in practice recently …
S Yang, X Zhang, M Wang - Advances in neural information …, 2022 - proceedings.neurips.cc
Bilevel optimization have gained growing interests, with numerous applications found in meta learning, minimax games, reinforcement learning, and nested composition …
G Jeong, HY Kim - Expert Systems with Applications, 2019 - Elsevier
We study trading systems using reinforcement learning with three newly proposed methods to maximize total profits and reflect real financial market situations while overcoming the …
X Gu, K Huang, J Zhang… - Advances in Neural …, 2021 - proceedings.neurips.cc
Federated learning (FL) coordinates with numerous heterogeneous devices to collaboratively train a shared model while preserving user privacy. Despite its multiple …
We analyze (stochastic) gradient descent (SGD) with delayed updates on smooth quasi- convex and non-convex functions and derive concise, non-asymptotic, convergence rates …