Collecting and analyzing data from smart device users with local differential privacy

TT Nguyên, X Xiao, Y Yang, SC Hui, H Shin… - arXiv preprint arXiv …, 2016 - arxiv.org
Organizations with a large user base, such as Samsung and Google, can potentially benefit
from collecting and mining users' data. However, doing so raises privacy concerns, and risks …

Tempered sigmoid activations for deep learning with differential privacy

N Papernot, A Thakurta, S Song, S Chien… - Proceedings of the …, 2021 - ojs.aaai.org
Because learning sometimes involves sensitive data, machine learning algorithms have
been extended to offer differential privacy for training data. In practice, this has been mostly …

Differentially private high-dimensional data publication via sampling-based inference

R Chen, Q Xiao, Y Zhang, J Xu - Proceedings of the 21th ACM SIGKDD …, 2015 - dl.acm.org
Releasing high-dimensional data enables a wide spectrum of data mining tasks. Yet,
individual privacy has been a major obstacle to data sharing. In this paper, we consider the …

Efficient and secure outsourcing of differentially private data publishing with multiple evaluators

J Li, H Ye, T Li, W Wang, W Lou… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
Since big data becomes a main impetus to the next generation of IT industry, data privacy
has received considerable attention in recent years. To deal with the privacy challenges …

Correlated differential privacy: Feature selection in machine learning

T Zhang, T Zhu, P Xiong, H Huo, Z Tari… - IEEE Transactions on …, 2019 - ieeexplore.ieee.org
Privacy preserving in machine learning is a crucial issue in industry informatics since data
used for training in industries usually contain sensitive information. Existing differentially …

Preserving differential privacy in deep neural networks with relevance-based adaptive noise imposition

M Gong, K Pan, Y Xie, AK Qin, Z Tang - Neural Networks, 2020 - Elsevier
In recent years, deep learning achieves remarkable results in the field of artificial
intelligence. However, the training process of deep neural networks may cause the leakage …

Heavy hitter estimation over set-valued data with local differential privacy

Z Qin, Y Yang, T Yu, I Khalil, X Xiao, K Ren - Proceedings of the 2016 …, 2016 - dl.acm.org
In local differential privacy (LDP), each user perturbs her data locally before sending the
noisy data to a data collector. The latter then analyzes the data to obtain useful statistics …

Dpis: An enhanced mechanism for differentially private sgd with importance sampling

J Wei, E Bao, X Xiao, Y Yang - Proceedings of the 2022 ACM SIGSAC …, 2022 - dl.acm.org
Nowadays, differential privacy (DP) has become a well-accepted standard for privacy
protection, and deep neural networks (DNN) have been immensely successful in machine …