Z Huang, W Lu, C Hong, J Ding - 31st USENIX Security Symposium …, 2022 - usenix.org
Secure two-party neural network inference (2PC-NN) can offer privacy protection for both the client and the server and is a promising technique in the machine-learning-as-a-service …
Y Miao, Z Liu, H Li, KKR Choo… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Federated learning enables clients to train a machine learning model jointly without sharing their local data. However, due to the centrality of federated learning framework and the …
Neural networks (NNs) have become one of the most important tools for artificial intelligence. Well-designed and trained NNs can perform inference (eg, make decisions or …
M Hao, H Li, H Chen, P Xing, G Xu… - Advances in neural …, 2022 - proceedings.neurips.cc
We initiate the study of private inference on Transformer-based models in the client-server setting, where clients have private inputs and servers hold proprietary models. Our main …
M Rathee, C Shen, S Wagh… - 2023 IEEE Symposium on …, 2023 - ieeexplore.ieee.org
Federated learning (FL) is an increasingly popular approach for machine learning (ML) in cases where the training dataset is highly distributed. Clients perform local training on their …
JL Watson, S Wagh, RA Popa - 31st USENIX Security Symposium …, 2022 - usenix.org
Secure multi-party computation (MPC) is an essential tool for privacy-preserving machine learning (ML). However, secure training of large-scale ML models currently requires a …
Complex machine learning (ML) inference algorithms like recurrent neural networks (RNNs) use standard functions from math libraries like exponentiation, sigmoid, tanh, and reciprocal …
D Kim, C Guyot - IEEE Transactions on Information Forensics …, 2023 - ieeexplore.ieee.org
Inference of machine learning models with data privacy guarantees has been widely studied as privacy concerns are getting growing attention from the community. Among others, secure …
LKL Ng, SSM Chow - 2023 IEEE Symposium on Security and …, 2023 - ieeexplore.ieee.org
We studied 53 privacy-preserving neural-network papers in 2016-2022 based on cryptography (without trusted processors or differential privacy), 16 of which only use …