DAS-AST: Defending against model stealing attacks based on adaptive softmax transformation

J Chen, C Wu, S Shen, X Zhang, J Chen - Information Security and …, 2021 - Springer
Abstract Deep Neural Networks (DNNs) have been widely applied to diverse real life
applications and dominated in most cases. Considering the hardware consumption for DNN …

Defending against neural network model stealing attacks using deceptive perturbations

T Lee, B Edwards, I Molloy, D Su - 2019 IEEE Security and …, 2019 - ieeexplore.ieee.org
Machine learning architectures are readily available, but obtaining the high quality labeled
data for training is costly. Pre-trained models available as cloud services can be used to …

APMSA: Adversarial perturbation against model stealing attacks

J Zhang, S Peng, Y Gao, Z Zhang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Training a Deep Learning (DL) model requires proprietary data and computing-intensive
resources. To recoup their training costs, a model provider can monetize DL models through …

AugSteal: Advancing Model Steal with Data Augmentation in Active Learning Frameworks

L Gao, W Liu, K Liu, J Wu - IEEE Transactions on Information …, 2024 - ieeexplore.ieee.org
With the proliferation of machine learning models in diverse applications, the issue of model
security has increasingly become a focal point. Model steal attacks can cause significant …

Army of Thieves: Enhancing Black-Box Model Extraction via Ensemble based sample selection

A Jindal, V Goyal, S Anand… - Proceedings of the IEEE …, 2024 - openaccess.thecvf.com
Abstract Machine Learning (ML) models become vulnerable to Model Stealing Attacks
(MSA) when they are deployed as a service. In such attacks, the deployed model is queried …

Prediction poisoning: Towards defenses against dnn model stealing attacks

T Orekondy, B Schiele, M Fritz - arXiv preprint arXiv:1906.10908, 2019 - arxiv.org
High-performance Deep Neural Networks (DNNs) are increasingly deployed in many real-
world applications eg, cloud prediction APIs. Recent advances in model functionality …

Efficient defense against model stealing attacks on convolutional neural networks

K Khaled, M Dhaouadi… - 2023 International …, 2023 - ieeexplore.ieee.org
Model stealing attacks have become a serious concern for deep learning models, where an
attacker can steal a trained model by querying its black-box API. This can lead to intellectual …

Es attack: Model stealing against deep neural networks without data hurdles

X Yuan, L Ding, L Zhang, X Li… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Deep neural networks (DNNs) have become the essential components for various
commercialized machine learning services, such as Machine Learning as a Service …

[PDF][PDF] CloudLeak: Large-Scale Deep Learning Models Stealing Through Adversarial Examples.

H Yu, K Yang, T Zhang, YY Tsai, TY Ho, Y Jin - NDSS, 2020 - ndss-symposium.org
Cloud-based Machine Learning as a Service (MLaaS) is gradually gaining acceptance as a
reliable solution to various real-life scenarios. These services typically utilize Deep Neural …

Isolation and induction: Training robust deep neural networks against model stealing attacks

J Guo, X Zheng, A Liu, S Liang, Y Xiao, Y Wu… - Proceedings of the 31st …, 2023 - dl.acm.org
Despite the broad application of Machine Learning models as a Service (MLaaS), they are
vulnerable to model stealing attacks. These attacks can replicate the model functionality by …