Private gradient descent for linear regression: Tighter error bounds and instance-specific uncertainty estimation

G Brown, K Dvijotham, G Evans, D Liu, A Smith… - arXiv preprint arXiv …, 2024 - arxiv.org
We provide an improved analysis of standard differentially private gradient descent for linear
regression under the squared error loss. Under modest assumptions on the input, we …

Better private linear regression through better private feature selection

T Dick, J Gillenwater, M Joseph - Advances in Neural …, 2024 - proceedings.neurips.cc
Existing work on differentially private linear regression typically assumes that end users can
precisely set data bounds or algorithmic hyperparameters. End users often struggle to meet …

Differentially private and explainable boosting machine with enhanced utility

I Baek, YD Chung - Neurocomputing, 2024 - Elsevier
In this paper, we introduce DP-EBM*, an enhanced utility version of the Differentially Private
Explainable Boosting Machine (DP-EBM). DP-EBM* offers predictions for both classification …

Revisiting differentially private XGBoost: are random decision trees really better than greedy ones?

E Wang, A Kolbeinsson, L Foschini, YX Wang - openreview.net
Boosted Decision Trees (eg, XGBoost) are one of the strongest and most widely used
machine learning models. Motivated by applications in sensitive domains, various versions …