ML with Differential Privacy using JAX and DP-SGD

A tutorial on JAX: a great tool for DP-SGD

Differential Privacy
Deep Learning
Machine Learning
Nicolas Grislain

In 📔 this notebook we demonstrate how powerful it is to use JAX based DP-SGD along with the dp-accounting library to train deep learning models with Differential Privacy (DP). We show how to use these tools on a small toy problem: histogram estimation, and compare this generic approach to some well known benchmarks.

Along the way we get to discover some of the problems one is commonly faced with when tuning the hyper-parameters of a DP-SGD, in particular its clipping threshold.

If you own privacy-sensitive data and would like more people to use this data without having to worry about privacy risk, you might be interested by what Sarus Technologies does.

Read the notebook 📔

The per-sample gradient clipping may add a strong bias to our model

About the author

Nicolas Grislain

Cofounder & CSO @ Sarus

Ready to put all of your data to work?

Get in touch, you'll be up and running in no time.
Get started


Subscribe to our newsletter

You're on the list! Thank you for signing up.
Oops! Something went wrong while submitting the form.
32, rue Alexandre Dumas
75011 Paris — France
©2022 Sarus Technologies.
All rights reserved.