Curated notebooks on how to train neural networks using differential privacy and federated learning.
Before you start learning about Differential Privacy and Federated Learning, it's important to understand tensors; the fundamental data structures for neural networks.
Most of the time you won't want to train a whole convolutional network yourself. Modern ConvNets training on huge datasets like ImageNet take weeks on multiple GPUs. Transfer Learning helps you solve this problem.
Differential Privacy is a set of techniques for preventing a model from accidentally memorizing secrets present in a training dataset during the learning process.
For it to work, we need to uphold the following:
- Make a promise to a data subject that: You won’t be affected, adversely or otherwise, by allowing your data to be used in any analysis, no matter what studies, datasets or information sources, are available.
- Ensure that the model learning from sensitive data are only learning what they are supposed to learn without accidentally learning what they are not supposed to learn from their data
Here's some notebooks to explain the concept further:
Instead of bringing data all to one place for training, federated learning is done by bringing the model to the data. this allows a data owner to maintain the only copy of their information.
This notebook on federated learning explains more in detail.