September 18, 2021

Malware Protection

Dedicated Forum to help removing adware, malware, spyware, ransomware, trojans, viruses and more!

NeuralDP Differentially private neural networks by design. (arXiv:2107.14582v1 [cs.LG])

The application of differential privacy to the training of deep neural
networks holds the promise of allowing large-scale (decentralized) use of
sensitive data while providing rigorous privacy guarantees to the individual.
The predominant approach to differentially private training of neural networks
is DP-SGD, which relies on norm-based gradient clipping as a method for
bounding sensitivity, followed by the addition of appropriately calibrated
Gaussian noise. In this work we propose NeuralDP, a technique for privatising
activations of some layer within a neural network, which by the post-processing
properties of differential privacy yields a differentially private network. We
experimentally demonstrate on two datasets (MNIST and Pediatric Pneumonia
Dataset (PPD)) that our method offers substantially improved privacy-utility
trade-offs compared to DP-SGD.