Differential Privacy considered not practical
Theoretical Anatagonist
73

Nice post. I couldn’t agree more on the complaints of practitioners on differential privacy. But I think reverting to k-anonymity is probably not the right way to go forward. It always takes time before theory turns into technology.

Finding weaker yet principled notions of privacy that permits more utility in practice is an active area of research. For instance, you will find personalized and on-average notion of differential privacy (https://arxiv.org/pdf/1605.02277.pdf) in the following references that attempts to address this problem. Also, at least for the learning setting, there are actually success stories of using DP, e.g.

In recommendation systems with matrix factorization: https://arxiv.org/abs/1505.01419

In MNIST and CIFAR with neural networks: https://arxiv.org/abs/1607.00133

The point is, it’s true that DP is not yet practical in most tasks, but it will be eventually. The key we figure out a smoother landscape to tradeoff privacy and utility.

Like what you read? Give Yu-Xiang Wang a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.