Defining Security and Privacy Tradeoffs

Josh
Vehicular Security and Privacy
2 min readMay 9, 2017

A secure system is tasked with protecting assets. Real life examples might be security guards or locks on doors. Typically security becomes layered on, especially as the system or asset to protect grows, such as a large building might have multiple security safeguards (e.g., multiple guards, automated swipe doors, surveillance, hourly guard rounds).

Cryptographic security may take it’s place in the form of Confidentiality, Integrity, and Availability (CIA). Cryptographic encryption in the form of cryptographic keys ensures confidentiality at the cost of usability. That is, there is a large usability overhead in the management of the cryptographic keys. In a layered secure system, the usability thus gets worse as the security becomes stronger (think airport TSA, random people screening, random baggage screening).

Security has been understood for most of human history. When someone works hard for something, it becomes natural to secure the asset and accept the usability cost.

Personal privacy on the other hand is painted as a “new” idea that should somehow be controlled by someone other than the individual themselves. However, mass data collection comes with a price that either government agents must pay or democracy itself must pay.

The government agents data disclosure seems like a sure issue. What are we to say about profiling voters and using that data to shape a campaign accordingly? How can any of this be verified? How transparent can the process be when it comes to big data analytics?

Privacy in Computer Science has been shaped as a trade off between accuracy and privacy. That is, the more accurate we wish data to be, the less private the data must be. However, is this the only trade off?

Once data is collected and linked with other data, new patterns and insights emerge. Can this even be controlled? Say location data is privately released. How well can I limit how this information is used? Is such a limitation even possible? One might say that I only release data once and then I can limit the use of the data. Though personal data is continuously generated. How can constraints be applied to a fountain of data? It might be time to ask what exactly is privacy and what can we protect?

--

--