First they came for GPG keys
and you didn’t speak up, because what the hell is a GPG key
I saw this (not great advice for 2016, just to be very clear)
A few thousand words later.
Data security is a hard problem to solve. It seems like it shouldn’t — because humans have been encoding/encrypting things as long as they’ve had an alphabet. But securing your own personal communications and data is a uniquely individual problem.
Many drive-by workshops and tutorials are obsessively focused on overviews of tooling. There’s a good reason for this — only tooling information is truly general purpose and one-size-fits-most. Even when the workshop isn’t focused that way, participants tend to desperately search for a mythological perfect tldr; one that distills everything down to a single pithy tool choice. Unfortunately objectives of using said tool, the amount of information transferred, the number of participants in a secret exchange and their platform choices/technical sophistication (among many other factors) are all variables that make this decision more nuanced. What’s the tldr; on information security? There probably isn’t one. To put it in medical terms, you can self prescribe or listen to a witch doctor and it will probably be ok. Until it isn’t.
It’s easier to say (or remember, or tweet) “Use Signal, use Tor” than it is to say “Please apply all the OS updates when released and don’t install crapware on your notebook computer first, then talk about your tools for secure communication”. (Yes, it is equally possible to tweet both, but the latter seems orthogonal at first glance to communicating secretly with someone else)
Not to sound like Mr Miyagi, but information security is much more about an ongoing process and mindset than it is about a set of tools. In many ways, there is still a tendency to treat data security as something fixable by an appliance or an arsenal of magic bullets; in the same way that one would employ a microwave oven or refrigerator.
It doesn’t quite work that way.
I used to consult on information security practices for local and international organisations a few years ago. I now spend some of my time working on the same things in-house, with slightly more rewarding results. My focus with clients wasn’t on the dark and much Hollywood hyped arts of whoa dude hacking. My remit usually starts off differently (make us really really secure), and actually ends up with most of my time spent helping clients define what needed protecting from whom. Quite often, clients needed help articulating objectives; and once they did — they found the necessary changes to their day to day processes (which mostly lacked in information security hygiene in the first place) were prohibitively expensive.
Most organisations (not just Sri Lankan NGOs) have a significant, ongoing problem with disseminating and sharing information in visible, meaningful ways. Adoption of information systems is low, and prone to resistance from more entrenched workers. Skill levels are climbing but not steeply enough in my opinion. If you want to share electronic documents between remote field offices where no such capacity existed in the first place; you will usually go with the first workable solution proposed by a consultant; through an informal consult or one of your employees. Any kind of collaboration or information sharing is still a huge huge win in many sectors. So what happens when someone wants to stop this march towards progress for something as nebulous as “information security”?
A typical brief goes something like this —
So, as you know — we face a number of threats to our data, from many different sources. We handle a lot of sensitive information, and we would like to communicate with our employees and also external parties securely.
Small print: we don’t have much of a budget for this. Even smaller print: We don’t have (m)any trained specialists, we just have a part-time network guy who fixes our printers and helps us with Word documents. He doesn’t show up to work every day. Extra small print: most of our staff aren’t very technical people.
For your conventional Sri Lankan NGO that may be engaged in sensitive work in the field, doing threat analysis (PDF) is of vital importance. I’ll be honest — I sometimes gauge how serious clients are about information security by offering to work with them on defining their threat model. How many hand-wave past this? Uncomfortably too many. Threat analysis is no mere theoretical exercise, because defining the potential for damage and the sophistication of your potential attacker vastly limits changes to your existing practices; and costs (by allowing concentration of resources on critical pieces of information).
[A threat model is] a way of narrowly thinking about the sorts of protection you want for your data. It’s impossible to protect against every kind of trick or attacker, so you should concentrate on which people might want your data, what they might want from it, and how they might get it. Coming up with a set of possible attacks you plan to protect against is called threat modeling. Once you have a threat model, you can conduct a risk analysis. Source: EFF glossary entry on threat models
Here’s a hierarchical sampling of entities that may compromise a non-governmental organisation’s data security:
a) Organisations that belong to or are sponsored by foreign states
b) the state; and/or its intelligence apparatus (with varying degrees of complicity from ISPs/telcos)
c) Technically skilled third parties hired either by (b) or competitors/interest groups
d) Insiders (disgruntled, technically incompetent or indiscreet employees, potentially those with a financial or social incentive)
e) Parties with whom you are sharing sensitive information
f) Technically skilled (possibly ideologically motivated) outsiders
g) Casual outsiders [ever had a friend pick up your phone and post to Facebook impersonating you? yeah, those people]
Wait. Does your prospective client say “we need to be bulletproof against the NSA on a shoestring?”. Walk, no run away if they really mean it. Sri Lankan organisations have had countless breaches (and that’s just the stuff we know about) by far less sophisticated adversaries. Do your candidate clients have difficulty prioritising types of information? These are indications of superficial thought being given to the actual mechanics of information security; and probably need to be fixed before any tools can be applied to a given problem. Do they have problems telling you who should have access to which types of information? This is less common, but still a surprisingly frequent occurrence in organisations where information systems are bolted on top of paper based record keeping. It’s probably possible to tool up your organisation to be impenetrable to most of the entities above; but no one gets there on day 1 (or to be frank, day 100).
What does all of this mean for those potentially curious about communicating securely, and protecting their data; ie, the promised land of everything being secret?
Some random truisms:
- Tool choices can and do come with caveats. Inappropriate use of a security tool is perhaps the easiest way to leak information. Unfortunately, tool choices are also a moving target. There was a time when Telegram was used by people for secure communications simply because WhatsApp communicated in the clear back then; and other secret-chat tooling (Threema, products by Silent Circle) was not widely available. Tomorrow, some genius cryptographer may discover a vulnerability in the Signal protocol; just as some people did with MTProto which is used by Telegram.
- Using a tool (even with great training) is just a single step — PGP/GPG key IDs had a flaw exploited last week . Updating the tool itself as well as your organisational code of best practices around using the tool are equally important. Even where GPG adoption has happened locally, most users work off a series of memorised recipes for encryption/decryption; which is not ideal.
- A tool cannot magically fix your own information security practices if they were subpar earlier. Encrypting an email is great but less so if you slip up and add all the recipients on the cc: list. Encrypted chats are great, unless you slip up and add an unencrypted backup on another service like Google Drive. Encrypting data at rest is meaningless if you slap a post-it with a passphrase on the backup disk. Sending someone an encrypted message, then putting up a Facebook status asking if they got it is … not ideal.
- The weakest link is not necessarily your own organisation but some other, perhaps higher profile establishment — for instance Cablegate did a great little number on local NGOs and some of their employees. Be careful of what is shared, with whom; and when. Funding proposals, requests for information and initial contact with potential sources can all be intercepted or leaked by third parties.
- Spear phishing is by far a more likely vector of attack than using a previously undisclosed software vulnerability on your computing device.
- The best way to protect secrets is to not have them in the first place — many jurisdictions have a key disclosure law. Other places indulge in rubber hose cryptanalysis; which is not even remotely as cutesy as the name might suggest. Full disk encryption for your computing devices as a means of preserving secrets, therefore, is very useful in case of theft or misplacing a device but less useful against determined adversaries or border security in some countries.
- Recommending tools on its own is meaningless. Install something on your mobile phone? What if your phone is already compromised? On a computer? What if your internet connection is not secure? Nonetheless …
- Some potential tool choices to investigate in August 2016 (this list could be invalidated tomorrow by newer findings or advances in the field) — Tor, Signal, Wire, G(nu)PG (of those, GPG/PGP tends to divide opinion the most. Great tool, but poor usability). Look at VPNs for securing network traffic (but remember VPN traffic is not difficult to disrupt and VPN providers also serve a large market of copyright violators).
- Related to VPNs above, people/groups will use the same tools you do to facilitate illegal or objectionable activities. This means that using some of these tools can be seen as an indicator of criminal activity in some jurisdictions. Nonetheless, a society where everyone uses encrypted, secure communications is arguably safer.
- Oh dear heavens, no.
Do not take any advice on security tooling or practices from people who do not work in the field. Even if they do work in the field, cross check their advice with multiple, credible sources. Ask these sources if a tool is an appropriate fit for your circumstances. There is a lot of misinformation out there, most of it dated or appearing to be unequivocal and/or authoritative without any basis for being so.
Even the guides by the EFF have their share of (legitimate) concerns and detractors so much so that the first version of their Secure Messaging Scorecard is no longer available online. Crypto nerds argue over these things to the death
- Be aware of local context when receiving advice (or offering it)
Burner phones (or throwaway SIMs) are more difficult to acquire in Sri Lanka because of the requirement to produce valid identification on purchase. Sure, you can social engineer a guy off the street to buy one for you so perhaps it isn’t a big barrier; but when a telco is complicit in surveillance — burner phones tend to have a fairly short half life. If you think you know how to circumvent the checks; you’re probably missing something (or don’t need this guide in the first place).
- Sri Lankan surveillance infrastructure will massively benefit in the near future (costs, efficiency) from advances being field tested in other countries at present. In addition, Sri Lanka also appears in the list of countries that are vulnerable to a complete internet shutdown (article is from 2012, but we have added two more links in the intervening half a decade so not much has changed). Our ISPs are already complicit in blocking access to various websites; with no definitive list of what is being blocked when.
Did you read this far and not come away with a single concrete recommendation on software; nor an actionable item that will magically make all your stuff secure? Then I think I’ve done the job, because choosing your personal security tools on the strength of a single blog post or workshop is the very definition of insanity. Talk to practitioners about an audit of your organisation first, before discussing tool choices. Ask for caveats and alternatives for most proposals; as there are few instances where a single tool will be universal proof against all scenarios.
In my personal (and probably controversial) opinion, workshops are a useful first step at best for figuring out an information security strategy; be it personal or organisational. Paradoxically, given how I’ve tried to avoid tool recommendations — workshops really shine at giving a set of people (with disparate backgrounds) a good foundation on how to use a specific tool.
There is a truism that the most important information is (still) rarely committed to electronic memory. Sometimes the best information security strategy can be to not need one for very much information at all.