Data in the wrong hands is dangerous, just ask 007
If any of you have just seen Spectre, you will be aware that data can be used for good and for bad.
Spectre still — James Bond and Q — copyright Twentieth Century Fox
I’ve lost count of the times I have talked/ preached/ ranted about using data, more so bringing data together for insight generation, however with data comes great responsibility.
For the record when i say data, i mean…
“Data is” slide from NUX Liverpool presentation — behind every great idea is great data
If you are lucky enough to get data through to the board room, your one of a few.
However I’m pretty sure that if you are successful this data will often be:
- aggregated data (data that has been combined, but in reality should be kept separate)
- lacking the basic QA checks (data biased from the get go, think including internal users)
- whose metrics/ terminology you don’t really understand (think page views, hits..)
- opinionated (normally headed up via marketing/ agencies)
- not enough or too much (often an individual piece of data or a years worth of data which for both reasons they will use against you to make it invalid)
Where this gets even more troublesome is that it’s this data that is then acted on. (Granted, that’s slightly better than making decisions from no data)
Don’t just take my word for it, the great workers at Government Digital Service (GDS) agree too…
Like any tool, there are pitfalls if it’s not used the right way. Research findings can be incredibly powerful in helping a team make good decisions. But without careful thought and analysis, it can be used to jump to poor decisions.
So what do we do about it?
1. Limit access to your raw data
Whether we are talking analytics or research keep access to the tools down to a minimum and to those who understand it. This means agencies too.
(NB that doesn’t mean limit access to your “analysed” data)
Rationale: If others have access to the raw data, it is data that lacks additional analysis and most importantly insights. Using this data is then open for debate at meetings, what does it mean? is that good?… and de-values all data, not just this piece.
2. Focus the data on organisational goals and objectives
Data collection should be based around organisational goal/ objectives. Let’s face it, this is why you are employed in the first place. Having your data aligned with organisational objectives means it’s easier to use and gain buy-in to the data when going up the chain as this is already familiar territory.
Rationale: Whilst the above makes sense, so many people don’t do it. You might have good data (quality and insightful), but if it isn’t related to objectives and goals then its time wasted and if shown to others wastes their time and makes you and your position a little bit less important.
3. Share “user friendly” data
GDS Dashboard — Practical driving test bookings
By creating your own dashboards and crib sheets, you instantly have something of worth to share. If done correctly you should replace jargon for user friendly language so that they make sense to anyone. You can then also produce different data visuals for different stakeholder groups, so for example provide insights just for your IT / digital services teams on top browsers to build for, or for your marketing teams on how campaigns have really worked.
Rationale: As the phrase goes, Nothing Ventured, Nothing Gained. In this case Nothing Shared, Nothing Gained. Sharing aids learning, and learning leads to improvements. I suggest for real impact, rather than hard/ digital copy dashboards, get them online. This not only makes them highly accessible, usable and sharable but also alive (real time) and thus raises data awareness through your organisation and beyond.
4. Create a collaborative way of thinking
Many of you will work with marketing/ agencies especially around PPC or campaigns. Whilst it might not be an area your are an expert in, hence the agency involvement, letting them to their own devices isn’t good for them or you. (Remember points 1 & 2)
So the solution is to work together. Create a solution that works for both of you, which for me is access to online dashboards tailored to the information they need. (A tailored offline digital report is equally good)
Rationale: By providing a tailored dashboard/ data visualisation you have one single point of truth that can be shared and explained to all removing potential mixed messages from separated data. Keep evolving the data to not just show data but true insights. The fact you have created a collaboration also pleases those who you previously removed access too in step 1.
5. Always review data quality (from the start)
The QA role for service delivery is often overlooked in small teams simple due to resources. The QA role for data delivery is not just overlooked its not on the radar. For those involved in data capture, be that quantitative or qualitative, grow the QA role into everyones daily activities. Simple steps will prevent biased data, reinforcing quality as default.
Rationale: As mentioned before if data is inaccurate it losses it impact and carries a risk if used. Making sure the basics is covered is data 101. Read Geoff Kenyon’s post on moz.com for some great tips on QA for Google Analytics, Digital Tonic’s post for tips on QA for A/B tests or Nielson Norman Group on QA for User Research.
As a caveat, don’t get too hung up on a perfect data quality standard from the start. Get the basics in place and refine a standard as you learn. For those analyst ninjas out there, don’t let a lack of data quality stop you gleaning insights.
#emetrics @SHamel67 Stop complaining about data quality. A great analyst can provide insights even with imperfect data. — Yes!
— Dean Abbott (@deanabb) November 4, 2015
This article was originally posted on dominichurst.com