Our Feeble Human Brains are Terrible at Objective Analytics
One problem that scientists run into from time to time is (intentionally or not) leading the data where they want it to go, instead of following the data where it actually leads. This isn’t just due to grifters looking to make a name for themselves: we humans are phenomenal at fooling ourselves into thinking we understand more than we do!
There’s already a lot of great information out there about how the human brain is hardwired to look for patterns in no small part because our ancestors needed to survive in the wild against much bigger, tougher predators.
My most vivid experience with this was during an electronics lab in college. We were tasked with building an optical transmitter and receiver for sound signals, basically a laser radio. We were graded on how well we could pick up the signal under varying challenging circumstances.
As was often the case in this class, my lab partner and I were so busy trying to complete the lab in the impossibly short window allotted that we didn’t have time to go back to our dorm for about two days. This was before the days where everyone had clouds full of music stuffed in their pockets so our only source of test sound with an audio output port was my Sony Discman and the single CD it contained, my then-favorite album.
We toiled ceaselessly to get our apparatus working, straining to hear the Sweater Song or really anything approximating noises created by humans against the deluge of white noise coming out of our apparatus’ tiny speaker. We were working in near-complete darkness most of the time because we were, after all, trying to get music to transmit over a blinking light and we were terrible at building it so we needed to give ourselves every edge possible.
Finally we heard “… no one cares about my ways!” crackle faintly against the static and we rejoiced! We could finally sleep, eat, bathe, and rejoin humanity.
Then one of our friends came over and covered the light receiver with his hand. We kept hearing the song. Pulled his hand away, no change. Blocked the little blinking light. Still heard the song. Unplugged the light. Still Weezer went on and on about the joys of being in a garage.
We had been working from a single breadboard and so I learned about ground loop (basically our transmitter was leaking signal into the circuit that was powering the receiver).
After much panic and profanity we got our apparatus working (for real this time, guys) about 10 minutes before our turn came up to demonstrate in front of the whole class. Success! We took our A-/B+ and went home to shower.
Exhausted, I peeled off the clothes I’d been wearing for about 60 hours and stood under the hot stream of water. And heard it. Faintly, but very distinctly, in the sound of the water spraying out of the shower-head.
“In the garage where I belong…”
I’d spent the last three days essentially in intense sensory and sleep deprivation straining to pick up the sounds of Weezer out of a hiss of static. I had effectively re-programmed my brain to listen for that very specific signal in a sea of noise.
I haven’t listened to the blue album since.
In science there’s even a phrase for looking at a confused and messy data set and trying to clean it up to tell a coherent picture. It’s called “massaging the data,” which sounds incredibly relaxing if you’re a data point.
Massaging the data can actually be important and useful, too! It’s not necessarily dishonest. For example, you can identify a source of experimental error and eliminate the points which reflect that specific error, or you can perform other analysis to identify what’s signal and what’s noise to improve the quality of your data set. As long as you’re transparent about what you’re doing and preserve the full, unmodified data set there’s no problem with it.
The problem comes when you end up massaging something into existence from whole cloth because it’s what you expect to see. Like hearing Weezer out of the showerhead for a week.
That course taught me a fair bit about electronics, physics, and soldering without burning your fingertips.
It also taught me to be reasonably skeptical of my own perceptions. Just because I’m completely sure that I saw or heard something a certain way doesn’t mean that’s what really happened. Just because I’m seeing patterns that completely support my views doesn’t mean that they’re really there or that no other perspective is needed or useful.
There are certainly larger life lessons to be drawn about the perils and very real consequences that can unfold from uncritically trusting our senses.
But there’s also a very specific one in today’s big-data and analytics-driven business culture. Think critically about the patterns you’re seeing and what you believe the data is telling you. Talk to someone else about it, especially someone who is likely to be skeptical. Keep an open mind and be ready to be convinced you’re wrong. We succeeded in our lab not because we stuck to our guns when we thought we were done, but because we listened to the feedback we’d gotten and fixed what was really broken.
Jonathan is an Assistant Director of Research Information Systems at UCLA. After obtaining a Physics degree from Stanford University he went on to spend over 10 years working in information systems architecture, data-driven business process improvement, and organizational management. He is also the founder of Peach Pie Apps Workshop, a company that focuses on building data solutions for non-profits and small businesses.