I received an email this week from a vendor trying to sell me on a product and included was an attachment about a “study” the company had done to show how their product helped students grow in a given area. The line in the email about the study was that it showed “3% growth” for students that used their product.
Let’s be clear. Before I even read the article, I couldn't help but laugh at the 3% assertion. Generally speaking, in statistical world, 3% isn't going to get you out of the margin of error, so basically this email said that their program may have accidentally helped kids do better in a program…but also maybe not.
At this point, I then dived into reading the “article” — the article that was published by the vendor and payed for by a grant from the vendor. In other words, this entire article equals the vendors twist on everything that is discussed.
To be fair, there were real statistics in the piece. The authors talked about Effect Size, analysis of covariance, and even those silly things called reliability and validity. And it looked really nice! The charts were great! But here’s the deal…when you actually read through the article, dissect the statistics, and make sense of what they are really getting at, you can’t help but see this “study” is a farce! There was no generalizability in the results, the test group versus control group had almost nothing in common…except they were humans…and then there is the whole “our product made this happen” piece. Really? You can show that your product and your product ALONE helped these kids learn more…and NOTHING else had an impact in it? I’m thinking no…
So what’s my point? That you can’t be a fool! Especially if you are working in my field, Education. Companies know that data is the key to getting in the door. Without data, the education world doesn’t seem to want to hear about it. That being said, statistics (i.e. data) can be a very slippery slope and those numbers can be tweaked, skewed or interpretted in any number of ways to help the author meet their desired outcome.
A couple of strategies to help you not get suckered:
- If it sounds too good to be true, it probably is!
- In educational research it is basically impossible to prove causality! If a company says their product alone made crazy gains for students, they are blowing smoke.
- Ask for the data and read it…and read it well! Compare your student population with the population in the study. Look at the comparisons — do they make sense? Read the content with a critical eye, because it is too easy to make “data” look like real information!
So do your homework, don’t be wowed by the ridiculous, and above all else, always remember that statistics can be some of the most impressive, artistic lies every told.
Email me when Chad Jones publishes or recommends stories