ANOVA explained without math: Why analyze variances to compare means?
Some people who are new to statistics get confused about ANOVA (analysis of variance) and one common confusion is why we are analyzing variances in order to compare means. Here is an extremely nontechnical introduction. I am going to use an extreme example because it makes things clearer. There will be no formulas.
ANOVA is used when we want to compare means among three or more groups (with only two groups, you could still use ANOVA but it is easier and equivalent to use a t-test).
For example, suppose you have a bunch of men and you are interested in comparing their weights . You have
- (American) football linemen (NOTE: These guys are huge, often 2 meters tall and weighing 150 kg or so)
- Thoroughbred jockeys (NOTE: These guys are tiny. They weigh less than 50 kg.)
- College professors (not known for being extreme on weight).
Now, suppose that, rather than put them into groups by profession, you do so alphabetically. A through G, H through N, O through Z. The weights in each group will vary wildly. But, the means of the three groups won’t be exactly equal. There will be a lot of variation within groups and little variation between groups. For instance, the first few weights in each group might look like:
- A through G: 45, 160, 90, 48, 42, 74……
- H through N: 180, 160, 43, 70, 85 ….
- O through Z: 50, 100, 110, 180, 45 ….
Then you put them into groups by profession. There will still be variation in each group (especially the professors) but it will be much less. But there will be huge variation between the groups. It might look like this:
- Linemen: 180, 160, 150, 145 …..
- Jockeys: 44, 45, 46, 48, 50 ….
- Professors: 100, 80, 75, 90 ….
So, when we divide them up in a way that matters, we get huge variation between and little variation within. When we divide them in a silly way, we get huge variation within and little variation between.
That’s why we look at variances to compare means.
ANOVA just makes all that formal and mathematical.