The ethics of citations has been a hot topic as of late, and deservedly so as evidenced by Lior Pachter’s detailed case study of one paper’s citations. In the post he popularized the term “drive by citations” and coined the term “hit and run citations”. It got me thinking about what other types of citations might be out there in the wild waiting to be discovered.
Since the start of Cornell’s pizzagate I’ve consumed a lot of work by Brian Wansink and couldn’t help but notice some unusual citations. Every time I thought I had seen it all he somehow managed to pleasantly surprise me. I’d like to tell you about these rare creatures that I’ve discovered.
First, let me show you how this game works. Let’s take a very common occurrence and name it.
We have all read a paper where the methods section says something along the lines of “as described in X”. You then open up paper X and in its methods section it says “as described in Y”. Soon you have 10 PDFs open with no end in sight. I would call this a “Russian doll citation” since every reference you follow just leads you to another reference to open.
Now the fun begins, let’s get to the never before seen types of citations and start to name them.
Case Study 1: Fool me once…you can’t get fooled again…a Rickroll
In “De-Marketing Obesity” (2005), Brian Wansink and Mike Huckabee (yes, the politician) write:
Although such packaging can increase production costs, the $43 billion spent last year on diet-related products is evidence that there is a portion-predisposed segment that would be willing to pay a premium for packaging that enabled them to eat less of a food in a single serving and to enjoy it more. For instance, results from a survey of 770 North Americans indicated that 57% of them would be willing to pay up to 15% more for these portion-controlled items.⁷
7. Brian Wansink, Mindless Eating: The Hidden Persuaders that Make Us Lose and Gain Weight (New York, NY: Bantam-Dell, forthcoming 2006).
It seems unusual to cite a book for a fact like this, especially a book meant for laypeople and that is unpublished. Is the book going to include never before seen data? Wait a minute, the very next reference is a book by Mike Huckabee:
8. Mike Huckabee, Quit Digging Your Grave with a Knife and Fork: A 12-Stop Program to End Bad Habits and Begin a Healthy Lifestyle (New York, NY: Center Street Books, 2005).
They didn’t write this review just so they could cite their books and get some free publicity, did they?
As a citation connoisseur, this whetted my appetite and ravenous to learn more about this survey of 770 people I devoured Mindless Eating.
…nothing about this survey.
There is this nugget however:
we found that half of the loyal users of one popular snack food said they would pay 15 percent more for a new package that helped them better control how much they ate
Half is about 57%, so it seems I found what he’s referring to in his review with Huckabee, but there’s no mention of a survey and no reference to follow.
Is this unpublished data?
If so why didn’t he just say that in the Huckabee paper?
I just read an entire book only to get RICKROLLED.
And that takes us to our first new type of citation.
A “Rickroll citation”: when you follow a reference for some info only to find nothing. No additional information, no references to follow.
Man, I was really interested in that survey, but I guess we’ll never learn about this data set…
HA! You knew I wouldn’t give up that easily.
This is where things start to get (more) bizarre, but before we move on I want to show you the fabled technique of a “delayed Rickroll citation”.
In “Can ‘Low-Fat’ Nutrition Labels Lead to Obesity” (2006), Brian Wansink and Pierre Chandon write:
For example, a loyalty program survey of current customers of a Kraft product indicated that 57% of them would be willing to pay up to 15% more for portion-controlled packaging (Wansink and Huckabee 2005).
But as we know the Huckabee review references Mindless Eating which does not provide a reference for these numbers. Legendary.
Wait, what’s that? These authors did it again? In “Slim by design: Redirecting the accidental drivers of mindless overeating” (2014), Wansink and Chandon write:
Although such packaging can increase production costs, the $43 billion spent
in 2013 on diet-related products is evidence that there is a portion-predisposed segment that would be willing to pay a premium for packaging that enabled them to eat less of a food in a single serving and to enjoy it more. For instance, results from a survey of 770 North Americans indicated that 57% of them would be willing to pay up to 15% more for these portion-controlled items (Wansink & Huckabee, 2005). Although targeting this “ portion-prone ” segment will not initially address the immediate needs of all consumers, it can provide the critical impetus that companies need to develop profitable win — win solutions.
Hmm, that $43 billion number looks familiar. Oh yeah, Wansink used the same figure in the Huckabee review for sales in 2004. So sales didn’t change for a decade?
I’m not even going to try and figure out where that number is coming from, I’ve already performed more rectal exams than I’d like (I do not miss med school).
Case Study 2: Mindless citing, why there are less facts than you think
Searching for the survey of 770 people proves impossible. I even tried looking through Wansink’s other work to see if he ever cited it again and gave an appropriate reference. Nothing.
So what now?
Well, since I noticed Wansink’s blog post back in December I’ve read a lot of the media coverage he’s received, watched his videos, read his papers. At this point I’m something of a Wansink whisperer. Even though Google can’t locate this survey maybe I can.
Then an assistant professor of marketing at the University of Pennsylvania’s Wharton School of Business…He and his grad students had planned to dump Wheat Thins and M&M’s into large Ziploc bags, but by mistake they also brought some tiny, snack-sized ones. Since there weren’t enough large bags to go around, some moviegoers got four small ones instead. Something surprising happened: Most people who received the four small bags finished only one or two. In a follow-up questionnaire, Wansink asked the participants how much more they would pay for snacks that came in lots of small packages instead of one big one. A majority said they’d spend 20 percent more.
Finally! A survey!
57% is technically a majority, but this sounds like a small survey, and 15% has changed to 20%. And there’s still no reference to what article this is referring to. Hmm...
The passage did contain a lot of nice background information though, plenty to perform a Google search that led me to another one of Wansink’s books, Slim by Design.
Apparently this is an experiment he’s famous for, it’s even in the introduction section of the book!
When they got to the end of their first or second 110-calorie pack, they just stopped eating.³ Even crazier, more than half also said they’d pay 20 percent more money for snacks if companies put them in smaller packages.⁴
O M G, not one, but two references! I can’t contain myself with excitement.
Let’s go take a look at them…
3. We published a similar study for validation…“The 100-Calorie Semi-Solution: Sub-Packaging Most Reduces Intake Among the Heaviest”
So the original study with Ziploc bags was never published?
4. This same study was replicated in a lab study at Wharton with similar results.
This lab replication at Wharton also was never published?
I scoured Google Scholar. Where is the 770 person survey? My head is starting to hurt.
So I guess we’ve really reached the end of the line now.
HA! Don’t be silly. I never give up. I’m your Huckabee-ry.
A carefully crafted Google search eventually led me to “Helping Consumers Eat Less” (2007), Wansink. There he writes:
In 1996, my laboratory began experimenting with mini-size packs to determine how they would influence how much consumers paid for them and how much
they ate **(Wansink, 1996)**. We found that a sizable percentage of people were willing to pay more for something that would help them control their portions and that it would help 70% of these people eat less in a single sitting. Although the packages would be more expensive per ounce than the larger packages, some people would not mind paying more to eat less or to eat better.
We’re back on the scent boys! Where does (Wansink, 1996) lead us?
“Can package size accelerate usage volume?” (1996), Brian Wansink.
Okay, sounds promising, let me have a look…
…and I’m back.
Nowhere in the 14 pages is there any eating or anything to suggest people would pay more to control their portions. The goal of the paper was to determine if the perceived unit price of a product influences how much is used. To determine this Wansink ran 5 different studies involving PTA members where he had them measure out different amounts of products such as Canola oil, spaghetti noodles, M&Ms, water, cleaner, or bleach. There was NO eating involved.
What? Is this even the correct reference?
I thought maybe the wrong reference got inserted so I checked for other papers by Wansink around this time period that would substantiate his claims, but couldn’t find anything.
Now I was REALLY curious. What’s going on here?
How about we go and see how he used this citation elsewhere? I remembered seeing it mentioned in Mindless Eating, so I thought that would be a good place to start:
For many of the breakfast, lunch, and dinner foods we have studied, the result is about the same — people eat 20–25 percent more on average from the larger packages.²
2. Much of this section’s discussion on package size is based on the paper, Brian Wansink, “Can Package Size Accelerate Usage Volume?” Journal of Marketing 60:3 ( July 1996): 1–14.
^No one in that study ate a single thing.
It’s been estimated that 72 percent of our calories come from food that we eat from bowls, plates, and glasses.⁶
6. See Brian Wansink, “Can Package Size Accelerate Usage Volume?”
^This is just craziness, THERE WAS NO EATING IN THAT STUDY!
Perhaps there was another mistake in the references. Maybe in both “Helping Consumers Eat Less” and Mindless Eating he accidentally cited “Can Package Size Accelerate Usage Volume?” instead of other papers.
So I moved on to Slim by Design, where it is the first reference, hard to mess that one up, right?
The bigger the package, the more people ate and the more they liked it. This was true for everyone we tested, and it’s probably true for everyone from competitive hot dog eaters in Coney Island to desperate housewives in Beverly Hills.¹
1. While we didn’t do packaging size studies with competitive eaters, we have done other studies with them, and they won’t be much different than the rest of the world when it comes to overeating during a nonwork day. “Can Package Size Accelerate Usage Volume?”
^Were the PTA members competitive eaters?
And it gets cited again…
Imagine you are making a spaghetti dinner and on one day you have medium-size ingredients, including a box of spaghetti, jar of spaghetti sauce, and package of ground beef — and on another day we give you large-size versions of all three. What will you do? You’ll make and eat 22 percent more food.²⁴
24. This study on the package sizes was the basis for the 100-calorie pack: “Can Package size accelerate Usage Volume?”
^Soooooo…the study did have people put some noodles in a pot, but there was no sauce, beef, cooking, or spaghetti to eat, AND NO ONE ATE ANYTHING.
This is hard to explain…
Okay, perhaps he just doesn’t remember that paper very well, I mean, it was back in 1996, maybe he was trying to reference some other spaghetti paper.
I remembered reading about spaghetti in Mindless Eating, so I went back there and searched for this spaghetti study.
With spaghetti, for instance, we found that the people who were given the large package of pasta, sauce, and meat typically prepared 23 percent more — around 150 extra calories — than those given the medium packages. Did they eat it all? Yes. We ﬁnd over and over that if people serve themselves, they tend to eat most — 92 percent — of what they serve.¹
1. This 92 percent figure pops up in our studies again and again. See Brian Wansink and Matthew M. Cheney, “Super Bowls: Serving Bowl Size and Food Consumption,” Journal of the American Medical Association 293:14 (April 2005): 1727–28.
…the reference is just about the 92% thing.
So where is this spaghetti study? I don’t know, I couldn’t find it anywhere.
And we never found the survey with 770 people…
…but we did find plenty of other surveys with 770 respondents, as documented by Nick Brown:
But never mind that, let’s get to our next definition! These citations are similar to “drive by citations”:
references to a work that make a very quick appearance, extract a very small, specific point from the work, and move on without really considering the existence or depth of connection [to] the cited work.
Except Wansink’s are way worse than that, especially considering they are self-citations. How do you not cite yourself correctly? There are a lot of vulgar ways to describe this but let’s try and keep this post PG for the tone police and just call it…
A “mindless citation”: when you incorrectly cite your own work out of laziness or unethical reasons such as trying to make it appear you have evidence to support your argument when you don’t and might as well increase your h-index in the process.
Case Study 3: That’s rather bold…an Icarus sighting
I have a lot of respect for bold people and bold actions, but there is such a thing as being TOO bold and flying too close to the sun.
During my search for the survey of 770 people mentioned in the Huckabee review I came across this paper, “Cooking Habits Provide a Key to 5 A Day Success” (2004), by Brian Wansink and Kyoungmi Lee. It reads:
Two thousand Americans were randomly selected from census data and paid $6.00 to complete a mailed survey. The 770 people (38.5%) who completed it within 6 weeks had an average of 1.6 children living at home, were 37.3 years old, had a median household income of $38,000, were 70% Anglo-American, and were 61% female. Of these, 508 could be categorized as vegetable lovers or as fruit lovers using a cross-classiﬁcation technique based on their preference ratings for fruits and vegetables and by their self-perceptions.
Given the interesting citation practices of one of the authors I thought I’d see how this paper was cited.
It was cited in the “The sweet tooth hypothesis: How fruit consumption relates
to snack consumption” (2006), Brian Wansink, Ganael Bascoul, and Gary Chen. There they write:
A survey was mailed to a random sample of 2000 North Americans along with an honor check of $6.00 that they could cash if they completed the survey. Within a 6 week period 770 (38.5%) responded and were included in the study. Respondents were 61.0% female, lived in a household with an average of 3.1 people (SD ¼ 1.83), were 70.2% Anglo-American and had an average age of 37.3 yr.
Great, they describe the exact same survey, so we’re good here, right?
No…I wouldn’t be talking about it if we were.
Although the surveys are exactly the same, the authors give the impression that they were two differently designed surveys.
In the 2004 paper the authors write:
To do this, we ﬁrst conducted in-depth interviews about the cooking
habits and food preferences of 37 supermarket shoppers in Illinois and
Michigan. Based on these ﬁndings and the literature, a survey approved
by the Institutional Review Board was developed [the 770 person survey].
In the 2006 paper they write:
To complement the CSFII investigation and to better assess the reliability of this fruit-sweet snack relationship, a follow-up study [the 770 person survey] was designed to determine whether sweet snack consumption was related more strongly to fruit consumption than to vegetable consumption.
Okay, admittedly in my field of biology it is pretty typical to come up with creative stories about why a study was done. What often happens is you get some unexpected result while doing an experiment, then you weave a narrative about how you expected to find what you had found. It doesn’t sound great to admit you got your result by forgetting about a Petri dish and having mold grow on it.
But once you have a story about how you arrived at your experiment and result, you stick to it. And in this case it’s not like the authors stumbled upon a survey of 770 people and then retroactively needed to justify why they did the survey. Unless of course they just have some giant survey somewhere where they asked hundreds of questions and they dip back into that well whenever they want to write a paper, and just come up with a different reason for doing the survey every time.
Whatever the case, this citation is unusual for several reasons. First, if this is indeed the exact same survey, you would expect it to get referenced in the introduction or the results as a prior work that was done by the authors. But it’s not. It’s cited in the discussion:
Exploratory efforts have shown that vegetable lovers, for instance, enjoy cooking, entertaining and using new recipes more than fruit lovers (Wansink & Lee, 2004).
Reading that brief statement you would have no idea the reference used the exact same survey results. They make it sound as if it was a completely different study. You have to wonder if the journal and reviewers would have been as interested in the 2006 paper if they knew it was just another slice of salami from the sausage the 2004 paper is based on.
The citation is also unusual because why take this risk? If you aren’t going to be upfront about the salami slicing why sneak in a reference to a study with the exact same data set? Is the extra self-citation really worth it? That’s playing with fire, and brings us to our next definition.
An “Icarus citation”: a citation that could raise concerns about your work but you do it anyways for greedy reasons such as increasing your h-index.
Case Study 4: How did you know? Psychic citations.
This last definition is a fun one.
In “Position of the American Dietetic Association: food and nutrition misinformation.” (2006), Brian Wansink writes:
Consumer spending on functional foods, dietary supplements, natural/organic foods, and natural personal care products totaled $168 billion in 2004.¹⁰
10. US Department of Health and Human Services. Health, Information and the Use of Questionable Treatments: A Study of the American Public. Washington, DC: US Department of Health and Human Services; 1987.
That’s amazing, how did a study in 1987 know how much money would be spent in 2004?
Food and nutrition misinformation may be especially detrimental because people spend increasing amounts of money on weight-loss solutions ($43 billion in 2004).¹¹
11. Sarubin A. The Health Professional’s Guide to Popular Dietary
Supplements. Chicago, IL: American Dietetic Association; 2000.
And a paper in 2000 knew how much money would be spent in 2004?
This wide range of herbal, botanical, and sports supplements, which comprise
over half of the dietary supplement industry, has helped sales increase $13.9 billion in 2004.¹²
12. The Dietary Supplement Health and Education Act of 1994. Pub L
№103–417 (S 784) (codiﬁed at 42 USC §287C-11), 1994.
A law in 1994 knew how much sales would increase in 2004?
Finally we are at our last definition:
A “psychic citation”: when a reference you cite somehow had the ability to predict the future and support your random fact.
Citations have a strange power. They are kind of like starting a rumor. Once someone says there’s evidence for something and provides a reference, people will believe them and perpetuate the rumor.
Science is often described as a type of religion because we have to trust the sermons of others. But when an entire gospel is composed of one person’s sermons this becomes dangerous.
Donald Trump will often begin many of his remarks with “People tell me” or “Many reports say”, which gives what he says some weight, and forces (at least some) people to take him seriously.
Why do people think Donald Trump is a successful businessman? Because he repeatedly tells us he is.