Video games and real-world violence: No connection

How an 18-month search came up empty

By Edward Castronova, Professor of Media, Indiana University

Do violent games cause violent crime? People seem to think so. According to Pew, 60 percent of Americans think that video games cause violence.[1] On what basis? What evidence is there that people who play violent games go out and wreak havoc?

The answer is “not much,” not enough to hang a hat on, really. In this article I recount my personal journey — as economist, statistician, and game scholar — into the wilds of video game violence research. After looking hard for 18 months, I found that there’s no there there. There’s no evidence that game violence moves the needle on social violence. There’s tons of evidence that it doesn’t and couldn’t, however. The research that has been done asks the wrong questions, uses questionable methods, and finds weak effects anyway. I tried a new angle and it too came up empty.

You can’t publish nothingburgers in academic journals, but you can write an article for public consumption. Hence this quasi-academic, quasi-personal story of an effort to find something that is apparently not there.

The starting point

I started looking at games and violence because of a Supreme Court case. In 2011, the Court was asked to decide whether California could regulate the sale of violent games. The Court said no; game companies have First Amendment rights to produce and sell their content.

SCOTUS: Media violence skeptics

But it wasn’t the judgment that piqued my curiosity. It was the total rejection by the court of all the research that had been done supposedly linking game violence to real-world violence.[2] Of the nine justices, eight of them completely ignored the research. One, Anthony Scalia, positively denounced it:

The State must specifically identify an “actual problem” in need of solving … California cannot meet that standard. At the outset, it acknowledges that it cannot show a direct causal link between violent video games and harm to minors. Rather, … the State claims that it need not produce such proof because the legislature can make a predictive judgment that such a link exists, based on competing psychological studies … The State’s evidence is not compelling. California relies primarily on the research of Dr. Craig Anderson and a few other research psychologists whose studies purport to show a connection between exposure to violent video games and harmful effects on children.

These studies have been rejected by every court to consider them, and with good reason: They do not prove that violent video games cause minors to act aggressively (which would at least be a beginning).

Instead, ‘[n]early all of the research is based on correlation, not evidence of causation, and most of the studies suffer from significant, admitted flaws in methodology … They show at best some correlation between exposure to violent entertainment and minuscule real-world effects, such as children’s feeling more aggressive or making louder noises in the few minutes after playing a violent game than after playing a nonviolent game … Even taking for granted Dr. Anderson’s conclusions that violent video games produce some effect on children’s feelings of aggression, those effects are both small and indistinguishable from effects produced by other media. In his testimony in a similar lawsuit, Dr. Anderson admitted that the ‘effect sizes’ of children’s exposure to violent video games are ‘about the same’ as that produced by their exposure to violence on television … And he admits that the same effects have been found when children watch cartoons starring Bugs Bunny or the Road Runner or when they play video games like Sonic the Hedgehog that are rated “E” (appropriate for all ages), or even when they “vie[w] a picture of a gun.”[3]

I judge this scathing criticism to be entirely on point. The existing research says nothing of interest about the critical issue: Whether violent gameplay makes people violent in the real world.[4]

It’s a hard question. We cannot answer it, even if we apply the best methods we have. The best research practice in Communications (of which game research is a subfield) to conduct controlled experiments with groups of research subjects. This works out to a method something like this:

· 200 people are invited to the lab

· 100 chosen at random are asked to read a magazine

· The other 100 play a violent video game for 30 minutes

· Both groups are given a survey or exercise that captures their level of aggressiveness

· If the level of aggression is higher among the game players, it is concluded that violent video game play increased aggression

There are many problems with this method, but the larger problem is that it provides an answer to questions no one is asking. Society as a whole is not wondering whether people feel aggressive after playing a game for 30 minutes. Society wants to know whether people who immerse themselves in violent games are more likely to become violent felons. The game scholars in Communications are asking the wrong question.

Small-scale behavioral experiments are asking the wrong question and do not tell us anything about the effect of media on society. On the whole, the existing evidence is, as Scalia thunderously points out, more than a little weak in terms of real-world significance.[5]

There is also the problem that journalists overinterpret findings. A study that says, “Brief exposure to the game Bloodborne is associated with slightly higher aggression survey scores” is transformed into a headline that says, “Study: Games make people kill.” I exaggerate. But there is little question that minor research findings get blown up into broad but inaccurate statements about the social effects of games.

The upshot is that fairly quickly I concluded that we do not know anything solid about the questions we care about. We want to know if the existence and popularity of violent video games has something to do with violence in society. Existing research tells us nothing about that.

What would be better?

With other methods, it is possible to directly answer the question that society is asking: Do violent games make society more violent? It is a macro-level question. It calls for macro-level research: Comparisons of societies. Does a society with more violent gaming also have a higher murder rate? That would be a statistical approach that directly addresses the question everyone wants to know about.

Specifically, the central questions about games and violence could be answered using US geographic data. There are datasets reporting violent crime by state, zip code, and census tract. If we had data on violent video game play at these levels, we could do a regression study where violent crime was the dependent variable and violent video gaming was the key independent variable. In English, we could isolate the correlation between gaming and violence, removing the influence of other factors like family structure, education, poverty, and so on. The result of such study would look like this: “Consider two census tracts that are exactly the same in all respects: Population, income, family structures, etc. Assume, however, that tract A has more violent video gaming than tract B. We find that tract A also has more violent crime. All else equal, locations with more violent gaming have more violent crime.” That would be a powerful finding. In fact, a study like this was used by two economists to debunk the idea that movies cause crime.[6]

The movie economists had geographic data on violent movie watching. As my journey unfolded, I became was determined to find geographic data on violent gaming.

There’s no good geographic data on violent gaming

Unfortunately, there are many false leads.


A vast trove of data about the US population is published by IPUMS, the Integrated Public Use Microdata Series. These data include a time-use component, and there is a variable measuring hours of game play. It does not isolate violent game play, unfortunately. This, plus the fact that there were holes in the data (missing values for small states, for example), lend me to abandon IPUMS.[7]


NPD is a private company that collects and reports sales data to major corporate customers. In the game industry, they gather information on what games are selling well, and where. This would include violent games. NPD has the geographic data that we would need.

However, for some mysterious reason, they declined to sell me the data. I was told that the data could not be released because companies in the game industry did not want it to be used for publicly-available research.

One wonders. Did they run some regressions and find positive correlations between violent gaming and violent crime? Or were they just worried that some professor might find actual evidence to that effect?

It seems odd, though. The game industry suffers because of the general public impression that games cause violence. This is not being discouraged at all by the weak research currently being produced. You would think game companies would be willing to gamble on a stronger study. Apparently not, however; NPD was a bust.


Pew is known for public opinion surveys about the internet and games. When I asked about data linking violent games and crime, I was pointed to a study in which they had asked people whether they think games cause violence. As noted at the start of the article, many people do. But this is another case of asking the wrong question. Society does not have a pressing need to know what people think about games and violence; it needs to know what the relationship between games and violence actually is. A Pew representative told me that they generally do not attempt to gather data that would establish causal connections; they are interested primarily in opinion surveys.

Google Trends

Despite three strikes I decided to swing again; and I got a bloop single, I think. The fourth avenue I pursued, and the most fruitful one, was Google search data. Google Trends lets anyone view and download state-level data on searches. The results are presented as an index, not a raw number of searches. It assigns the value of 100 to the state with the most searches for a topic (out of all searches). Other states are given a number less than 100 that represents their search volume for that topic relative to the state with the greatest search volume. Thus if Alabama had 400 searches in total and 200 for ‘cat,’ its cat search volume is 200/400 = 0.50. If Alaska had 100 searches of which 10 were for ‘cat,’ its cat search volume would be 10/100 = 0.10. Google Trends would report Alabama as 100 and Alaska as 20, because 0.10 is 20/100 of 0.5.

Google Trends allows the comparison of topic interest across states. If the topics are violent video games, this gives us an opportunity to link violent crime to searches for violent games. Interest in violent games is not the same thing as playing violent games. But it stands to reason that if a state has a lot of searches for a violent game, it probably has a lot of people playing it too.

The rest of the paper, then, reports results of a study of violent crime and video game violence, where the indicator of state-level violent gaming is given by Google Trends search data. Specifically: I used Trends to produce search indices for the following eight violent games:

· Battlefield: Hardline

· Bloodborne

· Dying Light

· Hatred

· Mad Max

· The Phantom Pain

· The Order: 1886

· Until Dawn

These eight games were taken from Common Sense Media’s list of the top ten most violent games of 2015, the same year of the FBI crime data in the study (and the most recent available). Common Sense Media is a major watchdog group that reviews media for violence and offensive conduct. Two games on CSM’s list (OneeChanbara and Mortal Combat) had too few searches in one or more states, so that Trends did not report an index for those states.

A state’s level of interest in violent games in general was measured as the average of its index number for searches across these eight games. What that means is, a state where there are many searches for each of these game titles will have a higher average search index. For this study, that means that they have higher overall interest in violent games. And it is reasonable to assume that having high Google-based interest in specific violent games is closely related to buying and playing them. It certainly is more closely related to the center issues of our study than anything else I found.[8]

A study of violent game searches and violent crime in US States

Let’s begin with a simple scatter plot of US states. On the X-axis we have the average of search indices for the eight games list above. The search data range from about 40 to about 90. On the Y axis we have violent crime rates per 1,000 population, as reported by the FBI.[9] The information all comes from 2015, the most recent year for which crime data were available when the study began. Each dot in the picture represents one US state.

In this picture, Alaska and Virginia fit the theory that games cause violence. Virginia’s crime rate is below 1.00 per 1,000 population, and Virginians’ searches for violent games is the lowest in the country. Alaska has the highest search interest in violent games, and its crime rate is three times bigger. That would suggest that there is a link between violent games and violent crimes.

HOWEVER — and it is a big however — Alaska and Virginia are not the only states in America and the other states do not fit this story at all. Delaware has by far the highest crime rate, but its search intensity for violent games is below the median. North Dakota and West Virginia have very high interest in violent games but their crime rates are below the median.

As for stereotypes about rural people and game-inspired massacres, there’s no supporting evidence here. Rural states like Alaska, West Virginia, and North Dakota do have more searches for violent games, but North Dakota and West Virginia have below-median crime rates. Virginia has both low interest in violent games and a lower crime rate. There’s no obvious pattern.

There’s no obvious pattern to any of it, really. A basic trendline calculation reveals a slight positive correlation between searches for violent games and violent crime. The correlation effect — 0.0112 — implies the following: If a state had a 10-point higher interest in games, its crime rate would be 0.112 points higher. What does this mean for the real world? If a state had a crime rate of 1.050, a 10-point increase in violent game interest, from 60 to 70, would increase crime from 1.050 to 1.162. This is a meaningless change. When Alaska has a crime rate over 3 and Delaware’s is above 8, a shift from 1.05 to 1.10 is not big enough to matter. Interest in violent games does not move the needle as far as violent crime is concerned.

Scalia’s standard for evidence is that it must reveal some kind of a social problem being caused by video games. There’s no evidence of that to be found in basic state-level correlation data.

What about other factors?

Perhaps there is truly a large effect of violent game interest, but it is being masked by other factors that influence crime. Numerous regressions will be reported below. The result — actually the lack of a result — is interesting. Some factors seem significant for explaining crime rates, but an interest in violent games is not among them.

The variables selected for analysis are based on standards models of crime. Because enforcement likely is important, a measure of police presence is included in the study. Economic variables like income and unemployment are included. The prevalence of young men, which is the demographic that commits most violent crimes, is included. Measures of education are included, as well as cultural factors like diversity and out of wedlock birth.

Although dozens of regressions were run — as is common in studies like this — space only permits a few to be reported. The models reported here are as follows:

· Model A: The simple trend line from Figure 1.

· Model B: A basic demographic model. Factors accounted for include police, unemployment, men aged 20–34.

· Model C: Accounts for everything in Model B, plus data on income and education (the high school graduation rate).

· Model D adds cultural factors, including the level of diversity in a state and the frequency of out-of-wedlock births.

· Model E adds a cognitive ability score.

· Model F is included as the most robust regression found. It includes some of the above explanatory factors but it also uses only one game, Hatred, as the violent game variable (as opposed to an average of all eight games).

I include Model F mostly as a caution. If one runs enough regressions, eventually a strong-looking set of results will appear. It is tempting to report just that one shiny cherry out of the lot, but it makes little sense as a model. Why would interest in the single game Hatred make the effect of police on crime stronger and clearer? There is no theory to support that; it must be a random artifact of this particular data set. Instead of cherry-picking, it is better to report a set of results that reflect roughly what was found in running dozens of different specifications.

The table below shows the selected results in terms of effect sizes. Raw results with coefficients and T-stats are in the appendix. A double-star ‘**’ indicates a variable that meets the standard of statistical significance at the 0.95 level. A single star ‘*’ indicates statistical significance at the 0.90 level.

Table 1. Effect of different factors on violent crime at the state level

Read the table as follows:

All else equal, a 1 percent increase in {Variable} is associated with a {number} percent increase / decrease in violent crime rates. Example: In Model B, a 1 percent increase in Police is associated with a 0.380 percent decrease in violent crime.

All data from 2015. Definitions: Violent game interest: State average of Google Trends index scores for eight violent games. Police: Police per 1,000 population in state. Unemployment: State unemployment rate. Young men: Percent in state that are male aged 20–34. Income: Median state income. Graduation rate: High school graduation rate. Diversity: Percent of population that is non-white. Out-of-wedlock births: State out of wedlock birth rate. Math score: National Assessment of Educational Progress, 8th-grade math test, state average. Sources in the appendix. Data are available from the author on request.

For those unfamiliar with regression analysis, the results can be interpreted as follows: Numbers with an asterisk are estimated with some amount of accuracy; numbers without an asterisk are likely to be inaccurate. By “accuracy” is meant the likelihood that the same number would result if the study were done again and again with new data. Accurately-estimated numbers should re-appear in roughly the same magnitude during runs with new data.[10]

Model A shows the effect of violent game interest on violent crime based on simple trend line from Figure 1. The entry in the table says that if a state has 1 percent higher level of interest in violent games, it will have a 0.545 percent higher violent crime rate. Is this large or small? Let us examine some of the other models.

Model B adds police, unemployment, and the population share of young men. Now a state with 1 percent higher violent game interest — and exactly the same police, unemployment, and young men — should have a 0.271 percent higher violent crime rate. Accounting for the effect of police, unemployment, and young men has cut the effect of violent gaming in half.

Model C, which takes into account income and education levels, brings the effect of violent game interest back up to a half percent. The effect seems sensitive to the presence or absence of other variables in the model.

Up to this point, none of the models has produced an accurate estimate. Were the study to be repeated with new data, we would expect the estimates to vary considerably from what we have here.

Model D, which adds diversity and out-of-wedlock births, is the one in which I have most confidence. It is the first of these models to exhibit any robustness in terms of accuracy.[11] Three variables are revealed to be somewhat accurately estimated: Police, income, and out-of-wedlock births. Note that the effect sizes are large. The value for out-of-wedlock birth says that if two states are exactly the same in terms of police, income, etc., but one has a 1 percent higher rate of out-of-wedlock births, that state will also have a 3.086 percent higher rate of violent crime — three times higher than the stimulus of 1 percent. We could go into some detail discussing the reasons why these factors matter, but we are most interested in the effect of violent game interest, which has become negative and very small, -0.050. The effect of out-of-wedlock birth on violent crime is 60 times larger than the effect of violent game interest.

The accuracy of Model E is not strong. It survives a weaker test of accuracy than is standard in the literature.[12] Still, it is more accurate than Models A-C.

Model E includes all the factors above plus a measure of cognition, an 8th-grade math score in the state. Its coefficient is large, as is the effect of the high school graduation rate. But neither of these is accurately measured. The effect of police, income, and out-of-wedlock birth remain large and accurately estimated. The effect of violent game interest returns to the positive and is now one-tenth the size of out-of-wedlock birth.

Model F is included, as I said above, as a cautionary tale. As I tortured the data, I came across this Model F, an odd but strangely strong regression. Model F drops out the math score and the young male demographic. It also switches the violent game variable from an average of all eight games to just the Google Trends index for one game, the game Hatred. Lo and behold, the three primary variables police, income, and out-of-wedlock birth all pass a strong accuracy test (statistical significance at the 95 percent level). The violent game effect rises to 0.725, the largest of any model reported here. Out of wedlock birth effects are still five times larger.

Looking over all the results, the important thing to note is that some variables consistently have the same effect across many specifications, whereas the violent game effect whipsaws up and down around zero. When a variable moves around like this, it means that the true effect is so small that it is hard to measure with any accuracy. One could not conclude from these regressions (and the many others that I ran) that state levels of interest in violent video games are correlated with state violent crime rates. There’s nothing there.

Perspective: Crime, shootings, and cars

Violent crime rates are tiny. Other than a few outliers, the states are clustered around 1 violent crime per 1,000 people. How much difference does it make, for the real world of daily living, that the violent crime rate is 0.5 per 1,000 or 1.5 per 1,000? Violent crime rates are so low that changes are not noticeable to the average person going about his daily affairs. The situation is similar to what happens when a study announces a doubling of some rare health risk. It sounds scary that the risk of getting cancer doubles if you eat one carrot a day. But if the original risk is tiny, doubling it is still tiny. “If you eat that carrot, your risk of getting cancer goes from 0.00000000001 to 0.00000000002.” So what?

In this case, the largest difference in violent crime risk across states is from Illinois at 0.000319 (that is, 0.000319 crimes per person) to Delaware at 0.008546. They’re both tiny. The only practical conclusion from the data is that the chances of being in a violent crime are so low that changes are not noticeable. We should not move from Delaware to Illinois just to lower our exposure to violent crime. We wouldn’t notice a difference.

The same is true of mass shooting. Shootings get a great deal of media play but are simply too rare to be the grounds for a change in personal behavior or public policy.

Unfortunately, the public seems to think that we could dramatically reduce mass shootings by restricting access to video games. This is crazy.

Suppose your cat caused your house to burn down, as follows. She jumped on the piano keyboard, got startled, rocketed up to the ceiling, and came down on a candle, knocking it over into the pile of pine needles that your five-year-old decided to bring in and put on the table. A rare event. Would it make any sense to get rid of the cat, because she was the “cause?” Wouldn’t it be even crazier for the government to ban cats, because of their “tragic consequences when mixed with musical instruments?” Isn’t this all rather insane? And what makes it insane is the extreme low likelihood of these events.

Wikipedia defines “mass shooting” as any shooting with four or more victims. In 2018 there were 323 shootings of this kind, in which 387 people died.[13] A review of the individual cases reveals that most of these were the results of gang violence, domestic disputes, or escalating fights. The number of walk-in-the-door-and-open-fire shootings is exceedingly small, perhaps 10 or 12.

Meanwhile, 36,750 people died on the highways 2018.[14] Stop and think: You know people who died in car crashes. Not a lot, but a few. That level of awareness was generated by decades of auto deaths in the tens of thousands. But notice: Nobody is worried about the level of reckless, stupid driving going on in this country, and there’s a lot of it (especially in Ohio). Nobody is saying we should ban cars. Nor is anyone saying that racing and car games should be banned because they encourage reckless driving.

Yet 10 to 12 cases of shooting insanity are enough to make people argue that video games should be restricted or banned. In order to justify such restrictions, video games would have to be mighty powerful. And nothing in the evidence suggests even a moderate effect of game violence on crime, much less mass shootings.


In the existing literature and in this present effort to look into new data and methods, there has been no solid, consistent evidence of an impact of violent video games on violent crime.

At some point, absence of evidence becomes evidence of absence. I have been looking for the rhinoceros in my backyard for over ten years and still have not found him. At what point might I conclude that he actually is not there? We continue to produce studies showing weak or no effect of games on violence; perhaps it is time to conclude that there is no effect of games on violence.

I’m ready to make this conclusion, especially for policy purposes. Having spent the last 18 months researching, I’m convinced that the link between games and violence is so weak — if it exists at all — that it cannot support a public policy change. There’s nothing in the data to support a restriction on violent game play, either by parents or government, out of a concern for its purported effect on violence. There is none.

I would note, however, that I discourage violent games with my kids. This is not because violent games make my kids aggressive (they don’t). Rather, its because the more violent games also tend to suck away too much time. When a kid is in the middle of an intense level or death match, its hard to get him to come to the table for dinner. In fact, I view the main problem with games and social media not to be the content, but the time diversion. Eyeballs devoted to screens are not making contact with family members and friends, and that can become a problem if it happens too often.

A second reason I discourage the more violent games is that they are usually low quality. Lack of quality is the same reason I don’t encourage my kids to watch slasher flicks. Whether its slasher flicks or bloody games, there are better things to do. It doesn’t bother me if they play something cheesy once in a while. But I would prefer they devote their attention to games with some depth, like Minecraft or Undertale, or the heavier board games, like Kingmaker and Labyrinth: The War on Terror, or role-playing games like Dungeons and Dragons.

Games are part of a healthy media diet, for teenagers as well as adults. Their effects on society, like those of film, music, and books, are generally neutral. The proper role of society, and parents, is not to ban but to encourage gently the pursuit of aesthetic excellence, which appears in games far more often than most people realize.



The unit of observation in the regression study is states in the US. This does not include the District of Columbia. N = 50. The method is OLS regression, with the dependent variable being the incidence of violent crimes per 1,000 population.

During the study an effort was made to apply a two-stage least squares method, with varying sets of instrumental variables. A two-equation system was assumed, with video game hours from IPUMS and the violent crime rate from the FBI as the dependent variables. The goal was to explore the possible endogeneity of video game play. The results differed very little from OLS results. A Hausman specification test indicated that video games were not endogenous. However, given the small size of the data set, it is unlikely that an advanced method would yield reliable results in any case. It was decided to report only basic OLS regressions and accept the possibility that perhaps real-world violence might result in more people searching for violent video games. This seems an unlikely theory, but it is possible and is a potential limitation to the study.

A deeper discussion of these and other methodological issues may be found in a working paper titled “MMFX: A Method for Estimating Macro-Level Media Effects,” available from the author. A caution, however: The results in that paper are strictly exploratory. The paper was written only to explore different econometric issues. The main finding was that the current paper should be written and released in a public forum.

Data Sources

All data were gathered at the level of US states.

Violent crime
“Crime in the United States — FBI Universal Crime Reporting, 2015.” Violent crimes include offenses of murder and nonnegligent manslaughter, rape, robbery, and aggravated assault. Property crimes are offenses of burglary, larceny-theft, motor vehicle theft, and arson.

Violent Game Searches
Google Trends searches for 2015 in the United States. Games selected using Common Sense Media’s list of top 10 most violent games of 2015. Two games (Mortal Combat and OneeChanbara) did not have data for all 50 states and were dropped.

The searches entered the game’s name and then selected the sub-topic “[Name] video game.” The resulting variable is the interest index: An index of the share of all searches for that state that were for this term, relative to the most state with the most searches, which sets the index value of 100.

Total population. Comparative Demographic Estimates, 2015 American Community Survey, 2015.

Police and Sherriff Officers, May 2015. BLS Occupation and Employment Statistics.

University of Kentucky Center for Poverty Research, State Welfare Data, 2015.

Young Men
Comparative demographic estimates, 2015 American Community Survey.

Household median income, in units of 1,000. Bureau of the Census, American FactFinder, Income. Households, 2015.

Graduation rate
National Center for Education Statistics, Public high school 4-year adjusted cohort graduation rate (ACGR), by selected student characteristics and state: 2014–15 school year.

100 minus percent population white. Comparative demographic estimates, American Community Survey, 2015

Out-of-Wedlock Births
CDC — National Center for Health Statistics, Live Birth Data, 2015.

Math score
National Center for Education Statistics. average National Assessment of Educational Progress (NAEP) mathematics scale score of 8th-grade public school students, by state: 2015

Descriptive Statistics

OLS Regression Results



[2] For an overview of this research, and some of the wilder claims made about it, see Anderson, CA, Shibya A, Ihori N, Sing EL, Bushman BJ, Sakamoto A, Rothstein HR, Saleem M, “Violent video game effects on aggression, empathy, and prosocial behavior in eastern and western countries: a meta-analytic review.” Psychol Bull 136(2) 2010, 151–173 ( The review concludes “The evidence strongly suggests that exposure to violent video games is a causal risk factor for increased aggressive behavior, aggressive cognition, and aggressive affect and for decreased empathy and prosocial behavior.” CJ Ferguson notes the numerous methodological problems. Ferguson CJ, Kilburn J “Much ado about nothing: the misestimation and overinterpretation of violent video game effects in eastern and western nations: comment on Anderson et al. (2010),” Psychol Bull 136(2) 2010, 182–187 ( But the deeper problem is that there’s little of broader interest to be gleaned from data on aggressive feelings during an experiment. This literature goes to great lengths to establish correlations between minor amounts of game play and certain feelings that people might have. Nobody cares about this. We care about possible social problems caused by video games.

[3] Brown v. Entertainment Merchants Ass’n, 2011.

[4] In a particularly egregious failure, Huesmann and colleagues did a survey of young adults and matched the results with data from when those people were children. But there are two problems that prevent us from taking anything meaningful from the study. First, when measuring aggression, they lumped actionable policy events such as arrests with many other measures such as “aggressive personality.” The composite measure makes no sense for policy; it is a lumping together of different things, some of which matter for society and others which do not. The second problem is that the authors fail to account for other factors that could cause both childhood violent TV and adult violence. They claim to account for many factors, but their approach to regression analysis is simply wrong. They control for one or two variables at a time, rather than including all the relevant variables at the same time. In controlling for just one or two factors, they leave child TV viewing to pick up the influence of all the other, uncontrolled factors. The point of controlling for different factors in a study is to keep constant all external things like parental violence and income constant while isolating the relationship we care about, between childhood TV and adult violence. In English: Failure to include all the relevant variables in the regression means that the effect of childhood TV on adult violence is mismeasured. The literature is filled with studies like this. I could not find a single study that did a ceterus paribus regression study of real-world violence indicators on media consumption.

[5] Problems with these studies abound. They are typically based on college students who do the experiment for course credit. They typically play one game. One game! The “aggression” indicators are some kind of survey or minor exercise; how do we know they are taking any of it seriously? The only deviation from small-scale experiments is an even weaker approach, where researchers will compare a gamer group to a non-gamer group, purportedly discovering that the gamer group is worse in some way. But this method fails to account for other factors that could affect both gaming and these bad outcomes. The first thing an econometrician would ask is, “Have you accounted for family?” The issue is that there could be a third variable that causes both violent crime and violent game play. Perhaps people from violent families, who are therefore more likely to be violent themselves, play violent games to exorcise their inner demons. If so, then violent games and violent acts would go together — not because one causes the other, but because both are caused by something else. These methods are open to more criticisms than can be crammed into a single paper. Fortunately, the Supreme Court summarized things nicely: It’s too weak to have any real-world significance.

[6] Gordon Dahl and Stefano DellaVigno, “Does Movie Violence Increase Violent Crime?”

[7] I did however use the IPUMS measure of game play hours to conduct a lengthy study of methodological issues related to model specification. I estimated systems of equations that accounted for endogeneity between game play and violence. I explored many different sets of instruments for both factors. In the end, I abandoned these more advanced regression methods because the results were so close to OLS as to be indistinguishable. This stands to reason; with fewer than 50 data points (some states being missing), the data set is too small to support anything beyond the most basic operations. These advanced methods can be found in a working paper “MMFX: A Method for Estimaing Macro-Level Media Effects,” available from the author. Those methods should be used if data from Census tracts becomes available.

[8] The Google Trends interest index measure is unlikely to be subject to reverse causation. One might argue, for example, that states with high violence cause more people to search for “violent video games” or “game violence.” But these terms are not the source of the indices I am using. The indices here are based on individual game titles. It seems unlikely to me that someone concerned about violent video games would search for “bloodborne” — how would they know that this is one of the most violent games? Instead they would begin with the more generic searches like “violent games.” Furthermore, none of the games on the list are particularly famous, well-known, or notorious (that honor belongs to Grand Theft Auto). It seems likely that the only people who would search for “bloodborne” specifically are those who have heard about the game and want to play it.

[9] See the appendix for data source references.

[10] This way of discussing t-statistics is unorthodox but I would argue that it better captures the true meaning of classical significance testing. When an estimate fails a t-test, we say that the hypothesis of 0 is rejected. Some scholars proceed as if the effect is zero, and then completely ignore the variable. This is wrong. When the zero hypothesis is rejected, another hypothesis is rejected as well: That the true coefficient is twice as high. Both 2b and 0 lie within two standard deviations of the estimate. The failed t-test only means that in repeated samples the estimate may be zero (not must be) and it may be very much larger. The import of a failed t-test is that the variance of the estimate is wide relative to its location, and nothing more. Colloquially speaking, a failed t-test means that the estimate is inaccurate. It should not be ignored; it should be reported and then labeled as inaccurately estimated. Another way of saying this is that the magnitude of an estimate has meaning, regardless of the t-test. The location of the estimate distribution is an interesting statistical result. So is the variance, so we should report both the location of the estimate as well as some sense of the variance. This is done here by reporting the magnitude and then indicating whether the estimate is accurate/inaccurate.

[11] It also represents a large increase in variance explained. The R2 of Model D is almost 20 percent, whereas Models A-C do not exceed 4 percent. See the appendix for the exact figures.

[12] Model E’s variables succeed only vs a 90 percent threshold rather than the standard 95 percent threshold.





Professor at Indiana University.

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store