Sign up for email updates!

Don't miss out on what matters. Sign up for email updates!

Stay informed! Sign up for e-mail updates:

Wednesday, July 24, 2019

Stanford Study Openly Lies about Gun Violence

Not a subject I normally write about, but it needs to be addresses. When people use bad economic methodology to "prove" that guns cause violent crime, I get irritated...


A news story from Bloomberg, circulating in social media, is claiming to report a decisive conclusion on Right to Carry laws, namely that those laws cause an increase in violent crime.

The news story, which is also published by Yahoo Finance, explains:
Using different statistical approaches, and operating with the most complete data yet compiled, the researchers tested the effects of right-to-carry firearm laws in 33 states that adopted them between 1981 and 2007 – tracking violent crime before and after, and controlling for other factors. … No matter which [statistical] model was used, the study found that in states that adopted RTC, “violent crime is substantially higher after 10 years than would have been the case had the RTC law not been adopted.”
This would be a stunning conclusion, if it were true.

Which it isn’t.

The very claim that “violent crime is substantially higher” is utterly false. Nonsense.

Unlike the journalists who wrote those articles, I have read the actual research report. Here are some of the most egregious errors in their article.

Lie #1. “There is not even the slightest hint in the data from any econometrically sound regression that RTC laws reduce violent crime”

Actually, there is. If we review the data state by state, there is more than a hint that RTC laws actually work.

Lie #2. The research report “found that state laws making it easy to carry concealed firearms lead to more violent crime.”

No, it doesn’t.

Lie #3. “Defensive gun uses against criminals are rare.”

The report does not show this. Not at all.

Lie #4. “Because aggressors are opportunistic, and retain the element of surprise, even trained professionals, including police officers, may have limited ability to repel an armed assault.”

The research report does not show this. But even if it did, it would be an argument against spending any money on patrol officers in the first place.

Lie #5. “…so-called good guys with guns regularly supply criminals with weapons” because guns are stolen.

The study does not correlate this to any crime rate. They do not study how gun theft correlates with gun violence. They mention two examples, but present no data. Using anecdotes as the sole source of evidence is a pathetically banal violation of the most basic principles of good research.

Lie #6. “The Stanford study…”

No, it is not a Stanford study. It was published by the National Bureau of Economic Research.

With so many facts wrong, it is only logical that the Yahoo news story (replicated in numerous other media) completely misses out on the actual data in the research report. Keeping in mind what the news story says about it, we are going to review their data on a state-by-state basis, but first a word on their methodology – which leaves a thing or two to wish for.

The conclusions reported in the actual study are drawn from what is claimed to be a comparison between Right to Carry (RTC) states and non-RTC states. The researchers used what they call a “synthetic” sample of comparison states.

This sounds sophisticated, and it is presented with great fanfare. Yet it is really nothing more than a method for tailoring the statistical method to the outcomes you want. In plain English: the researchers compare each RTC state to a group of comparison states that they select – for each state. It is not the same group for each state. It changes depending on what state they are looking at.

For example,

  • Texas is compared to California, Nebraska and Wisconsin;
  • Pennsylvania is compared to Delaware, Hawaii, Maryland, Nebraska, New Jersey, Ohio and Wisconsin.

No sane quantitative researcher would do this. This is like saying “I want to see if liberals are dumb, so I will pick a liberal and compare him, based on his intelligence, to a sample of conservatives of my choosing.”

It does not help that the researchers call their sample for each state “perfect”, especially since they do not explain, let alone report, the data based on which they selected the comparison states.

It goes without saying that demographically and socio-economically, Texas does not have much in common with Nebraska. For one, Texas is a main route for organized crime from Mexico, which alone creates a major crime problem for the Lone Star State.

Which brings us to another methodical problem with the study: it does not distinguish between crime committed by organized gangs vs. crime committed by individual citizens. If a state is penetrated by gangs and cartels, it is likely to see a rise in crime as a result.

Nor do they control for the origin of any firearms used, or separate crimes depending on weapon. This last point is significant: there is emerging evidence that London and Stockholm, two European capitals in gun-strict countries, have higher rates of violent crime than comparable cities in the United States. The main choice of weapon: a knife.

But the manipulation of data does not stop here. It actually goes so far that the researchers openly admit that they had to manipulate some of it to get the results they wanted. Their claim that Pennsylvania is plagued with higher crime as a result of the RTC law is based solely on one factor: the difference made by Philadelphia. In their data, Pennsylvania without Philadelphia is peaceful; with Philadelphia, it is violent.

And this is supposed to be a “Stanford study”.

But we are not done yet. The researchers even lie about crime going up in all the 33 RTC states. Even when they report their artificially crafted results – based on control groups tailored to “perfection” for each comparison – one quarter of the RTC states saw a decline in crime.

Only 25 experienced an increase. And that increase is not even absolute. In many cases crime declined but deviated from the crafted, concocted control group in a way that the researchers interpreted as a rise in crime.

Toward the end of the research report they actually publish a time series of violent-crime rates for each RTC state. When they do, the narrative about RTC laws causing violent crime to increase falls apart at the seams:

  • In Alaska, the rise in crime they report (compared to Hawaii, Delaware and Maryland of all states) is simply the work of a time lag. This is a classic misinterpretation of time-series data. I see it all the time in economics research. If you adjust for the time lag there is no rise in violent crime in Alaska.
  • In Arkansas, the comparison states (California being one of them…) make it look like crime has risen. This, however, is clearly the result of a difference in volatility over time – not a rise in crime, just a difference in volatility. Again, this is a statistical phenomenon common in comparisons between individual states or countries to a sample of other jurisdictions. The sample creates a trend, which is almost by definition less volatile than the individual example, simply by virtue of being an average. The researchers also do not report that Arkansas saw a spectacular drop in crime after RTC was adopted.
  • Arizona is compared to California and Hawaii. Since Hawaii does not have the crime problem related to the Mexican border, it is entirely expectable that Arizona would look worse than its "control group” of states.
  • In Colorado the violent crime rate has declined since RTC was passed. The entire claim that it has “increased” is based on the same problem as with Arkansas: the choice of a comparison group with a smoother aggregated trend. Standard trick in statistical analysis, and also a standard flaw with econometrics as a methodology.
  • The results for Florida and Georgia are similar; in Kansas, Michigan, Missouri, New Mexico and Virginia, the crime rate tracks almost perfectly with the control states, yet the researchers report it as a state with “increased” violent crime. Ohio exhibits largely the same pattern.
  • Louisiana is listed as having an increase in violent crime, yet even a cursory look at the time series for that state tells you that it is temporary and entirely attributable to the Katrina disaster. In Mississippi there is a similar, albeit larger temporary spike, which goes away after 3-4 years.
  • In Nevada, crime falls faster than the carefully crafted comparison group, with what could very well be a temporary spike in the last two years.
  • Crime in North Carolina follows perfectly along the trend defined by the artificial comparison group.
  • Texas has a falling crime rate, with a deviation attributable in its entirety to the choice of Nebraska and Wisconsin for comparison.
  • In Wyoming, crime falls faster than in the artificial comparison group. Guess what that group is? Rhode Island and Wisconsin…

We can go on and on and on. It has been a long time since I saw research so poorly done and so openly biased toward tailoring the desired results.

If you can draw any conclusions from this research, it is that RTC laws work and those who hate guns want to do everything they can to cover it up.

Oh, and when someone tells you that "research shows guns lead to more violence", don't believe them. Because now you know that it doesn't.

1 comment: