Translate

Tuesday, October 16, 2012

A few more studies done on conceal carry (CCW)

A few more studies have been published since 2011 looking into the conceal carry debate. In this case, we are looking at its effects on crime. Most studies have found some decrease in crime. In fact, in many of the studies where they claim no effect there is actually data they present that shows a decrease in crime. When looking at the data, nearly all studies prove some type of decrease in crime. However, a few studies find an increase. In 2003, Ayres and Donahue posted an analysis in the Stanford law review finding CCW had no effect on crime, and in many cases had a definite increase. So, I will go down the list of the studies claiming an increase in crime and evaluate them.

A) Ayres and Donahue 2003

This article was the first to argue an increase in crime (academically, at least, mainly because its nearly impossible to argue the position).

The commonly argued "robbery effect" argues after CCW, robbery continues to rise in the states after this law meaning no correlation exists. Sadly, this argument is a great example of cherry picking. Although on average robbery did increase (still) after the CCW law was passed, one must look first at the former trends. Before these bills became laws, robbery was increasing. After these laws where passed, yes, robbery kept raising but at a lower rate. This indicates a correlation of some sort does exist as the rate of increase decreased substantially, effectively proving CCW has some type of correlation. Further research finds that initial increase is not significant (as noted), but over the course of time states that passed these laws did have an overall downward trend. Ayres and Donahue (A and D) merely looked into before/after averages, which weakens their results. As Plassman, Lott, and Whitley note, "While Ayres  and Donohue acknowledge the problems in using simple before and-after average in evaluating the  impact of the law, yet they do not consistently apply that insight when discussing the evidence." In other words, A and D essentially admit their data is flawed effectively damaging the robbery effect hypothesis.

As for murder, A and D truly had to hide their results. Their own data showed a decrease in murder, however they tried to argue there was "little movement". A and D also mention the murder increase in Maine, but is that true? Not really, they had the wrong dates. Plassman et. al. notes, "The values for these four years show up in the data only because Ayres and Donohue  recode Maine’s right-to-carry law as going into effect in 1981 instead of 1985 as previous research had done." The murder only increased the 4 years before the law. As A and D added in those four years which should not have been implemented, it ruined the average showing an increase that never existed.

A and D argue rapes and assaults also raise after a CCW law is passed, however again the data is full of errors. The study using country level data had mixed results. Most of their data tables showed rape raising before the law, and falling after. Even after regression (a technique used to control for other variables [like crime cycles]) the results showed a decrease in murder. Only year did rape actually rise. In other words, rape decreased because of CCW most years and in only one did other factors erase CCW's deterrent effect. Using their state results (state data) after CCW rape declines for 10 years and stays below pre-CCW rates for at least 12 years. Only when Maine is the remaining state in the sample does rate rise. But remember, Maine in the study is not credible due to the date being changed, meaning in all honesty CCW decreases rape in all states. A and D's own data shows assaults decreasing after the law and assaults where 3% below the rate they used to be before the law.

Overall, the 2003 Ayres and Donahue study is not credible.

Reference: Plassmann, Florenz, John R. Lott, Jr., and John Whitley. "Confirming More Guns, Less Crime." Stanford Law Review (2003)

B) Aneja, Donahue, and Zhang, 2011

This is the biggest eye opener for me... but sadly, it is flawed like its younger cousin.

Here are the main criticisms:

  • The observations for county 2060 in Alaska are repeated 73 times for 1996.
  • It claimed (and used data from) Kansas in 1996 but the law was not passed until 2006. This obviously ruins the data as they have 10 years of irrelevant data that could easily throw the average off. 
  • The first year of data used in Florida was in 1989 but the law was passed in 1987. Now it is skewed as we are missing two years of potentially important data. 
  • South Dakota's first data was recorded dating at 1987 however the law was passed in 1985. Again, two years of important data, lost (the first years after a law is passed is crucial when determining its effects). 
Interestingly enough, the NRC had no conclusion on Lott's work. They did not say he was wrong, however they did not say he was right. But, to me, this makes me sad they cannot admit Lott is correct! Their own data is almost exactly the same as Lott's!Only once was it different—for murder—however their results where only different by .03. So, on balance, Lott's work has really been replicated by the NRC when the Aneja, Donahue, and Zhang data has not been replicated. Lott et. al. responded to Aneja, Donahue, and Zhang (ADZ), "ADZ couldn’t replicate the NRC results with the NRC data. They jumped to the conclusion that it was due to bad data from Lott. We now know that the data that Lott provided to the NRC was the same as that provided to hundreds of other researchers. Using Lott’s data, we could we replicate the NRC results for both the dummy variable and trend model, corresponding to the NRC Tables 6-1 and 6-2 and, since the NRC was also able to replicate the original Lott and Mustard results, the NRC must have been using the same data. We were also able to replicate the NRC “no covariate” model for the 1977-2000 sample. We find it hard to believe that ADZ couldn’t replicate those results with the NRC data. Researchers cannot be held responsible for errors committed by others who request their data."

Concluding Lott's data is flawed is odd, when Donahue spoke so kindly upon the NRC data in his debate on NPR with Lott. Donahue, then, really likes the NRC data, in doing so he must concede one of two things. One, the NRC and Lott used the same data and Lott's data is clean. Or two, Lott originally had bad data but in this 2012 study Lott uses a better technique for determining the effects of CCW and therefore agrees with their conclusion (that it lowers crime). Either way, he admits CCW decreases crime or that his hypothesis is weaker then he makes others believe. I would also like to note James Q. Wilson—one of those on the NRC committee and looked at the data the NRC provided—was completely justified for arguing CCW decreased crime and that the data is not corrupt.

To conclude on this study:

a) its data is highly flawed and leaves out either data or adds unnecessary data, both of which severely harm their results and lead to obvious flaws invalidating their conclusion.
b) data supports the more guns less crime hypothesis.

 Reference: The paper can be found here: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2026957


Two studies published in 2012

1) The first one I recently cited. The first studies abstract: "In a recent article, Aneja, Donohue and Zhang claim that they are unable to replicate the regressions published by the National Research Council in Chapter 6 of Firearms and Violence. They conclude that the NRC regressions must have been based on bad data supplied by John Lott. The implication is that earlier studies that found that right-to-carry laws reduced crime were flawed because of bad data. However, we can replicate the NRC results with Lott’s original data and with the data set used by the NRC. The earlier studies are not flawed by bad data. 

2) The second study is only written by John Lott. He does an analysis on the current research and finds 18 studies find deterrence, one (only one) study finds an increase in crime, and 10 claim to have found no effect (which is false, Lott in 2010 in MGLC proves their data supports a decrease). By the way, he is only counting top-tier studies (the 2003 A and D article is not in the refereed category meaning they are not counted in the results.) But, if we count ALL of the studies including those in non-refereed journals,  21 find deterrence, 11 find no effect, and 3 find an increase in crime. Due to the fact this study was written before his other 2012 study, we find 22 have deterrence, double the amount that claim no effect and over 7 times more then those which claim an increase. Lott concludes, "In state after state when right-to-carry laws have been adopted, the entire debate [on whether or not crime will explode] quickly becomes a non-issue within a yearafter the laws are passed."

Conclusion:

Overall, it seems obvious the argument for CCW leading to large increases in crime is beginning to crumble and the argument for passing conceal carry laws is increasing every passing day. I hope Americans, politicians, and judges wake up to these facts and conclude CCW is a good idea and should be passed in every state. And for those who currently restrict teachers (and others) from carrying these guns in their work areas should consider changing these laws.
 

No comments:

Post a Comment