For months John Lott had tried getting a response from the Department of Justice and the authors of a FBI report on “active shooters” regarding errors that the CPRC had found. Lott then published our research in the March 2015 issue of the Academy of Criminal Justice Sciences, pointing out how the errors mistakenly lead to the conclusion that attacks were increasing over time. That finally got a response from two of the FBI study’s authors, Pete Blair and M. Hunter Martaindale, last week. As to the data errors in the FBI report, here is really all that Blair and Martaindale write (p. 34):
We acknowledge in the FBI report that our data are imperfect. Even “official” data have substantial issues, most of which have been thoroughly dissected by scholars (see Kelling, 1996; Stephens, 1999; Wolfgang, 1963). However, we believe we have collected the best data currently available on active shooter events. We are also constantly trying to improve our data. As new cases come to our attention, they are vetted and incorporated, as appropriate (we are doing this with the cases identified by Lott). . . .
In conclusion, because official data did not contain the information we needed, we had to develop our own. This required choices between various options with different strengths and weaknesses. While our data is imperfect, it nonetheless represents the best attempt to date to comprehensively capture active shooter events. Because it is the best available data, it can help inform response procedures and hopefully help save lives. . . .
There is absolutely no defense offered in response to the data problems that Lott pointed out in his piece.
Jason Riley’s article in the Wall Street Journal describes their response:
. . . But late last week, J. Pete Blair and M. Hunter Martaindale, two academics at Texas State University who co-authored the FBI report, acknowledged that “our data is imperfect.” They said that the news media “got it wrong” last year when they “mistakenly reported mass shootings were on the rise.”
Mind you, the authors did not issue this mea culpa in the major news outlets that supposedly misreported the original findings. Instead, the authors published it in ACJS Today, an academic journal published by the Academy of Criminal Justice Sciences. . . .
John Lott of the Crime Prevention Research Center—who has studied FBI crime data for three decades—told me in an interview that the FBI report is better understood as a political document than as a work of serious social science. For example, the authors chose the year 2000 as their starting point “even though anyone who has studied these trends knows that 2000 and 2001 were unusually quiet and had few mass shootings.” Data going back to the mid-1970s is readily available but was ignored. How come? Over the past 40 years, there has been no statistically significant increase in mass shootings in the U.S.
Another problem with the study: The data used seemed selectively chosen to achieve certain results. The researchers somehow “missed 20 mass-shooting cases,” Mr. Lott said. “There’s one case where nine people were murdered. You just don’t miss that.” Also, the omissions helped create an “upward trend, because they were primarily missed at the beginning of the period.” This, he said, “is disturbing.”
Mr. Lott told me that he had reached out repeatedly to the FBI and to the authors for an explanation after the original report came out, but none was forthcoming until last week. The Journal recently described Mr. Obama’s tenure as the “least transparent administration in history,” and the White House seems to have no interest in proving its critics wrong. . . .
The Media Research Center also had a very detailed discussion of the history of this false FBI report.
Media Matters has put out its typical misinformation. This time on Jason Riley’s piece on the now acknowledged errors in the FBI report on public shootings. As I wrote last October in a post at the Crime Prevention Research Center:
While the FBI report provides graphs illustrating “active shooting incidents,” not mass shootings, given the way that the report was written, the media has understandably interpreted the report as implying that mass public shootings have massively increased over time. . . .
The bottom line here is that neither Media Matters nor some of the authors from a FBI report on public shootings were able to defend the data used in the FBI report so they focus on a red herring, whether the media got confused about the report being on mass public shootings. Media Matters doesn’t even try to defend the 20 missing mass shooting cases that should clearly have been included in the Blair and Martaindale data set. Nor does Media Matters try to defend the way that the FBI report measured active shooter cases.
Jason Riley explicitly mentions that Blair and Martaindale complain that the media misunderstood their point on the issue of Mass shootings/active shooters (Riley writes: “[Blair and Martaindale] said that the news media “got it wrong” last year when they ‘mistakenly reported mass shootings were on the rise’”, Riley provides a directly link to their piece in the ACJS, and John Lott’s report explicitly spends a great deal of time discussing the problems with their measure of “active shooters.” Whether one wants to lump together mass public shootings with cases where a shot is fired in public and no one is hit, Lott’s point was that Blair and Martaindale’s measure of these active shooter cases is biased towards picking up just recent cases and thus tends to make it look as if there were an increase in public shootings over time.