School Shootings That Weren't: An Unsurprising Result

NPR’s Morning Edition story “The School Shootings that Weren’t” investigated a statistic reported in the U.S. Department of Education’s School Climate and Safety Report for 2015-2016: “Nearly 240 schools (0.2 percent of all schools) reported at least 1 incident involving a school-related shooting.” NPR found, by contacting the schools with reported shootings, that two-thirds of them did not, in fact, have a shooting. NPR was able to confirm only 11 shootings among the 240 schools.

What’s going on here? How could the statistic reported by the Department of Education be so far off?

There’s a simple statistical explanation. The statistic comes from the Department of Education’s 2015-2016 Civil Rights Data Collection. All public schools were asked to submit data between January and June, 2017—a lot of data.

The school form has 79 pages of instructions for supplying the requested information on student and faculty characteristics, course and program enrollment, absenteeism, disciplinary actions, retention, sports participation, and other items “related to the nation's public school districts and elementary and secondary schools’ obligation to provide equal educational opportunity.”

Item 27 concerns school shootings, with the question:

Question 27.png

Remember, a school staff member is providing the answer to this and the hundreds of other data items requested. It is inevitable that some errors will be made - sometimes the incorrect box will be checked, or the answer for another question will be entered on that line.

Let’s look at a hypothetical example to see how data entry errors can affect the statistic. For simplicity, let’s suppose that there are 100,000 schools in the data, and 50 of them had real shootings.

Now suppose that one-tenth of one percent of the items have errors. That means that we would expect one-tenth of one percent of the 99,950 schools without shootings to erroneously report they had a shooting (about 100 schools). Of the 50 schools with shootings, we'd expect 1/20th of a school to have an error, so let's just round this to the nearest whole number (zero) and say all of schools with shootings report the item accurately.

Here are the numbers in a table:


A total of 150 schools in this hypothetical example report a shooting, but two-thirds of the reports are from schools that did not have one.

This is a simple consequence of an error rate, which, though small, is not zero. In this example, errors are equally likely to occur in either direction. But there are waaaaaay more schools without shootings than schools with shootings, so there are a lot more opportunities to mistakenly report a school shooting when one did not occur.

Statisticians call this a false positive result. False positives frequently occur in medicine, particularly when screening the population for diseases. Mammograms detect most instances of breast cancer, but about 7 to 12 percent of mammograms (the higher percentage is for younger women) have a false positive result—they show a cancer when there is none.

That is why positive results from mammograms are followed by more, and different, tests. And why error rates should be considered for statistics counting the number of rare events. Even a very very low false positive rate can mean that the majority of reported events are false reports.

Copyright (c) 2018 Sharon L. Lohr

Sharon Lohr