
Professional Decay: My Day at NJEA’s Teachers Convention
November 24, 2025The Truth Behind NJ’s Student Safety Data: When Safer Schools Are Just Safer Numbers
John Migueis is a Licensed Clinical Social Worker and the administrator of NJ21st.com, a platform dedicated to analyzing and reporting on municipal, education, and state issues affecting New Jersey’s 21st district.
Ed. Note: The NJ Department of Education has not responded to a request for comment.
In 2023-24, the New Jersey Department of Education reorganized its Student Safety and Discipline Report, and at first glance the numbers look encouraging: fewer total incidents, fewer police referrals, fewer weapons. However, as we all know when it comes to state reporting, especially with the NJ DOE, a closer look brings us to a very different conclusion with bigger questions about why we continue to spend money on increasing police presence that is disconnected from research and produces no meaningful outcomes, while needed services like high impact tutoring remain underfunded or unfunded.
For the first time, the DOE is placing far less emphasis on the outcomes of incidents such as suspensions, removals, and expulsions – moving them out of the upfront charts and into a smaller section of the report. While these counts still exist they are no longer connected visually or narratively to the broader picture. At the same time, student arrest data was removed entirely despite being included in previous years.
The way the data is presented in the new report has changed and key context that once made the discipline landscape clearer has become harder to see. As with changing proficiency benchmarks or assessment models, the way data is framed can create the appearance of improvement even when underlying conditions have not changed.
The Data
From 2018-19 through 2022-23, several indicators climbed steadily: police reports increased from 7,799 to 10,082, mandatory referrals increased from 3,629 to 4,842, suspensions ranged from 52,135 to 61,132, and weapons incidents increased from 924 to 1,537. So many police, yet all these numbers show is increased stress on schools, not improved safety.
NJ school districts have steadily increased spending on school police, Class III officers, security staff, camera networks and physical security upgrades over the past decade.
Our own 7-District dashboard based on the most recent ACFR bears this reality out. Seven of the some of the wealthiest and safest Districts spent over 2M combined, a 52% increase in security spending in just one year.

Pulling just one town from the Dashboard, Berkeley Heights – a small NJ community with 13,000 —we can see the exponential spend in security despite declining enrollment.


These numbers aren’t enough for the Mayor, Town Administrator and — oddly enough —the former Recreation Commissioner? They advocated for no caps and that the District cover police benefit time. Yet with all this investment the data does not show a reduction in the types of incidents police presence is supposed to deter, which should not surprise anyone because it lines up with what all the research on police in schools has shown for the past 20 years. Between 2018-19 and 2022-23, police referrals rose by more than 30 percent, mandatory referrals increased every year, weapons incidents continued climbing, and fights and assaults increased even before the pandemic disruptions subsided.
More policing did not generate fewer problems. In many cases it coincided with more. And as the data suggest, when staff become more comfortable with police presence in schools, they are also more likely to send children to them.
In 2023-24, the numbers suddenly dropped as total incidents fell to 33,526, down from 36,039.
Weapons incidents dropped to 1,146.
Mandatory referrals fell to 3,813 but still higher than pre-COVID levels.
These numbers suggest a safer environment until you look beneath the surface.
Fights increased (6,265 compared to 6,024).
Marijuana remained overwhelmingly the most common substance (5,309 incidents).
Restraint and seclusion affected 2,747 students, disproportionately preschoolers and children with disabilities.
Indicators didn’t improve—in some important areas they worsened.
The decline in total incidents was driven by reductions in a few categories, specifically weapons and certain substance incidents. It did not represent a broad improvement in school safety. Because the DOE shifted how data is structured it’s now harder to understand how incidents connect to discipline outcomes. Important information is missing or not as visible in the public charts- arrest data disappeared entirely, the category “Other Incidents Leading to Removal” disappeared from the summary charts and disciplinary outcomes were relocated rather than highlighted, making comparisons harder.
The DOE appendix lists 59,877 suspensions, 1,099 removals, and 18 expulsions, but these figures are no longer tied to the incident totals in the main charts. Arrest data disappeared entirely. This change in framing alters the public’s perception even when the raw numbers show little improvement or worsening trends.
The DOE might say the missing arrest data wasn’t deleted – just reorganized. They might point out that suspensions, removals and expulsions still show up in the Disciplinary Actions section, which is technically true.
But the bigger issue is what happens when you take discipline data out of the main charts and remove arrests entirely from the very report that is supposed to track school safety statewide.
That’s not reorganizing – that’s rolling transparency back.
Having incidents show up in one place and the consequences for those incidents somewhere else breaks the link between the two. The public is left with a safety report that shows a clean downward curve of “incidents,” while the reality of how schools are responding to those incidents is tucked away into a small and disconnected section most folks won’t find.
It also forces people to flip between multiple documents just to answer a basic question…the most basic one, in fact…. “Did schools become safer, or did we just arrest fewer students for the same behaviors?”
When you separate the behavior from the consequence, you’re not informing the public – you’re managing the narrative. We’re watching the same pattern play out in academic data. First it was changes to the NJGPA cut scores and now it’s adaptive testing and reporting shifts that no one has really explained.
For families, educators, and policymakers the message is pretty simple: Safer schools require more than safer statistics. The public should pay close attention to what happens next with proficiency testing and reporting because superintendents and boards of education are staying unusually quiet.



