We Parents Did All the Scrambling During This Pandemic, and Now Randi Weingarten Wants the Credit
May 14, 2021Which School District in Your County Pays Teachers Most Generously?
May 17, 2021COMMENTARY: New ‘Report’ on Camden Students’ Academic Growth Fails the Smell Test
Last week New Jersey Policy Perspective (NJPP) released a “report” that tries to prove the state takeover of Camden Public Schools in 2013 didn’t result in improved student academic performance. The three authors, all from Rutgers University, skew data, ignore facts, and create a narrative that is less a quantitative analysis of student performance in Camden over the last 8 years and more an attack on the schools that parents prefer there, traditional charter schools and hybrid district/charter renaissance schools. As such, they undermine Rutgers, NJPP, and their own scholarly reputations.
This is not really a surprise. NJPP is largely funded by the New Jersey Education Association, whose executives are long-time foes of charter schools. Two of the authors, Michael Hayes and Pengju Zhang, are experts in budgeting and finance, not student growth. And the third author, Julia Sass Rubin, is the founder of Save Our Schools-NJ, the Princeton-based anti-charter/accountability group that draws its members primarily from wealthy N.J. suburbs. She’s also an NJPP trustee. Rubin made news when she told a reporter that parents in Newark and Camden “don’t have the bandwidth to even evaluate charter schools.” (One Newark mom, Crystal Williams, responded, “Who is Julia Sass Rubin and what does she have against my kids?”)
As such, readers would be wise to take the authors’ conclusion –”The analysis finds no evidence that state control of the district improved standardized test scores in Camden”— with a grain of salt.
Or, better yet, make it a barrel.
Here’s why.
The authors are forced to concede that, in fact, student proficiency rates have risen since the 2013 state intervention, which contradicts their primary thesis. (Maybe that’s the scholarly influence of Hayes and Zhang.) Throughout the report there are admissions of Camden students’ actual academic performance, like “As shown in Figure 1, Camden had positive growth in standardized test scores for all subjects” and “Figure 2 shows a positive spike in 11th grade ELA test scores for Camden City schools starting in the first year of the state takeover.”
There are three graphs. Each one shows that students achievement based on state standardized test scores went up post-2013. Each one shows that the gap between student performance in Camden and student performance in other Abbott districts has narrowed over time in every subject as Camden schools improved at a faster rate than schools in other Abbott districts.
Need more evidence of this confluence of truth-telling and data-spinning? How about the fact that over the last four years students in Camden public schools –district, renaissance, charter–have improved their proficiency in reading and math? How about the fact that the district graduation rate has risen 20 points since the state took over in 2013? How about the fact that the high school drop-out rate is down by almost 50% since the state intervention?
It’s true that Camden hasn’t yet caught up with other state takeover districts like Newark. But it’s been under state oversight for the shortest time and started at an abysmal point. In 2013, out of the state’s 26 “priority” schools –the bottom 5%—23 were in Camden. In 2013 three high school seniors reached the benchmark for “college and career-ready.”
Now? A recent Stanford report found that by 2017 Camden students were achieving roughly 85 days more of learning in math and 30 days more in reading than they had two years earlier. And a recent survey of Camden parents showed that 70% believed the public schools available to their children have improved over the last 5 years. (Twelve percent said they are worse.)
Perhaps the failure of the authors to prove their thesis is why this “report” is atypically short, fails to include the standard methodology section, and wasn’t peer-reviewed. (There is a cursory explanation for how they accounted for the state’s change in 2015 from basic skills standardized tests to grade-level PARCC tests— “to compare test scores across school years, the y-axis represents normalized test scores by converting each school’s test score into standard deviations from the average performing New Jersey school”—but we’re still comparing apples and oranges.)
Or perhaps this is just another hit-job, provoked by declining enrollment in Camden’s traditional district schools, with almost 60% of students in renaissance and charter schools.
I have no problem with people who share their opinions of local control and what’s best for students. I do have a problem with people who cloak a blatantly political agenda as scholarly data analysis. Maybe Rutgers University needs a state intervention.