A Win for School Choice
November 24, 2009Quote of the Day
November 25, 2009N.J.’s Score on Data Quality
The Data Quality Campaign has posted its Annual Progress Report on State Data Systems, which measures each state’s development of robust longitudinal data systems to accurately measure student progress. The Report calls national progress “remarkable;” more and more states are reaching the ten “elements” the Campaign deems necessary for valid and reliable decision-making in education. New Jersey is one of 31 states that have met at least 8 of 10 elements. (Eleven states have all 10.) We’re missing “Statewide Teacher Identifier with a Teacher-Student Match” and “Student-Level Course Completion Transcript Data.” Not too shabby, although the former is a Race To The Top criterion. Here’s N.J.’s evaluation.
3 Comments
It's striking that the states with the best data systems are those which generally have the worst education systems – UT and WA included in that (going by the map of the number of data elements available). They include the 2 lowest effort states on ed funding (LA and De), states with fewer than 80% of their school aged kids even attending public or public charter schools (LA and DE), states with the lowest overall funding and lowest overall outcomes to match (even controlling for poverty variation). It's as if the assumption is that good data will trump good education. Not likely.
That said, I anxiously await the day that NJ gets on the ball with better data. I continue to do most of my work on other states because of NJ data quality. That said, the data quality report doesn't seem to identify those states which I believe have much better and more useful data than others.
What do you think N.J. needs to do to get better data? Is it the two weaknesses cited in the Data Quality report or something else?
New Jersey needs to link its teachers over time, with unique IDs and needs to link their teacher personnel records (current practicing teachers) to their academic credential records, so we can figure out the differences in academic backgrounds/preparation of teachers and principals across NJ schools.
Then there's the issue of the quality of the assessment data, which we should learn more about as NJSMART gets up and running and accumulates a few years of data. As of now, I am not comfortable with the precision, accuracy, reliability or validity of NJ assessments, and NJDOE certainly has done little to validate their predictive validity. We need more immediate access to student level scale scores, not this lump proficiency category stuff.
Of course, the final issue that complicates analyses of NJ data are the various configurations and sizes of NJ districts. Typically, when conducting statistical tests across different types of schools or districts, one has to use dummy variables to cut the data into the various schooling configurations. NJ, of course, has too many and little excuse. NJ also has too many schools and districts that are too small to report any subgroups in the aggregate data – hence the need for unmasked student level data to do almost any analysis. No excuse. And, NJ is so highly segregated, many districts just don't have subgroups. They're all black or all white – right next door to each other.
NJDOE is also very slow to release the detailed (line by line) annual financial report data. The finance staff are easy to work with though, and responsive.
I could go on, but my major concern is the first one – academic credentials of teachers and administrators. Not so surprisingly, a large body of good research indicates that strength of academic preparation matters – and that it is disparately distributed. I've got a new article coming out using Missouri and Wisconsin data on this topic soon.