Yesterday I reacted to the news, revealed during Wednesday’s State Board of Education meeting, that student achievement, as measured by standardized tests, flat-lined after four consecutive years of growth. I also discussed Education Commissioner Lamont Repollet’s proposal — already in progress despite Board members’ admonitions to Repollet last Spring to not lower standards — to, well, lower standards by creating easier tests in fewer grades. I noted that I didn’t have the data on achievement gaps.
I have the achievement gap data now and I’d like to offer some hypotheses for why our children’s steady gains in proficiency in English Language Arts (ELA) and math have started to backslide.
Let’s start with the achievement gaps. Here, I’m looking at the DOE data (no link, sorry) in ELA and math over the last five years of PARCC testing. I’ve compared two sets of subgroups: white students and Black students, and low-income students and all other students. (There is data for Asian students; in every category they scored higher than white students but let’s leave that be for now because Asian students comprise only about 10% of NJ students. There is also data for Latinx students; they scored better than Black students and worse than white students.)
The first year we administered PARCC tests in 2014-2015, the achievement gap in ELA between white students and Black students was 29.3 percentage points. In 2018-2019 the achievement gap for the same groups was 29.2, which is statistically insignificant.
In 2014-15 the achievement gap between low-income students and everyone else in ELA was 19.1 points. Last Spring it was 18.1 points. So one point better. Your call whether that change is significant.
Now let’s look at math. In 2014-15 the achievement gap between white and Black student was 27.6 points. In 2018-19 it was 31.7 points. This is statistically significant. It means that the achievement gap widened by 4 points, not a good sign.
Regarding low-income students and everyone else in math, in 2014-2015 the achievement gap was 17.1 points. In 2018-19 it was 18.4 points. So that’s a little worse.
I’ll note that this data on achievement gaps tracks with what we call the “gold standard” of student assessments, NAEP testing, also known as “the Nation’s Report Card.”
Now let’s get to less data-driven discussion: Why has student achievement stalled in just this past year? And what is behind Repollet’s crusade (backed by Gov. Murphy) to lower standards and interfere with our ability to measure student growth, the most basic form of accountability for a Department of Education?
Gov. Murphy was inaugurated in January 2018 but during his campaign (and we all knew he’d beat Republican candidate Kim Guadagno) he was hailed by NJEA leaders as an ally. In fact, at the 2016 NJEA Convention, he swore to members that he would “get rid of PARCC Day 1.”
Additionally, during his campaign Murphy made clear his disdain for linking student academic growth to teacher evaluations, which was set in place — with transient support of NJEA bosses — during negotiations over the 2012 law called “Teacher Effectiveness and Accountability for the Children of New Jersey (TEACHNJ) Act.” The law requires “a provision ensuring that performance measures used in the rubric are linked to student achievement” — at the time, 30% of a teacher’s evaluation was linked to student outcomes, then temporarily reduced to 15% — but Murphy made it clear that he intended to lower that percentage even further.
He did: Now only 5% of teacher evaluations are linked to student growth, rendering this form of accountability meaningless. School staff knew this was coming down the pike and, while certainly the vast majority of NJ teachers do everything they can to improve student learning, the decimation of TEACHNJ may be a factor in the drop in student outcomes.
Now, of course, Murphy was unable to fulfill his promise to immediately dump PARCC. (Shortly after his inauguration he was asked by a reporter about the process of eliminating PARCC exams. Murphy conceded, “the answer to the logistics of how it’s done, honestly, I don’t know.” I explain the process here.) In fact, the current standardized tests in NJ, renamed the New Jersey Student Learning Assessments, are still PARCC; we own the bank of questions and continue to use them. And that’s why Repollet’s plan to spend the next two years creating new tests — which will look just like PARCC because tests have to be aligned with course content — are a waste of millions of taxpayer money.
Regardless, last year districts knew PARCC was a goner, at least in its original form. Major modifications were made to placate (white, upper-class) anti-testers and their lobbyists, including NJEA, Education Law Center, and Save Our Schools-NJ. So why care about thoroughly covering course content if the tests were headed for the scrap heap?
This political stance — not an educational one — may have had an impact on the collective behaviors of students, parents, teachers, and principals. For comparison, consider Washington, D.C., which has maintained PARCC assessments for four years.
There, student outcomes continue to increase each year. “We are seeing progress,” D.C. State Superintendent of Education Hanseul Kang said last month. “This data is particularly valuable because we know that this assessment is truly a high-quality one that measures real-world skills like problem solving and critical thinking.”
Students at almost every neighborhood high school in the traditional public system improved on the English portion of the exam. The passing rate on the English exam at Anacostia High, where more than 80 percent of students are considered at-risk, jumped 8 percentage points, to 12.5 percent.
At Coolidge and Dunbar high schools, two campuses with high populations of at-risk students and those with special-education needs, passing rates more than doubled in English, with 10 percent passing at Coolidge and 15.8 percent at Dunbar.
When NJ was committed to accountability, to honest examination of student growth and how to improve student proficiency, our scores rose each year, just like in D.C. We have four years of data to prove it. But Murphy/Repollet’s aversion to transparency coincides with this year’s sudden dip in our children’s academic growth as measured by standardized tests.
It’s too soon to know if this is correlation or causation. At the very least, it’s troubling.
In a guest post this past July, State Board of Education Vice President Andrew Mulvihill wrote, ” When you provide standards, choice, and accountability, miracles happen for children.”
We’re not seeing miracles. We’re seeing regression. That can happen when you have state leaders more focused on optics than children, when decisions that should be driven by student academic need are instead driven by lobbyists and political debt. Over the next two years, while we spend taxpayer money to duplicate what we already own, we’ll see more PARCC data. I hope we see achievement gaps narrow. I hope we see increased levels of proficiency. And if wishes were horses, beggars would ride.