Access, Governance, Indicators, State Advocacy, Transparency

A Reason to Celebrate the ESSA State Plans

A Reason to Celebrate the ESSA State Plans

Admit it—you’re reading Every Student Succeeds Act (ESSA) plans and feeling like state after state has let you down in some way. While no state will ever please everyone, I believe there is at least one area in which states are exceeding expectations: data. As I read, I’m continually struck by the way states are using information from their data systems to craft more nuanced plans.

Think about it—a decade ago states didn’t have student-level data (only aggregate) from districts, and linkages (like from student to teacher and K–12 to postsecondary) were nascent at best. So state policymakers were limited in their ability to dig into the data and model different policy options to better understand how their decisions would impact action on the ground (where it matters!). Data systems were more of a blunt instrument rather than the nimble tool they are today.

Fast forward to 2017, and ESSA has created an opportunity for states to demonstrate exactly how they’re using their state data systems to inform goal setting, develop richer indicators of student learning, understand the impact of n-size decisions, and ensure equity for all students. With these data practices out in the open, policymakers across all states can now have more robust conversations about their plans. The following are some examples of what that looks like in practice (with no judgments about the quality of these states’ plans, analyses, or indicators):

  • In its plan to ensure equitable access to teachers, Tennessee examined student math and English assessment results to develop “equity gap” analyses based on the percentage of individual subgroups with access to highly effective teachers.
  • To mitigate some of the challenges associated with a large number of very small schools, Vermont conducted analyses to develop two additional subgroups (historically marginalized and historically privileged students), an equity index, a second tier of district level accountability, and an interesting approach to n-size.
  • With an eye toward performance management, New Jersey outlines a plan to leverage its chief intervention officer, data governance system, and supplemental data collected through its state data system to provide better support to schools and districts throughout implementation.
  • Of the 16 states (including DC) that have submitted plans, every single one included at least one indicator requiring longitudinal data in its plan. Of course they all include the four-year adjusted cohort grad rate, but each includes at least one of the following: student growth data, postsecondary enrollment, ninth-grade on-track, and chronic absence.

As data use continues to evolve from hammer to flashlight—that is, from accountability only to shining a light on what works—is it possible we’ve also created a more sophisticated hammer along the way? These ESSA plans suggest that we have. And that is what it looks like when we make data work for students.