Note: The Data Quality Campaign has convened a working group to develop a fuller brief on this issue, including recommendations, to be released this summer. Stay tuned for more on two-year growth.
This post is by Brennan McMahon Parton, director of policy and advocacy at the Data Quality Campaign, and Chad Aldeman, senior associate partner at Bellwether Education Partners.
Measuring how much students learn over time is the most equitable measure of student performance we have, revealing trends and patterns that status measures of student proficiency rates might miss. Student growth information will be critical to understand the extent of the impact of current school closures and to help inform recovery efforts.
Student growth measures not just student performance at one point in time. Instead, it provides the best estimate of school and district contributions to student learning. States recognize and value the importance of measuring student growth: All but two states include growth in their accountability systems for elementary and middle schools, and 20 do so for high schools. Our (forthcoming) annual review of report cards also found that 43 states included growth data — up from 39 in 2019. Parents and advocates value growth data too – they recognize the value of growth data and are calling for more.
In normal times, growth measures typically track how much students learn over the course of a single year. But with states canceling their 2020 spring assessments, there are valid questions about whether states can and will continue to measure student growth. Fortunately, it is possible to measure growth over two years’ time.
States can measure student growth in 2021 using data they already have, if they think creatively. While the process isn’t the same as normal, states can produce valid, usable estimates of student growth. Without 2020 assessment data, states can capture student data from the 2019 and 2021 spring assessments. This approach can be used with all different types of growth models (e.g. value-add, student growth percentiles, etc.), and the resulting data can still be disaggregated by student group – which will be especially important as states and districts seek to understand how the crisis affected students differently across communities.
This might seem like uncharted territory, but it’s been done before. States routinely use two-year growth measures when prior year assessment data is unavailable or unusable. For instance, 20 states hold high schools accountable for student growth, even though many states do not test all students in all grades. As an example, Massachusetts calculates a growth measure from a student’s 8th grade test scores vs. their 10th grade test scores to calculate how much the student grew over 9th and 10th grade combined.
Other states have dealt with this issue due to testing disruptions. In Tennessee, for example, the state had technical issues with the rollout of their new online exam in 2016. The following year, the Tennessee Department of Education released guidance that data from 2015 would be serve as the baseline for calculating value-added results in the spring of 2017. Research from the SAS Education Value-Added Assessment System found that Tennessee’s two-year growth estimates were highly correlated to its standard one-year growth estimates – making this strategy a viable step forward considering the absence of 2020 assessment data.
However, these two-year growth measures should be thought of as more of a temporary stop-gap measure than a replacement for the normal year-over-year calculations.
States should think about how this data should and should not be used. Because this data captures how much students have progressed over the course of two school years, states will need to come up with alternate business rules to allocate credit across the multiple years, especially for students who studied in multiple schools. Moreover, states may not want to consider using the spring of 2021 as a new baseline and de-emphasize immediate stakes or consequences for teachers or school leaders.
As states work to rebuild and recover, the data they collect, calculate, and share matters. States can use growth data to understand how they can best support students.
States can and should use two-year growth estimates to understand how school closures have impacted student success. Having growth data next year will allow states to see the extent of the “COVID slide” among different student groups. States can also compare the effectiveness of schools’ efforts to get students back on track – identifying and promoting best practices, and providing additional support where needed. Being transparent by sharing this information on report cards and in other public resources will open a dialogue with communities that will help parents work to support educators through distance learning.
Two-year growth measures are not the norm, but they are what’s available and possible for states to use now. These measures have worked before and will work for states to understand where students have grown (or not) during this difficult time, and to share this valuable information with teachers, parents, and communities. States valued growth data before the current crisis and should continue to. State leaders need all the data they can get right now – and two-year growth measures will help districts, teachers, students, and families understand what they need to do to promote recovery.