Research and Reflections

What the Data Can’t Tell Us

Concerns about summative assessment in a pandemic year

Lately, the four of us have been involved in lots of conversations with our colleagues at SAP about rehumanizing assessment practices. Some of these are inspired by the work of the amazing educators at Newport-Mesa Unified School District, who are working to make their assessments more student centered (check out the webinar or some of the recent Aligned posts focused on ways that teachers can humanize their assessment practices). It is a tremendously important conversation, as assessment is a piece of the educational system that, for all its potential to be a support for students, has actually harmed many kids for quite some time. 

It is why we read the recent guidance from the Biden administration around summative testing with such trepidation. In a year where students, whom our education system is supposedly built to serve, have faced so much disruption, we have a lot of concerns about the way that summatives, both in their design and in the way we use the results, contribute to the dehumanization of our students. Any concerns about summatives in previous years will only be magnified in this pandemic year. 

Why? Well, let’s start with things that we know to be true about summative testing. First is the question of what is tested. Summatives are designed to measure what students have learned in school that year in ELA/literacy and math. This means that the blueprints (or documents outlining the design of the test) must align to the standards, as, after all, that is what is taught. Enter 2020. States, districts, and schools had to make decisions about how to get kids learning again, and often those decisions involved prioritizing academic content to meet the unique needs of this year. The limited instructional time has meant that schools have prioritized certain content in order to focus on what is most important. No two classrooms, two districts, two states, did this in exactly the same way. The National Academy of Education, a leading group of education researchers, recently released a report talking about this exact issue; for more, check it out here.

Yet we have limited evidence that state tests have made similar adjustments to assessment blueprints. While some states did release revised blueprints earlier in the year, it is unclear if this happened everywhere, and if districts and teachers had enough time to ensure that the content they prioritized matched the content that the state says is most important. So, it is likely that tests will cover content that students might not have gotten this year, through no fault of their own. 

And here is one area where some serious concerns about this decision come in. Summative testing data can be used in so many different ways, some of which do make sense in the life of a school and some of which quite simply don’t. At its best, this data can be used at the school or district level to develop a big-picture-understanding of how groups of students are doing, and make comparisons year to year. Practically speaking, this means it can be used to find places where things went well and to understand which districts and schools need the most support and then deploy additional resources to get students what they need. 

And yes, while this group-level data comes from individual students, there are serious risks that come into play when trying to make decisions about individual kids. This practice has the potential to inaccurately label some students. Analysis of test data at the student level often results in kids getting reduced to a “bubble kid” or a “struggling reader” instead of the full, brilliant, amazing humans that they are. These labels are always damaging to kids, but in this disrupted year have even more potential to cause harm for years to come. 

So what is an educator to do? After all, state testing windows open very soon, and some states have said they will test regardless of federal guidance. We have a couple of suggestions for you to consider as you do the hard work of teaching and learning in the upcoming weeks:

  1. If you are welcoming students back to physical buildings for the first time, continue to engage in content. Students need your love, support, and encouragement, not test prep. 
  2. Speaking of test prep, or changing plans to try and cram in content that you haven’t yet covered this year – don’t. In the long run, it is better for your students to develop deep knowledge of the content you have prioritized as opposed to a few tips and tricks that may or may not actually help them on the test. 
  3. In your PLCs, start talking about how you will share this data with your parents. Parents need to know the limitations of what the state test can tell them about their kid. In a year where there are concerns about validity, you need to talk now about how you can communicate that with parents. 

And when that data comes back:

  1. Do not use it to make instructional decisions about students: It is limited in what it can tell you, so if you are planning to use it to try and group students for academic supports for next year, you may inadvertently assign students to the same group with wildly different needs, or worse, keep students from engaging in work they’re absolutely ready for because a couple of test items told you otherwise.
  2. Do not use this year’s summative data as the sole piece of information that tells you how well students mastered grade-level content; it provides an incomplete picture of student academic performance. 
  3. Do remember the other sources of data that matter for this year. Information from the classroom, such as attendance, assignment completion tracking, listening to students read aloud, listening to students talk about their mathematical thinking, family surveys, and talking with caregivers about their kids’ interests and learning: all of these provide a more complete picture of student learning needs as we move in to the 2021-22 school year.
  4. Do engage students in goal setting and self reflection about their learning, not about their test scores. It is important to support and engage students with their individual growth and progress to build and encourage their identity and agency. 

This year has been anything but easy for teachers and students. While we can’t change whether or not state tests occur, we do have a say in how they affect our students. Changing our mindsets and getting real about what summative test data can’t tell us, is the one step we can take toward making responsible, equitable decisions for kids.

One thought on “What the Data Can’t Tell Us

  1. YES, YES, YES! I own a tutoring company in South Florida, and I am a former classroom teacher and adminstrator. So many parents and students face the stress of getting back into the classroom, building relationships with their teachers and other students as they begin to socialize after being secluded in a virtual environment for so long. Our students need our support, our love, and our compassion now more than ever. The overwhelming need for counseling services for students suffering from anxiety and depression is insurmountable. While testing can be important in states and districts, it should definitely not be the center of our world as educators right now. I’ve always said to my students, “No test stress, just show what you know.” If we teach the content in a compassionate, culturally and compassionate way, the students will learn. Thank you for this article. I’m going to share it with all of my fellow educators!

Leave a Reply

Your email address will not be published.

About the Author: Katie Keown is a Senior ELA/Literacy Specialist on the Advisory Support team at Student Achievement Partners. She began her career as a Teach for America corps member, teaching middle school reading in rural Mississippi. She later moved on to teaching high school English in Illinois. While teaching high school English, her work revising the district's ELA curriculum led to a 24% jump in ACT writing scores across the district. Most recently, Katie worked for Houghton Mifflin Harcourt-Riverside. She started as a Content Development Specialist, working her way up to the role of Supervisor, managing the High School ELA team. She worked on assessments for district and state clients, as well as helped to run the development of assessments aligned to the Common Core State Standards. She holds a bachelor’s degree in English from the University of Illinois.

About the Author: Astrid Fossum is a Senior Mathematics Specialist on the Advisory Support team at Student Achievement Partners. Before coming to Student Achievement Partners, Astrid spent 18 years in mathematics teaching and administration, starting with 10 years as an elementary and middle school teacher in Minnesota, South Dakota and Wisconsin. She served in district mathematics teacher leadership roles and as the Mathematics Curriculum Specialist for the Milwaukee Public Schools. Astrid has experience teaching pre-service and in-service teachers and administrations at the college level and has served on the board of the Wisconsin Mathematics Council. Astrid holds a bachelor's degree in Art History from St. Olaf College, teaching certification coursework from University of St. Thomas and a master's in curriculum and instruction from Viterbo University.

About the Author: Jun Li is a Mathematics Specialist on the Professional Learning team at Student Achievement Partners. Prior to joining the organization, she served as the Math Design Lead at Transcend Education, where she supported partners, such as Achievement First Greenfield and Montessori for All, in designing their math program and managing a team to design CCSS-aligned assessments and curriculum. Prior to this work, she served in various network leadership roles at Mastery Charter Schools, including coaching teachers and leaders in math curriculum adoption and implementation. Prior to this work, she served as a middle school math teacher in Philadelphia, PA. Jun holds a Bachelor's degree in Sociology and Urban Studies and a Master's degree in Education from the University of Pennsylvania.

About the Author: John W. Young is a Senior Fellow for Research with Student Achievement Partners. He previously held positions as the Head of Research for the International Baccalaureate Organization, Director of the Higher Education Research Group at the Educational Testing Service, and as an Associate Professor of Educational Psychology at Rutgers University. In 1999, he received the Early Career Contribution Award from the American Educational Research Association’s Committee on Scholars of Color in Education for his research on the academic achievement of students of color. In 2014, he served as the President of the Northeastern Educational Research Association. In his career, he has authored or co-authored more than 50 peer-reviewed publications and has delivered more than 100 presentations at regional, national, and international conferences. He received his bachelor’s degree in psychology from New York University, his master’s degree in education from Harvard University, and his master’s degree in statistics and Ph.D. in educational research from Stanford University.