On Standardized Tests, Students Face an ‘Online Penalty’
Switching from paper to online standardized tests caused significant drops in students’ scores—and vulnerable students were hurt the worst.
Prior to the pandemic, dozens of states had begun offering standardized tests online, a trend that’s expected to grow rapidly as schools adapt to social distancing. But according to a large-scale 2019 study, students who took such tests online lagged behind their peers who took paper tests, performing as if they’d lost several months of academic learning.
The study also found that children from low-income families, English language learners, and students with disabilities were disproportionately harmed by switching to online tests. These students performed considerably worse than similar students who took paper tests, particularly in English language arts.
These findings may give pause to the many schools shifting to online testing. In Ohio, schools are increasingly giving students the option to take state standardized tests online. In 2014, 65 percent of Ohio schools offered state standardized tests online—a number that grew to 98 percent in 2017. And in California, 22 percent of students—nearly 700,000—took statewide standardized tests online last year. As the pandemic forces schools to look at viable alternatives to in-person testing, many may consider shifting to online versions, but they should consider the impact that shift may have on students, especially marginalized ones.
What We Can Learn From Massachusetts
In 2015 roughly half of the school districts in Massachusetts shifted to an online version of the PARCC test, a national assessment designed to measure student progress, while the other half administered a paper version of the test. News reports and anecdotal evidence frequently pointed to lower scores for students taking the test online, and Ben Backes and James Cowan, researchers at the American Institutes for Research, examined whether the drop in test scores could be explained by differences between the tests or by other factors—for example, higher-performing school districts being more likely to offer the test online.
Backes and Cowan found that students who took the PARCC test online scored significantly worse than their peers who took the same exam on paper, performing as though they had lost the equivalent of 3.4 months of learning in math and 7.3 months of learning in ELA.
“The same student taking the same test will get a different result whether they’re taking it online or on paper,” Backes told me, calling the results “an online penalty.”
In the second year of the online test, scores improved, although students taking the test online still lagged behind counterparts who took it on paper. The disparity shrank by roughly two-thirds in math and by half in ELA, suggesting that familiarity with the online format could help reduce—but perhaps not eliminate—the penalty.
This study highlights the importance of anticipating and addressing potential technological issues when transitioning to computer-based testing, a lesson that can be applied more generally to remote learning during the pandemic. Two students who are equally proficient in a topic can have different test scores based on how familiar they are with computers.
“Ideally, you want to capture something inherent to the students or the teachers, and not these artifacts of the test or the test mode,” Backes told me. “You want to know how well are the students performing, how much are they learning, how much are their teachers influencing how much they're learning, things like that.”
“Computer-based tests may measure skills, such as computer literacy, for which student proficiency differs,” Backes and Cowan explain in the study. They point to advantages that students from wealthier families have, such as access to computers and broadband internet at home, that can increase their ability to navigate design features of online tests such as drop-down menus and radio buttons, which are commonly used to select answers to multiple-choice questions. Backes and Cowan note that “urban schools are also less likely to have computers with internet access,” so their students have fewer opportunities to practice taking digital tests.
Vulnerable Students Hit Hardest
If all students performed equally worse on online tests, scores could be adjusted accordingly. But Backes and Cowan found that online tests increased the gap between high- and low-performing students, primarily children from low-income families, special education students, and English language learners.
In Springfield, which includes Massachusetts’s second-largest school district, about 35 percent of Hispanic and 28 percent of Black families reported having no computer access at home. Such disparities may drive the discrepancies seen between paper and online testing.
The use of computers also creates an obstacle for students who need special accommodations like text-to-speech readers or language translators: A 2011 study found that students with visual impairments did worse on computer-based tests that provided a digital reader, compared to similar students who took paper tests with an adult reader.
Online testing is likely to grow in the coming years. The tests are easier to administer and grade, and easier to update when needed. But if some students are receiving lower scores as a result, there could be significant consequences, “including identification for gifted and talented programs, consideration for special education programs, and being flagged for grade retention,” according to the study. In addition, many states use standardized test scores to evaluate schools and teachers, so the results could influence accountability measures such as school ranking or teacher compensation.
While teachers may not be in a position to change policies around testing, there are a few strategies that may help.
- Be an advocate: Teachers can make the case that vulnerable students should be given every opportunity to perform as well on online tests as they would on paper tests, which includes being provided with language or disability accommodations such as additional test-taking time or live translators.
- Raise awareness about effective policies: Backes and Cowan point to a key strategy that Massachusetts implemented to improve the transition to online testing: a two-year “hold harmless provision” that prevented schools from being held accountable for any negative changes in test performance, giving them time to resolve any problems. To help students, schools can apply for grants under the federal E-Rate program to purchase broadband services and equipment, and parents and teachers can contact their legislators to support a $2 billion bill proposed earlier this year to expand E-Rate funds in response to the pandemic.
- Offer practice tests on a computer: A 2015 study found an association between students’ home computer use and their writing proficiency in computer-based tests. Students who had frequent practice writing on a computer not only had longer written responses but also were more skilled at using editing functions—such as cut and paste—to improve the quality of their writing. When students are unfamiliar with taking tests on a computer, they take more time to construct their responses and spend cognitive resources trying to figure out how to use word processing tools.