You are here
Is Online a Better Baseline? Comparing the Predictive Validity of Computer- and Paper-Based Tests
Prior work has documented a substantial penalty associated with taking the Partnership for Assessment of Readiness for College and Careers (PARCC) online relative to on paper (Backes & Cowan, 2019). However, this penalty does not necessarily make online tests less useful. For example, it could be the case that computer literacy skills are correlated with students’ future ability to navigate high school coursework, and thus more predictive of later outcomes. Using a statewide implementation of PARCC in Massachusetts, we test the relative predictive validity of online and paper tests. We are unable to detect a difference between the two and in most cases can rule out even modest differences. Finally, we estimate mode effects for the new Massachusetts statewide assessment. In contrast to the first years of PARCC implementation, we find very small mode effects, showing that it is possible to implement online assessments at scale without large online penalties.
Citation: Benjamin Backes, James Cowan (2020). Is Online a Better Baseline? Comparing the Predictive Validity of Computer- and Paper-Based Tests. CALDER Working Paper No. 241-0820
You May Also Be Interested In
Impacts of Academic Recovery Interventions on Student Achievement in 2022-23
Maria V. Carbonari, Michael DeArmond, Daniel Dewey, Elise Dizon-Ross, Dan Goldhaber, Thomas J. Kane, Anna McDonald, Andrew McEachin, Emily Morton, Atsuko Muroga, Alejandra Salazar, Douglas O. Staiger
A Descriptive Portrait of the Paraeducator Workforce in Washington State
Roddy Theobald, Lindsey Kaler, Elizabeth Bettini, Nathan Jones
Academic Mobility in U.S. Public Schools: Evidence from Nearly 3 Million Students
Wes Austin, David Figlio, Dan Goldhaber, Eric Hanushek, Tara Kilbride, Cory Koedel, Jaeseok Sean Lee, Jin Luo, Umut Özek, Eric Parsons, Steven Rivkin, Tim Sass, Katharine O. Strunk
See other working papers on:
Research Area: Social policy and program impact