Photo provided by DepositPhotos
Taking the SAT looks different now than it did five years ago.Â
The test went fully digital in 2024, a shift College Board billed as shorter, more accessible, and better suited to how today's students actually learn. Participation responded in a big way. Scores went the other direction. And the schools that swore off testing during the pandemic? Some are quietly bringing it back.
A year of data is now in. The story it tells isn't the one anyone scripted.
The Numbers Don't Match the Marketing
Start with participation, because that part of the digital rollout worked. The class of 2025 became the first majority-digital cohort, with 97% taking the test on a screen. More than 2 million students sat for the SAT, the highest figure since 2020 and a jump from 1.97 million the year prior.
People are also reading…
That's a reasonable victory lap for a redesigned test. Shorter format, adaptive sections, students bringing their own devices. On paper, accessibility improved.
But the average score kept sliding. Mean scores dropped from 1051 for the class of 2020 to 1028 in 2023, then to 1024 in 2024. The digital era arrived, and the downward trend didn't pause to acknowledge it.
So the data raises an obvious question: if the test is supposedly more student-friendly, why aren't scores following participation upward?
What the Drop Is Actually Telling Us
A surge in participation almost always pulls average scores down. When more students test, the pool widens to include kids who might have skipped it in a test-optional world.Â
But that explanation only goes so far. The decline began before the digital switch, which suggests something deeper than format is in play. Pandemic-era learning gaps, shifts in how students prepare, and the long tail of test-optional policies all factor in. The digital SAT didn't cause the slide. It just didn't reverse it either. Even at launch, education observers questioned whether a shorter, sleeker test would meaningfully change outcomes or simply repackage them.
And that matters, because the test was supposed to be a fresh start. Adaptive questions, a built-in calculator, no more bubbling answers with a pencil that's been chewed half to death. The redesign promised to meet students where they are. The numbers suggest "where they are" is a more complicated place than a format change can fix.
The Reinstatement Wave No One Saw Coming
Then there's the plot twist colleges handed everyone.
For a stretch, test-optional looked like the future. Then a wave of reversals hit. Dartmouth, Yale, Brown, MIT, Caltech, Stanford, Georgetown, UT Austin, and the public university systems in Florida and Georgia all moved back to requiring scores. According to FairTest, more than 80% of four-year colleges remain test-optional, but the schools breaking ranks are the ones with the loudest megaphones.
Their reasoning, broadly, is that scores helped them identify high-achieving students from under-resourced schools, the exact students test-optional policies were meant to protect. Whether that holds up across more institutions is a separate debate. The signal it sends to families is not.
For students staring down junior year, the message is muddled. Some schools want scores. Others don't. A few want them now after years of saying they didn't. Prep stopped being optional the moment a dream school changed its mind.
Why Prep Behavior Is Quietly Shifting
The digital format rewards a different kind of preparation than paper ever did.
Adaptive sections mean the second module's difficulty depends on how a student handles the first. Pacing matters more. Familiarity with the digital interface matters more. Strategies that worked for the paper SAT, like skipping around freely or marking up the booklet, don't translate cleanly. The test isn't harder, exactly. It's just different in ways that punish students who prep like it's still 2019.
Smart prep now means practicing on screen, learning the platform's tools, and building stamina for a test that's shorter but more concentrated. Students who treat the digital SAT like the paper version with a new coat of paint tend to find that out the hard way on test day. For students looking to adjust their approach, updated SAT tips built around the digital format are a better starting point than recycled paper-era advice.
The Bigger Picture
Standardized testing was supposed to be either dying or evolving. What's actually happening is messier and more interesting. The test changed format, participation grew, scores kept slipping, and the colleges with the most leverage started rewriting the rules again.
For students and families, that means the old playbook needs an update. Test prep isn't dead. It just looks different now, and the schools setting the standard aren't all reading from the same script.

