Digital SAT Coming into Focus

Recently the College Board released a series of documents bringing the Digital SAT into sharper focus: a high-level overview of the coming changes, a comprehensive (191-page) document analyzing the form, structure, and composition of the Digital SAT, and a suite of 33 sample questions

Although full sample tests will not be available until the fall, we now have adequate information to anticipate the changes that international students will see in March and domestic students will see on the October 2023 PSAT and the March 2024 SAT. 

We now know the structure and length of the Digital SAT

Previously, we had known that the Digital SAT would have four sections (two Reading & Writing, two Math) and that the second section of each would be adaptive depending on how the student performs on the first section. We now have specifics on the question length and timing for each section. 

Timing per section       Number of questions per section
Reading and writing   32 minutes 27 questions: 25 operational and 2 experimental
Math 35 minutes 22 questions: 20 operational and 2 experimental 

 

Here is the basic format and timing of the test

Initial set up
Reading and Writing baseline module 32 minutes   27 questions  
Reading and writing adaptive module (Easier or Harder)   32 minutes 27 questions
10-minute break
Math baseline module 35 minutes 22 questions
Math adaptive module (Easier or Harder) 35 minutes 22 questions

 

The total timing for the test administration, including the 10-minute break between the “verbal” (Reading and Writing) and math sections, is 2 hours and 24 minutes. This is a remarkably shorter test than the current SAT and the ACT, and the amount of time per question has been significantly increased on both sections of the test.  

The average amount of time per verbal question on the current SAT is 61.4 seconds. The Digital SAT allows students to have 71 seconds per verbal question, an increase of 16% more time per question.  

Similarly, the current SAT allows students 84.2 seconds per math question, while the Digital SAT allows 95.5 seconds per question, an increase of 13.3% more time per question. The digital SAT allows students to have roughly 15% more time per question than on the current paper-based test.  

The stark difference lies in comparing the time per question of the digital SAT to the ACT:

Question type Seconds per question  
dSAT verbal (R+W)   71.0
ACT verbal (R+E) 44.3
dSAT math 95.5
ACT math 60.0

 

Compared to the ACT, the digital SAT allows a whopping 60.5% increase in the amount of time allotted per question on verbal and 59% more time per question on math. Having an extra 60% more time per question is a full extended timing testing accommodation, which is incredibly compelling. Students are going to like the decreased emphasis on processing speed, and the opportunity to show what they know, rather than worry about how fast they can work.  

The decrease in focus on processing speed was informed directly by timing research conducted by College Board psychometricians. They found that by increasing the verbal section from 60 minutes to 64 minutes and cutting the number of questions from 56 to 54, an “additional segment of the test-taking population [was able to] finish the section without rushing.” The psychometricians similarly reduced the math section by 2 questions to achieve this same goal. 

It’s refreshing to see the testing agencies focusing on the testing experience for students, those with and without diagnosed disabilities and processing speed challenges. 

Embedded pretesting questions

All testing agencies need to pretest questions for clarity, levels of difficulty, and potential bias. Bad or biased questions are discarded, and student performance on items leads to the calibration of levels of difficulty of each question. 

Typically testing agencies separate out their experimental items into a discrete section, but on the digital SAT, the College Board has decided to embed its pretesting/experimental/nonoperational questions directly into sections containing the “real” questions that inform the student’s score. 

There have been concerns that students who know they are taking experimental questions that have no bearing on their score will not approach them with the same effort or attention. Differential performance on experimental items may have led to some of the challenges the SAT experienced in calibrating the difficulty of its test items, leading to greater variability in test difficulty across a calendar year. Hiding the experimental questions among the others may help address this.

On the other hand, experimental items may cause some students to get temporarily “hung up” or distracted by a bad or biased question. If the College Board does some preliminary pre-pretesting, they can eliminate the most troublesome items in advance of official test administrations. Thankfully, there will only be 2 experimental questions per section, 8 experimental items of the 98 total questions per test. The College Board noted its intention to minimize the number of experimental questions to limit the impact of these nonoperational items on student performance. 

Two-stage adaptive testing: easier or harder adaptive section.

The College Board has decided to keep things very simple with its adaptive delivery model. Every student will take a baseline section for both math and for verbal that will contain a mix of easy, medium, and hard questions. The subsequent module will adapt in only one of two ways- the questions will be, on average, easier than the baseline, or harder than the baseline.

The College Board test writers will need to generate three discrete types of sections: (1) baseline sections which have a broad mix of questions, (2) harder adaptive sections, which skew towards a more difficult level of question, and (3) easier adaptive sections, which skew towards easier questions. Over time, College Board test writers will create many discrete forms of baseline and adaptive question sets which will be in circulation.  

The College Board explains that the digital SAT has “been designed and developed such that each student is administered a highly comparable but unique version of the test.” In other words, two students taking the harder adaptive math section on the same day but in different time zones would likely not be able to text each other the answer to question 14 because question 14 won’t be the same for them. This is just one concrete example of the vastly heightened test security afforded by digital adaptive testing.  

The question remains, however, how exactly will the College Board achieve this promise of “highly comparable but unique” tests for each student? One option is for test forms to have the same or similar questions but in a different order from one another. Another, more robust option, is to have many different test forms so that each student has an entirely different set of questions from other test-takers. Technology can be used to generate more test forms and can also be used to generate validated “template” items that can be replicated across multiple test forms with slight changes to the specifics so that two questions test the same thing but the correct answer for one is 4 while the correct answer for the other is 16.

When the College Board shares more detail on their methodology for generating and varying test forms, we will provide another update.

College Board acknowledges potential concordance limitations 

The College Board has been steadfast in stating that the new digital test and its 1600 scale would correlate directly to the current paper-based test and the ACT concordance table. Buried in the analysis, the College Board acknowledges that this concordance is not yet backed up by psychometric criteria. (Psychometrics is a branch of psychology interested in testing, measurement, and assessment.) The College Board states that the scores are linked (i.e. relatable) versus equated (i.e. the same score). We appreciate the College Board’s honesty in this regard, although it may lead to the ACT deciding to construct a new concordance table to compare the ACT to the digital SAT.

A paper-based testing accommodation for students with disabilities

For students with disabilities rendering them unable to take a test on a computer, a paper option will be available. The paper accommodation will consist of a slightly longer nonadaptive (AKA linear) test form. While the digital SAT is 2 hours and 14 minutes long, the linear test is 2 hours and 44 minutes long. A longer test form is required absent the advantages and efficiency gains of adaptive testing.  

Where the 2-module digital math section has 40 total operational and 4 experimental items, the linear math section will consist of 54 operational items and no experimental items. Similarly, the linear verbal section will consist of 66 operational items and no experimental items, contrasted to the 50 operational and 4 experimental items on the digital SAT. The time per question will remain identical on both the digital and linear test forms.

It is unclear who will be eligible for the paper testing accommodation. It is also unclear whether test-security will be a concern for paper-based testing, depending upon how many discrete test forms the College Board creates for this relatively small population of students. Item exposure may be more of a concern if there are a small number of linear test forms in circulation.

The digital SAT offers support for English Learners, but with a big asterisk

A criticism against the current SAT is that it places too much emphasis on reading skills that hinder English learners. The digital SAT has acknowledged this feedback in several ways: writing questions will no longer test word choice (affect/effect, then/than) and math word problems now have a cap on how many words they can contain and how frequently they can be used. 

The digital SAT will also offer accommodations for English learners if they receive similar accommodations in school. These can include extended time and a bilingual word-to-word dictionary. 

The big caveat is that these accommodations are only available during school-day administrations of the digital SAT and not during weekend testing. This means that schools with a large population of English learners will need to sign up for an in-school test if they want to support their student body. Clearly, the College Board wants to move towards in-school testing and this looks to be a clever means of moving the needle. 

Reading and Writing are blended together

The current SAT’s EBRW score is split straight down the middle, with 400 of the points coming from the reading section and 400 from the writing section. The Digital SAT not only blends question types, with many reading questions now feeling like Writing “expression of ideas” questions, but has also eliminated any subscores that delineate the difference between reading and writing. 

Reading looks short but tough

We’ve known that reading would use shorter passages and only one question per passage. Any illusions that this would make for an easy section should be addressed by the practice questions published. All-and-all, the reading selected seemed tough and the questions asked were also rigorous

The SAT promises to categorize reading passages into three bands: grades 6–8, grades 9–11, and grades 12–14. However, the practice released did not provide this metric and it’s unclear where the College Board would place the current sample passages within those bands.

Yes, there’s more vocab, but no need to freak out

All indications before this release pointed towards a greater emphasis on vocabulary and words in context. In the Assessment Framework, the College Board addresses the 3 tiers of vocabulary. While not stated explicitly, the College Board seems the most interested in tier 2 (words common in writing, but less common in day-to-day speaking) rather than tier 3 (words used primarily in a specific context or discipline). Although sentence completion questions have been revived, students should not need to return to the dreary days of memorizing 1,000 SAT words in order to improve their verbal score. 

On Math, data representation takes a dip

The test writers are moving to reduce the reading burden on the math section and have fewer words per problem. On the digital SAT only 30% of math items will be in the context of science or social studies or real-world applications. The remaining 70% of items will be “pure” math problems. This marks a decrease from the current SAT and the ACT. Going forward the digital SAT math section may be less reading-intensive than the ACT. That would be a very significant change.

To accomplish the decrease in reading burden, the test writers pared back many of the data representation questions that exist on the current SAT. Currently, 29% of the math questions fall into this category, which includes questions involving percentages, probability, averages, and study design. The Digital SAT lowers this to only 15% of all questions. The likely reason the College Board has chopped down the number of data representation questions is that currently, many of them are text-heavy word problems. 

While word problems are being reduced, the College Board is increasing its focus on Geometry and Trigonometry. The Current SAT averages roughly 8% of questions from the domains of Geometry and Trigonometry, but the digital SAT will increase the percentage of geometry and trigonometry questions to 15% of all test items.

Time to get comfortable with Desmos

The Digital SAT includes Desmos, a well-loved, online graphing calculator. Although the Digital SAT version of Desmos will not be as feature-rich as the online version (an older version of College Board’s testing platform is available here), it does allow students to solve some algebra questions by being clever with the graphing calculator. For students who struggle with algebraic solutions, the graphing calculator can present an alternative approach to solving questions. 

Students and administrators seem to prefer the new test

The data from the pilot study suggests that administrators and students alike prefer the shorter, streamlined digital SAT. Eighty percent of students who had previously taken a paper-based SAT (sample size of 5,564 students) reported a “better test-taking experience with the digital SAT.” Students prefer the shorter testing duration and the more spacious timing. Administrators prefer the relative simplicity of administering the test without paper.  

More details coming soon

Within a few months, we will have the first full sample tests coming from the College Board as we’ll move to prepare our international students for the March 2023 administration. Stay tuned for more comprehensive information as we analyze those new tests and share with you insights from our research.

 


Applerouth is a trusted test prep and tutoring resource. We combine the science of learning with a thoughtful, student-focused approach to help our clients succeed. Call or email us today at 202-558-5644 or info@applerouth.com.