ACT Forging Clear Path Towards Dynamic Digital Assessments in All Markets

Jed Applerouth, PhD
October 11, 2016
#
min read
Graphic of a notepad and pencil.

The ACT is poised to once again change the face of college admissions testing. In the fall of 2017 the ACT is migrating its overseas testing to a computer adaptive format. Once the kinks are worked out overseas, the US market will be next in line for the great digital and adaptive migration.

In late September I had the opportunity to speak directly to ACT CEO, Marten Roorda, and Senior VP of Client Relations, Paul Weeks, during the annual meeting of the National Association for College Admissions Counseling (NACAC). Roorda and Weeks painted a clear picture of the ACT’s future: it’s digital and it’s adaptive, and it’s already well underway.

The ACT has been on a buying spree and a hiring spree of late as it prepares for the move to adaptive assessments. In the last year, the ACT acquired Open Ed and Pacific Metrics, enhancing its capabilities in the domains of psychometric research, online testing, test security and scoring, and combined these entities to form the ACT Assessment Technologies group. This summer the ACT hired Wilm van der Linden, the man who literally wrote the book on Item Response Theory, the foundation of adaptive testing, and Alina von Davier, expert in data modeling, algorithms, and adaptive learning.

For several decades, the ACT has been building capacity in the arena of adaptive testing, and now finds itself well-suited to lead the national movement towards adaptive college admissions tests. For 20 years the ACT has been working in the domain of computer adaptive testing (CAT), primarily through the ACT Compass program and its decade-long collaboration with the GMAT, one of the most important CATs in the world of higher education.

It is inevitable that paper tests will eventually disappear from the assessment landscape, to be replaced by more secure, efficient, dynamic, and engaging forms of assessment. CATs have been gaining broader acceptance across the educational landscape. For decades, graduate programs have utilized adaptive tests such as the GRE and GMAT for admissions purposes, and now dozens of states are using computer-based and computer adaptive assessments to measure student achievement. The Smarter Balanced assessment (SBAC) is one of the most broadly utilized CATs on the market.

Benefits of CATs include efficiency and economy (i.e., shorter tests), heightened precision and accuracy (more specific measurement of student abilities through targeted questions), and greater security. Shifting to CAT does pose several challenges. Students must adapt their test-taking approach when shifting from paper to the screen. There are issues of access and facility with technology when moving to a digital format. And there are also issues of test security, though to a lesser extent than with fixed test forms.

Using cutting-edge technology to address test security concerns

People with the inclination to cheat will exploit any weakness inherent in a system. Fixed test forms can be fairly easily compromised, copied and distributed, especially when test forms are recycled and administered across multiple time zones. The weakness of CATs, in contrast, lies in the issue of “item exposure.” If a test’s item bank is too small, students may be able to memorize particular question “pathways” through an adaptive exam and subsequently share them with other students (i.e., if you answer the first 6 math questions in this fashion, the 7th question will be X, and so on). When the Educational Testing Service (ETS) realized this form of cheating was taking place on the GRE, which ETS writes and administers, it decided to replace its question-adaptive format (i.e., your response on each question determines the next question) to a section-adaptive format (your performance on an introductory section will determine the difficulty of the subsequent section). A question-adaptive format is superior in many ways and can be made more secure -- with certain controls in place and a robust enough problem bank, it will be effectively impossible for students to memorize all the possible pathways through a test, rendering item exposure less of a concern.

The ACT leadership is well aware of the challenges of creating secure CATs, and plans to use cutting-edge technology to build a sufficiently broad problem bank to mitigate the risks of item exposure. ACT leaders hope that transitioning the 60,000 plus overseas ACT testers to CAT will be highly effective in eliminating broad-scale cheating stemming from the use of fixed test forms.

Scaling up effectively will be a key challenge

Roorda made it clear at NACAC that the endgame for the ACT is to migrate its full suite of assessments towards CAT. In his early tenure as CEO, he’s making moves to accelerate this transition and bring the ACT to the forefront of the digital assessment age. The ACT must be mindful of moving carefully through this transition. The early implementation of the digital ASPIRE was fraught with technical challenges, similar to those faced by numerous states migrating to digital assessments.

We have ample evidence that CAT can be skillfully administered at scale. Each year the GRE is administered to 450,000-plus test takers and the GMAT to a quarter million test takers. The ACT is in active partnership with the writers of the GMAT, and will leverage the know-how it has gained from this alliance to eventually roll out adaptive ACT testing to the 2.1 million ACT test-takers.

Still, scaling up will present its challenges. Controlling the conditions at thousands of high schools and test centers will provide a greater logistics challenge than administering tests in professional testing centers as takes place with the GRE and GMAT.

A possible timing boon for test takers

One exciting opportunity that adaptive testing may provide could come in the form of more generous timing per question. The rigorous timing constraints of the ACT have long been a source of contention, as particular sections of the test seem to significantly advantage students with the highest processing speed. While processing speed is certainly one important element of cognitive ability, some have questioned whether it should play such a predominant role in determining student success on certain sections of the ACT.

Shifting to a more efficient CAT, in which each student receives the optimal subset of questions needed to assess his or her ability, may allow students to have more time per question without increasing the test’s total length. Roorda said the ACT plans to conduct timing studies on overseas students using computer-based testing, and would consider adjusting the ACT’s timing constraints based on the outcomes. A less speedy ACT would be a boon to students everywhere.

Closing thoughts

The move from paper tests to computer-based assessments is already underway in the US, at the district and state level with End of Course tests and Common Core assessments such as PARCC and Smarter Balanced. Soon we will see a broader shift to CAT, leading to shorter, more secure assessments. And finally we will see the transition to dynamic, visually rich, engaging assessments that leave multiple choice in the dust. That is the inevitable future of assessments, and the ACT is setting itself up to lead the charge.

Schedule a call with a Program Director.

Questions? Need some advice? We're here to help.

A happy Program Director makes the peace sign with their fingers
An animated man walks while juggling 'A', 'B', 'C', 'D' balls
Free Events & Practice Tests

Take advantage of our practice tests and strategy sessions. They're highly valuable and completely free.

Explore More

No items found.

Related Upcoming Events

Heading

(50% extended time)

Month DD, YYYY
0:00am - 0:00pm EDT
1234 Las Gallinas
San Francisco, CA 94960
Orange notification icon

No events found{location}

Check out upcoming Webinars.

Orange error icon

Error

Let’s figure this out.
Try again or contact Applerouth.

Retry
No items found.