Fine Tuning the ACT
Hot on the heels of the College Board’s announcement in March of a “redesigned” test, the ACT has responded with changes of its own, yet of a much subtler nature than its competitor. In a clever act of one-upmanship, the Iowa-based powerhouse used the College Board’s dramatic announcement as an opportunity to contrast the ACT’s “continuous improvement” and “gradually and thoughtfully” introduced “enhancements” with the “radical change,” “unnecessary risks,” and “total product reinvention” that currently describe the SAT. These not-so-veiled criticisms of the College Board’s March announcement constitute the ACT’s effort to demonstrate the contrast between two companies whose tests are starting to look remarkably similar.
In a nuanced announcement of enhancements and improvements to the test, the ACT sent a clear message to its share of the market: that it is operating from a position of strength and has a time-proven product that it is continually updating and tweaking, unlike its erratic and unreliable competitor.
The most substantial change to the ACT lies in the optional essay. Starting in 2015, the written portion will evaluate four writing areas: ideas and analysis, development and support, organization, and language use. In addition, students will be provided several perspectives and asked to create their own analysis of a complex issue, rather than responding to an open-ended question about a high-school-relevant subject. Gone is the essay where students simply take a position and brainstorm literary, historical, and personal examples.
This update is not exclusive to the ACT. Both the SAT and the ACT are tweaking their essays in similar fashion for their respective future tests. Many critics of both tests have lamented the fact that students can make up whatever details they want to support their point, and it seems that both companies have taken this criticism to heart. Both tests’ “redesigned” and “enhanced” essays will involve analysis of a text and an author’s use of that text to present an argument. Much like the current GRE, the college entrance exams will require students to analyze an author’s use of evidence, reasoning, and persuasive elements, a skill that is absent in the current essay format.
Outside of its announcement, the ACT has been slightly modifying its current tests, particularly with the reading section, “borrowing” some of the SAT’s strengths. Since October, reading sections have included comparison passages with questions that require students to identify the authors’ opinions regarding a subject and how they might agree or disagree. Such questions bring the ACT more into the territory of the College Board, whose short/medium comparison passages have long been a staple of the SAT’s reading section. Additionally, the “constructed-response” portion to be tested in various schools starting in 2015 calls to mind the grid-in section of the SAT math. While the ACT has positioned itself in the media’s eyes as the crème-de-la-crème of standardized tests, it has been (albeit covertly) importing some of the SAT into its test. If “imitation is the sincerest form of flattery,” both the SAT and ACT tests remain big fans of one another.
The other changes to the test mostly involve different ways the ACT can use the five sections of the test (English, Math, Reading, Science, and Essay) to evaluate student readiness for college. Such “readiness indicators” will include a STEM score, evaluating performance in the math and science sections; an English language arts score for achievement on the English, reading, and writing sections; a “progress toward career readiness” indicator, which will “help students understand their progress toward career readiness”; and a “text complexity progress indicator,” which will evaluate the students’ ability to understand complex texts that they may encounter in college and during their careers. The ACT will continue to provide students with the traditional ACT scores and ACT college readiness benchmarks. At the moment, it is uncertain as to how colleges and universities will use these new indicators, if at all.
Wayne Camara, senior vice president of research with the ACT, concluded that “the ACT will continue to be the tried-and-true achievement exam that students, colleges and states have trusted for more than 50 years.” As the two companies try to outdo one another in earning the nation’s trust, we can expect more “upgrading,” “enhancing,” and “borrowing” in the years to come.
More information on updates to the ACT can be found at act.org/actnext.