New SAT Is to Old SAT as…

Student study behavior, as recorded on a test preparation Web site, has changed with the introduction of the new SAT in March 2005. With the elimination of the popular analogy questions, students are spending less time overall on preparation, and appear to view vocabulary drills as less important. Three years of data show that the broad pattern of student study activity is still not well-matched to the actual demands of the SAT. We emphasize the potential for large-scale databases to open up new frontiers for education researchers.

In March 2005 the College Board changed the Scholastic Aptitude Test (SAT), a college admissions test taken by 1.4 million students annually in the United States. The new version added a grammar and writing section, a written essay, additional reading questions, and more advanced math content. The changes were intended to make the test more reflective of the academic requirements of high school and college. Critics of the old SAT had complained that student preparation was specific to the test and not linked to developing broader academic skills. Richard Atkinson, former president of the University of California system, reported having seen his 6th grade granddaughter already practicing analogies (Atkinson, 2005), perhaps the most famous type of question on the old SAT.

This important concern about appropriate preparation begs the question: How do students actually study for the SAT, and has study behavior changed with the new SAT? Gathering data to answer these questions is challenging. Surveys indicate that most students engage in some type of test preparation (Powers, 1998), but self-report surveys are not objective, lack detail, and might not assess transitions over time. Anecdotal observations, such as Atkinson’s, are also unsatisfactory. Fortunately, as students migrate to the Internet for studying, it becomes possible to gather very large amounts of objective data at a fine level of detail.

With three administrations of the new SAT in hand, we can report that the new test has in fact changed students’ preparation activity, based on usage data for nearly 100,000 students at a free online study tool, www.number2.com. The Web site offers tutorials, practice sessions and vocabulary drills, from which each student can choose ad libitum. Students enrolling for SAT preparation on the site between February 1 and June 15 in 2003, 2004, and 2005 constitute the sample population. The longitudinal data (shown in Table 1 on Page 16) demonstrate a strikingly stable pre-transition baseline in 2003 and 2004; the distribution of student activity is essentially equivalent in these two years.

In 2003 and 2004, (i.e. before the SAT transition), analogy questions were the clear favorite. On average, students answered ~28 questions across all categories, studied ~40 words in the vocabulary builder, and viewed ~250 pages on the site overall. In 2005, the popular analogy questions were no longer available. Rather than simply transfer the time and effort previously focused on analogies onto the other question categories, students actually studied less. The overall number of questions answered declined to ~23, words viewed in the vocabulary builder dropped to ~19 (students apparently did perceive analogy questions as a test of vocabulary), and the total pages viewed declined to ~200.

Across all years, study behavior is imbalanced, not reflecting the relative importance of components of the test. Although reading questions account for a majority of a verbal test score (and even more so in 2005), students are apparently unwilling to invest the time to read the comparatively lengthy passages and answer questions. With regard to the new grammar questions, only half as many students attempted these as tried the analogies in previous years. Students are no longer studying the analogies, but they do not yet appear to be spending more effort on reading and math.

Although these results are based on a sample of almost 100,000 students, representing approximately 5 percent of the entire test-taking population in the period studied, there are some limitations. The sample may not be representative, and Web study behavior may not reflect studying using off-line resources. Furthermore, Web site design is associated with the choices students make. For instance, it was noticed in 2002 that by randomly deciding whether the math or verbal categories are listed first on the student homepage a preference for verbal content over math could be reduced, but not eliminated (Loken et al, 2004).

The potential for large-scale databases to allow education researchers and public policy makers to ask and answer entirely new classes of questions is enormous. By monitoring student studying in real time, on a fine-grained level, and in very large populations, we will be able to calibrate and adapt study materials in real time, evaluate longitudinal trends in the quality of materials used (for example, is there a “dumbing down” of materials and standards?), and integrate assessment with learning tools to provide continuous performance measurements for schools, districts and states. Just as large-scale databases have opened new research frontiers in biology, physics, astronomy, and other sciences, so the fields of educational policy and research can also exploit these new sources of data.

References

  • Atkinson, R. (2005). College admissions and the SAT: A personal perspective. APS Observer, 18(5), 15-22.
  • Loken, E., Radlinski, F., Crespi, V., Cushing, L., & Millet, J. (2004). Online study behavior of 100,000 students preparing for the SAT, ACT and GRE. Journal of Educational Computing Research, 30, 255-262.
  • Powers, D.E. (1998). Preparing for the SAT 1: Reasoning Test: An update. (College Board Report No. 98-5). New York: College Entrance Examination Board.

APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.