The SAT has changed. Accordingly, so have the attitudes of admissions officers changed about its value as an applicant evaluation tool.
Not long ago, I received an update from Russell Shaffer of Kaplan Test Prep. He keeps me informed about emerging trends in the world of standardized testing. The topic of his latest update was: Kaplan Test Prep Comments on the SAT® Scores Report and Releases Results from its 2016 Survey of College Admissions Officers, Showing an Environment of Uncertainty.
As I’ve mentioned before when sharing information from Russell, I have absolutely no connection whatever with Kaplan Test Prep. Through Russell, they provide me with up-to-date information regarding trends and issues in the world of standardized testing. Much of this information can be quite pertinent to aspiring college students.
The phrase “Environment of Uncertainty” relates to how college admission officers have come to view the SAT as an accurate indicator of an applicant’s performance qualifications for higher education. This is especially crucial in the ever-increasing competitiveness at the top schools in America.
While it is true that many colleges are using subjective, so-called “holistic” admissions criteria, objective, quantitative test scores are still a major part of their evaluation process. Private schools tend to use holistic admissions to judge their applicants. Test scores are just one component of the overall admission process. In case you’re not familiar with holistic admissions, these links should help you understand.
Public universities tend to be much more quantitatively oriented for admissions. Thus, test scores, GPA, class rank, etc. weigh more heavily at those schools than at the private, holistically-oriented institutions.
Getting back to that “environment of uncertainty,” Lee Weiss, vice president of Kaplan’s college admissions programs, made the following statement about the annual SAT scores report released by the College Board:
“This scores report shows another decline in average test scores, which can likely be attributed to the expansion and diversification of the test taking pool, a trend we’ve seen for several years. While that trend is notable, keep in mind that it only applies to the old SAT, which is no longer being administered. At this stage what students and parents are really thinking about is the current SAT, and we know there’s a lot of uncertainty about it as this is new territory for students, guidance counselors and colleges as well.”
The scores report takes into account only scores from the old SAT, which expired in January. It does not take into account performance on the new SAT, launched in March, for which there is no publicly released score data yet.
Weiss pointed to several results from Kaplan’s college admissions officers survey, showing the jury is still out on the new SAT*:
- Mixed Views on Whether it’s a Better Measure: Just 29% of college admissions officers survey say the new SAT will help them better evaluate applicants’ ability to succeed in their school than the old one did; 31% say it won’t; and 40% are not sure.
- Skepticism on Essay Section: Only 4% of college admissions officers surveyed say their school will require the new SAT’s optional essay section; 21% say they recommend applicants submit it, but don’t require it; but the most popular policy, at 75%, is to not require or recommend students take it.
- Increased Popularity of ACT®: Nearly half (49%) report more applicants submitting ACT scores compared to previous years, with 63% of this subset reporting the increase as “significant.” A plurality of admissions officers (43%) also say there is an increase in the number of applicants who submit scores from both exams.
- Confidence in Evaluating Old vs New Scores: Nearly two thirds (65%) of admissions officers are confident in the ability of a concordance table to compare scores from the old SAT to scores from the new SAT and ACT and evaluate applicants; 17% say they are doubtful; and 18% aren’t sure.
“It’s great that students are adapting to the new SAT, as we hear many questions from them and their parents about how colleges will evaluate the new test. From what admissions officers are telling us, they are still figuring out what is in the best interests of their schools when it comes to the new SAT,” Weiss said. “The good news for students is that everyone involved — the test maker, admissions officers, administrators — has a vested interest in getting it right.”
Two trends appear to be emerging: (1) the new SAT’s essay has very low credibility (that’s terrific news for all those high school students who become anxious when faced with an extemporaneous writing challenge), and (2) the ACT is gaining on the SAT because of its apparent greater credibility. I’ve said before that in my view, the College Board has “improved” the SAT simply as a marketing move, in an attempt to slow the sales gains of the ACT. Call me cynical and/or conspiratorial, but I’ve been around sales and marketing people all my life and that’s how the game is played.
Incidentally, Lee Weiss refers to Kaplan’s survey of college admissions officers. For their 2016 survey, Kaplan interviewed by telephone, between July and August 2016, admissions officers from 374 of the nation’s top national, regional and liberal arts colleges and universities, as compiled by U.S. News & World Report.
This information piqued my curiosity, so I did some research and found an interesting article, one of a fairly large group, that confirms Kaplan’s survey’s findings. In SAT Essay Losing Steam Among Admissions Officers, Dian Schaffhauser notes that “University admissions officers are ho-hum about the essay question. According to a survey of 300 colleges and universities, only a handful of them will expect applicants to submit their score from the new SAT’s essay section.” …
“… When U Penn announced that it would no longer require the essay portion of the SAT or ACT, Yvonne Romero Da Silva, Penn’s director of admissions, said the decision was based on careful consideration. ‘Our internal analysis as well as a review of the extensive research provided by the College Board showed that the essay component of the SAT was the least predictive element of the overall writing section of the SAT. Given the impending redesign of the SAT and PSAT/NMSQT, which will make the essay portion of the assessment optional, we could no longer support requiring the essay portion of either exam given its weaker predictive power.'” …
Once again, we see the weak link in the “improved” SAT. For those of you who have yet to — or are about to — wade into the new SAT waters, here’s a helpful article for your enlightenment: Behind the SAT: The Good and Bad of the 2016 Redesign. U.S. News education reporter, Allie Bidwell, begins her coverage with some strong words:
It will be a game of wait-and-see when it comes to how effective the revamped SAT is in terms of predicting student success in college.
While some have praised the test redesign for better aligning with what’s actually taught in schools, others say there are risks of lowering standards. Still others say the changes really don’t matter too much because the test never has been a sole indicator of student success. … [my bold emphasis added]
I added that emphasis because, as most of my readers know, I am anti-SAT and anti-College Board marketing. However, as they say, your mileage may vary, and you might have good reason to trust the test. If so, that’s fine by me.
However, I have seen too many completely qualified and capable college applicants denied their opportunity to attend — and succeed at — top schools simply because their SAT scores reflected one unfortunate reality: they are not very good at taking standardized tests.
Once again, don’t get me started …
Check College Confidential for all of my college-related articles.