Join ArborBridge’s Weekly News Flash and receive the latest news in test prep and college admissions every week
We are committed to providing you with the most up-to-date resources and announcements from the college admissions testing landscape. Here are some of the top headlines from this past month:
SAT Percentiles Released
Summary: The College Board finally released their official SAT percentile chart. Up to this point, we had all been forced to use reverse-engineered percentiles based on SAT concordance with PSAT and ACT to understand student positioning. Now we have official materials direct from the College Board. (Whew!) The link below will take you to a downloadable copy of the percentile charts. For more information on the new percentile charts, please be sure to check out our blog.
What this means: Just as we expected, scores are rising on the new SAT. Whereas a student once needed a Math score of 600 to be in the 75th percentile, she now needs a 610. These higher scores don’t mean that the test is any “easier” or that students are doing “better”. Rather, the redesigned SAT is simply a different animal and the percentile charts reflect that. The gap between old and new percentiles is even more pronounced for Reading/Writing. Whereas a student once needed a 570 in Reading/Writing to achieve the 75th percentile, she now needs a 610.
SAT Understanding Scores 2016 (College Board)
Turnitin Partners with College Board
Summary: Turnitin—the plagiarism checking software every college student knows and dreads—has recently ventured into the test prep world with its formative writing instruction technology, Revision Assistant. While the new technology currently aims to provide students with computerized feedback on select Advanced Placement writing prompts, the College Board hopes to ultimately use it for SAT Essay preparation. The idea is that the program will give students individualized feedback so that they can adjust their writing and learn from their mistakes.
What this means: For years we’ve been hearing rumblings of the College Board using computer adaptive technology to grade its essays. This is yet another sign that College Board is slowly—but surely—heading in that direction. Unfortunately, such a program will likely have some pretty big pitfalls. First, because the grading program spans genres, it’s not likely to be able to give highly nuanced feedback. Rather, look for this program to identify what type of writing the student is engaging in and make blanket recommendations based on that genre. Second, the program may correctly identify what a student is doing wrong, but it won’t likely be able to explain why the student made the mistake in the first place. Students don’t just need to know they’re doing something wrong. They also need to be able to identify and recognize how to fix those mistakes in the future. Although a student will likely need more essay support than just the Revision Assistant, educators can use this tool to better understand the College Board’s grading process and expectations.
ACT “Enhances” its Test Score Reports
Summary: In a four-part announcement, the ACT has introduced brand new reporting categories for the 2016-2017 academic year. These new reporting changes will replace the old categories such as Intermediate Algebra/Coordinate Geometry with more detailed and diverse categories such as Preparing for Higher Math, Integrating Essential Skills, and Modeling. You can find a detailed description of all of these new categories at the link below. We’ll continue to update you on new developments as the ACT releases the remainder of this series.
What this means: Although the ACT claims to be making these changes in an effort to provide students with more detailed and informative data, many of these new scoring categories are really cryptic for students and tutors alike. Instead of breaking the test down by passage type and question type, the new score report is breaking questions down by inferred areas of strength. Students may use these new reports to know what they’re good and bad at, but have no idea how to fix it.
Read all four announcements below:
New ACT Score Reports (ACT)
ACT Cancelled in Hong Kong and South Korea
Summary: For the first time ever, the ACT joined the international cheating crackdown and cancelled the June 11th exam in both Hong Kong and South Korea due to a confirmed test leak. The confirmation, according to ACT spokesman Ed Colby, wasn’t made until the day before the exam, forcing the ACT to cancel 5500+ students’ exams just hours before they were scheduled to test. In the midst of the confusion, thousands of students who did not receive the cancellation email showed up to empty test sites. While the ACT will issue full registration refunds for all students, many students from mainland China and the surrounding regions incurred additional travel expenses—such as airfare and hotel costs—which cannot be reclaimed. Moreover, students who had hoped to use the exam as a way to apply for early decision and seniors who had hoped to take the exam for scholarships, athletic eligibility, etc. will no longer be able to exercise these options.
What this means: For the past several months, we’ve speculated that the “international cheating ring” couldn’t possibly be unique to the SAT. Now that the ACT has acknowledged the problem as well, we’re hoping that the test prep mills and persons who profit off of cheating will learn—once and for all—that gaming the system only hurts the system for students everywhere. Hoping, but we wouldn’t count on it.
ACT Cancels Entrance Exam in South Korea and Hong Kong After Test Leak (Reuters)
Leaked ACT College Admissions Test Cancelled Hours Before Students Were to Take It (Washington Post)
U.S. College Exam Cancelled in South Korea and Hong Kong After Leak (The Guardian)
Columbia Drops SAT Subject Test and Essay Requirements
Summary: Columbia University announced that it will no longer require applicants to take the SAT Subject Tests or writing portions of the SAT/ACT. Columbia will be joining University of Pennsylvania as the only Ivy League Schools without either requirement. Despite these changes, both Columbia and UPenn will continue to require a standard SAT or ACT score.
What this means: The choice to drop these testing requirements signals Columbia’s shift in thinking toward a more holistic review process. By dropping these additional scores, though, we have to anticipate that Columbia will place larger emphasis on a student’s standard SAT/ACT score. It’s no surprise that yet another university is going essay-optional. What is surprising is that now two of the Ivies are essay-optional. For many years, the essay has been assumed to be a necessary evil due to big name Ivy League schools requiring it. Now that Columbia and UPenn have deprioritized the essay, others just might follow their lead.
Columbia Drops SAT Subject Test Requirement (Inside Higher Education)
Columbia Admissions Drops SAT Subject Test Standardized Writing Requirement (Columbia Spectator)
College Board Official Goes Rogue
Summary: In a scathing critique of the College Board, former SAT Executive Director of Assessment Design and Development, Manuel Alfaro, released a multi-part blog post on Linkedin attempting to “whistleblow” a number of College Board practices he believes to be dishonest and unethical. Among these allegations are the College Board’s surreptitious use of Common Core testing standards, antiquated content management system, recycled exams, and—perhaps most shocking—untested questions on official student exams. When asked why he waited until now to make these allegations, Mr. Alfaro claimed that he needed the College Board to administer its new exam before he had the proof he needed to go public. Now he’s urging everyone to sign a petition for the federal government to investigate the College Board for making false claims about the SAT.
What this means: Even though Alfaro’s claims did not generate the uproar he was hoping for, they do raise a couple of issues students should take note of. First, as we’ve known for quite some time, the College Board is still getting its footing with the new test. Until we have multiple official exams available on the market, the only safe assumption is that the exam can and will change from test date to test date. Alfaro also reminds us that exam creators are not infallible. Flawed questions and weak questions can and do slip into official exams—especially experimental sections—from time to time. That’s why we always urge students never to get too hung up on any given question. All test questions are worth the same number of points, so if a question is taking too much time, seems strange, or just feels wrong, students should always feel comfortable skipping and returning to the question at the end of a test passage or section.
Shining a Spotlight on the Dark Corners of the College Board (Linkedin)
Former College Board Exec: New SAT Hastily Thrown Together (Deutsch29)
Former College Board Executive Blows the Whistle (The Critical Reader)
ACT Finds Disconnect Between Common Core and College Readiness
Summary: In its 2016 National Curriculum Survey, the ACT, which surveyed thousands of K-12 and college instructors as well as workplace supervisors and employees from various industries, found that about 60% of people familiar with Common Core didn’t think English and math Common Core Standards are what students need to be academically and professionally successful. When it comes to writing, many college representatives favor a student’s ability to generate creative ideas while the Common Core emphasizes the student’s ability to analyze source text. Many teachers also reported choosing to cover non-Common Core content with their students in order to better prepare them for the future.
What this means: The ACT is loudly and proudly proclaiming the differences between its flagship exam and Common Core Standards. From its persuasive essay, which calls upon the student to construct a narrative supporting his position, to its science test, which tests science reasoning skills as a non-integrated, separate skill, the ACT remains a very different testing option for students and educators who may not favor the SAT’s Common Core-inspired format.
Are SAT/ACT Scores the Best Predictor of College Success?
Summary: A recent survey of the National Association for College Admission Counselors (NACAC)’s 1,354 member colleges shows that of the 424 that responded, only 51% of those that require SAT/ACT scores conduct predictive validity studies of how well those scores correlate to later college performance. NACAC interviewed 11 of these colleges to learn more about how student test scores predict college success. The study found that while there is a positive relationship between test scores and college performance, a student’s grade point average was a stronger predictor of success in college.
What this means: It’s hard to put much stock in a study that involved only 11 colleges (less than 1% of all NACAC member colleges). Apparently, only 40 schools that responded to the initial NACAC survey agreed to an interview and, of those, only 11 were hand-selected by NACAC to weigh in. These schools were intended to represent a large cross-section of the United States including public and private schools, large and small schools, and East and Midwest schools. However, with so few voices to be heard, it’s not likely that the findings are truly representative. It is interesting to note one feature of all of the studies spotlighted in the Education Week spread: a consistently strong, positive correlation between GPA and college performance.
Barely Half of Colleges Validate Use of SAT, ACT as Success Predictors (Education Week)
Many Colleges Don’t Put Testing Requirements to the Test (The Chronicle of Higher Education)
Want more stories like this delivered straight to your inbox every Tuesday? Subscribe to ArborBridge’s Weekly News Flash!