Click the answer that best describes you: “My level of political awareness is: Very high, high, average, low or very low.”
This question is the basis for The Princeton Review’s annual ranking of the most politically active colleges published in “The Best 381 Colleges” book and sold to high school guidance counselors across the country. I own four of those books – one for every year I attended high school – and the page with the “most politically active” rankings still has my pink sticky-note on it because I used it to help me make my college decision, like many other students at GW.
This year, GW lost its No. 1 spot and dropped down to 10th on the list, breaking a four-year streak. Columbia University stole the top spot, surprising many students – including me – on campus. Despite losing our title, GW students were hardly slacking off politically this past year. In fact, the impressive amount of political participation on campus last year, coupled with the shift in the rankings, proves that the Princeton Review’s methodology – or the way they calculate the rankings – has flaws that leave considerable room for interpretation.
“Most politically active” was a common tagline in admissions recruiting brochures, a staple fun-fact thrown out by Student Admissions Representatives – or STAR tour guides – and a title touted by administrators. This doesn’t have to change. The top ten colleges include a politically active environment that all schools should be proud of, but if University administrators want to recapture this “claim to fame” next year, then they need to encourage students to complete the survey and accurately rank their own levels of political activity. In the meantime, students shouldn’t be ashamed to embrace the No. 10 ranking.
GW was blindsided by the drastic drop in the rankings, released in late August, but it’s no surprise that Columbia University is now top dog. Currently, a free-speech group at Columbia is suing President Donald Trump for blocking oppositional-Twitter users from following his account. On the Princeton Review website, students at Columbia are characterized as being “very ambitious” and “aren’t afraid to speak out against what they think is wrong.” But that doesn’t sound much different from students here.
There isn’t a lack of activism on campus. In fact, GW might be more politically active than ever before. In the last year alone, three students were arrested during the Democracy Spring protests, a University-wide walkout to protest President Donald Trump’s election attracted 400 participants and an a cappella group went viral for their performance at the Women’s March. We are no less politically active than students at any other University ranked above us this year. It is not the quality or quantity of political activeness that distinguishes one university from another for the Princeton Review rankings, but rather the students’ perception of themselves. This is where Princeton’s rankings fall short.
The Princeton Review notifies their contacts at the school they plan to survey, and the review depends largely on these officials to inform the student body of the availability of the survey. Students are trusted to self-report what school they attend and complete the survey found on the Princeton Review’s website. One survey per one student per academic year is recorded, but there simply aren’t enough students who participate. This is where the trouble starts.
The average number of students surveyed per campus is 359. Without increasing this sample size for larger universities like GW, the rankings have a pretty high margin of error, according to political science professor Steven Balla, who teaches Scope and Methods of Political Science.
“You are talking a margin of error of about 5 percent with a sample of that size,” Balla said in an email this week. “So it could be hard to differentiate the political activity of schools that are fairly close on the scale.”
But perhaps the most glaring problem of the Princeton Review rankings is how they choose the students surveyed – the answer is that they don’t. Only the responses of a handful of students who intentionally seek out the 80-question survey on Princeton Review are recorded. This is called a convenience sample. Any introductory statistics class will tell you that these samples have no way to ensure that the students polled represent a diverse cross-section of the whole University and that students should take these results with a grain of salt.
If GW wants to reclaim their place at the top of the list, then administrators should simply prompt students to fill out the survey with an email that announces its availability. The Princeton Review claims on its website that the schools who sent an email notification to the entire student body “yielded robust response rates,” which means a more accurate picture of the level of political activeness at a school. When filling out the survey, students should reflect on their own personal political activeness over the past year, instead of comparing themselves to their peers, so they can give more objective answers. Meanwhile, the Princeton Review should also report how many responses they receive from each university so that no one is misled about the margin of error of the rankings they are reading.
The Princeton Review finally reports that about 80 to 85 percent of students said that their school’s profile was “extremely” or “very” accurate. But this percentage is off too, because rankings are only as good as their methodology.
Sydney Erhardt, a senior majoring in international affairs, is a Hatchet columnist.
Want to respond to this piece? Submit a letter to the editor.