Serving the GW Community since 1904

The GW Hatchet

AN INDEPENDENT STUDENT NEWSPAPER SERVING THE GW COMMUNITY SINCE 1904

The GW Hatchet

Serving the GW Community since 1904

The GW Hatchet

NEWSLETTER
Sign up for our twice-weekly newsletter!

Alex Schneider: A better strategy to gather student input

Media Credit: Hatchet File Photo by Elise Apelian | Senior Staff Photographer
The GW Law School and some peer institutions proctor the Law School Survey of Student Engagement.

Quick, take a minute to answer these questions: Have you had serious conversations with students who are very different from you? Have you worked harder than you thought you could to meet faculty members’ standards or expectations? How often this year did you prepare two or more drafts of a paper before turning it in?

Now, a real challenge: Did your responses to those questions say anything meaningful about you or the education you receive at GW?

The GW Law School clearly thinks so. It has joined its peers across the country to proctor the standardized Law School Survey of Student Engagement.

The survey, now in its 10th year, is conducted by researchers at Indiana University every spring. The LSSSE’s staff creates a custom survey specific to a law school, like GW’s, and reaches out to students there to encourage them to respond. With GW’s law school enrollment sitting at about 1,300 students, we likely paid $4,700 for the privilege.

The LSSSE not only helps individual colleges interpret the data they collect, but takes the information gathered from all of their subscriber colleges together to make wide observations about the state of law schools across the country.

During the past month, LSSSE staff members have inundated law students with emails begging us to take the 15-minute survey “to improve the educational quality and the student experience” at the school.

It’s obvious they’re interested in collecting our responses. But the data isn’t really that useful for one important reason: Online polls are the easy way out. It is far more time-intensive to solicit feedback from students by talking to them. Survey emails far outnumber emails about office hours or faculty-student committees.

Last spring, interim GW Law School Dean Gregory Maggs held a town hall-style meeting that drew dozens of students who asked pointed questions about the concerns they wanted their administrators to address. The format helped administrators not only become aware of students’ thoughts, but also provided an outlet to respond to those concerns.

Surveys don’t work the same way. After faceless administrators collect answers, that’s it. There’s no dialogue, no follow-up. The school can boast that it was responsive to students by initiating a survey, and it can report the results in a series of nice graphs. But it’s hard to know how our answers are received or if they go toward implementing real change.

And it’s difficult to make a case that vague prompts and one-to-five rankings actually have value. From a statistical perspective, they certainly do not. The students who actually open and take email surveys are not representative of all students.

And many of the surveys we’re asked to take often don’t seem all that scientific. In the case of law school course evaluations, for instance, I’ve never quite understood the difference between ranking a professor as “exceptional” or “excellent.”

Departments, student groups and even outside companies send online questionnaires to solicit feedback all the time. These surveys generate a whole lot of data, and I’m sure that someone in a fancy office is paid to pour over the results and turn our responses to vague questions into in-depth reports – like they do with the LSSSE.

As students, we’re bombarded with surveys left and right – some scientific, like the law school engagement survey, and some very clearly not.

I shudder to think that anyone would take the results of these less technical polls seriously. It’s concerning that our clicks, made to cease the constant email reminders, would ever translate into decisions about hiring, curriculum or student services.

The school could improve surveys by relying more on open responses. In the context of course evaluations, for instance, open responses answer the often-neglected question of why a respondent answered as he or she did. After all, rating a professor as exceptional can mean different things to different students. For some, exceptional means a professor who does not call on students in class. For others, it can mean a professor with a 10-page résumé.

It’s great to see administrators interested in student trends and feedback. But the solution needs to be more dialogue, not data that is collected from impersonal, vague online surveys.

Alex Schneider, a second-year law student, is a Hatchet columnist.

More to Discover
Donate to The GW Hatchet