Serving the GW Community since 1904

The GW Hatchet

AN INDEPENDENT STUDENT NEWSPAPER SERVING THE GW COMMUNITY SINCE 1904

The GW Hatchet

Serving the GW Community since 1904

The GW Hatchet

NEWSLETTER
Sign up for our twice-weekly newsletter!

Data helps shape next steps at GW

Six years ago, University officials flagged a glaring weakness in an accreditation self-study: The data they collected to measure academic progress often went to waste, failing to prompt any real change.

Now, with a response to that report due June 1, administrators are trying to improve the University’s previously lackluster assessment practices as accreditors push for proof of success and colleges nationwide focus on the numbers.

They are adopting a “Moneyball” approach to decision-making – increasing oversight of academic assessment, rewarding departments that make data-based changes and studying applicants to shape admissions practices.

“There’s always been a lot of data, but we just haven’t used all the data,” said Senior Vice Provost for Academic Affairs and Planning Forrest Maltzman, who chaired that accreditation study and now serves as the provost’s second-in-command.

Since the 2007 self-study conducted for the Middle States Commission on Higher Education, the University has collected data from each of its schools on how students are learning, based on yearly department-level reviews of senior papers and capstone courses.

The University then created the Office of Academic Assessment and Planning in 2008, but administrators say they are still looking to do more, like incentivize academic departments to overhaul courses if they find their teaching isn’t working.

And now that the provost’s office oversees GW’s student life arm, those offices will undergo reviews, just like academic departments.

“I just think there’s this natural tendency that unless you do that periodically, you tend to make decisions on assumptions,” Provost Steven Lerman said. “We’re doing that in many different parts of the organization, asking, ‘How do you know what works and what doesn’t work?’ ”

Administrators and accreditors are pressing faculty for open data, which has been a game changer, said Dan Ullman, Columbian College of Arts and Sciences associate dean of undergraduate students and a mathematics professor. He said it’s helped by requiring departments to report more information to dean’s offices.

“In the old model, faculty were king of their fiefdom in the classroom. They got to assess their students, give their grades, determine what they wanted to teach the students,” Ullman said. “Now we’re saying, we have a communal responsibility to teach the students.”

Creating a culture of assessment

Data collection is becoming more mandatory as accreditors rapidly raise their expectations to prove they are helping students learn, said Cheryl Beil, GW’s associate provost for academic planning and assessment.

She is heading the University’s required five-year check-up with its accreditor, the Middle States Commission on Higher Education, which will be made public in the fall.

“It used to be, you wrote up a report, and it was fine,” Beil said. “They’ve really ramped up the assessment piece, and its assessment of student learning outcomes, but it’s also institutional assessment.”

The University’s arts and sciences, business and education schools all added positions to oversee assessment and data collection. GW has also given out $50,000 in the last four years for departments to revamp assessment.

Beil said the University has made significant strides since that 2007 report, which said that GW needed to shift from a “culture of data collection and investment to a culture of assessment and renewal.”

Those attempts were underway when administrators caught the admissions office in a decade-long practice of inflating freshman profile statistics. Maltzman said he caught the mistake when he reviewed admissions office data last summer.

The error, which administrators maintain was not intentional, kicked GW out of the U.S. News & World Report top colleges ranking for one year. The Office of Academic Planning and Assessment, which Beil heads, now oversees data collection instead of the admissions office.

Beil said Lerman and Maltzman have made the difference toward a culture shift.

“I’ve been collecting data here on students since 1984, and I’ve seen a number of administrators. Forrest and Steve Lerman are data-driven people, so they really care about data and are using the data,” Beil said. “Partly, it reflects a national trend, but it’s also the people who are there.”

Former University President Stephen Joel Trachtenberg, who retired in 2007 after leading GW for 19 years, said his administration was built on a mix of numbers and gut. He said while “even back in the day we used data to make decisions,” he was skeptical that it would prove to be a cure-all.

“What you’re looking for is balance,” he said. “I don’t have patience for people who say we’re just doing what the numbers dictate. You always have to bring an intellectual, cognitive, human component to making decisions.”

But the data angle is also sharpest, administrators say, as they focus more on the science of admissions and enrollment management. The University hired its first enrollment chief last month, cementing its commitment to more than just marketing GW to applicants, but also to shaping a class that’s academically stronger and more diverse.

Lerman and Maltzman said now they are polling, surveying and separating students into control groups. As the University hosts more events for accepted students this year, the University is tracking to see if those students wind up in Foggy Bottom.

If admitted students check off that they intend to study a certain field – whether it is physics or accounting – they may get an email from a department chair offering to answer questions. The move is calculated, Maltzman said, and aimed at understanding how well administrators’ constant tinkering works.

“Between the two of us, we can typically come up for any given action at least two hypotheses that actually lead you in different directions, and the only way to tell the two apart is to get some data,” Lerman said.

A practice usually ignored

When assistant economics professor Irene Foster arrived at GW in 2010, she heard that students in introductory classes lacked basic math skills. So in pursuit of raising standards, the economics department started handing out practice algebra tests in the first days of those classes.

Half of the students failed.

Despite student gripes about the tests, the department used the data collected from those exams to make evidence-based changes to redesign courses. Foster said top administrators have supported the approach, handing out about $12,000 to help them study their own classes.

“The econ department kind of stuck our neck out and did this because we didn’t know how it’d go down with students or the University,” Foster said. “We put some teeth behind it. Everyone knows we’re serious.”

Other departments like biology began instituting similar assessments this year. The political science department shifted similarly when it found it needed to upgrade writing standards.

Foster said universities typically ignore making these kinds of changes based on assessment, at least in her experience teaching at universities like Vanderbilt, Indiana and Purdue.

“I haven’t seen anyone doing any assessment,” she said. “It should be more systematic. We owe it to our students.”

And now, as departments go through their first reviews since the 2007 study, there are big incentives to keep good track of their progress, Beil said.

“We’re asking programs to use those assessments if they want to ask for more faculty, and any changes they want to make needs to be based on data,” she said. “We’re becoming much more data-driven.”

More to Discover
Donate to The GW Hatchet