Getting the facts right in evaluating the Regulatory Studies Center

Steve Balla is an associate professor of political science, international affairs and public policy and public administration.

Chris Carrigan is an associate professor of public policy and public administration and director of the master of public administration program.

Kathryn Newcomer is a professor of public policy and public administration and former director of the Trachtenberg School of Public Policy and Public Administration.

In a previous op-ed, two of us clarified some misperceptions about the GW Regulatory Studies Center. The source of many of these misperceptions is a report issued last year by Public Citizen. As professors who have taught research design and statistics to GW undergraduate and graduate students for decades, we want to set the record straight about problematic methodological practices in the report. We also want to raise a larger point about the importance of critical analysis in evaluating information.

In short, the report clouds the ability of readers to independently assess its claims and presents incomplete findings about the content of RSC publications. It is simply not true that “96 percent of its publications argue for deregulation,” as asserted in an op-ed recently published in The Hatchet.

First, the report focuses on a small fraction of the work produced by those affiliated with the RSC. RSC outputs consist of books, reports, working papers, articles, public comments, testimonies, insights and commentaries. All of these outputs are central to the RSC’s mission to “improve regulatory policy through research, education and outreach.” As highlighted in our previous op-ed, publications include research arguing against the perception that “regulation kills jobs” and supporting mass comment campaigns, a means of regulatory participation used by groups favoring stringent health, safety and environmental regulations. But the report’s “96 percent” finding is derived solely from public comments submitted in response to proposed agency regulations. To be more specific, the finding is based on 7 percent of the RSC’s output. As a result, claims that RSC publications almost universally advocate for deregulation are unsubstantiated, in the absence of systematic analysis of the remaining 93 percent of the RSC’s work.

Second, the report focuses not on all RSC comments, but rather on an unrepresentative sample. Of the population of 55 comments referenced in the report, 28 are excluded at various points in the analysis. For example, some comments are eliminated because they are “comments that focused on aspects of proposed rule that were not relevant to stringency of regulation; cases in which we could not discern a recommendation’s potential effect on stringency; and cases in which we deemed the prospective effect of the recommendation to be minimal.”

In other words, the report openly acknowledges that it is common for RSC comments to focus on issues other than making clear, strong recommendations about regulation and deregulation. Yet this element of RSC comments is discarded completely from the key analysis. In the end, the report’s “96 percent” finding is derived from an analysis of less than half of RSC comments.

Third, the report fails to make transparent a number of crucial details about its methodology. A central element of the report is determining whether RSC comments favor more or less regulation. Precise criteria for determining what constitutes “more” versus “less” regulation, however, are never provided. Similarly, the report never mentions having two or more individuals independently evaluate the content of RSC comments, a standard practice for demonstrating the reliability of subjective judgments. Absent such transparency, it is not possible for readers to be confident in the results of the analysis, let alone replicate and verify the findings themselves.

In sum, the report’s “96 percent” finding is the product of a number of questionable methodological practices. The fact that this finding has gained attention on campus underscores the importance of critical analysis in assessing information. All of us are continually inundated with information that speaks directly to our deeply-held views. How are we, as members of an academic community, to judge this information? There is no substitute for dispassionate analysis based on rigorous methodological principles. In that spirit, we encourage interested members of the GW community to give the diverse, voluminous body of work of their faculty colleagues and student peers affiliated with the RSC a close, careful look for themselves.

The Hatchet has disabled comments on our website. Learn more.