The National Academies: Advisers to the Nation on Science, Engineering, and Medicine
NATIONAL ACADEMY OF SCIENCES NATIONAL ACADEMY OF ENGINEERING INSTITUTE OF MEDICINE NATIONAL RESEARCH COUNCIL
Current Operating Status

The National Academies Testimony before Congress

Public Laws Containing Studies for the National Academies

Briefings to Congress

Congressionally Mandated Reports

Policy Statements and Historical Documents

The OCGA staff

Request a Report (Congressional and Government Staff Only)


Mailing Address:
The Office of Congressional and Government Affairs
The Keck Center of the National Academies
Keck WS1008
500 Fifth Street, NW
Washington, DC 20001
Tel: (202) 334-1601
Fax: (202) 334-2419

Back to Main Page

Date:04/17/2002
Session:107th Congress (Second Session)
Witness(es):Robert W. Fri
Credentials:  Visiting Scholar, Resources for the Future and Chair, Committee on Benefits of DOE R&D on Energy Efficiency and Fossil Energy, Board on Energy & Environmental Systems, Division on Engineering and Physical Sciences, National Research Council, The National Academies
Chamber:House
Committee:Interior Subcommittee, Committee on Appropriations, U.S. House of Representatives
Subject:Energy Research: Interior & Related Agencies Appropriations, Fiscal 2003

TESTIMONY OF ROBERT FRI VISITING SCHOLAR RESOURCES FOR THE FUTURE AND CHAIRMAN COMMITTEE ON BENEFITS OF DOE R&D ON ENERGY EFFICIENCY AND FOSSIL ENERGY BOARD ON ENERGY & ENVIRONMENTAL SYSTEMS DIVISION ON ENGINEERING & PHYSICAL SCIENCES NATIONAL RESEARCH COUNCIL ON ENERGY RESEARCH -- MEASURING SUCCESS BEFORE THE SUBCOMMITTEE ON INTERIOR AND RELATED AGENCIES

COMMITTEE ON APPROPRIATIONS U.S. HOUSE OF REPRESENTATIVES

APRIL 17, 2002

Good morning, Mr. Chairman and members of the Subcommittee. My name is Robert W. Fri. I am currently a visiting scholar at Resources for the Future, an independent nonprofit organization that does research to help people make better decisions about the conservation and use of natural resources and the environment. Today, however, I am appearing as the chair of a National Research Council (NRC) committee that evaluated the benefits and costs of selected Department of Energy research and development (R&D) programs since 1978. Your subcommittee requested this study in the FY 2000 appropriations act. We completed the project last summer and published our findings in a report entitled Energy Research at DOE: Was It Worth It?

In addition to the substance of our assignment, a primary purpose of the NRC study was to develop a methodology for evaluating benefits and costs that could be applied uniformly to DOE’s R&D programs. My remarks today will focus on these methodological issues, not on the details of our evaluation of the benefits and costs of the R&D projects that we studied. Although my comments today draw extensively on the NRC experience, I want to stress that I am offering my personal views.

Specifically, I will discuss:

Before turning to these topics, however, I will very quickly summarize the methodology our committee developed.

The NRC Methodology

The methodology developed by our committee has two major elements. The first of these is a matrix for categorizing benefits, a copy of which is attached [in Microsoft Power Point].

The idea behind the matrix is quite straightforward:

• The rows of the matrix represent the major goals of DOE research. DOE’s mission is to ensure the Nation has reliable access to affordable and environmentally sound energy. Accordingly, the rows of the matrix correspond to the economic, environmental, and security goals implicit in the Department’s mission.

• The columns of the matrix reflect the uncertainty of the research process itself. Some applied research succeeds both technically and commercially, and falls into the column for realized benefits. Other technologies succeed technically but the economic and/or policy conditions necessary for commercial success fail to materialize; this are option benefits. Finally, research into technologies that are not (yet) successful technically may produce important knowledge benefits.

The matrix thus provides an accounting framework into which the results of a research project can logically be placed. In analyzing thirty-nine case studies using this framework, the NRC committee found that significant benefits could be assigned to each of the nine elements of the matrix.

The second major element of the NRC methodology is the set of instructions for defining each of the nine types of benefits in the matrix. Appendix D of our report contains lengthy definitions that I will not try to summarize here. But to illustrate their importance, the definitions require that benefits of new technology be measured against the next best technology – not against some static baseline. Similarly, we do not give environmental credit for finding cheaper ways of meeting existing environmental standards, although we might assign an economic benefit in such a case.

I would be happy to answer questions about the details of the methodology, but believe that this brief summary provides sufficient background for the conclusions to which I now want to turn.

Lessons Learned from the NRC Study

Probably the greatest lesson we learned from analyzing the thirty-nine case studies presented in our report is that the methodology works. That is, it is feasible to design and to apply consistently a uniform methodology to a diversity of applied research programs. In the course of our study, we reviewed dozens of other evaluations that used many different evaluation methods. The lack of uniform methodology has been one of the chief causes of confusion about the benefits of research, and the resulting frustration was, I believe, a primary reason for commissioning our project. The simple fact that we were able to apply a uniform method is therefore of considerable significance. For if that is in fact possible, then the stakeholders in DOE’s applied research program can reasonably expect the Department to develop and apply a uniform method for benefits estimation.

Although I am sure that our methodology can be refined and improved, I want to stress five critical features of it. I believe that these features are essential to any system that is ultimately used by the Department.

As you can imagine, this was not a particularly popular policy, but it made a huge difference. To illustrate, the NRC committee estimated benefits on fourteen programs that were from two to ninety-five percent lower that the Department’s own estimates. What is important here is not the size of the markdown but the spread. Our committee could have been overly conservative and perhaps should not have marked the DOE estimates down so sharply. Rather, I believe it was our consistency in applying the rules that lead to the large range of markdowns, and in the end yielded comparability among benefits estimates that had not existed before.

I noted earlier that our methodology can be refined and improved, and I urge that this be done. Of particular concern is to disaggregate the knowledge category, which we used as a bit of a catchall, and to update the definition of security to encompass homeland security issues.

Prospective Evaluation

The NRC study focused exclusively on a retrospective analysis of actual outcomes of applied research. This review of twenty-eight years of experience produced valuable insights and should be replicated periodically. But program planning and budget decisions are made prospectively, and the NRC methodology needs to be adapted to perform this function.

Our committee did not attempt to develop a prospective evaluation methodology. In my own view, however, the steps needed to adapt the NRC process to a forward-looking methodology are fairly straightforward. Let me offer a few suggestions in this regard, recognizing that considerable work needs to describe a prospective methodology in as much detail as the retrospective methodology in the NRC report.

• Account for technical and other uncertainties. Analyzing actual outcomes involves few uncertainties, and none that depend on the analyst’s ability to predict the future. Prospective analysis is not so simple. Moreover, the uncertainties associated with different technologies are not the same; some research is inherently more risky than others. Prospective analysis must cope with this spread of technical uncertainty, despite the fact that program advocates do not like to admit it exists. Other important uncertainties lodge in the economic and policy scenarios assumed to exist in the future. Tools exist for dealing with such uncertainties and must be included in any prospective methodology.

• Ensure compatibility with portfolio analysis. DOE plans and operates a portfolio of applied research that should be balanced in terms of risk, timing, and expected benefits. Prospective analysis of individual research projects should lend itself to constructing and evaluating such portfolios.

• The rows are fine but the columns need work. The benefits matrix rows represent the goals of DOE research and so apply to both retrospective and prospective analysis. Furthermore, most of the definitional material in Appendix D is the same for both methods. But the concept of realized, option, and knowledge benefits does not work so well prospectively. A straightforward modification would be to require each research project to define its expected outcome as either a fully commercialized technology or specific bit of knowledge relevant to an applied problem (e.g., a pollutant characterization that determines the level of environmental control required in a power plant). In addition, the project should define the economic and policy scenario envisioned to exist at the time the technology outcome is available. The scenario should reflect the most likely case for those technologies that are intended for commercial deployment, but could be a different case if the intent is to produce a technological option as an insurance policy.

• Specify interim outputs. The research project plan should specify interim outputs that can be monitored with the NRC retrospective matrix as they develop. Used in this way, the matrix tracks the progress of research as it develops knowledge, creates a workable but uneconomic pilot technology, and ultimately finds its way into commercial deployment. This tracking mechanism then serves to support one of the major NRC recommendations – that research progress be evaluated regularly and funding stopped when progress does.

Implementation

The central recommendation of the NRC report is to adopt a uniform methodology for evaluating the benefits and costs of DOE’s applied research programs. Doing so requires all of the stakeholders finding common ground – the Congress, the Office of Management and Budget, DOE senior management, and the DOE program offices. This is not a simple task.

In my view, the leadership for this task can only come from DOE management. Neither the Congress nor OMB has the management resources to work through the details of the methodology or to coordinate the interests of all the stakeholders. The DOE program offices have these capabilities, but very little incentive to develop a methodology that applies uniformly to all of their activities. Thus, the ball falls in the court of DOE top management, and I recommend that you encourage the Department to take up this responsibility at the highest level.

At the same time, both the Congress and OMB should require careful oversight to ensure that the critical features of the methodology (as noted earlier in my testimony) are in fact incorporated in DOE’s work. Additionally, periodic review of the operation of the evaluation system is appropriate to maintain it integrity. (I should note that this is a review of the system, not of the progress of research programs.) Establishment of this oversight mechanism should go hand in glove with setting up operational responsibility at the top level of DOE.

Finally, let me say that my colleagues on the study committee and on the NRC’s Board on Energy and Environmental Systems are fully committed to providing whatever help we can to make the implementation of a uniform system of benefits evaluation successful. Not only is the objective important, but we believe that your interest, as well as the interest of other stakeholders in it, means that the time is ripe for action.[<dd>]Thank you. I would be happy to answer any questions you may have about my comments or about our study.

RSS News Feed | Subscribe to e-newsletters | Feedback | Back to Top