The National Academies: Advisers to the Nation on Science, Engineering, and Medicine
NATIONAL ACADEMY OF SCIENCES NATIONAL ACADEMY OF ENGINEERING INSTITUTE OF MEDICINE NATIONAL RESEARCH COUNCIL
Current Operating Status

The National Academies Testimony before Congress

Public Laws Containing Studies for the National Academies

Briefings to Congress

Congressionally Mandated Reports

Policy Statements and Historical Documents

The OCGA staff

Request a Report (Congressional and Government Staff Only)


Mailing Address:
The Office of Congressional and Government Affairs
The Keck Center of the National Academies
Keck WS1008
500 Fifth Street, NW
Washington, DC 20001
Tel: (202) 334-1601
Fax: (202) 334-2419

Back to Main Page

Date:06/25/2009
Session:111th Congress (First Session)
Witness(es):Micah D. Lowenthal
Credentials:  Director, Nuclear Security and Nuclear Facility Safety Program, Nuclear and Radiation Studies Board, National Research Council, The National Academies and Study Director, Committee on Advanced Spectroscopic Portals, Nuclear and Radiation Studies Board, Division on Earth and Life Studies, National Research Council, The National Academies
Chamber:House
Committee:Investigations and Oversight Subcommittee, Committee on Science and Technology, U.S. House of Representatives
Subject:The Science of Security: Lessons Learned in Developing, Testing and Operating Advanced Radiation Monitors

Hearing on the Science of Security:
Lessons Learned in Developing, Testing, and Operating Advanced Radiation Monitors

Testimony of

Micah D. Lowenthal, Ph.D.
Director, Nuclear Security and Nuclear Facility Safety Program
and
Study Director, Committee on Advanced Spectroscopic Portals
Nuclear and Radiation Studies Board
Division on Earth and Life Studies
National Research Council
The National Academies

before the

Subcommittee on Investigations and Oversight
Committee on Science and Technology
U.S. House of Representatives

June 25, 2009

Good morning Chairman Miller, Ranking Member Broun, and members of the committee. My name is Micah Lowenthal and I am the director of the program on nuclear security and nuclear facility safety in the National Research Council’s Nuclear and Radiation Studies Board.1 I am here to describe the recently issued interim report from a congressionally mandated National Research Council study on advanced spectroscopic portals (ASPs). I am the study director supporting the authoring committee of that report.2 The full report is classified, but an abbreviated version was also produced for unrestricted public release.3 My testimony is based on the abbreviated version. I will begin by providing background on the request for this study. I will then summarize the main messages of the report and discuss some of the points most relevant to this hearing.

BACKGROUND ON THE REQUEST FOR THE STUDY

Containerized cargo entering the United States at sea ports and land-border crossings for trucks is currently screened for radiation using detectors, called radiation portal monitors (RPMs), in conjunction with handheld radioisotope identifiers (RIIDs). The Department of Homeland Security (DHS) is seeking to deploy new radiation detectors, called advanced spectroscopic portals (ASPs), to replace the current RPM and RIID combination, which has known deficiencies. The ASPs consist of new detector equipment and new software, including algorithms for isotope identification.

Following some controversy over the testing and evaluation of the new ASPs, Congress required in Title IV of Division E of the Consolidated Appropriations Act, 2008 (Public Law 110-161) that the Secretary of Homeland Security submit to Congress a report certifying that ASPs would provide a "significant increase in operational effectiveness" over continued use of existing screening devices. This certification is a precondition for proceeding with full-scale procurement of ASPs. Congress also directed DHS to request that the National Academies advise the Secretary on the certification decision by helping to validate testing completed to date, providing support for future testing, assessing the costs and benefits of this technology, and bringing robustness and scientific rigor to the procurement process. Due to delays in the test and evaluation program, the Academies and DHS agreed that the study committee would issue an interim report that provides (1) the committee's evaluation of testing plans and execution it has seen, and (2) advice on how the Domestic Nuclear Detection Office (DNDO) can complete and make more rigorous its ASP evaluation for the Secretary and the nation.

This interim report is based on testing done before 2008 (referred to as past tests), plans for and preliminary results from performance tests carried out in 2008, and the agency's draft cost-benefit analysis as of October 2008. The committee received briefings on the performance test results and analysis and on the cost-benefit analysis but it did not receive any written reports on those topics by February 2009, when the interim report entered the Academies peer review process.

I will now discuss each element of the study task below.

SUMMARY OF THE MAIN MESSAGES OF THE REPORT

First, I want to note that the committee focused much of its attention on performance testing. This is not because the other tests are unimportant—the portals will be of little use if they are incompatible with CBP’s computer systems, for example—but the design, execution, and evaluation of these tests are comparatively routine, even if solutions to problems revealed by the tests are not. The design, execution, and evaluation of performance tests for the ASPs is more challenging and involves more of the science and engineering principles on which the committee has advice to offer.

Past Performance Testing

Performance tests prior to 2008 had serious flaws that were identified by the Government Accountability Office and the Secretary's ASP Independent Review Team. All truck-conveyed containers at ports and border crossings currently pass through primary screening, which is conducted with a radiation portal monitor (RPM). Containers that trigger an alarm are sent to secondary screening, which is conducted with an RPM and a RIID. The tests prior to 2008 did not adequately assess the capabilities of the ASP systems in primary and secondary screening compared with the currently deployed RPM and RIID screening systems, nor whether the ASP systems met performance criteria for procurement.

There were serious flaws in the testing protocol. Notably, DNDO utilized the same radiation sources in performance testing that were used to set up and calibrate this testing. Device setup and any calibration must use separate radiation sources from those used for testing. Also, standard operating procedures for the use of RlIDs in secondary screening were not followed in the performance tests, which disadvantaged the RIIDs in comparisons with ASPs.

2008 Performance Testing

DNDO staff acknowledged several pre-2008 deficiencies and designed its 2008 test plan to correct them. The study committee examined the revised test plan, observed tests, and questioned test personnel, and the committee concluded that DNDO did address those problems.

Because of the ASP configurations and the size of their detectors, ASPs would be expected to improve isotope identification, provide greater consistency and coverage in screening, , and increase speed of screening compared to the current RPM-RIID combination when used in secondary screening. Consequently, DNDO’s 2008 performance tests of ASPs in secondary screening focused on confirming and quantifying that advantage for several threat objects, cargos, and configurations.

When used for primary screening, an ASP system must be compared to the existing RPM-RIID combination for primary and secondary screening. This is because ASPs perform an isotope identification function in primary screening. Isotope identification is only possible in secondary screening with the current RPM-RIID system. DNDO's preliminary analysis did account for this difference.

The study committee found that the 2008 performance tests were an improvement over previous tests. They enabled DNDO to identify and physically test some of the performance limits of ASP systems. However, the committee identified several shortcomings of these tests: (1) The selected test configurations were too limited to assess the performance of ASP systems against the range of threat objects, cargos, and configurations that could be encountered during cargo screening operations at ports without modeling to complement the physical experiments; (2) the sample sizes (the number of test runs of each case) are small and limit the confidence that can be placed in comparing ASP and RPM-RIID performance; and (3) in its analysis, some of the performance metrics are not the correct ones for comparing operational performance of cargo screening systems. These shortcomings are described in greater detail in our report. In the committee’s judgment, DHS cannot determine whether ASPs can consistently outperform current RPM-RlID systems in routine practice until these shortcomings are addressed. Better physical measurement and characterization of the performance of the systems are a necessary first step but may not be sufficient to enable DHS to conclude that the ASPs meet the criteria it has defined for achieving a "significant increase in operational effectiveness."

The committee recommends modifications to the testing procedures that are being used by DHS. These modifications would influence subsequent procurement steps, as described in the recommendations for the procurement process.

Recommended Approach for Testing and Evaluation

To make the testing and evaluation more scientifically rigorous, the committee recommends an iterative approach involving modeling and physical testing. The threat space--that is, the set of possible threat objects, configurations, surrounding cargoes, and conditions of transport--is so large and multidimensional that DNDO needs an analytical basis for understanding the capabilities of ASP detectors for screening cargo. DNDO's current approach is to physically test small portions of the threat space and to use other experimental data to interpolate and extrapolate to other important parts of the threat space to test the identification algorithms in ASP systems.

The committee recommends that DNDO use computer models of threat objects, radiation transport, and detector response to simulate ASP performance. Then DNDO can use physical experiments to validate these computer models, which would allow a critique of the models' fidelity to reality and show where model refinements were needed. Physical testing and model refinement would proceed iteratively until the model provided an acceptably accurate depiction of reality. With validated models, DNDO could evaluate the performance of ASP systems over a larger, more meaningful range of the threat space than is feasible with physical tests alone.

This kind of interaction between computer models and physical tests is standard in the development and deployment of some high-technology equipment and is essential for building scientific confidence in technology performance. The performance tests conducted in 2008, and even prior to 2008, can be used to help refine and validate models. The committee also notes the skills required to proceed exist in the National Laboratories.

Recommended Approach for the Procurement Process

The idea of an iterative approach also extends to deployment of ASP systems at ports of entry. The committee noted that DHS' testing philosophy is oriented toward a one-time certification decision in the near future. However, the mandate for passive radiation screening of cargo at ports of entry is expected to continue indefinitely. Rather than focusing on a one-time decision about the deployment of ASPs, the committee suggests that current testing be viewed as a first step in a continuous process of system improvement and adaptation to changes in the threat environment, composition of container cargo, technological and analytical capabilities, and the nature of commerce at ports of entry. These factors have changed significantly over the last decade and can be expected to evolve—in both predictable and unpredictable ways—in the future. The committee recommends that DHS should develop a process for incremental deployment and continuous improvement, with experience leading to refinements in both technologies and operations over time, rather than a single product purchase to replace current screening technologies. The ASP deployment process should be developed to address and exploit changes. This would enable DNDO to adapt and continually update its screening systems so that they do not become outdated as they would after a one-time deployment.

As the first step in this process, the committee recommends that DHS deploy its currently unused low-rate initial production ASPs for primary and secondary inspection at various sites under a program of extended operational testing. Such deployment, even on a limited scale, would provide valuable data concerning ASP operation, reliability, and performance, and would allow DHS to better assess ASP capabilities in multiple environments without investing in a much larger acquisition at the outset.

The development of the hardware for radiation detection and the software for analyzing detector signals is separable. The current DHS procurement process is a competition among vendors to provide the combined systems, which has been useful. However, as DHS moves forward, the committee recommends that it match the best hardware to the best software (particularly the algorithms), drawing on tools developed by the competing companies and others, such as the national laboratories.

The deployment of ASPs will not eliminate the need for handheld detectors with spectroscopic capabilities. Because some of the improvement in isotope identification offered by the ASPs over the RIIDs is a result of software improvements, the committee recommends that these improvements be incorporated into handheld detectors. Improved software might significantly improve RIID performance and expand the range of deployment options available to Customs and Border Protection for cargo screening.

By separating the hardware and software elements of the system and engaging the broader science and engineering community, DHS would have increased confidence in its procurement of the best product available with current technology, and simultaneously could advance the state of the art.

Recommended Approach for Cost-Benefit Analysis

The preliminary analysis presented to the committee suggests that benefits of deploying the ASPs may not be clearly greater than the costs. Because DNDO's preliminary estimates indicate that the cost increases from replacing the RPM-RIID combination with ASPs exceed the cost reductions from operational efficiencies, it is important to consider carefully the conditions under which the benefits of deploying ASPs justify the program costs. A cost-benefit analysis (CBA) can provide a structure for evaluating whether a proposed program (such as the ASP program) is reasonable and justified.

The Secretary's decision on ASP certification is based, at least in part, on whether the ASPs meet the objectives in DHS' definition of "significant increase in operational effectiveness" (SlOE); however, other factors relating to the costs and benefits of the proposed ASP program will also need to be taken into account. DHS' definition of SlOE is a modest set of goals: As noted above, the increases in operational efficiency do not by themselves appear to outweigh the cost increases from replacing the RPM/RIID combination with ASPs, based on DNDO's preliminary estimates, and the criteria do not require a significantly improved ability to detect special nuclear material (an ingredient of a nuclear weapon) in primary screening. If the ASPs meet the defined goalsand are able to detect the minimum quantities of nuclear threat material recommended by the "DOE guidance", DHS still will not know whether the benefits of the ASPs outweigh the additional costs associated with them, or whether the funds are more effectively spent on other elements of the Global Architecture.

A CBA can provide insight about alternative choices--for example, whether the benefits of a given program exceed its costs, and which choices are most cost-effective. To be effective, the CBA must include three key elements: (1) a clear statement of the objectives of the screening program; (2) an assessment of meaningful alternatives to deploying ASPs; and (3) a comprehensive, credible and transparent analysis of in-scope benefits and costs.

The CBA should begin with a clear statement of what operational problem the ASPs are intended to address. This statement will define the role that the system plays in providing a layer in the defense against the importation of nuclear or radiological materials. The CBA should also include a narrative that clarifies how improving detection for containers at ports of entry to the United States fits into a larger effort to improve detection capabilities, in recognition of the many ways that materials could be brought into the United States through ports of entry that are not already screened, or across uncontrolled stretches of border.

Furthermore, to be useful in a procurement decision, a CBA must address whether funds are better spent to replace the currently deployed equipment rather than to expand coverage for other material pathways that currently have no radiation screening. DHS should consider tradeoffs among different options for allocating efforts and funds, looking at the overall system for ways to improve defenses against nuclear smuggling. Such an analysis is needed in the CBA for ASP systems because it is not evident that it has been provided elsewhere.

The CBA also needs to account for meaningful alternatives (including non-ASP systems) to reveal the scale of the benefits of ASPs for radiation screening and determine whether these benefits outweigh the additional costs for procurement and deployment. The complexity of the container screening task suggests that there could be many different options worthy of consideration. These options include variations on ASP deployment configurations and operational processes, and application of technologies beyond the current RPM-RIID and ASP systems such as deploying handheld passive detectors with state-of-the-art software and advanced methods for detecting nuclear materials. Considerations should include active interrogation, improved imaging systems, and integration of these existing technologies. These alternatives need to be compared to a baseline that reflects as realistically as possible the screening capability that DHS currently has in place. This baseline should reflect the number and placement of current RPM and RIID detectors, sensitivity of these detectors based on how they are operated at each port, and performance of existing handheld detectors in the manner they are used in the field. The CBA must indicate what capability an investment in ASPs will provide beyond the existing systems as they are currently deployed and operated, or beyond alternative radiation detection technologies that could be developed and deployed at ports of entry.

In comparing alternatives, it is important that the CBA treat benefits and costs in a comprehensive, credible, and transparent manner. The benefit assessment should show how the ASP system would contribute to improving security with respect to prevention of the detonation of a nuclear device or radiological weapon in the United States. Because this is the primary objective of the ASP program, a CBA that is silent on this subject would be incomplete. Such an assessment is difficult and no assessment of such benefits will be definitive or unassailable. The cost assessment should cover all phases of the acquisition life cycle in a manner that is independent of contractor or program office biases, and it should also assess the risk of cost escalation associated with the estimate.

The committee recommends that DHS not proceed with further procurement until it has addressed the findings and recommendations in this report, and then only if the ASP is shown to be a favored option in the CBA.

“Lessons Learned”

For this hearing I was also asked to comment on "lessons learned" regarding the processes by which the ASPs have been researched, developed, and tested. Although the study committee only examined the testing of ASPs, my personal view is that three lessons could be learned:

First, the process of iterative modeling and testing can be applied more broadly to other complex technology development programs. Modeling coupled to validating experiments is a necessity for some technology applications because of the complex conditions in which these technologies must operate.

Second, incremental deployment with continuous improvement is a good strategy for deploying systems that have not fully matured, especially if they are envisioned to have an ongoing mission.

Third, a systems-level approach--that is, examining how to optimize choices within the overall system rather than narrowly focusing on one tradeoff decision--is applicable to almost every use of equipment in security applications.

This concludes my testimony to the committee. Thank you for the opportunity to testify on this important topic. I would be happy to elaborate on any of my comments during the question and answer period.

****
ENDNOTES

1 The National Research Council is the operating arm of the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine of the National Academies, chartered by Congress in 1863 to advise the government on matters of science and technology. The Nuclear and Radiation Studies Board is responsible for oversight of National Research Council studies on safety and security of nuclear materials and waste.

2 Dr. Robert Dynes, a physicist at the University of California, member of the National Academy of sciences, and former president of the University of California, chaired this study.

3 The report is titled Evaluating Testing, Costs, and Benefits of Advanced Spectroscopic Portals for Screening Cargo at Ports of Entry: Interim Report. The abbreviated version of the report is available online at http://www.nap.edu/catalog.php?record_id=XXXX.

RSS News Feed | Subscribe to e-newsletters | Feedback | Back to Top