This is the accessible text file for GAO report number GAO-11-86 
entitled 'Electronic Records Archive: National Archives Needs to 
Strengthen Its Capacity to Use Earned Value Techniques to Manage and 
Oversee Development' which was released on February 4, 2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States Government Accountability Office: 
GAO: 

Report to Congressional Requesters: 

January 2011: 

Electronic Records Archive: 

National Archives Needs to Strengthen Its Capacity to Use Earned Value 
Techniques to Manage and Oversee Development: 

GAO-11-86: 

GAO Highlights: 

Highlights of GAO-11-86, a report to congressional requesters. 

Why GAO Did This Study: 

Since 2001, the National Archives and Records Administration (NARA) 
has been working to develop an Electronic Records Archive (ERA) to 
preserve and provide access to massive volumes and all types of 
electronic records. However, in acquiring this system, NARA has 
repeatedly revised the program schedule and increased the estimated 
costs for completion from $317 million to $567 million. NARA is to 
manage this acquisition using, among other things, earned value 
management (EVM). EVM is a project management approach that, if 
implemented appropriately, provides objective reports of project 
status and unbiased estimates of anticipated costs at completion. 

GAO was asked to (1) assess whether NARA is adequately using EVM 
techniques to manage the acquisition and (2) evaluate the earned value 
data to determine ERA’s cost and schedule performance. To do so, GAO 
compared agency and contractor documentation with best practices, 
evaluated earned value data to determine performance trends, and 
interviewed cognizant officials. 

What GAO Found: 

NARA has, to varying degrees, established selected best practices 
needed to manage the ERA acquisition through EVM, but weaknesses exist 
in most areas (see table). For example, the scope of effort in ERA’s 
work breakdown structure is not adequately defined, thus impeding the 
ability to measure progress made on contractor deliverables. These 
weaknesses exist in part because NARA lacks a comprehensive EVM 
policy, training, and specialized resources and also frequently 
replans the program. As a result, NARA has not been positioned to 
identify potential cost and schedule problems early and thus has not 
been able to take timely actions to correct problems and avoid program 
schedule delays and cost increases. 

Table: Assessment of EVM Best Practices for ERA Program: 

EVM practice: Define the scope of effort using a work breakdown 
structure; 
GAO assessment: practice partially implemented. 

EVM practice: Identify who in the organization will perform the work; 
GAO assessment: practice fully implemented. 

EVM practice: Schedule the work; 
GAO assessment: practice partially implemented. 

EVM practice: Estimate the labor and material required to perform the 
work and authorize the budgets; 
GAO assessment: practice partially implemented. 

EVM practice: Determine objective measure of earned value; 
GAO assessment: practice partially implemented. 

EVM practice: Develop the performance measurement baseline; 
GAO assessment: practice partially implemented. 

EVM practice: Execute the work plan and record all costs; 
GAO assessment: practice fully implemented. 

EVM practice: Analyze EVM performance data and record variances; 
GAO assessment: practice partially implemented. 

EVM practice: Forecast estimates at completion; 
GAO assessment: practice not implemented. 

EVM practice: Take management action to mitigate risks; 
GAO assessment: practice partially implemented. 

EVM practice: Update the performance measurement baseline as changes 
occur; 
GAO assessment: practice not implemented. 

Sources: GAO analysis of agency and contractor data. 

[End of table] 

ERA’s earned value data trends do not accurately portray program 
status due to the program’s weaknesses in implementing EVM; however, 
historical program trends indicate that future cost overruns will 
likely be between $195 million and $433 million to fully develop ERA 
as planned and between $205 and $405 million at program end (see 
table). In contrast, the contractor’s estimated cost overrun is $2.7 
million. Without more useful earned value data, NARA will remain 
unprepared to effectively oversee contractor performance and make 
realistic projections of program costs. 

Table: Projected Cost Overruns for ERA Program: 

Estimate at completion: Development phase; 
Current NARA estimate: $567 million; 
GAO estimate[A]: $762 million to $1 billion; 
Net change (percentage change): $195 to $433 million (34 to 76 
percent). 

Estimate at completion: Life cycle; 
Current NARA estimate: $995 million; 
GAO estimate[A]: $1.2 to $1.4 billion; 
Net change (percentage change): $205 to $405 million (21 to 41 
percent). 

Sources: GAO analysis of agency and contractor data. 

[A] These estimates are being reported as a range since they reflect 
rough estimates and thus incorporate assumptions made in the absence 
of validated cost inputs. 

[End of table] 

What GAO Recommends: 

GAO recommends, among other things, that NARA establish a 
comprehensive plan for all remaining work; improve the accuracy of 
earned value performance reports; and engage executive leadership in 
correcting negative trends. NARA generally concurred with GAO’s 
recommendations. 

View [hyperlink, http://www.gao.gov/products/GAO-11-86] or key 
components. For more information, contact David A. Powner at (202) 512-
9286 or pownerd@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

NARA Has Yet to Fully Establish Most EVM Practices to Manage the ERA 
Acquisition: 

ERA's Earned Value Data Do Not Reflect True Program Status or the 
Magnitude of Future Cost and Schedule Increases: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Comments from the National Archives and Records 
Administration: 

Appendix III: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Summary of Planned ERA System Capabilities by Increment: 

Table 2: Delays in Key ERA Program Milestones, as of June 2010: 

Table 3: Assessment of EVM Best Practices for NARA's ERA Program: 

Table 4: NARA and GAO Estimates at Completion for the ERA Program: 

Figures: 

Figure 1: ERA System Acquisition Strategy, as of June 2010: 

Figure 2: Simplified ERA Program Organizational Structure: 

Abbreviations: 

ANSI: American National Standards Institute: 

CIO: Chief Information Officer: 

EIA: Electronic Industries Alliance: 

EOP: Executive Office of the President: 

ERA: Electronic Records Archive: 

EVM: earned value management: 

IT: information technology: 

NARA: National Archives and Records Administration: 

OMB: Office of Management and Budget: 

[End of section] 

United States Government Accountability Office: 
Washington, DC 20548: 

January 13, 2011: 

The Honorable Thomas Carper: 
Chairman: 
Subcommittee on Federal Financial Management, Government Information, 
Federal Services, and International Security: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

The Honorable Wm. Lacy Clay: 
Chairman: 
Subcommittee on Information Policy, Census, and National Archives: 
Committee on Oversight and Government Reform: 
House of Representatives: 

Since 2001, the National Archives and Records Administration (NARA) 
has been developing a modern Electronic Records Archive (ERA). This 
major information system is intended to preserve and provide access to 
massive volumes of all types and formats of electronic records, 
independent of their original hardware or software. Moreover, ERA is 
to manage the entire life cycle of electronic records, from their 
ingestion through preservation and dissemination to customers. 
However, in acquiring this system, NARA has repeatedly revised the 
program schedule and increased the estimated costs for completion from 
$317 million to about $567 million. As a result, the direction of the 
program was recently changed in July 2010, and NARA is now planning to 
deploy an ERA system with reduced functionality by the end of fiscal 
year 2011. 

To more effectively manage such investments, the Office of Management 
and Budget (OMB) has a number of key initiatives under way--one of 
which was established in 2005 and directs agencies to fully implement 
earned value management (EVM).[Footnote 1] EVM is a project management 
approach that, if implemented appropriately, provides objective 
reports of project status, produces early warning signs of impending 
schedule delays and cost overruns, and provides unbiased estimates of 
anticipated costs at completion. More recently, in August 2010, OMB 
identified the ERA program as a high-priority program[Footnote 2] 
across the federal information technology (IT) portfolio. 

This report responds to your request that we review NARA's use of EVM 
to manage the ERA acquisition. Specifically, our objectives were to 
(1) assess whether NARA is adequately using EVM techniques to manage 
the acquisition and (2) evaluate the earned value data to determine 
ERA's cost and schedule performance. 

To address our objectives, we reviewed ERA's EVM-related 
documentation, including project work breakdown structures, project 
schedules, contractor performance reports, and executive management 
briefings. In doing so, we compared ERA's EVM practices with both 
OMB's requirements and key best practices recognized within the 
federal government and industry for the implementation of EVM. We also 
evaluated the earned value data from contractor performance reports to 
determine whether the program is projected to finish within planned 
cost and schedule targets. In addition, we interviewed relevant agency 
and contractor officials responsible for implementing EVM. 

We conducted this performance audit from March 2010 to January 2011, 
in accordance with generally accepted government auditing standards. 
Those standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe 
that the evidence obtained provides a reasonable basis for our 
findings and conclusions based on our audit objectives. Appendix I 
contains further details about our objectives, scope, and methodology. 

Background: 

The ability to find, organize, use, share, appropriately dispose of, 
and save records--the essence of records management--is vital for the 
effective functioning of the federal government. In the wake of the 
transition from paper-based to electronic processes, records are 
increasingly electronic, and the volumes of electronic records 
produced by federal agencies are vast and rapidly growing, providing 
challenges to NARA as the nation's record keeper and archivist. 

Furthermore, the Presidential Records Act gives the Archivist of the 
United States responsibility for the custody, control, and 
preservation of presidential records upon the conclusion of a 
President's term of office.[Footnote 3] The act states that the 
Archivist has an affirmative duty to make such records available to 
the public as rapidly and completely as possible consistent with the 
provisions of the act. 

In response to these widely recognized challenges, NARA began a 
research and development program to develop a modern archive for 
electronic records. The final operational ERA system is to consist of 
the following six key functions: 

* Ingest enables the transfer of electronic records from federal 
agencies. 

* Archival storage enables stored records to be managed in a way that 
guarantees their integrity and availability. 

* Records management supports scheduling,[Footnote 4] appraisal, 
[Footnote 5] description, and requests to transfer custody of all 
types of records, as well as ingesting and managing electronic 
records, including the capture of selected records data (such as 
origination date, format, and disposition). 

* Preservation enables secure and reliable storage of files in formats 
in which they were received, as well as creating backup copies for off-
site storage. 

* Local services and control regulates how the ERA components 
communicate with each other, manages internal security, and enables 
telecommunications and system network management. 

* Dissemination enables users to search descriptions and business data 
about all types of records and to search the content of electronic 
records and retrieve them. 

ERA Acquisition Strategy: 

In 2001, NARA began developing policies and plans to guide the overall 
acquisition of an electronic records system. Upon completion of the 
design phase, the agency awarded a cost-plus-award-fee[Footnote 6] 
contract to Lockheed Martin Corporation in September 2005, worth $317 
million, to develop the ERA system. 

The development contract is composed of six option periods[Footnote 
7]--the first option lasting 2 years and all subsequent options each 
lasting 1 year (to cover any uncompleted planned work and/or 
additional new work). The ERA contract is currently in the fifth 
option period. 

Within this contract structure, NARA is to deliver ERA system 
capabilities in five separate increments. Each period of performance 
includes specific capabilities associated with one or more increments 
to be delivered. Increments will overlap to allow the analysis and 
design activities for the next increment to begin while the testing of 
the final release of the current increment is under way. Figure 1 
illustrates the ERA program plan schedule prior to the recent change 
in program direction in July 2010 (as discussed later in this report). 

Figure 1: ERA System Acquisition Strategy, as of June 2010: 

[Refer to PDF for image: time line] 

Program start: 2002. 

September 2005: 
NARA selected Lockheed Martin Corporation to develop the ERA system. 
Option 1: 9/2005 through 11/2007; 
Option 2: 11/2007 through 4/2009; 
Option 3: 4/2009 through 4/2010; 
Option 4: 4/2010 through 10/2010; 
Option 5: 10/2010 through 9/2011; 
Option 6: 9/2011 through 10/2012. 

Increment 1: 9/2005 through 6/2008; 
Increment 2: 11/2007 through 12/2008; 
Increment 3: 4/2009 through 10/2010; 
Increment 4: 4/2010 through 4/2011; 
Increment 5: 4/2011 through 10/2012. 

June 2008: 
Initial Operating Capability: First use of the ERA system. 

December 2008: 
EOP Initial Operating Capability: First use of the ERA EOP system. 

September 2012: 
Full Operating Capability: Full use of the ERA system. 

Program retires: 2020. 

Source: GAO analysis of NARA data. 

[End of figure] 

Table 1 summarizes the planned system capabilities to be delivered by 
increment. 

Table 1: Summary of Planned ERA System Capabilities by Increment: 

Increment 1: 
Description: Deployed in two releases: Release 1 established the ERA 
base system--the hardware, software, and communications needed to 
deploy the system. Release 2 enabled functional archives with the 
ability to preserve electronic data in their original format, enable 
disposition agreements and scheduling, and receive unclassified and 
sensitive data from four federal agencies; according to NARA 
officials, this increment was certified as complete in June 2008. 
However, additional enhancements were made to Increment 1, release 2, 
and were completed in March 2010. 

Increment 2: 
Description: Includes the Executive Office of the President (EOP) 
system, which was designed to handle records from the Executive Office 
of the President. This increment was to include the content searching 
and management for special access requests. The EOP system was 
certified for initial operating capability in December 2008. However, 
NARA did not finish ingesting the presidential records it received 
until September 2009, 9 months after initial operating capability. 

Increment 3: 
Description: Expected to include the following: 
* Storage and access capabilities for electronic records of the 
Congress and Supreme Court. NARA deployed the first release of 
Increment 3--the congressional component--in January 2010; 
* Upgrades to the ERA base system to, among other things, search, 
view, and print records; 
* Public access to provide the public with tools needed to search and 
access electronic records; 
* Planning for preservation to include development of a preservation 
framework prototype. The prototype is to include the capability to 
plan, execute, and monitor preservation activities. 

Increment 4: 
Description: Planned to build upon the base architecture delivered as 
part of Increment 3, and NARA plans to insert newly available 
technology, particularly for preservation capabilities. NARA began 
work on this increment in 2010 and plans to complete it in fiscal year 
2011. We have previously reported that NARA has not fully defined the 
functionality to be included in Increment 4. 

Increment 5: 
Description: Expected to expand on system capabilities implemented in 
the prior increments. Our prior work has found that NARA has not fully 
defined the functionality for this increment. 

Source: GAO analysis of NARA data. 

[End of table] 

Since awarding the contract, NARA has made several modifications to 
the program schedule including, among other things, extending the 
first two option periods by 2 months and 7 months, respectively. NARA 
also reduced the period of performance for option period four by 6 
months. Additionally, NARA stated that Increment 3 was completed in 
October 2010[Footnote 8] and that they expect to complete Increment 4 
by early-2011, both of which are later than the milestones established 
in program planning documents. Table 2 shows a comparison of the 
original and revised ERA schedules. 

Table 2: Delays in Key ERA Program Milestones, as of June 2010: 

Milestone: Increment 1 (ERA Base Instance); 
Baseline schedule (September 2005 contract award): September 2007; 
Current schedule: June 2008; 
Status of milestone: Completed; 
Change: 9-month delay. 

Milestone: Increment 2 (ERA EOP System); 
Baseline schedule (September 2005 contract award): September 2008; 
Current schedule: December 2008; 
Status of milestone: Completed; 
Change: 3-month delay. 

Milestone: Increment 3; 
Baseline schedule (September 2005 contract award): September 2009; 
Current schedule: October 2010; 
Status of milestone: Completed; 
Change: 13-month delay. 

Milestone: Increment 4; 
Baseline schedule (September 2005 contract award): September 2010; 
Current schedule: March 2011; 
Status of milestone: In process; 
Change: 6-month delay. 

Milestone: Increment 5; 
Baseline schedule (September 2005 contract award): September 2011; 
Current schedule: September 2012; 
Status of milestone: Not yet begun; 
Change: 12-month delay. 

Source: GAO analysis of NARA documents. 

[End of table] 

Prior GAO Reviews Have Identified Cost and Schedule Issues in ERA's 
Progress: 

Since 2002, we have reported and testified on the technical and 
programmatic challenges that NARA has experienced in acquiring the ERA 
system, as well as on additional key risks facing the program. 
[Footnote 9] Our most recent report,[Footnote 10] in June 2010, 
reported that the estimated cost for ERA through March 2012 increased 
to more than $567 million. For example, NARA reportedly spent about 
$80 million on the base increment, compared with its planned cost of 
about $60 million. According to agency and contractor officials, 
factors contributing to the increase include unanticipated complexity 
of the system being developed. In order to enhance NARA's ability to 
complete the ERA development within reasonable funding and time 
constraints, we recommended that the agency ensure adequate executive-
level oversight by maintaining documentation of investment review 
results, including changes to the program's cost and schedule baseline 
and any other corrective actions taken as a result of changes in ERA 
cost, schedule, and performance. 

We further reported that, although NARA initially planned for the 
system to be capable of ingesting federal and presidential records in 
September 2007, the two system increments to support those records did 
not achieve initial operating capability until June 2008 and December 
2008, respectively. In addition, a number of functions originally 
planned for the base increment were deferred to later increments, 
including the ability to delete records and to ingest redacted 
records. More notably, we reported that NARA had not detailed what 
system capabilities would be delivered in the final two increments; it 
also had not effectively defined or managed ERA's requirements to 
ensure that the functionality delivered satisfies the objectives of 
the system. Although NARA established an initial set of high-level 
requirements, it lacked firm plans to implement about 43 percent of 
them. As a result, we recommended that NARA ensure that ERA's 
requirements are being managed using a disciplined process. 

OMB Directed Recent Changes to the ERA Program: 

As a result of our most recent report,[Footnote 11] OMB is working 
with NARA to remedy the problems we highlighted related to the cost, 
schedule, and performance of the ERA system. Specifically, in July 
2010, OMB directed NARA to halt all development activities by the end 
of fiscal year 2011 and develop an action plan to address our finding 
on the lack of defined system functionality for the final two 
increments of the ERA program and the need for improved strategic 
planning. 

In response, NARA has work under way to revise its program 
implementation plans and enter the operations and maintenance phase 
beginning in fiscal year 2012. For development work to be accomplished 
prior to this date, NARA is to prioritize existing requirements and 
develop realistic cost and schedule estimates to determine what can be 
accomplished by the deadline. In addition, NARA also plans to 
prioritize remaining outstanding requirements (that are to be 
accomplished under the ERA contract); identify other requirements not 
yet met by the system; and determine ERA operations and maintenance 
requirements. 

Despite changes in program direction, the Archivist noted that the 
essential goals of ERA would remain unchanged. He stated that, 
beginning in fiscal year 2012, ERA would fully support the transfer of 
electronic records to an archival repository, as well as access to and 
preservation of electronic archival records. To do this, the Archivist 
stated that the agency would work on those elements determined to be 
the highest priorities in fiscal year 2011. According to NARA, this 
may lead to a second phase of the ERA development in the future. 

EVM Provides Insight on Program Cost and Schedule: 

Given the size and significance of the government's investment in IT, 
it is important that projects be managed effectively to ensure that 
public resources are wisely invested. Effectively managing projects 
entails, among other things, pulling together essential cost, 
schedule, and technical information in a meaningful, coherent fashion 
so that managers have an accurate view of the program's development 
status. Without meaningful and coherent cost and schedule information, 
program managers can have a distorted view of a program's status and 
risks. To address this issue, in the 1960s, the Department of Defense 
developed the EVM technique, which goes beyond simply comparing 
budgeted costs with actual costs. This technique measures the value of 
work accomplished in a given period and compares it with the planned 
value of work scheduled for that period and with the actual cost of 
work accomplished. 

Differences in these values are measured in both cost and schedule 
variances. Cost variances compare the value of the completed work 
(i.e., the earned value) with the actual cost of the work performed. 
For example, if a contractor completed $5 million worth of work, and 
the work actually cost $6.7 million, there would be a negative $1.7 
million cost variance. Schedule variances are also measured in 
dollars, but they compare the earned value of the completed work with 
the value of the work that was expected to be completed. For example, 
if a contractor completed $5 million worth of work at the end of the 
month, but was budgeted to complete $10 million worth of work, there 
would be a negative $5 million schedule variance. Positive variances 
indicate that activities are costing less or are completed ahead of 
schedule. Negative variances indicate activities are costing more or 
are falling behind schedule. These cost and schedule variances can 
then be used in estimating the cost and time needed to complete the 
program. 

Without knowing the planned cost of completed work and work in 
progress (i.e., the earned value), it is difficult to determine a 
program's true status. Earned value allows for this key information, 
which provides an objective view of program status and is necessary 
for understanding the health of a program. As a result, EVM can alert 
program managers to potential problems sooner than using expenditures 
alone, thereby reducing the chance and magnitude of cost overruns and 
schedule slippages. Moreover, EVM directly supports the 
institutionalization of key processes for acquiring and developing 
systems and the ability to effectively manage investments--areas that 
are often found to be inadequate on the basis of our assessments of 
major IT investments. 

In 2005, OMB began requiring agencies, such as NARA, to fully 
implement EVM on major IT investments.[Footnote 12] Specifically, this 
guidance directs agencies to (1) develop comprehensive policies to 
ensure that their major IT investments are using EVM to plan and 
manage development; (2) include a provision and clause in major 
acquisition contracts or agency in-house project charters directing 
the use of an EVM system that is compliant with the American National 
Standards Institute (ANSI) standard;[Footnote 13] (3) provide 
documentation demonstrating that the contractor's or agency's in-house 
EVM system complies with the national standard; (4) conduct periodic 
surveillance reviews; and (5) conduct integrated baseline reviews 
[Footnote 14] on individual programs to finalize their cost, schedule, 
and performance goals. 

Building on OMB's requirements, in March 2009, we issued a guide on 
best practices for estimating and managing program costs.[Footnote 15] 
This guide highlights the policies and practices adopted by leading 
organizations to implement an effective EVM program. Specifically, in 
the guide, we identify 11 key practices that are implemented on 
acquisition programs of leading organizations. These practices include 
the need for organizational policies that establish clear criteria for 
which programs are required to use EVM, specify compliance with the 
ANSI standard, require a standard product-oriented structure for 
defining work products, require integrated baseline reviews, provide 
for specialized training, establish criteria and conditions for 
rebaselining programs, and require an ongoing surveillance function. 
In addition, we identify key practices that individual programs can 
use to ensure that they establish a sound EVM system, that the earned 
value data are reliable, and that the data are used to support 
decision making. 

NARA's Chief Information Officer Is Responsible for EVM Implementation: 

In October 2002, NARA established the ERA Program Management Office, 
which has primary responsibility for managing the ERA acquisition. The 
ERA program falls within the oversight of the NARA IT Executive 
Committee and the Chief Information Officer (CIO). Specifically, the 
executive committee is comprised of senior NARA decision makers who 
manage NARA's IT capital planning and investment control process and 
the NARA IT investment portfolio, which includes the ERA investment. 
The NARA CIO oversees management of the ERA program and is responsible 
for EVM implementation across the agency's IT acquisitions. 

To support project managers in the execution of EVM, among other 
things, the CIO established the Capital Planning and Administration 
Branch to establish policy and guidance, analyze monthly project 
status reports, identify earned value trends, provide corrective 
action recommendations, and disseminate project information as 
appropriate. Furthermore, the ERA Program Director, who reports to the 
CIO, is responsible for the operational scope of work, performance, 
budget, and schedule of the program. Additionally, the NARA senior 
staff, which includes the Archivist and the Deputy Archivist, provide 
oversight and risk management as required. Figure 2 illustrates the 
organizational structure for the ERA program. 

Figure 2: Simplified ERA Program Organizational Structure: 

[Refer to PDF for image: Organizational Structure] 

Top level: 
Archivist of the United States; 
* Deputy Archivist of the United States; 
* Chief of Staff. 

Second level, reporting to Archivist of the United States: 
* Office of Records Services; 
* Office of Regional Records Services; 
* Office of Presidential Libraries; 
* Office of the Federal Register; 
* Office of Administration; 
* Office of Information Services. 

Third level, reporting to Office of Information Services: 
* Electronic Records Archives Program Office; 
- Assistant Program Director; 
- Contracting Officer. 

Fourth level, reporting to Electronic Records Archives Program Office: 
* Project Manager; 
* Customer Support and Logistics Division; 
* Program Support Division[A]; 
* Systems Engineering Division. 

Source: GAO analysis of NARA data. 

[A] EVM activities occur within the Program Support Division. 

[End of figure] 

NARA Has Yet to Fully Establish Most EVM Practices to Manage the ERA 
Acquisition: 

NARA has, to varying degrees, established certain best practices 
needed to manage the ERA acquisition through EVM. Our work on best 
practices in EVM identified 11 key practices that are implemented on 
acquisition programs of leading organizations. These practices can be 
organized into three management areas: establishing a comprehensive 
EVM system, ensuring reliable earned value data, and using those data 
to make decisions. The ERA program fully met 2 of the 11 key practices 
for implementing EVM, partially met 7 practices, and did not meet 2 
others. These weaknesses exist in part because NARA lacks a 
comprehensive EVM policy, as well as training and specialized 
resources. NARA also frequently replans the ERA program. Without 
effectively implementing EVM, NARA has not been positioned to identify 
potential cost and schedule problems early and thus not been able to 
take timely actions to correct problems and avoid program schedule 
delays and cost increases. Table 3 lists the 11 key EVM practices by 
management area and summarizes the status of NARA's implementation of 
each practice. 

Table 3: Assessment of EVM Best Practices for NARA's ERA Program: 

Program management area: Establish a comprehensive EVM system; 
EVM practice: Define the scope of effort using a work breakdown 
structure; 
GAO assessment: The agency addressed some, but not all, aspects of 
this EVM practice. 

Program management area: Establish a comprehensive EVM system; 
EVM practice: Identify who in the organization will perform the work; 
GAO assessment: The agency addressed all aspects of this EVM practice. 

Program management area: Establish a comprehensive EVM system; 
EVM practice: Schedule the work; 
GAO assessment: The agency addressed some, but not all, aspects of 
this EVM practice. 

Program management area: Establish a comprehensive EVM system; 
EVM practice: Estimate the labor and material required to perform the 
work and authorize the budgets, including management reserve; 
GAO assessment: The agency addressed some, but not all, aspects of 
this EVM practice. 

Program management area: Establish a comprehensive EVM system; 
EVM practice: Determine objective measure of earned value; 
GAO assessment: The agency addressed some, but not all, aspects of 
this EVM practice. 

Program management area: Establish a comprehensive EVM system; 
EVM practice: Develop the performance measurement baseline; 
GAO assessment: The agency addressed some, but not all, aspects of 
this EVM practice. 

Program management area: Ensure that the data resulting from the EVM 
system are reliable; 
EVM practice: Execute the work plan, and record all costs; 
GAO assessment: The agency addressed all aspects of this EVM practice. 

Program management area: Ensure that the data resulting from the EVM 
system are reliable; 
EVM practice: Analyze EVM performance data, and record variances from 
the performance measurement baseline plan; 
GAO assessment: The agency addressed some, but not all, aspects of 
this EVM practice. 

Program management area: Ensure that the data resulting from the EVM 
system are reliable; 
EVM practice: Forecast estimates at completion; 
GAO assessment: The agency did not address any aspects of this EVM 
practice. 

Program management area: Ensure that the program management team is 
using earned value data for decision-making purposes; 
EVM practice: Take management action to mitigate risks; 
GAO assessment: The agency addressed some, but not all, aspects of 
this EVM practice. 

Program management area: Ensure that the program management team is 
using earned value data for decision-making purposes; 
EVM practice: Update the performance measurement baseline as changes 
occur; 
GAO assessment: The agency did not address any aspects of this EVM 
practice. 

Sources: GAO analysis of NARA and contractor data. 

[End of table] 

NARA Did Not Fully Establish a Comprehensive EVM System: 

The ERA program did not fully establish a comprehensive EVM system. Of 
the six key practices in this management area, the program fully 
implemented one, and partially met five. Specifically, the agency's 
program organization charts and contract work breakdown structure 
fully identified the personnel responsible for performing the defined 
work. However, critical weaknesses remain in the following other key 
practices: 

* Define the scope of effort using a work breakdown structure. The ERA 
program maintains a work breakdown structure that is consistent with 
work planned in the project schedule; however, this structure neither 
reflects the entire scope of the program, nor is it defined in such a 
way to provide meaningful understanding of the products or 
deliverables being developed. Specifically, the work breakdown 
structure did not include work planned for Increment 4 and beyond. 
Furthermore, the structure was defined by program increment rather 
than by major program/system component (e.g., ERA base, EOP), and the 
work planned in these increments was not broken down in a standardized 
fashion, thus making it difficult to track common work elements across 
increments. Without a work breakdown structure that is comprehensive, 
product-oriented, and standardized, ERA cannot efficiently track and 
measure progress made on contractor deliverables. 

* Schedule the work. The ERA project schedule had activities that were 
adequately sequenced; however, it also had a number of weaknesses that 
undermined the quality of the established performance baseline. These 
weaknesses included an invalid critical path (the sequence of 
activities that, if delayed, impacts the planned completion date of 
the project); a lack of resources assigned to all activities; and the 
excessive or unjustified use of constraints, which impairs the 
program's ability to forecast the impact of ongoing delays on future 
planned work activities. 

To the contractor's credit, it is aware of many of the deviations from 
scheduling best practices and has controls in place to monitor them. 
However, these weaknesses remain a concern because the schedule serves 
as the performance baseline against which earned value is measured, 
and any weaknesses impair the use of the schedule as a management tool. 

* Estimate the labor and material required and authorize the budgets. 
The establishment of a sound baseline plan, which would include 
estimating the labor and materials required to perform the work, was 
not thoroughly completed through an integrated baseline review. 
Although NARA performed integrated baseline reviews prior to 
exercising each option period, as well as after a major rebaseline, 
the most recent review, held in December 2009, showed that none of the 
corrective actions needed to mitigate program risks--including 
reducing a large amount of work not being measured objectively--had 
been taken. Without a fully completed integrated baseline review, NARA 
has not taken the proper steps to determine whether the baseline plan 
contains an acceptable level of risk and that significant risks have 
been mitigated. While the contractor has established management 
reserves to cover realized risks in the baseline plan and reports 
reserve levels to NARA on a monthly basis, the lack of a sufficient 
review makes it difficult to determine whether the amount of reserve 
set aside is justified. 

* Determine objective measure of earned value. Objective measures were 
not always used for determining a majority of work planned. For 
example, as of February 2010, approximately 17 percent of the 
program's baseline budget was classified as nonobjective (also called 
level-of-effort[Footnote 16]). Our research shows that, if more than 
15 percent of the baseline is measured using level-of-effort, then 
that amount should be scrutinized because it does not allow schedule 
performance to be measured. NARA identified the use of nonobjective 
metrics as a concern in its most recent integrated baseline review; 
however, it did not take action to address this concern. Until NARA 
ensures that metrics used to measure the progress made on planned work 
elements are appropriate, it cannot be assured that ERA's measurements 
of accomplishments are sufficiently credible. 

* Develop the performance measurement baseline. ERA's performance 
measurement baseline[Footnote 17] does not contain sufficient budget 
to cover all remaining work on the program since Increment 4 and 
beyond have not yet been fully defined, and the work deferred to later 
increments was not reflected in the existing earned value data or 
other baseline planning documents. As such, NARA does not have a 
stable baseline against which to measure performance and to support 
predictions of future performance through completion. 

Program Did Not Adequately Ensure That EVM Data Were Reliable: 

The ERA program did not adequately ensure that ERA's earned value data 
were reliable. Of the three key practices in this management area, the 
program fully implemented one, partially met one, and did not meet the 
remaining one. Specifically, the program has processes in place to 
identify and record cost and schedule variances and review earned 
value data using monthly contractor EVM performance reports. In 
addition, the ERA program office reviews contractor EVM data on a 
regular basis to track contractor performance, including incorporating 
EVM data into monthly program management reviews. However, the program 
has not adequately recorded variances from the performance baseline or 
been able to forecast estimates at completion using EVM: 

* Analyze EVM performance data and record variances. The contractor's 
monthly reports include justifications for cost and schedule 
variances; however, these justifications are not sufficiently detailed 
for NARA program management to fully understand the reasons for the 
variances and the contractor's plan for resolving them. In particular, 
the justifications of variances for the base system augmentation work, 
a major part of Increment 3, did not discuss the impact of the problem 
and comprehensive corrective actions to be taken. As a result, the 
program office cannot track and mitigate related risks. 

Furthermore, the monthly reports also showed a number of anomalies 
that raise questions regarding the reliability of the earned value 
data. Examples are as follows: 

- Planned work was removed from the baseline without also removing its 
corresponding budget. This is an inappropriate EVM practice and 
results in the appearance of favorable cost and schedule performance 
trends. 

- Work was shown as fully completed in one month's report but, in 
subsequent reports, the same work was reported as less than 100 
percent complete. For example, Increment 3 development work was 
reported as 100 percent complete in July 2009, but 2 months later, in 
September 2009, it was reported as 10 percent complete. In another 
example, program support activities for Increment 3 were reported as 
100 percent complete in August 2009, but in the subsequent month as 49 
percent complete. 

- Dollars were reported as spent in a given month, but no work was 
reported as scheduled or completed. 

NARA program and contractor officials provided justifications for 
these anomalies, such as extension of the period of performance. 
However, these justifications were not always valid. In particular, 
program officials cited lagging invoices as a major contributor to 
these anomalies. As such, the reconciliation of estimated costs to 
actual costs was not reflected in the earned value reports until, in 
some cases, up to 15 months after the fact. Lagging invoices can 
create false positive or negative variances and, as such, the timely 
reconciliation of these costs is necessary for obtaining reliable 
data. Until NARA improves its ability to assess contractor data and 
resolve anomalies, it risks using inaccurate data to manage the 
program, potentially resulting in additional cost overruns, schedule 
delays, and performance shortfalls. 

* Forecast estimates at completion. The ERA program is unable to 
forecast costs at program completion based on the earned value data it 
receives because these data reflect contractor performance trends in 
one increment, not the full development program. 

Program Management Team Did Not Effectively Use Earned Value Data to 
Make Decisions: 

The ERA program did not effectively use earned value data to inform 
programmatic decisions. Of the two key practices, the program 
partially met one and did not meet the other practice. Specifically, 
the program office included earned value performance trend data in 
monthly performance management review briefings. In addition, the cost 
and schedule drivers causing poor trends (as identified in the monthly 
contractor reports) were generally consistent with the risks and 
issues contained in the program risk registers. Nevertheless, critical 
weaknesses remain in this management area. Examples of those 
weaknesses are as follows: 

* Take management action to mitigate risks. NARA management did not 
take all necessary actions to mitigate risks. First, according to NARA 
officials, the CIO, Program Director, and contractor executives meet 
weekly and discuss cost and schedules issues when appropriate. 
However, NARA does not document the results of these briefings, and 
thus there is little evidence that this body has reviewed and approved 
cost and schedule issues. There is also little evidence that it 
identified corrective actions and tracked them to closure. Second, the 
briefings to senior executives are inconsistent. For example, in 
January 2010, the program team reported to the Program Director that 
unless Increment 3 work was replanned into Increment 4, they 
anticipated a cost overrun of $2.0 million. However, in other 
briefings to senior NARA management and OMB, it was reported that the 
cost performance remained steady. 

Moreover, while ERA earned value data trends are included in briefing 
materials provided to NARA senior executives, these cost and schedule 
performance trends are not discussed in these management meetings. 
Until NARA uses earned value data to make program decisions, it will 
be unable to effectively identify areas of concern and make 
recommendations to reverse negative trends. 

* Update the performance measurement baseline. NARA was unable to 
demonstrate that it maintains changes made to the program's 
performance measurement baseline as they occur. While the program 
office maintains a log of contract modifications, the changes it 
specified could not be mapped back to the baseline. Specifically, the 
changes detailed in this log did not identify the specific elements of 
the work breakdown structure being impacted, which makes it nearly 
impossible to determine whether these changes had been properly 
incorporated into the baseline. Additionally, the performance 
measurement baseline is not appropriately updated when functionality 
is deferred. While program officials stated that they remove 
corresponding budget from the baseline for work that has been moved 
out of the baseline, there is little evidence supporting this. 
Moreover, changes are not made to the baseline in a timely manner. For 
example, the base system augmentation replan was identified in 
September 2009, but it was finalized into the baseline in June 2010, 
almost 9 months later. 

Weaknesses in EVM Implementation Are Due in Part to Key Factors at the 
Program and Agency Levels: 

The weaknesses we identified in the three management areas exist, in 
part, because of a number of key factors: 

NARA-wide EVM policy: As we have previously reported,[Footnote 18] a 
comprehensive EVM policy is an important aspect of instituting a sound 
EVM program. NARA's policy, established in 2005, outlines clear 
criteria for which IT programs are to use EVM. However, it does not 
require EVM training for senior executives with oversight 
responsibility, program managers, or relevant program staff 
responsible for contract management. The policy also does not require 
annual EVM system surveillance to ensure program compliance with the 
industry standard. The ERA program office provided documentation that 
a surveillance review was performed in April 2009; however, a number 
of outstanding corrective action items resulting from this review were 
not closed. Moreover, the program could not provide documentation to 
show that regular surveillance reviews were performed in past years. 
Without such policies, NARA is not positioned to ensure that ERA's 
program staff have the appropriate skills to validate and interpret 
EVM data, and that its executives fully understand the data they are 
given in order to ask the right questions and make informed decisions. 

Specialized program resources: The program office lacks the 
appropriate levels of skilled EVM personnel. In a past governmentwide 
review,[Footnote 19] we reported on successful EVM implementation on 
major IT projects at the Department of Homeland Security and the 
Federal Aviation Administration; these projects, all similar in size 
to ERA, had between four and eight EVM specialists on staff to 
complete such activities. At this time, the ERA program has two 
resident specialists on staff to oversee and monitor contractor 
performance for all components of the program; however, their 
responsibilities also extend beyond EVM to other areas of program 
control. Given the extent of earned value data anomalies we found, and 
the frequency with which the performance baseline is replanned, it is 
essential that the program office have the appropriate level of 
personnel in place to perform EVM analysis and oversight activities. 
Without an appropriate level of staffing, the program office will 
likely continue to experience issues in obtaining reliable earned 
value data. 

Acquisition strategy approach: Our body of work[Footnote 20] has shown 
that frequent rebaselines on a systems acquisition program allow real 
performance to be hidden, leading to distorted EVM data reporting. The 
weaknesses associated with ERA's performance baseline are largely due 
to frequent rebaselining. Program and contractor officials attributed 
this to ERA's current acquisition strategy approach, which calls for 
NARA to renegotiate the contract (or replan the baseline) with every 
option period. As such, NARA is unable to produce a stable and 
comprehensive baseline that reflects all development work planned for 
the system. Instead, a new baseline is created for each option period--
so work that was not completed in one option period gets replanned or 
removed in the subsequent one, thus resetting all past contractor cost 
and schedule performance. 

We agree that the program's current implementation of the acquisition 
strategy is inherently incompatible with the use of EVM. Moreover, 
this environment sets the contractor up to be favorably positioned to 
receive a high award fee for each period of performance because the 
constant rebaselining makes it easier for the contractor to excel at 
achieving the objectives measured by the award fee evaluation process. 
In addition, it also makes the program highly inefficient because it 
must focus significant effort on program replanning instead of on the 
ERA system development work. 

Until NARA changes its acquisition strategy and establishes a 
comprehensive baseline for the program, its EVM practices will 
continue to be hampered with weaknesses, and its ability to obtain the 
insight needed to effectively manage the contractor will be impeded. 

ERA's Earned Value Data Do Not Reflect True Program Status or the 
Magnitude of Future Cost and Schedule Increases: 

ERA's earned value performance trends do not accurately portray 
program status, and our analysis of historical program trends indicate 
that future cost and schedule increases will likely be significant. 
Due to the limited implementation of EVM practices and the presence of 
data anomalies (both previously discussed), ERA's earned value data 
reflect only a small portion of the work actually being performed. As 
such, we relied on other historical ERA program performance data to 
construct a projected range of costs at completion (see app. I for 
details). We previously reported, in June 2010, that NARA completed 
about 60 percent of ERA's system requirements.[Footnote 21] If NARA 
pursues its original set of requirements, and the contractor maintains 
its current rate of productivity, it is unlikely that more than 65 
percent of them will be completed by the revised contract end date of 
September 2011. We further project that the total cost overrun 
incurred at contract end could roughly be between $285 million and 
$334 million. 

Plans for the completion of the remaining development work once the 
contract ends are being reevaluated by NARA at the direction of OMB 
(as previously discussed). According to the Archivist, the essential 
goals of ERA will remain unchanged and may lead to a second phase of 
the development in the future. If NARA were to complete the full ERA 
system as originally designed, we project the development phase to be 
complete by March 2017 with a total cost overrun between $195 million 
and $433 million. We further project that the total cost overrun 
incurred at the end of the program life cycle will likely be between 
$205 million and $405 million. Table 4 shows our cost and schedule 
estimates as compared with NARA's estimates for the program. 

Our projection assumes that past trends are indicative of future 
performance and does not take into account the degree of difficulty of 
the work being performed. This is critical because the work that 
remains includes system integration and testing activities that are 
complex and often the most challenging to complete based on our review 
of similar IT programs. Furthermore, in making our projection of total 
life cycle cost, we applied the same estimated operations and 
maintenance cost used by NARA. We did not validate the credibility of 
the operations and maintenance cost estimate. Based on these 
assumptions, we believe our rough estimates are conservative and that 
the final costs at completion could be even higher. 

Table 4: NARA and GAO Estimates at Completion for the ERA Program: 

Development phase completion; 
NARA estimate, as of Jan. 2002: September 2011; 
Current NARA estimate, as of July 2010: September 2011; 
GAO estimate: March 2017; 
Net change from current NARA estimate to GAO estimate (percentage 
change): 67 months. 

Development phase estimate at completion; 
NARA estimate, as of Jan. 2002: $317 million; 
Current NARA estimate, as of July 2010: $567 million; 
GAO estimate: $762 million to $1 billion; 
Net change from current NARA estimate to GAO estimate (percentage 
change): $195 to $433 million; (34 to 76%). 

Life cycle cost estimate; 
NARA estimate, as of Jan. 2002: $745 million; 
Current NARA estimate, as of July 2010: $995 million; 
GAO estimate: $1.2 to $1.4 billion; 
Net change from current NARA estimate to GAO estimate (percentage 
change): $205 to $405 million; (21 to 41%). 

Sources: GAO analysis of ERA program and contractor data. 

[End of table] 

In contrast, contractor-provided data from January 2009 to June 2010, 
show that the contractor has exceeded its cost target by $1.6 million 
and has not completed about $2 million worth of planned work. The 
contractor reported that the negative cost and schedule variances are 
largely due to unanticipated development work required to integrate 
specific commercial-off-the-shelf products into the base system and 
unplanned software code growth in key areas, including ingest 
orchestration and archive search capability. Based on current 
performance trends, the contractor estimates it will incur a $2.7 
million overrun at the end of Increment 3. 

The earned value data reported in ERA's contractor reports are of 
limited use to the agency in monitoring ERA's performance and making 
decisions since they do not provide an accurate depiction of program 
status. Without data that can provide such insight, NARA will remain 
unprepared to effectively oversee contractor performance and make 
realistic projections of cost and schedule for the program. 

Conclusions: 

Overall, NARA has fallen short in its implementation of EVM to oversee 
and manage the ERA system acquisition. Most of the earned value 
process controls needed for sound implementation have yet to be fully 
established. Specifically, 

* the baseline for measuring contractor performance lacks sufficient 
accuracy and completeness to provide a meaningful basis for 
understanding performance; 

* the performance data measured against a flawed baseline are not 
reliable and are further impaired by the extent of anomalies found in 
the contractor performance reports; taken together, this hampers 
NARA's ability to produce reliable estimates of cost at completion; 
and: 

* the ability to take timely action to correct unfavorable results and 
trends is constrained. Moreover, because senior executives do not 
discuss and use earned value trends to oversee this investment, the 
production of reliable EVM performance reports will continue to be a 
low priority to the program office and ultimately the contractor. 

Many of the weaknesses found can be traced back to NARA's inadequate 
agency-level EVM policies, training, and specialized resources, as 
well as to its acquisition strategy for the ERA program. Until NARA 
addresses these underlying issues, it is not positioned to optimize 
EVM as a management tool on this program. 

In addition, the program's historical cost and schedule performance 
suggest that the ERA system, at full operational capability, will 
likely be deployed at least 67 months behind schedule (in March 2017) 
and that the total life cycle cost for the program could be at least 
$1.2 billion (a 21 percent increase). 

Recent changes made to the ERA program, as directed by OMB, could 
offer a significant opportunity for the agency to move quickly and 
take aggressive corrective action for this acquisition. However, if 
NARA does not strike a proper balance between the final revised 
program plan and its institutional capacity to execute it, then the 
risk of delivered system functionality not satisfying mission 
objectives will continue to exist, and our projected cost overruns on 
this program will likely be realized. 

Recommendations for Executive Action: 

To improve NARA's ability to effectively implement EVM on its ERA 
system acquisition program, we recommend that the Archivist of the 
United States direct the NARA CIO to take the following five actions 
while the current system development contract is active: 

* Direct the ERA program to establish a comprehensive baseline 
(through an integrated master schedule) for all remaining work on 
contract. 

* Ensure that the ERA program obtains reliable EVM performance 
reports, taking into consideration the data anomalies and weaknesses 
identified in this report. 

* Engage senior NARA and contractor leadership/oversight officials to 
direct attention to reversing current negative performance trends, as 
shown in the earned value data, and take action to mitigate the 
potential cost and schedule overruns. 

* Include as part of its acquisition policy governing EVM requirements 
for (1) EVM training for senior executives and program staff 
responsible for ERA investment oversight and (2) ongoing surveillance 
of the ERA program's EVM system to ensure its compliance with industry 
standards. 

* Ensure that the ERA program has the appropriate level of specialized 
staff in place to perform EVM analysis and oversight activities. 

Taking into consideration the new ERA program direction, we further 
recommend that the Archivist of the United States direct the CIO to 
take the following three actions: 

* Using a gap analysis of the work completed through fiscal year 2011, 
and the original ERA requirements set, determine and clearly define 
the remaining work that will be pursued in the future ERA system 
development phase (Phase 2). 

* Direct the ERA program to develop new cost and schedule estimates 
for a comprehensive Phase 2 baseline, as well as for the total program 
life cycle. In combination with the above action, this should provide 
the program with enough information to disclose to the Congress the 
exact work that will be accomplished and the cost of that work. 

* Upon completion of the above action, direct the ERA program to 
implement the EVM practices that address the detailed weaknesses that 
we identified in this report, taking into consideration the criteria 
used, including: 

- establishing a comprehensive Phase 2 baseline (through an integrated 
master schedule) that has been validated through an integrated 
baseline review and limits the use of nonobjective metrics; 

- ensuring that reliable reports of EVM performance are being 
produced, including records of work completed, forecasts of estimates 
at completion, and explanations/corrective actions for variances and 
data anomalies; and: 

- engaging senior NARA leadership/oversight officials to ensure that 
earned value data are being used for decision-making purposes, 
including holding and documenting executive meetings to ensure that 
cost and schedule risks/issues have been tracked to closure, negative 
performance trends are mitigated, and major updates made to the 
baseline have been validated through an integrated baseline review. 

Agency Comments and Our Evaluation: 

In written comments on a draft of this report, which are reprinted in 
appendix II, the Archivist of the United States generally concurred 
with our recommendations and stated that NARA plans to address most of 
them in a near-term action plan. He further stated that NARA would be 
unable to address the final three recommendations in this plan since 
those were specific to a future ERA development effort. In addition, 
the Archivist shared two perspectives regarding our methodology used 
to project ERA program costs. 

First, NARA stated that it believes the true cost of ERA's system 
development to be only $282 million, rather than our reported cost of 
$567 million, because NARA looks at total costs as two distinct parts: 
developmental costs versus nondevelopmental costs. Specifically, NARA 
considers costs such as project management, research and development, 
concept exploration and planning activities, and operations of the 
system to be nondevelopmental and thus excludes them from its 
projections. We disagree that this reflects the true cost of 
developing the system. True system development cost should include the 
costs for all program activities performed in the development phase of 
an acquisition's life cycle, including project management, research 
and development, concept exploration and planning activities.[Footnote 
22] The projections we have made in the report reflect this. 

Second, NARA stated that our cost projections' assumption that past 
trends are indicative of future performance does not hold true because 
of its cost category distinction (developmental versus 
nondevelopmental) and the impact of OMB's July 2010 memo, which 
redirected the scope of the entire program and ends the current 
development work in September 2011. NARA further stated that, as a 
result, the agency cannot know now when new development efforts may 
start, or the scope or cost of such development. As discussed above, 
NARA's cost distinction does not provide for a comprehensive 
estimation of system development costs; therefore, we believe our cost 
projections are sound. We agree with NARA concerning the impact of the 
change in program direction and believe the appropriate caveats 
pertaining to ERA's future were placed on our cost projections in the 
report. Specifically, our report states that the plans for the 
completion of the remaining 35 percent of development work are being 
reevaluated and that our projections were based on the completion of 
the full ERA system as originally intended. 

We are sending copies of this report to the appropriate congressional 
committees, the Archivist of the United States, and other interested 
parties. The report also is available at no charge on the GAO Web site 
at [hyperlink, http://www.gao.gov]. 

If you or your staff members have questions on matters discussed in 
this report, please contact David Powner at (202) 512-9286 or 
pownerd@gao.gov. Contact points for our Offices of Congressional 
Relations and Public Affairs may be found on the last page of this 
report. GAO staff who made major contributions to this report are 
listed in appendix III. 

Signed by: 

David A. Powner: 
Director, Information Technology Management Issues: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

Our objectives were to (1) assess whether the National Archives and 
Records Administration (NARA) is adequately using earned value 
management (EVM) techniques to manage the Electronic Records Archives 
(ERA) acquisition and (2) evaluate the earned value data to determine 
ERA's cost and schedule performance. 

To accomplish our first objective, we analyzed program documentation, 
including project work breakdown structures, project schedules, 
integrated baseline review briefings, risk registers, contractor 
performance reports, and monthly program management review briefings 
for the ERA program. Specifically, we compared program documentation 
with EVM and scheduling best practices as identified in GAO's cost 
guide.[Footnote 23] We characterized the extent to which the program 
met each of the 11 practices as either fully implemented (all sub- 
elements of the practice were met), partially implemented (some but 
not all sub-elements were met), or not implemented (none of the sub- 
elements were met). To have fully implemented a key practice, the 
program must have implemented all characteristics of the practice. We 
also interviewed program and contractor officials (and observed 
program status review meetings) to obtain clarification on how EVM 
practices are implemented and how the data are used for decision-
making purposes. 

To accomplish our second objective, we analyzed earned value data 
contained in contractor EVM performance reports, program budget 
reports sent to the Office of Management and Budget (OMB), as well as 
past GAO work on ERA costs and system requirements.[Footnote 24] To 
perform this analysis, we compared the cost of work completed with 
budgeted costs for scheduled work in the contractor performance 
reports over an 18-month period to show trends in cost and schedule 
performances. 

We determined that the earned value cost data were not sufficiently 
reliable to estimate the likely costs at contract completion. As a 
result, we developed an alternative methodology by using other 
historical ERA performance data to make cost projections at contract 
completion, as well as to make further cost and schedule projections 
about the system development phase beyond the contractor's baseline 
plan.[Footnote 25] To do so, we used our past work to identify the 
percentage of ERA requirements completed through September 2010. Our 
alternative methodology was as follows: 

* Completed requirements estimate: We divided the total number of 
completed requirements by the duration (in months) it took to complete 
them to calculate a productivity factor. We then multiplied this 
factor by the remaining duration of the contract to calculate our 
estimate of the percentage of requirements that will likely be 
completed at contract end. 

* Low end of contract completion cost estimate range: We used the cost 
overrun incurred to complete the amount of requirements described 
above by the duration (in months) it took to complete them to 
calculate a burn rate of overrun dollars. We then multiplied the burn 
rate by the remaining duration to determine an estimated total overrun 
beyond what had already been incurred. 

* High end of contract completion cost estimate range: We divided the 
current contract value by the total number of completed requirements 
to calculate an efficiency factor. We then multiplied this factor by 
our estimate of completed requirements at contract end (calculated as 
described in the first bullet) to determine our estimate. 

* Development phase schedule estimate: We used the productivity factor 
to estimate the duration to complete 100 percent of the requirements 
(i.e., the development phase). 

* Development phase cost estimate range: We applied the same general 
methodology as described above to determine both the low-end and high- 
end estimates. 

To generate our total life cycle cost estimates, we added the NARA- 
provided cost estimate for operations and maintenance to our estimated 
development phase costs. 

To assess the reliability of the budget cost data, we compared them 
with other available supporting documents (including financial reports 
to OMB); performed limited testing of the data to identify obvious 
problems with completeness or accuracy; and interviewed agency and 
contractor officials about the data. For the purposes of this report, 
we determined that the budget cost data were sufficiently reliable. We 
did not test the adequacy of the agency or contractor cost-accounting 
systems. Our evaluation of these cost data was based on what we were 
told by the agency and the information they could provide. 

We conducted this performance audit from March 2010 to January 2011 at 
NARA offices in the Washington, D.C., metropolitan area. Our work was 
done in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Comments from the National Archives and Records 
Administration: 

National Archives And Records Administration: 
8601 Adelphi Road: 
College Park, MD 20740-6001: 
[hyperlink, http://www.archives.gov] 

November 22, 2010: 

Government Accountability Office: 
Director of Information Technology Management Issues: 
Mr. David A. Powner: 
441 G Street NW: 
Washington DC, 20548: 

Dear Dave: 

Thank you for the opportunity to comment on the draft report GAO-11-
86, Electronic Records Archive: National Archives Needs to Strengthen 
Its Capacity to Use Earned Value Techniques to Manage and Oversee 
Development. We appreciate the level of analysis provided and the 
attention of the audit team to our comments throughout the engagement. 

I would like to share two perspectives regarding the methodology that 
the audit team used to project program cost. Through various 
interactions with the audit team, we understand how you reached the 
conclusion that the estimate of the development costs of ERA through 
September 2011 is $567 million. We look at this total cost as two 
distinct parts: developmental costs and non-developmental costs. Non-
developmental costs such as project management, research and 
development, concept exploration and planning activities, and 
operations of the system, are not included in our projections. Thus we 
believe the true cost of ERA's system development for this same period 
is only $282 million. 

Second, the report notes that the projections assume that past trends 
are indicative of future performance. Normally this would hold true 
for the Lifecycle cost estimate except for two things: the 
developmental vs. non-developmental distinction discussed above, and 
the impact of the OMB "TechStat" memo dated July 2, 2010. This memo 
redirects the scope of the entire program and ends development at 
September 30, 2011. This does not spell the end of ERA. We expect that 
development will be continued at some point in the future based on an 
assessment of the impact of technological trends on records creation 
in the Government and NARA's need to address new record formats and to 
respond to new public access requirements. However, we cannot know now 
when new development efforts may start, nor the scope or cost of such 
development. 

The report also includes eight recommendations. We generally concur 
with all, but since the final three are all specific to a future ERA, 
we will be unable to address them in a near term action plan. If you 
have any questions regarding this memo or our action plan process, 
please contact Mary Drak, NARA's Audit Liaison at 301-837-1668 or via 
email at mary.drak@nara.gov. 

Signed by: 

David S. Ferriero: 
Archivist of the United States: 

[End of section] 

Appendix III: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

David A. Powner, (202) 512-9286, or pownerd@gao.gov: 

Staff Acknowledgments: 

In addition to the individual named above, those making contributions 
to this report included Carol Cha, Assistant Director; Neil Doherty; 
Ronalynn Espedido; Jason Lee; Lee McCracken; Karen Richey; and Niti 
Tandon. 

[End of section] 

Footnotes: 

[1] OMB Memorandum, M-05-23 (Aug. 4, 2005). 

[2] As part of the administration's Accountable Government Initiative, 
OMB identified major IT investments across the government that are at 
risk of failure and require additional oversight. Program selection is 
based on the following risk factors: (1) significant cost or schedule 
variance from the current baseline, (2) performance targets or mission 
objectives have not been met, (3) frequent re-baselines, or (4) lack 
of essential executive sponsorship/leadership. 

[3] 44 U.S.C. § 2203(f)(1). 

[4] A record schedule is a document that describes agency records, 
establishes a period for their retention by the agency, and provides 
mandatory instructions for what to do with them when they are no 
longer needed for current government business. 

[5] Records appraisal is the process of determining the value and the 
final disposition of records, making them either temporary or 
permanent. 

[6] A cost-plus-award-fee contract is a cost reimbursement contract 
that provides for a fee consisting of a base amount fixed at inception 
of the contract plus an award amount that may be given based upon a 
judgmental evaluation by the government of contract performance. 

[7] Lockheed Martin's contract also includes a base period that 
reflects system design phase work,which was completed in August 2005. 

[8] We currently have work under way at NARA to verify the completion 
of Increment 3, among other things. 

[9] GAO, Information Management: Challenges in Managing and Preserving 
Electronic Records, [hyperlink, 
http://www.gao.gov/products/GAO-02-586] (Washington, D.C.: June 17, 
2002); Records Management: Planning for the Electronic Records 
Archives Has Improved, [hyperlink, 
http://www.gao.gov/products/GAO-04-927] (Washington, D.C.: Sept. 23, 
2004); Information Management: Acquisition of the Electronic Records 
Archives is Progressing, [hyperlink, 
http://www.gao.gov/products/GAO-05-802] (Washington, D.C.: July 15, 
2005); Electronic Records Archives: The National Archives and Records 
Administration's Fiscal Year 2006 Expenditure Plan, [hyperlink, 
http://www.gao.gov/products/GAO-06-906] (Washington, D.C.: Aug. 18, 
2006); Information Management: The National Archives and Records 
Administration's Fiscal Year 2007 Expenditure Plan, [hyperlink, 
http://www.gao.gov/products/GAO-07-987] (Washington, D.C.: July 27, 
2007); Information Management: Challenges in Implementing an 
Electronic Records Archive, [hyperlink, 
http://www.gao.gov/products/GAO-08-738T] (Washington, D.C.: May 14, 
2008); Electronic Records Archives: The National Archives and Records 
Administration's Fiscal Year 2009 Expenditure Plan, [hyperlink, 
http://www.gao.gov/products/GAO-09-733] (Washington, D.C.: July 24, 
2009); and National Archives: Progress and Risks in Implementing its 
Electronic Records Archive Initiative, [hyperlink, 
http://www.gao.gov/products/GAO-10-222T] (Washington, D.C.: Nov. 5, 
2009). 

[10] GAO, Electronic Records Archive: Status Update on the National 
Archives and Records Administration's Fiscal Year 2010 Expenditure 
Plan, [hyperlink, http://www.gao.gov/products/GAO-10-657] (Washington, 
D.C.: June 11, 2010). 

[11] [hyperlink, http://www.gao.gov/products/GAO-10-657]. 

[12] OMB Memorandum, M-05-23 (Aug. 4, 2005). 

[13] Recognizing the importance of ensuring quality earned value data, 
ANSI and the Electronic Industries Alliance (EIA) jointly established 
a national standard for EVM systems in May 1998 (ANSI/EIA-748-A-1998). 
This standard, commonly called the ANSI standard, is composed of 
guidelines to instruct programs on how to establish a sound EVM 
system. This document was updated in July 2007 and is referred to as 
ANSI/EIA-748-B. 

[14] An integrated baseline review is an evaluation of a program's 
baseline plan to determine whether all program requirements have been 
addressed, risks have been identified, mitigation plans are in place, 
and available and planned resources are sufficient to complete the 
work. 

[15] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for 
Developing and Managing Capital Program Costs, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP (Washington, D.C.: March 2009). 

[16] Level-of-effort is unmeasured effort of a general or supportive 
nature that does not produce definitive end products (e.g., program 
administration). 

[17] The performance measurement baseline represents the cumulative 
value of the planned work over time. It takes into account that 
program activities occur in a sequenced order, based on finite 
resources, with budgets representing those resources spread over time. 
Deviations from the baseline identify areas where management should 
focus attention. 

[18] GAO, Information Technology: Agencies Need to Improve the 
Implementation and Use of Earned Value Techniques to Help Manage Major 
System Acquisitions, [hyperlink, http://www.gao.gov/products/GAO-10-2] 
(Washington, D.C.: Oct. 8, 2009). 

[19] [hyperlink, http://www.gao.gov/products/GAO-10-2]. 

[20] GAO, Secure Border Initiative: DHS Needs to Strengthen Management 
and Oversight of its Prime Contractor, [hyperlink, 
http://www.gao.gov/products/GAO-11-6] (Washington, D.C.: Oct. 18, 
2010); Defense Acquisitions: Missile Defense Program Instability 
Affects Reliability of Earned Value Management Data, [hyperlink, 
http://www.gao.gov/products/GAO-10-676] (Washington, D.C.: July 14, 
2010); and National Airspace System: Better Cost Data Could Improve 
FAA's Management of the Standard Terminal Automation Replacement 
System, [hyperlink, http://www.gao.gov/products/GAO-03-343] 
(Washington, D.C.: Jan. 31, 2003). 

[21] [hyperlink, http://www.gao.gov/products/GAO-10-657]. 

[22] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[23] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[24] [hyperlink, http://www.gao.gov/products/GAO-09-733]. 

[25] At the direction of OMB, all remaining ERA system development 
work will be halted in September 2011, and the contract will end at 
that time. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: