This is the accessible text file for GAO report number GAO-14-219 entitled 'Recovery Act: Grant Implementation Experiences Offer Lessons for Accountability and Transparency' which was released on January 24, 2014. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Requesters: January 2014: Recovery Act: Grant Implementation Experiences Offer Lessons for Accountability and Transparency: GAO-14-219: GAO Highlights: Highlights of GAO-14-219, a report to congressional requesters. Why GAO Did This Study: In response to the recent serious recession, Congress enacted the Recovery Act to promote economic recovery, make investments, and minimize or avoid reductions in state and local government services. Approximately $219 billion was distributed as grants for use in states and localities, making grants a major component of the act. These grants covered a broad range of areas including education, transportation, energy, infrastructure, the environment, health care, and housing. GAO was asked to examine grant management lessons learned resulting from the Recovery Act. This report examines federal, state, and local experiences with implementing grants funded by the Recovery Act by identifying examples of good practices employed and challenges faced in meeting the act's accountability and transparency requirements. GAO reviewed relevant documents including OMB and Recovery Board guidance, relevant literature, and previous reports by GAO, federal inspectors general, and others. GAO also interviewed officials from OMB, the Recovery Board, four federal agencies, three state governments, and two local governments, among others. This report also draws on GAO's past bi-monthly reviews of selected states' and localities' use of Recovery funds. What GAO Found: Federal, state, and local officials responsible for implementing grants funded by the American Recovery and Reinvestment Act of 2009 (Recovery Act) as well as the external oversight community reported lessons learned regarding both useful practices and challenges to ensuring accountability. Faced with aggressive timelines for distributing billions of dollars, they adopted a number of practices to foster accountability including (1) strong support by top leaders; (2) centrally-situated collaborative governance structures; (3) the use of networks and agreements to share information and work towards common goals; and (4) adjustments to, and innovations in, usual approaches to conducting oversight such as the increased use of up- front risk assessments, the gathering of “real time” information, earlier communication of audit findings, and the use of advanced data analytics. For example, in 2009, the Recovery Accountability and Transparency Board (Recovery Board) established the Recovery Operations Center which used advanced data analysis techniques to identify potential fraud and errors before and after payments were made. The Recovery Act's emphasis on accountability also presented challenges for several states and federal agencies. These included limited resources for oversight at the state and local levels, and the speed with which Recovery Act funds were distributed. One state addressed the challenge of limited resources by transferring funds from its central administration account to Recovery Act oversight. To facilitate the quick distribution of funds, maintenance-of-effort provisions concerning transportation projects (which prevented Recovery funds from being used for planned state projects) were rolled out before the Department of Transportation had time to issue sufficiently detailed definitions of what constituted “state funding.” To address this challenge, the department had to issue clarifying guidance to states seven times during the first year of the Recovery Act. Federal, state, and local officials also developed practices and encountered challenges related to the transparency of Recovery Act funds. An example of one good practice that was required by the Recovery Act was the creation of the Recovery.gov website. This site, as well as similar portals created by states and localities, demonstrated several leading practices for effective government websites. These included (1) establishing a clear purpose, (2) using social networking tools to garner interest, (3) tailoring the website to meet audience needs, and (4) obtaining stakeholder input during design. Efforts to increase transparency also led to challenges for several states and federal agencies. For example, some recipients lacked knowledge or expertise in using the data systems needed to report grant spending, while others faced challenges with reporting the same data to multiple systems. Early GAO reviews also found several problems with job reporting data including discrepancies in how full time equivalents were recorded and the capacity of recipients to meet reporting deadlines. The Office of Management and Budget (OMB) addressed these challenges by issuing additional guidance and providing technical support. Finally, agencies receiving Recovery Act funds were required to submit performance plans that identified measures on a program-by-program basis. The level of detail and the specificity of outcomes in these plans varied greatly for the agencies GAO examined, making it difficult to determine the extent to which some were making progress toward their goals and demonstrating results. What GAO Recommends: GAO is not making any recommendations in this report. View [hyperlink, http://www.gao.gov/products/GAO-14-219]. For more information, contact Stanley J. Czerwinski at (202) 512-6806 or czerwinskis@gao.gov. [End of section] Contents: Letter: Background: Practices at Federal, State, and Local Levels Contributed to Improved Accountability of Recovery Act Grant Programs but Challenges Existed: The Recovery Act Resulted in Increased Transparency but Also Presented Challenges: Concluding Observations: Agency Comments: Appendix I: Objectives, Scope, and Methodology: Appendix II: GAO Contact and Staff Acknowledgments: Table: Table 1: Key Organizations and Their Primary Accountability and Oversight Responsibilities Regarding Implementation of the Recovery Act: Figures: Figure 1: Timeline of Selected Recovery Act Events: Figure 2: Overview of Recovery Act Spending by Program and Category, as of October 31, 2013: Figure 3. An Analyst Working in the Recovery Board's Recovery Operations Center and a Sample Output of One of ROC's Link Analysis Tools. Figure 4: Screenshot of Recovery.gov User Identification Feature: Figure 5: Screenshot of Recovery.gov's Recipient Projects Page for a Georgia ZIP Code, 30318: Figure 6: Screenshot of NYCStat Stimulus Tracker Mapping Feature: Abbreviations: ARRA: American Recovery and Reinvestment Act of 2009: API: Application Program Interface: BCN: RRA Big City Network: CBO: Congressional Budget Office: DOE: Department of Energy: HUD: Department of Housing and Urban Development: DOT: Department of Transportation: FHWA: Federal Highway Administration: FTE: Full-time Equivalent: GPRAMA: GPRA Modernization Act of 2010: IG: Inspector General: MASSPIRG: Massachusetts Public Interest Research Group: NIAF: National Intergovernmental Audit Forum: NRT: National Review Team: OMB: Office of Management and Budget: Recovery Board: Recovery Accountability and Transparency Board: ROC: Recovery Operations Center: SFSF: State Fiscal Stabilization Fund: [End of section] United States Government Accountability Office: GAO: 441 G St. N.W. Washington, DC 20548: January 24, 2014: The Honorable Thomas R. Carper: Chairman: The Honorable Tom A. Coburn, M.D. Ranking Member: Committee on Homeland Security and Governmental Affairs: United States Senate: The Honorable Claire McCaskill: Chairman: Subcommittee on Financial and Contracting Oversight: Committee on Homeland Security and Governmental Affairs: United States Senate: In response to the recent serious recession, Congress enacted the American Recovery and Reinvestment Act of 2009 (Recovery Act) to, among other purposes, promote economic recovery, make investments, and minimize and avoid reductions in state and local government services. [Footnote 1] A significant component of the Recovery Act was grants for use in states and localities. As of the end of October 2013, the Department of the Treasury had awarded approximately $219 billion of Recovery Act funds in the form of grants. These grants covered a broad range of areas including education, transportation, infrastructure, energy, the environment, health care, and housing. The importance of spending Recovery Act funds quickly was highlighted by the President's goal of spending 70 percent of the funds by September 30, 2010. You asked us to examine grant management lessons learned resulting from the Recovery Act and to provide examples of what worked well, as well as what challenges were experienced by federal, state, and local governments. To better understand grant management lessons learned resulting from the Recovery Act, we focused on two key issues involving grant implementation--accountability[Footnote 2] and transparency[Footnote 3]--where Congress and the administration placed unprecedented emphasis when they crafted the Recovery Act. Specifically, this report identifies and provides examples of good practices employed and the challenges faced in meeting the Recovery Act's accountability and transparency requirements by select federal, state, and local agencies implementing grant programs funded by the Recovery Act. Additionally, in September 2013, we issued a related report examining federal efforts to increase the transparency of federal data and identifying lessons learned from operating existing data systems that could contribute to these efforts.[Footnote 4] To accomplish our objectives, we conducted a detailed literature review to identify relevant prior work by us and others regarding Recovery Act challenges and lessons learned.[Footnote 5] We then interviewed federal, state, and local officials involved in the implementation of the Recovery Act and obtained supporting documentation. The federal entities that we contacted with broad jurisdiction for the Recovery Act were the Recovery Implementation Office, the Office of Management and Budget (OMB), and the Recovery Accountability and Transparency Board (Recovery Board).[Footnote 6] We developed criteria to select a subset of four federal agencies, three state governments, and two local governments. We then contacted these entities in order to obtain a more in-depth understanding of their experiences with grant programs funded by the Recovery Act as well as to identify examples of challenges and good practices. Our selection criteria included factors such as the nature, type, and value of the grants handled by the organization; whether the grants involved were already well-established, greatly increased in size, or entirely new; and the extent to which the organizations were identified in our previous work or in the broader literature. The federal agencies we selected were the Departments of Education, Energy, Transportation, and Housing and Urban Development. We deemed Medicaid out of scope for the purposes of this review. Although it was the largest grant program funded by the Recovery Act, it is primarily an entitlement and is subject to specific rules that are not typical of program grants. The state and local governments we selected included California, Georgia, and Massachusetts, as well as New York, New York and Denver, Colorado. These three states are part of a core group of 16 states and the District of Columbia that we selected for our series of bimonthly reviews of how Recovery Act funds were being used by recipients. The Recovery Act required that we conduct these bimonthly reviews of how these funds were used by selected states and localities. This report also fulfills this requirement in that we examined the use of Recovery Act funds in the previously mentioned states and cities. To obtain a broader state perspective, we interviewed officials from the state Recovery Act coordinators' network, which included key state officials responsible for implementing the Recovery Act from several states. For additional context, we supplemented these interviews by meeting with officials from state and local advocacy organizations such as the National Association of State Auditors and Comptrollers; the National Association of State Budget Officers; and the National Association of Counties. We reviewed and synthesized the information provided by these officials, as well as previously issued work regarding challenges and good practices that relate to our two key themes, and we developed descriptive examples. We also reviewed and applied criteria established by HowTo.gov, a source of guidance and leading practices for government websites, to Recovery.gov and state and local Recovery websites. A full description of our objectives, scope, and methodology is provided in appendix I. We conducted this performance audit from December 2012 to January 2014 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: The stated purposes of the Recovery Act are to: * preserve and create jobs and promote economic recovery; * assist those most impacted by the recession; * provide investments needed to increase economic efficiency by spurring technological advances in science and health; * invest in transportation, environmental protection, and other infrastructure that will provide long-term economic benefits; and: * stabilize state and local government budgets, in order to minimize and avoid reductions in essential services and counterproductive state and local tax increases.[Footnote 7] While many Recovery Act projects focused on immediately jumpstarting the economy, some projects--such as those involving investments in technology, infrastructure, and the environment--are expected to contribute to economic growth for many years. The Recovery Act established the Recovery Accountability and Transparency Board (Recovery Board) to provide additional monitoring and oversight. The board was originally scheduled to terminate operations by September 30, 2013,[Footnote 8] but its mission has been extended until September 30, 2015, to provide oversight and monitoring of assistance provided in response to Hurricane Sandy, which hit the northeast in October 2012.[Footnote 9] The timeline in figure 1 displays selected events related to the Recovery Act and its requirements. Figure 1: Timeline of Selected Recovery Act Events: [Refer to PDF for image: timeline] February 2009: * Recovery Act is signed into law; * The Recovery Accountability and Transparency Board (Recovery Board) is established; * Recovery.gov website initially launches to provide projections for how, when, and where funds will be spent; * The Office of Management and Budget (OMB) issues the first in a series of implementation guidance for federal departments and agencies. March 2009: Recovery Act funds begin to be distributed to states and localities; * Federal agencies begin reporting on their use of Recovery Act funds. April 2009: * GAO reports on the first of its required bi-monthly reviews on the use of Recovery Act funds by selected states and localities. September 2009: * The President's Council of Economic Advisers issues first of required quarterly reports on the economic impact of the Recovery Act; * Recovery.gov is re-launched to provide additional information and new functionality. October 2009: * Recipients of Recovery Act funds begin reporting on funding used and job created. November 2009: * The Recovery Board launches the Recovery Operations Center; * Congressional Budget Office issues first of required estimates on the Recovery Act economic impact; * GAO issues first of its required quarterly comments on recipients' reports of jobs created or retained. January 2013: * Disaster Relief Appropriations Act, 2013, extends the Recovery Board's mission beyond its initial expiration date of September 30, 2013 to track and monitor spending related to Hurricane Sandy relief efforts. October 2013: * As of October 31, 2013, $811.9 billion of total Recovery Act funds have been distributed (estimated 98 percent). Sources: GAO presentation of information from Congressional Budget Office, Office of Management and Budget, Recovery Board, and the White House. [End of figure] Recovery Act Funding: The Congressional Budget Office (CBO) initially estimated the cost of the Recovery Act to be approximately $787 billion; however, CBO's most recent estimate projects that the Recovery Act will cost approximately $830 billion over the 2009-2019 time period.[Footnote 10] As of October 31, 2013, the federal government provided a total of approximately $812 billion related to Recovery Act activities. This includes funding to 28 federal agencies that were distributed to states, localities, and other entities; individuals through a combination of tax benefits and cuts; entitlements; and loans, contracts, and grants. See figure 2 for an overview of Recovery Act spending by category and program. Although Medicaid was the single largest Recovery Act grant program, we did not include it in our review because it is primarily an entitlement program and subject to specific rules that are not typical of program grants.[Footnote 11] Accordingly, we included the Recovery Act funds directed to Medicaid in the entitlement category, rather than the grant category, in figure 2. Emphasizing the importance of spending Recovery Act funds quickly, the President established a goal that by September 30, 2010, 70 percent of Recovery Act funding should be spent (that is, both obligated and outlayed).[Footnote 12] Therefore, agencies had approximately 19 months to spend almost three-quarters of their Recovery funds. Figure 2: Overview of Recovery Act Spending by Program and Category, as of October 31, 2013: [Refer to PDF for image: 2 pie-charts; 1 vertical bar graph] All Recovery Act funding: Tax benefits: 36%; Entitlements: 32%; Contracts, grants and loans: 32%. Grants, contracts, and loans: Grants: 80%; Contracts: 15%; Loans: 5%. Grant programs by category[A]: Program category: Education; Grant amount: $83.3 billion; Program category: Transportation; Grant amount: $47.2 billion; Program category: Energy and Environment; Grant amount: $22.1 billion; Program category: Infrastructure; Grant amount: $17.1 billion; Program category: Research and Development and Science; Grant amount: $15.7 billion; Program category: Housing; Grant amount: $11.5 billion; Program category: Family; Grant amount: $5.1 billion; Program category: Health; Grant amount: $5.6 billion; Program category: Public Safety; Grant amount: $5.7 billion; Program category: Job Training and Unemployment; Grant amount: $4.8 billion; Program category: Other; Grant amount: $0.8 billion. Sources: GAO analysis of Recovery.gov, U.S. Treasury Federal Agency Financial and Activity Reports and Recovery Act recipient reporting data. [A] Does not include Medicaid which is instead reflected in the entitlement category in the first pie chart above. [End of figure] Grant Programs Funded by the Recovery Act: Grants have played a key role in providing Recovery Act funds to recipients, with approximately $219 billion being awarded for use in states and localities through a wide variety of federal grant programs. With the intent of disbursing funds quickly to create and retain jobs and stabilize state and local budgets, a large majority of Recovery Act grant funding went to states and localities within 3 years of the law's enactment. Recipients reported receiving approximately 88 percent of their grant awards by the end of the 2ND quarter of calendar year 2013. State and local spending was as follows: * Fiscal year 2009: spending totaled approximately $53 billion in actual outlays. * Fiscal year 2010: spending was at its highest level with approximately $112 billion in actual outlays. * Fiscal year 2011: spending decreased from its peak, with approximately $69 billion in actual outlays. The 28 federal agencies that received Recovery funds developed specific plans for spending the money.[Footnote 13] The agencies then awarded grants and contracts to state governments or, in some cases, directly to schools, hospitals, or other entities. OMB guidance directed these federal agencies to file weekly financial reports detailing how the money was being distributed. Recipients of the funds, in turn, were required by the Recovery Act to file quarterly reports on how they were spending the Recovery Act funds that they received. Recovery Act grants provided to states and localities covered a broad range of areas such as transportation, energy, and housing. Education programs were the largest recipients of Recovery Act grant awards. Of the education programs funded in the Recovery Act, the largest in terms of funding was the newly created State Fiscal Stabilization Fund (SFSF) program, which provided assistance to state governments to stabilize their budgets by minimizing budgetary cuts in education and other essential government services, such as public safety. The Recovery Act appropriated $53.6 billion for the SFSF program.[Footnote 14] As figure 2 (above) shows, grants represent over one-quarter of Recovery Act funding. Out of that category, funding received in the program areas of education, transportation, and energy and environment amount to approximately $137 billion, or 70 percent, of Recovery Act grant spending to date. Recovery Act Oversight and Accountability Responsibilities of Key Participants: The Recovery Act called for a large amount of federal funds to be spent (that is, obligated and outlayed) in a short period of time-- approximately 19 months--by the end of September 30, 2010. To assure the public that their tax dollars were being spent efficiently and effectively, the Recovery Act placed increased emphasis on accountability and transparency through enhanced reporting, auditing, and evaluation requirements for users of Recovery Act funds. The Recovery Act delineated some of these increased accountability and transparency responsibilities to existing organizations and entities as well as newly-created ones. See table 1 for details regarding the primary accountability and oversight responsibilities of key organizations involved in implementing the Recovery Act. Table 1: Key Organizations and Their Primary Accountability and Oversight Responsibilities Regarding Implementation of the Recovery Act: Federal government: Key organizations and entities: Congressional Budget Office; Primary accountability and oversight responsibilities: * Provide periodic estimates on the Recovery Act's effect on economic output and employment. (Recovery Act, div. A, § 1512(e), 123 Stat. at 288). Key organizations and entities: Council of Economic Advisers; Primary accountability and oversight responsibilities: * Prepare periodic reports on employment and economic impacts of Recovery Act spending. (Recovery Act, div. A, § 1513, 123 Stat. at 288). Key organizations and entities: Federal agencies; Primary accountability and oversight responsibilities: * Report how they are distributing the funds on a weekly basis. (Office of Management and Budget Guidance, OMB Memorandum M-09-10); * Use a separate accounting identifier for Recovery Act funded projects. (Office of Management and Budget Guidance, OMB Memorandum M- 09-10); * Make publicly available recipient reports within 30 days of the end of each quarter. (Recovery Act, div. A, § 1512(d), 123 Stat. at 288). [Footnote 15] Key organizations and entities: Government Accountability Office; Primary accountability and oversight responsibilities: * Conduct bimonthly reviews and prepare reports of such reviews of selected states' and localities' use of Recovery Act funds. (Recovery Act, div A, § 901(a), 123 Stat. at 191); * Comment on quarterly recipient reports on the number of jobs created or preserved. (Recovery Act, div. A, § 1512(e), 123 Stat. at 288); * Review areas such as trade adjustment assistance, new education incentive grants, and efforts to increase small business lending. (Recovery Act, div. B, § 1894, 123 Stat. at 423 and Recovery Act, div. A, § 14009, 123 Stat. at. 285). Key organizations and entities: Inspectors general; Primary accountability and oversight responsibilities: * Audit the programs, grants, and projects funded under the Recovery Act, both within their particular agency or department and collectively address concerns raised by the public. (Recovery Act, div. A, § 1514, 123 Stat. 289); * Serve on Recovery Act Board. (Recovery Act, div. A, § 1522(b), 123 Stat. at 290);[Footnote 16] * Conduct whistleblower reprisal investigations. (Recovery Act, div. A, 123 § 1553, Stat. 297). Key organizations and entities: Office of Management and Budget (OMB); Primary accountability and oversight responsibilities: * Coordinate with federal agencies on recipient reporting guidance. (Recovery Act, div. A, § 1512(g), 123 Stat. at 288); * Coordinate with the Council of Economic Advisers on their reports. (Recovery Act, div. A, § 1513(a), 123 Stat. at 288). Key organizations and entities: Recovery Accountability and Transparency Board (Recovery Board); Primary accountability and oversight responsibilities: * Review whether use of Recovery Act funds met applicable standards and specified purposes. (Recovery Act, div. A, § 1523(a)(1), 123 Stat at 290); * Identify fraud, waste and mismanagement related to the use of Recovery Act funds and refer to federal Inspector General. (Recovery Act, div. A, § 1523(a)(2)(C), 123 Stat at 290); * Review whether there are sufficient and qualified personnel overseeing Recovery Act funds. (Recovery Act, div. A, § 1523(a)(2)(D), 123 Stat at 290); * Submit quarterly and annual reports to the President and Congress, as well as "flash reports" on potential problems that require immediate attention. (Recovery Act, div. A, § 1523(b), 123 Stat at 291); * Make recommendations to federal agencies on measures to prevent fraud, waste, and mismanagement of Recovery Act funds. (Recovery Act, div. A, § 1523(c)(1), 123 Stat at 291); * Established and maintains the Recovery.gov website. (Recovery Act, div. A, § 1526(a), 123 Stat at 293). Key organizations and entities: Recovery Implementation Office; Primary accountability and oversight responsibilities: * Support key administration officials and coordinate Recovery Act efforts at OMB; * Facilitate interagency coordination; * Complement oversight work led by the Recovery Board. Key organizations and entities: Recovery Independent Advisory Panel; Primary accountability and oversight responsibilities: * Provide recommendations to the Recovery Board to identify and prevent fraud, waste and abuse in Recovery Act programs. (Recovery Act, div. A, §1542, 123 Stat at 295). State government: Key organizations and entities: Governors; Primary accountability and oversight responsibilities: * Certify that the state will request and use funds provided by the Recovery Act in accordance with the law and that the funds will be used to create jobs and promote economic growth within 45 days of enactment. (Recovery Act, div. A, § 1607, 123 Stat at 303). Key organizations and entities: State auditors; Primary accountability and oversight responsibilities: * Work with the Recovery Board in coordinating oversight activities. (Recovery Act, div. A, § 1528, 123 Stat at 294). Key organizations and entities: State government agencies; Primary accountability and oversight responsibilities: * Report on the amount of funds received on a quarterly basis. (Recovery Act, div. A, § 1512 (c)(1) 123 Stat at 287); * Report on their use of funds on a quarterly basis. (Recovery Act, div. A, § 1512 (c)(2) 123 Stat at 287); * Report on an estimate of the number of jobs created and the number of jobs retained on the quarterly basis. (Recovery Act, div. A, § 1512(c)(3) (D) 123 Stat at 287); * Use a separate accounting identifier for Recovery Act funded projects. (Office of Management and Budget Guidance, OMB Memorandum M- 09-10). Local government and others: Key organizations and entities: Recipients' of Recovery Act funds (local governments, universities and other research institutions, non- profit organizations, and private companies); Primary accountability and oversight responsibilities: * Report quarterly on Recovery Act funds including on (1) the amount of funds received, (2) the amount of funds expended or obligated to projects and activities, (3) a detailed list of all projects and activities for which funds were expended or obligated, including a name, description, and evaluation of completion status for each project or activity, and (4) an estimate of the number of jobs created and the number of jobs retained by each project or activity. (Recovery Act, div. A § 1512 (c) 123 Stat at 287-288). Source: GAO analysis of Recovery Act and OMB/White House memoranda. [End of table] Practices at Federal, State, and Local Levels Contributed to Improved Accountability of Recovery Act Grant Programs but Challenges Existed: Strong Support by Top Leaders, a Collaborative Approach, and Systematic Use of Data Were Key to Managing Recovery Act Implementation: Under the Recovery Act, accountability for timely and effective implementation of the law was a shared responsibility that included agencies involved in directly implementing the law as well as the external oversight community. On the operational side, among the practices that facilitated accountability were (1) strong support by top leaders, (2) centrally-situated collaborative governance structures, and (3) the regular and systematic use of data to support management reviews. Strong Support by Top Leaders: We have previously reported on the importance of having the active support of top leadership when undertaking large and complex activities.[Footnote 17] This was the case in the implementation of the Recovery Act where, at the federal level, the President and Vice President made clear that effective Recovery Act implementation was a high priority for them. The President assigned overall management responsibility for the Recovery Act to the Vice President and appointed a former OMB deputy director to head the newly-created Recovery Implementation Office with direct reporting responsibilities to both him and the Vice President. The former head of the Recovery Implementation Office told us that his position gave him access to top leadership in the administration. This official said he participated in daily morning staff meetings with the White House senior staff, briefing them on any issues related to the Recovery Act. He briefed the President directly approximately once a month. In addition, he typically met with the Vice President's staff on a daily basis after the President's staff meeting. He also met with the Vice President directly every 1 to 2 weeks. Finally, he frequently interacted with the head of OMB and sometimes also sat in on his staff meetings. In each of these roles he had direct access to, and support from, the highest levels of government. The former head of the Recovery Implementation Office stated this was key to his ability to ensure cooperation and coordination with other federal departments during the Recovery Act. For example, he told us that senior government leaders knew that his office had the authority of the President and Vice President behind it, and if they did not do what was requested, they would have to explain their reasoning to senior White House officials. This awareness of the Recovery Implementation Office's line of authority helped to ensure that federal officials coordinated and cooperated with the office. In turn, the involvement and engagement of top leaders at individual federal agencies was facilitated by OMB guidance that required each agency to identify a senior accountable official--generally at the deputy secretary or subcabinet level--to be responsible for Recovery planning, implementation, and performance activities within the agency.[Footnote 18] These senior agency leaders were regularly involved with overseeing and reporting on Recovery Act efforts. At the state level, several governors demonstrated top leadership support by establishing specific positions, offices, or both that were responsible for state Recovery efforts. For example, the Governor of Massachusetts created the Massachusetts Recovery and Reinvestment Office as a temporary program management office for the specific task of overseeing Recovery activities. The former director of the office stated that he reported directly to, and drew his authority from, the Governor. The Governor also elevated the office to the rank of a senior level office. This action increased it's the office's visibility and gave it a seat at the Governor's weekly cabinet meetings, where its director would regularly report on the status of Recovery Act projects. In addition, no state Recovery Act program could be approved without the director's consent. The former director told us that the success of the office was attributable to the direct line of authority it had with the Governor of Massachusetts. In fiscal year 2012, Massachusetts' Office of Commonwealth Performance, Accountability, and Transparency was created, in part, as a direct result of the Recovery Act.[Footnote 19] According to Massachusetts' state officials, this office is the state's attempt to take lessons from the state's experience with the Massachusetts Recovery and Reinvestment Office and apply them post Recovery Act. Centrally-Situated Collaborative Governance Structures: Collaboration played a key role in ensuring timely implementation of, and accountability for, Recovery Act grant programs. Because the success of the Recovery Act relied on many programs being implemented quickly at the federal, state, and local levels, cooperation and collaboration among these groups was essential. While centrally-situated federal entities such as the Recovery Implementation Office, OMB, and the Recovery Board set the tone, issued guidance, and provided ongoing oversight, many implementation decisions were left to state and local partners directly engaged in managing Recovery Act programs. For example, agencies and grantees were given freedom to publish notices of funding availability and to run competitions in a manner consistent with their individual statutes, regulations, and agency practices. On the other hand, there was also centralization of oversight as demonstrated by the direct involvement of high-level officials such as the Vice President, cabinet secretaries, and senior accountable officials in federal agencies receiving Recovery Act funding, as well as centrally-placed policy and oversight organizations such as OMB and the Recovery Board. This combination of a centralized and decentralized approach to managing the implementation of the Recovery Act represented a new method of managing grant oversight, one which simultaneously recognized the importance of collaboration while increasing the role of the center.[Footnote 20] Officials in the Recovery Implementation Office employed a collaborative, facilitative approach, while also leveraging the authority of the Vice President to facilitate the participation of stakeholders. The office functioned as a convener and problem-solver that engaged with a wide range of federal, state and local partners. This approach was embodied in the objectives identified by the Vice President when the office was established. These objectives included the expectation that office staff respond to requests and questions within 24 hours, cut across bureaucratic silos by reaching out to a variety of partners, and always be accessible. Toward this end, the office adopted the role of an "outcome broker," working closely with partners across organizational silos at all levels of government in order to foster implementation of the Recovery Act and achieve results.[Footnote 21] Another role of the Recovery Implementation Office was to closely monitor Recovery Act spending. One way it did so was to monitor grants to ensure that they were consistent with the objectives identified by the Vice President. A second way the office monitored spending was to review weekly financial reports on agency obligations and expenditures for programs receiving Recovery Act funds and to meet with the agencies on a regular basis. OMB sought to facilitate effective implementation of the Recovery Act by working to establish and strengthen relationships with the state and local governments that would ultimately implement the programs on the ground. This was done in two ways: (1) by soliciting feedback from state and local partners when formulating and revising rules and policies governing the implementation of Recovery Act programs and (2) by developing its capacity to respond to questions from the many states and localities that would be implementing those rules and policies. A senior OMB official directly involved in this work told us the office had to move out of its traditional role as mainly a policy- making organization to adopt a more interactive and service-oriented approach. Under this approach, key activities involved engaging with and obtaining feedback from states and localities as well as providing technical support to these groups so that they could meet the Recovery Act's numerous reporting requirements. For example, to obtain feedback from state and local partners when developing key Recovery Act policies, OMB became actively involved in weekly conference calls that included a diverse group of federal, state, and local organizations. Starting in the spring of 2009, regular participants in these calls included OMB; GAO; the National Association of State Auditors, Comptrollers and Treasurers; the National Governors' Association; the National Association of State Budget Officers; the Recovery Board; the National Association of Counties; the National Association of State Chief Information Officers; and the National Association of State Purchasing Officers. These weekly calls were scheduled after several of these organizations wrote to OMB and GAO to express their strong interest in coordinating on reporting and compliance aspects of the Recovery Act. An important outcome of this regular information exchange was to make OMB aware of the need to clarify certain reporting requirements. The Recovery Act required federal agencies to make information publicly available on the estimate of the number of jobs created and number of jobs retained as a result of activities funded by the act. Our previous Recovery Act work in the states raised the issue that some local officials needed clarification regarding definitions when reporting on job data. The local partners participating in these calls were able to corroborate what we reported and provide OMB with specific information about what additional guidance was needed. To obtain information to further guide refinements to the Recovery implementation process, at the end of 2009, OMB officials said they (1) interviewed and surveyed numerous stakeholders including governors and state and local recipients, and (2) worked with GAO to identify best practices. Based on these efforts, OMB subsequently revised its guidance, which focused on lessons learned around enhancing recipient reporting and compliance. [Footnote 22] To improve technical support provided to state and local governments implementing the Recovery Act, OMB worked with the Recovery Board to establish an assistance center based on an "incident command" model. [Footnote 23] One OMB official likened this approach to an extension of a traditional response model used during natural disasters, where the country's economic condition during the Great Recession was the "incident" and the Recovery Act was the intervention to be rolled out through many partners. To help implement this approach, OMB worked with officials from the Department of Agriculture who offered the services of one of their national emergency management teams to help set up and coordinate this effort. Given the large number of state and local governments that needed to be supported, OMB requested that each agency with grant programs receiving Recovery Act funds contribute personnel to support the center. According to OMB officials, from September to mid-December of 2009, the center responded to approximately 35,000 questions from states and localities. Regular and Systematic Use of Data to Support Management Reviews: Under the Recovery Act, some agencies used new data-driven approaches to inform how they managed programs, and some of those new approaches become institutionalized at the agencies post-Recovery. While the Government Performance and Results Act (GPRA) Modernization Act of 2010 (GPRAMA) laid out requirements for data-driven quarterly performance reviews, several Recovery Act efforts aided agencies in implementing those requirements.[Footnote 24] For example, in February 2013 we found that the Department of Energy (DOE) built on its Recovery Act-related performance reviews and established quarterly performance reviews, called business quarterly reviews, in 2011. [Footnote 25] Another control DOE implemented for large dollar projects was a "Stage-Gate" process, which did not allow the funds to be disbursed all at one time. It required the recipient to meet certain metrics or results before receiving additional funding at certain levels. DOE Office of Inspector General (OIG) officials believed this Stage-Gate approach was an effective internal control tool. Post-Recovery, DOE has institutionalized both the business quarterly reviews and Stage-Gate processes. As part of the Department of Housing and Urban Development's (HUD) implementation of the Recovery Act, the agency piloted a new approach to data management and accountability called HUDStat. HUD's Recovery Act team collected data about the status of projects and progress towards financial goals. Armed with this information, HUD leaders could identify and neutralize spending delays across the agency's 80 field and regional offices. In some cases, a senior HUD official would make a phone call to a mayor or a governor to stress the need to spend funds quickly. In other cases, staff would refocus on regions where progress was slow and would work with grantees to move more quickly to promote economic growth. After the Recovery Act, and in accordance with GPRAMA requirements, HUD continued to use HUDStat to share data and resources across the agency. Heightened Accountability Requirements and Aggressive Implementation Timelines Led to Increased Coordination and Information Sharing: The Recovery Act contained increased accountability requirements in the areas of reporting, audits, and evaluations to help ensure that tax dollars were being spent efficiently and effectively. At the same time, the act provided aggressive timelines--approximately 19 months-- for the distribution of funds. The combination of these two factors placed high expectations on federal, state, and local governments and led to increased coordination both vertically across levels of government and horizontally within the same level of government to share information and work towards common goals. Networks Provided a Mechanism to Share Information: Organizations involved in overseeing and implementing grants funded by the Recovery Act made use of both new and established networks to share information. Shortly after the Recovery Act was signed into law, our then Acting Comptroller General and the Chair of the Council of the Inspectors General on Integrity and Efficiency hosted a coordination meeting with the OIGs or their representatives from 17 federal agencies to discuss an approach to coordination and information sharing going forward. We also worked with state and local auditors and their associations to facilitate regular conference calls to discuss Recovery Act issues with a broad community of interested parties. Participants included the Association of Government Accountants; the Association of Local Government Auditors; the National Association of State Auditors, Comptrollers, and Treasurers; the Recovery Board; and federal OIGs. Another active venue for information sharing was the National Intergovernmental Audit Forum (NIAF). The NIAF, led during this period by our then Acting Comptroller General, is an association that has existed for over three decades as a means for federal, state, and local audit executives to discuss issues of common interest and enhance accountability. NIAF's May 2009 meeting brought together these executives and others including OMB, to update them on the Recovery Act and provide another opportunity to discuss emerging issues and challenges. In addition, several Intergovernmental Audit Forum meetings were scheduled at the regional level across the country and sought to do the same. This regional coordination and information sharing directly contributed to our Recovery Act work in the states. For example, our western regional director made a presentation at the Pacific Northwest Audit Forum regarding our efforts to coordinate with state and local officials in conducting Recovery Act oversight. In conjunction with that forum and at other related forums, she regularly met with the principals of state and local audit entities to coordinate oversight of Recovery Act spending. Officials from New York City also played a role in creating networks to share information. Believing that large cities were probably facing similar issues and challenges, Recovery officials in New York City established the American Recovery and Reinvestment Act Big City Network (BCN) to serve as a peer exchange group and facilitate information sharing among large municipalities across the country. The group was composed of over 20 large cities with geographical diversity, such as Los Angeles, Philadelphia, Phoenix, and Seattle, that received a significant amount of federal stimulus funding. The former head of the BCN told us that the organization held frequent teleconferences and used this collaboration to elevate issues unique to large cities with OMB, the White House's Recovery Implementation Office, and the Recovery Board. For example, BCN informally surveyed its members in January 2010 concerning each grant and associated funds they received. From this survey, BCN officials assembled a list of cross-jurisdictional issues reflecting the perspectives and experiences of large cities and shared them with the White House, OMB, and the Recovery Board. Likewise, OMB, the Recovery Implementation Office, and the Recovery Board used BCN as a vehicle for getting information out to its partners on the ground. Similarly, at the state level, a network was established where state Recovery Act coordinators shared information and lessons learned on a weekly basis. This state-level network also discussed ongoing Recovery Act policy and operational issues with the White House, OMB, and the Recovery Board to ensure successful implementation. Federal officials joined the state calls on a regular basis. Both BCN and the state network proved to be especially helpful in fostering intergovernmental communications. For example, the former head of the BCN stated that in response to a Senate Committee request in 2012, New York City leveraged both BCN and the state Recovery Act coordinators' network to inform the current discussion on the Digital Accountability and Transparency Act, proposed legislation which seeks to improve grant transparency through increased reporting.[Footnote 26] Cities and states mobilized quickly and came together on key consensus principles for Congress' consideration. Organizations Worked Together in New Ways to Achieve Common Goals: Under the tight time frames set for implementation of the Recovery Act, federal agencies needed to work together to accomplish their goals. For example, HUD and DOE shared a goal of weatherizing low- income households through long-term energy efficiency improvements. To get the projects under way as quickly as possible, they worked together to ensure that homeowners met income standards. Before Recovery Act implementation, both DOE and HUD conducted their own independent income verifications. In May of 2009, DOE and HUD entered into a memorandum of understanding that eliminated the need for separate DOE income verification for people whose incomes had already been verified by HUD. According to DOE officials, this collaboration helped projects move faster, reduced the cost and administrative burden of duplicative verifications, and helped DOE weatherize numerous homes under the Recovery Act through 2013. DOE officials reported that between fiscal years 2010 and 2013, the joint effort helped weatherize approximately 1.7 million housing units, the majority of which were low-income. This policy of sharing low-income verifications for weatherizing homes has continued post-Recovery Act. At the state level, Massachusetts is an example where officials developed new ways of working together to achieve Recovery Act goals. For example, Massachusetts state officials established the Stimulus Oversight and Prevention (STOP) Fraud Task Force in 2009 to fulfill the Recovery Act's goal of preventing fraud, waste, and abuse of Recovery Act funds. This task force included the state OIG's office, the Attorney General's office, and the State Auditor. Over the next 2 years, the group met bimonthly to discuss fraud prevention and collaborated with several federal agencies including the Department of Justice, the Federal Bureau of Investigation, and HUD. The group also brought in federal OIGs including DOE and Education, the state Comptroller's office, and the Massachusetts Recovery and Reinvestment Office to discuss our report findings and OMB guidance. According to officials from the Massachusetts Attorney General's office, the task force improved communication and furthered efforts to avoid overlap. The Recovery Act Prompted Adjustments and Innovations in Oversight to Foster Accountability: Faced with the short time frames and accelerated roll out of Recovery Act funds, both the oversight community and agencies adjusted their oversight approach and innovated to foster accountability for Recovery Act funds at the federal and state agency levels. These organizations became more engaged in up-front analysis and monitoring of programs under the Recovery Act and their reviews were often issued before money was spent. These practices included (1) assessing and planning for risks up front; (2) reviewing programs before and while they were being funded rather than waiting until after programs were implemented; (3) communicating findings quickly through informal processes as opposed to regular full reports; and (4) using advanced data analytics. Increased Use of Up-Front Risk Assessments and Planning: At the federal level, several agency OIGs conducted up-front risk planning to proactively prepare for the influx of Recovery Act funds. For example, the Department of Transportation's (DOT) OIG instituted a three-phase risk assessment process for DOT programs that received Recovery Act funds. The OIG first identified existing program risks based on past reports; it next assessed what the department was doing to address those risks; and it then conducted the audit work. DOT's OIG is continuing to use this three-phase scan approach for its work on Hurricane Sandy. At the Department of Education, when the OIG realized that Education's discretionary grant budget would increase from a typical allotment of $60 billion annually to over $100 billion under the Recovery Act, officials put aside their initial work plan and developed a new one which focused on the Recovery Act. Toward this end, the OIG conducted up-front risk assessments by looking at its prior work to identify persistent implementation issues going back to fiscal year 2003. The OIG then issued a 2009 capping report that summarized these issues. This report and additional risk assessments on Recovery Act-specific issues guided the OIG's internal control audits that focused on the use of funds, cash management, subrecipient monitoring, and data quality for Recovery Act education programs. Shortly after the Recovery Act was signed, DOE's OIG reviewed the challenges the agency would need to address to effectively manage the unprecedented level of funding and to meet the goals of the Recovery Act. The resulting report was based on a body of work by the OIG to improve operations and management practices.[Footnote 27] The OIG identified specific risks that they discovered during past reviews and investigations. The OIG also suggested actions that should be considered during Recovery Act planning and program execution to help reduce the likelihood that these historical problems would recur. Further, the OIG described the department's initial efforts to identify risks and to develop strategies to satisfy the Recovery Act's goals and objectives. In addition, the report outlined the OIG's planned oversight approach which adopted a risk-based strategy that included, among other things, early evaluations of internal controls and assessments of performance outcomes. At HUD, regional offices conducted front-end risk assessments of programs that would be receiving Recovery Act funds. The HUD OIG considered these risk assessments when preparing its work plan and carrying out audits. The office also conducted capacity reviews for programs that field offices had identified as having known issues. The purpose of these capacity reviews was to enable the office to actively address and work to resolve known issues before Recovery Act funds were distributed to programs. At the state level, audit organizations also adjusted their usual approaches when planning and conducting reviews of grant programs that received Recovery Act funds. Several state auditors conducted extra audit work of state programs up front in an effort to identify risks and inform their work moving forward. For example, the Office of the California State Auditor conducted "readiness reviews" that highlighted known vulnerabilities in programs receiving Recovery Act money. The office used the information coming out of these reviews to identify specific issues to focus on in future work as well as to inform the oversight committees of the state legislature and other state officials involved in Recovery Act oversight and implementation. As a result of one such review that focused on DOE's Weatherization Assistance Program, the State Auditor was able to identify key implementation issues that needed attention at a joint meeting of state and federal officials organized by the Governor's Recovery Act Task Force. The readiness review identified specific areas where the program needed to improve and informed the frequency with which state auditors would go back to program officials to check on progress. According to the California state auditor, among the benefits of this approach was the feedback it provided to state agencies on their level of readiness as well as the detailed information given to both the state legislature and the Governor's Recovery Act Task Force on the agency's progress. The use of readiness reviews has continued post- Recovery Act. Most recently, the office employed the approach in 2013 as it prepared to audit the implementation of the Affordable Care Act in California. Increased use of "Real Time" Information: The Recovery Act's short time frames prompted the oversight community to carry out some of its reviews in "real time" as Recovery funds were being rolled out, as opposed to the traditional approach of reviewing a program after implementation. Under this approach, members of the oversight community looked for ways to inform program officials of challenges and needed improvements much earlier in the process. For example, as described previously in table 1, the Recovery Act specified several roles for us, including conducting bimonthly reviews of selected states' and localities' use of funds made available under the Act. We subsequently selected a core group of 16 states and the District of Columbia to follow over the next few years to provide an ongoing longitudinal analysis of the use of funds provided in conjunction with the Recovery Act. The Recovery Act also assigned us a range of responsibilities to help promote accountability and transparency. Some were recurring requirements such as providing bimonthly reviews of the use of funds made available under various provisions of the Recovery Act by selected states and localities and reviews of quarterly reports on job creation and job retention as reported by Recovery Act fund recipients. Other requirements included targeted studies in several areas such as small business lending, education, and trade adjustment assistance. In total, we issued approximately 125 reports on, or related to, the Recovery Act resulting in more than 65 documented accomplishments. The interest in obtaining "real time" feedback concerning Recovery Act implementation was not limited to the oversight community. For example, DOT's Federal Highway Administration (FHWA) established National Review Teams (NRT) within 3 months of the Recovery Act's passage to help assist its division offices attain the greater level of accountability and transparency called for under the Recovery Act. As we previously reported, the NRTs were composed of FHWA staff-- separated from the rest of FHWA--to act as a neutral third party to conduct oversight.[Footnote 28] The mission of the NRTs was to conduct quick reviews of FHWA programs and assess processes and compliance with federal requirements in six key risk areas: (1) preliminary plans, specifications, and estimates; (2) contract administration; (3) quality assurance of construction materials; (4) local public agencies; (5) disadvantaged business enterprises; and (6) eligibility for payments. As a review progressed, the NRT discussed findings with division office and state transportation staff. According to FHWA officials, independent reviews had several benefits: * a consistent, comparative perspective on the oversight regularly conducted by division offices, and the collection of information at the national level on both best practices and recurring trouble spots across FHWA division offices; * additional "boots on the ground" for project-level oversight and increased awareness of federal oversight activity among states, Metropolitan Planning Organizations, and other transportation organizations receiving Recovery Act funds; and: * an independent outside voice to examine Recovery Act projects and point out problems, keeping the partnering relationship between the division offices and the state DOTs intact. Division offices and state officials with whom we spoke responded positively to the NRT reviews. The NRT was viewed as a success for FHWA and it has since added independent reviews based largely on the NRT model to provide independent corporate level review of projects and programs in addition to providing other support services. Earlier Communication of Audit Findings: The rapid pace at which Recovery Act funds were being distributed also prompted audit organizations to communicate their findings earlier in the audit process. For example, DOT's OIG issued periodic advisories within the agency rather than waiting until an audit was completed to share its findings. According to OIG staff, these advisories informed the department of issues or concerns shortly after they were discovered, thereby permitting program staff to take corrective action much more quickly. In our first report on our bimonthly reviews of the use of Recovery Act funds by selected states and localities, we determined that the Single Audit process needed adjustment to provide the necessary level of focus and accountability over Recovery Act funds in a timelier manner than the current schedule.[Footnote 29] Subsequently, we recommended that the director of OMB adjust the Single Audit process to provide for review of the design of internal controls during 2009 over programs to receive Recovery Act funding, before significant expenditures in 2010. In response, in October 2009 OMB implemented the Single Audit Internal Control Project--a collaborative effort between 16 volunteer states receiving Recovery Act funds, their auditors, and the federal government--to achieve more timely communication of internal control deficiencies for higher-risk Recovery Act programs. [Footnote 30] The project encouraged auditors to identify and communicate significant deficiencies and material weaknesses in internal controls over compliance for selected major Recovery Act programs 3 months sooner than the 9-month time frame required under statute. The project allowed program management officials at an audited agency to expedite corrective action and help mitigate the risk of improper Recovery Act expenditures. In May 2010, we reported that the project met some of its objectives and was helpful in identifying critical areas where further OMB actions were needed to improve the Single Audit process over Recovery Act funding.[Footnote 31] Auditors at the local level also communicated their findings early. For example, the Denver City Auditor's Office adopted new practices to provide more timely information on Recovery Act programs to the Mayor and other key officials, particularly on issues affecting compliance with Recovery Act reporting requirements. Using a tiered notification process, the auditor's office would initially notify the appropriate city department informally through e-mail or a similar means of potential issues they were finding during an on-going audit. The auditor's office would revisit the issues later and, if the office determined the issue had not been addressed, it would then formally communicate any substantive issue on a real-time basis through an "audit alert." These alerts were typically brief documents and went to the affected departments as well as directly to the Mayor's work group that oversaw the city's Recovery Act implementation. If appropriate action was still not forthcoming, the city auditor might issue a public alert or maybe a full public audit report. According to a senior city audit official, the alerts were beneficial because the city auditor did not have to conduct a full audit to communicate risks and findings to decision makers, allowing them to more quickly address problems. The city auditor issued its first audit alert in October 2009 and subsequently issued another one in February 2010 when problems from the first one had not been addressed. After the second alert, the city administration corrected the identified problems. Use of Advanced Data Analytics: To further increase accountability under the Recovery Act, the Recovery Board utilized innovative data analytics in carrying out its oversight responsibilities. Data analytics is a term typically used to describe a variety of techniques that can be used to analyze and interpret data to, among other things, help identify and reduce fraud, waste, and abuse. Specifically, predictive analytic technologies can be used to identify potential fraud and errors before payments are made, while other techniques, such as data-mining and data-matching of multiple databases, can identify fraud or improper payments that have already been awarded, thus assisting agencies in recovering these dollars.[Footnote 32] In October 2009, the Recovery Board established an innovative center to analyze the use of Recovery Act funds by employing data analytics (see figure 3). The Recovery Operations Center (ROC) served as a centralized location for analyzing Recovery Act funds and their recipients through the use of such predictive analytic technologies. According to Recovery Board staff, the results of these approaches provided the OIG community and other oversight authorities with information they could use to focus limited resources on cities, regions, and high-risk government programs where historical data and current trends suggested the likelihood of future risk. ROC analysts would cross-reference lists of grant recipients or sub-recipients against a variety of databases to look for risk indicators such as criminal convictions, lawsuits, tax liens, bankruptcies, risky financial deals, or suspension/debarment proceedings. One tool used to do this is link analysis, which assists the analyst in making connections by visually representing investigative findings. Link analysis charts visually depict how individuals and companies are connected, what awards an entity has received, and how these actors may be linked to any derogatory information obtained from the databases described above . Such tools, when combined with enhanced Geographic Information System capabilities, enable ROC analysts to conduct geospatial analysis by displaying data from multiple datasets on maps to help them make linkages and discover potential problems. For example, the ROC helped a federal agency investigate possible contract fraud related to over-billing on multiple contracts. ROC analysts found 99 recipient awards made to a single company totaling over $12 million. In another example, the ROC helped to investigate allegations of false claims and major fraud against the United States. ROC analysts found officers of one company were also executives of more than 15 other companies, many of which were located at the same address, and collectively received millions in Recovery Act funds. More recently, the ROC has been used to track funds and help reduce fraud, waste, and abuse related to the tens of billions of dollars that have been awarded to states and communities to assist in their recovery after Hurricane Sandy hit in October 2012. Recovery Board staff have sought to leverage the expertise they have developed in analyzing financial spending and identifying potential fraud and high- risk indicators based on their experience with the Recovery Act. Figure 3: An Analyst Working in the Recovery Board's Recovery Operations Center and a Sample Output of One of ROC's Link Analysis Tools: [Refer to PDF for image: photograph and illustrative example of link analysis tool] Sources: GAO representation of Recovery Board information; Recovery Board (photo). [End of figure] The Recovery Act's Accountability Requirements Also Presented Oversight Challenges: To assure the public that their tax dollars were being spent efficiently and effectively, the Recovery Act called for increased oversight and accountability of those funds by oversight and program entities at the federal, state, and local levels. This increased emphasis on oversight and accountability presented challenges for those entities stemming from (1) a lack of financial resources to conduct oversight at the state and local levels, (2) human capital issues, and (3) the accelerated roll out of Recovery Act funds. Limited Resources to Conduct Oversight at the State and Local Levels: Officials with whom we spoke in several states expressed concerns that the Recovery Act did not provide funding to state oversight entities, although it placed additional federal requirements on them to provide proper accounting and to ensure transparency. Federal agency OIG offices received a significant amount to conduct oversight of Recovery Act funds--ranging anywhere from $1 million to $48.25 million distributed to more than 28 agencies. In contrast, states and localities relied on their existing budgets and human capital resources (and, in some cases, supplemented by a small percentage of administrative funds) to carry out their additional oversight activities. Due to fiscal constraints, states reported significant declines in the number of management and oversight staff--limiting states' ability to ensure proper implementation and management of Recovery Act funds. With oversight capacity already strained in many states, the situation was further exacerbated by increased workloads resulting from implementation of new or expanded grant programs funded by the Recovery Act. For example, Massachusetts officials explained that the state oversight community faced budget cuts of about 10 percent. According to officials from the OIG and the State Auditor's office, their budgets are almost entirely composed of salaries, and any cuts in funding resulted in fewer staff available to conduct oversight. As a result of the cuts, the Inspector General stated that his department did not have the resources to conduct any additional oversight related to Recovery Act funds. Further, the Massachusetts State Auditor described how his department had to furlough staff for 6 days in fiscal year 2009. In recognition of this situation and reflective of the state's desire to pursue fraud in the Recovery Act program, for state fiscal years 2009 through 2012, the Massachusetts Recovery and Reinvestment Office allocated funds from the state's central administration account to the Attorney General, State Auditor, and OIG offices to ensure that oversight would take place. The California State Auditor also cited the lack of federal funding for state and local oversight as a challenge to ensuring accountability in the implementation of the Recovery Act. In a 2009 testimony to the California state budget committee, the State Auditor said that her office would need to conduct an additional 14 audits based on an initial analysis of the estimated stimulus funds that California would receive. Furthermore, the programs that the office was auditing at the time received additional funds, which potentially increased the workload and cost to audit those programs as well. Finally, new requirements created by the Recovery Act for existing programs also impacted the State Audit Office's efforts. The California State Auditor noted that given the additional responsibilities her office faced due to the influx of stimulus funds, any budget cuts would adversely affect the office's ability to conduct audits. In another example, Colorado's state auditor reported that state oversight capacity was limited during Recovery Act implementation, noting that the Department of Health Care Policy and Financing had three controllers in 4 years and the state legislature's Joint Budget Committee cut field audit staff for the Department of Human Services in half. In addition, the Colorado DOT's deputy controller position was vacant, as was the Department of Personnel & Administration's internal auditor position. Colorado officials noted that these actions were, in part, due to administrative cuts during a past economic downturn in an attempt to maintain program delivery levels. Accelerated Rollout of Recovery Act Programs Presented Oversight Challenges: The President's goal for quickly spending Recovery Act funds created a large spike in spending for a number of programs in the 28 agencies receiving Recovery Act funds. The act also created a number of new programs--requiring agencies to move quickly. As a result, under the Recovery Act's accelerated rollout requirements, some federal agencies and states faced oversight challenges. For example, DOT and states faced numerous challenges in implementing the Recovery Act's maintenance-of-effort oversight mechanism due to the accelerated rollout of funds. The Recovery Act contains maintenance of effort provisions designed to prevent recipients, such as state DOTs, public housing agencies, and private companies, from substituting planned spending for a given program with Recovery Act funds. That is, the provisions ensured that the increased federal spending would supplement rather than replace state, local, or private spending.[Footnote 33] The maintenance-of-effort provision for DOT in the Recovery Act required the governor of each state to certify that the state would maintain its planned level of transportation spending from February 17, 2009, through September 30, 2010.[Footnote 34] Twenty-one states did not meet their certified planned spending levels, and a January 2011 preliminary DOT report found that some of these states were unclear on what constituted "state funding". DOT also found some of the states were unclear about how well DOT guidance on calculating planned expenditures would work in the many different contexts in which it would have to operate. As a result, many problems came to light only after DOT had issued initial guidance and states had submitted their first certifications. DOT issued guidance seven times during the first year after the act was signed to clarify how states were to calculate their planned or actual expenditures for their maintenance-of-effort certifications. Further, many states did not have an existing means to identify planned transportation expenditures for a specific period, and their financial and accounting systems did not capture that data. Therefore, according to DOT and some state officials, a more narrowly focused requirement applying only to programs administered by state DOTs or to programs that typically receive state funding could have helped address the maintenance-of-effort challenges. DOT and state officials told us that while the maintenance- of-effort requirement can be useful for ensuring continued investment in transportation, allowing more flexibility for differences in states and programs, and adjustments for unexpected changes to states' economic conditions, should be considered for future provisions. At DOE, the department initially encountered some challenges with fully developing a management and accountability infrastructure because of the large amount of Recovery Act funding it received in a short period of time. According to an official in the DOE OIG's office, this was especially true with the new Energy Efficiency Conservation Block Grant program.[Footnote 35] This official told us that some states and localities also did not have the infrastructure in place (including the necessary training) to manage the large amount of additional federal funding. Further, DOE required recipients' weatherization plans to address how the respective state's current and expanded workforce (employees and contractors) would be trained. In May 2010, according to DOE, the agency was in the process of developing national standards for weatherization certification and accreditation. DOE estimated that developing the standards would take about 2 years--a time frame that did not match the accelerated timing of the Recovery Act's funds' distribution. Several years after the Recovery Act was implemented, DOE reported that it had completed certain milestones toward developing national standards for weatherization, training, certification, and accreditation, but was still working to finalize other elements such as its national certification program. The Recovery Act Resulted in Increased Transparency but Also Presented Challenges: Recovery Act Transparency Websites Embody Several Leading Practices: In an April 2009 memorandum, OMB directed agencies to follow leading practices for federal website development and management, such as those listed on HowTo.gov, a website managed by the Federal Web Managers Council and the General Services Administration.[Footnote 36] HowTo.gov makes available a list of the "Top 10 Best Practices" for federal websites as a resource to improve how agencies communicate and interact with customers and provide services and information.[Footnote 37] Recovery.gov, as well as selected state and city Recovery websites, demonstrated several of these leading practices including establishing a clear purpose of the website, using social networking tools to garner interest in the website, tailoring websites to meet audience needs, and obtaining stakeholder input when designing the website. In addition, we found that some websites enabled place-based performance reporting. Establish a Clear Purpose of the Website: Consistent with leading practices for the development of federal websites on HowTo.gov, Recovery.gov and selected state Recovery websites clearly identify for the user the purposes of the site and the ways it can be used to accomplish tasks efficiently. According to HowTo.gov, this is important because people often visit government websites with a specific task in mind, and if it is not easy to find the information quickly they need to quickly complete that task, they will leave the site. Recovery.gov contains an entire page that outlines what users can do on the site, including how to use the raw data available through the website; report waste, fraud, and abuse; or find job and grant opportunities. Further, Recovery.gov has a "Get Started" page with an overview of the information on the site including Recovery Act goals, the Recovery Board's mission, what information is not available on the website, and what users can do on the website. Similarly, Massachusetts' Recovery website has tabs on its homepage that link to information on how to use the website to track Recovery Act jobs, spending, vendors, and the impact of Recovery Act dollars in the state.[Footnote 38] For example, the "track jobs" page informs users how they can track jobs created and retained in their community and provides a user guide to assist them in their query. Use Social Networking Tools to Garner Interest in the Website: Another leading practice for federal websites includes the use of social networking tools. According to Howto.gov, social media is transforming how government engages with citizens, allowing agencies to share information and deliver services more quickly and effectively than ever before. Recovery.gov and selected state and local Recovery websites use social networking tools to garner interest in their websites. These websites integrated Web 2.0 technologies to help people share and use the information they provide. For example, to develop web-based communities of interest, Recovery.gov has a dedicated social media web page that has links to Recovery's presence on various social-networking tools such as Facebook, Twitter, YouTube, and Flickr.[Footnote 39] Recovery.gov's social media page enables users to (1) download a Recovery application for iPhones and for iPads with a mapping feature showing how Recovery Act funds were being spent, (2) sign up for a Recovery.gov month-in-review email, and (3) sign up to receive Recovery RSS web feeds.[Footnote 40] Finally, Recovery.gov also has a blog, written by Recovery Board staff, with a stated purpose to further a dialog on transparency and accountability in government, as well as to provide a forum for thoughts, comments, and suggestions from the public. New York City also made use of social networking to communicate information regarding Recovery Act implementation through the use of a Tumblr blog.[Footnote 41] City officials used this blog to communicate stories and examples to its residents about how it was using Recovery Act funds and the impact of those investments.[Footnote 42] City officials said the blog allowed them to get behind full-time equivalent numbers and dollar expenditures so that people could better understand how the Recovery Act was helping them tackle problems where they work and live. For example, the blog described one project that had no net increase in jobs but still made a valuable difference for the city because Recovery Act funds were used to repair 300,000 potholes and move to zero diesel fuel emissions for city vehicles. Tailor Website to Meet Audience Needs: Organizing a website according to the needs of its audience is also a key leading practice for federal websites since an agency's goal is to build the right website for the people who need it and serve them effectively by learning as much as possible about the website's customers and what they do. Recovery.gov has dedicated pages for different audiences that compile and organize relevant resources according to their needs and interests. On its home page, Recovery.gov has a tab which provides links to pages designed with specific users in mind such as citizens, the press, and grant recipients. There are also links to pages on neighborhood Recovery Act projects, information on the Recovery Board, and other information users are looking for. For example, grant recipients have a dedicated page that provides resources such as reporting timelines, user guides, a service/help desk, recipient reporting information, and a recipient awards map. (See figure 4.) Figure 4: Screenshot of Recovery.gov User Identification Feature: [Refer to PDF for image: screenshot] Depicted: links: I Am: An Interested Citizen: See information about projects in your area, learn more about the Recovery Act, and connect with Recovery.gov via social media. A Data User: Download recipient data and find widgets that you can post on your own website. A Member of the Press: Learn about the Recovery Board, and get the details on recipient reporting. A Recipient: Review the reporting schedule for the current reporting cycle, and see the latest enhancements to FederalReporting.gov. I Am Looking For: Projects in My Neighborhood: Enter your zip code to see the Recovery projects in your area, and visit the Map Gallery. Opportunities and Benefits: Search for jobs through a variety of job searches, and find information about COBRA and Unemployment Insurance. Recovery Act and Recovery Board Information: Read about the Recovery Act, the Board and its members, and actions of the Inspectors General. A Snapshot of Recovery: See pictures on the Recovery.gov Flickr page, watch Recovery videos, and read stories about Recovery projects. Source: Recovery.Gov. Note: The link to the above web page is [hyperlink, http://www.recovery.gov/Pages/audience_landing.aspx]. [End of figure] On Recovery.gov's “Developer Center” web page, users can access data reported by recipients of Recovery awards through the Recovery application program interface (API) and the Mapping API.[Footnote 43] Users can also find widgets providing data summaries by state, county, congressional district, or ZIP code as reported by recipients. [Footnote 44] The web page also has a tool for users to build customized charts and graphs displaying information such as funds awarded and received by state, agencies by number of awards, and spending categories by funds awarded. The state of Massachusetts also tailored its Recovery Act website to meet its audience's needs. Prior to its implementation of the Recovery Act's transparency provisions, Massachusetts had little experience with electronic reporting and disclosure of federal contracts, grants, and loans. The MassRecovery website provided weekly citizen updates and testimonials of how spending has benefited lives. The Citizens' Update web page provides a summary of where the state's Recovery Act dollars are going, where jobs are being created and retained, and information on beneficiaries of funds received. In December 2009, MASSPIRG, an independent consumer research group, issued a brief pointing to the strengths of the Massachusetts Recovery website including the ability of the Citizens' Update web page to show money spent and jobs created and retained in easy-to-read pie charts and tables; a summary of funds distributed through the state; and an interactive state map of Recovery Act spending. Further, in January 2010, Good Jobs First, a national policy resource center, reviewed and evaluated states' Recovery Act websites. The organization ranked Massachusetts' Recovery website on its top 10 list citing such beneficial features as the site's comprehensive search engine, data download capability, and information on five key Recovery Act project elements—description, dollar amount, recipient name, status, and the text of the award. Obtain Stakeholder Input When Designing the Website: Leading website practices also recommend that developers obtain stakeholder input when designing federal websites by engaging potential users through focus groups and other outreach; regularly conducting usability tests to gather insight into navigation, the organization of content, and the ease with which different types of users can complete specific tasks; and collecting and analyzing performance, customer satisfaction, and other metrics. According to leading website practices, these efforts are important for collecting and analyzing information about audiences, their needs, and how they are using, or want to use, the website. The developers of Recovery.gov followed this leading practice by using input from user forums, focus groups, and usability testing with interested citizens to collect feedback and recommendations, which then inform the development of the website from its initial stages. For example, teaming with OMB and the National Academy of Public Administration, the developers of Recovery.gov hosted a week-long electronic town hall meeting at the end of April 2009 entitled "Recovery Dialogue on Information Technology Solutions." Over 500 citizens, information technology specialists, and website development experts registered for the event and submitted numerous ideas. Recovery.gov adopted some of the ideas right away and included others in the re-launched version of the website in September 2009. These changes included a standardized reporting system for recipients, a greater use of maps, and a feedback section for users. Additionally, in October 2009, Recovery.gov developers conducted remote usability testing with 72 users, where the developers received suggested changes, some of which they later implemented. Further, in 2012, significant changes were made to Recovery.gov based on user feedback on the website. These changes included creating a recipient and agency data page, agency profiles, and a new Recipient Projects Map with a series of dropdown menus and checkboxes that enable users to filter data so they can see it in a targeted fashion (for example, by state, agency, or category). Enable Place-Based Performance Reporting: For websites covering numerous projects at various locations, a place- based geographic information system can be a useful tool. According to the White House's Digital Government Strategy, the federal government needs to be customer-centric when designing digital service platforms such as websites. In other words, agencies need to be responsive to customers' needs by making it easy to find and share electronic information and accomplish important tasks. From the beginning, recipient reported data on Recovery.gov was geo-coded in a way that made it possible for users to find awards and track the progress of projects on a block-by-block basis. The presentation of information on Recovery.gov and on many state websites generally targeted individual citizens who were not experts in data analysis. The format and content of data prioritized mapping capabilities and invited people to enter their ZIP code and locate projects in their immediate area. For example, figure 5 shows the map a user sees if ZIP code 30318 in Georgia is entered into this web page. From this map, the user can click on any of the dots that represent Recovery projects to find out information such as the project recipient name, the award amount, project description, the number of jobs created, and completion status. Additional information available to users includes the amount of funds received by recipients as well as the overall distribution of grants by funding categories for that area. Figure 5: Screenshot of Recovery.gov's Recipient Projects Page for a Georgia ZIP Code, 30318: [Refer to PDF for image: screenshot] Depicts the following: Recipient Projects; Total Funds Awarded; Total Awards; List of Awards. Source: Recovery.gov. Note: The link to access Recovery.gov's recipient projects page can be found at [hyperlink, http://www.recovery.gov/arra/Transparency/RecoveryData/Pages/RecipientRe portedDataMap.aspx]. [End of figure] States and localities also utilized mapping features on their Recovery websites. For example, in New York City, Recovery officials launched a Recovery Act website, the NYCStat Stimulus Tracker, as an interactive, comprehensive reporting tool. The federal government's website, Recovery.gov, served as the design inspiration and, according to a senior city official, Stimulus Tracker was one of the first publicly-accessible websites to report Recovery Act data for a local jurisdiction. City Recovery officials were able to develop and launch New York City's stimulus website more quickly than other locations—approximately 6 weeks from start to completion—because they were able to leverage a previously implemented information technology platform to support citywide performance reporting. Stimulus Tracker allowed the public to explore several levels deeper than what was at Recovery.gov, which reported at the funding award level. For example, Stimulus Tracker broke down each award into several projects, each of which had its own dashboard page that displayed information such as (1) the status of the project, (2) the percentage of total funds spent, (3) start date and spending deadlines, and (4) the number of jobs created or retained. Visitors to the site could drill into a record of every payment made with stimulus funds through the additional feature “Payment Tracker” and every contract to carry out stimulus-funded work through “Contract Tracker.” Stimulus Tracker also offered an interactive map for site visitors who were interested in knowing how stimulus dollars were allocated geographically and where specific projects were located. This information was layered on top of the city's existing online map portal. It included such items as the locations of schools, libraries, hospitals, and subways, as well as online property, building, statistics, and census information. As New York City's existing online map portal could already be navigated either by entering a specific address or simply using zoom and scroll tools, city Recovery Act officials were able to build on this application and include a city mapping tool for Recovery Act funds where the public could find any project with a discrete location. See figure 6 for a screen shot of New York City's mapping tool depicting the city's Recovery Act projects. Figure 6: Screenshot of NYCStat Stimulus Tracker Mapping Feature: [Refer to PDF for image: screenshot] Depicted on map: call out of site data: Federal Stimulus: Wards Island Wastewater Treatment Plant Replacement of Primary Sludge System; Description: Replacement of Primary Sludge System; Stimulus Tracker ID: 107006; Lead City Agency: DEP. This project is funded by the American Recovery and Reinvestment Act. Source: City of New York (Copyright © 2013, The City of New York). Note: The link to the NYC Stimulus Tracker webpage is [hyperlink, http://maps.nyc.gov/doitt/nycitymap/?featuretypes=STIMULUS]. [End of figure] Recovery Act Performance Was Mostly Measured by Outputs Rather than Outcomes, and Challenges with Both Existed: Recipients and Agencies Were Required to Report Amount and Speed of Funding but Faced Challenges in Doing So: The Recovery Act requires recipients to report on their use of funding and agencies that provide those funds to make the reports publicly available.[Footnote 45] The Recovery Act's recipient reporting requirements apply only to nonfederal recipients of funding, including all entities other than individuals receiving Recovery Act funds directly from the federal government such as state and local governments, private companies, educational institutions, nonprofits, and other private organizations. As required by section 1512(c) of the Recovery Act, recipients were to submit quarterly reports that included the total amount of Recovery Act funds received, the amount of funds expended or obligated to projects or activities, and a detailed list of those projects or activities.[Footnote 46] For each project or activity, the detailed list was to include the project's name, description, and an evaluation of its completion status. Also, the recipient reports were to include detailed information on any subcontracts or subgrants as required by the Federal Funding Accountability and Transparency Act of 2006.[Footnote 47] For example, recipient reports are required to also include details on sub-awards and other payments. With the Recovery Act's enhanced reporting requirements on spending, agencies and recipients faced several challenges. Many agencies and state and local partners were limited in their capacity to meet the enhanced reporting requirements due to a lack of knowledge and expertise. Others struggled with the burden of double reporting when they had to report to federal systems tracking Recovery dollars as well as to agency systems because, in some cases, agencies required more data to manage their programs. Finally, some had trouble reporting data for certain projects within the operational limitations of place-based data mapping systems. Capacity to meet reporting requirements. Many state and local partners were limited in their capacity to meet spending reporting requirements because they lacked knowledge and expertise. Using a centralized mechanism like FederalReporting.gov to capture recipient reporting information was a new process that recipients and agencies had to learn. We have previously reported on the questions raised by state officials regarding the reporting capacities of some local organizations, particularly small rural entities, boards, or commissions, and private entities not used to doing business with the federal government.[Footnote 48] In addition, some state officials said that the Recovery Act's requirement that recipients report on the use of funds within 10 days after a quarter ends was a challenge because some sub-recipients were unable to send them the needed data on time. Officials at several agencies suggested that if FederalReporting.gov had allowed certain key award and identifying data fields to be pre- populated each quarter, it would have likely resulted in fewer data errors for agencies to address and eased the reporting burden on recipients. In our September 2013 report and testimony on federal data transparency, we concluded that the transparency envisioned under the Recovery Act for tracking spending was unprecedented for the federal government, requiring the development of a system that could track billions of dollars disbursed to thousands of recipients. Such a system needed to be operational quickly to enable posting of spending information rapidly for a variety of programs. However, because agency systems did not collect spending data in a consistent manner, the most expedient approach for Recovery Act reporting was to collect data directly from fund recipients.[Footnote 49] Recipients had the additional burden of having to provide this information and when the data had to be entered manually, it could impact the accuracy of the data.[Footnote 50] Thus, in September 2013 we recommended that the director of OMB, in collaboration with members of the Government Accountability and Transparency Board, develop a plan to implement comprehensive transparency reform, including a long-term timeline and requirements for data standards, such as establishing a uniform award identification system across the federal government.[Footnote 51] Earlier this year, the Recovery Board noted that agencies and OIGs also experienced difficulties adapting to the more frequent reporting (every quarter) and more detailed reporting (e.g., jobs created or individual project activities) reporting required of most government grant recipients. [Footnote 52] Agency officials acknowledged spending considerable staff hours training recipients, providing technical assistance to them, verifying and validating their data, and following up with them when issues arose. Despite efforts to streamline and enhance existing review protocols, agencies still needed skilled people to review and process applications for awards. Although agencies and OIGs credited outreach to recipients for reducing noncompliance with reporting requirements, the amount of staffing resources it took to conduct that outreach was significant. Double reporting. We have previously noted that recipients of Recovery Act funds were required to report similar information to both agency reporting systems and FederalReporting.gov.[Footnote 53] Several federal agency and state government officials we spoke with also mentioned that reporting to FederalReporting.gov resulted in double reporting for their agency and grantees as several of them deemed their existing internal systems superior and therefore would end up reporting to both. For example, at HUD, program offices were unable to abandon their established reporting systems because the agency's systems collected data necessary to support HUD's grants management and oversight processes. HUD officials told us that requiring grantees to report using two systems resulted in double reporting of data and proved burdensome to recipients and to HUD staff who spent many hours correcting inaccurate entries. At DOT, officials preferred using the agency's own data because it was more detailed and were reported monthly--more frequently than the Recovery.gov data. In a focus group involving state transportation officials, several echoed the redundancy of reporting systems. These officials indicated that having to report to three systems--the internal state system, DOT's system, and FederalReporting.gov-- increased their agencies' burden. As we reported in our previously mentioned September 2013 report and testimony on federal data transparency efforts, the lack of consistent data and standards and commonality in how data elements are defined places undue burden on federal fund recipients. This can result in them having to report the same information multiple times via disparate reporting platforms. [Footnote 54] To address this issue, as OMB developed procedures for reporting on the use of federal funds, it directed recipients of covered funds to use a series of standardized data elements. Further, rather than report to multiple government entities, each with its own disparate reporting requirements, all recipients of Recovery Act funds were required to centrally report into the Recovery Board's inbound reporting website, FederalReporting.gov. Difficulty of using place-based GIS data for some types of projects. While Recovery.gov enabled users to locate Recovery Act projects on a block-by-block basis, some of the Recovery Act data did not lend itself to the geospatial reporting presentation format on the website. For example, according to Recovery Board officials, the website only allowed one location to be reported per project even though some projects spanned multiple locations. Therefore, if a DOT highway project crossed multiple ZIP codes, only one location of performance could be reported. Further, certain locations were difficult to map such as rural roads, post office boxes, county level data, and consultant contractors who worked out of their homes. Recipients Were Required to Report Outcome Measure of Jobs Created and Retained but Faced Some Challenges: The other major performance measure required under the Recovery Act focused on the estimate of the number of jobs created or number of jobs retained as a result of funding provided by the act. In addition to the previously described reporting on funds spent and activities, recipients were required in their quarterly reports to estimate the number of jobs created or retained by that project or activity. [Footnote 55] OMB issued clarifying guidance for recipient reporting in June 2009 and recipients began reporting on jobs starting in October 2009.[Footnote 56] Among other things, the guidance clarified that recipients of Recovery Act funds were to report only on jobs directly created or retained by Recovery Act-funded projects, activities, and contracts. Recipients were not expected to report on the employment impact on materials suppliers ("indirect" jobs) or on the local community. Recipients had 10 days after the end of each calendar quarter to report. OMB's guidance also provided additional instruction on calculating the number of jobs created or retained by Recovery Act funding on a full-time equivalent (FTE) basis.[Footnote 57] Recipients faced several challenges meeting these requirements. They had difficulty accurately defining FTEs, as various recipients interpreted and applied the FTE guidance from OMB differently. Further, many recipients struggled to meet reporting deadlines as they had little time to gather, analyze, and pass on information to the federal government at the end of each fiscal quarter. Definitional challenges and discrepancies in reporting FTEs. Under OMB guidance, jobs created or retained were to be expressed as FTEs. In our November 2009 report we found that recipients reported data inconsistently even though OMB and federal agencies provided significant guidance and training.[Footnote 58] Specifically, we found that while FTE calculations should allow for different types of jobs-- part time, full time or temporary--to be aggregated, differing interpretations of the FTE guidance compromised the recipients' ability to aggregate the data. For example, in California, two higher education systems calculated FTEs differently. One chose to use a 2- month period as the basis for the FTE performance period. The other chose to use a year as the basis. The result was almost a three-to-one difference in the number of FTEs reported for each university system in the first reporting period. Although the Department of Education provided alternative methods for calculating an FTE, in neither case did the guidance explicitly state the period of performance of the FTE. We recommended that OMB clarify the definition of FTE jobs and encourage federal agencies to provide or improve program-specific guidance for recipients. Further, we recommended that OMB be more explicit that jobs created or retained are to be reported as hours worked and paid for by the Recovery Act. In general, OMB and agencies acted upon our recipient reporting-related recommendations and later reporting periods indicated significant improvements in FTE calculations. OMB's guidance changed the original formula and consequently, agencies had to rush to educate recipients about the changes. Agencies spent extra time and resources that quarter reviewing and validating recipient data to reduce errors. In some cases, agencies communicated daily with recipients via phone or e-mail to ensure their report submissions were accurate. Capacity of recipients to meet deadlines. The requirement to regularly report on jobs created and retained further strained the capacity of some recipients. Recipients only had 10 days after the end of each fiscal quarter to determine this information and pass it on to the federal government.[Footnote 59] Some state education officials told us that deadlines for reporting should have been extended by 1 to 2 weeks so they were not rushing to input data. One of these officials said she was directed by other state officials to put in "the best data you have, even if it's not correct…and go back and correct it later." City officials also reported concerns with the quick turn- around time for reporting. For example, one city official stated that, in order to meet reporting deadlines, it was necessary to enter data manually, which created additional work. The Recovery Board accepted these post-correction actions as it extended the quality assurance period to provide more time for agencies to review reports and recipients to make corrections in FederalReporting.gov. As a result, recipients could change their reports up to about 2 weeks before the start of the next reporting period. Program Specific Performance Measures Varied by Program and Agency with Some Focusing More on Outputs than Outcomes: The administration required agencies receiving Recovery Act funds to submit performance plans that identified additional measures on a program-by-program basis. Consistent with existing GPRA requirements for agencies to set outcome-oriented performance goals and measures, OMB's initial Recovery Act implementation guidance required federal agencies to ensure that program goals were achieved. OMB required agencies to measure specific program outcome measures, supported by corresponding quantifiable output measures, and improved results on broader economic indicators.[Footnote 60] In responding to this requirement, agencies typically resorted to existing measures in their grant programs' performance plans. This information is reported by agency and by program within each agency, as opposed to government-wide. While Recovery.gov provided a template for facilitating the reporting of this information, the level of detail and specificity of outcomes varied greatly for some of the agencies we reviewed, making it difficult to determine the extent to which some were making progress toward their goals and demonstrating results. For example, Education's performance plan described the agency's accountability mechanisms, the type and scope of project activities, and specific program performance measures. With the exception of the number of jobs created or retained, Education's plan stated the agency was primarily using existing established agency performance measures that applied to both Recovery and non-Recovery funds. For example to measure the success of one type of education grant fund (specifically, Title I of the Elementary and Secondary Education Act of 1965, as amended) which the Recovery Act made available to local educational agencies, Education used existing agency performance measures, such as the percentage of economically disadvantaged students in grades 3 to 8 scoring at the proficient or advanced levels on state reading and mathematics assessments. On the other hand, DOT filled out the templates to report on its 12 programs, and its performance measures were generally less specific and outcome oriented. For example, DOT's Capital Assistance for High Speed Rail Corridors and Intercity Passenger Rail Service performance plan metrics included whether interim guidance was published within time frames, the number of applicants received for the program, and the number of grants awarded for the program. Further, as we previously reported, DOT released a series of performance plans in May 2009 to measure the impact of Recovery Act transportation programs, but these plans generally did not contain an extensive discussion of the specific goals and measures to assess the impact of Recovery Act projects.[Footnote 61] For example, while the plan for the highway program contained a section on anticipated results, three of its five measures were the percent of funds obligated and expended and the number of projects under construction. The fourth measure was the percentage of vehicle miles traveled on pavement on the National Highway System rated in good condition, but the plan said that goals for improvement with Recovery Act funds were yet to be determined. The fifth goal was number of miles of roadway improved, and DOT's plan reported that even with the addition of Recovery Act funds, the new target would remain the same as previously planned. As a result, we recommended in May 2010 that DOT ensure that the results of these projects were assessed and a determination made about whether these investments produced long-term benefits. DOT did not implement our recommendation. Concluding Observations: Created in response to the recent serious recession, the Recovery Act represents a significant financial investment in improving the economy. Grant programs were a key mechanism for distributing this support. By increasing accountability and transparency requirements while at the same time setting aggressive timelines for the distribution of funds, the Recovery Act created high expectations as well as uncertainty and risk for federal, state, and local governments responsible for implementing the law. Faced with these challenges, some of these organizations looked beyond their usual way of doing business and adjusted their usual practices to help ensure the accountability and transparency of Recovery Act funds. The oversight community adopted a faster and more flexible approach to how they conducted and reported on their audits and reviews so that their findings could inform programs of needed corrections before all Recovery funds were expended. They leveraged technology by using advanced data analytics to reduce fraud and to create easily accessible Internet resources that greatly improved the public's access to, and ability to make use of, data about grants funded by the Recovery Act. These and other experiences, as well as the challenges identified in this report, provide potentially valuable lessons for the future. Underlying many of these lessons is the importance of increased coordination and collaboration, both vertically-- transcending federal, state, and local levels of government--and horizontally--across organizational silos within the federal community--to share information and work towards common goals. One question that remains unresolved is the extent to which good practices developed in response to the Recovery Act's special challenges and conditions can ultimately be incorporated in everyday practice for managing and overseeing grants. Some of the practices we found, such as the use of the Recovery Operations Center and state readiness reviews, have been able to make this transition. Others, such as some of the information sharing networks established during the Recovery Act, have had more difficulty in doing so. Proposals under consideration by Congress and the administration to extend Recovery Act requirements for spending transparency to all federal grants suggest that this has been the case for tracking dollars.[Footnote 62] Still to be seen is whether it will be possible to provide this type of government-wide transparency to other measures of performance, such as grant outcomes. Agency Comments: We provided a draft of this report to the Secretaries of the Departments of Education, Energy, Housing and Urban Development, and Transportation; and to the Director of the Office of Management and Budget. We also provided drafts of the examples included in this report to cognizant officials from the relevant state and local agencies to verify accuracy and completeness, and we made technical changes and clarifications where appropriate. The agencies generally agreed with our findings and provided technical comments which were incorporated in the report. We are sending copies of this report to other interested congressional committees; the Secretaries of the Departments of Education, Health and Human Services, Housing and Urban Development, and Transportation; and the Director of the Office of Management and Budget. In addition, the report will be available on our web site at [hyperlink, http://www.gao.gov]. If you or your staff have any questions regarding this report, please contact me at (202) 512-6806 or by email at czerwinskis@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report are listed in appendix I. Signed by: Stanley J. Czerwinski: Director, Strategic Issues: [End of section] Appendix I: Objectives, Scope, and Methodology: To better understand grant management lessons resulting from the American Recovery and Reinvestment Act of 2009 (Recovery Act), we focused on two key issues involving grant implementation during the Recovery Act: accountability and transparency. Specifically, this report identifies and provides examples of good practices employed and the challenges faced by select federal, state, and local agencies implementing grant programs funded by the Recovery Act, in the areas of accountability and transparency. To obtain a broad view of lessons learned during the implementation of grants funded by the Recovery Act, we conducted a detailed literature review of relevant reports describing lessons learned from implementing grants funded by the Recovery Act from GAO; federal and state inspectors general; federal agencies; state and local governments; accountability boards; state and local government advocacy organizations; think tanks; and academia.[Footnote 61] We developed selection criteria to identify relevant federal agencies and state and local governments to obtain their views related to the implementation of grant programs funded by the Recovery Act. We then selected four federal agencies, three states, and two localities based on the extent to which they had information related to our focus areas of accountability and transparency; information from our colleagues, subject matter experts, and academics; and citations in the literature. To capture a diverse mix of Recovery Act grants and identify potential good practices and challenges, we selected a variety of grants--some that had their funding structures already well established, others that had their funding greatly increased as a result of the Recovery Act, as well as new programs.[Footnote 62] Although Medicaid was the largest grant program funded by the Recovery Act, we deemed it out of scope for the purposes of this review since it is primarily an entitlement and subject to specific rules that are not typical of program grants. Further, Medicare and unemployment insurance were not included in the recipient reports we examined. To obtain illustrative examples of the good practices employed and the challenges faced during the implementation of grants funded by the Recovery Act related to accountability and transparency, we conducted interviews with a wide range of officials and experts. We interviewed cognizant officials and obtained supporting documentation from government-wide oversight entities at the federal level including the Recovery Implementation Office, Office of Management and Budget, and the Recovery Accountability and Transparency Board. In addition, we interviewed and obtained supporting documentation from select federal agency officials from the Departments of Education; Energy; Housing and Urban Development; and Transportation; and their respective inspectors general. At the state level, we interviewed and obtained supporting documentation from agency and audit officials from the states of California, Georgia, and Massachusetts. To get a broader state perspective, we also interviewed officials from the state Recovery Act coordinators'network, which included key state officials involved in implementing the Recovery Act from 16 other states. [Footnote 63] At the local level, we interviewed officials from Denver, Colorado and New York City, New York. We also interviewed officials from leading state and local advocacy organizations that were involved in Recovery Act implementation such as the National Association of State Auditors and Comptrollers, the National Association of State Budget Officers, and the National Association of Counties. We obtained additional information on lessons learned related to the Recovery Act from officials representing the Government Accountability and Transparency Board, Sunlight Foundation, Council of Government Relations, National Council of Non-profits, Center for Effective Government, the Federal Demonstration Project, and National Association of State Chief Information Officers. In addition, we conducted seven focus groups representing a range of federal fund recipients.[Footnote 64] Focus groups included: (1) state comptrollers; (2) state education and transportation officials; and (3) local government officials from both large and small municipalities; Each focus group had between four and eight participants who were recruited from randomized member lists provided by the recipient associations we interviewed. Lastly, we reviewed and synthesized information provided in previously issued reports related to the Recovery Act that included the following sources: our previous work; inspectors general from the Departments of Education, Energy, Housing and Urban Development, and Transportation; the Recovery Accountability and Transparency Board; the White House; and various non-governmental sources including the IBM Center for The Business of Government. In addition, we reviewed and applied criteria established by HowTo.gov, a source of guidance and leading practices for government websites, to Recovery.gov and state and local Recovery websites. The scope of our work did not include independent evaluation or verification of the effectiveness of the examples we identified. We also did not attempt to assess the prevalence of the practices or challenges we cite either within or across levels of government. Therefore, entities other than those cited for a particular practice may or may not have employed the same or similar practice, and it is not possible to generalize how prevalent the practices and challenges may be across all Recovery Act grants. We conducted this performance audit from December 2012 through January 2014, in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: GAO Contact and Staff Acknowledgments: GAO Contact: Stanley J. Czerwinski, Director (202) 512-6806 or czerwinskis@gao.gov: Staff Acknowledgments: In addition to the contact named above, Peter Del Toro, Assistant Director; Mark Abraham; and Jyoti Gupta made significant contributions to this report. Also contributing to this report were Tom Beall, Robert Gebhart, Jacob Henderson, Donna Miller, Robert Robinson, Beverly Ross, and Andrew J. Stephens. [End of section] Footnotes: [1] Pub. L. No. 111-5, 123 Stat. 115. [2] The Recovery Act established multiple mechanisms to provide accountability including those aimed at preventing fraud, waste, and abuse. According to the Office of Management and Budget's (OMB) initial Recovery Act implementation guidance, meeting the Recovery Act's accountability objectives included ensuring that funds were awarded and distributed in a prompt manner; benefits of these funds were reported in a timely manner; and projects funded avoided unnecessary delays and cost overruns. OMB further specified that a critical accountability objective is ensuring funds are used for authorized purposes and instances of fraud, waste, and abuse are mitigated. See OMB, Initial Implementing Guidance for the American Recovery and Reinvestment Act of 2009, OMB Memorandum M-09-10 (2009). [3] In 2009, the President and OMB issued memos stating that “transparency promotes accountability and provides information for citizens about what the Government is doing.” Further, the Federal Funding Accountability and Transparency Act of 2006, as amended, requires information disclosure concerning entities receiving federal financial assistance through federal awards such as federal contracts, sub-contracts, grants, and sub-grants. 31 U.S.C. § 6101 note. Office of Management and Budget, Open Government Directive, OMB Memorandum M-10-06 (2009); and Open Government Directive–Federal Spending Transparency and Subaward and Compensation Data Reporting (2010). [4] GAO, Federal Data Transparency: Opportunities Remain to Incorporate Lessons Learned as Availability of Spending Data Increases, [hyperlink, http://www.gao.gov/products/GAO-13-758] (Washington, D.C.: Sept. 12, 2013). [5] For additional information on GAO's broad body of work related to the Recovery Act, see [hyperlink, http://www.gao.gov/recovery/]. External reports we reviewed include studies by the White House, Recovery Accountability and Transparency Board, federal Inspectors General, IBM Center for the Business of Government, and George Mason University. See appendix I for additional information on studies reviewed. [6] The Recovery Implementation Office, with a staff of not more than eight full-time equivalents, was created by Vice President Biden to oversee the high-level management dimensions of the Recovery Act. The Recovery Accountability and Transparency Board was created by the Recovery Act to carry out a wide range of accountability and transparency functions. See table 1 for more specifics. [7] Recovery Act, § 3(a), 123 Stat. at 116. [8] The authority of the Recovery Board was expanded by the Consolidated Appropriations Act, 2011, under which the Recovery Accountability and Transparency Board was appropriated $28.35 million, to carry out title XV of the Recovery Act and to "develop and test information technology resources and oversight mechanisms to enhance transparency of and detect and remediate waste, fraud, and abuse in federal spending." Pub. L. No. 112-74, 125 Stat. 786, 920 (2011). [9] Disaster Relief Appropriations Act, 2013, Pub. L. No. 113-2, § 904(d), 127 Stat. 4, 18. [10] Congressional Budget Office, Estimated Impact of the American Recovery and Reinvestment Act on Employment and Economic Output from October 2012 Through December 2012, (Washington, D.C.: Feb. 21, 2013). In January 2010, CBO estimated that the cost of the Recovery Act would be larger than originally estimated. Among the factors contributing to the higher estimates were higher than anticipated use of entitlement programs such as unemployment insurance and nutritional assistance, as well as higher than anticipated use of the Build America Bonds program. [11] Medicaid is a joint federal-state program that finances health care for certain categories of low-income individuals, including children, families, persons with disabilities, and persons who are elderly. [12] An "obligation" is the definite commitment that creates a legal liability of the government for the payment of goods and services ordered or received, or a legal duty on the part of the United States that could mature into a legal liability by virtue of actions on the part of the other party beyond the control of the United States. An "outlay" is the issuance of checks, disbursement of cash, or electronic transfer of funds made to liquidate a federal obligation. See GAO, A Glossary of Terms Used in the Federal Budget Process, [hyperlink, http://www.gao.gov/products/GAO-05-734SP], (Washington, D.C.: Sept. 2005). Spending by the federal government (i.e., funds obligated and outlayed) can represent just the first steps of using the funds. For instance, the Department of Education makes grants to state educational agencies. After the federal government has outlayed the funds to the state educational agencies, they may, in turn, make grants to local educational agencies, which would then, in turn, spend the money on salaries, school facilities, or other items. [13] Some programs administered by these agencies followed a different procedure. For example, some of the Recovery Act funds provided to the Department of Transportation were distributed to states and localities through the statutory formulas and rules governing existing programs. The states and localities retained responsibility for selecting projects, which were required to meet additional requirements specified in the Act. [14] Under the Recovery Act, the Department of Education's SFSF Grant authorized three programs--SFSF State Grants, State Incentive Grants, and an Innovation Fund. Of the $53.6 billion appropriated for SFSF, the Department used $48.6 billion for the State Grants, $4.35 million for Race to the Top Grants (authorized under the State Incentive Grants), and $650,000 for the Investing in Innovation program (authorized under the Innovation Fund). Recovery Act, div. A, §§ 14001- 14012, 123 Stat. at 279-286. [15] Although the law gave responsibility to agencies to make recipient reports available, recipients actually submitted information to a single website operated by the Recovery Board. Data were then provided to agencies for review and recipients were requested to make any needed corrections. The Recovery Board then made the data available on its website, Recovery.gov. [16] The Recovery Accountability and Transparency Board is comprised of a chairperson appointed by the President and inspectors general from the Departments of Agriculture, Commerce, Education, Energy, Health and Human Services, Homeland Security, Justice, Transportation, and Treasury, and any other inspector general designated by the President from any agency that expends or obligates Recovery Act funds. [17] GAO, Managing for Results: Key Considerations for Implementing Interagency Collaborative Mechanisms, [hyperlink, http://www.gao.gov/products/GAO-12-1022] (Washington, D.C.: Sept. 27, 2012); Streamlining Government: Key Practices from Select Efficiency Initiatives Should be Shared Government wide, [hyperlink, http://www.gao.gov/products/GAO-11-908] (Washington, D.C.: Sept. 30, 2011); Gulf Coast Rebuilding: Preliminary Observations on Progress to Date and Challenges for the Future, [hyperlink, http://www.gao.gov/products/GAO-07-574T] (Washington, D.C.: Apr. 12, 2007); and Catastrophic Disasters: Enhanced Leadership, Capabilities, and Accountability Controls Will Improve the Effectiveness of the Nation's Preparedness, Response, and Recovery System, [hyperlink, http://www.gao.gov/products/GAO-06-618] (Washington, D.C.: Sept. 6, 2006). [18] OMB, Initial Implementing Guidance for the American Recovery and Reinvestment Act of 2009, OMB Memorandum M-09-10 (2009). [19] Mass. Gen. Laws ch. 7, § 4A(e). [20] For a detailed discussion of this and related issues see, Posner, Paul L., et. al., Implementation of the Recovery Act: Networks Under Stress, George Mason University, Centers on the Public Service, (February 2013). [21] For more information on the concept of an "outcome broker", please see Frank Digiammarino, Can Government Work Like Open Table? Innovation in the Collaborative Era (2012), accessed January 22, 2014, [hyperlink, http://www.scribd.com/doc/115361546/Can-Government-Work- Like-OpenTable]. [22] OMB, Updated Guidance on the American Recovery and Reinvestment Act - Data Quality, Non-Reporting Recipients, and Reporting of Job Estimates, OMB Memorandum10-08 (2009). [23] An example of an incident command system can be found within the Department of Homeland Security's (DHS) National Incident Management System, which is intended to provide a consistent framework for incident management at all jurisdictional levels regardless of cause, size, or complexity of the situation and to define the roles and responsibilities of federal, state, and local governments during an emergency event. DHS's system has an incident command system component designed to coordinate the communications, personnel, and procedures of different agencies and levels of government within a common organizational structure during an emergency. [24] Pub. L. No. 111-352, 124 Stat. 3866. GPRAMA updated the Government Performance and Results Act of 1993 (GPRA). Pub. L. No. 103- 62, 107 Stat. 285. [25] GAO, Managing for Results: Data-Driven Performance Reviews Show Promise But Agencies Should Explore How to Involve Other Relevant Agencies, [hyperlink, http://www.gao.gov/products/GAO-13-228] (Washington, D.C.: Feb. 27, 2103). [26] Digital Accountability and Transparency Act of 2013, H.R. 2061, 113th Cong. (2013); Digital Accountability and Transparency Act of 2013, S. 994, 113th Cong. (2013). [27] Special Report on The American Recovery and Reinvestment Act at the Department of Energy (OAS-RA-09-01, March 2009). [28] GAO, Highway Infrastructure: Federal-State Partnership Produces Benefits and Poses Oversight Risks, [hyperlink, http://www.gao.gov/products/GAO-12-474] (Washington, D.C.: Apr. 26, 2012). [29] GAO, Recovery Act: As Initial Implementation Unfolds in States and Localities, Continued Attention to Accountability Issues is Essential, [hyperlink, http://www.gao.gov/products/GAO-09-580] (Washington, D.C.: Apr. 23, 2009). [30] GAO, Recovery Act: States' and Localities' Use of Funds and Actions Needed to Address Implementation Challenges and Bolster Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-604] (Washington, D.C.: May 26, 2010); and Recovery Act: One Year Later, States' and Localities' Use of Funds and Opportunities to Strengthen Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-437] (Washington, D.C.: Mar. 3, 2010). [31] [hyperlink, http://www.gao.gov/products/GAO-10-604]. [32] GAO, Highlights of a Forum: Data Analytics for Oversight and Law Enforcement, [hyperlink, http://www.gao.gov/products/GAO-13-680SP] (Washington, D.C.: July 2013). [33] GAO, Recovery Act: Funding Used for Transportation Infrastructure Projects, but Some Requirements Proved Challenging, [hyperlink, http://www.gao.gov/products/GAO-11-600] (Washington, D.C.: June 29, 2011) and Recovery Act: Planned Efforts and Challenges in Evaluating Compliance with Maintenance of Effort and Similar Provisions, [hyperlink, http://www.gao.gov/products/GAO-10-247] (Washington, D.C.: Nov. 30, 2009). [34] Recovery Act, div. A, § 1201(a), 123 Stat. at 212. [35] Although the Energy Efficiency Conservation Block Grant was authorized in 2007 by the Energy Independence and Security Act of 2007, it was not funded (and the monies could not be spent) until the Recovery Act was enacted in 2009 and therefore is considered a new program. [36] See [hyperlink, http://www.howto.gov/about-us], accessed December 12, 2013. [37] GAO, Managing for Results: Leading Practices Should Guide the Continued Development of Performance.gov, [hyperlink, http://www.gao.gov/products/GAO-13-517] (Washington, D.C.: June 6, 2013). [38] See [hyperlink, http://www.mass.gov/recovery/], accessed December 12, 2013. [39] Facebook is a social networking website that allows registered users to create profiles, upload photos and video, and send messages. Twitter is a social networking microblogging service that allows members to broadcast short posts called tweets. Twitter members can broadcast tweets and follow other users' tweets on multiple platforms and devices. YouTube is a video-sharing website that allows members to store and serve video content. Flickr is an image hosting and video hosting website that allows users to share and embed photographs. [40] A feed is a regularly updated summary of new content that also links to the content. The RSS feed allows Recovery.gov to publish frequently updated information by syndicating content automatically. [41] Tumblr is a microblogging platform and social networking website which allows users to post multimedia and other content to a short- form blog. [42] See [hyperlink, http://nycarra.tumblr.com/], accessed December 12, 2013. [43] An API specifies how some software components should interact with each other. APIs make it easier for software to interact with an outside program like a database or computer service such as a display control. [44] A widget is an application that allows a website to perform a function, ranging from displaying a simple clock to pulling data from multiple sources for display. Recovery.gov provides widgets that let users display state-level recipient data or search the Recovery.gov databases from their own websites. [45] When measuring the implementation of Recovery Act, both the amount and speed of grant funding can be thought of as output measures. These measures typically describe an activity or effort including a description of the characteristics (e.g., amount or timeliness) established as standards for the activity (in this case grant funding). In contrast, outcome measures provide an assessment of results compared to the intended purpose to be achieved. [46] These reports were submitted to FederalReporting.gov, the central governmentwide data collection system for federal agencies and recipients of federal awards using Recovery Act funds. [47] 31 U.S.C. § 6101 note. [48] GAO, Recovery Act: As Initial Implementation Unfolds in States and Localities, Continued Attention to Accountability Issues is Essential, [hyperlink, http://www.gao.gov/products/GAO-09-580] (Washington, D.C.: Apr. 23, 2009). [49] GAO, Federal Data Transparency: Opportunities Remain to Incorporate Recovery Act Lessons Learned, [hyperlink, http://www.gao.gov/products/GAO-13-871T] (Washington, D.C.: Sept. 18, 2013); Federal Data Transparency: Opportunities Remain to Incorporate Lessons Learned as Availability of Spending Data Increases, [hyperlink, http://www.gao.gov/products/GAO-13-758] (Washington, D.C.: Sept. 12, 2013); and [hyperlink, http://www.gao.gov/products/GAO-13-680SP]. [50] After the first few quarters of Recovery Act reporting, the system stabilized so recipients generally only needed to update a few data fields each cycle. [51] OMB staff agreed that the Government Accountability and Transparency Board's early plan provides an initial strategy and added that multiple initiatives are under way. One of these initiatives is the administration's fiscal year 2014 budget proposal that would operationalize comprehensive transparency through the transfer of USAspending.gov from the General Services Administration to the Department of the Treasury. [52] This report was issued by the Department of the Interior OIG on behalf of the Recovery Board and included input from sixteen agencies and their OIGs. For more, see U. S. Department of Interior, Office of the Inspector General, Lessons Learned from the Recovery Act: An Agency and OIG Retrospective, Report No. RO-SP-MOI-0008-2012 (Washington, D.C.: May 2013). [53] See [hyperlink, http://www.gao.gov/products/GAO-12-913T]. [54] [hyperlink, http://www.gao.gov/products/GAO-13-871T] and [hyperlink, http://www.gao.gov/products/GAO-13-758]. [55] Recovery Act, div. A, § 1512(c)(3)(D), 123 Stat. at 288. [56] These reporting requirements applied only to nonfederal recipients of funding, including all entities receiving Recovery Act funds directly from the federal government such as state and local governments, private companies, educational institutions, nonprofits, and other private organizations. [57] Office of Management and Budget, Implementing Guidance for the Reports on Use of Funds Pursuant to the American Recovery and Reinvestment Act of 2009, OMB Memorandum M-09-21 (2009). [58] GAO, Recovery Act: Recipient Reported Jobs Data Provide Some Insight into Use of Recovery Act Funding, but Data Quality and Reporting Issues Need Attention, [hyperlink, http://www.gao.gov/products/GAO-10-223] (Washington, D.C.: Nov. 19, 2009). [59] According to a senior Recovery Board official, in an effort to accommodate weekends and holidays, the Recovery Board regularly allowed recipients 14 days to submit their data to FederalReporting.gov. [60] See OMB Memorandum M-09-10 (2009). This information was to be provided by all agencies receiving Recovery Act funds, covering each grant program using these funds, in the agencies' “Recovery Program Plans” submitted to OMB. Initially due on May 1, 2009, the plans were to be updated by the agencies as needed and were to be published on Recovery.gov as well as agency websites. These plans included information on each Recovery Act program's objectives, activities, delivery schedule, accountability plan, monitoring plan, and program performance measures. [61] GAO, Recovery Act: States' and Localities' Use of Funds and Actions Needed to Address Implementation Challenges and Bolster Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-604] (Washington, D.C.: May 26, 2010). [62] Digital Accountability and Transparency Act of 2013, H.R. 2061, 113th Cong. (2013); Digital Accountability and Transparency Act of 2013, S. 994, 113th Cong. (2013). [63] External reports reviewed include Khademian, Anne and Sang Choi, Virginia's Implementation of the American Recovery and Reinvestment Act: Forging a New Intergovernmental Partnership, IBM Center for the Business of Government (2011); DeSeve, G. Edward, Managing Recovery: An Insider's View, IBM Center for the Business of Government (2011); Posner, Paul L., et. al., Implementation of the Recovery Act: Networks Under Stress, George Mason University, Centers on the Public Service, (February 2013); U.S. Department of the Interior, Office of Inspector General, Lessons Learned from the Recovery Act: An Agency and OIG Retrospective, RO-SP-MOI-0008-2012 (Washington, D.C.: May 2013); White House, A New Way of Doing Business: How the Recovery Act is Leading the Way to 21st Century Government, (February 2012); Rojas, Francisca M., Recovery Act Transparency: Learning from States' Experience, IBM Center for the Business of Government (2012); Callahan, Richard, Sandra O. Archibald, Kay A. Sterner, and H. Brinton Milward, Key Actions That Contribute to Successful Program Implementation: Lessons from the Recovery Act, IBM Center for the Business of Government (2012); U.S. Department of Energy, Office of Inspector General ,Office of Audits and Inspections, Lessons Learned/Best Practices during the Department of Energy's Implementation of the American Recovery and Reinvestment Act of 2009, OAS-RA-12-03 (Washington, D.C.: January 2012); and, U.S. Department of Housing and Urban Development, Office of Inspector General, American Recovery and Reinvestment Act Lessons Learned Initiative, Memorandum No.: 2013-IE-0801, (October 18, 2012). [64] The specific grants we focused on were: at DOE, Energy Efficiency and Conservation Block Grant and the Weatherization Assistance Program; at DOT, Transportation Investment Generating Economic Recovery (TIGER) grants and the Federal Aid Highway Program; at HUD, Community Development Block Grants and the Tax Credit Assistance Program; at Education, Race to the Top grants, and the State Fiscal Stabilization Fund. [65] The states represented in the State Coordinators' Network meeting were Arizona, Arkansas, Delaware, Florida, Maryland, Michigan, Minnesota, Missouri, Nebraska, Nevada, Oregon, Rhode Island, Tennessee, Texas, Utah, and Wisconsin. [66] Results from nongeneralizable samples cannot be used to make inferences about a population. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E-mail Updates.” Order by Phone: The price of each GAO publication reflects GAO's actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO's website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, DC 20548. [End of document]