This is the accessible text file for GAO report number GAO-14-211R entitled 'K-12 Education: Characteristics of the Investing in Innovation Fund' which was released on February 7, 2014. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. GAO-14-211R: [End of section] United States Government Accountability Office: GAO: 441 G St. N.W. Washington, DC 20548: February 7, 2014: Congressional Committees: K-12 Education: Characteristics of the Investing in Innovation Fund: The Investing in Innovation (i3) Fund is a program that provides competitive grants to expand the use of innovative practices that improve student achievement and attainment.[Footnote 1] The U.S. Department of Education (Education) administers the program, which was created under the American Recovery and Reinvestment Act of 2009 (Recovery Act).[Footnote 2] Along with other programs funded by the Recovery Act,[Footnote 3] the i3 program is part of the administration's larger educational reform efforts to ensure that students are college and career ready. The i3 program awards three types of grants. The size of these grants differs based on the level of prior evidence required to demonstrate an innovation's effectiveness and its readiness to expand in scale. The largest grants, known as scale-up grants, fund innovations with strong evidence of effectiveness that can be scaled to a national level. Validation grants are the next largest grant type and fund innovations with a moderate evidence of effectiveness that can be scaled nationally or regionally. The smallest grants, known as development grants, fund innovations that are supported by evidence of promise or a strong theory that should be studied further.[Footnote 4] To receive an i3 grant, the proposed innovation must address one of Education's priority areas for the program, such as improving the effectiveness of teachers and principals or serving rural communities, among other requirements. The i3 program provides grants to three types of groups: local educational agencies (LEAs), nonprofit organizations partnering with LEAs, and nonprofit organizations that join with a consortium of schools.[Footnote 5] As of November 2013, Education had awarded 92 i3 grants totaling about $937 million. The Recovery Act mandated that GAO review the i3 fund.[Footnote 6] This report describes: (1) the distribution of i3 awards across grant and recipient types; (2) the criteria Education has used to make awards; (3) the distribution of i3 funds among the priorities that Education specified for the program; and (4) how Education supports grantees in implementing and evaluating their projects. In conducting this work, we reviewed information pertaining to i3 grants awarded in 2010 through 2012 and reviewed information on grant requirements for 2010 through 2013.[Footnote 7] For the first objective, we obtained and analyzed publicly available data on grants and grantees. For the second objective, we reviewed information from the Federal Register as well as program documents that discussed the selection process, eligibility requirements, and the rules by which peer reviewers assess each application. For the third objective, we reviewed each grantee's i3 application and identified which of Education's priority areas the grantee asserted that its innovation addressed. We then combined this information with the data used in the first objective and analyzed it to determine the distribution of i3 funds across priority areas. We assessed the reliability of Education's publicly available data by (1) reviewing existing information about the data and the system that produced them, and (2) interviewing agency officials knowledgeable about the data. We determined that the data were sufficiently reliable for the purposes of this report. For the fourth objective, we reviewed relevant federal laws, regulations, guidance, and monitoring protocols. We also interviewed Education officials from the Office of Innovation and Improvement (OII)--the office that oversees the program--and from Education's Office of the Inspector General (OIG) to discuss the monitoring and technical assistance provided to i3 grantees. In addition, we interviewed officials from the contractor that Education hired to provide technical assistance related to evaluations of i3 projects. We did not interview grantees so we did not verify Education's specific monitoring and technical assistance activities associated with grantees; however, we corroborated our discussions with Education officials with their monitoring protocols and guidance. We conducted this performance audit from July 2013 to February 2014 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Results in Brief: From fiscal years 2010 to 2012, Education awarded half of i3 grant funds ($493 million of $937 million) as validation grants, and most awards went to partnerships involving nonprofit organizations. Nonprofit organizations partnering with school consortia accounted for a large portion of i3 funds largely because they won four ($170 million) of the five scale-up grants that Education made through 2012. Education has the flexibility to change the selection criteria for any given i3 competition; however, four criteria have been included in each i3 competition. Education relies on outside peer reviewers to rate each application based on the selection criteria. Peer reviewers apply the same selection criteria to each type of i3 grant, but the maximum number of points that peer reviewers may assign varies by grant type. Education officials said they make final awards based on peer reviewers' scores and other factors. i3 projects most often support one of four of the priority areas that Education identified for the 2010, 2011, and 2012 competitions and use methods that often rely on teacher and principal development. i3 projects most often address one of four priorities: (1) supporting effective teachers and principals; (2) using high quality standards and assessments; (3) turning around low-performing schools; and (4) improving science, technology, engineering, and math (STEM) education. Through 2012, 62 i3 projects include teacher or principal professional development as a means to achieve their project goals. Education supports i3 grantees through monitoring and technical assistance. According to Education officials and i3 monitoring protocols, i3 program officers communicate regularly with grantees to ensure that projects comply with Education's regulations, funds are spent appropriately, and progress is being made. Education hired contractors to provide technical assistance and to support implementation and evaluation of projects. According to Education officials, the technical assistance provided for project evaluations helps maximize the strength of project evaluations. Background: The i3 grant program supports research-based projects that are intended to help close achievement gaps and improve outcomes for high- need students. Education awards three types of i3 grants, with scale- up grants receiving the most funding and development grants the least (see table 1). Table 1: Maximum Funding Available Per Grant, Fiscal Years (FY) 2010- 2013: Type of i3 Grant: Scale-up; Maximum funding available per grant: FY 2010[A]: $50 million; FY 2011-FY 2012: $25 million[B]; FY 2013: $20 million. Type of i3 Grant: Validation; Maximum funding available per grant: FY 2010[A]: $30 million; FY 2011-FY 2012: $15 million; FY 2013: $12 million. Type of i3 Grant: Development; Maximum funding available per grant: FY 2010[A]: $5 million; FY 2011-FY 2012: $3 million; FY 2013: $3 million. Source: U.S. Department of Education. [A] Funding for all three grant types was higher for the fiscal year 2010 competition because the total funding for the i3 program was greater than in any subsequent year. [B] Education did not award any scale-up grants in 2012. [End of table] In order to receive an i3 grant, an applicant must meet various requirements,[Footnote 8] including the following: [Footnote 9] * It is an LEA, a nonprofit organization partnering with one or more LEAs, or a nonprofit organization partnering with a consortium of schools;[Footnote 10] * The i3 project supports high-need students;[Footnote 11] * The applicant has a record of improving student achievement;[Footnote 12] * The proposed project addresses one of Education's priority areas such as improving effective teachers and principals; * The innovation meets the evidence requirements for the type of grant the applicant seeks (e.g., strong evidence of effectiveness is required for scale-up grants).[Footnote 13] In addition, applicants seeking i3 grants must secure a private sector partner that will provide matching funds and an independent third party to conduct an evaluation of the project's results.[Footnote 14] Applicants must also agree to participate in communities of practice with other grantees in order to solve problems or improve practices. Education awarded 92 i3 grants totaling approximately $937 million in the first three years of the program (see table 2). Each i3 project must be completed within a 3-to 5-year time frame, although a grantee may extend the project period for up to 1 year without requiring prior approval provided that the extension does not require additional federal funds or involve a change in the approved objectives and scope of the project. According to Education officials, Education will not provide additional federal funding to grantees that extend their projects; any additional funding must come the grantee or its private sector partner. Table 2: Total Funding and Number of New Awards for i3 Program, Fiscal Years 2010-2012: Total award spending: FY 2010: $646 million; FY 2011: $148 million; FY 2012: $143 million; Total: $937 million. Number of new awards: FY 2010: 49; FY 2011: 23; FY 2012: 20; Total: 92. Source: U.S. Department of Education. [End of table] Education awarded the majority (47 of 92) of i3 grants for 2010 through 2012 as cooperative agreements.[Footnote 15] A cooperative agreement can be used whenever Education anticipates substantial federal involvement--beyond monitoring--with the grantee. For example, Education may be involved in the selection of key grantee personnel. In addition, according to an OIG report, cooperative agreements require grantees to outline more information about their projects at the beginning of the grant, which helps Education better plan its involvement with each grantee.[Footnote 16] Education Awarded over Half of i3 Funds to Validation Grants and Most i3 Awards Go to Partnerships Involving Nonprofit Organizations: Since the inception of the i3 program, Education has awarded the largest portion of i3 funding as validation grants. As shown in figure 1, validation grants account for over one half of all i3 funds and represent 28 of the 92 grants made through the end of 2012.[Footnote 17] Education awarded approximately one-quarter of i3 funds through development grants and about one-quarter of program funds to the five scale-up grants. Figure 1: Distribution of i3 Grants by Funding and Grant Type, 2010 to 2012: [Refer to PDF for image: 2 pie-charts] Value of i3 grants awarded by grant type: Development: $224 million (24%) Scale-up: $220 million (23%) Validation: $493 million (53%). Number of i3 grants awarded by grant type: Development: 59 (64%) Scale-up: 5 (6%) Validation: 28 (30%). Source: GAO analysis of U.S. Department of Education data. [End of figure] As of November 2013, Education had awarded 90 percent of total i3 funds and 70 of the 92 grants to nonprofit organizations (see figure 2). More specifically, Education awarded half of i3 funds to nonprofit organizations that partner with LEAs and 40 percent of i3 funds to non-profits that partner with school consortia. Figure 2: Distribution of i3 Grants to Recipient Groups, 2010-2012: [Refer to PDF for image: 2 pie-charts] Value of i3 grants awarded to recipient groups: LEA: $377 million (40%) Nonprofit with school consortium: $377 million (40%) Nonprofit with LEA: $472 million (50%). Number of i3 grants awarded to recipient groups: LEA: 22 (24%) Nonprofit with school consortium: 27 (29%) Nonprofit with LEA: 43 (47%). Source: GAO analysis of U.S. Department of Education data. Note: The sum of the percentages for the value of i3 grants awarded to recipient groups is greater than 100 due to rounding. [End of figure] As shown in figure 3, the nonprofit organizations partnering with school consortia won four of the five scale-up grants that Education made through 2012. These four grants comprise about $170 million in i3 funding. In contrast, LEAs won 22 development grants, ranging between $2 million and $5 million each, and these grants account for about 9 percent of total i3 funding. Figure 3: Recipient Group Awards by Grant Type, 2010-2012: [Refer to PDF for image: stacked vertical bar graph] Number of i3 awards: LEAs: Development: 22; Validation: 0; Scale-up: 0. Nonprofit with school consortium: Development: 14; Validation: 9; Scale-up: 4. Nonprofit with LEA: Development: 23; Validation: 19; Scale-up: 1. Source: GAO analysis of U.S. Department of Education data. [End of figure] Education's distribution of awards by grant types changed over time. In 2010, Education awarded four scale-up grants totaling $195 million. This amount represented 30 percent of the grant funds awarded that year. In 2011, Education awarded one scale-up grant and none in 2012. According to Education officials, the decision not to make any scale- up grants in 2012 enabled the department to fund more validation and development grants and create a more diverse portfolio of projects in the 2012 competition.[Footnote 18] Education Applies Various Selection Criteria to Make i3 Awards: Education has the flexibility to change the selection criteria for any given i3 competition. In 2011, Education revised the i3 program rules to give the department greater flexibility in establishing selection criteria.[Footnote 19] For example, in 2011, Education reduced the selection criteria from seven to four. According to Education, some selection criteria included in the 2010 i3 competition were not selection criteria in 2011 but remained eligibility requirements. For 2013, the selection criteria remained similar; however, Education established quality of the management plan and personnel as two distinct criteria and included additional factors under these for the applicant to address. Four selection criteria have been included in each i3 competition: project design, management plan, personnel, and the project evaluation (see table 3). Table 3: i3 Selection Criteria, 2010 to 2013 i3 Competitions: 2010: Need for the project and quality of the project design; Quality of the management plan and personnel; Quality of the project evaluation; Sustainability; Strength of research, significance of effect and magnitude of effect; Experience of the eligible applicant; Strategy and capacity to bring to scale or to further develop and bring to scale. 2011: Quality of the project design; Quality of the management plan and personnel; Need for the project; Quality of the project evaluation. 2012: Quality of the project design; Quality of the management plan and personnel; Significance; Quality of the project evaluation. 2013: Quality of the project design; Quality of the management plan; Personnel; Quality of the project evaluation; Significance. Source: GAO analysis of U.S Department of Education data. [End of table] Education selects outside peer reviewers to rate each application. [Footnote 20] Since the 2011 competition, a peer reviewer must demonstrate expertise either in one of Education's i3 priorities for that year or in the evaluation of education research.[Footnote 21] Education creates review panels based on its i3 priorities. Peer reviewers are assigned to a panel in which they have identified expertise. When peer reviewers review and score applications, they apply the same selection criteria for that year's competition to all three types of i3 grants. However, the maximum number of points peer reviewers may give for each criterion may differ depending on the type of grant. For example, in the 2013 competition, applications for scale-up and validation grants could receive up to 30 points based on the quality of the project evaluation, while applications for development grants could receive no more than 15 points for this criterion (see table 4). This approach enabled Education to place a greater emphasis on project evaluation for scale-up and validation grants than for development grants. Table 4: Total Selection Points an Application May Earn by Grant Type, 2013: Selection Criteria - i3 Competition: Quality of the project design; Development: 25; Validation: 20; Scale-Up: 20. Selection Criteria - i3 Competition: Quality of the management plan; Development: 15; Validation: 20; Scale-Up: 20. Selection Criteria - i3 Competition: Personnel; Development: 10; Validation: 10; Scale-Up: 10. Selection Criteria - i3 Competition: Quality of the project evaluation; Development: 15; Validation: 30; Scale-Up: 30. Selection Criteria - i3 Competition: Significance; Development: 35; Validation: 20; Scale-Up: 20. Selection Criteria - i3 Competition: Total Points; Development: 100; Validation: 100; Scale-Up: 100. Source: U.S. Department of Education. [End of table] Education ranks the applications after they are scored by peer reviewers. It makes the final awards after considering the ranking and other factors such as an applicant's performance and use of funds and compliance history under a previous award under any department program. i3 Projects Most Often Support Four of Education's Priority Areas for the i3 Program: Each Project Reflects One of Education's Priorities: As shown in figure 4, i3 projects most often address one of four of the priorities Education identified for the 2010, 2011, and 2012 competitions: 1) supporting effective teachers and principals; 2) using high quality standards and assessments; 3) turning around low- performing schools; and 4) improving STEM education.[Footnote 22] The majority of projects that support effective teachers and principals or improve standards and assessments received their grants in 2010, the year in which Education made its largest awards in number (49) and grant amounts.[Footnote 23] For example, thirteen of the 20 projects that support effective teachers and principals won their awards in 2010. In contrast, projects supporting STEM education have received 5 awards in both the 2011 and 2012 competitions. Figure 4: Distribution of i3 Projects by Priority and Number of Grants, 2010-2012: [Refer to PDF for image: horizontal bar graph] Project priority: Support effective teachers and principals; Number of grants: 20 (22%). Project priority: Use high quality standards and assessments; Number of grants: 21 (23%). Project priority: Turn around low-performing schools; Number of grants: 20 (22%). Project priority: Improve STEM education; Number of grants: 10 (11%). Project priority: Serve rural communities; Number of grants: 8 (9%). Project priority: Parent and family engagement; Number of grants: 4 (4%). Project priority: Improve the use of data; Number of grants: 9 (10%). Source: GAO analysis of U.S. Department of Energy data. [End of figure] As shown in figure 5, projects that support effective teachers and principals, use high quality standards and assessments, or seek to turn around low-performing schools received the most i3 funding in the 2010 to 2012 competitions. Projects that focus on supporting effective teachers and principals won about 22 percent of the total $937 million in i3 funds awarded for the 2010 to 2012 competitions by winning two scale-up awards (totaling $100 million) and eight validation awards (totaling $156 million). Projects that support the turnaround of low-performing schools also won 22 percent of total i3 funds by winning two scale-up grants (totaling $95 million) and six validation grants (totaling $118 million). Figure 5: Distribution of i3 Projects by Priority and Total Grant Funding, 2010-2012: [Refer to PDF for image: horizontal bar graph] Project priority: Support effective teachers and principals; Development: $39 million; Validation: $156 million; Scale-up: $100 million. Project priority: Turn around low-performing schools; Development: $48 million; Validation: $118 million; Scale-up: $95 million. Project priority: Use high quality standards and assessments; Development: $59 million; Validation: $122 million; Scale-up: $0. Project priority: Improve STEM education; Development: $17 million; Validation: $35 million; Scale-up: $25 million. Project priority: Serve rural communities; Development: $18 million; Validation: $30 million; Scale-up: $0. Project priority: Parent and family engagement; Development: $10 million; Validation: $0; Scale-up: $0. Project priority: Improve the use of data; Development: $33 million; Validation: $33 million; Scale-up: $0. Source: GAO analysis of U.S. Department of Education data. [End of figure] The Majority of i3 Projects Use Teacher and Principal Development as One Method to Achieve Goals: Sixty-two of the 92 i3 projects awarded from 2010 to 2012 use teacher and principal professional development as a way to achieve results. As a group, these 62 projects address many of Education's program priorities, such as effectiveness of teachers and principals and effective standards and assessments. Thirty-one of the 62 projects use professional development as well as other measures such as project- based learning, providing additional classroom time, and either one-on- one or small-group tutoring. Thirty-one of the 62 projects use professional development as the primary or sole method underlying their innovations.[Footnote 24] Combined, these 31 projects received about $457 million of the $937 million in i3 grants. The ways in which these projects rely on professional development varies. For example, some projects emphasize training teachers within a particular subject area, such as reading and writing for middle and high school students, or science education for elementary and middle school students. Other projects attempt to develop teachers regardless of subject area based on approaches such as using measures of student achievement, monitoring by accomplished teachers, and providing a teacher training and support program developed by the project's sponsor. In addition, two of the 62 projects that use professional development focus on the development of principals and their leadership skills. In these cases, the projects seek to increase the number and placement of trained principals in schools. Education Supports i3 Grantees through Monitoring and Technical Assistance: Education Monitors Implementation of i3 Projects for Compliance and Progress: According to Education officials and i3 monitoring protocols, program officers communicate regularly with grantees to ensure that their projects comply with Education's regulations, funds are spent appropriately, and they are making adequate progress implementing their projects. Program officers are required to communicate with grantees by e-mail or by phone on an ongoing basis regarding all aspects of their projects. In addition to this ongoing communication, program officers must meet with grantees by phone specifically to discuss project performance at least once a month for the first two years of the grant and bimonthly thereafter. Program officers may conduct site visits to grantees to discuss progress and implementation in greater detail.[Footnote 25] For example, issues discussed during the site visit may include initial results about the effectiveness of the project and efforts to expand the project. According to Education's monitoring protocols, program officers review and approve, as appropriate, changes to a grantee's project from what was outlined in the grantee's approved application. Under Education's regulations, grantees must generally request approval for various program or budgetary changes, such as a proposed revision to the scope or objective of the project.[Footnote 26] According to Education officials, requests not covered by Education's regulations may need to be submitted in writing and if needed, program officers confer with Education's General Counsel before approving a request. Education's monitoring protocols also require that program officers conduct financial monitoring of grantees that includes a review of relevant financial information, such as drawdowns of grant funds, annual audit reports, and financial records. Program officers must ensure that grantees use funds in a manner consistent with their approved budgets and timelines. For example, according to post-award guidance for 2012 grantees, program officers monitor grantees to ensure that they make timely drawdowns commensurate with the project's approved scope and milestones and to ensure that grantees avoid maintaining a large amount of unexpended funds or too many or too few drawdowns. In addition, according to i3 monitoring protocols, program officers review expenditures to ensure that costs are allowable, allocable, and reasonable.[Footnote 27] In addition, as part of their financial monitoring, program officers also review a grantee's progress toward fulfilling the program's private match requirement.[Footnote 28] Education also requires grantees to submit annual performance reports to summarize project status, accomplishments, and financial information. These reports must include information on the progress that grantees have made toward their project objectives and information on certain i3 program measures that Education uses to meet Government Performance and Results Act of 1993 requirements.[Footnote 29] Education officials told us their monitoring process is complex and time-intensive because most i3 grants are cooperative agreements. Education awards grants as cooperative agreements when it anticipates having substantial involvement with the grantee over the course of the grant period. Because of this collaborative arrangement, Education officials said program officers communicate frequently with grantees, sometimes on a daily basis, to discuss projects. In addition, Education officials said i3 program officers generally manage a small number of grantees--on average 10 to 12--because they are so involved with the projects. According to a February 2013 report, Education's OIG found that program officers regularly engaged with i3 grantees and provided substantive monitoring due in part to their manageable workload and Education's decision to award grants as cooperative agreements. However, the report also found that Education did not hold i3 grantees accountable for responding to Education's requests for information. The report recommended that Education develop appropriate requirements or consequences for i3 grantees that do not respond to information requests or do not do so in a timely manner. Education's March 2013 monitoring protocols include guidance to program officers regarding this issue. The OIG also found potential risks to Education's ability to adequately monitor i3 grantees in the future and recommended that Education continue to monitor any increase in program officers' workload.[Footnote 30] Education officials told us they recently hired an additional program officer to help manage program officer workload. Education Provides Technical Assistance to Support Project Implementation and Evaluation: Education has hired contractors to provide technical assistance and to support implementation and evaluation of projects. Through a contract with Westat[Footnote 31] that began in September 2012, Education provides support and technical assistance to address implementation challenges. The objectives of this technical assistance are to help grantees implement their projects successfully and perform the type of evidence-based work envisioned under the i3 program. According to Education officials, Westat conducted a needs assessment for each 2010, 2011, and 2012 grantee that considered, among other factors, the implementation challenges of each grantee, level of stakeholder support, and the grantee's capacity to monitor their own progress. Westat is using the results of that assessment to provide targeted technical assistance, such as individual consultations and research, access to a network of experts, and sharing of best practices across grantees. The needs assessments identified common areas of technical assistance across grantees including how to use project management tools and address unforeseen obstacles, such as budget cuts. Scaling and sustaining projects and sharing information were identified as the highest needs across projects. To support cross-grantee collaboration, Westat established five communities of practice that focus on areas identified through the needs assessment.[Footnote 32] According to Education's monitoring protocols, program officers also provide technical assistance during their routine monitoring of i3 grants. This assistance can include such activities as sharing information about other grantees conducting similar work, sharing articles and other resources, and information about appropriate meetings and webinars. Education officials said most grantees have been able to utilize technical assistance and make adjustments when an innovation does not appear to be working. According to a July 2013 status update on the i3 program, some grantees adjusted their implementation plans to address challenges they encountered.[Footnote 33] For example, grantees built technological infrastructure in rural communities to support effective implementation of their projects. To support the required evaluations of i3 projects,[Footnote 34] Education contracted with Abt Associates (Abt),[Footnote 35] to work with the independent evaluators chosen by i3 grantees on their evaluation plans and implementation.[Footnote 36] In their applications, grantees were required to submit information about their plans for evaluating their projects, such as proposed research questions, methods for addressing the questions, and proposed data collection methods. Following receipt of the award, grantees are asked to revise and refine these evaluation plans, in consultation with Abt. According to the i3 monitoring protocols, Abt also provides feedback to the project evaluator on evaluation design and implementation on an ongoing basis. For example, Abt and the independent evaluators participate in regularly scheduled phone calls to discuss issues including the progress of individual evaluations, progress on data collection, and how evaluators are measuring and documenting the implementation of i3 innovations. Grantees are required to provide Education with a final evaluation of the project, and Education officials said they ask grantees to cooperate with requests from Abt to share the information needed to periodically assess progress. Abt's technical assistance is part of a larger effort to conduct a national evaluation of the i3 program. Using findings from the grantees' project evaluations, Abt plans to provide information on the results for different categories of key i3-funded practices, strategies, and programs. Education plans to release an initial report in 2016 with findings from the 2010 grantees, and Education officials said Abt would provide annual addendums on findings from additional cohorts. Independent evaluators are also required to contribute to an annual data collection effort for the purposes of i3 annual reporting. According to Education, Abt's technical assistance helps maximize the strength of project evaluations. In addition, a primary goal of the i3 program is to ensure that projects contribute significantly to improving the information available to practitioners and policymakers about which practices work, for which types of students, and in what contexts. Education expects project evaluations for scale-up and validation grants to meet What Works Clearinghouse (WWC) standards and expects project evaluations for development grants to provide evidence of the innovation's promise for improving student outcomes.[Footnote 37] The WWC uses evidence standards to assess the strengths and weaknesses of a study's methodology, such as the type of design it uses, the quality of the study's data, and the appropriateness of the study's statistical procedures. For example, the WWC has a standard by which it assesses the likelihood that a study's findings may be biased due to attrition of study participants over time.[Footnote 38] The WWC has three rating categories that indicate the level of evidence provided by the study with respect to an innovation's effectiveness: Meets Evidence Standards (study provides strong evidence for an innovation's effectiveness), Meets Evidence Standards with Reservations (study provides weaker evidence for an innovation's effectiveness), Does Not Meet Evidence Standards (study provides insufficient evidence for an innovation's effectiveness). Education officials said 77 of 92 i3 project evaluations currently have the potential to meet WWC standards.[Footnote 39] Agency Comments: We provided a draft of this report to the Secretary of Education for review and comment. Education provided technical comments, which we incorporated into the report as appropriate. We are sending copies of this report to the appropriate congressional committees and the Secretary of Education. In addition, the report is available at no charge on GAO's Web site at [hyperlink, http://www.gao.gov]. If you or your staff has any questions about this report, please contact me at (202) 512-7215 or scottg@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. In addition to the contact named above, Elizabeth Morrison, Assistant Director, Nisha R. Hazra, and Daren Sweeney, made key contributions to this report. Also contributing to this report were Deborah Bland, Holly Dye, Alex Galuten, Monika Gomez, Jean McSween, and Mimi Nguyen. Sincerely Yours, Signed by: George A. Scott: Director, Education, Workforce, and Income Security Issues: Enclosures - 1: [End of section] Enclosure 1: List of Congressional Committees: The Honorable Tom Harkin: Chairman: The Honorable Lamar Alexander: Ranking Member: Committee on Health, Education, Labor and Pensions: United States Senate: The Honorable Thomas Carper: Chairman: The Honorable Tom Coburn: Ranking Member: Committee on Homeland Security and Governmental Affairs: United States Senate: The Honorable Tom Harkin: Chairman: The Honorable Jerry Moran: Ranking Member: Subcommittee on Labor, Health and Human Services, Education, and Related Agencies: Committee on Appropriations: United States Senate: The Honorable John P. Kline: Chairman: The Honorable George Miller: Ranking Member: Committee on Education and the Workforce: House of Representatives: The Honorable Darrell E. Issa: Chairman: The Honorable Elijah Cummings: Ranking Member: Committee on Oversight and Government Reform: House of Representatives: The Honorable Jack Kingston: Chairman: The Honorable Rosa DeLauro: Subcommittee on Labor, Health and Human Services, Education, and Related Agencies: Committee on Appropriations: House of Representatives: [End of section] Footnotes: [1] For the purposes of this program, Education defines an innovation to be "a process, product, strategy, or practice that improves (or is expected to improve) significantly upon the outcomes reached with status quo options and that can ultimately reach widespread effective usage." [2] Pub. L. No. 111-5, §14007, 123 Stat. 115, 284. [3] Other programs funded by the Recovery Act include the Race to the Top (RTT) grant fund and the School Improvement Grants (SIG) program. RTT provides incentives for states to implement large-scale, far- reaching reforms to K-12 education to improve student outcomes. The Recovery Act spurred Education to make substantive changes to the SIG program, which is designed to fund significant reforms in low performing schools. For example, the persistently lowest-achieving schools receiving SIG funding must now implement one of four intervention models, each with specific requirements for reform interventions. [4] These were the evidence requirements for scale-up, validation, and development grants for the 2013 grant competition. Under the 2010, 2011, and 2012 grant competitions, development grants required evidence of a reasonable hypothesis. [5] For the purposes of the program, Education defines a "consortium of schools" to mean "two or more public elementary or secondary schools acting collaboratively for the purpose of applying for and implementing an i3 grant jointly with an eligible nonprofit organization." [6] Pub. L. No. 111-5, § 14009, 123 Stat. 115, 285. In addition to reviewing the i3 program, the Act mandated GAO to conduct reviews of the programs under section 14006 of the Recovery Act, which include the RTT fund and the RTT Assessment Program. In the first review of RTT, completed in June 2011, GAO examined, among other things, how states were using their grants to reform their elementary and secondary education systems. See, GAO, Race to the Top: Reform Efforts Are Under Way and Information Sharing Could Be Improved, GAO 11-658 (Washington, D.C.: June 30, 2011). GAO plans to issue a report on the RTT Assessment Program in the future. [7] On December 13, 2013, Education announced the awards for the 2013 competition. This report does not include an analysis of these awards. [8] Education officials told us they review these requirements throughout the application review process. [9] These are the current requirements for the 2013 i3 grant competition (see generally 78 Fed. Reg. 18,682 (Mar. 27, 2013)); however, these requirements may have changed over time. See 75 Fed. Reg. 12,004 (Mar. 12, 2010) for the original requirements of the i3 grant competition, and 75 Fed. Reg. 18,407 (Apr. 12, 2010) and 76 Fed. Reg. 32,073 (June 3, 2011) for changes made in 2010 and 2011. [10] Pub. L. No. 111-5, § 14007(a)(1), 123 Stat. 115, 284.The non- profit organization may be an institution of higher education. 78 Fed. Reg. 18,682, 18,704 (Mar. 27, 2013). [11] 78 Fed. Reg. 18,682, 18,702 (Mar. 27, 2013). For this program, Education defines a high-need student as a student who is "at risk of educational failure or otherwise in need of special assistance and support, such as students who are living in poverty, who attend high- minority schools […], who are far below grade level, who have left school before receiving a regular high school diploma, who are at risk of not graduating with a diploma on time, who are homeless, who are in foster care, who have been incarcerated, who have disabilities, or who are English learners." [12] Pub. L. No. 111-5, § 14007(b), 123 Stat. 115, 284, as amended by Pub. L. No. 111-117, Div. D, § 307, 123 Stat. 3034, 3271 (2009). Applicants must generally show, among other things, that they have significantly closed achievement gaps among certain groups of students or have significantly increased student academic achievement for all of those groups of students. The groups represent students who have disabilities, are economically disadvantaged, represent major racial or ethnic groups, or have limited English proficiency. Nonprofit organizations can meet this requirement by showing that they have significantly improved student achievement, attainment, or retention. [13] 78 Fed. Reg. 18,682, 18,703 (Mar. 27, 2013). [14] Id. [15] In 2010, Education awarded validation and development grants as discretionary grants and the scale-up grants as cooperative agreements. Beginning in 2011, all grants were awarded as cooperative agreements. [16] See U.S. Department of Education Office of the Inspector General, American Recovery and Reinvestment Act: The Department's Monitoring of Investing in Innovation Program Grant Recipients, Final Inspection Report, ED-OIG/I13M0001 (Washington, D.C.: Feb. 21, 2013). [17] Validation grants received a maximum of $30 million each in fiscal year 2010 and a maximum of $15 million in fiscal years 2011 and 2012 competitions. [18] For more information on the 2012 grants, see Department of Education, Questions about the Highest-Rated Applications Announcement, November 8, 2012, accessed December 30, 2013, [hyperlink, http://www2.ed.gov/programs/innovation/i3hrafaq2012.doc]. [19] See 76 Fed. Reg. 32,073, 32,080 (June 3, 2011). [20] Starting in 2012, Education began requiring that applicants seeking development grants submit pre-applications. Peer reviewers selected by Education review the pre-applications against an abbreviated set of selection criteria used to score full applications. Education officials noted that this process requires fewer resources for applicants deemed to be less competitive, reduces the burden on applicants, and allows Education to use resources while providing additional time for applicants that are judged more competitive to structure their proposal. [21] For the 2010 competition, peer reviewers had to show expertise in at leastone of the following areas:education reform and policy, evidence, innovation, strategy,and application review. [22] For more information on teacher and principal evaluation systems, see GAO, Race to the Top: States Implementing Teacher and Principal Evaluation Systems despite Challenges, [hyperlink, http://www.gao.gov/products/GAO-13-777] (Washington, D.C.: Sept. 18, 2013). [23] In the 2011 and 2012 competitions, Education made 23 and 20 awards, respectively. In the fiscal year 2010 competition, i3 awards averaged about $13 million. In the fiscal year 2011 and 2012 competitions, awards averaged $6 and $7 million, respectively. [24] The 31 projects include 19 projects that support Education's "teacher and principal effectiveness" priority and 12 projects that support other Education priorities. [25] According to Education officials, program officers strive to conduct at least one site visit for grantees that received 3-year awards and at least two site visits for grantees that received 5-year awards, contingent on the availability of travel funds. [26] See 34 C.F.R. §§ 74.25 and 80.30. [27] According to Education's guidance, allowable costs are those that are either permitted or not specifically prohibited and necessary for project success. Allocable costs are expended for a particular purpose or time period, and reasonable costs are those that would be incurred by any prudent person. [28] 2010, 2011, and 2012 grantees had to submit evidence, following the peer review of applications, of the commitment by private sector organizations to fulfill the full amount of the required private- sector matching funds. 2013 grantees were required to submit evidence of securing at least 50 percent of the required private-sector match following peer review of applications and evidence of the remaining 50 percent within 6 months of the project start date. [29] The Government Performance and Results Act of 1993 require federal agencies to set goals, measure performance, and report on accomplishments. To help evaluate the i3 program as a whole, grantees report on certain performance indicators. For example, for validation grants, grantees report on the number of students served by the program. For validation and development grants, grantees also report on the cost per student served by the grant. [30] ED-OIG/I13M0001 [31] Westat is a research and statistical survey organization that provides a range of services in areas such as research design and analysis, evaluation, and data collection and management. Education and Westat are currently in the second year of the contract, with the option to extend up to another 3 years [32] The five communities of practice are (1) organization development and performance management, (2) partnership management, (3) communication, (4) sustainability and scale-up, and (5) dissemination of project information. [33] U.S. Department of Education, Update on the Investing in Innovation (i3) Fund July 2013, accessed November 4, 2013, [hyperlink, http://www2.ed.gov/programs/innovation/130726i3portfolioreview.ppt]. [34] Grantees are required to obtain an independent project evaluation, share the results of the evaluation broadly, and generally share the data sets with third-party researchers. In addition, grantees are required to cooperate with any technical assistance provided by Education. [35] Abt provides research and program implementation services to public and private sector clients. Education's Institute of Education Sciences is overseeing two technical assistance contracts with Abt related to the i3 program. The first, at a cost of $13.3 million over 5 years provides technical assistance to the 2010 and 2011 cohorts of i3 grantees. The second, at a cost of $4.5 million over 5 years provides technical assistance to the 2012 cohort of grantees. Education's OIG recommended that Education ensure that evaluation technical assistance is available for future cohorts of grantees as well. Education officials said the competition for the contract to provide technical assistance to the 2013 cohort of grantees is underway and would be awarded by December 31, 2013. [36] For the i3 program, Education has defined an "independent evaluation" to mean that "the evaluation is designed and carried out independent of, but in coordination with, any employees of the entities who develop a process, product, strategy, or practice and are implementing it." According to Education, this independence helps ensure the objectivity of an evaluation and prevents the appearance of a conflict of interest. [37] The WWC is a federal source of evidence about effective education practices. The WWC, which was created in 2002 by Education's Institute of Education Sciences, is operated by an independent contractor that conducts systematic reviews of education research and disseminates information on its website about the effectiveness of the practices reported in these research studies. For more information, see GAO, Department of Education: Improved Dissemination and Timely Product Release Would Enhance the Usefulness of the What Works Clearinghouse, [hyperlink, http://www.gao.gov/products/GAO-10-644] (Washington, D.C.: July 23, 2010). [38] For more information, see Institute of Education Sciences, What Works Clearinghouse Procedures and Standards Handbook, v. 2.1, (Washington, D.C.: Sept. 2011). [39] Education officials noted that the other 15 projects may still provide information on whether an innovation works, but may not meet WWC standards. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E-mail Updates.” Order by Phone: The price of each GAO publication reflects GAO's actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO's website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, DC 20548. [End of document]