This is the accessible text file for GAO report number GAO-12-108 entitled 'Science Technology, Engineering, and Mathematics Education: Strategic Planning Needed to Better Manage Overlapping Programs across Multiple Agencies' which was released on January 24, 2012. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Requesters: January 2012: Science Technology, Engineering, and Mathematics Education: Strategic Planning Needed to Better Manage Overlapping Programs across Multiple Agencies: GAO-12-108: GAO Highlights: Highlights of GAO-12-108, a report to congressional requesters. Why GAO Did This Study: Science, technology, engineering, and mathematics (STEM) education programs help to enhance the nation’s global competitiveness. Many federal agencies have been involved in administering these programs. Concerns have been raised about the overall effectiveness and efficiency of STEM education programs. GAO examined (1) the number of federal agencies and programs that provided funding for STEM education programs in fiscal year 2010; (2) the extent to which STEM education programs have similar objectives, serve similar target groups, and provide similar types of services, and, if necessary, what opportunities exist to increase coordination; and (3) the extent to which STEM education programs measured effectiveness. To answer these questions, GAO reviewed relevant federal laws, regulations, and plans; surveyed federal STEM education programs; analyzed programs’ STEM evaluations; and interviewed relevant federal officials. An electronic supplement—-GAO-12-110SP—- provides survey results. What GAO Found: In fiscal year 2010, 13 federal agencies invested over $3 billion in 209 programs designed to increase knowledge of STEM fields and attainment of STEM degrees. The number of programs within agencies ranged from 3 to 46, with the Departments of Health and Human Services and Energy and the National Science Foundation administering more than half of these programs. Almost a third of the programs had obligations of $1 million or less, while some had obligations of over $100 million. Beyond programs specifically focused on STEM education, agencies funded other broad efforts that contributed to enhancing STEM education. Eighty-three percent of the programs GAO identified overlapped to some degree with at least 1 other program in that they offered similar services to similar target groups in similar STEM fields to achieve similar objectives. Many programs have a broad scope—-serving multiple target groups with multiple services. However, even when programs overlap, the services they provide and the populations they serve may differ in meaningful ways and would therefore not necessarily be duplicative. Nonetheless, the programs are similar enough that they need to be well coordinated and guided by a robust strategic plan. Currently, though, less than half of the programs GAO surveyed indicated that they coordinated with other agencies that administer similar STEM education programs. Current efforts to inventory federal STEM education activities and develop a 5-year strategic plan present an opportunity to enhance coordination, align governmentwide efforts, and improve efficiency of limited resources by identifying opportunities for program consolidation and reducing administrative costs. Agencies’ limited use of performance measures and evaluations may hamper their ability to assess the effectiveness of their individual programs as well as the overall STEM education effort. Specifically, program officials varied in their ability to provide reliable output measures-—for example, the number of students, teachers, or institutions directly served by their program. Further, most agencies did not use outcomes measures in a way that is clearly reflected in their performance planning documents. This may hinder decision makers’ ability to assess how agencies' STEM education efforts contribute to agencywide performance goals and the overall federal STEM effort. In addition, a majority of programs did not conduct comprehensive evaluations since 2005 to assess effectiveness, and the evaluations GAO reviewed did not always align with program objectives. Finally, GAO found that completed STEM education evaluation results had not always been disseminated in a fashion that facilitated knowledge sharing between both practitioners and researchers. What GAO Recommends: GAO recommends that as OSTP leads the governmentwide STEM education strategic planning effort, it should work with agencies to better align their activities with a governmentwide strategy, develop a plan for sustained coordination, identify programs for potential consolidation or elimination, and assist agencies in determining how to better evaluate their programs. OSTP provided technical comments that we incorporated as appropriate. OMB had no concerns with the report. View [hyperlink, http://www.gao.gov/products/GAO-12-108] or key components. To view the e-supplement online, click GAO-12-110SP. For more information, contact George A. Scott at (202) 512-7215 or scottg@gao.gov. [End of section] Contents: Letter: Background: Thirteen Federal Agencies Administered over 200 STEM Education Programs with Over $3 Billion in Obligated Funds: Most STEM Programs Overlapped to Some Degree, Highlighting the Need for Improved Coordination and Planning: Limited Use of Performance Measures and Evaluations May Hamper Ability to Assess Effectiveness: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Objectives, Scope, and Methodology: Appendix II: List of STEM Education Programs with Fiscal Year 2010 Obligations: Appendix III: Review of Evaluations: Appendix IV: GAO Contact and Staff Acknowledgments: Related GAO Products: Tables: Table 1: Agencies Administering STEM Education Programs: Table 2: STEM Fields of Focus and Target Groups of Federal STEM Education Programs: Table 3: Reasons for Exclusion of STEM Education Activities from Our Survey: Table 4: Program Evaluations and Evaluation Methods: Table 5: Committee of Visitors and Other Types of Reports Used to Assess Program Effectiveness: Figures: Figure 1: Number of STEM Education Programs Reported by Agency: Figure 2: Number of STEM Education Programs by Range of Obligations, Fiscal Year 2010: Figure 3: Overlapping Federal STEM Education Programs: Figure 4: Number of Target Groups per Federal STEM Education Program: Figure 5: Services Provided by Federal STEM Education Programs: Figure 6: Integration of Overall STEM Education Efforts in Agencies' Performance Plans and Reports: Figure 7: Integration of STEM Education Programs in Agencies' Performance Plans and Reports: Figure 8: Percentage of STEM Education Programs, by Status of Evaluations since 2005: Abbreviations: ACC: Academic Competitiveness Council: COMPETES: America COMPETES Act: DHS: Department of Homeland Security: DOD: Department of Defense: DOT: Department of Transportation: EPA: Environmental Protection Agency: GPRA: Government Performance and Results Act: HHS: Department of Health and Human Services: IHS: Indian Health Service: K-12: Kindergarten-12th grade: NASA: National Aeronautics and Space Administration: CoSTEM: Committee on Science, Technology, Engineering, and Math Education: NIH: National Institutes of Health: NRC: Nuclear Regulatory Commission: NSF: National Science Foundation: NSTC: National Science and Technology Council: OMB: Office of Management and Budget: OSTP: Office of Science and Technology Policy: PCAST: President's Council of Advisors on Science and Technology: SMD: NASA Science Mission Directorate: STEM: Science, Technology, Engineering, and Mathematics: USDA: Department of Agriculture: View GAO-12-108 key component: Science, Technology, Engineering, and Mathematics Education: Survey of Federal Programs [hyperlink, http://www.gao.gov/products/GAO-12-110SP], an e-supplement to [hyperlink, http://www.gao.gov/products/GAO-12-108: [End of section] United States Government Accountability Office: Washington, DC 20548: January 20, 2012: The Honorable John Kline: Chairman: Committee on Education and the Workforce: House of Representatives: The Honorable Duncan D. Hunter: Chairman: Subcommittee on Early Childhood, Elementary, and Secondary Education: Committee on Education and the Workforce: House of Representatives: Federally funded science, technology, engineering, and mathematics (STEM) education programs can serve an important role both by helping to prepare students and teachers for careers in STEM fields and by enhancing the nation's global competitiveness. In this effort, many federal agencies administer STEM education programs. In addition to the federal effort, state and local governments, universities and colleges, and the private sector have also developed programs that provide opportunities for students to pursue STEM education and occupations. Nonetheless, research continues to show that the United States lacks a strong pipeline of future workers in STEM fields and that U.S. students continue to lag behind students in other highly technological nations in mathematics and science achievement. Over the decades, Congress and the executive branch have continued to create new STEM education programs, even though, as we reported in 2005, there has been a general lack of assessment of how well STEM programs are working.[Footnote 1] A little more than a year after our report was issued, the Academic Competitiveness Council (ACC)--headed by the Department of Education--issued a report that outlined areas of potential overlap and recommended areas for better coordination and evaluation of STEM education programs.[Footnote 2] In this context, we were asked to examine the delivery and effectiveness of STEM education programs. Specifically, our objectives were to determine (1) the number of federal agencies and programs that provided funding for STEM education programs in fiscal year 2010; (2) the extent to which these STEM programs have similar objectives, serve similar target groups, and provide similar types of services and, if necessary, what opportunities exist to increase coordination; and (3) the extent to which federal STEM education programs have measured their effectiveness. To address our objectives, we collected and analyzed information through several methods. We reviewed relevant federal laws and regulations as well as previous GAO work on overlap, duplication, and fragmentation. We interviewed officials from the Office of Management and Budget (OMB) and the Office of Science and Technology Policy (OSTP), and officials from other federal agencies that administer STEM education programs. We reviewed relevant literature and past reports that catalog and assess the federal investment in STEM education. To gather information on federal STEM education programs and to assess the level of fragmentation, overlap, and potential duplication, we surveyed over 200 programs across 13 agencies that met our definition of a STEM education program, asking questions about program objectives, target populations, services provided, interagency coordination, outcome measures and evaluations, and funding.[Footnote 3] Our web-based survey, which was administered between May 2011 and August 2011 to federal agency program officials, achieved a 100 percent response rate. To assess the reliability of data provided in our survey, we incorporated questions about the reliability of the programs' data systems, reviewed documentation for a sample of selected questions, conducted internal reliability checks, and conducted follow-up as necessary. While we did not verify all responses, we determined that the data used in our report are sufficiently reliable for our purpose. To gather additional perspectives about federal STEM education programs, we attended several STEM education conferences. To gather information on program effectiveness, we reviewed evaluations provided by program officials as well as agencies' annual performance plans and reports. For more information on our scope and methodology, see appendix I. The STEM survey and selected results can be found in GAO-12-110SP, an e- supplement that is a companion to this report. We conducted this performance audit from February 2011 through January 2012, in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: Past Efforts to Assess Federal STEM Education Efforts: In 2005, we reported that 207 federal STEM education programs across 13 different agencies spent $2.8 billion in federal funds in fiscal year 2004.[Footnote 4] We noted that before increasing investment in STEM education, it is important to know the extent to which existing STEM education programs are appropriately targeted and whether or not they are making the best use of available federal resources. Additionally, information about the effectiveness of these programs could help guide policymakers and program managers. Since then, several other efforts have been conducted to identify federal STEM programs and provide recommendations to improve both coordination and program evaluation as well as reduce potential duplication. For example, in 2006, ACC, led by the Department of Education, created an inventory and assessed the effectiveness of federal STEM programs. ACC recommended further coordination among federal agencies administering STEM programs, states, and local school districts. In addition, ACC recommended that agencies adjust program designs and operations so that programs can be assessed and measurable results can be achieved and that funding for federal STEM education programs should not be increased unless a plan for rigorous, independent evaluation is in place. In 2010, the President's Council of Advisors on Science and Technology (PCAST), an advisory group of the nation's leading scientists and engineers housed in OSTP, published a report in response to the President's request to develop specific recommendations concerning the most important actions that the administration should take to ensure that the United States is a leader in STEM education in the coming decades.[Footnote 5] PCAST found that approaches to Kindergarten-12th grade (K-12) STEM education across agencies emerged largely without a coherent vision or careful oversight of goals and outcomes. PCAST also found that relatively little funding was targeted at efforts with the potential to transform STEM education, too little attention was paid to replication efforts to disseminate proven programs widely, and too little capacity at key agencies was devoted to strategy and coordination. Our past effort to inventory STEM education programs identified a multitude of agencies that administer such programs. The primary missions of these agencies vary, but most often, they are to promote and enhance an area that is related to a STEM field or enhance general education. See table 1 for relevant agencies and their missions. Table 1: Agencies Administering STEM Education Programs: Agency: Department of Agriculture (USDA); Mission: To provide leadership on food, agriculture, natural resources, and related issues based on sound public policy, the best available science, and efficient management. Agency: Department of Commerce (Commerce); Mission: To promote job creation, economic growth, sustainable development, and improved standards of living for all Americans by working in partnership with businesses, universities, communities, and our nation's workers. Agency: Department of Defense (DOD); Mission: To provide the military forces needed to deter war and to protect the security of our country. Agency: Department of Education (Education); Mission: To promote student achievement and preparation for global competitiveness by fostering educational excellence and ensuring equal access. Agency: Department of Energy (Energy); Mission: To ensure America's security and prosperity by addressing its energy, environmental and nuclear challenges through transformative science and technology solutions. Agency: Department of Health and Human Services (HHS); Mission: To enhance the health and well-being of Americans by providing for effective health and human services and by fostering sound, sustained advances in the sciences underlying medicine, public health, and social services. Agency: Department of Homeland Security (DHS); Mission: To ensure a homeland that is safe, secure, and resilient against terrorism and other hazards. Agency: Department of the Interior (Interior); Mission: To protect and manage the nation's natural resources and cultural heritage; to provide scientific and other information about those resources; and to honor its trust responsibilities or special commitments to American Indians, Alaska Natives, and affiliated island communities. Agency: Department of Transportation (DOT); Mission: To ensure a fast, safe, efficient, accessible and convenient transportation system that meets our vital national interests and enhances the quality of life of the American people, today and into the future. Agency: Environmental Protection Agency (EPA); Mission: To protect human health and the environment. Agency: National Aeronautics and Space Administration (NASA); Mission: To drive advances in science, technology, and exploration to enhance knowledge, education, innovation, economic vitality, and stewardship of Earth. Agency: National Science Foundation (NSF); Mission: To promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; to support science and engineering education, from prekindergarten through graduate school and beyond, among other things. Agency: Nuclear Regulatory Commission (NRC); Mission: To ensure the adequate protection of public health, safety, and the environment while promoting the common defense and security. Source: GAO review of agencies' websites and strategic plans. [End of table] As part of this effort, we also identified the role that the National Science and Technology Council (NSTC), a component of OSTP, plays in coordinating STEM education programs. NSTC was established in 1993 and is the principal means for the administration to coordinate science and technology with the federal government's larger research and development effort.[Footnote 6] NSTC is made up of the Vice President, the Director of the Office of Science and Technology Policy, and officials from other executive branch agencies with significant science and technology responsibilities. One objective of NSTC is to establish clear national goals for federal science and technology investments in areas ranging from information technologies and health research to improving transportation systems and strengthening fundamental research. NSTC is responsible for preparing research and development strategies that are coordinated across federal agencies in order to accomplish these multiple national goals. Federal Legislation: STEM education programs have been created in two ways--by Congress directly in legislation or through agencies' broad statutory authority to carry out their missions.[Footnote 7] The Higher Education Opportunity Act,[Footnote 8] the No Child Left Behind Act of 2001, [Footnote 9] and the National Science Foundation Act of 1950[Footnote 10] created programs at the Department of Education and the National Science Foundation (NSF)--two key agencies that administer many STEM education programs. In addition, since our 2005 review of STEM education programs, Congress has also passed legislation to examine the overall federal effort to improve STEM education. For example, the Deficit Reduction Act of 2005 established ACC.[Footnote 11] ACC, consisted of officials from the Department of Education and other federal agencies with responsibility for managing mathematics and science education programs and was mandated to (1) identify all federal programs with a mathematics or science education focus, (2) identify the target populations being served by such programs, (3) determine the effectiveness of such programs, (4) identify areas of overlap or duplication in such programs, and (5) recommend processes to integrate and coordinate such programs. While various pieces of legislation directly created some STEM education programs, agencies reported using their broad statutory authority to create many programs as well. For example, according to agency officials, NSF created 25 of its 37 programs and the Department of Health and Human Services (HHS) created 40 of its 46 programs in this manner. More recently, the America COMPETES Act (COMPETES), enacted in 2007, authorized several programs to promote STEM education.[Footnote 12] In December 2010, Congress reauthorized COMPETES.[Footnote 13] The reauthorization approved new funding for some STEM education programs and made substantive changes to others by reducing certain nonfederal matching requirements. Additionally, it repealed many of the programs that went unfunded following the original COMPETES passage. The COMPETES reauthorization also sought to address coordination and oversight issues, including those associated with the coordination and potential duplication of federal STEM education efforts. Specifically, Congress required the Director of OSTP to establish a committee under NSTC to inventory, review, and coordinate federal STEM education programs.[Footnote 14] Congress also directed this NSTC committee to specify and prioritize annual and long-term objectives for STEM education, and to ensure that federal efforts do not duplicate each other, among other things. NSTC is required to report to Congress annually. Beyond STEM-specific efforts, the federal government as a whole is seeking to identify programmatic areas that could be better tracked and coordinated. One such effort revolves around the Government Performance and Results Act (GPRA) Modernization Act of 2010.[Footnote 15] The GPRA Modernization Act established a new framework aimed at taking a more crosscutting and integrated approach to focusing on results and improving government performance. It requires OMB, in coordination with agencies,[Footnote 16] to develop--at least every 4 years--long-term priority goals, including outcome-oriented goals covering a limited number of crosscutting policy areas. On an annual basis, OMB is to provide information on how these long-term crosscutting goals will be achieved. This approach could provide a basis for more fully integrating a wide array of federal activities as well as a cohesive perspective on the long-term goals of the federal government. GAO's Work on Fragmentation, Overlap, and Duplication: In 2010, Congress directed GAO to conduct routine investigations to identify programs, agencies, offices, and initiatives with duplicative goals and activities within departments and governmentwide and report annually to Congress.[Footnote 17] In March 2011, GAO issued its first annual report to Congress in response to this requirement.[Footnote 18] In that report, we identified 81 areas for consideration--34 areas of fragmentation, overlap, and potential duplication and 47 additional areas--where agencies or Congress may wish to consider taking action in an effort to reduce the cost of government operations or enhance revenue collections. Using the framework established in the March 2011 GAO report, we examine the extent to which federal STEM education programs are fragmented, overlapping, and duplicative. For the purposes of this report, the key terms are defined as follows: * Fragmentation occurs when more than one federal agency (or more than one organization within an agency) is involved in the same broad area of national need. * Overlap occurs when multiple programs offer similar services to similar target groups in similar STEM fields to achieve similar objectives. * Duplication occurs when multiple programs offer the same services to the same target beneficiaries in the same STEM fields. Thirteen Federal Agencies Administered over 200 STEM Education Programs with Over $3 Billion in Obligated Funds: Thirteen agencies administered 209 STEM education programs in fiscal year 2010.[Footnote 19] (See appendix I for our definition of a STEM education program.) Agencies reported that they developed the majority (130) of these programs through their general statutory authority and that Congress specifically directed agencies to create 59 of these programs.[Footnote 20] The number of programs each agency administered ranged from 3 to 46 with three agencies--HHS, the Department of Energy, and NSF--administering more than half of all programs--112 of 209. Figure 1 provides a summary of the number of programs by agency, and appendix II contains a list of the 209 STEM education programs and reported obligations for fiscal year 2010. Figure 1: Number of STEM Education Programs Reported by Agency: [Refer to PDF for image: plotted point graph] Agency: NASA; Fiscal year 2010 obligations: $209.6 million; Number of programs: 9. Agency: NSF; Fiscal year 2010 obligations: $1.1 billion; Number of programs: 37. Agency: NRC; Fiscal year 2010 obligations: $22.5 million; Number of programs: 3. Agency: USDA; Fiscal year 2010 obligations: $41.9 million; Number of programs: 11. Agency: Commerce; Fiscal year 2010 obligations: $87.9 million; Number of programs: 19. Agency: DOD; Fiscal year 2010 obligations: $157.0 million; Number of programs: 19. Agency: Education; Fiscal year 2010 obligations: $691.2 million; Number of programs: 12. Agency: Energy; Fiscal year 2010 obligations: $72.2 million; Number of programs: 29. Agency: HHS; Fiscal year 2010 obligations: $597.2 million; Number of programs: 46. Agency: DHS; Fiscal year 2010 obligations: $7.1 million; Number of programs: 5. Agency: Interior; Fiscal year 2010 obligations: $1.0 million; Number of programs: 3. Agency: DOT; Fiscal year 2010 obligations: $94.3 million; Number of programs: 6. Agency: EPA; Fiscal year 2010 obligations: $18.3 million; Number of programs: 10. Source: GAO analysis of survey responses. [End of figure] Having multiple agencies, with varying expertise, involved in delivering STEM education can be advantageous. One such advantage is that agencies may be better able to tailor programs to suit their specific missions and needs. For example, Energy officials said that their efforts to support students in pursuing a STEM course of study are related to Energy's mission and work in their labs and can be a way to attract new employees to their workforce. However, this could also make it challenging to develop a coherent federal approach to educating STEM students and creating a workforce with STEM skills. Having multiple agencies involved in the delivery of STEM education could also make it challenging to identify gaps and allocate resources across the federal government. Agencies obligated over $3 billion to STEM education programs in fiscal year 2010.[Footnote 21] Individual program obligations ranged from $15,000 to hundreds of millions of dollars. NSF and the Department of Education programs account for over half of this funding. Almost a third of the programs had obligations of $1 million or less, with 5 programs having obligations of more than $100 million each. See figure 2 for program obligation ranges. Figure 2: Number of STEM Education Programs by Range of Obligations, Fiscal Year 2010: [Refer to PDF for image: vertical bar graph] Total fiscal year 2010 Obligations: $25,000; Number of programs: 4. Total fiscal year 2010 Obligations: $100,000; Number of programs: 5. Total fiscal year 2010 Obligations: $500,000; Number of programs: 25. Total fiscal year 2010 Obligations: $1 million; Number of programs: 30. Total fiscal year 2010 Obligations: $5 million; Number of programs: 67. Total fiscal year 2010 Obligations: More than $5 million; Number of programs: 78. Source: GAO analysis of survey responses. [End of figure] Agencies carried out other activities that did not fit our definition of a STEM education program because STEM education was their secondary or tertiary objective, rather than their primary objective. These efforts include broad-based programs with STEM components, programs that enhance the general public's knowledge of STEM, and research programs that may hire students.[Footnote 22] Selected examples of agencies' efforts as reported to us by agency officials include the following: Broad-Based Programs That Include STEM Components: * Several of the Department of Education's programs have STEM components. For example, Title I of the Elementary and Secondary Education Act of 1965, as amended, includes funding for the assessment of math for primary and secondary students, putting a renewed focus on educational attainment in these areas. In addition, the Race to the Top Fund, a competitive grant program, includes bonus points for states that report they will include in their grant activity, efforts to enhance STEM education. * The Department of Transportation's State Maritime Academy program supports maritime training and education programs in an effort to improve the quality of the U.S. maritime industry with a secondary objective to encourage students to pursue careers in STEM fields that can contribute to the maritime industry. Programs to Educate the General Public: * The National Institutes of Health's (NIH) Science Education Drug Abuse Partnership Award provides support for the formation of partnerships among scientists and educators, media experts, community leaders, and other interested organizations for the development and evaluation of programs and materials that will enhance knowledge and understanding of science related to drug abuse. The intended focus is on topics not well addressed in existing efforts by educational, community, or media activities. Research Programs That Include Internships or Assistantships: * Energy's national laboratories, most of which are managed by contractors and engage in research activities on behalf of multiple federal agencies, sometimes partner with universities and offer students research opportunities in various disciplines, such as science and technology. The primary focus of these laboratories is on research and development, which is determined by the funding institution, and there is not always a requirement that they hire students. When research programs do hire students, this can enhance students' education and interest in STEM. * The Department of Defense has several programs with a primary objective to further research on a specific STEM topic. For example, it has programs that fund university faculty to conduct research on STEM topics and who may hire students to assist with research. * The Department of Homeland Security receives funding for technological research in areas that support its mission, and a portion of this may go to student research activities such as hiring a student for the summer or for several weeks to assist with the research. Nonmonetary Partnerships with Schools or through Private Partnerships: * The Department of the Interior participates in the GeoFORCE program-- a precollege program that provides hands-on science learning experiences for middle and high school students (primarily underserved minorities)--which is mostly funded by private donations and the University of Texas.[Footnote 23] * The Environmental Protection Agency has a cooperative agreement with the Hispanic Association of Colleges and Universities that is intended to increase the diversity of students going into science and technology careers. The agreement includes activities such as EPA staff participation in lectures, conferences, and other events, as well as EPA staff members serving as mentors or coaches, among other things. Dedicated Funds for Education Programs: * NASA's Science Mission Directorate (SMD) requires each of its missions to fund SMD-related education and public outreach using a small percentage of the research and development program costs, but these funds are not specifically for STEM education. Most STEM Programs Overlapped to Some Degree, Highlighting the Need for Improved Coordination and Planning: Most Programs Overlapped to Some Degree in Their Primary Objectives, Target Groups, and Services Provided: As figure 3 illustrates, in fiscal year 2010, 83 percent of STEM education programs overlapped to some degree with another program in that they offered at least one similar service to at least one similar target group in at least one similar STEM field to achieve at least one similar objective. These programs ranged from being narrowly focused on a specific group or field of study to offering a range of services to students and teachers across STEM fields. This complicated patchwork of overlapping programs has largely resulted from federal efforts to both create and expand programs across many agencies in an effort to improve STEM education and increase the number of students going into STEM fields. Program officials reported that approximately one-third of STEM education programs funded in fiscal year 2010 were first funded between 2005 and 2010. Indeed, the creation of new programs during that time frame may have contributed to overlap and, ultimately, to inefficiencies in how STEM programs across the federal government are focused and delivered. Overlap among STEM education programs is not new. In 2007, ACC identified extensive overlap among STEM education programs, and, in 2009, we identified overlap among teacher quality programs, which include several programs focused on STEM education. Figure 3: Overlapping Federal STEM Education Programs: [Refer to PDF for image: illustration] 209 STEM education programs: Programs that have at least one similar target population: 100%; 209 programs. Programs that have at least one similar target population and also provide at least one similar service: 99%; 207 programs (2 programs do not overlap). Programs that have at least one similar target population and also provide at least one similar service and also at least one similar STEM field of focus: 83%; 173 programs (34 programs do not overlap). Programs that have at least one similar target population and also provide at least one similar service and also at least one similar STEM field of focus and also have at least one similar program objective: 83% (173 programs). Source: GAO analysis of survey responses. [End of figure] Similar Target Groups: Many programs provided services to similar target groups, such as K-12 students, postsecondary students, K-12 teachers, and college faculty and staff. The vast majority of programs (170) served postsecondary students. Ninety-five programs served college faculty and staff, 75 programs served K-12 students, and 70 programs served K-12 teachers. In addition, many programs served multiple target groups. In fact, as figure 4 illustrates, 177 programs were primarily intended to serve two or more target groups. Figure 4: Number of Target Groups per Federal STEM Education Program: [Refer to PDF for image: horizontal bar graph] Target groups per program: 1; Number of programs: 30. Target groups per program: 2; Number of programs: 130. Target groups per program: 3; Number of programs: 18. Target groups per program: 4; Number of programs: 29. Source: GAO analysis of survey responses. Note: Two programs indicated they did not serve any of the target groups listed in our survey. [End of figure] Similar Services Provided: As figure 5 illustrates, we also found many STEM programs providing similar services. * To support students, 167 different programs provided research opportunities, internships, mentorships, or career guidance. In addition, 144 programs provided short-term experiential learning opportunities and 127 long-term experiential learning opportunities. Short-term experiential learning activities include field trips, guest speakers, workshops, and summer camps. Long-term experiential learning activities last a semester in length or longer. Furthermore, 137 programs provided outreach and recognition to generate student interest, 124 provided classroom instruction, and 75 provided student scholarships or fellowships. * To support teachers, 115 programs provided curriculum development, 83 programs provided teacher in-service, professional development, or retention activities, and 52 programs provided preservice or recruitment activities. * To support STEM research, 68 programs reported conducting research to enhance the quality of STEM education. * To support institutions, 65 programs provided institutional support to management and administrative activities, and 46 programs provided support for expanding the facilities, classrooms, and other physical infrastructure of institutions. Figure 5: Services Provided by Federal STEM Education Programs: [Refer to PDF for image: horizontal bar graph] Service: Research opportunities, internships, mentorships, or career guidance; Number of programs: 167. Service: Short-term experiential learning activities; Number of programs: 144. Service: Long-term experiential learning activities; Number of programs: 127. Service: Outreach and recognition to generate student interest in STEM field(s); Number of programs: 137. Service: Classroom instruction; Number of programs: 124. Service: Student scholarships or fellowships; Number of programs: 75. Service: Curriculum development; Number of programs: 115. Service: Teacher in-service, professional development, or retention services; Number of programs: 83. Service: Teacher preservice or recruitment activities; Number of programs: 52. Service: Research to improve STEM education; Number of programs: 68. Service: Institutional support for management and administrative activities; Number of programs: 65. Service: Institutional support for infrastructure; Number of programs: 46; Service: Other; Number of programs: 31. Source: GAO analysis of survey responses. [End of figure] Many programs provided similar services to similar target groups. For example, 39 programs that listed chemistry as a primary field of focus provided student scholarships or fellowships to postsecondary students. Many of these programs offered scholarships and fellowships to minority, disadvantaged, or underrepresented students across a broad range of STEM fields. Specifically, some programs, like NASA's Minority University Research and Education Program (MUREP) and the Department of Commerce's Dr. Nancy Foster Scholarship Program, offered scholarships, along with a range of other services, to underrepresented and underserved students in overlapping STEM fields even though the programs focused on preparing students to work in fields that support the science mission of each agency. Overall, most programs provided an array of services to target groups--150 programs provided four or more services, while only 16 programs provide one service. Similar STEM Fields of Focus: In addition to the serving multiple target groups, most programs also provided services in multiple STEM fields. Twenty-three programs targeted one specific STEM field, while 121 programs targeted four or more specific STEM fields. In addition, 26 programs indicated not focusing on any specific STEM field, they provided services eligible for use in any STEM field. Five different STEM fields had over 100 programs that provided services. Biological sciences and technology were the most selected STEM fields focused on by programs. Agricultural sciences, which was the least commonly selected, still had 27 programs that provided services specifically to that STEM field. While the data show that many programs had similar target groups and similar STEM fields of focus, it is also important to compare programs' target groups and STEM fields of focus to get a better picture of the potential target beneficiaries that could be served within a given STEM discipline. For example, both the National Environmental Satellite, Data, and Information Service (NESDIS) Education and the Graduate Automotive Technology Education Program provided scholarships or fellowships to postsecondary students, but one focused on students in earth, atmospheric, and ocean sciences programs, and one on students in engineering, specifically in the areas of hybrid propulsion systems, fuel cells, biofuels, energy storage systems, lightweight materials, and advanced computation; therefore, the target beneficiaries served by these programs are quite different. Nevertheless, 72 programs provided services to postsecondary students in physics. As table 2 illustrates, many programs offered services to similar target groups in similar STEM fields of focus. Overlapping programs can lead to individuals and institutions being eligible for similar services in similar STEM fields offered through multiple programs and, without information sharing, could lead to the same service being provided to the same individual or institution. Table 2: STEM Fields of Focus and Target Groups of Federal STEM Education Programs: Target groups: K-12 students; Agricultural sciences: 8; Biology: 40; Chemistry: 36; Computer science: 30; Earth sciences: 38; Engineering: 32; Mathematics: 33; Physics: 31; Social sciences: 19; Technology: 43. Target groups: Postsecondary students; Agricultural sciences: 22; Biology: 99; Chemistry: 85; Computer science: 84; Earth sciences: 64; Engineering: 89; Mathematics: 79; Physics: 76; Social sciences: 62; Technology: 87. Target groups: K-12 teachers; Agricultural sciences: 5; Biology: 36; Chemistry: 33; Computer science: 25; Earth sciences: 39; Engineering: 26; Mathematics: 28; Physics: 29; Social sciences: 17; Technology: 38. Target groups: College faculty; Agricultural sciences: 17; Biology: 49; Chemistry: 42; Computer science: 43; Earth sciences: 35; Engineering: 47; Mathematics: 37; Physics: 36; Social sciences: 30; Technology: 50. Source: GAO analysis of survey results. Note: Many STEM education programs serve multiple target groups with multiple STEM fields of focus. The totals cited in table 2 will not sum to 209, the number of programs in our review. Earth sciences includes atmospheric and ocean sciences; social sciences includes psychology, sociology, anthropology, cognitive science, economics, and behavior sciences. [End of table] Similar Objectives: Many STEM education programs had similar objectives. The vast majority (87 percent) of STEM education programs indicated that attracting and preparing students throughout their academic careers in STEM areas was a primary objective. In addition to attracting and preparing students throughout their academic careers in STEM areas, officials also indicated the following primary program objectives: improving teacher education in STEM areas (teacher development)--26 percent, * improving or expanding the capacity of K-12 schools or postsecondary institutions to promote or foster education in STEM fields (institution capacity building)--24 percent, and: * conducting research to enhance the quality of STEM education provided to students (STEM education research)--18 percent. Many programs also reported having multiple primary objectives. While 107 programs focused solely on student education, 82 others indicated having multiple primary objectives, and 9 programs reported having 4 or more primary objectives. Few programs reported focusing solely on teacher development, institution capacity building, or STEM education research. Most of these objectives were part of a larger program that also focused on attracting and preparing students in STEM education. Overlapping Programs Are Not Necessarily Duplicative: However, even when programs overlapped, the services they provided and the populations they served may differ in meaningful ways and would therefore not necessarily be duplicative: * There may be important differences between the specific field(s) of focus and the program's stated goals. For example, both Commerce's National Estuarine Research Reserve System Education Program and the Nuclear Regulatory Commission's Integrated University Program provided scholarships or fellowships to doctoral students in the field of physics. However, the National Estuarine Research Reserve System Education Program's goal was to increase environmental literacy related to estuaries and coastal watersheds by providing students with an opportunity to conduct research of local and national significance that focuses on enhancing coastal zone management; while the Integrated University Program focused on supporting education in nuclear science, engineering, and related fields with the goal of developing a workforce capable of designing, constructing, operating, and regulating nuclear facilities and capable of handling nuclear materials safely. * Programs may be primarily intended to serve different specific populations within a given target group. For example, 65 programs were primarily intended to serve minority, disadvantaged, or underrepresented groups and 10 programs limited their services to students or teachers in specific geographic areas.[Footnote 24] Indeed, of the 34 programs providing services to K-12 students in the field of technology, 10 were primarily intended to serve specific underrepresented, minority, or disadvantaged groups, and 2 were limited geographically to individual cities or universities. * Furthermore, individuals may receive assistance from different programs at different points throughout their academic careers that provide services that complement or build upon each other, simultaneously supporting a common goal rather than serving cross purposes. The America COMPETES Reauthorization Act of 2010 Requires Coordination and Strategic Planning of STEM Education Initiatives: Despite past recommendations from ACC and others to improve coordination among STEM education programs, efforts to coordinate STEM education programs across the government remain limited. Although 83 percent of STEM education programs overlapped to some degree with at least one other program, only 33 percent of programs reported coordinating with other agencies that provide similar STEM education services to similar program beneficiaries, not including basic governmentwide inventory efforts. Some program officials mentioned that they coordinate by employing informal mechanisms for information sharing such as conversations and meetings between program staff, sharing resources or best practices, and participating in conferences with other agency officials. Other efforts included developing memorandums of understanding, issuing joint guidance, cofunding programs, and establishing interagency working groups focused on specific science subjects or providing a specific service to a specific target group. With the growing concern for improved federal coordination and planning in STEM education, Congress passed the America COMPETES Reauthorization Act of 2010, which requires the Director of OSTP to establish a committee under NSTC to coordinate STEM education activities and programs among respective federal agencies and OMB. The NSTC Committee on Science, Technology, Engineering, and Math Education (CoSTEM), comprised of representatives from 11 different federal agencies, convened its first meeting in March 2011. The statute requires NSTC to develop a 5-year governmentwide STEM education strategic plan and identify areas of duplication among federal programs. CoSTEM provides NSTC with an opportunity to improve coordination and be more strategic with the federal investment in STEM education. Best practices in interagency collaboration include developing ongoing mechanisms and processes to monitor, measure, and report agency progress toward NSTC's strategic planning goals and making the results publicly available to improve accountability. According to OSTP officials, a description of the 5-year strategic plan should be publicly available in early 2012; however, as called for in its charter, the committee will terminate no later than March 31, 2015, before the first 5-year plan is carried out, unless it is renewed by the Director of OSTP. Pursuant to requirements under the 2010 reauthorization of the COMPETES Act, NSTC has implemented several initiatives to enhance coordination. In December 2011, CoSTEM published a report on the inventory of the federal STEM education portfolio that, according to OSTP officials, will be used to improve coordination and inform the strategic planning process.[Footnote 25] Specifically, OSTP officials said the inventory will allow agencies to identify similar programs and share information and best practices. Without proper coordination, overlapping programs may not share information about the results of the actions taken or research conducted with other interested agencies, possibly leading to numerous programs providing assistance to address the same issue or area of research. To the extent that CoSTEM identifies duplicative programs, it will be important that it considers the trade-offs associated with program consolidation and assist agencies in determining the most effective and efficient way to reduce duplication. Cost savings might be achieved through the consolidation of duplicative program administrative structures. However, our past work has shown that program consolidation can be more expensive in the short term, and, in the long term, cost savings could be diminished if the workload associated with certain administrative activities remains the same, such as reviewing and assessing applications, providing technical assistance, and monitoring program recipients.[Footnote 26] Furthermore, over 90 percent of STEM education programs that reported on administrative costs estimated having administrative costs lower than 10 percent of their total program costs.[Footnote 27] Last, the consolidation of some programs may require congressional action because some programs may be statutorily mandated. Limited Use of Performance Measures and Evaluations May Hamper Ability to Assess Effectiveness: Programs Varied in Their Ability to Provide Information on Populations Served: Program officials varied in their ability to provide reliable information on the number of students, teachers, or institutions directly served by their programs--which is a type of output measure. For example, among programs in our review that served postsecondary teachers and students in 2010, about one-fifth of them did not know the number served. However, depending on the service delivery structure of the program, it may be more difficult to track this number. In some cases, the program's agency did not maintain databases or contracts that would track the number of students served by the program. In other cases, programs may not have been able to provide information on the numbers of institutions they served because they provided grants to secondary recipients. For example, one program indicated that it gives grants to institutions to provide internships or scholarships but that funding goes directly to students, so it does not have information about the number of institutions served. Programs that provide informal educational activities or online services also reported difficulty in tracking the number of individuals who benefited from their programs. The validity and accuracy of the reported output data for some of these programs may be questionable and may hinder program planning and assessment. Programs that reported the numbers they served used varied approaches to collect this information, including annual reports from grant recipients, student enrollment counts, estimates of the expected number of participants reached, and reviews of funding proposals. Some programs had third parties track the numbers served, but did not always take steps to independently verify the data or review the process for how the information was collected. Further, the inconsistent collection of output measures across programs makes it challenging to aggregate the number of students, teachers, and institutions served and to assess the effectiveness of the overall federal effort. Output data are an important component to understanding whether programs are likely to meet their goals. For example, if a K-12 program has the goal of increasing the number of undergraduates pursuing coursework in STEM fields, it is important to know how many K-12 students were in the program. Without such data, it would be challenging to assess the intended outcome of the program-- for example, the number of students who actually went on to pursue such coursework. Outcome Data Are Not Clearly Reflected in Agencies' Performance Plans and Reports: Agencies in our review did not use outcome measures in a way that is clearly reflected in their performance plans and performance reports-- publicly available documents they use for performance planning. [Footnote 28] This may hinder decisionmakers' ability to assess how agencies' STEM efforts contribute to agencywide performance goals and the overall federal STEM effort. In our review of fiscal year 2010 annual performance plans and reports of the 13 agencies with STEM programs, we found that most agencies did not connect STEM education activities to agency goals or measure and report on the progress of those activities.[Footnote 29] These documents typically lay out agency performance goals that establish the level of performance to be achieved by program activities during a given fiscal year, the measures developed to track progress, and what progress has been made toward meeting those performance goals. As figure 6 illustrates, in our review of agencies' specific references to their overall STEM education initiatives, although 38 percent of agencies mentioned STEM education in their performance plans and 62 percent in their performance reports, fewer cited outcome measures related to STEM education. More specifically, in reporting on their progress toward meeting their performance goals, 46 percent of the agencies mentioned STEM education as contributing to one of these goals in their performance reports. Moreover, agencies that spent the most on STEM education were not necessarily more likely to mention, connect to agency performance goals, or measure and report on progress of their STEM efforts. For instance, NASA, which administered 9 STEM programs and obligations of about $209.6 million in fiscal year 2010, mentioned its overall STEM education efforts and connected them to agency performance goals in its planning documents and measured and reported on progress in both its performance plan and report. On the other hand, HHS's National Institutes of Health, which administered the most STEM education programs (44) and obligations of about $573.6 million, referred to agency performance goals and outcome measures of its STEM education efforts only in some of its institutes' performance reports, but not in its NIH-wide performance plan. Figure 6: Integration of Overall STEM Education Efforts in Agencies' Performance Plans and Reports: [Refer to PDF for image: horizontal bar graph] Number of programs: Performance plan: Mentioned STEM education; Yes: 5 (38%); No: 8 (62%). Performance plan: Connected STEM education to agency goals; Yes: 4 (31%); No: 9 (69%). Performance plan: Measured and reported on progress of STEM education; Yes: 2 (15%); No: 11 (85%). Performance report: Mentioned STEM education; Yes: 8 (62%); No: 5 (38%). Performance report: Connected STEM education to agency goals; Yes: 6 (46%); No: 7 (54%). Performance report: Measured and reported on progress of STEM education; Yes: 3 (23%); No: 10 (77%). Source: GAO analysis of survey responses. [End of figure] As figure 7 illustrates, in our review of agencies' specific references to their STEM education programs, while the 13 agencies combined mentioned 38 percent of their programs in their performance plans, they connected 19 percent of their STEM education programs to agency performance goals and measured and reported on progress of 9 percent of the programs. Agencies' STEM education obligations and number of programs did not correlate directly with their likelihood of connecting the programs to agency performance goals or measuring and reporting on their progress in performance plans and reports. For example, Interior, through the U.S. Geological Survey, which administered just 3 STEM education programs in fiscal year 2010, mentioned all of its programs in its performance plan. In contrast, NSF, which administered 37 STEM education programs and obligated about $1.1 billion in fiscal year 2010, connected only 2 of its programs to agency performance goals while measuring and reporting on progress in its performance plan and report. Figure 7: Integration of STEM Education Programs in Agencies' Performance Plans and Reports: [Refer to PDF for image: horizontal bar graph] Number of programs: Performance plan: Mentioned STEM education; Yes: 79 (38%); No: 139 (62%). Performance plan: Connected STEM education to agency goals; Yes: 39 (19%); No: 170 (81%). Performance plan: Measured and reported on progress of STEM education; Yes: 19 (9%); No: 190 (91%). Performance report: Mentioned STEM education; Yes: 41 (20%); No: 168 (80%). Performance report: Connected STEM education to agency goals; Yes: 23 (11%); No: 186 (89%). Performance report: Measured and reported on progress of STEM education; Yes: 15 (7%); No: 194 (93%). Source: GAO analysis of survey responses. [End of figure] The GPRA Modernization Act of 2010 and the America COMPETES Reauthorization Act of 2010 afford agencies the opportunity to better utilize performance measures for both governmentwide and agency- specific STEM education efforts. For example, the GPRA Modernization Act will require agencies to identify program activities and other activities, which may include STEM education activities that contribute to each performance goal. It recognizes the importance of governmentwide performance goals as it requires OMB to develop, in coordination with agencies, long-term, crosscutting federal government priority goals that are to be updated or revised every 4 years, which will be tracked quarterly in order to review progress to improve government performance. According to OMB guidance, it will announce interim federal government priority goals in February 2012 and finalize its goals in February 2014. The America COMPETES Reauthorization Act of 2010 also focuses on accountability through strategic planning, and has specific requirements for agencies with STEM programs. Specifically, it requires NSTC to develop a STEM education strategic plan with long-term objectives, metrics to assess agencies' progress, and approaches taken by participating agencies to assess the effectiveness of their STEM programs and activities. However, while OSTP will be required to report on agencies' annual progress toward the long-term objectives, an OSTP official said there is no mechanism to make agencies align their performance measures with the goals and objectives in the strategic plan. Most STEM Education Programs Have Not Completed Evaluations: Little is known about the effectiveness and performance of STEM education programs because the majority of them (66 percent) have not conducted an evaluation of their entire program since 2005 (as figure 8 illustrates). We define "evaluation" as an individual systematic study conducted periodically or on an ad hoc basis to assess how well a program is working, typically relative to its program objectives. Some programs that reported that they did not complete an evaluation reported they had their grantees complete one; however, in those cases, few programs used these grantee evaluations to inform a more comprehensive evaluation of the entire program that they or an external evaluator completed. Figure 8: Percentage of STEM Education Programs, by Status of Evaluations since 2005: [Refer to PDF for image: pie-chart] Completed: 29%; None completed: 66%; In progress: 5%. Source: GAO analysis of survey responses. Note: Of programs in our review, 12 percent (26 of 209) reported they planned to complete one, and 12 percent (26 of 209) used other reporting and evaluation activities such as memos summarizing program activities and evaluation planning documents. [End of figure] In total, since 2005, agencies conducting 61 programs, (representing about 61 percent of the $3.1 billion obligated in fiscal year 2010) responded that they had completed evaluations--all of which used a variety of methods and designs. We reviewed evaluations for 35 of the 61 programs.[Footnote 30] Most of the 35 program evaluations we reviewed used methods and designs that appropriately assessed how well they met their stated objectives. For instance, one evaluation selected a random sample of its former program participants and compared them with a sample of students who had applied to the same program, but had not participated. While former participants had some statistically significant academic outcomes when compared with the nonparticipants, the evaluation also noted other factors that may have influenced the favorable outcomes of the program--for example, that participants, on average, were more interested in careers in science and math than the nonparticipants, so the true effects of program participation may be overstated. Even though most of the 35 programs we reviewed employed appropriate methods and designs to assess their programs' effectiveness, we identified several ways to improve evaluations of STEM education, based on our review. * Improved survey response rates: Many of the evaluations we reviewed had low response rates. Without better response rates, generalizations from the results may be limited. * Better alignment of the methods with other components of the evaluation: Specifically, 10 of the programs used evaluation methods that were not fully aligned with the evaluation questions and the program context.[Footnote 31] For example, 3 of these evaluations had data limitations, thus hindering the use of methods that could collect the full range of data to inform program outcomes. * Robust use of criteria to measure outcomes: Among the 27 programs that measured outcomes, 9 did not evaluate them against any criteria. Without criteria to evaluate the outcomes, it may be difficult to establish programmatic impact and assess performance and effectiveness. Furthermore, in order to influence program practice, the evaluation results must be disseminated widely. While nearly all of the STEM education programs that reported completing an evaluation reported using different mechanisms to disseminate results, they did not always share results in a way that facilitated knowledge sharing. Program officials reported that the most common means of dissemination of their results were through their websites or at conferences or forums, which, according to a 2006 NSTC report, were methods that require practitioners to actively seek out results, so such methods may prevent the results of the research from being conveyed to them. However, these mechanisms have limits. For example, NSTC also reported that STEM education research results may not be conveyed to practitioners because the results often lack applicability, some are ambiguous, and the culture of teaching typically does not make decisions based on research findings. NSTC identified other issues with sharing information about STEM education program results and suggested several actions that agencies could take to improve dissemination, such as engaging practitioners to collaborate with researchers in setting research agendas.[Footnote 32] According to NSTC officials, most agencies do not share or disseminate evaluations in a way that could be useful for coordination. Conclusions: Although the federal government invests billions of dollars annually in STEM education programs, there remains concerns over U.S. economic and educational competitiveness, particularly with regard to the national educational system's ability to produce citizens literate in STEM subjects and to produce future scientists, technologists, engineers, and mathematicians. Prior reports on STEM education highlighted the lack of federal governmentwide planning and coordination. Recently, both Congress and the administration called for a more strategic and effective approach to the federal government's investment in STEM education. The America COMPETES Reauthorization Act of 2010 requires the Director of OSTP to establish a committee under NSTC to develop a 5-year strategic plan and submit annual reports, including a description of the plan, to Congress. The plan is expected to include common measures to assess progress toward the plan's goals. In addition, the GPRA Modernization Act of 2010 requires agencies to identify program activities that contribute to each performance goal, and, as agencies implement this provision, more information about STEM education efforts in performance plans and reports can be expected. NSTC's ongoing strategic planning efforts provide an opportunity to develop guidance on how to incorporate STEM- and program-specific education goals and measures in agencies' performance planning and reporting process and align their STEM education efforts with a governmentwide STEM education strategy. To further strengthen strategic planning and coordination efforts, an accountability and reporting framework should exist to ensure agencies are adhering to NSTC's strategic plan. While the STEM education programs we reviewed in this report are fragmented and overlapping to some degree, they are not necessarily duplicative of one another. More analysis is needed to identify areas of duplication among federal STEM education programs and ensure that the federal investment in these programs advances NSTC's 5-year strategic plan that is under development. In this era of budget constraints, governmentwide strategic planning can play a critical role in addressing concerns about program fragmentation, overlap and duplication. Fragmentation and overlap can (1) frustrate federal officials' efforts to administer programs in a comprehensive manner, (2) limit the ability to determine which programs are most cost- effective, and (3) ultimately increase program administrative costs. Therefore, if NSTC's 5-year strategic plan is not developed in a way that aligns agencies' efforts to achieve governmentwide goals, enhances the federal government's ability to assess what works, and concentrates resources on those programs that advance the strategy, the federal government may spend limited funds in an inefficient and ineffective manner that does not best help to improve the nation's global competitiveness. Understanding program performance and effectiveness is also key in determining where to strategically invest limited federal funds to achieve the greatest impact in developing a pipeline of future workers in STEM fields. Programs need to be appropriately evaluated to determine what is working and how improvements can be made. However, most agencies have not conducted comprehensive evaluations since 2005 to assess the effectiveness of their STEM education programs. Furthermore, methods for dissemination of program evaluations, especially to practitioners, could be improved. Agency and program officials would benefit from guidance and information sharing within and across agencies about what is working and how to best evaluate programs. This could not only help to improve individual program performance, but also inform agency and governmentwide decisions about which programs should continue to be funded. Without an understanding of what is working in some programs, it will be difficult to develop a clear strategy for how to spend limited federal funds. Recommendations for Executive Action: The Director of OSTP should direct NSTC to: 1. Develop guidance for how agencies can better incorporate each agency's STEM education efforts and the goals from NSTC's 5-year STEM education strategic plan into each agency's own performance plans and reports. 2. Develop a framework for how agencies will be monitored to ensure that they are collecting and reporting on NSTC strategic plan goals. This framework should include alternatives for a sustained focus on monitoring coordination of STEM programs if the NSTC Committee on STEM terminates in 2015 as called for in its charter. 3. Work with agencies, through its strategic planning process, to identify programs that might be candidates for consolidation or elimination. Specifically, this could be achieved through an analysis that includes information on program overlap, similar to the analysis conducted by GAO in this report, and information on program effectiveness. As part of this effort, OSTP should work with agency officials to identify and report any changes in statutory authority necessary to execute each specific program consolidation identified by NSTC's strategic plan. 4. Develop guidance to help agencies determine the types of evaluations that may be feasible and appropriate for different types of STEM education programs and develop a mechanism for sharing this information across agencies. This could include guidance and sharing of information that outlines practices for evaluating similar types of programs. Agency Comments and Our Evaluation: We provided a draft of this report to the Office of Science and Technology Policy (OSTP) and the Office of Management and Budget (OMB) for review and comment. OSTP provided technical comments that we incorporated as appropriate. OMB had no concerns with the report. As we agreed with your office, unless you publicly announce the contents of this report earlier, we plan no further distribution until 30 days from the report date. We are sending copies of this report to relevant congressional committees, OSTP, OMB, and other interested parties. In addition, this report will be available at no charge on GAO's website at [hyperlink, http://www.gao.gov]. If you or your staff have any questions about this report, please contact me at (202) 512-7215 or scottg@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix IV. Signed by: George A. Scott: Director, Education, Workforce, and Income Security Issues: [End of section] Appendix I: Objectives, Scope, and Methodology: The objectives of our report were to determine (1) the number of federal agencies and programs that provided funding for science, technology, engineering, and mathematics (STEM) education programs in fiscal year 2010; (2) the extent to which STEM programs have similar objectives, serve similar target groups, provide similar types of services, and, if necessary, what opportunities exist to increase coordination; and (3) the extent to which STEM programs have measured their effectiveness. To inform all of our objectives, we reviewed relevant federal laws and regulations. We also reviewed previous work that was conducted to catalog and assess the federal investment in STEM education programs, including a 2005 GAO study,[Footnote 33] the 2007 Academic Competitiveness Council (ACC) report,[Footnote 34] and the 2010 Office of Management and Budget (OMB) inventory. We reviewed relevant literature and past reports on STEM education, including the 2010 President's Council of Advisors on Science and Technology (PCAST) report entitled Report to the President: Prepare and Inspire: K-12 Education in Science, Technology, Engineering, and Math (STEM) for America's Future [Footnote 35] and the National Academies Press report entitled Rising above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future: Committee on Prospering in the Global Economy of the 21st Century: An Agenda for American Science and Technology.[Footnote 36] In addition, we interviewed officials from OMB, the Office of Science and Technology Policy (OSTP), and 13 other federal agencies that administer STEM education programs to gather information on their STEM education efforts, the extent of coordination between programs, and the existence of program evaluations. We attended several STEM education conferences to gather additional perspectives about federal STEM education programs. Finally, we reviewed evaluations provided by program officials as well as agencies' annual performance plans and reports. To gather information on federal STEM education programs and to assess the level of fragmentation, overlap, and potential duplication among them, we first reviewed past GAO work on assessing the level of fragmentation, overlap, and duplication among other groups of federal programs. Next, we surveyed over 200 programs across 13 agencies that met our definition of a STEM education program (see below) with questions about program objectives, target populations, services provided, interagency coordination, outcome measures and evaluations, and funding information. In December 2011, NSTC's Committee on STEM Education released its inventory of the federal STEM education portfolio.[Footnote 37] The NSTC inventory differs from GAO's survey in that it counts investments and allocations, whereas GAO asked agencies to report on programs and obligations. Definition of STEM Education Program: For the purposes of our study, we defined a federally funded STEM education program as a program funded in fiscal year 2010 by congressional appropriation or allocation that includes one or more of the following as a primary objective: * attract or prepare students to pursue classes or coursework in STEM areas through formal or informal education activities (informal education programs provide support for activities provided by a variety of organizations that offer students learning opportunities outside of formal schooling through contests, science fairs, summer programs, and other means; outreach programs targeted to the general public should not be included), * attract students to pursue degrees (2-year, 4-year, graduate, or doctoral degrees) in STEM fields through formal or informal education activities, * provide training opportunities for undergraduate or graduate students in STEM fields (this can include grants, fellowships, internships, and traineeships that are targeted to students; general research grants that are targeted to researchers that may hire a student to work in the lab should not be considered a STEM education program), * attract graduates to pursue careers in STEM fields, * improve teacher (preservice or in-service) education in STEM areas, * improve or expand the capacity of K-12 schools or postsecondary institutions to promote or foster education in STEM fields, or: * conduct research to enhance the quality of STEM education programs provided to students.[Footnote 38] In addition, we defined STEM education programs to include grants, fellowships, internships, and traineeships. While programs designed to retain current employees in STEM fields were not included, programs that fund retraining of workers to pursue a degree in a STEM field were included because these programs help increase the number of students and professionals in STEM fields by helping retrain non-STEM workers to work in STEM fields. For the purposes of this study, we defined the term "program" as an organized set of activities supported by a congressional appropriation or allocation. Further, we defined a program as a single program even when its funds were allocated to other programs as well. We asked agency officials to provide a list of programs that received funds in fiscal year 2010. This included programs that received one-time, limited funds in fiscal year 2010, such as earmarks.[Footnote 39] Definition of STEM Fields: We determined that a STEM field should be considered any of the following broad disciplines: * agricultural sciences; * biological sciences; * chemistry; * computer science; * earth, atmospheric, and ocean sciences; * engineering; * mathematics; * physics; * social sciences (e.g., psychology, sociology, anthropology, cognitive science, economics, behavioral sciences); or: * technology. In addition, we determined that our definition of STEM education would include health care programs that train students for careers that are primarily in scientific research. We did not, however, include health care programs that train students for careers that are primarily in patient care, that is, those that trained nurses, doctors, dentists, psychologists, or veterinarians. Program Selection: To identify federally funded STEM education programs, first we developed a combined list of programs based on the findings of three previous STEM education inventory efforts completed by GAO in 2005, ACC in 2007, and OMB in 2010. Second, we shared our list with agency officials, provided our definition of STEM education program, and asked officials to make an initial determination about which programs should remain on the list and which programs should be added to the list. If agency officials indicated they wanted to remove a program from our list, we asked for additional information. For example, programs on our initial list may have been terminated or consolidated, or did not receive federal funds in fiscal year 2010. In addition, we asked officials to provide program descriptions, program names, and contact information. Next, we reviewed each agency's submission and individual program information and determination. We also gathered additional information on the program, mainly through agency websites and program materials, and held discussions with program officials to understand the program in more detail. On the basis of this additional information, we excluded programs that we found did not meet our definition of a STEM education program. Once our determinations were made, we asked each agency to confirm the list of programs and the names and contact information for the officials who would be responsible for completing the survey. In total, we determined that 274 programs should receive a survey.[Footnote 40] We also included several screening questions in the survey to provide an additional verification to ensure the programs met our definition of a STEM education program. Nineteen programs did not pass our screening questions and therefore were excluded from our analysis. All in all, 209 programs were included in our final analysis. For a list of the 209 STEM education programs by agency, see appendix II. For a summary of excluded programs and their exclusion rationales, see table 3. Furthermore, we provide aggregate survey responses from these programs in an e-supplement (GAO-12-110SP). Table 3: Reasons for Exclusion of STEM Education Activities from Our Survey: Exclusion rationale: Program did not receive congressional appropriation or allocation in fiscal year 2010; Number of programs: 59. Exclusion rationale: Program was consolidated or part of another program; Number of programs: 48. Exclusion rationale: Program focused on professionals or postdoctoral positions; Number of programs: 41. Exclusion rationale: Entry was duplicative or not recognized by agency officials; Number of programs: 34. Exclusion rationale: Nonprogrammatic STEM activities; Number of programs: 24. Exclusion rationale: Program for which STEM is not a primary purpose; Number of programs: 17. Exclusion rationale: Research-related program, not focused on STEM education; Number of programs: 8. Exclusion rationale: Program is focused on patient care; Number of programs: 8. Exclusion rationale: General awareness program not focused on students or teachers; Number of programs: 1. Total entries excluded: Number of programs: 240. Source: GAO. [End of table] Survey: Design and Implementation: We developed a web-based survey to collect information on federal STEM education programs. See GAO-12-110SP for a copy of the survey's full text. The survey included questions on program objectives, target groups served, services provided, academic fields of focus, output metrics, outcome measures, obligations, and program evaluations. To minimize errors arising from differences in how questions might be interpreted and to reduce variability in responses that should be qualitatively the same, we conducted pretests with 14 different programs in March and April 2011. To ensure that we obtained a variety of perspectives on our survey, we selected 14 programs from 11 different agencies that differed in program scope, objectives, services provided, target groups served, evaluations completed, and funding sources. We included budget staff as well as program officials in the pretests to ensure budget-related terms in the survey were understandable and available. An independent GAO reviewer also reviewed a draft of the survey prior to its administration. On the basis of feedback from these pretests and independent review, we revised the survey in order to improve its clarity. After completing the pretests, we administered the survey. On May 3, 2011, we sent an e-mail announcement of the survey to the officials responsible for the programs selected for our review, notifying them that our online survey would be activated within a week. On May 11, 2011, we sent a second e-mail message to officials that informed them that the survey was available online. In that e-mail message, we also provided them with unique passwords and usernames. We made telephone calls to officials and sent them follow-up e-mail messages, as necessary, to clarify their responses or obtain additional information. We received completed surveys from 269 programs, for a 100 percent response rate. We collected survey responses through August 31, 2011. Analysis of Responses and Data Quality: We used standard descriptive statistics to analyze responses to the survey. Because this was not a sample survey, there were no sampling errors. To minimize other types of errors, commonly referred to as nonsampling errors, and to enhance data quality, we employed survey design practices in the development of the survey and in the collection, processing, and analysis of the survey data. For instance, as previously mentioned, we pretested the survey with federal officials to minimize errors arising from differences in how questions might be interpreted and to reduce variability in responses that should be qualitatively the same. We further reviewed the survey to ensure the ordering of survey sections was appropriate and that the questions within each section were clearly stated and easy to comprehend. To reduce nonresponse bias, another source of nonsampling error, we sent out e-mail reminder messages to encourage officials to complete the survey. In reviewing the survey data, we performed automated checks to identify inappropriate answers. We further reviewed the data for missing or ambiguous responses and followed up with agency officials when necessary to clarify their responses. To assess output measures, we asked a series of questions to assess the agency's procedures, policies, and internal controls to ensure the quality of data provided in the survey. For program obligations questions, we sampled 10 percent of responses reviewing documentary evidence to corroborate survey responses. For evaluation questions, we reviewed program evaluations provided to corroborate survey responses. To assess the reliability of data provided in our survey, we incorporated questions about the reliability of the programs' data systems, reviewed documentation for a sample of selected questions, conducted internal reliability checks, and conducted follow-up as necessary. While we did not verify all responses, on the basis of our application of recognized survey design practices and follow-up procedures, we determined that the data used in this report were of sufficient quality for our purposes. We did not report on data that we found of questionable reliability based on our review of data reliability questions in the survey--such as the number of students and teachers served. All data analysis programs were also independently verified by a GAO data analyst for accuracy. Performance Evaluations: Program officials who responded on their survey that an evaluation of their program had been completed in 2005 or later provided us with information about their most recent evaluations. GAO defines "evaluation" as an individual systematic study conducted periodically or on an ad hoc basis to assess how well a program is working. Studies are often conducted by experts external to the program, inside or outside the agency, as well as by program managers. Furthermore, an evaluation typically examines achievement of program objectives in the context of other aspects of program performance or in the context in which it occurs.[Footnote 41] After ensuring that the evaluations met this definition, we reviewed them to analyze their characteristics, including their methods and designs, and the extent to which program outcomes were measured. In addition, we examined whether the methods and designs were appropriate given the evaluation questions and program context. In total, 61 programs responded that they had completed a program evaluation since 2005, and we reviewed evaluations from 35 of those programs. Because we requested that officials provide us with a citation for the most recent evaluation, we selected the most recent one for our review. We did not review evaluations from the remaining 26 programs for a variety of reasons. Specifically, they were committee of visitors reports, and other types of reports that did not have evaluation information that aligned with the criteria by which we analyzed the other evaluations. Among these reports, we were unable to obtain 6 of them. As a result, we were unable to analyze them and determine whether they met GAO's definition of evaluation. For more details about the evaluations in our review, see appendix III. Agencies' Annual Performance Plans and Reports: We reviewed agencies' fiscal year 2010 required strategic planning documents--performance plans and performance reports--to determine the extent to which they incorporated program-specific and broad-based STEM goals and objectives.[Footnote 42] The performance plans and reports were done at the agency level, while others were done at other levels, such as the institute or office level--in which case we reviewed the documents that covered the particular STEM program(s) in our review. When reviewing these documents, we determined the extent to which: * agencies made any reference to agencywide STEM initiatives or particular STEM education programs, in general, but not in the context of agency goals or of outcome measures; * agencies connected their STEM initiatives or their individual STEM programs to agency goals; and: * agencies articulated outcome measures of their STEM initiatives or of individual STEM programs. [End of section] Appendix II: List of STEM Education Programs with Fiscal Year 2010 Obligations: Agency: NASA; Program: Aeronautics Research Directorate-STEM Education activities; Fiscal Year 2010 STEM education program obligations[A]: $4,153,000. Program: Exploration Systems Directorate-STEM Education activities; Fiscal Year 2010 STEM education program obligations[A]: $6,400,000. Program: Higher Education; Fiscal Year 2010 STEM education program obligations[A]: $18,346,329. Program: K-12 STEM Program; Fiscal Year 2010 STEM education program obligations[A]: $36,291,069. Program: Minority University Research and Education Program; Fiscal Year 2010 STEM education program obligations[A]: $28,862,619. Program: NASA Informal Education Opportunities (NIEO); Fiscal Year 2010 STEM education program obligations[A]: $14,295,934. Program: NASA Science Mission Directorate E/PO; Fiscal Year 2010 STEM education program obligations[A]: $30,057,100. Program: Space Grant/EPSCoR Program; Fiscal Year 2010 STEM education program obligations[A]: $68,910,696. Program: Space Operations Directorate-STEM Education activities; Fiscal Year 2010 STEM education program obligations[A]: 2,293,000. Agency: National Science Foundation; Program: Advanced Technological Education (ATE); Fiscal Year 2010 STEM education program obligations[A]: $64,510,000. Program: Alliances for Graduate Education and the Professoriate (AGEP); Fiscal Year 2010 STEM education program obligations[A]: $16,730,000. Program: Broadening Participation in Computing (BPC); Fiscal Year 2010 STEM education program obligations[A]: $14,000,000. Program: Centers for Ocean Science Education Excellence; Fiscal Year 2010 STEM education program obligations[A]: $5,700,000. Program: CISE Pathways to Revitalized Undergraduate Computing Education (CPATH); Fiscal Year 2010 STEM education program obligations[A]: $4,370,000. Program: Cyberinfrastructure Training, Education, Advancement, and Mentoring for Our 21st Century Workforce (CI-TEAM); Fiscal Year 2010 STEM education program obligations[A]: $4,850,000. Program: Discovery Research K-12 (DR-K12); Fiscal Year 2010 STEM education program obligations[A]: $118,380,000. Program: East Asia & Pacific Summer Institutes for U.S. Graduate Students (EAPSI); Fiscal Year 2010 STEM education program obligations[A]: $1,740,000. Program: Engineering Education (EE); Fiscal Year 2010 STEM education program obligations[A]: $13,740,000. Program: Enhancing the Mathematical Sciences Workforce in the 21st Century (EMSW21); Fiscal Year 2010 STEM education program obligations[A]: $15,070,000. Program: Ethics Education in Science & Engineering (EESE); Fiscal Year 2010 STEM education program obligations[A]: $2,650,000. Program: Federal Cyber Service: Scholarship for Service (SFS); Fiscal Year 2010 STEM education program obligations[A]: $14,870,000. Program: Geoscience Education; Fiscal Year 2010 STEM education program obligations[A]: $2,020,000. Program: Geoscience Teacher Training (GEO-Teach); Fiscal Year 2010 STEM education program obligations[A]: $2,980,000. Program: Global Learning and Observations to Benefit the Environment (GLOBE); Fiscal Year 2010 STEM education program obligations[A]: $1,100,000. Program: Graduate Research Fellowship Program (GRFP); Fiscal Year 2010 STEM education program obligations[A]: $136,130,000. Program: Graduate STEM Fellows in K-12 Education Program (GK-12); Fiscal Year 2010 STEM education program obligations[A]: $55,970,000. Program: Historically Black Colleges and Universities Undergraduate Program (HBCU-UP); Fiscal Year 2010 STEM education program obligations[A]: $32,060,000. Program: Informal Science Education (ISE); Fiscal Year 2010 STEM education program obligations[A]: $65,850,000. Program: Integrative Graduate Education and Research Traineeship (IGERT) Program; Fiscal Year 2010 STEM education program obligations[A]: $69,700,000. Program: Interdisciplinary Training for Undergraduates in Biological and Mathematical Sciences (UBM); Fiscal Year 2010 STEM education program obligations[A]: $2,700,000. Program: International Research Experiences for Students (IRES); Fiscal Year 2010 STEM education program obligations[A]: $3,430,000. Program: Louis Stokes Alliances for Minority Participation (LSAMP); Fiscal Year 2010 STEM education program obligations[A]: $44,550,000. Program: Math and Science Partnership; Fiscal Year 2010 STEM education program obligations[A]: $57,930,000. Program: Nanotechnology Undergraduate Education in Engineering; Fiscal Year 2010 STEM education program obligations[A]: $1,830,000. Program: Opportunities for Enhancing Diversity in the Geosciences; Fiscal Year 2010 STEM education program obligations[A]: $4,180,000. Program: Polar Education Program; Fiscal Year 2010 STEM education program obligations[A]: $1,500,000. Program: Research and Evaluation on Education in Science and Engineering (REESE); Fiscal Year 2010 STEM education program obligations[A]: $45,670,000. Program: Research Experiences for Teachers (RET) in Engineering and Computer Science; Fiscal Year 2010 STEM education program obligations[A]: $5,410,000. Program: Research Experiences for Undergraduates (REU); Fiscal Year 2010 STEM education program obligations[A]: $80,990,000. Program: Research in Disabilities Education (RDE); Fiscal Year 2010 STEM education program obligations[A]: $6,920,000. Program: Research on Gender in Science and Engineering (GSE); Fiscal Year 2010 STEM education program obligations[A]: $11,570,000. Program: Robert Noyce Teacher Scholarship Program; Fiscal Year 2010 STEM education program obligations[A]: $54,930,000. Program: Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP); Fiscal Year 2010 STEM education program obligations[A]: $31,640,000. Program: Transforming Undergrad Education in STEM (TUES); Fiscal Year 2010 STEM education program obligations[A]: $41,600,000. Program: Tribal Colleges and Universities Program (TCUP); Fiscal Year 2010 STEM education program obligations[A]: $13,350,000. Program: Undergraduate Research and Mentoring in the Biological Sciences (URM); Fiscal Year 2010 STEM education program obligations[A]: $9,000,000. Agency: Nuclear Regulatory Commission; Program: Integrated University Program; Fiscal Year 2010 STEM education program obligations[A]: $15,000,000. Program: Minority Serving Institutions Program (MSIP); Fiscal Year 2010 STEM education program obligations[A]: $2,838,500. Program: Nuclear Education Curriculum Development Grants; Fiscal Year 2010 STEM education program obligations[A]: 4,700,997. Agency: Department of Agriculture; Animal and Plant Health Inspection Service (APHIS); Program: AgDiscovery Program; Fiscal Year 2010 STEM education program obligations[A]: $15,000. National Institute of Food and Agriculture (NIFA); Program: 1890 Institution Teaching, Research and Extension Capacity Building Grants Program; Fiscal Year 2010 STEM education program obligations[A]: $17,167,994. Program: Agriculture in the Classroom; Fiscal Year 2010 STEM education program obligations[A]: $314,912. Program: Food and Agricultural Sciences National Needs Graduate and Postdoctoral Fellowships Grants Program; Fiscal Year 2010 STEM education program obligations[A]: $3,664,127. Program: Higher Education Challenge Grants Program; Fiscal Year 2010 STEM education program obligations[A]: $5,654,000. Program: Higher Education Multicultural Scholars Program; Fiscal Year 2010 STEM education program obligations[A]: $1,126,000. Program: Hispanic Education Partnership Grants; Fiscal Year 2010 STEM education program obligations[A]: $8,809,568. Program: New Era Rural Technology Competitive Grants Program; Fiscal Year 2010 STEM education program obligations[A]: $875,000. Program: Resident Instruction Grants for Institutions of Higher Education in Insular Areas; Fiscal Year 2010 STEM education program obligations[A]: $859,547. Program: Secondary Education, Two-Year Postsecondary Education and Agriculture in the K-12 Classroom Grants; Fiscal Year 2010 STEM education program obligations[A]: $983,000. Agency: Office of the Assistant Secretary for Departmental Management; Program: USDA/1890 National Scholars Program; Fiscal Year 2010 STEM education program obligations[A]: $2,398,947. Agency: Department of Commerce: Agency: National Institute of Standards and Technology (NIST); Program: NIST Summer Institute for Middle School Science Teachers; Fiscal Year 2010 STEM education program obligations[A]: $300,000. Program: Recovery Act Measurement Science and Engineering Fellowship Program; Fiscal Year 2010 STEM education program obligations[A]: $20,000,000. Program: Summer Undergraduate Research Fellowship (SURF) Program; Fiscal Year 2010 STEM education program obligations[A]: $595,641. Agency: National Oceanic and Atmospheric Administration (NOAA); Program: Bay Watershed Education and Training (B-WET) Program; Fiscal Year 2010 STEM education program obligations[A]: $9,700,000. Program: Climate Communications and Education Program; Fiscal Year 2010 STEM education program obligations[A]: $536,000. Program: Coral Reef Conservation Program; Fiscal Year 2010 STEM education program obligations[A]: $838,000. Program: Dr. Nancy Foster Scholarship Program; Fiscal Year 2010 STEM education program obligations[A]: $603,125. Program: Educational Partnership Program with Minority Serving Institutions; Fiscal Year 2010 STEM education program obligations[A]: $14,309,000. Program: Environmental Literacy Grants; Fiscal Year 2010 STEM education program obligations[A]: $10,388,185. Program: Ernest F. Hollings Undergraduate Scholarship Program; Fiscal Year 2010 STEM education program obligations[A]: $6,450,638. Program: Global Learning and Observations to Benefit the Environment; Fiscal Year 2010 STEM education program obligations[A]: $3,000,000. Program: National Environmental Satellite, Data, and Information Service (NESDIS) Education; Fiscal Year 2010 STEM education program obligations[A]: $2,700,000. Program: National Estuarine Research Reserve System Education Program; Fiscal Year 2010 STEM education program obligations[A]: $1,020,000. Program: National Marine Fisheries Service (NMFS) Education; Fiscal Year 2010 STEM education program obligations[A]: $3,084,750. Program: National Marine Sanctuaries Education Program; Fiscal Year 2010 STEM education program obligations[A]: $908,150. Program: National Ocean Service (NOS) Education; Fiscal Year 2010 STEM education program obligations[A]: $426,000. Program: National Sea Grant College Program-Education Component; Fiscal Year 2010 STEM education program obligations[A]: $9,378,529. Program: National Weather Service Outreach Program; Fiscal Year 2010 STEM education program obligations[A]: $3,070,000. Program: Teacher at Sea Program; Fiscal Year 2010 STEM education program obligations[A]: $600,000. Agency: Department of Defense: Agency: Air Force; Program: Awards to Stimulate and Support Undergraduate Research Experience (ASSURE); Fiscal Year 2010 STEM education program obligations[A]: $4,500,000. Program: National Defense Science and Engineering Graduate (NDSEG) Fellowship; Fiscal Year 2010 STEM education program obligations[A]: $38,695,132. Program: University NanoSatellite Program; Fiscal Year 2010 STEM education program obligations[A]: $660,000. Agency: Army; Program: Army Educational Outreach Program (AEOP); Fiscal Year 2010 STEM education program obligations[A]: $7,885,000. Program: Consortium Research Fellows Program (CRFP); Fiscal Year 2010 STEM education program obligations[A]: $1,634,050. Program: National Science Center (NSC); Fiscal Year 2010 STEM education program obligations[A]: $1,982,000. Agency: Office of the Secretary of Defense; Program: Autonomous Robotic Manipulation (ARM); Fiscal Year 2010 STEM education program obligations[A]: $8,180,000. Program: Computer Science in Science, Technology, Engineering, and Mathematics Education (CS-STEM); Fiscal Year 2010 STEM education program obligations[A]: $2,661,000. Program: DoD STARBASE Program; Fiscal Year 2010 STEM education program obligations[A]: $20,000,000. Program: ENGAGE; Fiscal Year 2010 STEM education program obligations[A]: $2,100,000. Program: National Defense Education Program (NDEP) K-12; Fiscal Year 2010 STEM education program obligations[A]: $13,595,000. Program: National Defense Education Program (NDEP) Science, Mathematics And Research for Transformation (SMART); Fiscal Year 2010 STEM education program obligations[A]: $47,400,000. Agency: Military Health System; Program: Uniformed Services University of the Health Sciences (USUHS); Fiscal Year 2010 STEM education program obligations[A]: $447,000. Agency: Navy; Program: Historically Black College and Universities/Minority Institutions Research Education Partnership; Fiscal Year 2010 STEM education program obligations[A]: $700,000. Program: Iridescent Learning; Fiscal Year 2010 STEM education program obligations[A]: $810,000. Program: Science and Engineering Apprentice Program (SEAP); Fiscal Year 2010 STEM education program obligations[A]: $755,000. Program: SeaPerch; Fiscal Year 2010 STEM education program obligations[A]: $700,000. Program: The Naval Research Enterprise Intern Program (NREIP); Fiscal Year 2010 STEM education program obligations[A]: $1,960,000. Program: University/Laboratory Initiative (ULI); Fiscal Year 2010 STEM education program obligations[A]: $2,350,000. Agency: Department of Education; Program: Developing Hispanic-Serving Institutions: STEM and Articulation Programs (mandatory); Fiscal Year 2010 STEM education program obligations[A]: 0[B]. Program: Graduate Assistance in Areas of National Need; Fiscal Year 2010 STEM education program obligations[A]: $31,005,248. Program: Mathematics and Science Partnerships; Fiscal Year 2010 STEM education program obligations[A]: $180,478,000. Program: Minority Science and Engineering Improvement Program; Fiscal Year 2010 STEM education program obligations[A]: $9,503,000. Program: National Science and Mathematics Access to Retain Talent Program; Fiscal Year 2010 STEM education program obligations[A]: $379,775,972. Program: Predominantly Black Institutions Competitive Grant Program; Fiscal Year 2010 STEM education program obligations[A]: 0[B]. Program: Research in Special Education; Fiscal Year 2010 STEM education program obligations[A]: $11,000,000. Program: Research, Development, and Dissemination; Fiscal Year 2010 STEM education program obligations[A]: $39,986,940. Program: Teachers for a Competitive Tomorrow: Baccalaureate Degrees in STEM and Critical Foreign Languages; Fiscal Year 2010 STEM education program obligations[A]: $1,092,000. Program: Teachers for a Competitive Tomorrow: Master's Degrees in STEM and Critical Foreign Languages; Fiscal Year 2010 STEM education program obligations[A]: $1,092,000. Program: Upward Bound Math-Science; Fiscal Year 2010 STEM education program obligations[A]: $34,873,057. Program: Women's Educational Equity; Fiscal Year 2010 STEM education program obligations[A]: $2,423,000. Agency: Department of Energy; Program: Academies Creating Teacher Scientists (DOE Acts); Fiscal Year 2010 STEM education program obligations[A]: $3,721,600. Program: Advanced Vehicle Competitions; Fiscal Year 2010 STEM education program obligations[A]: $2,000,000. Program: American Chemical Society Summer School in Nuclear and Radiochemistry; Fiscal Year 2010 STEM education program obligations[A]: $546,813. Program: ASCR-ORNL Research Alliance in Math and Science; Fiscal Year 2010 STEM education program obligations[A]: $250,000. Program: Community College Institute of Science and Technology; Fiscal Year 2010 STEM education program obligations[A]: $685,000. Program: Computational Science Graduate Fellowship; Fiscal Year 2010 STEM education program obligations[A]: $7,800,000. Program: Faculty and Student Teams; Fiscal Year 2010 STEM education program obligations[A]: $1,019,000. Program: Fusion Energy Sciences Graduate Fellowship Program; Fiscal Year 2010 STEM education program obligations[A]: $800,000. Program: Graduate Automotive Technology Education; Fiscal Year 2010 STEM education program obligations[A]: $1,000,000. Program: Hampton University Graduate Studies; Fiscal Year 2010 STEM education program obligations[A]: $48,000. Program: HBCU Mathematics, Science & Technology, Engineering and Research Workforce Development Program; Fiscal Year 2010 STEM education program obligations[A]: $8,967,507. Program: Industrial Assessment Centers; Fiscal Year 2010 STEM education program obligations[A]: $6,086,000. Program: Integrated University Program; Fiscal Year 2010 STEM education program obligations[A]: $5,000,000. Program: Laboratory Equipment Donation Program; Fiscal Year 2010 STEM education program obligations[A]: $150,000. Program: Mickey Leland Energy Fellowship; Fiscal Year 2010 STEM education program obligations[A]: $700,000. Program: Minority Serving Institutions Program; Fiscal Year 2010 STEM education program obligations[A]: $840,000. Program: Minority University Research Associates Program (MURA); Fiscal Year 2010 STEM education program obligations[A]: $591,880. Program: National Science Bowl; Fiscal Year 2010 STEM education program obligations[A]: $2,449,900. Program: National Undergraduate Fellowship Program in Plasma Physics and Fusion Energy Sciences; Fiscal Year 2010 STEM education program obligations[A]: $370,000. Program: Office of Science Graduate Fellowship (SCGF) program; Fiscal Year 2010 STEM education program obligations[A]: $17,500,000. Program: Pan American Advanced Studies Institute; Fiscal Year 2010 STEM education program obligations[A]: $200,000. Program: Plasma/Fusion Science Educator Programs; Fiscal Year 2010 STEM education program obligations[A]: $779,000. Program: Pre-Service Teacher Program; Fiscal Year 2010 STEM education program obligations[A]: $429,000. Program: QuarkNet; Fiscal Year 2010 STEM education program obligations[A]: $750,000. Program: Science Undergraduate Laboratory Internships; Fiscal Year 2010 STEM education program obligations[A]: $3,802,500. Program: Solar Decathlon; Fiscal Year 2010 STEM education program obligations[A]: $5,000,000. Program: Summer Applied Geophysical Experience (SAGE); Fiscal Year 2010 STEM education program obligations[A]: $100,000. Program: Technical Career Intern Program; Fiscal Year 2010 STEM education program obligations[A]: 0[C]. Program: Wind for Schools; Fiscal Year 2010 STEM education program obligations[A]: $630,000. Agency: Department of Health and Human Services; Agency: Health Resources and Services Administration; Program: Health Careers Opportunity Program; Fiscal Year 2010 STEM education program obligations[A]: $22,086,000. Program: Public Health Traineeship Program; Fiscal Year 2010 STEM education program obligations[A]: $1,510,000. Agency: National Institutes of Health; Program: Bridges to the Baccalaureate Program; Fiscal Year 2010 STEM education program obligations[A]: $6,460,988. Program: Bridges to the Doctorate; Fiscal Year 2010 STEM education program obligations[A]: $2,977,075. Program: Cancer Education Grants Program; Fiscal Year 2010 STEM education program obligations[A]: Agency: 6,756,869. Program: Cancer Research Interns; Fiscal Year 2010 STEM education program obligations[A]: Agency: $191,608. Program: CCR/JHU Master of Science in Biotechnology Concentration in Molecular Targets and Drug Discovery Technologies; Fiscal Year 2010 STEM education program obligations[A]: $445,000. Program: Clinical Research Training Program; Fiscal Year 2010 STEM education program obligations[A]: $1,100,000. Program: Community College Summer Enrichment Program; Fiscal Year 2010 STEM education program obligations[A]: $105,000. Program: Curriculum Supplement Series; Fiscal Year 2010 STEM education program obligations[A]: $341,849. Program: Education Programs for Population Research (R25); Fiscal Year 2010 STEM education program obligations[A]: $750,154. Program: Graduate Program Partnerships; Fiscal Year 2010 STEM education program obligations[A]: $16,720,000. Program: Initiative for Maximizing Student Development; Fiscal Year 2010 STEM education program obligations[A]: $21,412,146. Program: Intramural NIAID Research Opportunities; Fiscal Year 2010 STEM education program obligations[A]: $129,111. Program: MARC U-STAR NRSA Program; Fiscal Year 2010 STEM education program obligations[A]: $20,386,651. Program: Material Development for Environmental Health Curriculum; Fiscal Year 2010 STEM education program obligations[A]: $1,544,868. Program: National Cancer Institute Cancer Education and Career Development Program; Fiscal Year 2010 STEM education program obligations[A]: $20,442,233. Program: NCRR Science Education Partnership Award (SEPA); Fiscal Year 2010 STEM education program obligations[A]: $16,653,015. Program: NHLBI Minority Undergraduate Biomedical Education Program; Fiscal Year 2010 STEM education program obligations[A]: $475,970. Program: NIAID Science Education Awards; Fiscal Year 2010 STEM education program obligations[A]: $1,069,978. Program: NIDDK Education Program Grants; Fiscal Year 2010 STEM education program obligations[A]: $432,000. Program: NIH Academy; Fiscal Year 2010 STEM education program obligations[A]: $249,866. Program: NIH Summer Research Experience Programs; Fiscal Year 2010 STEM education program obligations[A]: $1,679,422. Program: NINDS Diversity Research Education Grants in Neuroscience; Fiscal Year 2010 STEM education program obligations[A]: $821,800. Program: NLM Institutional Grants for Research Training in Biomedical Informatics; Fiscal Year 2010 STEM education program obligations[A]: $10,143,676. Program: Office of Science Education K-12 Program; Fiscal Year 2010 STEM education program obligations[A]: $2,270,151. Program: Post-baccalaureate Intramural Research Training Award Program; Fiscal Year 2010 STEM education program obligations[A]: $24,810,000. Program: Postbaccalaureate Research Education Program (PREP); Fiscal Year 2010 STEM education program obligations[A]: $5,780,503. Program: Recovery Act Limited Competition: NIH Challenge Grants in Health and Science Research; Fiscal Year 2010 STEM education program obligations[A]: $4,953,293. Program: Research Scientist Award for Minority Institutions; Fiscal Year 2010 STEM education program obligations[A]: $82,146. Program: Research Supplements to Promote Diversity in Health-Related Research; Fiscal Year 2010 STEM education program obligations[A]: $68,981,252. Program: RISE (Research Initiative for Scientific Enhancement); Fiscal Year 2010 STEM education program obligations[A]: $24,441,722. Program: Ruth L. Kirschstein National Research Service Award Institutional Research Training Grants**(T32, T35); Fiscal Year 2010 STEM education program obligations[A]: $230,840,328. Program: Ruth L. Kirschstein NRSA for Individual Predoctoral Fellows, including Underrepresented Racial/Ethnic Groups, Students from Disadvantaged Backgrounds; Fiscal Year 2010 STEM education program obligations[A]: $56,882,642. Program: Science Education Drug Abuse Partnership Award; Fiscal Year 2010 STEM education program obligations[A]: $2,294,996. Program: Short Courses in Integrative and Organ Systems Pharmacology; Fiscal Year 2010 STEM education program obligations[A]: $665,937. Program: Short Courses on Mathematical, Statistical, and Computational Tools for Studying Biological Systems; Fiscal Year 2010 STEM education program obligations[A]: $695,460. Program: Short Term Educational Experiences for Research (STEER) in the Environmental Health Sciences for Undergraduates and High School Students; Fiscal Year 2010 STEM education program obligations[A]: $568,298. Program: Short-Term Research Education Program to Increase Diversity in Health-Related Research; Fiscal Year 2010 STEM education program obligations[A]: $4,188,763. Program: Student Intramural Research Training Award Program; Fiscal Year 2010 STEM education program obligations[A]: $5,868,500. Program: Summer Genetics Institute; Fiscal Year 2010 STEM education program obligations[A]: $53,935. Program: Summer Institute for Training in Biostatistics; Fiscal Year 2010 STEM education program obligations[A]: $1,449,092. Program: Technical Intramural Research Training Award; Fiscal Year 2010 STEM education program obligations[A]: $2,240,000. Program: Training in Computational Neuroscience: From Biology to Model and Back Again; Fiscal Year 2010 STEM education program obligations[A]: $1,443,450. Program: Training in Neuroimaging: Integrating First Principles and Applications; Fiscal Year 2010 STEM education program obligations[A]: $1,356,252. Program: Undergraduate Scholarship Program for Individuals from Disadvantaged Backgrounds; Fiscal Year 2010 STEM education program obligations[A]: $2,426,137. Agency: Department of Homeland Security; Agency: Science and Technology Directorate; Program: HS-STEM Career Development Grants Program; Fiscal Year 2010 STEM education program obligations[A]: $2,300,000. Program: HS-STEM Scholars Program; Fiscal Year 2010 STEM education program obligations[A]: $1,920,000. Program: HS-STEM Summer Internship Program; Fiscal Year 2010 STEM education program obligations[A]: $363,000. Program: Minority Serving Institutions-Scientific Leadership Awards; Fiscal Year 2010 STEM education program obligations[A]: $2,400,000. Program: Minority Serving Institutions-Summer Research Team; Fiscal Year 2010 STEM education program obligations[A]: $116,000. Agency: Department of the Interior; Agency: U.S. Geological Survey (USGS); Program: EDMAP Component of the National Cooperative Geologic Mapping Program; Fiscal Year 2010 STEM education program obligations[A]: $566,161. Program: National Association of Geoscience Teachers (NAGT)-USGS Cooperative Summer Field Training Program; Fiscal Year 2010 STEM education program obligations[A]: $200,000. Program: Student Intern in Support of Native American Relations (SISNAR); Fiscal Year 2010 STEM education program obligations[A]: $204,013. Agency: Department of Transportation; Agency: Federal Aviation Administration (FAA); Program: Joint University Program; Fiscal Year 2010 STEM education program obligations[A]: $300,000. Program: National Center of Excellence for Aviation Operations Research (NEXTOR); Fiscal Year 2010 STEM education program obligations[A]: $5,393,000. Agency: Federal Highway Administration (FHWA); Program: Garrett A. Morgan Technology and Transportation Education Program; Fiscal Year 2010 STEM education program obligations[A]: $1,250,000. Program: National Summer Transportation Institute Program; Fiscal Year 2010 STEM education program obligations[A]: $2,602,999. Program: Summer Transportation Internship Program for Diverse Groups; Fiscal Year 2010 STEM education program obligations[A]: $1,425,000. Agency: Research and Innovative Technology Administration (RITA); Program: University Transportation Centers Program; Fiscal Year 2010 STEM education program obligations[A]: $83,370,600. Agency: Environmental Protection Agency; Program: Cooperative Training in Environmental Sciences Research; Fiscal Year 2010 STEM education program obligations[A]: $1,593,184. Program: Environmental Education Grants; Fiscal Year 2010 STEM education program obligations[A]: $3,450,882. Program: EPA Greater Research Opportunities (GRO) Fellowships for Undergraduate Environmental Study; Fiscal Year 2010 STEM education program obligations[A]: $1,532,099. Program: EPA Marshall Scholars Program; Fiscal Year 2010 STEM education program obligations[A]: $205,888. Program: National Environmental Education and Training Partnership; Fiscal Year 2010 STEM education program obligations[A]: $2,259,500. Program: National Network for Environmental Management Studies Fellowship Program; Fiscal Year 2010 STEM education program obligations[A]: $469,403. Program: P3 Award: National Student Design Competition for Sustainability; Fiscal Year 2010 STEM education program obligations[A]: $2,000,000. Program: President's Environmental Youth Awards; Fiscal Year 2010 STEM education program obligations[A]: $50,000. Program: Science to Achieve Results Graduate Fellowship Program; Fiscal Year 2010 STEM education program obligations[A]: $6,387,830. Program: University of Cincinnati/EPA Research Training Grant; Fiscal Year 2010 STEM education program obligations[A]: $333,153. Source: GAO analysis of survey results. [A] This number equals the total program obligations for fiscal year 2010, unless the survey respondent provided obligations for the STEM only activities within the program. [B] Program funding was authorized in 2010, but was not obligated until 2011. [C] Fiscal year 2010 obligations for the Technical Career Intern Program are reflected in the Mickey Leland Program. [End of table] [End of section] Appendix III: Review of Evaluations: Different types of evaluation designs can provide rigorous evidence of effectiveness if designed well and implemented with a thorough understanding of their vulnerability to potential sources of bias. There are four main types of evaluations that GAO has identified: * Implementation evaluations (which assess the extent to which the program is operating as intended), * Impact evaluations (which include experimental and quasi- experimental designs), * Outcome evaluations (which assess the extent to which a program achieves its objectives), and: * Cost-benefit and cost-effectiveness analyses (which assess a program's outputs or outcomes with the costs to produce them). [Footnote 43] Deciding which evaluation type to use involves a variety of different considerations, as no one evaluation is suitable for all programs. For instance, as we have previously reported, an impact evaluation is more likely to provide useful information about what works when the intervention consists of clearly defined activities and goals and has been well implemented.[Footnote 44] One type of impact evaluation--the quasi-experimental comparison group design--which compares outcomes for program participants with those of a similar group not in the program, is used in instances when random assignment to the participant and nonparticipant groups is not possible, ethical, or practical. It is most successful in providing credible estimates of program effectiveness when the groups are formed in parallel ways and are not based on self-selection. On the other hand, case studies are recommended for assessing the effectiveness of complex interventions in limited circumstances when assessing comprehensive reforms that are so deeply integrated with the context (for example, the community) that no truly adequate comparison case can be found. Furthermore, every research method has inherent limitations; therefore, it is often advantageous to combine multiple measures or two or more designs in a study or group of studies to obtain a more comprehensive picture of the program's effect. As we have also previously reported, the evaluation methods literature describes a variety of issues to consider in planning which methods to use in carrying out an evaluation, including the expected use of the evaluation, the nature and implementation of program activities, and the resources available for the evaluation. We identified the following methods and designs of evaluation in our review, which may be used to carry out one or more of the main types of evaluation listed above: * committee of visitors, and other report types, which are generally external peer reviews that examine programs' managerial stewardship, compare plans with progress made, and evaluate outcomes to determine whether the research contributes to the agency's mission and goals; * experimental methods, which involve randomly assigning one group to a program and another to not participate in the program in order to compare outcomes of both groups; * mixed methods, which combine qualitative and quantitative designs; * qualitative, such as interviews or focus groups; * surveys, which involve the systematic collection of data from a respondent using a structured instrument (i.e., a questionnaire) to ensure that the collected data are as accurate as possible; and: * quasi-experimental comparison groups. In addition, there were two evaluations based solely on a compilation of grantee reports. As stated previously, other evaluations also used grantee evaluations, but these used other data sources to inform their results, and so were classified as using either mixed or qualitative methods. The most common evaluation designs that we classified programs as using were the committee of visitors and mixed methods. We reviewed 35 evaluations from the following agencies and programs, and determined their primary method for assessing effectiveness: Table 4: Program Evaluations and Evaluation Methods: Agency: Department of Commerce, National Institute of Standards and Technology; Program: Summer Institute for Middle School Science Teachers; Evaluation: Evaluation of the National Institute of Standards and Technology's (NIST) Summer Institute Year 3 Report; Methods: Mixed. Agency: Department of Commerce, National Institute of Standards and Technology; Program: Summer Undergraduate Research Fellowship Program; Evaluation: NIST SURF Program Assessment; Methods: Survey. Agency: Department of Commerce, National Oceanic and Atmospheric Administration; Program: Bay Watershed Education and Training Program (B-WET); Evaluation: An Evaluation of National Oceanic and Atmospheric Administration Chesapeake Bay Watershed Education and Training Program Meaningful Watershed Educational Experiences; Methods: Mixed. Agency: Department of Commerce, National Oceanic and Atmospheric Administration; Program: Global Learning and Observations to Benefit the Environment (GLOBE) Program; Evaluation: GLOBE 10 Year Evaluation: Into the Next Generation; Methods: Quasi-experimental. Agency: Department of Commerce, National Oceanic and Atmospheric Administration; Program: Teacher at Sea Program; Evaluation: Evaluation Report of the NOAA Teacher at Sea Program: 2005- 2009; Methods: Mixed. Agency: Department of Defense; Program: National Defense Education Program (NDEP) K-12; Evaluation: Recommended Resources for the National Defense Education Program Pre-Engineering Partnerships; Methods: Qualitative. Agency: Department of Defense; Program: DoD STARBASE Program; Evaluation: DoD STARBASE Program: 2010 Annual Report; Methods: Mixed. Agency: Department of Defense; Program: Army Educational Outreach Program (AEOP); Evaluation: The Talent Imperative in Science and Technology: An Evaluation of Army Educational Outreach Programs; Methods: Mixed. Agency: Department of Education; Program: Graduate Assistance in Areas of National Need; Evaluation: A Study of Four Federal Graduate Fellowship Programs: Education and Employment Outcomes; Methods: Survey. Agency: Department of Education; Program: Mathematics and Science Partnerships; Evaluation: Mathematics and Science Partnerships: Summary of Performance Period 2008 Annual Reports-Analytic and Technical Support for Mathematics and Science Partnerships; Methods: Analysis of grantee evaluations. Agency: Department of Education; Program: National Science and Mathematics Access to Retain Talent Program; Evaluation: Academic Competitiveness and National SMART Grant Programs: 2006-07 and 2007-08; Methods: Mixed. Agency: Department of Education; Program: Upward Bound Math-Science; Evaluation: The Impacts of Upward Bound Math-Science on Postsecondary Outcomes 7-9 Years After Scheduled High School Graduation: Final Report; Methods: Quasi-experimental. Agency: Department of Energy[A]; Program: Science Undergraduate Laboratory Internships; Evaluation: Making Comparisons and Examining Experiences: A Program Evaluation of the Department of Energy's Student Undergraduate Laboratory Internship (SULI) Program; Methods: Mixed. Agency: Department of Health and Human Services, National Institutes of Health; Program: Curriculum Supplement Series; Evaluation: The Relative Effects and Equity of Inquiry-Based and Commonplace Science Teaching on Students' Knowledge, Reasoning, and Argumentation; Methods: Experimental. Agency: National Institutes of Health, Department of Health and Human Services; Program: NIH Summer Research Experience Programs; Evaluation: Stimulating Science Education: NIH Summer Research Program Engages Students and Teachers in Science; Methods: Survey. Agency: National Institutes of Health, Department of Health and Human Services; Program: NIH Undergraduate Scholarship Program for Individuals from Disadvantaged Backgrounds; Evaluation: The NIH Undergraduate Scholarship Program: Career Outcomes of Scholars and Non-Awarded Finalists; Methods: Quasi-experimental. Agency: Environmental Protection Agency; Program: National Environmental Education and Training Partnership; Evaluation: The Third Environmental Education and Training Partnership: Summary of Year 5 Achievements; Methods: Analysis of grantee evaluations. Agency: Environmental Protection Agency; Program: EPA Greater Research Opportunities (GRO) Fellowships for Undergraduate Environmental Study; Evaluation: Review of the Office of Research and Development's Science to Achieve Results (STAR) and Greater Research Opportunities (GRO) Fellowship Programs at the U.S. Environmental Protection Agency; Methods: Mixed. Agency: Environmental Protection Agency; Program: Science to Achieve Results Graduate Fellowship Program; Evaluation: Review of the Office of Research and Development's Science to Achieve Results (STAR) and Greater Research Opportunities (GRO) Fellowship Programs at the U.S. Environmental Protection Agency; Methods: Mixed. Agency: NASA; Program: NASA Informal Education Opportunities (NIEO); Evaluation: NASA Informal Education: Final Report--A Descriptive Analysis of NASA's Informal Education Portfolio: Preliminary Case Studies; Methods: Mixed. Agency: NASA; Program: Space Grant/EPSCoR Program; Evaluation: 20th Year Program Evaluation Executive Summary: National Space Grant College and Fellowship Program; Methods: Mixed. Agency: National Science Foundation; Program: Alliances for Graduate Education and the Professoriate (AGEP); Evaluation: National Evaluation of the Alliances for Graduate Education and the Professoriate; Methods: Mixed. Agency: National Science Foundation; Program: CISE Pathways to Revitalized Undergraduate Computing Education (CPATH); Evaluation: Evaluation of CISE Pathways to Revitalized Undergraduate Computer Education (CPATH); Methods: Mixed. Agency: National Science Foundation; Program: Engineering Education (EE); Evaluation: Early Outcomes of the National Science Foundation's Grants Program on "How People Learn Engineering" (HPLE); Methods: Qualitative. Agency: National Science Foundation; Program: Federal Cyber Service: Scholarship for Service; Evaluation: Federal Cyber Service: Scholarship for Service Program Summative Evaluation Report; Methods: Mixed. Agency: National Science Foundation; Program: Integrative Graduate Education and Research Traineeship (IGERT) Program; Evaluation: Evaluation of the National Science Foundation's Integrative Graduate Education and Research Traineeship Program (IGERT): Follow-up Study of IGERT Graduates: Final Report; Methods: Quasi-experimental. Agency: National Science Foundation; Program: Louis Stokes Alliances for Minority Participation (LSAMP); Evaluation: Final Report on the Evaluation of the National Science Foundation Louis Stokes Alliances for Minority Participation Program; Methods: Quasi-experimental. Agency: National Science Foundation; Program: Research Experiences for Teachers (RET) in Engineering and Computer Science; Evaluation: Evaluation of the Research Experiences for Teachers (RET) Program: 2001-2006 Final Report; Methods: Survey. Agency: National Science Foundation; Program: Research Experiences for Undergraduates (REU); Evaluation: A Draft Report to the National Science Foundation: Research Experiences for Undergraduates (REU) in the Directorate for Engineering (ENG): 2003-2006 Participant Survey; Methods: Survey. Agency: National Science Foundation; Program: Research in Disabilities Education (RDE); Evaluation: Research in Disabilities Education Program Evaluation: Study 1 Methods and Results; Methods: Mixed. Agency: National Science Foundation; Program: Robert Noyce Teacher Scholarship Program; Evaluation: Robert Noyce Teacher Scholarship Program: Synopsis of 5 Years of Evaluation; Methods: Mixed. Agency: National Science Foundation; Program: Graduate STEM Fellows in K-12 Education Program (GK-12); Evaluation: Evaluation of the National Science Foundation's GK-12 Program: Final Report, Volumes I and II: Technical Report and Appendices; Methods: Quasi-experimental. Agency: National Science Foundation; Program: The Historically Black Colleges and Universities Undergraduate Program (HBCU-UP); Evaluation: Capacity Building to Diversify STEM Realizing Potential Among HBCUs: Findings from the National Evaluation of the Historically Black Colleges and Universities Undergraduate Program; Methods: Quasi-experimental. Agency: National Science Foundation; Program: Enhancing the Mathematical Sciences Workforce in the 21st Century (EMSW21); Evaluation: Evaluation of NSF's Program of Grants and Vertical Integration of Research and Education in the Mathematical Sciences (VIGRE); Methods: Mixed. Agency: Department of Agriculture; Program: 1890 Institution Teaching and Research Capacity Building Grants Program; Evaluation: Portfolio Annual Report 2008: Education; Methods: Other. Source: GAO analysis of survey results. [A] Department of Energy officials also submitted a committee of visitors report, which we list below in table 5. [End of table] The following are different types of reports, including the committee of visitors, that programs used to assess effectiveness of their STEM education programs. As stated in appendix I, we did consider these to be evaluations but did not review them because they did not align with the criteria we used to assess the evaluations. Table 5: Committee of Visitors and Other Types of Reports Used to Assess Program Effectiveness: Agency: Department of Defense; Program: Uniformed Services University of the Health Sciences (USUHS); Evaluation: Review of the School of Medicine. Agency: Department of Energy; Program: Academies Creating Teacher Scientists (DOE Acts); Evaluation: Committee of visitors. Agency: Department of Energy; Program: American Chemical Society Summer School in Nuclear and Radiochemistry; Evaluation: Peer-reviewed report. Agency: Department of Energy; Program: ASCR-ORNL Research Alliance in Math and Science; Evaluation: Internal report. Agency: Department of Energy; Program: Community College Institute of Science and Technology; Evaluation: Committee of visitors. Agency: Department of Energy; Program: Faculty and Student Teams; Evaluation: Committee of visitors. Agency: Department of Energy; Program: Hampton University Graduate Studies; Evaluation: Peer-reviewed report. Agency: Department of Energy; Program: Laboratory Equipment Donation Program; Evaluation: Committee of visitors. Agency: Department of Energy; Program: National Science Bowl; Evaluation: Committee of visitors. Agency: Department of Energy; Program: National Undergraduate Fellowship Program in Plasma Physics and Fusion Energy Sciences; Evaluation: Internal report. Agency: Department of Energy; Program: Office of Science Graduate Fellowship (SCGF) Program; Evaluation: Committee of visitors. Agency: Department of Energy; Program: Pan American Advanced Studies Institute; Evaluation: (Program officials did not provide us the name of this evaluation, but; did note that it is jointly supported with NSF and NSF led the program's peer review). Agency: Department of Energy; Program: Plasma/Fusion Science Educator Programs; Evaluation: Committee of visitors. Agency: Department of Energy; Program: Pre-Service Teacher Program; Evaluation: Committee of visitors. Agency: Department of Energy; Program: Science Undergraduate Laboratory Internships; Evaluation: Committee of visitors. Agency: Department of Energy; Program: Summer Applied Geophysical Experience (SAGE); Evaluation: Peer-reviewed report. Agency: National Institutes of Health, Department of Health and Human Services; Program: Ruth L. Kirschstein National Research Service Award Institutional Research Training Grants (T32, T35); Evaluation: Online performance index. Agency: National Science Foundation; Program: Advanced Technological Education (ATE); Evaluation: Committee of visitors. Agency: National Science Foundation; Program: Discovery Research K-12 (DR-K12); Evaluation: Committee of visitors. Agency: National Science Foundation; Program: Ethics Education in Science and Engineering (EESE); Evaluation: Committee of visitors. Agency: National Science Foundation; Program: Geoscience Education; Evaluation: Committee of visitors. Agency: National Science Foundation; Program: Geoscience Teacher Training (GEO-Teach); Evaluation: Committee of visitors. Agency: National Science Foundation; Program: Global Learning and Observations to Benefit the Environment (GLOBE); Evaluation: Committee of visitors. Agency: National Science Foundation; Program: Graduate Research Fellowship Program (GRFP); Evaluation: Committee of visitors. Agency: National Science Foundation; Program: Opportunities for Enhancing Diversity in the Geosciences; Evaluation: Committee of visitors. Agency: National Science Foundation; Program: Transforming Undergraduate Education in STEM (TUES); Evaluation: Committee of visitors. Agency: National Science Foundation; Program: Undergraduate Research and Mentoring in the Biological Sciences (URM); Evaluation: Committee of visitors. Source: GAO analysis of survey results. [End of table] [End of section] Appendix IV: GAO Contact and Staff Acknowledgments: GAO Contact: George A. Scott, (202) 512-7215 or scottg@gao.gov: Staff Acknowledgments: The following staff members made key contributions to this report: Bill Keller, Assistant Director; Susan Baxter; James Bennett; Karen Brown; David Chrisinger; Melinda Cordero; Elizabeth Curda; Karen Febey; Jill Lacey; Ben Licht; Amy Radovich; James Rebbe; Nyree Ryder Tee; Martin Scire; Ryan Siegel; and Walter Vance. [End of section] Related GAO Products: Opportunities to Reduce Potential Duplication in Government Programs, Save Tax Dollars, and Enhance Revenue. [hyperlink, http://www.gao.gov/products/GAO-11-635T]. Washington, D.C.: May 25, 2011. Managing for Results: GPRA Modernization Act Implementation Provides Important Opportunities to Address Government Challenges. [hyperlink, http://www.gao.gov/products/GAO-11-617T]. Washington, D.C.: May 10, 2011. Performance Measurement and Evaluation: Definitions and Relationships (Supercedes GAO-05-739SP). [hyperlink, http://www.gao.gov/products/GAO-11-646SP]. Washington, D.C.: May 2011. Opportunities to Reduce Potential Duplication in Federal Teacher Quality Programs [hyperlink, http://www.gao.gov/products/GAO-11-510T]. Apr 13, 2011. Government Performance: GPRA Modernization Act Provides Opportunities to Help Address Fiscal, Performance, and Management Challenges. [hyperlink, http://www.gao.gov/products/GAO-11-466T]. Washington, D.C.: March 16, 2011. Opportunities to Reduce Potential Duplication in Government Programs, Save Tax Dollars, and Enhance Revenue. [hyperlink, http://www.gao.gov/products/GAO-11-441T]. Washington, D.C.: March 3, 2011. Opportunities to Reduce Potential Duplication in Government Programs, Save Tax Dollars, and Enhance Revenue, [hyperlink, http://www.gao.gov/products/GAO-11-318SP]. Washington, D.C.: Mar. 1, 2011. Program Evaluation: Experienced Agencies Follow a Similar Model for Prioritizing Research. [hyperlink, http://www.gao.gov/products/GAO-11-176]. Washington, D.C.: January 14, 2011. America COMPETES Act: It Is Too Early to Evaluate Programs Long-Term Effectiveness, but Agencies Could Improve Reporting of High-Risk, High- Reward Research Priorities. [hyperlink, http://www.gao.gov/products/GAO-11-127R]. Washington, D.C.: October 7, 2010. Federal Education Funding: Overview of K-12 and Early Childhood Education Programs [hyperlink, http://www.gao.gov/products/GAO-10-51]. Washington, D.C.: Jan 27, 2010. Program Evaluation: A Variety of Rigorous Methods Can Help Identify Effective Interventions. [hyperlink, http://www.gao.gov/products/GAO-10-30]. Washington, D.C.: November 23, 2009. Government Performance: Strategies for Building a Results-Oriented and Collaborative Culture in the Federal Government. [hyperlink, http://www.gao.gov/products/GAO-09-1011T]. Washington, D.C.: September 24, 2009. Teacher Quality: Sustained Coordination among Key Federal Education Programs Could Enhance State Efforts to Improve Teacher Quality. [hyperlink, http://www.gao.gov/products/GAO-09-593]. Washington, D.C.: July 6, 2009. Higher Education: Federal Science, Technology, Engineering, and Mathematics Programs and Related Trends. [hyperlink, http://www.gao.gov/products/GAO-06-114]. Washington, D.C.: Oct. 12, 2005. Results-Oriented Government: Practices That Can Help Enhance and Sustain Collaboration among Federal Agencies. [hyperlink, http://www.gao.gov/products/GAO-06-15]. Washington, D.C.: Oct. 21, 2005. [End of section] Footnotes: [1] GAO, Higher Education: Federal Science, Technology, Engineering, and Mathematics Programs and Related Trends, [hyperlink, http://www.gao.gov/products/GAO-06-114] (Washington, D.C.: Oct. 12, 2005). [2] U.S. Department of Education, Report of the Academic Competitiveness Council, Washington, D.C., 2007. [3] See appendix I for our definition of a STEM education program. [4] In [hyperlink, http://www.gao.gov/products/GAO-06-114], we counted three organizations--National Institutes of Health (NIH), Indian Health Service (IHS), and Health Resources and Services Administration--within the Department of Health and Human Services (HHS) as three separate agencies. In this report, we count all subagencies, agencies, and organizations of cabinet-level departments as one agency. Therefore, NIH and IHS are counted as one agency--HHS-- in this report. [5] President's Council of Advisors on Science and Technology; Report to the President: Prepare and Inspire: K-12 Education in Science, Technology, Engineering, and Math (STEM) for America's Future. Washington, D.C., September 2010. [6] Exec. Order No. 12881 (1993). [7] STEM programs may fall under the jurisdiction of several congressional committees including those that oversee science, space, and technology programs; defense and homeland security programs; and education programs. [8] Pub. L. No. 110-315, 122 Stat. 3078 (2008). [9] Pub. L. No. 107-110, 115 Stat. 1425 (2002), reauthorizing and amending the Elementary and Secondary Education Act of 1965. [10] Pub. L. No. 81-507, 64 Stat. 149. [11] Pub. L. No. 109-171, tit. VIII, § 8003, 120 Stat. 4, 155 (2006). [12] Pub. L. No. 110-69, 121 Stat. 572 (2007). COMPETES also focused on STEM research programs. [13] America COMPETES Reauthorization Act of 2010, Pub. L. No. 111- 358, 124 Stat. 3982. [14] Pub. L. No. 111-358, § 101, 124 Stat. 3982, 3984. [15] Pub. L. No. 111-352, 124 Stat. 3866. [16] The GPRA Modernization Act of 2010 uses the term "agency," which is defined as an executive department, a government corporation, or an independent establishment, but does not include the Central Intelligence Agency, the Government Accountability Office, the U.S. Postal Service, and the Postal Regulatory Commission. 5 U.S.C. § 306(f). [17] Pub. L. No. 111-139, § 21, 124 Stat. 8, 29 (2010), codified at 31 U.S.C. § 712 note. [18] GAO, Opportunities to Reduce Potential Duplication in Government Programs, Save Tax Dollars, and Enhance Revenue, GAO-11-318SP (Washington, D.C.: Mar. 1, 2011). An interactive, web-based version of the report is available at: [hyperlink, http://www.gao.gov/ereport/GAO-11-318SP]. [19] Our analysis of programs does not include the 29 earmarks that were funded in 2010 because, according to our survey, 25 of these were not funded in 2011. [20] Nine program officials indicated that they did not know whether the program was created under their agencies' general statutory authority or through congressional direction. [21] GAO asked survey respondents to report on obligations--defined as definite commitments that create a legal liability of the government for the payment of goods and services ordered or received, or a legal duty on the part of the United States that could mature into a legal liability. [22] These other federal efforts are not included in our analysis of federal STEM education programs because the primary objective is not STEM education, but a secondary or tertiary benefit would be to enhance STEM education. Examples in this section are not intended to be exhaustive, but rather to provide examples of the different types of activities that can contribute to furthering STEM education and competitiveness. Further, many agencies administer efforts to promote STEM employment, such as funding postdoctoral students to perform research, which are not included in our review. [23] Through the GeoFORCE program, Interior has provided non-financial resources such as speakers, experts, science information materials, and mentoring. The agency has also provided the program a small amount of financial assistance intermittently; however, the main contribution has been non-financial. [24] African-Americans and Hispanics or Latinos were the most frequently identified minority, disadvantaged, or underrepresented groups. [25] Committee on STEM Education. National Science and Technology Council. Executive Office of the President of the United States. The Federal Science, Technology, Engineering, and Mathematics (STEM) Education Portfolio. A Report from the Federal Inventory of STEM Education Fast-Track Action Committee. Washington, D.C.: December 2011. [26] [hyperlink, http://www.gao.gov/products/GAO-11-318P]. [27] Not all programs use the same definition of administrative costs, and programs reported various methods to develop estimated costs. [28] For more details on our review of agencies' annual performance plans and reports, see appendix I. [29] In addition to reporting on STEM education programs through their performance plans and performance reports, there may be other ways to report on these efforts. However, our analysis was limited to these two documents. See appendix I for more information on our scope and methodology. [30] For more details on our evaluation review, see appendix I and appendix III. [31] Four of the evaluations did not have enough information for us to make a determination of the extent to which the methods, questions, objectives, and program context were aligned. [32] The National Science and Technology Council Committee on Science Subcommittee on Education and Workforce. Review and Appraisal of the Federal Investment in STEM Education Research. October 2006. [33] [hyperlink, http://www.gao.gov/products/GAO-06-114]. [34] U.S. Department of Education, Report of the Academic Competitiveness Council, Washington, D.C., 2007. [35] President's Council of Advisors on Science and Technology. Report to the President: Prepare and Inspire: K-12 Education in Science, Technology, Engineering, and Math (STEM) for America's Future. Washington, D.C., September 2010. [36] National Academies of Science, National Academy of Engineering, and Institute of Medicine of the National Academies, Rising above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future: Committee on Prospering in the Global Economy of the 21stCentury: An Agenda for American Science and Technology. National Academies Press, 2007. [37] Committee on STEM Education. National Science and Technology Council. Executive Office of the President of the United States. The Federal Science, Technology, Engineering, and Mathematics (STEM) Education Portfolio. A Report from the Federal Inventory of STEM Education Fast-Track Action Committee. Washington, D.C., December 2011. [38] To develop these objectives, we first examined the objectives used by previous GAO work to inventory federal STEM education programs. Second, we discussed these objectives with agency officials; the discussion resulted in providing clarifying language for some objectives and adding a new objective on conducting research to enhance the quality of STEM education. Third, after review and analysis of survey responses, we determined that for purposes of reporting out on survey responses, we would combine the first four objectives into one broader objective category--attracting and preparing students throughout their academic careers to enter STEM fields. [39] Although we surveyed 29 earmarks that were funded in 2010, we did not include earmark data in our analysis because, according to our survey, 25 of these were not funded in 2011. [40] After initial deployment of our survey, we became aware of four new programs not previously on our list--the National Security Agency's Cryptanalysis and Exploitation Services Summer Program, the National Institutes of Health's Material Development for Environmental Health Curriculum, the Animal and Plant Health Inspection Service's Daniel E. Salmon Scholarship, and the AgDiscovery program. After speaking with program officials and reviewing program information, we determined that these programs met our definition of a STEM education program, so we obtained program contact information, had each program fill out a survey, and added them to our review. In addition, there were five programs that program officials said should be excluded from our review after receiving the survey, even though agency officials had confirmed the list at the outset. After speaking with officials and reviewing program information, we determined that all five programs should be excluded from our list and should not fill out the survey. It was determined that two programs were part of another program, one was a duplicate entry, one was a nonprogrammatic STEM activity, and one was a research-related program not focused on STEM education. [41] [hyperlink, http://www.gao.gov/products/GAO-11-646SP]. [42] We did not assess agencies' plans and reports for compliance with GPRA and the Government Performance and Results Modernization Act of 2010 requirements, and our findings that some agencies did not include STEM education programs in their plans and reports should not be read to suggest that we identified instances of noncompliance. For example, we did not assess whether a particular STEM education program is a "program activity" as that term is defined by GPRA for purposes of determining what STEM education programs are required to be covered in agency performance plans and reports. 31 U.S.C. § 1115(h)(11). [43] [hyperlink, http://www.gao.gov/products/GAO-11-646SP]. [44] [hyperlink, http://www.gao.gov/products/GAO-10-30]. [End of section] GAO’s Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO’s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E- mail Updates.” Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov, (202) 512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125 Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, DC 20548.