This is the accessible text file for GAO report number GAO-11-804 
entitled 'Recovery Act Education Programs: Funding Retained Teachers, 
but Education Could More Consistently Communicate Stabilization 
Monitoring Issues' which was released on September 22, 2011. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States Government Accountability Office: 
GAO: 

Report to Congress: 

September 2011: 

Recovery Act Education Programs: 

Funding Retained Teachers, but Education Could More Consistently 
Communicate Stabilization Monitoring Issues: 

GAO-11-804: 

GAO Highlights: 

Highlights of GAO-11-804, a report to Congress. 

Why GAO Did This Study: 

The American Recovery and Reinvestment Act of 2009 (Recovery Act) 
provided $70.3 billion for three education programs—the State Fiscal
Stabilization Fund (SFSF); Title I, Part A of the Elementary and 
Secondary Education Act (Title I); and Individuals with Disabilities 
Education Act (IDEA), Part B. One goal of the Recovery Act was to save 
and create jobs, and SFSF also requires states to report information 
expected to increase transparency and advance educational reform. 

This report responds to two ongoing GAO mandates under the Recovery
Act. It examines (1) how selected states and local recipients used the
funds; (2) what plans the Department of Education (Education) and 
selected states have to assess the impact of the funds; (3) what 
approaches are being used to ensure accountability of the funds; and 
(4) how Education and states ensure the accuracy of recipient reported 
data. 

To conduct this review, GAO gathered information from 14 states and the
District of Columbia, conducted a nationally representative survey of
local educational agencies (LEA), interviewed Education officials,
examined recipient reports, and reviewed relevant policy documents. 

What GAO Found: 

As of September 9, 2011, in the 50 states and the District of 
Columbia, about 4 percent of the obligated Recovery Act funds remain 
available for expenditure. Teacher retention was the primary use of 
Recovery Act education funds according to GAO’s nationally 
representative survey of LEAs. The funds also allowed recipients to 
restore state budget shortfalls and maintain or increase services. 
However, the expiration of funds and state budget decreases may cause 
LEAs to decrease services, such as laying off teachers. We also found
that nearly a quarter of LEAs reported lowering their local spending 
on special education, as allowed for under IDEA provisions that 
provide eligible LEAs the flexibility to reduce local spending on 
students with disabilities by up to half of the amount of any increase 
in federal IDEA funding from the prior year. However, even with this 
flexibility, many LEAs reported having difficulty maintaining required 
levels of local special education spending. In addition, two states have
not been able to meet required state spending levels for IDEA or 
obtain a federal waiver from these requirements. States whose waivers 
were denied and cannot make up the shortfall in the fiscal year in 
question face a reduction in their IDEA funding equal to the 
shortfall, which may be long-lasting. 

Education plans to conduct two types of systematic program assessments 
to gauge the results of Recovery Act-funded programs that focus on 
educational reform: program evaluation and performance measurement. In 
the coming years, Education plans to produce an evaluation that will 
provide an in-depth examination of various Recovery Act programs’ 
performance in addressing educational reform. In addition, for the 
SFSF program, Education plans to measure states’ ability to collect 
and publicly report data on preestablished indicators and descriptors 
of educational reform, and it plans to provide a national view of 
states’ progress. Education intends for this reporting to be a means 
for improving accountability to the public in the shorter term. Further,
Education officials plan to use states’ progress to determine whether 
a state is qualified to receive funds under other future reform-
oriented grant competitions. 

Numerous entities help ensure accountability of Recovery Act funds 
through monitoring, audits, and other means, which have helped 
identify areas for improvement. Given the short time frame for 
spending these funds, Education’s new SFSF monitoring approach 
prioritized helping states resolve monitoring issues and allowed 
Education to target technical assistance to some states. However, some 
states did not receive monitoring feedback promptly and this
feedback was not communicated consistently because Education’s 
monitoring protocol lacked internal time frames for following up with 
states. 

Education and state officials reported using a variety of methods to 
ensure recipient reported data are accurate. They also use recipient 
reported data to enhance their oversight and monitoring efforts. 
According to Recovery.gov, the Recovery Act funded approximately 
286,000 full-time equivalents (FTE) during the eighth round of 
reporting, which ended June 30, 2011, for the education programs GAO 
reviewed. Despite the limitations associated with FTE data, Education 
found these data to be useful in assessing the impact of grant 
programs on saving and creating jobs. 

What GAO Recommends: 

GAO recommends that the Secretary of Education establish mechanisms to
improve the consistency of communicating SFSF monitoring feedback to 
states. Education agreed with our recommendation. 

To view the e-supplement online click on [hyperlink, 
http://www.gao.gov/products/GAO-11-885SP]. View [hyperlink, 
http://www.gao.gov/products/GAO-11-804] or key components. For more 
information, contact George A.Scott at (202) 512-7215 or 
scottg@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

LEAs Have Obligated Most of Their Funds, Primarily on Retaining 
Teachers, but the Funding Cliff May Reduce Educational Services: 

Education Plans to Assess Results of Recovery Act Funds: 

Education and States Help Ensure Accountability, but Education Did Not 
Consistently Communicate SFSF Monitoring Concerns to States: 

Education and States Continue to Oversee the Quality of Recipient 
Reporting Data in Eighth Round of Reporting: 

Conclusions: 

Recommendation for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Drawdown Rates by Program: 

Appendix III: Comments from the Department of Education: 

Appendix IV: Status of Prior Open Recommendations and Matters for 
Congressional Consideration: 

Appendix V: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Estimated Percentage of LEAs Reporting Maintaining, 
Increasing, or Decreasing Level of Service: 

Table 2: Approval Status and Outcomes for IDEA MOE Waivers as of 
August 2011: 

Table 3: Education's Planned Reports as Part of Evaluation of Recovery 
Act Programs That Include SFSF; IDEA, Part B; and, ESEA Title I, Part 
A: 

Table 4: Estimated Percentage of LEAs with and without Plans to 
Collect Data for Evaluation Purposes: 

Table 5: Percentage of Awarded Recovery Act SFSF; ESEA Title I, Part 
A; and IDEA, Part B Funds Drawn Down by States as of September 9, 2011: 

Figures: 

Figure 1: Estimated Percentage of LEAs That Used Various Amounts of 
SFSF Funds; ESEA Title I, Part A; and IDEA, Part B Recovery Act Funds 
to Retain Staff over the Entire Grant Period: 

Figure 2: Examples from Selected States of How LEAs Spent SFSF; ESEA 
Title I, Part A; and IDEA, Part B Recovery Act Funding: 

Figure 3: Estimated Percentage of LEAs with Funding-Level Changes in 
School Years 2009-2010 and 2010-2011, and Anticipated Funding Changes 
for School Year 2011-2012: 

Figure 4: Selected Actions Taken in School Year 2010-2011 and Likely 
Actions LEAs Will Take in School Year 2011-2012 Reported by LEAs 
Experiencing or Expecting Funding Decreases: 

Figure 5: Estimated Percentage of LEAs Reporting Recovery Act 
Monitoring by Various Entities: 

Figure 6: Number of Days between SFSF Monitoring Review and Issuance 
of Draft Report to State (as of 9/16/11): 

Figure 7: Status of Education's SFSF Monitoring Reviews and Reports 
(as of 8/31/11): 

Figure 8: FTEs Reported for Recovery Act SFSF; Title I, Part A; and 
IDEA, Part B in 50 States and DC for Quarters Ending December 2009 
through June 2011: 

Abbreviations: 

CCD: U.S. Department of Education's Common Core of Data: 

DATA Act: Digital Accountability and Transparency Act of 2011: 

Education: U.S. Department of Education: 

ESEA: Elementary and Secondary Education Act of 1965: 

FTE: full-time equivalent: 

IDEA: Individuals with Disabilities Education Act, as amended: 

IHE: institutions of higher education: 

LEA: local educational agencies: 

MOE: maintenance of effort: 

OIG: Office of Inspector General: 

SEA: state educational agencies: 

SFSF: State Fiscal Stabilization Fund: 

United States Government Accountability Office: 
Washington, DC 20548: 

September 22, 2011: 

Report to Congress: 

With the most severe recession in decades, states and school
governments around the country faced record budget shortfalls that
threatened to adversely affect services. In response to the economic
crisis facing the nation and the fiscal challenges facing state and 
local governments, Congress enacted the American Recovery and
Reinvestment Act of 2009 (Recovery Act).[Footnote 1] Among other 
things, the purposes of the Recovery Act were to preserve and create 
jobs, promote national economic recovery, and provide long-term 
economic benefits through infrastructure investments, including 
education.[Footnote 2] The Recovery Act provided nearly $100 billion 
in fiscal year 2009 for elementary, secondary, and postsecondary 
education programs-—a major responsibility for state and local 
governments—-in an effort to ensure students continue to receive 
quality educational services.[Footnote 3] While a key purpose of these 
funds was to help address short-term fiscal challenges, newly created 
education programs also promoted progress on educational reform and 
the U.S. Department of Education encouraged recipients to invest in 
long-term capacity. We previously reported that school districts have 
used Recovery Act education funds primarily to fund jobs, but also 
that they have reported progress in key reform areas and have used 
funds for one-time investments in equipment and training. The funds 
are available until September 30, 2011, but the impact of how those 
funds were spent will not be clear for several more years. However, 
how school districts used the funds to invest in sustainable reform 
efforts could affect their ability to mitigate the effects of 
potential funding reductions. 

Our review of states’ use of Recovery Act funds covers three programs
administered by the U.S. Department of Education (Education)—-the State 
Fiscal Stabilization Fund (SFSF) ($48.6 billion); Title I, Part A of the
Elementary and Secondary Education Act of 1965, as amended (ESEA)
($10 billion); and the Individuals with Disabilities Education Act, as
amended (IDEA) Part B ($11.7 billion).[Footnote 4] We chose to review 
these programs because, collectively, funding for these programs 
accounts for approximately $70.3 billion of the $275 billion in 
Recovery Act funding distributed through contracts, grants, and loans. 
Although all grants have been awarded, recipients have until September 
30, 2011, to obligate the remainder of their funds. Most recipients 
have obligated the majority of their funds already, but some 
recipients may continue to spend their funds after September 30, 2011, 
as they pay off their obligations. 

The Recovery Act mandates that GAO conduct bimonthly reviews of the
funds used by states and determine whether the act is achieving its
stated purposes.[Footnote 5] The Recovery Act also requires GAO to 
comment and report quarterly on, among other things, estimates of job 
creation and retention, counted as full-time equivalent (FTE), as 
reported by recipients of Recovery Act funds.[Footnote 6] Consistent 
with the mandates in the Recovery Act, we determined: (1) how selected 
states and local educational agencies (LEA) are using Recovery Act 
SFSF, ESEA Title I, and IDEA, Part B funds; (2) what plans Education 
and selected states have to assess the effect of the Recovery Act 
education funds and what is known about the resulting outcomes; (3) 
what approaches Education, selected states, and LEAs are taking to 
ensure accountability for Recovery Act education funds; and (4) what 
procedures Education and states are using to ensure required recipient 
reports contain accurate FTE information for education programs. 

To obtain national-level information on how Recovery Act funds made
available by Education under SFSF; ESEA, Title I; and IDEA, Part B, 
were used at the local level, we selected a stratified random sample of
LEAs—generally school districts—in all 50 states and the District of
Columbia, and administered a Web-based survey.[Footnote 7] We 
conducted our survey between March and May 2011, with a 78 percent 
final weighted response rate at the national level. The results of our 
sample have a 95 percent confidence interval, with a margin of error 
of plus or minus 7 percentage points or less, unless otherwise noted. 
We stratified the population into strata based on size, urban status, 
and poverty status. Regarding size, we identified and included the 100 
largest LEAs in the country. This report does not contain all the 
results from the survey. The survey and a more complete tabulation of 
the results can be viewed at GAO-11-885SP. For further information on 
our survey, see appendix I. Furthermore, at the state and local level, 
we gathered information from 14 states and the District of Columbia to 
discuss how they were using, monitoring and planning to evaluate the 
effect of their Recovery Act funds. We conducted site visits to four 
states (California, Iowa, Massachusetts, and Mississippi), and 
contacted an additional seven states (Alaska, Arizona, Georgia, 
Hawaii, North Carolina, New York, and Wyoming) and the District of 
Columbia to discuss how they were using, monitoring, and planning to 
evaluate the effect of their Recovery Act funds. We selected these 
states based on drawdown rates, economic response to the recession, 
and data availability, with consideration of geography and recent 
federal monitoring coverage. In addition, we contacted officials from 
Florida, Kansas, and South Carolina for information regarding IDEA, 
Part B waivers. We also met with program officials at Education to 
discuss ongoing monitoring and evaluation efforts for Recovery Act 
funds provided through SFSF; ESEA Title I; and IDEA, Part B. We 
assessed recipient reports for these programs for the quarter
ending June 30, 2011, for completeness and accuracy and found them
sufficiently reliable for the purposes of this report.[Footnote 8] We 
also analyzed the reported FTE jobs data from recipient reports. 
Lastly, we reviewed relevant federal laws and regulations, as well as 
information on education reform efforts the four states we visited 
submitted for their SFSF applications. 

Our oversight of programs funded by the Recovery Act has resulted in
more than 100 related products with numerous recommendations since
we began reporting on the Recovery Act.[Footnote 9] This report 
updates agency actions in response to recommendations from previous 
bimonthly and recipient reporting reviews that have not been fully 
implemented (referred to as open recommendations) in appendix IV. 

We conducted our work from October 2010 to September 2011 in
accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe that
the evidence obtained provides a reasonable basis for our findings and
conclusions based on our audit objectives. 

Background: 

Of the Education programs funded in the Recovery Act, the newly 
created SFSF program was the largest in terms of funding. It included 
approximately $48.6 billion awarded to states by formula and up to $5 
billion awarded as competitive grants.[Footnote 10] SFSF was created, 
in part, to help state and local governments stabilize their budgets 
by minimizing budgetary cuts in education and other essential 
government services, such as public safety. SFSF funds for education 
distributed under the Recovery Act were required to first be used to 
alleviate shortfalls in state support for education to LEAs and public 
institutions of higher education (IHE). 

States were required to use SFSF education stabilization funds to 
restore state funding to the greater of fiscal year 2008 or 2009 
levels for state support to LEAs and public IHEs. When distributing 
these funds to LEAs, states must use their primary education funding 
formula, but they can determine how to allocate funds to public IHEs. 
In general, LEAs maintain broad discretion in how they can use 
education stabilization funds, but states have some ability to direct 
IHEs in how to use these funds. 

Several other programs received additional funding through the 
Recovery Act. For example, the Recovery Act provided $10 billion to 
help LEAs educate disadvantaged youth by making additional funds 
available beyond those regularly allocated for ESEA Title I, Part A. 
These additional funds are distributed through states to LEAs using 
existing federal funding formulas, which target funds based on such 
factors as high concentrations of students from families living in 
poverty. 

The Recovery Act also provided $12.2 billion in supplemental funding 
for programs authorized by IDEA, the major federal statute that 
supports the provisions of early intervention and special education 
and related services for infants, toddlers, children, and youth with 
disabilities. Part B of IDEA funds programs that ensure preschool and 
school-aged children with disabilities have access to a free 
appropriate public education and is divided into two separate grants-- 
Part B grants to states (for school-age children) and Part B preschool 
grants.[Footnote 11] 

While one purpose of the Recovery Act was to preserve and create jobs, 
it also required states to report information quarterly to increase 
transparency and SFSF required recipients to make assurances relating 
to progress on educational reforms. To receive SFSF, states were also 
required to provide several assurances, including that they will 
maintain state support for education at least at fiscal year 2006 
levels; and that they would implement strategies to advance four core 
areas of education reform. The four core areas of education reform, as 
described by Education, are: 

1. Increase teacher effectiveness and address inequities in the 
distribution of highly qualified teachers. 

2. Establish a pre-K-through-college data system to track student 
progress and foster improvement. 

3. Make progress toward rigorous college-and career-ready standards 
and high-quality assessments that are valid and reliable for all 
students, including students with limited English proficiency and/or 
disabilities. 

4. Provide targeted, intensive support, and effective interventions to 
turn around schools identified for corrective action or restructuring. 

Education required states receiving SFSF funds to report about their 
collection and reporting of 34 different indicators and 3 descriptors 
related to these four core areas of education reform or provide plans 
for making information related to the education reforms publicly 
available no later than September 30, 2011. Previously, we reported 
that, while states are responsible for assuring advancement of these 
reform areas, LEAs were generally given broad discretion in how to 
spend the SFSF funds. It is not clear how LEA progress in advancing 
these four reforms would affect states' progress toward meeting their 
assurances.[Footnote 12] 

Additionally, Recovery Act recipients and subrecipients are 
responsible for complying with other requirements as a condition of 
receiving federal funds. For example, for Recovery Act education 
programs we reviewed, states and LEAs must meet applicable maintenance 
of effort (MOE) requirements, which generally mandate them to maintain 
their previous level of spending on these programs.[Footnote 13] 
Generally, this also helps to ensure that states continue to fund 
education even with the influx of the Recovery Act funds. 
Specifically, the newly created SFSF program required states to 
maintain support for elementary and secondary education, in fiscal 
years 2009, 2010, and 2011, at least at the level that the state 
provided in fiscal year 2006, but did not place any MOE requirements 
on subrecipients.[Footnote 14] IDEA generally prohibits states and 
LEAs from reducing their financial support, or MOE, for special 
education and related services for children with disabilities below 
the level of that support for the preceding year.[Footnote 15] For 
ESEA, Title I, states[Footnote 16] and LEAs are also required to 
maintain their previous level of funding with respect to the provision 
of free public education.[Footnote 17] As long as states met certain 
criteria, including that the states maintained MOE for SFSF funding, 
this funding could be counted to meet MOE for other programs including 
ESEA, Title I, and IDEA. 

In addition, section 1512 of the Recovery Act requires recipients to 
report certain information quarterly. Specifically, the Act requires, 
among other types of information, that recipients report the total 
amount of Recovery Act funds received, associated obligations and 
expenditures, and a detailed list of the projects or activities for 
which these obligations and expenditures were made. For each project 
or activity, the information must include the name and description of 
the project or activity, an evaluation of its completion status, and 
an estimate of the number of jobs funded through that project or 
activity. The job calculations are based on the total hours worked 
divided by the number of hours in a full-time schedule, expressed in 
FTEs--but they do not account for the total employment arising from 
the expenditure of Recovery Act funds. The prime recipient is 
responsible for the reporting of all data required by section 1512 of 
the Recovery Act each quarter for each of the grants it received under 
the act. 

LEAs Have Obligated Most of Their Funds, Primarily on Retaining 
Teachers, but the Funding Cliff May Reduce Educational Services: 

According to our nationally representative survey of LEAs conducted in 
spring 2011, nearly all LEAs reported that they had obligated the 
majority of their Recovery Act funds, primarily for retaining 
instructional positions, which assisted LEAs in restoring shortfalls 
in state and local budgets that LEAs have had to cope with over the 
past few school years. As a result of the fiscal stress states faced 
during the recession, a number of state educational agencies (SEA) and 
LEAs have had difficulty meeting their required MOE requirements for 
IDEA. States that do not either fully meet their MOE requirements or 
receive a waiver from Education may face a reduction in future IDEA 
allocations. State and LEA officials we visited stated that the 
actions they have taken to deal with decreased budgets and the 
expiration of their Recovery Act funds--such as reducing instructional 
supplies and equipment and cutting instructional positions--could have 
a negative impact on the educational services they provide to students. 

Most LEAs Have Obligated the Majority of Their Recovery Act Education 
Funds: 

According to our survey, the majority of LEAs reported they had 
already obligated most of their SFSF; ESEA Title I, Part A; and IDEA, 
Part B funds. Nearly all of the LEAs--99 percent for SFSF; and 97 
percent for ESEA Title I and IDEA, Part B--reported that they expected 
to obligate all of their Recovery Act funds prior to September 30, 
2011.[Footnote 18] However, approximately one-quarter (23 percent) of 
LEAs reported that uncertainty about allowable uses of the funds 
impacted their ability to expend them in a timely and effective manner. 

According to data from Education, as of September 9, 2011, about 4 
percent of the states' obligated Recovery Act funds remain available 
for expenditure[Footnote 19]. See appendix II for percentages of 
awarded Recovery Act funds drawn down by states. As of September 9, 
2011, two states had drawn down 100 percent of their ESEA Title I, 
Part A funds. Additionally, 27 states had drawn down all of their SFSF 
education stabilization funds, while Wyoming, for example, had the 
lowest drawdown rate for SFSF--34 percent. Drawdowns can lag behind 
actual expenditures for various reasons. For example, SEA officials in 
Wyoming stated that funds for certain uses, such as professional 
development, tended to be expended in large amounts during the middle 
and end of the school year which did not require them to draw down 
funds at a constant rate throughout the school year. Additionally, SEA 
officials in Alaska told us their drawdown rates appeared low because 
the state draws down funds on a quarterly basis to reimburse LEAs 
after their allocations have been spent. 

SEA officials in the states we visited told us they provided guidance 
on obligating Recovery Act funds in an effort to assist LEAs in 
meeting the deadline for obligation of the funds. For example, SEA 
officials in Massachusetts told us that they sent four communiqués and 
conducted teleconferences with LEAs with the goal of ensuring that 
SFSF funds were spent appropriately and in a timely fashion. In 
Wyoming, SEA officials stated they requested that districts submit 
Periodic Expenditure Reports on a quarterly basis so that they could 
assess districts' spending of Recovery Act funds. They also told us 
that they contacted districts to determine if they were having 
challenges obligating the funds by the September 2011 deadline and 
sent e-mails to their districts notifying them of the amount of funds 
they had remaining. 

Recipients Used Most of Their Recovery Act Funds to Retain Teachers, 
Build Capacity, and Maintain Educational Services: 

Retaining staff was the top use cited by LEAs of SFSF; IDEA, Part B; 
and ESEA Title I, Part A Recovery Act funding over the entire grant 
period. According to our survey, about three-quarters of LEAs spent 51 
percent or more of their SFSF funds on job retention (see figure 1). A 
smaller, but substantial, percentage of LEAs also reported using 51 
percent or more of their ESEA Title I, Part A and IDEA, Part B 
Recovery Act funding--an estimated 43 percent and 38 percent, 
respectively--for job retention. Specifically, in the 2010-2011 school 
year, the large majority of LEAs (84 percent) used Recovery Act funds 
to retain instructional positions, which typically include classroom 
teachers and paraprofessionals. Salaries and benefits comprise the 
majority of public school's budgets, and funds authorized by the 
Recovery Act provided LEAs additional funds to pay for the retention 
of education staff. 

Figure 1: Estimated Percentage of LEAs That Used Various Amounts of 
SFSF Funds; ESEA Title I, Part A; and IDEA, Part B Recovery Act Funds 
to Retain Staff over the Entire Grant Period: 

[Refer to PDF for image: 3 pie-charts] 

Percentage of funds used: 

SFSF: 
51% to 100%: 74%; 
26% to 50%: 7%; 
1% to 25%: 5%; 
0%: 10%; 
Not applicable or “don't know”: 3%. 

ESEA Title I, Part A: 
51% to 100%: 43%; 
26% to 50%: 14%; 
1% to 25%: 15%; 
0%: 23%; 
Not applicable or “don't know”: 4%. 

IDEA, Part B: 
51% to 100%: 38%; 
26% to 50%: 20%; 
1% to 25%: 16%; 
0%: 21%; 
Not applicable or “don't know”: 5%. 

Source: GAO survey of LEAs in school year 2010-11. 

[End of figure] 

In addition to retaining instructional positions, LEAs spent Recovery 
Act funds on one-time, nonrecurring purchases and sustainable items 
that built capacity without creating recurring costs. According to our 
survey, 78 percent of LEAs reported using 1 to 25 percent of at least 
one of their Recovery Act funding sources--SFSF; ESEA Title I, Part A; 
or IDEA, Part B--on one-time expenditures, such as professional 
development for instructional staff, computer technology, and 
instructional materials. For example, LEA officials in one district in 
Mississippi told us that they used Recovery Act funds to invest in 
technology, security equipment, and a handicapped-accessible school 
bus for students with special needs. In the New Bedford Public Schools 
district in Massachusetts, LEA officials stated that Recovery Act 
funds were used to rehabilitate and redeploy computers around the 
district, purchase iPad connections to enable online learning, and 
provide professional development to teachers on various technological 
efforts. See figure 2. 

Figure 2: Examples from Selected States of How LEAs Spent SFSF; ESEA 
Title I, Part A; and IDEA, Part B Recovery Act Funding: 

[Refer to PDF for image: 5 photographs] 

1) Mississippi summer school students work in a computer lab in a 
Water Valley elementary school. 

2) A classroom in Fairfield-Suisun in California that helps students
with cognitive disabilities and significant behavorial issues learn 
life skills, such as cooking and cleaning. 

3) A display showing images from security cameras around a school in 
Mississippi. 

4) A New Bedford School District teacher in Massachusetts using a 
Smart Board to help students with a writing assignment. 

5) A “Sensory Room” designed to aid the physical and cognitive 
development of students with special needs in New Bedford School 
District in Massachusetts. 

Source: GAO. 

[End of figure] 

Other one-time purchases made with Recovery Act funds enhanced 
districts' capacity to provide services in the future, sometimes with 
anticipated long-term cost savings. In Massachusetts, we visited two 
LEAs--Newton Public Schools and New Bedford Public Schools--that used 
IDEA, Part B Recovery Act funds to provide or expand their services 
for students with special needs instead of paying more expensive 
schools or facilities to provide the alternative programs and 
services. LEA officials in Fairfield-Suisun Unified School District in 
California told us they used IDEA, Part B Recovery Act funds to 
implement two initiatives they expected to lead to significant cost 
savings in the future. The first initiative involved partnering with 
the nearby University of the Pacific to recruit recent speech 
pathology graduates. In exchange for externships and student loan 
stipends paid for with Recovery Act funds, the students committed to 
working in the district for 3 years upon graduation. These newly 
licensed therapists would be paid salaries around $45,000 per year, 
considerably less than the contracted therapists that cost the 
district over $100,000 per year. Further, because of the 3-year 
commitment, officials stated the graduates were more likely to 
continue working in the district as permanent employees. Officials 
estimated that this initiative could save them $800,000 in the 2011-
2012 school year. The second initiative used IDEA, Part B Recovery Act 
funds to start a public school for emotionally disturbed students who 
previously attended non-public schools at the district's expense. 
According to the officials, remodeling the old school building was 
both cost-effective and programmatically effective, since non-public 
schools for emotionally disturbed students could cost up to $85,000 
per student, with additional costs for occupational and speech therapy 
if needed. The new public school costs from $25,000 to $35,000 per 
student, according to district officials. Additionally, officials at 
Hinds Community College in Mississippi used SFSF education 
stabilization funds to invest in energy conservation. Specifically, 
the college contracted with an organization to help educate students 
and staff on energy conservation efforts, such as turning off lights 
and computers. The officials stated that they saved approximately $1 
million on utilities in fiscal year 2010, which offset the need to 
increase tuition. 

Compared to the year prior to receiving Recovery Act funds, a large 
majority of LEAs reported being able to, with the use of Recovery Act 
funds, maintain or increase the level of service they could provide to 
students (see table 1). LEA officials in the Center Point-Urbana 
Community School District in Iowa told us that Recovery Act funds 
allowed the district to maintain its core curriculum, provide 
professional development to instructional staff, and maintain the 
collection of assessment data that helps them align the district's 
curriculum with the state's core curriculum. LEA officials in the 
Water Valley School District in Mississippi stated that SFSF funds 
allowed the district to maintain its reform efforts because they 
allowed students greater access to teachers. They explained that 
saving those teacher positions allowed them to keep class sizes small 
and offer more subjects, such as foreign language, fine arts, and 
business classes. 

However, an estimated 13 percent of LEAs were not able to maintain the 
same level of service even with Recovery Act SFSF funds. These LEAs 
reported a number of factors that had an effect on their decreased 
level of service, including increases in class size, reductions in 
instructional and non-instructional programs, and reductions in staff 
development. For example, LEA officials at the Tipton Community School 
District in Iowa stated that, even with Recovery Act funding, they 
could not afford to maintain their high school agriculture program and 
middle school vocal music program on a full-time basis. 

Table 1: Estimated Percentage of LEAs Reporting Maintaining, 
Increasing, or Decreasing Level of Service: 

ESEA Title I, Part A; 
Maintain: 50%; 
Increase: 46%; 
Decrease: 3%. 

IDEA, Part B; 
Maintain: 58%; 
Increase: 38%; 
Decrease: 2%. 

SFSF; 
Maintain: 71%; 
Increase: 14%; 
Decrease: 13%. 

Source: GAO survey of LEAs in school year 2010-11. 

Note: There were variations in the wording of the survey questions 
that were used to create this table. We asked respondents about the 
overall effect of their Title I, Part A funds on education reform 
efforts and the overall effect of their IDEA, Part B funds on 
education reform for students with disabilities. We asked respondents 
how their SFSF funds affected their ability to maintain or raise the 
level of service in their LEA. 

[End of table] 

LEAs Reported Anticipating Continued Fiscal Constraints and Being 
Likely to Reduce Educational Services: 

The fiscal condition of LEAs across the country is mixed, but many 
school districts continued to face funding challenges in the 2010-2011 
school year. One sign of state fiscal stress has been mid-year budget 
reductions resulting from lower revenues than those forecasted. 
Nationwide, in state fiscal year 2011, one of the program areas where 
many states made mid-year general fund expenditure reductions was K-12 
education, according to the Fiscal Survey of States.[Footnote 20] Out 
of the 23 states that reported making mid-year reductions, 18 states 
reduced K-12 education funding. Looking forward to fiscal year 2012, 
reductions for K-12 education had been proposed in 16 states, 
according to the Fiscal Survey of States.[Footnote 21] Given that 
nearly half of education funding, on average, is provided by the 
states, the impact of state-level reductions to education could 
significantly affect LEA budgets. 

Over the course of our work on the Recovery Act, our surveys of LEAs 
have shown a mixed but deteriorating fiscal situation for the nation's 
LEAs. Specifically, our survey of LEAs conducted in the 2009-2010 
school year indicated that an estimated one-third of LEAs reported 
experiencing funding decreases in that year. Our survey conducted in 
the 2010-2011 school year showed that an estimated 41 percent of LEAs 
reported experiencing funding decreases in that year. Moreover, nearly 
three-quarters (72 percent) anticipated experiencing funding-level 
decreases in school year 2011-2012 (see figure 3). Further, LEAs 
anticipated decreases of varying amounts--24 percent expected 
decreases between 1 and 5 percent, 29 percent expected decreases 
between 6 and 10 percent, and 19 percent expected decreases over 10 
percent. 

Figure 3: Estimated Percentage of LEAs with Funding-Level Changes in 
School Years 2009-2010 and 2010-2011, and Anticipated Funding Changes 
for School Year 2011-2012: 

[Refer to PDF for image: horizontal bar graph] 

Estimated Percentage of LEAs: 

School year: 2009-10 (actual); 
Decrease in funding: 33%; 
No change in funding: 12%; 
Increase in funding: 61%. 

School year: 2010-11 (actual); 
Decrease in funding: 41%; 
No change in funding: 14%; 
Increase in funding: 40%. 

School year: 2011-12 (anticipated); 
Decrease in funding: 72%; 
No change in funding: 10%; 
Increase in funding: 15%. 

Source: GAO survey of LEAs in school years 2009-10 and 2010-11. 

[End of figure] 

All types of LEAs have had to cope with declining budgets in the past 
few school years, but LEAs with high student poverty rates were 
especially hard hit. LEAs that had high student poverty rates (54 
percent) more often reported experiencing funding decreases compared 
to those with low student poverty rates (38 percent).[Footnote 22] 
Additionally, 45 percent of suburban LEAs reported experiencing a 
decrease in funding from the 2009-2010 school year to the 2010-2011 
school year.[Footnote 23] Likewise, 41 percent of rural LEAs and 33 
percent of urban LEAs reported experiencing funding decreases in the 
same year.[Footnote 24] In addition, 62 percent of LEAs that 
experienced a decrease in funding in the 2010-2011 school year 
reported that they formed or planned to form an advisory committee or 
hold meetings with community stakeholders to develop budget 
recommendations as a cost-saving strategy.[Footnote 25] 

To address their funding decreases in school year 2010-2011, about one-
quarter or more of LEAs reported taking actions such as reducing 
instructional supplies and equipment and cutting instructional 
positions. Moreover, about one-half of LEAs that expected a decrease 
in funding in the upcoming 2011-2012 school year reported that they 
would likely have to reduce instructional supplies and equipment or 
cut instructional and non-instructional positions in the 2011-2012 
school year to address the budget shortfall (see figure 4). 

Figure 4: Selected Actions Taken in School Year 2010-2011 and Likely 
Actions LEAs Will Take in School Year 2011-2012 Reported by LEAs 
Experiencing or Expecting Funding Decreases: 

[Refer to PDF for image: horizontal bar graph] 

Estimated percentage of LEAs with decreased funding: 

Reduce instructional supplies/equipment: 
Actions taken by LEAs in school year 2010-11: 27%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 54%. 

Cut instructional positions: 
Actions taken by LEAs in school year 2010-11: 24%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 51%. 

Cut non-instructional positions: 
Actions taken by LEAs in school year 2010-11: 25%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 51%. 

Reduce energy consumption: 
Actions taken by LEAs in school year 2010-11: 26%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 50%. 

Increase class size: 
Actions taken by LEAs in school year 2010-11: 22%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 49%. 

Defer maintenance: 
Actions taken by LEAs in school year 2010-11: 23%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 45%. 

Reduce professional development/teacher training: 
Actions taken by LEAs in school year 2010-11: 17%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 43%. 

Freeze pay: 
Actions taken by LEAs in school year 2010-11: 15%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 40%. 

Reduce custodial services: 
Actions taken by LEAs in school year 2010-11: 15%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 34%. 

Eliminate summer/alternate programs: 
Actions taken by LEAs in school year 2010-11: 11%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 28%. 

Reduce transportation services: 
Actions taken by LEAs in school year 2010-11: 11%; 
Actions LEAs are “likely to take” in school year 2011-12, as reported 
in school year 2010-11: 26%. 

Source: GAO survey of LEAs in school year 2010-11. 

Note: The LEAs who responded that they took actions in school year 
2010-11 may not be the same LEAs that reported that they anticipated 
being likely to take actions in school year 2011-12. 

[End of figure] 

LEAs across the country will soon exhaust their SFSF; ESEA Title I, 
Part A; and IDEA, Part B Recovery Act funds, which will place them at 
the edge of a funding cliff--meaning that they will not have these 
funds to help cushion budget declines in the upcoming 2011-2012 school 
year. However, many LEAs planned to spend Education Jobs Fund awards, 
which could mitigate some of the effects of the funding cliff. 
Congress created the Education Jobs Fund in 2010, which generally 
provides $10 billion to states to save or create education jobs for 
the 2010-2011 school year.[Footnote 26] States distribute the funds to 
LEAs, which may use the funds to pay salaries and benefits, and to 
hire, rehire, or retain education-related employees for the 2010-2011 
school year. According to our survey, an estimated 51 percent of LEAs 
spent or planned to spend 75 to 100 percent of their Education Jobs 
fund allocation in the 2010-2011 school year and about 49 percent 
planned to spend the same amount in the 2011-2012 school year. The 
large majority of LEAs (72 percent) spent or planned to spend most of 
the funds on retaining jobs, as opposed to hiring new staff or 
rehiring former staff. 

State and LEA officials we visited stated that the actions they have 
taken to deal with decreased budgets and the expiration of their 
Recovery Act funds could have an impact on the educational services 
they provide. For example, officials at the Fairfield-Suisun Unified 
School District in California told us that they tried to make cuts 
that had the least impact on the classroom, but they had begun making 
cuts that would impact the students. For example, they reported that 
they will increase class sizes, cut administrative and student support 
staff, eliminate summer school programs, and close schools because of 
their decreased budget. LEA officials at the Newton Public School 
District in Massachusetts stated that they cut many support services 
and were reviewing under-enrolled classes to determine which programs 
to eliminate. They stated that they tried to insulate cuts to mitigate 
the impact on student learning, but stated that the cuts would 
nonetheless negatively impact the students' educational services. In 
Hawaii, SEA officials told us that their state was considering certain 
cost-saving scenarios to help mitigate the state's strained fiscal 
climate, including decreasing wages for all SEA employees, increasing 
class size, and eliminating school bus transportation for all students 
except those with special needs.[Footnote 27] Officials noted that 
eliminating bus transportation could lead to increased student 
absences and could be a challenge for students living in rural areas. 
Additionally, officials at the Center Point-Urbana School District in 
Iowa told us that they made several adjustments to save costs and be 
more efficient, such as reducing custodial staff. Because it is a 
small, rural district, Center Point-Urbana officials told us that any 
further cuts would jeopardize the quality of education it can provide 
to students. 

Further, a recent Center on Education Policy report found funding cuts 
also hampered progress on school reform.[Footnote 28] According to 
their national survey of school districts, they estimate that 66 
percent of school districts with budget shortfalls in 2010-2011 
responded to the cuts by either slowing progress on planned reform, or 
postponing or stopping reform initiatives. Further, about half (54 
percent) of the districts that anticipated shortfalls in 2011-2012 
expected to take the same actions next school year. 

LEAs Reduced Support for Special Education Due to Flexibility Allowed 
under IDEA: 

According to our survey, over a quarter of LEAs decreased their 
spending on special education because of the local MOE spending 
flexibility allowed under IDEA and the large influx of Recovery Act 
IDEA funds. Under IDEA, LEAs must generally not reduce the level of 
local expenditures for children with disabilities below the level of 
those expenditures for the preceding year.[Footnote 29] The law allows 
LEAs the flexibility to adjust local expenditures, however, in certain 
circumstances. Specifically, in any fiscal year in which an LEA's 
federal IDEA, Part B Grants to States allocation exceeds the amount 
the LEA received in the previous year, an eligible LEA[Footnote 30] 
may reduce local spending on students with disabilities by up to 50 
percent of the amount of the increase.[Footnote 31] If an LEA elects 
to reduce local spending, those freed up funds must be used for 
activities authorized under the ESEA. Because Recovery Act funds for 
IDEA count as part of the LEA's overall federal IDEA allocation, 
[Footnote 32] the total increase in IDEA funding for LEAs was far 
larger than the increases in previous years, which allowed many LEAs 
the opportunity to reduce their local spending. 

As we have previously reported, the decision by LEAs to decrease their 
local spending may have implications for future spending on special 
education.[Footnote 33] Because LEAs are required to assure that they 
will maintain their previous year's level of local, or state and 
local, spending on the education of children with disabilities to 
continue to receive IDEA funds, if an LEA lowers its spending using 
this flexibility, the spending level that it must meet in the 
following year will be at this reduced level. If LEAs that use the 
flexibility to decrease their local spending do not voluntarily 
increase their spending in future years--after Recovery Act funds have 
expired--the total local, or state and local, spending for the 
education of students with disabilities may decrease, compared to 
spending before the Recovery Act. 

Many LEAs Anticipate Difficulty Meeting IDEA MOE Requirements, Which 
May Result in Financial Consequences: 

Many LEAs anticipate difficulty meeting the IDEA MOE requirement for 
the next few years and could experience financial consequences if they 
do not comply. Through our survey, we found that 10 percent of LEAs 
expected to have trouble meeting their MOE for school year 2010-11 
and, in the 2011-12 school year, this percentage jumps to 24 percent 
of LEAs. For example, Florida officials reported that nearly two-
thirds of the LEAs in their state may be in jeopardy of not meeting 
their MOE requirement. Further, Education officials told us the LEA 
MOE amount can be difficult to calculate because there are various 
exceptions and adjustments LEAs can make, such as considering spending 
changes in the case of students with high-cost services leaving a 
program, hiring lower salary staff to replace retirees, and extensive 
one-time expenditures like a computer system. Education officials 
reported that they provided technical assistance to help states and 
LEAs understand how to include these exceptions and adjustments in 
their MOE calculations. 

Of LEAs that exercised the flexibility to adjust their IDEA MOE 
amount, 15 percent reported they anticipated having difficulty meeting 
MOE in 2010-11 even though their required local spending level was 
reduced.[Footnote 34] And in 2011-12, 33 percent of the LEAs that took 
advantage of the MOE adjustment still expected difficulty in meeting 
their MOE level.[Footnote 35] 

According to Education's guidance, if an LEA is found to have not 
maintained its MOE, the state is required to return to Education an 
amount equal to the amount by which the LEA failed to maintain effort. 
Additionally, IDEA does not provide LEAs an opportunity to receive a 
waiver from MOE requirements. 

Several States Applied for IDEA MOE Waivers, but Two States May 
Receive a Long-Lasting Reduction of IDEA Funding Because Full Waivers 
Were Not Approved: 

As of August 2011, seven states had applied for a total of 11 waivers 
of IDEA MOE requirements and other states reported they were 
considering applying for a waiver because of fiscal declines in their 
states. In addition to LEAs, states must also meet MOE requirements. 
To be eligible for Part B funding, states must provide an assurance 
that they will not reduce the amount of state financial support for 
special education below the amount of that support for the preceding 
fiscal year,[Footnote 36] and must operate consistent with that 
assurance. However, Education may waive this state MOE requirement 
under certain circumstances.[Footnote 37] While Education has granted 
full waivers for five instances, it has also denied or partially 
granted[Footnote 38] waivers in five instances for Iowa, Kansas, 
Oregon, and South Carolina (twice) and is currently reviewing an 
additional waiver request from Kansas (see table 2). In their waiver 
requests, all seven states cited declining fiscal resources as the 
reason for not being able to maintain their spending on special 
education, but waiver requests varied in amount from nearly half a 
million dollars in West Virginia to over $75 million in South Carolina. 

Table 2: Approval Status and Outcomes for IDEA MOE Waivers as of 
August 2011: 

State: Alabama; 
State fiscal year: 2010; 
Amount requested: $9,204,462; 
Amount waived: $9,204,462; 
Approval status and outcome: Approved. 

State: Iowa[A]; 
State fiscal year: 2010; 
Amount requested: $38,102,897; 
Amount waived: $38,102,897; 
Approval status and outcome: Approved. 

State: Iowa[A]; 
State fiscal year: 2011; 
Amount requested: $4,082,923; 
Amount waived: 0; 

Approval status and outcome: Denied--state recalculated MOE and no 
shortfall remained.[A]. 

State: Kansas; 
State fiscal year: 2010; 
Amount requested: $55,492,707; 
Amount waived: $53,306,253; 
Approval status and outcome: Partially granted--shortfall of about $2 
million. 

State: Kansas; 
State fiscal year: 2011; 
Amount requested: $34,193,605; 
Amount waived: [B]; 
Approval status and outcome: Currently under review. 

State: New Jersey; 
State fiscal year: 2010; 
Amount requested: $25,671,915; 
Amount waived: $25,671,915; 
Approval status and outcome: Approved. 

State: Oregon; 
State fiscal year: 2011; 
Amount requested: $15,674,579; 
Amount waived: 0; 
Approval status and outcome: Denied--state restored shortfall. 

State: South Carolina; 
State fiscal year: 2009; 
Amount requested: $20,312,122; 
Amount waived: $20,312,122; 
Approval status and outcome: Approved. 

State: South Carolina; 
State fiscal year: 2010; 
Amount requested: $67,402,525; 
Amount waived: $31,199,616; 
Approval status and outcome: Partially granted--shortfall of about $36 
million. 

State: South Carolina; 
State fiscal year: 2011; 
Amount requested: $75,343,070; 
Amount waived: 0; 
Approval status and outcome: Denied--state restored the entire 
shortfall. 

State: West Virginia; 
State fiscal year: 2010; 
Amount requested: $491,580; 
Amount waived: $491,580; 
Approval status and outcome: Approved. 

Source: GAO analysis of state waiver requests and Education's waiver 
determination letters. 

[A] According to Education officials, Iowa recalculated its MOE based 
and was able to meet the MOE requirement. 

[B] Kansas applied for a waiver on August 17, 2011, and is currently 
being reviewed by Education officials. 

[End of table] 

Education's guidance states that it considers waiver requests on a 
case-by-case basis and seeks to ensure that reductions in the level of 
state support for special education are not greater in proportion than 
the reduction in state revenues. In addition, as part of its review of 
waiver requests, Education seeks to ensure that states are not 
reducing spending on special education programs more severely than 
other areas. When Education receives a request for a waiver, officials 
told us they request state budget data to better understand the 
state's calculation of MOE, and to assess whether granting a waiver 
would be appropriate. According to Education officials, as well as 
state officials, this process can be lengthy and may involve a lot of 
back-and-forth between the department and the state to acquire the 
necessary information, understand the state's financial situation, and 
determine the state's required MOE level.[Footnote 39] Once all the 
data have been collected and reviewed to determine whether the state 
experienced exceptional or uncontrollable circumstances and whether 
granting a waiver would be equitable due to those circumstances, 
Education officials inform states that their waiver has either been 
approved, partially granted, or denied. 

For states whose waivers are denied or are partially granted, 
according to Education officials, the state must provide the amount of 
the MOE shortfall for special education during the state fiscal year 
in question or face a reduction in its federal IDEA grant award by the 
amount equal to the MOE shortfall.[Footnote 40] Education officials 
told us that because a state must maintain financial support for 
special education during a fiscal year, IDEA does not permit a state 
to make up this shortfall after that fiscal year is over. Education 
officials also told us that once a state's funding is reduced, the 
effect may be long-lasting in that the IDEA requires that each 
subsequent year's state allocation be based, in part, on the amount 
the state received in the prior year.[Footnote 41] Both Kansas and 
South Carolina now face reductions of IDEA awards for fiscal year 2012 
of approximately $2 million and $36 million, respectively. Education 
officials reported that it is impossible to predict with certainty the 
effect this may have on the states' future IDEA awards, but they 
indicated that these reductions may have a long-lasting negative 
effect on future IDEA awards. South Carolina has filed an "Appeal of 
Denial of Waiver Request/Reduction in Funds" with Education's Office 
of Hearings and Appeals regarding Education's decision to partially 
grant its waiver request for 2009-2010. 

Education Plans to Assess Results of Recovery Act Funds: 

Education plans to conduct two common types of systematic program 
assessment: program evaluation and performance measurement. In the 
coming years, Education plans to produce an evaluation that will 
provide an in-depth examination of various Recovery Act programs' 
performance in addressing educational reform. In addition to this 
overall assessment of the programs' results, for the SFSF program, 
Education plans to measure states' ability to collect and publicly 
report data on preestablished indicators and descriptors of 
educational reform. Education intends for this reporting to be a means 
for improving accountability to the public in the shorter term. 

Education's Planned Evaluations Are Intended to Assess Recovery Act- 
Funded Outcomes: 

Education plans to conduct a national evaluation to assess the results 
of Recovery Act-funded programs and initiatives addressing educational 
reform. The evaluation is intended to focus on efforts to drive 
innovation, improve school performance, and reduce student achievement 
gaps.[Footnote 42] According to Education officials, the programs 
covered by the evaluation include SFSF; IDEA, Part B; ESEA Title I, 
Part A; Race to the Top; the Teacher Incentive Fund; and Ed Tech. 
[Footnote 43] Including these Recovery Act education programs in one 
evaluation will allow for a broad view of the results of programs 
focused on education reform. 

As part of this integrated evaluation, Education plans to issue four 
reports over the next several years that are descriptive in nature, 
with the final report in 2014 including analysis of outcome data. The 
four planned reports are described in table 3. 

Table 3: Education's Planned Reports as Part of Evaluation of Recovery 
Act Programs That Include SFSF; IDEA, Part B; and, ESEA Title I, Part 
A: 

Type of report and planned completion: 
* Descriptive; 
* Winter 2012; 
Focus of report: Analysis of the variation in funding to states, LEAs, 
and schools and how funds were distributed (e.g., from states to LEAs, 
directly to LEAs, etc.). 

Type of report and planned completion: 
* Descriptive; 
* Spring 2012; 
Focus of report: The extent to which the key strategies, such as the 
four reform assurances, are implemented over time and whether the 
funding seems related to the scope and pace of the activity. How the 
emphasis and extent of implementation varies by fiscal conditions, 
other state and LEA characteristics, and the types of Recovery Act 
program funds received. 

Type of report and planned completion: 
* Descriptive; 
* Spring 2013; 
Focus of report: The extent of support provided by one educational 
level to another and the match in implementation priorities across 
them. May also assess whether such factors as clear guidance, 
technical assistance, or shared priorities are associated with fewer 
challenges and greater implementation of Recovery Act strategies. 

Type of report and planned completion: 
* Descriptive and outcome; 
* Summer 2014; 
Focus of report: Relationships between levels of Recovery Act funding 
and implementation of specific reform strategies and how these may be 
associated with key outcomes (e.g., gains in student achievement, 
graduation rates). However, definitive causal conclusions cannot be 
drawn from this study. 

Source: GAO analysis of information provided by Education officials. 
Notes: For this evaluation, IES contracted with external research 
professionals, led by Westat. 

[End of table] 

In addition, studies are planned related to measuring progress in 
meeting performance goals under the Recovery Act, according to 
Education officials. For example, Education's Policy and Program 
Studies Service will issue a report in 2012 that will examine teacher 
evaluation and other teacher-related issues based on state reported 
data under SFSF and through Education's EDFacts database.[Footnote 44] 

Although the Recovery Act does not require states and LEAs to conduct 
evaluations of their Recovery Act-funded reform activities, officials 
in a few states and LEAs we talked with said they are considering 
conducting evaluations. For example, Mississippi has implemented LEA 
program evaluations of some Recovery Act funded initiatives using 
student achievement data. At the local level, between about 43 and 56 
percent of LEAs reported that they are neither collecting nor planning 
to collect data that would allow for the evaluation of the use of 
SFSF; IDEA, Part B; or ESEA Title I, Part A funds, while between about 
19 and 31 percent of LEAs indicated they were either collecting or 
planning to collect information for this purpose. (See table 4.) For 
example, officials at one LEA in Massachusetts said that they are 
evaluating their use of IDEA Recovery Act funds to provide special 
education programs within the district rather than through private 
schools.[Footnote 45] 

Table 4: Estimated Percentage of LEAs with and without Plans to 
Collect Data for Evaluation Purposes: 

Program : SFSF; 
Not collecting or planning to collect: 56; 
Collecting or planning to collect: 19; 
Don't know: 25. 

Program : ESEA Title I, Part A; 
Not collecting or planning to collect: 43; 
Collecting or planning to collect: 31; 
Don't know: 26. 

Program : IDEA, Part B; 
Not collecting or planning to collect: 47; 
Collecting or planning to collect: 24; 
Don't know: 29. 

Source: GAO survey of LEAs. 

[End of table] 

Education Plans to Measure States' Progress on Reporting SFSF Reform 
Data: 

In addition to the more comprehensive evaluation, Education intends to 
assess each state's progress on collecting and publicly reporting data 
on all of the 37 SFSF-required indicators and descriptors of 
educational reform because, according to Education officials, the 
public will benefit from having access to that information.[Footnote 
46] States have until September 30, 2011, to report this performance 
data. As part of that assessment, Education officials said they have 
reviewed states' SFSF applications and self-reported annual progress 
reports on uses of funds and the results of those funds on education 
reform and other areas. Coupled with reviews of the applications and 
annual reports, Education requires states that receive SFSF funds to 
maintain a public Web site that displays information responsive to the 
37 indicator and descriptor requirements in the four reform areas. 
[Footnote 47] For example, on its Web site as of August 2011, Iowa's 
SEA reported that it includes 9 of the 12 required reporting 
indicators for its statewide longitudinal data system.[Footnote 48] 

These Web-based, publicly-available data are intended for use within 
each state, according to Education officials, because individual 
states and communities have the greatest power to hold their SEAs and 
LEAs accountable for reforms. Specifically, Education intended this 
information to be available to state policymakers, educators, parents, 
and other stakeholders to assist in their efforts to further reforms 
by publicly displaying the strengths and weaknesses in education 
systems.[Footnote 49] Officials in most of the states we talked with 
said that the requirements to report this information are useful. For 
example, some state officials pointed out that publicly reporting such 
data could serve as a catalyst of reform by pointing out areas where 
the state could improve.[Footnote 50] 

In addition to each state publicly reporting this information, 
Education plans to report states' progress toward complying with the 
conditions of the SFSF grant at the national level. Education 
officials said they will summarize states' ability to collect and 
report on the required indicators and descriptors across the four 
reform areas. Not all states were able to collect and report this 
information as of March 2011, but states have until September 30, 
2011, to do so. If a state could not report the information, it was 
required to create a plan to do so as soon as possible and by the 
September deadline.[Footnote 51] As part of its reporting, Education 
will summarize states' responses for certain indicators and 
descriptors and present it on Education's Web site.[Footnote 52] 

For many other indicators and all three descriptors, Education 
officials said that it faces challenges in presenting a national 
perspective on states' progress. For example, there are no uniform 
teacher performance ratings among LEAs within states and across 
states, which limits Education's ability to present comparisons. 
Moreover, states are not required to present information in a 
consistent manner, making it difficult to present aggregated or 
comparative data for many of the indicators. Also, Education officials 
said that because information addressing the three descriptors is 
presented in narrative, it is difficult to provide summary 
information. According to Education officials, they did not provide 
specific guidance on how states are to report the other data elements 
because they did not want to be too prescriptive. However, according 
to Education officials, through their reviews of state Web sites they 
found cases where the sites do not clearly provide the information, 
and states have acted on Education's suggested improvements to the 
sites. 

Additionally, Education plans to use states' progress toward 
collecting and reporting this information to inform whether states are 
qualified to participate in or receive funds under future reform-
oriented grant competitions, such as they did for the Race to the Top 
program.[Footnote 53] GAO has found that using applicant's past 
performance to inform eligibility for future grant competitions can be 
a useful performance accountability mechanism.[Footnote 54] Education 
communicated its intention to use this mechanism in the final 
requirements published in the Federal Register on November 12, 2009, 
but Education has not yet specified how this mechanism will be 
used.[Footnote 55] As a result, officials in most of the states we 
spoke with said they were unaware of how Education planned to use the 
indicators or how Education would assess them with regards to their 
efforts to meet assurances. Education officials said they also plan to 
use the information to inform future policy and technical assistance 
to states and LEAs. 

Education and States Help Ensure Accountability, but Education Did Not 
Consistently Communicate SFSF Monitoring Concerns to States: 

A Range of Accountability Efforts Are in Place and Identified Areas 
for Improvement: 

To help ensure accountability of Recovery Act funds, a wide variety of 
entities oversee and audit Recovery Act programs,[Footnote 56] and 
Education officials told us they routinely review monitoring and audit 
results from many of these sources. Federal and state entities we 
spoke with described various accountability mechanisms in place over 
Recovery Act education programs, including financial reviews, program 
compliance reviews, and recipient report reviews. For example, state 
auditors and independent public accountants conduct single audits that 
include tests of internal control over and compliance with grant 
requirements such as allowable costs, maintenance of effort, and cash 
management practices.[Footnote 57] The Department of Education, the 
Education Office of Inspector General (OIG), and various state 
entities also examine internal controls and financial management 
practices, as well as data provided quarterly by grant recipients and 
subrecipients as required by section 1512 of the Recovery Act. 
Additionally, many of these entities conduct programmatic reviews that 
include monitoring compliance with program requirements, such as 
funding allowable activities and achieving intended program goals. 

These accountability efforts have helped identify areas for 
improvement at the state and local levels, such as issues with cash 
management, subrecipient monitoring, and reporting requirements. For 
example, since 2009 the Education OIG has recommended that several 
states improve their cash management procedures after finding that 
states did not have adequate processes to both minimize LEA cash 
balances and ensure that LEAs were properly remitting interest earned 
on federal cash advances.[Footnote 58] The Education OIG also found 
that several states had not developed plans to monitor certain 
Recovery Act funds or had not incorporated Recovery Act-specific 
requirements into their existing monitoring protocols. With regard to 
recipient reporting, various recipients had difficulty complying with 
enhanced reporting requirements associated with Recovery Act grants. 
For example, an independent public accounting firm contracted by the 
Mississippi Office of the State Auditor found 32 instances of 
noncompliance with reporting requirements in the 43 LEAs it tested. 
Some of the findings included failure to file quarterly recipient 
reports on Recovery Act funds as required and providing data in the 
quarterly reports that differed from supporting documentation. 

During the fiscal year 2010 single audits of the state governments we 
visited, auditors identified noncompliance with certain requirements 
that could have a direct and material effect on major programs, 
including some education programs in California and Massachusetts. 
[Footnote 59] In Iowa and Mississippi, the auditors found that the 
states complied in all material respects with federal requirements 
applicable to each of the federal programs selected by the auditors 
for compliance testing. Auditors also identified material weaknesses 
and significant deficiencies in internal control over compliance with 
SFSF; ESEA Title I, Part A; and IDEA, Part B, for some SEAs and LEAs 
we visited.[Footnote 60] For example, auditors reported that 
California's SEA continued to have a material weakness because it 
lacked an adequate process of determining the cash needs of its ESEA 
Title I subrecipients.[Footnote 61] At the state level, Iowa, 
Massachusetts, and Mississippi were found to have no material 
weaknesses in internal control over compliance related to Recovery Act 
education funds, though auditors did identify significant deficiencies 
in Iowa. For example, in Iowa auditors found several instances of 
excess cash balances for the SFSF grant. According to our survey of 
LEAs, nearly 8 percent of all LEAs reported having Single Audit 
findings related to Recovery Act education funds. For example, an 
auditor found that one LEA in Iowa had a material weakness because it 
did not properly segregate duties for SFSF--one employee was tasked 
with both preparing checks and recording transactions in the general 
ledger. In Massachusetts, the auditors identified a material weakness 
because an LEA was not complying with Davis-Bacon Act 
requirements,[Footnote 62] such as failing to obtain certified 
payrolls for vendors contracted for special education construction 
projects in order to verify that employees were being paid in 
accordance with prevailing wage rates. As part of the single audit 
process, grantees are responsible for follow-up and corrective action 
on all audit findings reported by their auditor, which includes 
preparing a corrective action plan at the completion of the audit. For 
all the 2010 single audit findings described above, the recipients 
submitted corrective action plans. 

Our survey of LEAs showed that federal and state entities also have 
been monitoring and auditing their Recovery Act funds through both 
site visits and desk reviews. As figure 5 indicates, over a third of 
LEAs reported their SEA conducted a desk review to oversee their use 
of Recovery Act funds, and nearly a fifth reported their SEA conducted 
a site visit. States are responsible for ensuring appropriate use of 
funds and compliance with program requirements at the subrecipient 
level, and Education in turn works to ensure that the states are 
monitoring and implementing federal funds appropriately. Education 
does select some school districts for desk reviews and site visits, as 
shown in figure 5. 

Figure 5: Estimated Percentage of LEAs Reporting Recovery Act 
Monitoring by Various Entities: 

[Refer to PDF for image: horizontal bar graph] 

Estimated percentage of LEAs: 

Department of Education: 
Desk review: 3.2%; 
Site visit: 2.9%. 

State Educational Agency: 
Desk review: 34.6%; 
Site visit: 18.0%. 

State Auditor Office: 
Desk review: 10.8%; 
Site visit: 12.8%. 

State Recovery Leader: 
Desk review: 2.1%; 
Site visit: 1.5%. 

Source: GAO survey of LEAs in school year 2010-11. 

Note: Percentages in figure may be underestimates, as survey 
respondents were instructed to check only one monitoring entity per 
type of review. 

[End of figure] 

While few LEAs reported that Education monitored their Recovery Act 
funds directly, Education program offices told us that as part of 
their oversight efforts, they routinely review and follow up on 
information from a broad range of other entities' monitoring and audit 
reports. SFSF; ESEA Title I, Part A; and IDEA, Part B program 
officials told us that information drawn from multiple sources helps 
to (1) inform their monitoring priorities, (2) ensure states follow up 
on monitoring findings, and (3) target technical assistance in some 
cases. 

Education's New SFSF Oversight Approach Helped Some States Address 
Findings Quickly but Communication Varied: 

Education's approach to ensuring accountability of SFSF funds, which 
was designed to take into consideration the short timeframes for 
implementing this one-time stimulus program, as well as the need for 
unprecedented levels of accountability, has helped some states address 
issues quickly. Two of Education's goals in monitoring these funds are 
to (1) identify potential or existing problem areas or weaknesses and 
(2) identify areas where additional technical assistance is warranted. 
SFSF officials told us that they have prioritized providing upfront 
technical assistance to help states resolve management issues before 
they publish monitoring reports. This is intended to be an iterative 
process of communicating with states about issues found during 
monitoring, helping them develop action plans to address findings, and 
working with them to ensure successful completion of any corrections 
needed. 

Some states we spoke with told us that Education's approach to SFSF 
monitoring allowed them to resolve issues prior to Education issuing a 
final monitoring report to the state, and also allowed them to correct 
systemic or cross-programmatic issues beyond SFSF. For example, New 
York officials told us that after their monitoring review, Education 
provided a thorough explanation of the corrective actions that were 
required. This allowed the state the opportunity to resolve the 
issues, which were specific to individual subrecipients, prior to the 
issuance of Education's final monitoring report. North Carolina 
officials said Education's monitoring helped them to implement new 
cash management practices, and reported that Education staff were 
proactive about communicating with the state to enable the issue to be 
resolved. District of Columbia officials also stated that Education's 
SFSF monitoring raised awareness of subgrantee cash management 
requirements and the need for state policies for those requirements 
across programs. District of Columbia, New York, and North Carolina 
officials all reported that the technical assistance they received as 
part of Education's SFSF monitoring follow up was timely and effective. 

While some states reported helpful and timely contact from Education 
after their monitoring reviews were completed, we found that 
communication varied during the follow-up process, which left some 
states waiting for information about potential issues. According to 
data provided by Education, most states that were monitored before 
June 2011 received contact at least once before the department issued 
a draft report with monitoring results. However, several states 
received no contact from Education before they received draft reports 
presenting areas of concern. Education officials explained that if 
complete documentation was available by the end of the state's 
monitoring review, the situation would require less follow-up 
communication than if the state needed to submit additional 
documentation. Additionally, while the department did contact most 
states after monitoring reviews, they did not consistently communicate 
feedback to states regarding their reviews. Some states that did not 
receive monitoring feedback promptly, either orally or in writing, 
have expressed concerns about their ability to take action on 
potential issues. For example, an Arizona official told us in June 
2011 that the state had not been contacted about the results of its 
monitoring visit in December 2010 and that follow up contact from 
Education would have been helpful to make any necessary adjustments 
during the final months of the SFSF program in the state. According to 
Education officials, the department did communicate with Arizona on 
several occasions following the monitoring visit, primarily to request 
additional information or clarification on such things as the state's 
method for calculating MOE. Education officials told us in that as a 
result of receiving further information and documentation from the 
state, they were finalizing the state's draft report and would share 
the information with the state as soon as possible. In July 2011 
California officials told us they had not heard about the results of 
the monitoring review that was completed 10 months earlier in 
September 2010. California officials told us that Education raised a 
question during its review, but the state was unsure about the 
resolution and whether they would be required to take corrective 
action. Education officials told us in September 2011 that they had 
numerous communications with California officials, often to clarify 
issues such as the state's method for calculating MOE, and that they 
were still in communication with the state as part of the process of 
identifying an appropriate resolution. 

As a result of Education's approach to monitoring, the length of time 
between the Department's monitoring reviews and the issuance of the 
monitoring reports varied greatly--from as few as 25 business days to 
as many as 265 business days (see figure 6). The need to address 
issues identified during monitoring and the subsequent frequency of 
communication during monitoring follow up can affect the amount of 
time it takes to issue reports with monitoring results. For example, 
after Maine's desk review in September 2010, Education contacted the 
state 10 times to request additional information and clarification 
before sending the state a draft interim report 7 months later in 
April 2011. In contrast, Rhode Island was contacted once after its 
site visit, and Education provided a draft report with results about a 
month later. In part because of the need for continuous collaboration 
with states, Education's written SFSF monitoring plan does not include 
specific internal time frames for when it will communicate the status 
of monitoring issues to states after desk reviews and site visits. In 
the absence of such time frames, the length of time between 
Education's monitoring reviews and issuance of draft interim reports 
with monitoring feedback varied widely across states. Education 
officials told us they believe states benefit more from the iterative 
monitoring process that emphasizes early resolution of issues than 
through the issuance of written monitoring reports. 

Figure 6: Number of Days between SFSF Monitoring Review and Issuance 
of Draft Report to State (as of 9/16/11): 

[Refer to PDF for image: illustrated U.S. map] 

Number of business days waiting: 0-100; 
Alabama: 
Indiana: 
New Hampshire: 
Rhode Island: 
Utah: 
Washington: 
Wisconsin: 

Number of business days waiting: 101-150; 
Connecticut: 
Kansas: 
Minnesota: 
Missouri: 
Montana: 
Nebraska: 
New York: 
North Carolina: 
Tennessee: 
Virginia: 
West Virginia: 

Number of business days waiting: 151-200; 
Arizona[A]: 
Arkansas: 
Colorado: 
Georgia: 
Kentucky: 
Louisiana: 
Maine: 
Maryland: 
New Jersey: 
New Mexico: 
North Dakota: 
Ohio: 
Puerto Rico: 
South Carolina: 
Vermont[A]: 

Number of business days waiting: 201 or more; 
Alaska: 
California[A]: 
Delaware: 
District of Columbia: 
Idaho: 
Nevada: 
South Dakota: 

States not monitored before June 2011: 
Florida: 
Hawaii: 
Illinois: 
Iowa: 
Massachusetts: 
Michigan: 
Mississippi: 
Oklahoma: 
Oregon: 
Pennsylvania: 
Texas: 
Wyoming: 

[A] State had not received findings as of Sept. 16, 2011 

Source: GAO analysis of US Department of Education data. 

[End of figure] 

Due to its SFSF monitoring approach, Education has provided limited 
information publicly on the results of its oversight efforts, but it 
has plans to provide more detailed reports on what it has found during 
monitoring in the future and has taken steps to share information on 
common issues found among the states. While most SFSF monitoring 
reviews have been completed for the 2010-2011 cycle, Education has not 
communicated information about most of these reviews to the public and 
the states' governors. Of the 48 completed reviews, only three reports 
for site visits and 12 reports for desk reviews have been published 
(see figure 7). Additionally, the reports that have been published are 
brief and present a general description of the area of concern without 
detailing what the specific issues and their severity were. For 
example, in Tennessee's final letter report, Education wrote that it 
found issues with LEA funding applications, fiscal oversight, 
allowable activities, cash management, and subrecipient monitoring. 
However, Education officials told us that they planned to publish more 
detailed final reports after the 2011-2012 SFSF monitoring cycle, at 
which point they would have completed both a desk review and a site 
visit for each state. In the meantime, to help other states learn from 
common issues found during SFSF monitoring reviews, Education provided 
technical assistance to all states via a webinar in February 2011. The 
webinar highlighted lessons learned during monitoring reviews, 
including best practices for cash management and separate tracking of 
funds. 

Figure 17: Status of Education's SFSF Monitoring Reviews and Reports 
(as of 8/31/11): 

[Refer to PDF for image: list] 

Site visits: 

Review not completed (1): 
Pennsylvania. 

Review complete, report not published[A] (15): 
Arizona: 
District of Columbia: 
Florida: 
Hawaii: 
Illinois: 
Iowa: 
Massachusetts: 
Nevada: 
New Jersey: 
New Mexico: 
Puerto Rico: 
South Carolina: 
South Dakota: 
Texas: 
Washington: 

Report published (3): 
Maryland: 
Rhode Island: 
Tennessee: 

Desk reviews: 

Review not completed (3): 
Michigan: 
Mississippi: 
Oklahoma: 

Review complete, report not published[B] (18): 
Alaska: 
Arkansas: 
California: 
Connecticut: 
Delaware: 
Idaho: 
Indiana: 
Kansas: 
Kentucky: 
Louisiana: 
Montana: 
North Dakota: 
Ohio: 
Oregon: 
Utah: 
Vermont: 
Virginia: 
Wyoming: 

Report published (12): 
Alabama: 
Colorado: 
Georgia: 
Maine: 
Minnesota: 
Missouri: 
Nebraska: 
New Hampshire: 
New York: 
North Carolina: 
West Virginia: 
Wisconsin: 

Source: GAO analysis of U.S. Department of Education data. 

[A] For the completed site visits, Education has issued draft interim 
reports to seven states. 

[B] For the completed desk reviews, Education has issued draft interim 
reports to 12 states. 

[End of figure] 

Education and States Continue to Oversee the Quality of Recipient 
Reporting Data in Eighth Round of Reporting: 

To meet our mandate to comment on recipient reports, we continued to 
monitor recipient-reported data, including data on jobs funded. For 
this report, we focused our review on the quality of data reported by 
SFSF; ESEA Title I, Part A; and IDEA Part B education grant 
recipients. Using education recipient data from the eighth reporting 
period, which ended June 30, 2011, we continued to check for errors or 
potential problems by repeating analyses and edit checks reported in 
previous reports. 

Education and States Continue to Review Data Quality and Use Recipient 
Reports for Monitoring: 

Education uses various methods to review the accuracy of recipient 
reported data to help ensure data quality. Specifically, Education 
compared data from the agency's grant database and financial 
management system with recipient reported data. These systems contain 
internal data for every award made to states, including the award 
identification number, award date, award amount, outlays,[Footnote 63] 
and recipient names. Education program officials told us they verified 
expenditure data in states' quarterly reports by comparing it to data 
in their internal grants management system. Education officials told 
us that state expenditures can vary from outlays depending on how the 
state reimburses its subrecipients, but Education officials review the 
figures to determine if they are reasonable. In addition, SFSF 
officials told us they cross-walked the recipient reported data with 
previous quarterly reports to check for reasonableness. For example, 
the officials told us they compared the number of subrecipients and 
vendors from quarter to quarter to see if they increased or stayed the 
same, as would be expected for a cumulative data point. Education 
officials stated they worked directly with states to correct any 
issues found during their checks of recipient reported data. Overall, 
Education officials agreed that they have made significant progress in 
ensuring data quality, as compared to the early quarters when they had 
to focus on helping states understand basic reporting requirements. At 
this point, the program officials told us they do not generally see 
significant data quality issues or mistakes when they review recipient 
reports.[Footnote 64] In August 2011, the Education OIG reported that 
they performed 49,150 data quality tests of recipient reported data 
for grant awards and found anomalies in 4 percent of the tests. 
[Footnote 65] The OIG reported that the Department's processes to 
ensure the accuracy and completeness of recipient reported data were 
generally effective. 

In addition to Education's efforts to ensure data quality, selected 
state officials we spoke with said they examined recipient reports of 
individual subrecipients. For example, Georgia officials told us they 
reviewed FTE data for reasonableness, compared revenues and 
expenditures, and ensured all vendors were included in vendor payment 
reports. The officials stated that they followed up on any 
questionable items with district staff. As we previously reported, 
calculating FTE data presented initial challenges for many LEAs, and 
states worked to ensure the accuracy of the data through a variety of 
checks and systems. For example, the Mississippi Department of 
Education helped LEAs calculate FTE data correctly by providing LEAs 
spreadsheets with ready-made formulas. New York officials told us they 
examined the calculation of FTEs funded and compared that data with 
payroll records. North Carolina officials told us that through their 
review of LEA data, they identified issues with FTE figures that were 
budgeted but not ultimately verified against actual figures. To 
improve the accuracy of the data, the state now compares LEA payroll 
records to their budgeted figures. 

Education and selected states told us they used recipient reports to 
obtain data on expenditures, FTEs, and other activities funded to 
enhance their oversight and management efforts. For example, 
Education's special education program officials and most selected 
states used recipient reported data to track the amount of Recovery 
Act funds LEAs spent. 

In particular, Education officials that administer the IDEA, Part B 
grant told us they monitored LEA expenditures through recipient 
reports because it was the only information they had on how much 
subrecipients had spent. Education and several selected states also 
told us they examined recipient reports as part of their monitoring 
efforts. For example, SFSF program officials reviewed recipient 
reports, particularly expenditure data and the subrecipient award 
amount, to help choose subrecipients for monitoring. Officials from 
Arizona, the District of Columbia, and North Carolina told us they 
used recipient reported data to assess risk and inform their 
monitoring efforts. For example, the District of Columbia tracks 
spending rates to ensure subrecipients meet the deadline for using the 
funds. If a subrecipient has lower than expected spending rates, they 
are subject to increased monitoring. Arizona uses recipient reported 
data to verify that internal controls are working, for instance by 
examining expenditure rates to see whether there may be cash 
management issues. In addition, Iowa and New York officials said they 
used recipient reported data to ensure appropriate uses of funds. 

State and LEA officials we spoke with continued to report greater ease 
in collecting and reporting data for recipient reports. As we 
previously reported, recipients told us they have gained more 
experience reporting and the reporting process was becoming routine. 
[Footnote 66] For example, Arizona officials told us that their 
centralized reporting process now runs with very little effort or 
burden on state and local recipients of Recovery Act education funds. 
Alaska officials stated that the early quarters were challenging for 
reporting, but the state training sessions with LEAs helped establish 
a smooth process by the third quarter. At the local level, an LEA 
official in Iowa told us that while recipient reporting was confusing 
in the beginning, her district changed some internal procedures and 
automated some calculations to make the process more efficient. One 
measure of recipients' understanding of the reporting process is the 
number of noncompliant recipients. There were no non-compliers in the 
eighth reporting period for recipients of SFSF, ESEA Title I, Part A 
or IDEA, Part B funds.[Footnote 67] 

Although the recipient reporting process has become smoother over 
time, some states and LEAs noted that there continues to be a burden 
associated with meeting reporting requirements, particularly due to 
limited resources. For example, California officials stated it had 
been burdensome to collect data from over 1,500 LEAs when there were 
significant budget cuts. Officials from the Massachusetts SEA stated 
that the most burdensome aspect of recipient reporting was the short 
time frame for collecting data from nearly 400 LEAs when local staff 
were already stretched thin. At the local level, officials at a rural 
Mississippi school district stated that gathering the supporting 
documents for their quarterly reports was cumbersome and took a 
significant amount of time. For example, in the previous quarter one 
staff member had to upload more than 70 supporting documents to the 
state's centralized reporting system. Further, Education officials 
noted that the improvements in the process for recipient reporting 
have not eliminated the burden on LEAs. Moreover, according to 
Education officials, although the primary goal of the Recovery Act 
grants was not reporting, grantees were spending significant amounts 
of time complying with the reporting process when the Department 
already had some data elements, such as grant awards and drawdowns, 
from other sources. 

Two recent actions indicate that recipient reporting could be expanded 
to funds beyond those from the Recovery Act. A White House Executive 
Order dated June 13, 2011, established a Government Accountability and 
Transparency Board (Board) to provide strategic direction for 
enhancing the transparency of federal spending and advance efforts to 
detect and remediate fraud, waste, and abuse in federal programs, 
among other things.[Footnote 68] By December 2011, the Board is 
required to develop guidelines for integrating systems that support 
the collection and display of government spending data, ensuring the 
reliability of those data, and broadening the deployment of fraud 
detection technologies. In addition, one of the objectives of proposed 
legislation--the Digital Accountability and Transparency Act of 2011 
(DATA Act)--is to enhance transparency by broadening the requirement 
for reporting to include recipients of non-Recovery Act funds. 
[Footnote 69] 

While FTE Data Have Limitations, Education Found These Data to Be 
Useful: 

According to Recovery.gov, during the quarter beginning April 1, 2011, 
and ending June 30, 2011, the Recovery Act funded approximately 
286,000 FTEs using funds under the programs in our review (see figure 
8).[Footnote 70] Further, for this eighth round of reporting, similar 
to what we observed in previous rounds, education FTEs for these 
programs accounted for about half of all FTEs reported for the 
quarter. Following OMB guidance, states reported on FTEs directly paid 
for with Recovery Act funding, not the employment impact on suppliers 
of materials (indirect jobs) or on the local communities (induced 
jobs). According to Education officials, FTE numbers were expected to 
decrease over time because fewer prime recipients would be reporting 
as they exhaust all of their Recovery Act funds. 

Figure 8: FTEs Reported for Recovery Act SFSF; Title I, Part A; and 
IDEA, Part B in 50 States and DC for Quarters Ending December 2009 
through June 2011: 

[Refer to PDF for image: stacked vertical bar graph] 

FTEs (in thousands): 

Reporting quarter end date: December 2009; 
SFSF education stabilization: 245.07; 
SFSF government services: 43.37; 
IDEA, Part B: 47.71; 
ESEA Title I, Part A: 47.47. 

Reporting quarter end date: March 2010; 
SFSF education stabilization: 291.49; 
SFSF government services: 50.95; 
IDEA, Part B: 55.61; 
ESEA Title I, Part A: 44.52 

Reporting quarter end date: June 2010; 
SFSF education stabilization: 265.38; 
SFSF government services: 54.59; 
IDEA, Part B: 63.3; 
ESEA Title I, Part A: 48.02. 

Reporting quarter end date: September 2010; 
SFSF education stabilization: 171.1; 
SFSF government services: 59.14; 
IDEA, Part B: 49.03; 
ESEA Title I, Part A: 44.79. 

Reporting quarter end date: December 2010; 
SFSF education stabilization: 179.09; 
SFSF government services: 16.29; 
IDEA, Part B: 49.08; 
ESEA Title I, Part A: 41.5. 

Reporting quarter end date: March 2011; 
SFSF education stabilization: 175.31; 
SFSF government services: 20.19; 
IDEA, Part B: 52.2; 
ESEA Title I, Part A: 45.04. 

Reporting quarter end date: June 2011; 
SFSF education stabilization: 156.51; 
SFSF government services: 26.67; 
IDEA, Part B: 56.83; 
ESEA Title I, Part A: 46.13. 

Source: GAO analysis of recipient reported data from Recovery.gov. 

Note: Recipient reported data were downloaded from Recovery.gov on 
July 30, 2011. We did not include FTE data from the first reporting 
quarter due to concerns about comparability. We did not include FTE 
counts associated with the Education Jobs Fund. 

[End of figure] 

FTE data provide an overall indication of the extent to which the 
Recovery Act met one of its intended goals of saving and creating jobs 
in order to help economic recovery, although some limitations with 
these data may make it difficult to determine the impact the Recovery 
Act made in any one particular reporting period. In May 2010, GAO 
identified a number of issues that could lead to under-or over- 
reporting of FTEs.[Footnote 71] 

Our analysis of the data on Recovery.gov showed variations in the 
number of FTEs reported, which Education officials said could be 
explained by states' broad flexibility in determining what they used 
Recovery Act SFSF funds on and when they allocated those funds. For 
example, Illinois reported less than 1 FTE in the second reporting 
round and over 40,000 in the third reporting round for the SFSF 
education stabilization funds. Education officials stated that rarely 
would the districts in one state hire 40,000 teachers in 1 quarter. 
Rather, Education officials said the state likely made a decision to 
allocate those funds in that quarter to teacher salaries. Similarly, 
from the fourth to fifth reporting rounds, the number of FTEs more 
than doubled in Arkansas and nearly doubled in Florida for the SFSF 
education stabilization funds. Education officials explained that any 
significant increase or decrease in FTEs likely reflects the state's 
decision to allocate the funds in one quarter rather than during 
another quarter. They noted that some states used their funds 
consistently over time, whereas others used a large portion of the 
funds at the beginning or end of a school year. Therefore, sharp 
increases or decreases in the FTE data are not uncommon or unexpected. 
Delaware reported no FTEs for SFSF government services funds in the 
eighth reporting round. Education officials stated that Delaware 
decided to use those funds on operating costs, not salaries. 

Education officials told us that recipient reported FTE data were 
useful to them when assessing the impact of grants on jobs funded. 
Education does not have any comparable data on jobs funded. Therefore, 
FTE data provided them a measure of the extent to which the Recovery 
Act programs, particularly SFSF, accomplished that specific goal of 
funding jobs. According to Education officials, determining jobs 
funded was an important, but secondary impact of the Recovery Act 
funding for the ESEA Title I, Part A and IDEA, Part B grants. The 
purpose of ESEA Title I is to ensure that all children have a fair, 
equal, and significant opportunity to obtain a high-quality education 
by providing financial assistance to LEAs and schools with high 
numbers or percentages of poor children. The purpose of IDEA, Part B 
is to ensure that all students with disabilities have available to 
them a free appropriate public education that emphasizes special 
education and related services designed to meet their unique needs. 
According to Education officials, some of the services provided to 
students using the ESEA Title I, Part A and IDEA, Part B Recovery Act 
funds led to the creation of jobs while others served the needs of 
children but did not directly create jobs. Therefore, while FTE data 
did provide a useful indication of jobs funded for those programs 
under the Recovery Act, other measures such as student outcomes will 
be more useful after the Recovery Act ends when assessing the impact 
of programs with education-related goals. 

Conclusions: 

A key goal of Recovery Act funding was to create and retain jobs and, 
for SFSF, to advance education reforms, and our work has consistently 
shown that LEAs primarily used their funding to cover the cost of 
retaining jobs. Additionally, the transparency required by Recovery 
Act reporting allowed the public access to data on the number of jobs 
funded and the amount of funds spent, but as the deadline for 
obligating funds approaches, little is currently known nationally 
about the advancement of the four areas of educational reform. 
Education's planned evaluation could make an important contribution to 
understanding any outcomes related to reform. This national evaluation 
could be especially important considering that officials in many of 
our selected states have not planned evaluations, and many LEAs 
reported that they are neither collecting nor planning to collect data 
to evaluate the effect of SFSF on education reform efforts. While 
Education will assess results through its own evaluation, it will not 
be fully completed for several years. In the shorter term, state 
reporting on the SFSF indicators and descriptors of reform is the 
mechanism through which Education and the public track the extent to 
which a state is making progress. As these final data become available 
at the end of this fiscal year, Education has plans for assessing 
state compliance and analyzing the results in order to present, where 
possible, information to policymakers and the public. Given the 
accountability and transparency required by the Recovery Act, we feel 
it is important for Education to follow through with its plans to hold 
states accountable for presenting performance information and in its 
efforts to assist the public and policymakers in understanding the 
reform progress made by states. 

In addition to evaluations and reporting, program accountability can 
be facilitated through monitoring and taking corrective action on 
audit findings. Because of the historic amount of Education funding 
included in the Recovery Act, effective oversight and internal 
controls are of fundamental importance in assuring the proper and 
effective use of federal funds to achieve program goals. Education's 
new SFSF monitoring process took into account the one-time nature of 
these funds and was designed to make states aware of monitoring and 
audit findings to help them resolve any issues or make improvements to 
their program prior to Education publishing a final report. However, 
Education's implementation of this process has varied, with some 
states waiting months to get feedback on monitoring results. When 
states do not receive timely feedback on monitoring findings, they may 
not have time to resolve these issues before they have obligated their 
SFSF funds. 

Recommendation for Executive Action: 

To ensure all states receive appropriate communication and technical 
assistance for SFSF, consistent with what some states received in 
response to SFSF monitoring reviews, we recommend that the Secretary 
of Education establish mechanisms to improve the consistency of 
communicating monitoring feedback to states, such as establishing 
internal time frames for conveying information found during monitoring. 

Agency Comments and Our Evaluation: 

We provided a draft copy of this report to Education for review and 
comment. Education's comments are reproduced in appendix III. 

Education agreed with our recommendation to improve the consistency of 
communicating SFSF monitoring feedback to states. Specifically, 
Education responded that their SFSF monitoring protocols should 
include procedures for effectively communicating the status of 
monitoring feedback to states. Additionally, Education officials 
reiterated that the new SFSF monitoring approach was designed as an 
iterative method to take into consideration the large amount of 
funding, the complexities of state budget situations, the need to 
expeditiously resolve monitoring issues due to the short time frames, 
and the large numbers and diverse types of grantees. Through this 
monitoring approach, Education officials noted that the department has 
completed reviews of all but one state and is currently planning the 
second cycle of monitoring. Education officials reported that the 
feedback provided to states through this new approach was ongoing and 
that not all states have required the same level of follow up 
discussions. GAO agrees that this approach is appropriate given the 
one-time nature of the SFSF program and, as we point out in our 
report, this approach has helped states to quickly address potential 
issues. Since the amount of contact between Education and the states 
can be numerous and involve multiple officials and agencies, we 
believe that any actions taken by the department to improve the 
consistency of communication with states will improve its monitoring 
process. 

Education also provided some additional and updated information about 
their monitoring efforts and we modified the report to reflect the 
data they provided. In addition, Education provided us with several 
technical comments that we incorporated, as appropriate. 

We are sending copies of this report to relevant congressional 
committees, the Secretary of Education, and other interested parties. 
In addition, this report will be available at no charge on GAO's Web 
site at [hyperlink, http://www.gao.gov]. 

If you or your staff have any questions about this report, please 
contact me at (202) 512-7215 or scottg@gao.gov. Contact points for our 
Offices of Congressional Relations and Public Affairs may be found on 
the last page of this report. GAO staff who made key contributions to 
this report are listed in appendix V. 

Signed by: 

George A. Scott, Director: 
Education, Workforce, and Income Security Issues: 

List of Addressees: 

The Honorable Daniel K. Inouye: 
Chairman: 
The Honorable Thad Cochran: 
Vice Chairman: 
Committee on Appropriations: 
United States Senate: 

The Honorable Harold Rogers: 
Chairman: 
The Honorable Norman D. Dicks: 
Ranking Member: 
Committee on Appropriations: 
House of Representatives: 

The Honorable Joseph I. Lieberman: 
Chairman: 
The Honorable Susan M. Collins: 
Ranking Member: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

The Honorable Darrell E. Issa: 
Chairman: 
The Honorable Elijah Cummings: 
Ranking Member: 
Committee on Oversight and Government Reform: 
House of Representatives: 

The Honorable Tom Harkin: 
Chairman: 
The Honorable Michael B. Enzi: 
Ranking Member: 
Senate Heath, Education, Labor and Pensions: 
United States Senate: 

The Honorable John Kline: 
Chairman: 
The Honorable George Miller: 
Ranking Member: 
Committee on Education and the Workforce: 
House of Representatives: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

To obtain national level information on how Recovery Act funds made 
available by the U.S. Department of Education (Education) under SFSF; 
ESEA Title I, Part A; and IDEA, Part B were used at the local level, 
we designed and administered a Web-based survey of local educational 
agencies (LEA) in the 50 states and the District of Columbia. We 
surveyed school district superintendents across the country to learn 
how Recovery Act funding was used and what impact these funds had on 
school districts. We selected a stratified[Footnote 72] random sample 
of 688 LEAs from the population of 15,994 LEAs included in our sample 
frame of data obtained from Education's Common Core of Data (CCD) in 
2008-09. We conducted our survey between March and May 2011, with a 78 
percent final weighted response rate. 

We took steps to minimize nonsampling errors by pretesting the survey 
instrument with officials in three LEAs in January 2011 and February 
2011. Because we surveyed a sample of LEAs, survey results are 
estimates of a population of LEAs and thus are subject to sampling 
errors that are associated with samples of this size and type. Our 
sample is only one of a large number of samples that we might have 
drawn. As each sample could have provided different estimates, we 
express our confidence in the precision of our particular sample's 
results as a 95 percent confidence interval. All estimates produced 
from the sample and presented in this report are representative of the 
in-scope population and have margins of error of plus or minus 7 
percentage points or less for our sample, unless otherwise noted. We 
excluded nine of the sampled LEAs because they were no longer 
operating in the 2010-11 school year or were not an LEA, and therefore 
were considered out of scope. This report does not contain all the 
results from the survey. The survey and a more complete tabulation of 
the results can be viewed at GAO-11-885SP. 

At the state and local level, we conducted site visits to four states 
(California, Iowa, Massachusetts, and Mississippi), and contacted an 
additional seven states (Alaska, Arizona, Georgia, Hawaii, North 
Carolina, New York, and Wyoming) and the District of Columbia to 
discuss how they were using, monitoring, and planning to evaluate the 
effect of their Recovery Act funds. In addition, we contacted 
officials from Florida, Kansas, and South Carolina for information 
regarding IDEA, Part B waivers. We selected these states in order to 
have an appropriate mix of recipients that varied across certain 
factors, such as drawdown rates, economic response to the recession, 
and data availability, with consideration of geography and recent 
federal monitoring coverage. 

During our site visits, we met with SFSF, ESEA Title I, and IDEA 
officials at the state level as well as LEAs and an Institution of 
Higher Education (IHE). For the additional seven states, we gathered 
information by phone or e-mail from state education program officials 
on fund uses, monitoring, and evaluation. We also met with program 
officials at Education to discuss ongoing monitoring and evaluation 
efforts for Recovery Act funds provided through SFSF, ESEA Title I, 
and IDEA. We also interviewed officials at Education and reviewed 
relevant federal laws, regulations, guidance, and communications to 
the states. Further, we obtained information from Education's Web site 
about the amount of funds these states have drawn down from their 
accounts with Education. 

The recipient reporting section of this report responds to the 
Recovery Act's mandate that we comment on the estimates of jobs 
created or retained by direct recipients of Recovery Act funds. 
[Footnote 73] For our review of the eighth submission of recipient 
reports covering the period from April 1, 2011, through June 30, 2011, 
we built on findings from our prior reviews of the reports. We 
performed edit checks and basic analyses on the eighth submission of 
recipient report data that became publicly available at Recovery.gov 
on July 30, 2011.[Footnote 74] To understand how the quality of jobs 
data reported by Recovery Act education grantees has changed over 
time, we compared the 8 quarters of recipient reporting data that were 
publicly available at Recovery.gov on July 30, 2011. 

In addition, we also reviewed documentation and interviewed federal 
agency officials from Education who have responsibility for ensuring a 
reasonable degree of quality across their programs' recipient reports. 
Due to the limited number of recipients reviewed and the judgmental 
nature of the selection, the information we gathered about state 
reporting and oversight of FTEs is limited to those selected states in 
our review and not generalizable to other states. GAO's findings based 
on analyses of FTE data are limited to those Recovery Act education 
programs and time periods examined and are not generalizable to any 
other programs' FTE reporting. 

We compared, at the aggregate and state level, funding data reported 
directly by recipients on their quarterly reports against the 
recipient funding data maintained by Education. The cumulative funding 
data reported by the recipients aligned closely with the funding data 
maintained by the Department of Education. An Education Inspector 
General report included a similar analysis comparing agency data to 
recipient reported data from the first quarter of 2010.[Footnote 75] 
Although not directly comparable to our analysis, their assessment 
identified various discrepancies between agency and recipient reported 
data. We also noted some discrepancies across the education programs 
we reviewed where the state recipients' reported expenditures were 
either greater or less than 10 percent of the respective outlays 
reported by Education. In general, however, we consider the recipient 
report data to be sufficiently reliable for the purpose of providing 
summary, descriptive information about FTEs or other information 
submitted on grantees' recipient reports. 

To update the status of open recommendations from previous bimonthly 
and recipient reporting reviews, we obtained information from agency 
officials on actions taken in response to the recommendations. 

We conducted this performance audit from October 2010 to September 
2011 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Drawdown Rates by Program: 

Table 5: Percentage of Awarded Recovery Act SFSF; ESEA Title I, Part 
A; and IDEA, Part B Funds Drawn Down by States as of September 9, 2011: 

Alaska: 
SFSF education stabilization funds: 87%; 
SFSF government services funds: 92%; 
SFSF education stabilization and government services funds: 88%; 
ESEA Title I, Part A: 95%; 
IDEA, Part B: 93%. 

Alabama: 
SFSF education stabilization funds: 95%; 
SFSF government services funds: 99%; 
SFSF education stabilization and government services funds: 96%; 
ESEA Title I, Part A: 93%; 
IDEA, Part B: 94%. 

Arkansas: 
SFSF education stabilization funds: 88%; 
SFSF government services funds: 85%; 
SFSF education stabilization and government services funds: 87%; 
ESEA Title I, Part A: 86%; 
IDEA, Part B: 86%; 

Arizona: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 82%; 
SFSF education stabilization and government services funds: 97%; 
ESEA Title I, Part A: 92%; 
IDEA, Part B: 92%; 

California: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 98%; 
IDEA, Part B: 92%; 

Colorado: 
SFSF education stabilization funds: 98%; 
SFSF government services funds: 90%; 
SFSF education stabilization and government services funds: 97%; 
ESEA Title I, Part A: 89%; 
IDEA, Part B: 90%; 

Connecticut: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 100%; 
IDEA, Part B: 99%; 

District of Columbia: 
SFSF education stabilization funds: 99%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 99%; 
ESEA Title I, Part A: 65%; 
IDEA, Part B: 97%; 

Delaware: 
SFSF education stabilization funds: 90%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 92%; 
ESEA Title I, Part A: 86%; 
IDEA, Part B: 81%; 

Florida: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 99%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 94%; 
IDEA, Part B: 97%; 

Georgia: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 84%; 
IDEA, Part B: 88%; 

Hawaii: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 93%; 
SFSF education stabilization and government services funds: 99%; 
ESEA Title I, Part A: 84%; 
IDEA, Part B: 92%; 

Iowa: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 99%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 96%; 
IDEA, Part B: 100%; 

Idaho: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 90%; 
SFSF education stabilization and government services funds: 98%; 
ESEA Title I, Part A: 90%; 
IDEA, Part B: 95%; 

Illinois: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 95%; 
IDEA, Part B: 94%; 

Indiana: 
SFSF education stabilization funds: 95%; 
SFSF government services funds: 94%; 
SFSF education stabilization and government services funds: 95%; 
ESEA Title I, Part A: 90%; 
IDEA, Part B: 90%; 

Kansas: 
SFSF education stabilization funds: 96%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 96%; 
ESEA Title I, Part A: 99%; 
IDEA, Part B: 100%; 

Kentucky: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 95%; 
IDEA, Part B: 93%; 

Louisiana: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 92%; 
IDEA, Part B: 88%; 

Massachusetts: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 96%; 
SFSF education stabilization and government services funds: 99%; 
ESEA Title I, Part A: 100%; 
IDEA, Part B: 100%; 

Maryland: 
SFSF education stabilization funds: 96%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 97%; 
ESEA Title I, Part A: 91%; 
IDEA, Part B: 86%; 

Maine: 
SFSF education stabilization funds: 95%; 
SFSF government services funds: 99%; 
SFSF education stabilization and government services funds: 96%; 
ESEA Title I, Part A: 93%; 
IDEA, Part B: 92%; 

Michigan: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 94%; 
IDEA, Part B: 94%; 

Minnesota: 
SFSF education stabilization funds: 99%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 99%; 
ESEA Title I, Part A: 89%; 
IDEA, Part B: 92%; 

Missouri: 
SFSF education stabilization funds: 91%; 
SFSF government services funds: 98%; 
SFSF education stabilization and government services funds: 93%; 
ESEA Title I, Part A: 95%; 
IDEA, Part B: 89%; 

Mississippi: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 81%; 
SFSF education stabilization and government services funds: 97%; 
ESEA Title I, Part A: 85%; 
IDEA, Part B: 81%; 

Montana: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 98%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 94%; 
IDEA, Part B: 95%; 

North Carolina: 
SFSF education stabilization funds: 99%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 99%; 
ESEA Title I, Part A: 98%; 
IDEA, Part B: 99%; 

North Dakota: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 88%; 
SFSF education stabilization and government services funds: 98%; 
ESEA Title I, Part A: 85%; 
IDEA, Part B: 94%; 

Nebraska: 
SFSF education stabilization funds: 94%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 95%; 
ESEA Title I, Part A: 61%; 
IDEA, Part B: 74%; 

New Hampshire: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 99%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 76%; 
IDEA, Part B: 80%; 

New Jersey: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 80%; 
IDEA, Part B: 81%; 

New Mexico: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 86%; 
SFSF education stabilization and government services funds: 98%; 
ESEA Title I, Part A: 93%; 
IDEA, Part B: 86%; 

Nevada: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 95%; 
IDEA, Part B: 98%; 

New York: 
SFSF education stabilization funds: 94%; 
SFSF government services funds: 97%; 
SFSF education stabilization and government services funds: 95%; 
ESEA Title I, Part A: 90%; 
IDEA, Part B: 85%; 

Ohio: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 93%; 
IDEA, Part B: 95%; 

Oklahoma: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 74%; 
SFSF education stabilization and government services funds: 95%; 
ESEA Title I, Part A: 89%; 
IDEA, Part B: 97%; 

Oregon: 
SFSF education stabilization funds: 98%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 99%; 
ESEA Title I, Part A: 97%; 
IDEA, Part B: 95%; 

Pennsylvania: 
SFSF education stabilization funds: 93%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 95%; 
ESEA Title I, Part A: 82%; 
IDEA, Part B: 92%; 

Rhode Island: 
SFSF education stabilization funds: 84%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 87%; 
ESEA Title I, Part A: 93%; 
IDEA, Part B: 92%; 

South Carolina: 
SFSF education stabilization funds: 96%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 97%; 
ESEA Title I, Part A: 90%; 
IDEA, Part B: 83%; 

South Dakota: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 99%; 
IDEA, Part B: 100%; 

Tennessee: 
SFSF education stabilization funds: 99%; 
SFSF government services funds: 85%; 
SFSF education stabilization and government services funds: 96%; 
ESEA Title I, Part A: 94%; 
IDEA, Part B: 91%; 

Texas: 
SFSF education stabilization funds: 95%; 
SFSF government services funds: 99%; 
SFSF education stabilization and government services funds: 96%; 
ESEA Title I, Part A: 92%; 
IDEA, Part B: 90%; 

Utah: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 87%; 
IDEA, Part B: 77%; 

Virginia: 
SFSF education stabilization funds: 89%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 91%; 
ESEA Title I, Part A: 79%; 
IDEA, Part B: 76%; 

Vermont: 
SFSF education stabilization funds: 97%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 98%; 
ESEA Title I, Part A: 98%; 
IDEA, Part B: 95%; 

Washington: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 90%; 
IDEA, Part B: 93%; 

Wisconsin: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 100%; 
SFSF education stabilization and government services funds: 100%; 
ESEA Title I, Part A: 89%; 
IDEA, Part B: 89%; 

West Virginia: 
SFSF education stabilization funds: 100%; 
SFSF government services funds: 64%; 
SFSF education stabilization and government services funds: 94%; 
ESEA Title I, Part A: 96%; 
IDEA, Part B: 90%; 

Wyoming: 
SFSF education stabilization funds: 34%; 
SFSF government services funds: 64%; 
SFSF education stabilization and government services funds: 39%; 
ESEA Title I, Part A: 84%; 
IDEA, Part B: 79%; 

Total: 
SFSF education stabilization funds: 98%; 
SFSF government services funds: 97%; 
SFSF education stabilization and government services funds: 98%; 
ESEA Title I, Part A: 91%; 
IDEA, Part B: 91%%; 

Source: GAO analysis of U.S. Department of Education data. 

[End of table] 

[End of section] 

Appendix III: Comments from the Department of Education: 

United States Department Of Education: 
Office Of The Deputy Secretary: 
400 Maryland Ave. S.W. 
Washington, DC 20202: 

September 16, 2011: 

Mr. George A. Scott: 
Director: 
Education, Workforce, and Income Security Issues: 
U.S. Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Mr. Scott: 

I am writing in response to the recommendation made in the draft U.S. 
Government Accountability Office (GAO) report, "Recovery Act Education 
Programs: Funding Retained Teachers, but Education Could More 
Consistently Communicate Stabilization Monitoring Issues" (GA0-11-
804). GAO reviewed the administration, implementation, and oversight 
of three programs administered by the U. S. Department of Education 
(Department) that received funding under the American Recovery and 
Reinvestment Act of 2009 (Recovery Act): the State Fiscal 
Stabilization Fund (SFSF); Title I, Part A of the Elementary and 
Secondary Education Act of 1965, as amended; and the Individuals with 
Disabilities Education Act, as amended (IDEA). The report examines: 
(1) how selected States and local recipients used the funds; (2) what 
plans the Department and selected States have to assess the impact of 
the funds; (3) what approaches are being used to ensure accountability 
of the funds; and (4) how the Department and States ensure the 
accuracy of recipient reported data. 

We appreciate the time and effort that your office devoted to 
conducting this review. We are particularly pleased that GAO found 
that the Recovery Act funds are helping States save the jobs of 
significant numbers of teachers. However, we are concerned that local 
educational agencies (LEAs) continue to face challenging fiscal 
conditions that may be exacerbated when funds under the Recovery Act 
and the Education Jobs Fund program are no longer available. 

The report has one recommendation. The Department's response to the 
recommendation follows: 

Recommendation: To ensure all states receive appropriate communication 
and technical assistance for SFSF, consistent with what some states 
received in response to SFSF monitoring reviews, we recommend that the 
Secretary of Education establish mechanisms to improve the
consistency of communicating monitoring feedback to states, such as 
establishing internal timeframes for conveying information found 
during monitoring. 

Department's Response to the Recommendation: The Department agrees 
that it should include in the SFSF monitoring protocols procedures for 
communicating more effectively to States the status of outstanding 
matters relative to the monitoring reviews. 

Through the SFSF program, the Department fostered a new cooperative 
relationship with States by working iteratively with them in 
implementing and monitoring the program. The Department based its SFSF 
monitoring procedures on the following assumptions: 

* Because SFSF was a one-time appropriation and States received a 
significant amount of funding, the Department's monitoring review 
needed to differ from that of other formula grant programs where an 
issue can be resolved by applying conditions on a future year's 
allocation or requiring States to implement corrective actions at a 
later date; 

* Due to the frequency of State budgetary changes, the Department had 
to provide States with the opportunity to provide updated data 
demonstrating compliance with the program's maintenance-of-effort and 
allocation requirements; 

* To help ensure State compliance with applicable requirements and the 
appropriate use of taxpayer resources during the relatively short life 
of the program, the Department should prioritize the expeditious 
resolution of issues over the release of monitoring reports; and; 

* Since so many different types of entities served as subgrantees 
(e.g., LEAs, public institutions of higher education, and State 
agencies such as corrections and transportation agencies), the 
Department would need to develop a protocol and process that could 
address a wide variety of contexts and issues. In addition, since 
governors were not accustomed to administering large Department grant 
programs, there would be an added level of complexity and the need to 
orient governors' staffs to the monitoring process. 

The Department has made a concerted effort to monitor every State's 
implementation of the SFSF program annually. As of September 1, 2011, 
the Department monitored all but one State and is currently planning 
the second cycle of monitoring. After each monitoring visit or desk 
review, Department staff conducted a detailed "wrap-up call" with 
State officials to discuss issues identified during the review. 
Depending upon the nature of the issues, there were often numerous 
follow up calls with a State to discuss further strategies for 
resolving the issues identified. Not all States have required the same 
level of follow-up discussions. To date, 34 States have received draft 
reports and 28 States have received subsequent interim reports. 

The Department believes that States receive benefit significantly from 
an iterative monitoring process that emphasizes early resolution of 
issues rather than the development and issuance of a monitoring report 
by a specific deadline. The Department's ongoing communications with
States during the monitoring process provide them with immediate 
feedback on identified issues and opportunities to assist them in 
resolving the issues. The final report will summarize the results of 
the monitoring process and include an identification of the issues 
discovered during the monitoring review and a description of the 
actions taken by States to resolve those issues. In the limited 
instances when a State has not yet resolved all issues, the report 
will specify any corrective actions that are necessary to resolve the 
remaining issues. 

The Department will refine its monitoring procedures to ensure more 
effective communication so that each State understands fully the 
current status of its monitoring review. For example, the Department 
is developing a template for a form that each program officer would 
send to a State immediately after a wrap-up call. This form would 
provide the State with, among other things, a list of issues to be 
resolved. Further, the Department is including in its monitoring 
protocols timeframes for communicating with States on monitoring 
issues. 

Finally, there are some discussions in the report that the Department 
believes could benefit from additional clarification. We are enclosing 
suggested technical edits to the report. 

We appreciate the opportunity to provide this response. Please let us 
know if you need additional information. 

Sincerely, 

Signed by: 

Ann Whalen: 
Director, Policy and Program Implementation: 
Implementation and Support Unit: 

Enclosures: 

[End of section] 

Appendix IV: Status of Prior Open Recommendations and Matters for 
Congressional Consideration: 

In this appendix, we update the status of agencies' efforts to 
implement the 16 recommendations that remain open and are not 
implemented, 8 newly implemented recommendations, and 1 newly closed 
recommendation from our previous bimonthly and recipient reporting 
reviews.[Footnote 76] Recommendations that were listed as implemented 
or closed in a prior report are not repeated here. Lastly, we address 
the status of our matters for congressional consideration. 

Department of Energy: 

Open Recommendations: 

Given the concerns we have raised about whether program requirements 
were being met, we recommended in May 2010 that the Department of 
Energy (DOE), in conjunction with both state and local weatherization 
agencies, develop and clarify weatherization program guidance that: 
[Footnote 77] 

* Clarifies the specific methodology for calculating the average cost 
per home weatherized to ensure that the maximum average cost limit is 
applied as intended. 

* Accelerates current DOE efforts to develop national standards for 
weatherization training, certification, and accreditation, which is 
currently expected to take 2 years to complete. 

* Sets time frames for development and implementation of state 
monitoring programs. 

* Revisits the various methodologies used in determining the 
weatherization work that should be performed based on the 
consideration of cost-effectiveness and develops standard 
methodologies that ensure that priority is given to the most cost-
effective weatherization work. To validate any methodologies created, 
this effort should include the development of standards for accurately 
measuring the long-term energy savings resulting from weatherization 
work conducted. 

In addition, given that state and local agencies have felt pressure to 
meet a large increase in production targets while effectively meeting 
program requirements and have experienced some confusion over 
production targets, funding obligations, and associated consequences 
for not meeting production and funding goals, we recommended that DOE 
clarify its production targets, funding deadlines, and associated 
consequences, while providing a balanced emphasis on the importance of 
meeting program requirements. 

Agency Actions: 

DOE generally concurred with these recommendations and has made some 
progress in implementing them. For example, to clarify the methodology 
for calculating the average cost per home, DOE has developed draft 
guidance to help grantees develop consistency in their average cost 
per unit calculations. The guidance further clarifies the general cost 
categories that are included in the average cost per unit. DOE had 
anticipated issuing this guidance in June 2011, but as of late July 
2011 this guidance has not yet been finalized. 

Newly Closed Recommendation: 

In response to our recommendation that it develop and clarify guidance 
that develops a best practice guide for key internal controls, DOE 
distributed a memorandum dated May 13, 2011, to grantees reminding 
them of their responsibilities to ensure compliance with internal 
controls and the consequences of failing to do so. DOE officials 
stated that they rely on existing federal, state, and local guidance; 
their role is to monitor states to ensure they enforce the rules. DOE 
officials felt that there were sufficient documents in place to 
require internal controls, such as the grant terms and conditions and 
a training module. Because all of the guidance is located in one 
place, the WAPTAC Web site, DOE officials commented that a best 
practice guide would be redundant. Therefore, DOE officials stated 
that they do not intend to fully implement GAO's recommendation. 

Open Recommendation: 

To better ensure that Energy Efficiency and Conservation Block Grant 
(EECBG) funds are used to meet Recovery Act and program goals, we 
recommended that DOE explore a means to capture information on the 
monitoring processes of all recipients to make certain that recipients 
have effective monitoring practices.[Footnote 78] 

Agency Actions: 

DOE generally concurred with this recommendation, stating that 
"implementing the report's recommendations will help ensure that the 
Program continues to be well managed and executed." DOE also provided 
additional information on changes it has implemented. DOE added 
additional questions to the on-site monitoring checklists related to 
subrecipient monitoring to help ensure that subrecipients are in 
compliance with the terms and conditions of the award. These changes 
will help improve DOE's oversight of recipients, especially larger 
recipients, which are more likely to be visited by DOE project 
officers. However, not all recipients receive on-site visits. As noted 
previously, we believe that the program could be more effectively 
monitored if DOE captured information on the monitoring practices of 
all recipients. 

Newly Implemented Recommendation: 

To better ensure that Energy EECBG funds are used to meet Recovery Act 
and program goals, we recommended that DOE solicit information from 
recipients regarding the methodology they used to calculate their 
energy-related impact metrics and verify that recipients who use DOE's 
estimation tool use the most recent version when calculating these 
metrics.[Footnote 79] 

In our report, we concluded that DOE needed more information regarding 
the recipients' estimating methods in order to assess the 
reasonableness of energy-related estimates and thus determine the 
extent to which the EECBG program is meeting Recovery Act and program 
goals for energy-related outcomes. DOE officials noted that they have 
made changes to the way they collect impact metrics in order to apply 
one unified methodology to the calculation of impact metrics. DOE 
issued guidance effective June 23, 2011, that eliminates the 
requirement for grant recipients to calculate and report estimated 
energy savings. DOE officials said the calculation of estimated impact 
metrics will now be performed centrally by DOE by applying known 
national standards to existing grantee-reported performance metrics. 
Based on DOE's action, we concluded that DOE has addressed the intent 
of this recommendation. 

Department of Health and Human Services: 

Open Recommendation: 

To help ensure that grantees report consistent enrollment figures, we 
recommended that the Director of the Department of Health and Human 
Services' (HHS) Office of Head Start (OHS) should better communicate a 
consistent definition of "enrollment" to grantees for monthly and 
yearly reporting and begin verifying grantees' definition of 
"enrollment" during triennial reviews.[Footnote 80] 

Agency Actions: 

OHS issued informal guidance on its Web site clarifying monthly 
reporting requirements to make them more consistent with annual 
enrollment reporting. This guidance directs grantees to include in 
enrollment counts all children and pregnant mothers who are enrolled 
and have received a specified minimum of services (emphasis added). 
According to officials, OHS is considering further regulatory 
clarification. 

Newly Implemented Recommendation: 

To oversee the extent to which grantees are meeting the program goal 
of providing services to children and families and to better track the 
initiation of services under the Recovery Act, we recommended that the 
Director of OHS should collect data on the extent to which children 
and pregnant women actually receive services from Head Start and Early 
Head Start grantees.[Footnote 81] 

Agency Actions: 

OHS has reported that, in order to collect information on services 
provided to children and families, it plans to require grantees to 
report average daily attendance, beginning in the 2011-2012 school 
year. 

Newly Implemented Recommendation: 

To provide grantees consistent information on how and when they will 
be expected to obligate and expend federal funds, we recommended that 
the Director of OHS should clearly communicate its policy to grantees 
for carrying over or extending the use of Recovery Act funds from one 
fiscal year into the next.[Footnote 82] 

Agency Actions: 

Following our recommendation, HHS indicated that OHS would issue 
guidance to grantees on obligation and expenditure requirements, as 
well as improve efforts to effectively communicate the mechanisms in 
place for grantees to meet the requirements for obligation and 
expenditure of funds. HHS has subsequently reported that grantees have 
been reminded that the timely use of unobligated balances requires 
recipients to use the "first in/first out" principle for recognizing 
and recording obligations and expenditures of those funds. 

Newly Implemented Recommendation: 

To better consider known risks in scoping and staffing required 
reviews of Recovery Act grantees, we recommended that the Director of 
OHS should direct OHS regional offices to consistently perform and 
document Risk Management Meetings and incorporate known risks, 
including financial management risks, into the process for staffing 
and conducting reviews.[Footnote 83] 

Agency Actions: 

HHS reported OHS was reviewing the Risk Management process to ensure 
it is consistently performed and documented in its centralized data 
system and that it had taken related steps, such as requiring the 
grant officer to identify known or suspected risks prior to an on-site 
review. More recently, HHS has indicated that the results and action 
plans from the Risk Management Meetings are documented in the Head 
Start Enterprise System and used by reviewers to highlight areas where 
special attention is needed during monitoring reviews. HHS also notes 
that the Division of Payment Management (DPM) sends OHS monthly 
reports on grantees to assist OHS in performing ongoing oversight, 
monitoring grantee spending, and assessing associated risks and that 
it has incorporated a new fiscal information form as a pre-review 
requirement to ensure that fiscal information and concerns known to 
the regional office staff are shared with on-site reviewers. 

Department of Housing and Urban Development: 

Open Recommendation: 

Because the absence of third-party investors reduces the amount of 
overall scrutiny Tax Credit Assistance Program (TCAP) projects would 
receive and the Department of Housing and Urban Development (HUD) is 
currently not aware of how many projects lacked third-party investors, 
we recommended that HUD should develop a risk-based plan for its role 
in overseeing TCAP projects that recognizes the level of oversight 
provided by others.[Footnote 84] 

Agency Actions: 

HUD responded to our recommendation by saying it must wait for final 
reports from housing finance agencies on TCAP project financing 
sources in order to identify those projects that are in need of 
additional monitoring. When the final reports are received, HUD said 
it will develop a plan for monitoring those projects. HUD said it will 
begin identifying projects that may need additional monitoring at the 
end of September 2011 when sufficient information should be available 
to determine which projects have little Low-Income Housing Tax Credit 
investment and no other leveraged federal funds. 

Department of Labor: 

Newly Implemented Recommendations: 

To enhance the Department of Labor's (Labor) ability to manage its 
Recovery Act and regular Workforce Investment Act (WIA) formula grants 
and to build on its efforts to improve the accuracy and consistency of 
financial reporting, we recommended that the Secretary of Labor take 
the following actions:[Footnote 85] 

* To determine the extent and nature of reporting inconsistencies 
across the states and better target technical assistance, conduct a 
one-time assessment of financial reports that examines whether each 
state's reported data on obligations meet Labor's requirements. 

* To enhance state accountability and to facilitate their progress in 
making reporting improvements, routinely review states' reporting on 
obligations during regular state comprehensive reviews. 

Agency Actions: 

Labor reported that it has taken actions to implement our 
recommendations. To determine the extent of reporting inconsistencies, 
Labor awarded a contract in September 2010 and completed the 
assessment of state financial reports in June 2011. Labor is currently 
analyzing the findings and expects to have a final report and 
recommendations in the fall of 2011. To enhance states' accountability 
and facilitate their progress in making improvements in reporting, 
Labor issued guidance on federal financial management and reporting 
definitions on May 27, 2011, and conducted training on its financial 
reporting form and key financial reporting terms such as obligations 
and accruals. Labor also reported that it routinely monitors states' 
reporting on obligations as part of its oversight process and 
comprehensive on-site reviews. 

Newly Implemented Recommendation: 

Our September 2009 bimonthly report identified a need for additional 
federal guidance in defining green jobs and we made the following 
recommendation to the Secretary of Labor:[Footnote 86] 

* To better support state and local efforts to provide youth with 
employment and training in green jobs, provide additional guidance 
about the nature of these jobs and the strategies that could be used 
to prepare youth for careers in green industries. 

Agency Actions: 

Labor agreed with our recommendation and has taken several actions to 
implement it. Labor's Bureau of Labor Statistics (BLS) has developed a 
definition of green jobs, which was finalized and published in the 
Federal Register on September 21, 2010. In addition, Labor continues 
to host a Green Jobs Community of Practice, an online virtual 
community available to all interested parties. The department also 
hosted a symposium on April 28 and 29, 2011, with the green jobs state 
Labor Market Information Improvement grantees. Symposium participants 
shared recent research findings, including efforts to measure green 
jobs, occupations, and training in their states. In addition, the 
department released a new career exploration tool called "mynextmove" 
[hyperlink, http://www.mynextmove.gov] in February 2011 that includes 
the Occupational Information Network (O*NET) green leaf symbol to 
highlight green occupations. Additional green references have recently 
been added and are noted in the latest update, The Greening of the 
World of Work: O*NET Project's Book of References. Furthermore, Labor 
is planning to release a Training and Employment Notice this fall that 
will provide a summary of research and resources that have been 
completed by BLS and others on green jobs definitions, labor market 
information and tools, and the status of key Labor initiatives focused 
on green jobs. 

Executive Office of the President: Office of Management and Budget: 

Open Recommendations: 

To leverage Single Audits as an effective oversight tool for Recovery 
Act programs, we recommended that the Director of the Office of 
Management and Budget (OMB): 

1. take additional efforts to provide more timely reporting on 
internal controls for Recovery Act programs for 2010 and beyond; 
[Footnote 87] 

2. evaluate options for providing relief related to audit requirements 
for low-risk programs to balance new audit responsibilities associated 
with the Recovery Act;[Footnote 88] 

3. issue Single Audit guidance in a timely manner so that auditors can 
efficiently plan their audit work;[Footnote 89] 

4. issue the OMB Circular No. A-133 Compliance Supplement no later 
than March 31 of each year;[Footnote 90] 

5. explore alternatives to help ensure that federal awarding agencies 
provide their management decisions on the corrective action plans in a 
timely manner;[Footnote 91] and: 

6. shorten the time frames required for issuing management decisions 
by federal agencies to grant recipients.[Footnote 92] 

Agency Actions: 

GAO's recommendations to OMB are aimed toward improving the Single 
Audit's effectiveness as an accountability mechanism for federally 
awarded grants from Recovery Act funding. We previously reported that 
OMB has taken a number of actions to implement our recommendations 
since our first Recovery Act report in April 2009. We also reported 
that OMB had undertaken initiatives to examine opportunities for 
improving key areas of the single audit process over federal grant 
funds administered by state and local governments and nonprofit 
organizations based upon the directives in Executive Order 13520, 
Reducing Improper Payments and Eliminating Waste in Federal Programs 
issued in November 2009. Two sections of the executive order related 
to federal grantees, including state and local governments, and 
required OMB to establish working groups to make recommendations to 
improve (1) the effectiveness of single audits of state and local 
governments and non-profit organizations that are expending federal 
funds and (2) the incentives and accountability of state and local 
governments for reducing improper payments. 

OMB formed several working groups as a result of the executive order, 
including two separate working groups on issues related to single 
audits. These two working groups developed recommendations and 
reported them to OMB in May and June of 2010. OMB formed a 
"supergroup" to review these recommendations for improving single 
audits and to provide a plan for OMB to further consider or implement 
them. The "supergroup" finalized its report in August 2011. OMB also 
formed a Single Audit Metrics Workgroup as a result of one of the 
recommendations made in June 2010 to improve the effectiveness of 
single audits. In addition, the President issued a memorandum entitled 
"Administrative Flexibility, Lower Costs, and Better Results for 
State, Local, and Tribal Governments" (M-11-21) in February 2011 that 
directed OMB to, among other things, lead an interagency workgroup to 
review OMB circular policies to enable state and local recipients to 
most effectively use resources to improve performance and efficiency. 
Agencies reported their actions and recommendations to OMB on August 
29, 2011. Among the recommendations included in the report were 
recommendations aimed toward improving single audits. Since most 
Recovery Act funds will be expended by 2013, some of the 
recommendations that OMB acts upon may not be implemented in time to 
affect single audits of grant programs funded under the Recovery Act. 
However, OMB's efforts to enhance single audits could, if properly 
implemented, significantly improve the effectiveness of the single 
audit as an accountability mechanism. OMB officials stated that they 
plan to review the "supergroup's" August 2011 report and develop a 
course of action for enhancing the single audit process, but have not 
yet developed a time frame for doing so. We will continue to monitor 
OMB's efforts in this area. 

(1) To address our recommendation to encourage timelier reporting on 
internal controls for Recovery Act programs for 2010 and beyond, we 
previously reported that OMB had commenced a second voluntary Single 
Audit Internal Control Project (project) in August 2010 for states 
that received Recovery Act funds in fiscal year 2010.[Footnote 93] The 
project has been completed and the results have been compiled as of 
July 6, 2011. One of the goals of these projects was to achieve more 
timely communication of internal control deficiencies for higher-risk 
Recovery Act programs so that corrective action could be taken more 
quickly. The project encouraged participating auditors to identify and 
communicate deficiencies in internal control to program management 3 
months sooner than the 9-month time frame required under statute. The 
projects also required that program management provide a corrective 
action plan aimed at correcting any deficiencies 2 months earlier than 
required under statute to the federal awarding agency. Upon receiving 
the corrective action plan, the federal awarding agency had 90 days to 
provide a written decision to the cognizant federal agency for audit 
detailing any concerns it may have with the plan.[Footnote 94] 

Fourteen states volunteered to participate in OMB's second project, 
submitted interim internal control reports by December 31, 2010, and 
developed auditee corrective action plans on audit findings by January 
31, 2011. However, although the federal awarding agencies were to have 
provided their interim management decisions to the cognizant agency 
for audit by April 30, 2011, only 2 of the 11 federal awarding 
agencies had completed the submission of all of their management 
decisions, according to an official from the Department of Health and 
Human Services, the cognizant agency for audit. In our review of the 
2009 project, we had noted similar concerns that federal awarding 
agencies' management decisions on proposed corrective actions were 
untimely, and our related recommendations are discussed later in this 
report. 

Overall, we found that the results for both projects were helpful in 
communicating internal control deficiencies earlier than required 
under statute. The projects' dependence on voluntary participation, 
however, limited their scope and coverage. This voluntary 
participation may also bias the projects' results by excluding from 
analysis states or auditors with practices that cannot accommodate the 
project's requirement for early reporting of internal control 
deficiencies. Even though the projects' coverage could have been more 
comprehensive, the results provided meaningful information to OMB for 
better oversight of Recovery Act programs and for making future 
improvements to the single audit process. In August 2011, OMB 
initiated a third Single Audit Internal Control Project with similar 
requirements as the second OMB Single Audit Internal Control Project. 
The goal of this project is also to identify material weaknesses and 
significant deficiencies for selected Recovery Act programs 3 months 
sooner than the 9-month time frame currently required under statute so 
that the findings could be addressed by the auditee in a timely 
manner. This project also seeks to provide some audit relief for the 
auditors that participate in the project as risk assessments for 
certain programs are not required. We will continue to monitor the 
status of OMB's efforts to implement this recommendation and believe 
that OMB needs to continue taking steps to encourage timelier 
reporting on internal controls through Single Audits for Recovery Act 
programs. 

(2) We previously recommended that OMB evaluate options for providing 
relief related to audit requirements for low-risk programs to balance 
new audit responsibilities associated with the Recovery Act. OMB 
officials have stated that they are aware of the increase in workload 
for state auditors who perform Single Audits due to the additional 
funding to Recovery Act programs subject to audit requirements. OMB 
officials also stated that they solicited suggestions from state 
auditors to gain further insights to develop measures for providing 
audit relief. For state auditors that participated in the second and 
third OMB Single Audit Internal Control Projects, OMB provided some 
audit relief by modifying the requirements under Circular No. A-133 to 
reduce the number of low-risk programs to be included in some project 
participants' risk assessment requirements. However, OMB has not yet 
put in place a viable alternative that would provide relief to all 
state auditors that conduct Single Audits. 

(3) (4) With regard to issuing Single Audit guidance, such as the OMB 
Circular No. A-133 Compliance Supplement, in a timely manner, OMB has 
not yet achieved timeliness in its issuance of Single Audit guidance. 
We previously reported that OMB officials intended to issue the 2011 
Compliance Supplement by March 31, 2011, but instead issued it in 
June. OMB officials stated that the delay of this important guidance 
to auditors was due to the refocusing of its efforts to avert a 
governmentwide shutdown. OMB officials stated that although they had 
prepared to issue the 2011 Compliance Supplement by the end of March 
by taking steps such as starting the process earlier in 2010 and 
giving agencies strict deadlines for program submissions, they were 
not able to issue it until June 1, 2011. OMB officials developed a 
timeline for issuing the 2012 Compliance Supplement by March 31, 2012. 
In August 2011, they began the process of working with the federal 
agencies and others involved in issuing the Compliance Supplement. We 
will continue to monitor OMB's efforts in this area. 

(5) (6) Regarding the need for agencies to provide timely management 
decisions, OMB officials identified alternatives for helping to ensure 
that federal awarding agencies provided their management decisions on 
the corrective action plans in a timely manner, including possibly 
shortening the time frames required for federal agencies to provide 
their management decisions to grant recipients.[Footnote 95] OMB 
officials acknowledged that this issue continues to be a challenge. 
They told us they met individually with several federal awarding 
agencies that were late in providing their management decisions in the 
2009 project to discuss the measures that the agencies could take to 
improve the timeliness of their management decisions. However, as 
mentioned earlier in this report, most of the federal awarding 
agencies had not submitted all of their management decisions on the 
corrective actions by the April 30, 2011, due date in the second 
project (and still had not done so by July 6, 2011, when the results 
of the completed project were compiled). OMB officials have yet to 
decide on the course of action that they will pursue to implement this 
recommendation. 

OMB formed a Single Audit Metrics Workgroup to develop an 
implementation strategy for developing a baseline, metrics, and 
targets to track the effectiveness of single audits over time and 
increase the effectiveness and timeliness of federal awarding 
agencies' actions to resolve single audit findings. This workgroup 
reported its recommendations to OMB on June 21, 2011, proposing 
metrics that could be applied at the agency level, by program, to 
allow for analysis of single audit findings. OMB officials stated that 
they plan to initiate a pilot to implement the recommendations of this 
workgroup starting with fiscal year 2011 single audit reports. 

Newly Implemented Recommendation: 

We recommended that the Director of OMB provide more direct focus on 
Recovery Act programs through the Single Audit to help ensure that 
smaller programs with higher risk have audit coverage in the area of 
internal controls and compliance;[Footnote 96] 

Based on OMB's actions, we have concluded that OMB has addressed the 
intent of this recommendation. To provide direct focus on Recovery Act 
programs through the Single Audit to help ensure that smaller programs 
with higher risk have audit coverage in the area of internal controls 
and compliance, the OMB Circular No. A-133, Audits of States, Local 
Governments, and Non-Profit Organizations Compliance Supplement 
(Compliance Supplement) for fiscal years 2009 through 2011 required 
all federal programs with expenditures of Recovery Act awards to be 
considered as programs with higher risk when performing standard risk- 
based tests for selecting programs to be audited.[Footnote 97] The 
auditors' determinations of the programs to be audited are based upon 
their evaluation of the risks of noncompliance occurring that could be 
material to an individual major program. The Compliance Supplement has 
been the primary mechanism that OMB has used to provide Recovery Act 
requirements and guidance to auditors.[Footnote 98] One presumption 
underlying the guidance is that smaller programs with Recovery Act 
expenditures could be audited as major programs when using a risk-
based audit approach. The most significant risks are associated with 
newer programs that may not yet have the internal controls and 
accounting systems in place to help ensure that Recovery Act funds are 
distributed and used in accordance with program regulations and 
objectives. 

Since Recovery Act spending is projected to continue through 2016, we 
believe that it is essential that OMB provide direction in Single 
Audit guidance to help to ensure that smaller programs with higher 
risk are not automatically excluded from receiving audit coverage 
based on their size and standard Single Audit Act requirements. We 
spoke with OMB officials and reemphasized our concern that future 
Single Audit guidance provide instruction that helps to ensure that 
smaller programs with higher risk have audit coverage in the area of 
internal controls and compliance. OMB officials agreed and stated that 
such guidance will continue to be included in future Recovery Act 
guidance. We also performed an analysis of Recovery Act program 
selection for single audits of 10 states for fiscal year 
2010.[Footnote 99] In general, we found that the auditors selected a 
relatively greater number of smaller programs with higher risks with 
Recovery Act funding when compared to the previous period. Therefore, 
this appears to have resulted in a relative increase in the number of 
smaller Recovery Act programs being selected for audit for 7 of the 10 
states we reviewed. 

Department of Transportation: 

Open Recommendations: 

To ensure that Congress and the public have accurate information on 
the extent to which the goals of the Recovery Act are being met, we 
recommended that the Secretary of Transportation direct the Department 
of Transportations' (DOT) Federal Highway Administration (FHWA) to 
take the following two actions:[Footnote 100] 

* Develop additional rules and data checks in the Recovery Act Data 
System, so that these data will accurately identify contract 
milestones such as award dates and amounts, and provide guidance to 
states to revise existing contract data. 

* Make publicly available--within 60 days after the September 30, 
2010, obligation deadline--an accurate accounting and analysis of the 
extent to which states directed funds to economically distressed 
areas, including corrections to the data initially provided to 
Congress in December 2009. 

Agency Actions: 

In its response, DOT stated that it implemented measures to further 
improve data quality in the Recovery Act Data System, including 
additional data quality checks, as well as providing states with 
additional training and guidance to improve the quality of data 
entered into the system. DOT also stated that as part of its efforts 
to respond to our draft September 2010 report in which we made this 
recommendation on economically distressed areas, it completed a 
comprehensive review of projects in these areas, which it provided to 
GAO for that report. DOT recently posted an accounting of the extent 
to which states directed Recovery Act transportation funds to projects 
located in economically distressed areas on its Web site, and we are 
in the process of assessing these data. 

Open Recommendation: 

To better understand the impact of Recovery Act investments in 
transportation, we believe that the Secretary of Transportation should 
ensure that the results of these projects are assessed and a 
determination made about whether these investments produced long-term 
benefits.[Footnote 101] Specifically, in the near term, we recommended 
that the Secretary direct FHWA and FTA to determine the types of data 
and performance measures they would need to assess the impact of the 
Recovery Act and the specific authority they may need to collect data 
and report on these measures. 

Agency Actions: 

In its response, DOT noted that it expected to be able to report on 
Recovery Act outputs, such as the miles of road paved, bridges 
repaired, and transit vehicles purchased, but not on outcomes, such as 
reductions in travel time, nor did it commit to assessing whether 
transportation investments produced long-term benefits. DOT further 
explained that limitations in its data systems, coupled with the 
magnitude of Recovery Act funds relative to overall annual federal 
investment in transportation, would make assessing the benefits of 
Recovery Act funds difficult. DOT indicated that, with these 
limitations in mind, it is examining its existing data availability 
and, as necessary, would seek additional data collection authority 
from Congress if it became apparent that such authority was needed. 
DOT plans to take some steps to assess its data needs, but it has not 
committed to assessing the long-term benefits of Recovery Act 
investments in transportation infrastructure. We are therefore keeping 
our recommendation on this matter open. 

Matters for Congressional Consideration: 

Matter: 

To the extent that appropriate adjustments to the Single Audit process 
are not accomplished under the current Single Audit structure, 
Congress should consider amending the Single Audit Act or enacting new 
legislation that provides for more timely internal control reporting, 
as well as audit coverage for smaller Recovery Act programs with high 
risk.[Footnote 102] 

We continue to believe that Congress should consider changes related 
to the Single Audit process. 

Matter: 

To the extent that additional coverage is needed to achieve 
accountability over Recovery Act programs, Congress should consider 
mechanisms to provide additional resources to support those charged 
with carrying out the Single Audit Act and related audits.[Footnote 
103] 

We continue to believe that Congress should consider changes related 
to the Single Audit process. 

Matter: 

To provide housing finance agencies (HFA) with greater tools for 
enforcing program compliance, in the event the Section 1602 Program is 
extended for another year, Congress may want to consider directing the 
Department of the Treasury to permit HFAs the flexibility to disburse 
Section 1602 Program funds as interest-bearing loans that allow for 
repayment. 

We have closed this Matter for Congressional Consideration because the 
Section 1602 Program has not been extended. 

[End of section] 

Appendix V: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

George Scott, (202) 512-5932 or scottg@gao.gov: 

Staff Acknowledgments: 

Phyllis Anderson, Cornelia Ashby, James Ashley, Thomas Beall, James 
Bennett, Deborah Bland, Barbara Bovbjerg, Robert Campbell, Sherwin 
Chapman, Andrew Ching, David Chrisinger, Stanley J. Czerwinski, Karen 
Febey, Alexander Galuten, Bryon Gordon, Monica Gomez, Sonya Harmeyer, 
Laura Heald, Sharon Hogan, Eric Holbrook, Tom James, Yvonne D. Jones, 
Jamila Kennedy, Ying Long, Amy Moran Lowe, Jean McSween, Sheila 
McCoy,Elizabeth Morrison, Maria Morton, Karen O'Conor, Robert Owens, 
Carol Patey, Kathy Peyman, Brenda Rabinowitz, Susan Ragland, Ellen 
Phelps Ranen, James Rebbe, Beverly Ross, Michelle Sager, Vernette 
Shaw, Glen Slocum, Jonathan Stehle, A.J. Stephens, Najeema Washington, 
and James Whitcomb, 

[End of section] 

Footnotes: 

[1] Pub. L. No. 111-5, 123 Stat. 115. 

[2] Across the United States, as of August 26, 2011, the Department of 
the Treasury has paid out $228.7 billion in Recovery Act funds for use 
in states and localities. Of that amount, $64 billion has been paid 
out since the beginning of fiscal year 2011 (Oct. 1, 2010). For 
updates, see [hyperlink, http://gao.gov/recovery]. 

[3] CRS-R40151, Funding for Education in the American Recovery and 
Reinvestment Act of 2009 (P.L. 111-5), Washington, D.C., April 14, 
2009. 

[4] There are some Recovery Act education funds that are not included 
in the scope of this review. For example, we did not review IDEA Part 
C grants or SFSF government services funds. 

[5] Pub. L. No. 111-5, § 901(a)(1), 123 Stat. 115, 191. 

[6] Pub. L. No. 111-5 § 1512(e) 123 Stat. 115, 288. FTE data provide 
insight into the use and impact of the Recovery Act funds, but 
recipient reports cover only direct jobs funded by the Recovery Act. 
These reports do not include the employment impact on suppliers 
(indirect jobs) or on the local community (induced jobs). Both data 
reported by recipients and other macroeconomic data and methods are 
necessary to understand the overall employment effects of the Recovery 
Act. 

[7] The 33 geographic districts comprising the New York City Public 
Schools were treated as one school district and that one district was 
placed in the 100 largest LEAs' stratum. 

[8] In addition to our analyses of recipient report data for the 
education programs in our review, we continued, as in prior rounds, to 
perform edit checks and analyses on all prime recipient reports to 
assess data logic and consistency and identify unusual or atypical 
data. 

[9] See [hyperlink, http://gao.gov/recovery] for related GAO products. 

[10] States must use 81.8 percent of their SFSF formula grant funds to 
support education (these funds are referred to as education 
stabilization funds) and use the remaining 18.2 percent for public 
safety and other government services, which may include education 
(these funds are referred to as government services funds). The 
competitive grants included the Race to the Top program under which 
about $3.9 billion was awarded to 11 states and the District of 
Columbia, and the Investing in Innovation program under which nearly 
$650 million was awarded to 49 eligible entities, including school 
districts, non-profit education organizations, and institutions of 
higher education. 

[11] Part B Section 611 funds are provided to assist states in 
providing special education and related services to children with 
disabilities aged 3 through 21. Part B Section 619 funds are provided 
to assist states in providing special education and related services 
to children with disabilities aged 3 through 5. Our review focused on 
the use of Part B Section 611 funds. 

[12] GAO, Recovery Act: Opportunities to Improve Management and 
Strengthen Accountability over States' and Localities' Uses of Funds, 
[hyperlink, http://www.gao.gov/products/GAO-10-999] (Washington, D.C.: 
Sept. 20, 2010). 

[13] For the SFSF program, only states, and not LEAs, have to meet MOE 
requirements. 

[14] The Recovery Act authorizes the Secretary of Education to waive 
the SFSF MOE requirement under certain circumstances. For more 
information on SFSF MOE see GAO, Recovery Act: Planned Efforts and 
Challenges in Evaluating Compliance with Maintenance of Effort and 
Similar Provisions, [hyperlink, 
http://www.gao.gov/products/GAO-10-247] (Washington, D.C.: Nov. 30, 
2009). 

[15] The standard for MOE differs for states and LEAs. IDEA prohibits 
states from reducing "state financial support for special education 
and related services for children with disabilities below the level of 
that support for the preceding fiscal year." 20 U.S.C. 
§1412(a)(18)(A). IDEA prohibits LEAs from reducing the level of 
expenditures for the education of children with disabilities below the 
level of those expenditures for the preceding fiscal year. 20 U.S.C. 
§1413(a)(2)(A)(iii). Education may waive the state MOE requirement for 
the Part B grants to states program under certain circumstances, but 
there is no provision allowing for a waiver for LEAs from the MOE 
requirement. 

[16] Generally, states are required to demonstrate "maintenance of 
effort" by showing that either their combined fiscal effort per 
student or the aggregate expenditures within the state with respect to 
the provision of free public education for the preceding fiscal year 
were not less than 90 percent of such combined fiscal effort or 
aggregate expenditures for the second preceding fiscal year. 20 U.S.C. 
§ 6337(e). 

[17] An LEA may receive its full allocation of Title I, Part A funds 
for any fiscal year only if the state educational agency (SEA) 
determines that the LEA has maintained its fiscal effort in accordance 
with 20 U.S.C. § 7901. Specifically, an LEA must maintain its total or 
per-pupil expenditures in the preceding fiscal year at 90 percent or 
more of those expenditures in the second preceding fiscal year. 

[18] The Recovery Act education funds must be obligated by September 
30, 2011. 

[19] U.S. Department of Education, American Recovery and Reinvestment 
Act of 2009, Spending Report by Program, [hyperlink, 
http://www2.ed.gov/policy/gen/leg/recovery/spending/program.xls], 
September 9, 2011. 

[20] The National Governors Association and the National Association 
of State Budget Officers, The Fiscal Survey of States (Washington, 
D.C.: Spring 2011). Forty-six states begin their fiscal years in July 
and end them in June. The exceptions are Alabama and Michigan, with 
October to September fiscal years; New York, with an April to March 
fiscal year; and Texas, with a September to August fiscal year. 

[21] The Fiscal Survey of States, Spring 2011, "Table 9. Fiscal 2012 
Recommended Program Area Cuts," 11. 

[22] We used data from the U.S. Census Bureau's Small Area Income and 
Poverty Estimates (SAIPE) program to stratify LEAs by poverty status. 
The SAIPE program provides estimates by LEA of poverty rates, the 
number of children age 5 to 17 in poverty, and median household 
income. We defined an LEA to be high poverty if 20 percent or more of 
the children who are age 5 through 17, and served by the local LEA, 
are from families with incomes below the poverty line. The margins of 
error for the estimates for LEAs with high and low poverty rates were 
plus or minus 9.3 and 7.9 percentage points, respectively. 

[23] The margin of error for this estimate was plus or minus 9.5 
percentage points. 

[24] The margins of error for the estimates for rural and urban LEAs 
were 8.2 and 8.1 percent, respectively. 

[25] The margin of error for this estimate was plus or minus 8.4 
percentage points. 

[26] Pub. L. No. 111-226, § 101, 124 Stat. 2389 (2010). According to 
Education guidance, the funds are available for obligations that occur 
as of August 10, 2010 (the date of enactment of the Act). An LEA that 
has funds remaining after the 2010-2011 school year may use those 
remaining funds through September 30, 2012. 

[27] The Hawaii Department of Education is both an LEA and an SEA. 

[28] Center on Education Policy, Strained Schools Face Bleak Future: 
Districts Foresee Budget Cuts, Teacher Layoffs, and a Slowing of 
Education Reform Efforts (Washington, D.C.: June 2011). 

[29] 20 U.S.C. § 1413(a)(2)(A)(iii); 34 C.F.R. § 300.203(b). 

[30] To be eligible to exercise this flexibility, among other things, 
the LEA must be rated as "Meets Requirements" in its performance 
evaluation conducted by the SEA pursuant to 20 U.S.C. § 1416(a)(1)(C). 
20 U.S.C. § 1416(f). 

[31] 20 U.S.C. § 1413(a)(2)(C)(i). 

[32] Pub. L. No. 111-5, § 1601, 123 Stat. 115, 302. 

[33] GAO, Recovery Act: Funds Continue to Provide Fiscal Relief to 
States and Localities, While Accountability and Reporting Challenges 
Need to Be Fully Addressed, [hyperlink, 
http://www.gao.gov/products/GAO-09-1016] (Washington, D.C.: Sept. 23, 
2009). 

[34] The margin of error for this estimate was plus or minus 9.1 
percentage points. 

[35] The margin of error for this estimate was plus or minus 10.9 
percentage points. 

[36] 20 U.S.C. § 1412(a)(18)(A). 

[37] The Secretary may waive this requirement if the Secretary 
determines that granting a waiver would be equitable due to 
exceptional or uncontrollable circumstances, such as a natural 
disaster or a precipitous and unforeseen decline in the financial 
resources of the state. 20 U.S.C. § 1412(a)(18)(C)(i). 

[38] A partial waiver will waive a portion of the amount requested by 
the state, but deny the remainder. 

[39] Education officials reported that they try to reach agreement 
with the state on the amount of a state's required MOE level, the 
amount of the state's MOE shortfall, and other state budget data. 

[40] According to Education officials, whether and how much this 
reduction affects those states' future IDEA awards depends on a 
variety of factors contained in the funding formula for IDEA (such as, 
for example, changes in federal appropriations for IDEA, or changes in 
the population of children in the state, including the number of 
children living in poverty). 

[41] Education officials told us this result is based on the 
interaction of two provisions of IDEA--20 U.S.C. § 1412(a)(18)(B) 
(requiring a reduction in a state's IDEA grant for any fiscal year 
following the fiscal year in which the state failed to maintain 
support) and 20 U.S.C. § 1411(d) (regarding IDEA funding requirements, 
which base funding in part upon the amount of IDEA funds a state 
received in the preceding fiscal year). 

[42] In addition to the integrated evaluation, Education's Institute 
of Education Sciences (IES) will conduct other Recovery Act-related 
evaluations that do not include SFSF, IDEA, Part B, and ESEA Title I, 
Part A. For example, IES will evaluate the implementation and student 
outcomes related to the Race to the Top grants as well as an impact 
evaluation on the Teacher Incentive Fund mandated by the Recovery Act. 
According to IES officials, each evaluation went through an 
independent award process and has different contractors. 

[43] Education established the $4 billion Race to the Top grant fund 
to encourage states to reform their elementary and secondary education 
systems and to reward states that have improved student outcomes, such 
as high school graduation rates. (For more information on Race to the 
Top, see GAO, Race to the Top: Reform Efforts Are Under Way and 
Information Sharing Could Be Improved, [hyperlink, 
http://www.gao.gov/products/GAO-11-658] (Washington, D.C.: June 30, 
2011)). Through the Teacher Incentive Fund, Education awards 
competitive grants to states and school districts to support efforts 
to develop and implement performance-based teacher and principal 
compensation systems in high-need schools. The Ed Tech program is 
intended to improve student academic achievement through the use of 
technology in elementary and secondary schools. 

[44] According to Education, EDFacts is an initiative to put 
performance data at the center of policy, management, and budget 
decisions for all K-12 education programs. It is a multidimensional 
data system that includes: (1) an electronic submission system that 
receives data from states, districts, and schools; (2) analytical 
tools for analysis of submitted data; and (3) reporting tools for 
Education staff and data submitters to ensure better use of those data. 

[45] In April 2011, Mathematica Policy Research and the American 
Institutes for Research provided guidance to states on evaluating 
Recovery Act programs and other educational reforms. The guidance 
provides a framework for thinking about evaluations and examples of 
how to apply it, such as illustrating how recipients might evaluate 
professional development targeted to teachers and instructional 
leaders. 

[46] To receive the second phase of SFSF funding (Phase II), states 
had to complete an application in which they described their ability 
to provide data to address 37 indicators and descriptors (34 
indicators, 3 descriptors) that support the four assurances they made 
to receive their initial SFSF funding: (1) to advance reforms in 
achieving equity in teacher distribution; (2) enhancing standards and 
assessments; (3) supporting struggling schools; and (4) establishing a 
statewide longitudinal data system. Education officials said that 
states are only required to sign that they will meet the assurances 
and do not have to undertake new initiatives or otherwise indicate 
that Recovery Act funds are being directly spent on meeting the 
assurances. 

[47] As we have reported, finalizing the requirements for the SFSF 
program represented a significant effort by Education that will allow 
it to document and track the status of the SFSF reform assurances. 
Moreover, Education sought to use existing data to populate the 
indicators wherever possible so as to minimize the burden on states 
and LEAs. GAO, Recovery Act: States' and Localities' Uses of Funds and 
Actions Needed to Address Implementation Challenges and Bolster 
Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-604] 
(Washington, D.C.: May 26, 2010). 

[48] Establishing longitudinal data systems that include 12 specific 
elements is one of the assurances that states must make to be eligible 
to receive their portion of SFSF. One of the 12 elements, for example, 
is a teacher identifier system with the ability to match teachers with 
students. 

[49] GAO has found that entities should use and not simply collect 
performance information as a compliance exercise. GAO, Managing for 
Results: Enhancing Agency Use of Performance Information for 
Management Decision Making, [hyperlink, 
http://www.gao.gov/products/GAO-05-927] (Washington, D.C.: Sept. 9, 
2005). 

[50] As we have reported previously, some of these reform goals, such 
as improving standards and assessments, are more likely to be pursued 
at the state level than at the local level, while others, such as 
supporting struggling schools, may not apply to all LEAs. 

[51] As we have reported, some states' applications for SFSF funding 
described plans and initiatives that are conditioned on the receipt of 
funds, in addition to SFSF, under separate federal competitive grants 
that had not been awarded yet. GAO, Recovery Act: States' and 
Localities' Uses of Funds and Actions Needed to Address Implementation 
Challenges and Bolster Accountability, GAO-10-604 (Washington, D.C.: 
May 26, 2010). 

[52] Three of the SFSF indicators are part of several other non-SFSF 
indicators displayed on a "United States Education Dashboard." The 
Dashboard is intended to show how the nation is progressing on the 
administration's goal of having the highest proportion of college 
graduates in the world. For example, the Dashboard provides the latest 
percentage of 4th and 8th graders proficient on the NAEP reading and 
mathematics for 2009 and whether this is a change from 2007. 

[53] For SFSF, states are responsible for assuring advancement of the 
reform areas, but LEAs were generally given broad discretion in how to 
spend the SFSF funds. As a result, Education and states bear the bulk 
of risk since LEAs have received funds whether or not they have 
pursued activities related to the assurances. 

[54] GAO, Grants Management: Enhancing Performance Accountability 
Provisions Could Lead to Better Results, [hyperlink, 
http://www.gao.gov/products/GAO-06-1046] (Washington, D.C.: Sept. 29, 
2006). 

[55] 74 Fed. Reg. 58,436. 

[56] GAO, Recovery Act: States' and Localities' Uses of Funds and 
Actions Needed to Address Implementation Challenges and Bolster 
Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-604] 
(Washington, D.C.: May 26, 2010). 

[57] Congress passed the Single Audit Act, codified, as amended, 31 
U.S.C. ch. 75, to promote, among other things, sound financial 
management, including effective internal controls, with respect to 
federal awards administered by nonfederal entities. A single audit 
consists of (1) an audit and opinion on the fair presentation of the 
financial statements and of the Schedule of Expenditures of Federal 
Awards; (2) gaining an understanding of and testing internal control 
over financial reporting and over the entity's compliance with laws, 
regulations, and contract or grant provisions that have a direct and 
material effect on certain federal programs; and (3) an audit and an 
opinion on compliance with applicable program requirements for certain 
federal programs. The Single Audit Act requirements apply to state and 
local governments and non-profit organizations that expend $500,000 or 
more of federal awards in a year. 

[58] For further information about cash management issues we have 
previously reported, see [hyperlink, 
http://www.gao.gov/products/GAO-09-1016], 57-59, and [hyperlink, 
http://www.gao.gov/products/GAO-10-604], 30-32. 

[59] As part of a single audit, auditors opine on whether a recipient 
of federal program funds complied in all material respects with 
requirements described in the OMB Circular A-133 Compliance Supplement 
that are applicable to each of the federal programs selected by the 
auditors for compliance testing. A "qualified" opinion indicates that 
the audited entity was in material compliance with program 
requirements except for certain requirements indicated in the 
auditor's report. Auditors qualified their opinion on California's 
compliance in part because they found noncompliance with cash 
management requirements for the Title I program. Auditors qualified 
their opinion on Massachusetts' compliance in part because they found 
noncompliance with requirements applicable to its Federal Family 
Education Loans, Federal Direct Student Loans, and Vocational 
Rehabilitation Cluster programs, which were not included in the scope 
of this Recovery Act education program review. 

[60] Internal control means a process, effected by an entity's 
management and other personnel, designed to provide reasonable 
assurance regarding the achievement of objectives in the following 
categories: (1) effectiveness and efficiency of operations; (2) 
reliability of financial reporting; and (3) compliance with applicable 
laws and regulations. A material weakness is a deficiency, or 
combination of deficiencies, in internal control, such that there is a 
reasonable possibility that a material misstatement of the entity's 
financial statements will not be prevented, or detected and corrected 
on a timely basis. A significant deficiency is a deficiency, or a 
combination of deficiencies, in internal control that is less severe 
than a material weakness yet important enough to merit attention by 
those charged with governance. We reviewed 2010 Single Audit findings 
for the states we visited in-person: California, Iowa, Massachusetts, 
and Mississippi, as well as the 2 LEAs and 1 IHE we visited in each 
state. Our review covered Catalog of Federal Domestic Assistance 
(CFDA) grant numbers 84.010, 84.389, 84.027, 84.391, 84.392, and 
84.394. 

[61] California's 2009 Single Audit also identified deficiencies in 
cash management of Title I funds. GAO has previously reported on the 
state's ongoing cash management issues and the actions the SEA has 
taken to address them--see [hyperlink, 
http://www.gao.gov/products/GAO-09-830SP], [hyperlink, 
http://www.gao.gov/products/GAO-09-1017SP], [hyperlink, 
http://www.gao.gov/products/GAO-10-232SP], and [hyperlink, 
http://www.gao.gov/products/GAO-10-467T]. When we spoke with 
California Department of Education officials in May 2011, they stated 
they the Web-based reporting system to track LEA cash balances that 
they began developing in 2009 had been expanded to include Title I, 
and all federal programs. 

[62] The Davis-Bacon Act was enacted in 1931 in part to protect 
communities and workers from the economic disruption caused by 
contractors hiring lower wage workers from outside the local 
geographic area, thus obtaining federal construction contracts by 
underbidding contractors who pay local wage rates. The act generally 
requires that employers pay locally prevailing wage rates, including 
fringe benefits, to laborers and mechanics employed on federally-
funded construction projects in excess of $2,000. 

[63] Outlays are defined as the amount of funds obligated by Education 
and paid to grantees. 

[64] We use the term "recipient report" to refer to the reports 
required by section 1512 of division A of the Recovery Act. Pub. L. 
No. 111-5, § 1512, 123 Stat. 115, 287. 

[65] To perform this work, the Education OIG used data from the March 
31, 2010, recipient report. 

[66] GAO, Recovery Act: States' and Localities' Uses of Funds and 
Actions Needed to Address Implementation Challenges and Bolster 
Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-604] 
(Washington, D.C.: May 26, 2010) 186; and GAO, Recovery Act: 
Opportunities to Improve Management and Strengthen Accountability over 
States' and Localities' Uses of Funds, [hyperlink, 
http://www.gao.gov/products/GAO-10-999] (Washington, D.C.: Sept. 20, 
2010) 143. 

[67] According to Education officials, there was one recipient, 
Minnesota, that did not report during the eighth reporting period 
because the Minnesota government was shut down during the grantee 
reporting period due to budget issues. Consistent with Office of 
Management and Budget and Recovery Act Transparency Board approved 
procedures, Education issued Minnesota a waiver for reporting in the 
April-June 2011 quarter for all education grants. Although not 
required to, Minnesota did report on the SFSF government services 
funds. 

[68] Exec. Order No. 13,576, § 3, 76 Fed. Reg. 35,297 (June 16, 2011). 

[69] H.R. 2146, 112TH Cong. (2011); S. 1222, 112TH Cong. (2011). 

[70] We excluded FTE counts associated with grants whose funding 
agency was the U.S. Department of Interior. 

[71] See GAO, Recovery Act: States' and Localities' Uses of Funds and 
Actions Needed to Address Implementation Challenges and Bolster 
Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-604] 
(Washington, D.C.: May 26, 2010). 

[72] We stratified the population into strata based on size, poverty 
level, and urban status. Regarding size, we identified and included 
the 100 largest LEAs in the country. The 33 geographic districts 
comprising the New York City Public Schools were treated as one school 
district and that one district was placed in the 100 largest LEAs 
stratum. 

[73] Pub. L. No. 111-5 § 1512(e), 123 Stat. 115, 288. 

[74] As with our previous reviews, we conducted these checks and 
analyses on all prime recipient reports to assess data logic and 
consistency and identify unusual or atypical data. For this eighth 
round of reporting, we continued to see only minor variations in the 
number or percent of reports appearing atypical or showing some form 
of data discrepancy. 

[75] U.S. Department of Education Office of Inspector General, 
American Recovery and Reinvestment Act: The Effectiveness of the 
Department's Data Quality Review Processes, ED-OIG/A19K0010 
(Washington, D.C.: August 2011). 

[76] GAO, Recovery Act: As Initial Implementation Unfolds in States 
and Localities, Continued Attention to Accountability Issues Is 
Essential, [hyperlink, http://www.gao.gov/products/GAO-09-580] 
(Washington, D.C.: Apr. 23, 2009); Recovery Act: States' and 
Localities' Current and Planned Uses of Funds While Facing Fiscal 
Stresses, [hyperlink, http://www.gao.gov/products/GAO-09-829] 
(Washington, D.C.: July 8, 2009); Recovery Act: Funds Continue to 
Provide Fiscal Relief to States and Localities, While Accountability 
and Reporting Challenges Need to Be Fully Addressed, [hyperlink, 
http://www.gao.gov/products/GAO-09-1016] (Washington, D.C.: Sept. 23, 
2009); Recovery Act: Recipient Reported Jobs Data Provide Some Insight 
into Use of Recovery Act Funding, but Data Quality and Reporting 
Issues Need Attention, [hyperlink, 
http://www.gao.gov/products/GAO-10-223] (Washington, D.C.: Nov. 19, 
2009); Recovery Act: Status of States' and Localities' Use of Funds 
and Efforts to Ensure Accountability, [hyperlink, 
http://www.gao.gov/products/GAO-10-231] (Washington, D.C.: Dec. 10, 
2009); Recovery Act: One Year Later, States' and Localities' Uses of 
Funds and Opportunities to Strengthen Accountability, [hyperlink, 
http://www.gao.gov/products/GAO-10-437] (Washington, D.C.: Mar. 3, 
2010); Recovery Act: States' and Localities' Uses of Funds and Actions 
Needed to Address Implementation Challenges and Bolster 
Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-604] 
(Washington, D.C.: May 26, 2010); Recovery Act: Opportunities to 
Improve Management and Strengthen Accountability over States' and 
Localities' Uses of Funds, [hyperlink, 
http://www.gao.gov/products/GAO-10-999] (Washington, D.C.: Sept. 20, 
2010); Recovery Act: Head Start Grantees Expand Services, but More 
Consistent Communication Could Improve Accountability and Decisions 
about Spending, [hyperlink, http://www.gao.gov/products/GAO-11-166] 
(Washington, D.C.: Dec. 15, 2010); Recovery Act: Energy Efficiency and 
Conservation Block Grant Recipients Face Challenges Meeting 
Legislative and Program Goals and Requirements, [hyperlink, 
http://www.gao.gov/products/GAO-11-379] (Washington, D.C.: Apr. 7, 
2011); Recovery Act: Funding Used for Transportation Infrastructure 
Projects, but Some Requirements Proved Challenging, [hyperlink, 
http://www.gao.gov/products/GAO-11-600] (Washington, D.C.: June 29, 
2011); and Recovery Act: Funds Supported Many Water Projects, and 
Federal and State Monitoring Shows Few Compliance Problems, [hyperlink, 
http://www.gao.gov/products/GAO-11-608] (Washington, D.C.: June 29, 
2011). 

[77] [hyperlink, http://www.gao.gov/products/GAO-10-604], 124-125. 

[78] [hyperlink, http://www.gao.gov/products/GAO-11-379], 36. 

[79] [hyperlink, http://www.gao.gov/products/GAO-11-379], 36-37. 

[80] [hyperlink, http://www.gao.gov/products/GAO-11-166], 39. 

[81] [hyperlink, http://www.gao.gov/products/GAO-10-604], 184. 

[82] [hyperlink, http://www.gao.gov/products/GAO-11-166], 39. 

[83] [hyperlink, http://www.gao.gov/products/GAO-11-166], 39. 

[84] [hyperlink, http://www.gao.gov/products/GAO-10-999], 189. 

[85] [hyperlink, http://www.gao.gov/products/GAO-10-604], 244. 

[86] [hyperlink, http://www.gao.gov/products/GAO-09-1016], 78. 

[87] [hyperlink, http://www.gao.gov/products/GAO-10-604], 247. 

[88] [hyperlink, http://www.gao.gov/products/GAO-09-829], 127. 

[89] [hyperlink, http://www.gao.gov/products/GAO-10-604], 247. 

[90] [hyperlink, http://www.gao.gov/products/GAO-10-999], 194. 

[91] [hyperlink, http://www.gao.gov/products/GAO-10-604], 247-248. 

[92] [hyperlink, http://www.gao.gov/products/GAO-10-999], 194. 

[93] OMB's second project is similar to its first Single Audit 
Internal Control project, which started in October 2009. Sixteen 
states participated in the first project. We assessed the results of 
the project and reported them in GAO-10-999. 

[94] HHS, the cognizant agency for audit, has designated the HHS 
Office of the Inspector General to perform certain responsibilities 
relating to Single Audits. 

[95] The project's guidelines called for the federal awarding agencies 
to complete (1) performing a risk assessment of the internal control 
deficiency and identify those with the greatest risk to Recovery Act 
funding and (2) identifying corrective actions taken or planned by the 
auditee. OMB guidance requires this information to be included in a 
management decision that the federal agency was to have issued to the 
auditee's management, the auditor, and the cognizant agency for audit. 

[96] [hyperlink, http://www.gao.gov/products/GAO-09-829], 127. 

[97] Congress passed the Single Audit Act, as amended, 31 U.S.C. ch. 
75, to promote, among other things, sound financial management, 
including effective internal controls, with respect to federal awards 
administered by nonfederal entities. The Single Audit Act requires 
states, local governments, and nonprofit organizations expending 
$500,000 or more in federal awards in a year to obtain an audit in 
accordance with the requirements set forth in the act. A Single Audit 
consists of (1) an audit and opinions on the fair presentation of the 
financial statements and the Schedule of Expenditures of Federal 
Awards; (2) gaining an understanding of and testing internal control 
over financial reporting and the entity's compliance with laws, 
regulations, and contract or grant provisions that have a direct and 
material effect on certain federal programs (i.e., the program 
requirements); and (3) an audit and an opinion on compliance with 
applicable program requirements for certain federal programs. 

[98] In addition to the annual edition of the Compliance Supplement, 
OMB may issue Compliance Supplement addendums during the year to 
update or provide further Recovery Act guidance. 

[99] Analysis was based on 2010 Single Audit data submitted to the 
federal government in accordance with the Single Audit Act for 10 
randomly selected state governments. 

[100] [hyperlink, http://www.gao.gov/products/GAO-10-999], 187-188. 

[101] [hyperlink, http://www.gao.gov/products/GAO-10-604], 241-242. 

[102] [hyperlink, http://www.gao.gov/products/GAO-09-829], 128. 

[103] [hyperlink, http://www.gao.gov/products/GAO-09-829], 128. 

[104] [hyperlink, http://www.gao.gov/products/GAO-10-604], 251. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: