This is the accessible text file for GAO report number GAO-11-193 
entitled '2010 Census: Data Collection Operations Were Generally 
Completed as Planned, but Long-standing Challenges Suggest Need for 
Fundamental Reforms' which was released on December 14, 2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States Government Accountability Office: 
GAO: 

Report to Congressional Requesters: 

December 2010: 

2010 Census: 

Data Collection Operations Were Generally Completed as Planned, but 
Long-standing Challenges Suggest Need for Fundamental Reforms: 

2010 Census: 

GAO-11-193: 

GAO Highlights: 

Highlights of GAO-11-193, a report to congressional requesters. 

Why GAO Did This Study: 

Although the U.S. Census Bureau (Bureau) generally completed the field 
data collection phase of the 2010 Census consistent with its 
operational plans, at $13 billion, 2010 was the costliest census in 
the nation’s history. Moving forward, it will be important to both 
refine existing operations as well as to reexamine the fundamental 
approach to the census to better address long-standing issues such as 
securing participation and escalating costs. As requested, this report 
reviews (1) the conduct of nonresponse follow-up (NRFU), where 
enumerators collect data from households that did not return their 
census forms, (2) the implementation of other field operations 
critical to a complete count, and (3) potential reexamination areas 
that could help produce a more cost-effective 2020 Census. The report 
is based on GAO’s analysis of Bureau data and documents, surveys of 
local census office managers, and field observations. 

What GAO Found: 

Nationally, the Bureau was well positioned to implement NRFU and 
subsequent field operations. The Bureau achieved a mail response rate 
of 63 percent, which was within its expectations, and recruited nearly 
3.8 million total applicants for census jobs, which was 104 percent of 
its staffing goal. Moreover, the Bureau completed NRFU under budget, 
reportedly spending $1.59 billion on the operation, about $660 million 
(29 percent) less than the Bureau initially estimated. Most of the 
Bureau’s local census offices (LCO) also completed NRFU ahead of the 
10-week allotted time frame. Despite these operational successes, the 
Bureau encountered some notable challenges. For example, the pace of 
NRFU may have fostered a culture that tended to emphasize speed over 
quality, as those LCOs with higher percentages of less-complete 
questionnaires were more likely to have completed NRFU in 53 days or 
less (the average time LCOs took to complete NRFU). The Bureau also 
had to overcome issues with critical information technology (IT) 
systems. For example, performance problems with the IT system used to 
manage NRFU led to processing backlogs. Although the Bureau developed 
workarounds for the issue, it hindered the Bureau’s ability to fully 
implement quality-assurance procedures as planned. 

The Bureau generally completed other follow-up operations designed to 
improve the accuracy of the data consistent with its plans. One of 
these activities was the vacant/delete check (VDC), where enumerators 
verified housing units thought to be vacant or nonexistent. The Bureau 
completed VDC two days ahead of schedule, but encountered duplicate 
addresses on the address list used for the operation, which could 
indicate a more systemic problem with the quality of the Bureau’s 
address list. 

While it will be important to refine existing census-taking activities—
many of which have been in place since 1970—results of prior censuses 
point to the fact that simply improving current methods will not bring 
about the reforms needed to control costs and maintain accuracy. The 
cost of conducting the census has, on average, doubled each decade 
since 1970. At the same time, because of demographic and attitudinal 
trends, securing a complete count has become an increasing challenge. 
As a result, a fundamental reexamination of the nation’s approach to 
the census will be needed for a more cost-effective enumeration in 
2020. Potential focus areas include new data collection methods; the 
tenure of the Census Director; and ensuring the Bureau’s approaches to 
human-capital management, knowledge sharing, and other internal 
functions are aligned toward delivering more cost-effective outcomes. 
The Bureau recognizes that fundamental changes are needed and has 
already taken some important first steps, including developing a 
strategic plan. To help ensure the Bureau’s efforts stay on track and 
to avoid problems it had in planning for prior censuses, it will be 
important for the Bureau to issue a comprehensive operational plan 
that includes performance goals, milestones, cost estimates, and other 
critical information that could be updated regularly. 

What GAO Recommends: 

GAO recommends that the Census Director refine NRFU and other field 
follow-up efforts by, among other things, emphasizing quality as much 
as speed during NRFU and by incorporating best practices in its IT 
acquisition-management policy. To help ensure reform efforts stay on 
track, the Bureau should develop an operational plan that integrates 
performance, budget, and other information. The Department of Commerce 
generally agreed with GAO’s findings and recommendations. 

View [hyperlink, http://www.gao.gov/products/GAO-11-193] or key 
components. For more information, contact Robert Goldenkoff at (202) 
512-2757 or goldenkoffr@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

NRFU Was Generally Successful; Refinements Could Improve Procedures 
for 2020: 

Key Follow-up Operations Were Generally Completed as Planned: 

Fundamental Reforms Will Be Needed for a More Cost-Effective Census in 
2020: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Comments from the Department of Commerce: 

Appendix III: GAO Contact and Staff Acknowledgments: 

Related GAO Products: 

Figures: 

Figure 1: The Average Cost of Counting Each Housing Unit (in Constant 
2010 Dollars) Has Escalated Each Decade While Mail Response Rates Have 
Declined: 

Figure 2: The Bureau Met Its Minimum Mail Response Rate Goal of 59 
Percent in All but 11 States, but Rates Generally Declined Compared to 
2000: 

Figure 3: The Expected and Actual Number of Cases Completed during 
NRFU: 

Figure 4: Digital Fingerprint Scanner: 

Abbreviations: 

AA: assignment area: 

Bureau: U.S. Census Bureau: 

CCM: census coverage measurement: 

FBI: Federal Bureau of Investigation: 

IT: information technology: 

LCO: local census office: 

MaRCS: Matching Review and Coding System: 

NPC: National Processing Center: 

NRFU: nonresponse follow-up: 

PBOCS: Paper-Based Operations Control System: 

PI: person interviewing: 

VDC: vacant/delete check: 

[End of section] 

United States Government Accountability Office: 
Washington, DC 20548: 

December 14, 2010: 

The Honorable Thomas R. Carper: 
Chairman: 
The Honorable John McCain: 
Ranking Member: 
Subcommittee on Federal Financial Management, Government Information, 
Federal Services, and International Security: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

The Honorable Darrell E. Issa: 
Ranking Member: 
Committee on Oversight and Government Reform: 
House of Representatives: 

The Honorable Wm. Lacy Clay: 
Chairman: 
The Honorable Patrick T. McHenry: 
Ranking Member: 
Subcommittee on Information Policy, Census and National Archives: 
Committee on Oversight and Government Reform: 
House of Representatives: 

One of the final acts of the decade-long census life cycle is to occur 
in the remaining days of 2010 when, as required by law, the U.S. 
Census Bureau (Bureau) is to release to the President the state 
population counts used to apportion Congress.[Footnote 1] Although 
some additional work and more data releases lie ahead, and information 
on the accuracy of the count is not scheduled to be available until 
early 2012, this much is clear: the Bureau generally completed the 
enumeration phase of the 2010 Census on schedule and consistent with 
its operational plans, and largely surmounted a series of risks that 
jeopardized the success of the headcount. 

As you know, an operationally successful census was no small 
accomplishment. Various social and demographic trends such as an 
increasingly diverse population and a distrust of government made a 
complete count an extraordinary challenge in 2010. At the same time, 
the Bureau had to overcome a variety of internal management challenges 
including shortcomings with critical information technology (IT) 
systems. 

We have long reported that the decennial census is a shared national 
undertaking, where the Bureau, Congress, government agencies at all 
levels, private organizations, and ultimately the public at large, all 
play vital roles in securing a complete count. That the Bureau 
completed key operations on schedule, obtained an acceptable 
participation rate, and is on track for meeting legally mandated 
deadlines for reporting population figures is a tremendous credit to 
the people of this nation for completing their census forms and 
cooperating with enumerators; the hundreds of thousands of career and 
temporary Bureau employees who diligently implemented a vast array of 
census-taking activities, often under difficult circumstances; public, 
private, tribal, and nonprofit organizations of all sizes for 
voluntarily partnering with the Bureau and raising awareness of the 
census; and finally to Congress, which provided the necessary support 
while holding the Bureau accountable for results. 

Despite these impressive achievements, the 2010 Census required an 
unprecedented commitment of resources, including recruiting more than 
3.8 million total applicants--roughly equivalent to the entire 
population of Oregon--for its temporary workforce; and it escalated in 
cost from an initial estimate of $11.3 billion in 2001 to around $13 
billion, the most expensive population count in our nation's history. 
Further, our oversight of the 1990, 2000, and now 2010 Censuses 
suggests that the fundamental design of the enumeration--in many ways 
unchanged since 1970--is no longer capable of delivering a cost- 
effective headcount given the nation's increasing diversity and other 
sociodemographic trends. 

Indeed, beginning in 1990, we reported that rising costs, difficulties 
in securing public participation, and other long-standing challenges 
required a revised census methodology, a view that was shared by other 
stakeholders.[Footnote 2] For 2010, the Bureau eliminated the long-
form questionnaire in an effort to boost response rates, and refined 
other census-taking activities, but the basic approach to the 
enumeration is essentially the same as it was 40 years ago, and 
achieving acceptable results using these conventional methods has 
required an increasingly larger investment of fiscal resources, 
resources that in the coming years will become increasingly scarce. 

In short, as the nation turns the corner on the 2010 Census, it will 
be vitally important to both identify lessons learned from the current 
decennial census to improve existing census-taking activities, as well 
as to reexamine and perhaps fundamentally transform the way the Bureau 
plans, tests, implements, monitors, and evaluates future enumerations 
in order to address long-standing challenges. 

As requested, this report assesses the implementation of (1) 
nonresponse follow-up (NRFU), the largest and most costly census field 
operation, where the Bureau sends enumerators to collect data from 
households that did not mail back their census forms, and (2) other 
key follow-up field operations that were critical for ensuring a 
complete count; and (3) identifies key questions and focus areas that 
will be important for the Bureau, Congress, and census stakeholders to 
consider going forward now that planning for the next enumeration is 
underway. 

This report is one of three we are releasing today. Among the other 
two, one focuses on the Bureau's efforts to reach out to and enumerate 
hard-to-count populations, while the other examines the implementation 
of operations aimed at reducing census coverage errors. Both reports 
identify preliminary lessons learned, as well as potential focus areas 
for improvement.[Footnote 3] 

In reviewing NRFU, we examined the pace of production, the 
fingerprinting of census workers as part of a background check, and 
the performance of a critical automated system. The follow-up 
operations we reviewed for this report include the vacant/delete check 
(VDC), where the Bureau verifies the status of housing units flagged 
earlier in the census as being unoccupied or nonexistent; and census 
coverage measurement (CCM), where the Bureau assesses the completeness 
and accuracy of the census count. 

For all three objectives, we (1) analyzed Bureau cost and progress 
data as well as planning and other pertinent documents; (2) conducted 
periodic surveys of the Bureau's 494 local census office (LCO) 
managers using a series of online questionnaires that asked about 
their experience in managing LCO activities; and (3) made field 
observations at 28 locations across the country selected for various 
factors such as their geographic and demographic diversity, and 
including parts of such urban areas as Atlanta, Boston, Chicago, 
Detroit, New Orleans, New York City, San Francisco, and Tucson, as 
well as less-populated areas such as Meridian, Mississippi, and New 
Castle, Delaware. We also interviewed Bureau officials at headquarters 
and LCO managers and staff, and reviewed our prior work on the 
planning and implementation of the 1990, 2000, and 2010 Censuses. 
Moreover, to help inform a reexamination of the nation's approach to 
the census, in addition to the above, we reviewed our prior work on 
governmentwide reexamination, as well as leading practices and 
attributes in the areas of IT management, organizational performance, 
collaboration, stewardship, and human capital.[Footnote 4] Appendix I 
includes additional information on our scope and methodology and a 
list of LCOs we visited. Data presented in this report measuring 
operational timeliness and data quality were drawn from Bureau 
management and operational data systems. To assess the reliability of 
the data, we reviewed Bureau electronic documentation to gain 
information about the data and their sources, and followed up with 
agency officials knowledgeable about the data in cases where we had 
questions about potential errors or inconsistencies. On the basis of 
our efforts, we determined that the data were sufficiently reliable 
for the purposes of supporting the findings and recommendations in 
this report. 

We conducted this performance audit from December 2009 until December 
2010 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audits 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

On December 7, 2010, the Secretary of Commerce provided written 
comments on a draft of this report (see app. II). The Department of 
Commerce generally agreed with the overall findings and 
recommendations of the report. In addition, the Secretary of Commerce 
provided the Bureau's technical comments and suggestions where 
additional context might be needed, and we revised the report to 
reflect these comments where appropriate. 

Background: 

The decennial census is a constitutionally mandated enterprise 
critical to our nation. Census data are used to apportion 
congressional seats, redraw congressional districts, and help allocate 
hundreds of billions of dollars in federal aid to state and local 
governments each year. A complete count of the nation's population is 
an enormous challenge requiring the successful alignment of thousands 
of activities, hundreds of thousands of temporary employees, and 
millions of forms. Indeed, over the past year, in an effort to secure 
a complete count, the Bureau mailed out questionnaires to about 120 
million housing units for occupants to complete and mail back; hand-
delivered approximately 12 million questionnaires--mostly in rural 
locations as well as in areas along the Gulf Coast affected by recent 
hurricanes--for residents to fill out and return by mail; went door-to-
door collecting data from the approximately 46.6 million households 
that did not mail back their census forms; and conducted operations 
aimed at counting people in less-conventional dwellings such as 
migrant-worker housing, boats, tent cities, homeless shelters, nursing 
homes, dormitories, and prisons. In short, the decennial census is 
large, logistically complex, and, at a cost now estimated at around 
$13 billion, expensive. 

In developing the 2010 Census, the Bureau faced three significant 
internal challenges: critical IT systems had performance problems 
during testing, cost-estimates lacked precision, and some key 
operations were not tested under census-like conditions. These were 
some of the issues that led us to designate the 2010 Census a GAO high-
risk area in 2008.[Footnote 5] 

Although every census has its decade-specific difficulties, 
sociodemographic trends such as concerns over personal privacy, more 
non-English speakers, and more people residing in makeshift and other 
nontraditional living arrangements make each decennial increasingly 
challenging and do not bode well for the cost-effectiveness of future 
counts. As shown in figure 1, the cost of enumerating each housing 
unit has escalated from around $16 in 1970 to around $98 in 2010, in 
constant 2010 dollars (an increase of over 500 percent). At the same 
time, the mail response rate--a key indicator of a successful census-- 
has declined from 78 percent in 1970 to 63 percent in 2010. The mail 
response rate is an important figure because it determines the NRFU 
workload and ultimately, NRFU costs. In many ways, the Bureau has to 
invest substantially more resources each decade just to match the 
prior decennial's response rate. 

Figure 1: The Average Cost of Counting Each Housing Unit (in Constant 
2010 Dollars) Has Escalated Each Decade While Mail Response Rates Have 
Declined: 

[Refer to PDF for image: combined vertical bar and line graph] 

Year: 1970; 
Average cost per housing unit (in constant 2010 dollars): $16; 
Mail response rate: 78%. 

Year: 1980; 
Average cost per housing unit (in constant 2010 dollars): $30; 
Mail response rate: 75%. 

Year: 1990; 
Average cost per housing unit (in constant 2010 dollars): $39; 
Mail response rate: 66%. 

Year: 2000; 
Average cost per housing unit (in constant 2010 dollars): $70; 
Mail response rate: 66%. 

Year: 2010; 
Average cost per housing unit (in constant 2010 dollars): $98 
(projected); 
Mail response rate: 63%. 

Source: GAO analysis of Census Bureau data. 

Note: In the 2010 Census the Bureau used only a short-form 
questionnaire. For this report we use the 1990 and 2000 Census short- 
form mail response rate when comparing 1990, 2000, and 2010 mail-back 
response rates. Census short-form mail response rates are unavailable 
for 1970 and 1980, so we use the overall response rate. 

[End of figure] 

In our earlier work on high-performing organizations, we noted that 
the federal government must confront a range of new challenges to 
enhance performance, ensure accountability, and position the nation 
for the future.[Footnote 6] Nothing less than a fundamental 
transformation in the people, processes, technology, and environment 
used by federal agencies to address public goals will be necessary to 
address public needs. Ultimately, however, the federal government 
needs to change its culture to be more results-oriented. For the 
Bureau, as with all federal agencies, this means ensuring, among other 
things, that its culture embraces results rather than outputs; follows 
matrixes rather than stovepipes; forms partnerships rather than 
protecting turf; focuses on risk management rather than risk 
avoidance; and takes proactive approaches rather than behaving 
reactively. 

NRFU Was Generally Successful; Refinements Could Improve Procedures 
for 2020: 

The Bureau Met Its Response Rate Goal, but Recruited More Enumerators 
Than Needed and Should Revisit Its Staffing Model: 

Nationally, in terms of workload (as determined by the mail response 
rate) and staffing levels, the Bureau was well positioned to implement 
NRFU. With respect to the response rate, the Bureau expected a level 
of 59 percent to 65 percent. The actual mail response rate on April 
19, when the Bureau initially determined the universe of houses to 
visit for NRFU, was just over 63 percent, well within the Bureau's 
range of estimates. This translated into an initial workload of 48.6 
million housing units. 

Achieving this response rate was an important accomplishment as the 
nation's population is growing steadily larger, more diverse, and 
according to the Bureau, increasingly difficult to find and reluctant 
to participate in the census. High response rates are essential 
because they save taxpayer dollars. According to the Bureau, for every 
1 percentage point increase in mail response in 2010, the Bureau saved 
$85 million that would otherwise have been spent on the follow-up 
efforts. According to the Bureau, it costs 42 cents to mail back each 
census form in a postage-paid envelope, compared with an average 
estimate of around $57 for field activities necessary to enumerate 
each housing unit in person. Moreover, mail returns tend to have 
better-quality data, in part because as time goes on after Census Day 
(April 1), people move, or may have difficulty recalling who was 
residing with them. 

As illustrated in figure 2, the Bureau met its expected response rate 
in all but 11 states. The highest response rate (71.7 percent) was in 
Minnesota, while the lowest response rate (51 percent) was in Alaska. 
At the same time, response rates in all but two states--Hawaii and 
South Carolina--as well as the District of Columbia, declined anywhere 
from 0.8 to 8.2 percentage points when compared to 2000, thus 
underscoring the difficulty the Bureau will face in the future in 
trying to sustain response rates.[Footnote 7] The mail response rate 
is important because it helps the Bureau determine the housing units 
that failed to mail back the census questionnaires, and thus are 
included in the NRFU workload. 

Figure 2: The Bureau Met Its Minimum Mail Response Rate Goal of 59 
Percent in All but 11 States, but Rates Generally Declined Compared to 
2000: 

[Refer to PDF for image: illustrated U.S. map] 

2010 mail response rates: 

Met Bureau minimum mail response rate goal: 
Arkansas: 59.5% (-5.2); 
California: 64.6% (-4.3); 
Colorado: 64.6% (-4.9); 
Connecticut: 66.1% (-2.8); 
Delaware: 60.4% (-2.2); 
District of Columbia: 60.4% (1.5); 
Florida: 59.1% (-3.6); 
Georgia: 59.2% (-5); 
Hawaii: 60.1% (1); 
Idaho: 74.7% (-2.2); 
Illinois: 67.3% (-0.8); 
Indiana: 67% (-1.9); 
Iowa: 71.3% (-3.4); 
Kansas: 67.4% (-2.3); 
Kentucky: 62.6% (-2.6); 
Maryland: 66.1% (-1.7); 
Massachusetts: 65% (-3.5); 
Michigan: 65.4% (-5.8); 
Minnesota: 71.7% (-2.8); 
Missouri: 65.5% (-3.2); 
Montana: 63% (-5.4); 
Nebraska: 69.1% (-5.1); 
New Hampshire: 61.8% (-5.0); 
New Jersey: 64.1% (-3.2); 
New York: 60.4% (-2.2); 
North Carolina: 61.5% (-0.9); 
North Dakota: 67.1% (-4.7); 
Ohio: 66.2% (-5.0); 
Oregon: 63.9% (-2.5); 
Pennsylvania: 67.6% (-2.1); 
Rhode Island: 62.4% (-3.7); 
South Carolina: 60.7% (2.7); 
South Dakota: 65.4% (-8.2); 
Tennessee: 63.4% (-0.8); 
Texas: 60% (-3.1); 
Utah: 65.3% (-1.8); 
Virginia: 66.2% (-4.1); 
Washington: 63.4% (-1.7); 
Wisconsin: 71.4% (-4.3); 
Wyoming: 61.3% (-3.8). 

Did not meet Bureau minimum mail response rate goal: 
Alabama: 58.6% (-2.1); 
Alaska: 51% (-4.8); 
Arizona: 58.2% (-4.5); 
Louisiana: 57.4% (-2.3); 
Maine: 55.2% (-6.2); 
Mississippi: 57.5% (-4.1); 
Nevada: 58.4% (-6.5); 
New Mexico: 56.7% (-4.0); 
Oklahoma: 58.8% (-5.2); 
Vermont: 57.9% (-4.2); 
West Virginia: 56.9% (-6.7). 

Source: GAO analysis of preliminary Census Bureau data; Map Resources 
(map). 

Note: Number reflects the 2010 response rate as of April 19, 2010. 
Number in parentheses reflects the percentage change in response rate 
from 2000 to 2010. 2000 response rate as of April 18, 2000. 

[End of figure] 

The mail response rate differs from the participation rate in that it 
is calculated as a percentage of all housing units in the mail-back 
universe, including those that are later found to be nonexistent or 
unoccupied. In contrast, the participation rate is the percentage of 
forms mailed back by households that received them and is a better 
measure of cooperation with the census. According to a Bureau press 
release dated October 21, 2010, the nation achieved a final mail 
participation rate of 74 percent, matching the final mail 
participation rate that was achieved for the 2000 Census. Compared to 
2000, participation rates for 22 states and the District of Columbia, 
either met or exceeded their 2000 Census rate. 

Key factors aimed at improving the mail response rate included the 
mailing of an advance letter, a reminder postcard, and an aggressive 
marketing and outreach program. In addition, this is the first 
decennial census the Bureau sent a second or "replacement" 
questionnaire to households. Replacement questionnaires were sent to 
around 25 million households in census tracts that had the lowest 
response rates in the 2000 Census, and 10 million replacement 
questionnaires were sent to nonresponding households in other census 
tracts that had low-to-moderate response rates in 2000. 

With respect to staffing levels, the Bureau set a recruitment goal of 
nearly 3.7 million total applicants and achieved 104 percent of this 
goal by April 25, 2010, recruiting more than 3.8 million total 
applicants, almost a week prior to the start of NRFU (once the Bureau 
had an adequate pool of candidates for 2010, it attempted to limit the 
number of additional applicants, taking such steps as discontinuing 
the advertising of census jobs in mailed-out census materials). 

According to the Bureau, based on past experience, it set its 
recruiting goal at five times the number of persons that needed to be 
trained to ensure it had an ample pool of candidates in specific areas 
with specific skills, as well as to ensure it had a sufficient supply 
of enumerators during the course of its field operations. The Bureau's 
approach was similar to that used for the 2000 Census despite vast 
differences in the economy. During the 2000 Census, the Bureau was 
recruiting in the midst of one of the tightest labor markets in nearly 
three decades. In contrast, during the 2010 Census, the Bureau was 
recruiting workers during a period of high unemployment. While having 
too few enumerators could affect the Bureau's ability to complete NRFU 
on schedule, overrecruiting has its own costs. For example, there are 
costs associated with administering and processing the test taken at 
the time an individual applies for a census job, as well as a $2 
charge to have a name background check run on all applicants. 
Overrecruiting can also be burdensome on applicants as they need to 
find a test site and take a test before they can be hired for a census 
job--a job that because the Bureau has overrecruited, may not be 
available. In looking forward to 2020, it will be important for the 
Bureau to more precisely refine its recruiting model based on lessons 
learned from the labor markets in both 2000 and 2010, and use this 
information to develop more accurate recruiting targets. It will also 
be important for the Bureau to adhere to recruiting goals so that 
additional costs are not incurred. 

The Bureau Completed NRFU $660 Million under Budget: 

The Bureau budgeted that NRFU would cost around $2.25 billion. 
However, by the end of the operation, the Bureau reported using 
approximately $1.59 billion, which was 29 percent lower than budgeted. 
The Bureau, with congressional approval, also set up a contingency 
fund of $574 million to cover additional expenses that could have been 
caused by unfavorable weather and other unforeseen events. However, in 
the end, contingency money was not needed to complete NRFU. 

While the Bureau conducted NRFU under budget, the difference between 
actual and expected NRFU costs also highlights the need for the Bureau 
to develop an accurate cost model in order to establish more credible 
cost estimates for 2020. In addition to NRFU, other census operations 
had substantial variances between their initial cost estimates and 
their actual costs. In our 2008 report, we noted that the Bureau had 
insufficient policies and procedures and inadequately trained staff 
for conducting high-quality cost estimation for the decennial census, 
and recommended that the Bureau take a variety of steps to improve the 
credibility and accuracy of its cost estimates, including performing 
sensitivity and uncertainty analyses.[Footnote 8] The Bureau generally 
agreed with our recommendation and is taking steps to address them. 

Most Local Census Offices Finished NRFU ahead of Schedule, but the 
Bureau's Ambitious Production Schedule May Have Produced Mixed Results: 

In conducting NRFU, it is important for enumerators to follow Bureau 
procedures for collecting complete and accurate data while keeping 
production on schedule so that subsequent activities can begin as 
planned. Timely completion of NRFU is also important because as time 
goes on, people move or might have difficulty remembering who was 
living in a household on Census Day. 

The Bureau went to great lengths to obtain complete data directly from 
household members. For example, Bureau procedures generally called for 
enumerators to make six attempts to reach each household on different 
days of the week at different times until they obtained needed 
information on that household. However, in cases where household 
members could not be contacted or refused to answer all or part of the 
census questionnaire, enumerators were permitted to obtain data via 
proxy (a neighbor, building manager, or other nonhousehold member 
presumed to know about its residents), or if an enumerator after the 
required six attempts was unable to collect data from either the 
household or a proxy respondent then the enumerator submitted the 
incomplete questionnaire to the LCO (this is referred to as a 
"closeout interview"). Closeout interviews are processed at Bureau 
headquarters where statistical methods are used to determine household 
information. 

For the 2010 Census, NRFU began May 1 and was scheduled to finish July 
10, 2010. However, a majority of LCOs generally finished their NRFU 
workloads ahead of this 10-week time frame. For example, by June 28, 
2010, week 8 of the NRFU operation, 342 of the Bureau's 494 LCOs 
(almost 69 percent), had completed 100 percent of their workload. 
Figure 3 shows the production levels over the course of NRFU. 

Figure 3: The Expected and Actual Number of Cases Completed during 
NRFU: 

[Refer to PDF for image: multiple line graph] 

Week of nonresponse follow-up: 1; 
Percentage of cases completed: 0; 
Percentage of cases expected to be completed: 0. 

Week of nonresponse follow-up: 2; 
Percentage of cases completed: 8; 
Percentage of cases expected to be completed: 13. 

Week of nonresponse follow-up: 3; 
Percentage of cases completed: 21; 
Percentage of cases expected to be completed: 25. 

Week of nonresponse follow-up: 4; 
Percentage of cases completed: 35; 
Percentage of cases expected to be completed: 40. 

Week of nonresponse follow-up: 5; 
Percentage of cases completed: 62; 
Percentage of cases expected to be completed: 55. 

Week of nonresponse follow-up: 6; 
Percentage of cases completed: 82; 
Percentage of cases expected to be completed: 65. 

Week of nonresponse follow-up: 7; 
Percentage of cases completed: 94; 
Percentage of cases expected to be completed: 79. 

Week of nonresponse follow-up: 8; 
Percentage of cases completed: 98; 
Percentage of cases expected to be completed: 89. 

Week of nonresponse follow-up: 9; 
Percentage of cases completed: 99.73; 
Percentage of cases expected to be completed: 96. 

Week of nonresponse follow-up: 10; (Scheduled end of nonresponse 
follow-up 2010); 
Percentage of cases completed: 99.99; 
Percentage of cases expected to be completed: 99. 

Source: GAO analysis of Census Bureau data. 

[End of figure] 

A number of factors helped most LCOs complete NRFU ahead of schedule. 
For example, the Bureau removed almost 2 million late mail returns 
prior to the start of NRFU, reducing the follow-up workload from 48.6 
million to 46.6 million housing units (a 4 percent reduction in NRFU 
workload). The removal of the late mail returns resulted in a 1.5 
percent increase in mail response rate, saving approximately $127.5 
million (based on the Bureau's estimate that a 1 percentage point 
increase in the mail response rate would decrease workload costs by 
around $85 million). 

Another factor that was instrumental to the success of NRFU was 
retaining a sufficiently skilled workforce. Because of high 
unemployment rates, turnover was far lower than anticipated. 
Advertising census jobs locally helped to ensure an adequate number of 
applicants, and according to the Bureau, mileage reimbursement may 
have been lower, in part because enumerators lived in and had local 
knowledge about the neighborhoods they were assigned. Further, people 
may have been more willing to cooperate with enumerators who were from 
their own community. For example, at a Native American village in New 
Mexico, local enumerators were aware that according to the community's 
customs it was considered offensive to launch into business without 
first engaging in conversation. In addition, local enumerators in 
hurricane-affected rural areas of Louisiana were able to successfully 
locate households based on their knowledge of the geography. For 
example, based on his familiarity with the area, one enumerator we 
observed was able to locate an assigned household not included on a 
census map using only a brief description, such as "a white house with 
green roof away from the road." 

The Bureau also used weekly production goals that helped LCOs focus on 
the need to stay on schedule and to track their progress. However, 
several measures we reviewed underscored the challenge that LCOs faced 
in hitting these production goals while still maintaining data quality. 

Significantly, our analysis of Bureau data found that the fast pace of 
NRFU was associated with the collection of less-complete household 
data.[Footnote 9] Indeed, after controlling for such variables as 
response rate and local enumeration challenges, we found that LCOs 
with higher percentages of proxy interviews and closeout interviews 
were more likely to have finished NRFU in 53 days or less (the average 
amount of time LCOs took to complete their NRFU workloads) compared to 
LCOs with lower percentages of proxy and closeout interviews. As noted 
above, proxy interviews contain data provided by a nonhousehold member 
(e.g., a neighbor) and may thus be less reliable than information 
collected directly from a household member, while a closeout interview 
is one where no interview is conducted and household information is 
later determined using statistical methods at Bureau headquarters 
during data processing. 

The pace of NRFU and its potential effect on data quality was also 
reflected in the responses of a number of LCO managers we surveyed. 
For example, although almost half of the LCO managers responding to 
our survey said they were satisfied with their ability to meet 
production goals while maintaining data quality, almost 30 percent of 
respondents were dissatisfied with their ability to meet production 
goals while maintaining data quality (around 20 percent responded that 
they were neither satisfied nor dissatisfied). Some of the LCO 
managers commented that they felt undue pressure to finish the 
operation early (sometimes a month earlier than planned) and as a 
result, finishing early could have come at the expense of accuracy. 

In one example, an LCO manager noted that it appeared as though the 
LCOs were in a race to finish NRFU as fast as possible, even if the 
best data were not received. Another LCO manager said that even though 
his office was surpassing the daily production goals, he was still 
being pressured to finish faster, and that accuracy was not mentioned. 
Also, LCO managers expressed frustration at production goals being 
changed frequently or unexpectedly moved earlier. 

Further, during our field visits, some LCO managers we spoke with at 
the start of NRFU were concerned about meeting production goals as 
there were not enough assignment area (AA) binders containing maps and 
address registers for every enumerator due to problems with the 
Bureau's Paper-Based Operations Control System (PBOCS), a key IT 
system that we discuss below. To ensure that enumerators had 
sufficient work, some crew leaders split-up AA binders between two or 
more enumerators. This is contrary to Bureau procedures which require 
enumerators to have their own AA binder. When the binders are split, 
only one enumerator has the required maps. Without maps an enumerator 
is unable to determine an assignment area's boundaries and ensure that 
the locations of all housing units are accurately recorded, which can 
affect data quality. 

Later in NRFU, managers at two LCOs we visited said they felt pressure 
to finish NRFU ahead of schedule. At one LCO, managers explained that 
the regional office wanted to finish NFRU by June 12, or approximately 
4 weeks ahead of schedule. However, at that LCO they were only 85 
percent complete by week 5, and because NRFU procedures instruct 
enumerators to make up to six attempts to contact a household, they 
were not sure how they were going to finish by week 5 without having 
to accept more refusals and incomplete interviews--leading to 
potentially more proxy and closeout interviews, thus reducing data 
quality. 

At the other LCO, production goals were stretched 15 percentage points 
above the national goal in order to complete NRFU ahead of schedule. 
One of the field supervisors at that office told us that he was able 
to meet the revised production goals by having enumerators share their 
workload. For example, in the morning, one enumerator would work the 
AA, and any remaining cases were given to another enumerator in the 
evening to complete. While this approach might have enhanced 
efficiency, the sharing of enumerator assignments makes it more 
difficult for the Bureau's quality-assurance procedures to identify 
enumerators that are not following procedures and may need to be 
retrained. Under the Bureau's procedures, AAs are to be assigned to 
one enumerator at a time. 

In late-May 2010, while NRFU was still underway, we discussed the pace 
of the operation with Bureau officials, and whether enumerators were 
more often accepting less-complete household information. In response, 
Bureau officials notified the LCOs and reminded them of the importance 
of following prescribed procedures. Moving forward, as the Bureau 
conducts its evaluations of its 2010 NRFU operation and begins 
planning for 2020, it will be important for Bureau officials to 
closely examine the quality of data collected during NRFU and the pace 
of the operation, and determine whether it is placing appropriate 
emphasis on both objectives. 

The Bureau Improved Its Procedures for Fingerprinting Employees, but 
More Work Is Needed: 

To better screen its workforce of hundreds of thousands of temporary 
census workers, the Bureau fingerprinted its temporary workforce for 
the first time in the 2010 Census.[Footnote 10] In past censuses, 
temporary workers were only subject to a name background check that 
was completed at the time of recruitment. The Bureau, however, 
encountered problems capturing fingerprints during address canvassing, 
an operation that the Bureau conducted in the summer of 2009 to verify 
every address in the country. According to the Bureau, 22 percent of 
the approximately 162,000 workers hired for address canvassing had 
unclassifiable prints, or fingerprints that were incomplete or 
unreadable. The Federal Bureau of Investigation (FBI) determined that 
this problem was generally the result of errors that occurred when the 
prints were first taken at the LCOs that affected the readability of 
the two fingerprint cards that were created for each individual. 

To address these problems, the Bureau improved its training procedures 
and purchased additional equipment in order to fingerprint some 
580,000 NRFU temporary employees. Specifically, the Bureau refined 
training manuals used to instruct LCO staff on how to take 
fingerprints, scheduled fingerprint training closer to when the prints 
were captured, and increased the length of training. Further, the 
Bureau used an oil-free lotion during fingerprinting that is believed 
to raise the ridges on fingertips to improve the legibility of the 
prints. 

The Bureau also revised its procedures to digitally capture a third 
and fourth set of fingerprints when the first two sets of fingerprint 
cards could not be read. The Bureau purchased around a thousand 
digital fingerprint scanners (see fig. 4) for this new effort. The 
Bureau estimated that this additional step could reduce the percentage 
of temporary workers with unclassifiable prints from 22 percent to 
approximately 10 to 12 percent, or an estimated 60,000 to 72,000 
temporary workers for NRFU. As of May 25, 2010, it reduced the 
percentage of temporary workers with unclassifiable prints to 8.6 
percent of 635,251, or approximately 54,000 temporary workers. 

Figure 4: Digital Fingerprint Scanner: 

[Refer to PDF for image: photograph] 

Source: GAO (May 2010). 

[End of figure] 

Fingerprint cards were sent from each LCO to the Bureau's National 
Processing Center (NPC) in Indiana where they were scanned and sent to 
the FBI. We visited the NPC during peak processing and observed that 
NPC was able to adequately handle the workload without any glitches. 
However, capturing fingerprints at training sites did not go as well. 
Some LCOs mentioned that collecting fingerprints took more time than 
expected, thus reducing the time available for enumerator field 
training. In our observations, at one LCO it took an extra 2 hours to 
fingerprint enumerators, and at another fingerprinting took so long it 
carried over to the next day (which put the NRFU instructor behind 
schedule). Furthermore, almost 50 percent of LCO managers responding 
to our survey reported dissatisfaction with fingerprinting procedures, 
compared to about 30 percent of LCO managers who were satisfied. For 
example, LCO managers commented that they did not have enough time to 
train staff conducting the fingerprinting or did not have adequate 
fingerprinting supplies, such as cards and ink pads. Several LCO 
managers said that the process was time-consuming, yet the additional 
time spent did not produce higher-quality prints, possibly because 
staff did not have fingerprinting expertise. Although some LCO 
managers said they would have preferred more digital fingerprinting, 
others reported that the digital fingerprint scanners did not work 
well and were time-consuming to use. In looking forward to 2020, the 
Bureau should revise or modify training so that field staff are 
provided with numerous practice opportunities for collecting 
fingerprints prior to each operation. 

Workarounds Helped Mitigate PBOCS Issues, but Continuing Problems 
Hampered the Implementation of Key Quality-Assurance Procedures: 

Since 2005, we have reported on weaknesses in the Bureau's management 
and testing of key 2010 Census IT systems. Although the IT systems 
ultimately functioned well enough for the Bureau to carry out the 
census, workarounds developed to address performance problems with 
PBOCS--a workflow-management system crucial for the Bureau's field 
operations--adversely affected the Bureau's ability to implement key 
quality-assurance procedures as planned. 

In June 2005, we noted that the Bureau had not fully implemented key 
practices important to managing IT, including investment management, 
system development and management, and enterprise architecture 
[Footnote 11] management.[Footnote 12] As a result, we concluded that 
the Bureau's IT investments were at increased risk of mismanagement, 
and were more likely to experience cost and schedule overruns and 
performance shortfalls. 

As development of the IT systems progressed, these problems were 
realized. For example, the Field Data Collection Automation program, 
which included the development of handheld computers to collect 
information for address canvassing and NRFU, experienced substantial 
schedule delays and cost increases.[Footnote 13] As a result, the 
Bureau later decided to abandon the planned use of handheld data 
collection devices for NRFU and reverted to paper questionnaires. The 
Bureau developed PBOCS to manage the operation. However, as we stated 
in several testimonies, even with the approach of Census Day, PBOCS 
had not demonstrated the ability to function reliably under full 
operational loads required to complete NRFU.[Footnote 14] We noted 
that the limited amount of time remaining to improve its reliability 
before it would be needed for key operations created a substantial 
challenge for the Bureau. 

Although the Bureau worked aggressively to improve PBOCS performance, 
the system experienced significant issues at the start of NRFU. For 
example, despite efforts to upgrade its hardware and software, PBOCS 
continued to experience system outages, slow performance, and problems 
generating and maintaining timely progress reports. The Bureau has 
attributed these issues, in part, to the compressed development and 
testing schedule, as well as to inadequate performance and interface 
testing. 

To mitigate the system's performance issues, the Bureau implemented 
various workarounds. For example, the Bureau frequently restricted the 
number of hours that PBOCS was available to users in order to 
implement software upgrades and perform other system maintenance 
activities. In addition, the Bureau restricted the number of 
concurrent PBOCS users at each LCO to help reduce demand on the 
system. These restrictions often limited the number of concurrent 
users to 3 to 5 users per LCO, or about 1,500 to 2,500 total users. 
According to a Bureau official with responsibility for PBOCS, the 
system was originally intended to provide access for over 7,000 
concurrent users. While these workarounds improved the reliability of 
PBOCS, LCO managers who responded to our survey were consistently 
dissatisfied with the restrictions on the number of users allowed at 
one time, and many commented that the restrictions adversely affected 
their ability to keep up with the workload. Further, the limitations 
on the number of concurrent users, combined with PBOCS outages and 
slow performance, delayed the shipping of questionnaires to the data 
capture centers and resulted in a peak backlog of nearly 12 million 
questionnaires at the LCOs. 

The substantial backlog of questionnaires hampered the Bureau's 
ability to effectively monitor productivity and data quality during 
NRFU as planned. Nearly 75 percent of LCO manager survey respondents 
were dissatisfied with the usefulness of PBOCS reports to plan and 
monitor work during NRFU. A dissatisfied respondent wrote in that the 
unavailability of reports greatly hampered his LCO's ability to 
conduct NRFU in an efficient manner. Almost 80 percent of responding 
LCO managers indicated that their LCO needed to put forth a 
substantial amount of extra effort to manually prepare reports to 
track productivity outside of PBOCS. The use of manual processes 
increased costs at the LCOs and raised the risk of human error. 

The backlog of questionnaires also hampered the Bureau's ability to 
conduct NRFU reinterviews, a quality-assurance operation designed to 
identify enumerators who intentionally or unintentionally produced 
data errors. PBOCS was to select a sample of cases from each 
enumerator's completed workload, and these cases would be 
reinterviewed by another enumerator. Once cases were selected, a 
quality-assurance enumerator attempted to reinterview the original 
NRFU respondents in an effort to verify that accurate data was 
collected during the initial NRFU interview. 

However, the backlog of questionnaires delayed the selection of 
reinterview cases and, as a result, some could not be conducted. For 
example, in areas with large populations of college students, the 
Bureau conducted NRFU early in order to maximize the probability of 
enumerating people before they were likely to move out from where they 
were living on Census Day. In some of those cases, reinterviews could 
not be conducted since the students had moved out by the time an 
enumerator was given the case for reinterview. In addition, it also 
took longer to detect and retrain an enumerator with performance 
problems. For example, LCO staff reported to us that, because of the 
delay carrying out reinterviews, it was often too late to retrain 
enumerators because they had already finished their assignments and 
were released before the errors were identified. In cases where an 
enumerator had intentionally falsified work, the enumerator was 
supposed to be released and all his or her work was to be redone. 
However, because of the PBOCS delays, falsified cases were sometimes 
identified after the enumerator was finished with his or her 
assignment, requiring their entire assignment area to be reenumerated. 

Identifying errors and falsifications early in the operation would 
have minimized the number of housing units that needed to be reworked 
and reduced the burden for respondents. For example, an LCO manager 
told us that her office was not able to detect an enumerator's 
falsification until after NRFU, when the enumerator had already moved 
on to the next operation, requiring the LCO to rework nearly 200 
cases. According to our survey, approximately 30 percent of LCO 
managers who experienced backlogs reported that they had substantial 
difficulty detecting errors or fraudulent interviewing as a result of 
the backlog, while more than 20 percent reported moderate difficulty 
and nearly 50 percent reported slight to no difficulty detecting 
errors or fraudulent interviewing as a result of the backlog. 

The implementation of various workarounds helped the Bureau 
successfully complete NRFU. However, the lack of a fully functioning 
PBOCS limited the Bureau's ability to effectively monitor productivity 
or implement quality-assurance procedures as documented in its 
operational plans. 

More generally, as the Bureau prepares for 2020, among other actions 
it will be important for it to continue to improve its ability to 
manage its IT investments. Leading up to the 2010 Census, we made 
numerous recommendations to the Bureau to improve its IT management 
practices by implementing best practices in risk management, 
requirements development, and testing, as well as establishing an IT 
acquisition-management policy that incorporates best practices. 
[Footnote 15] While the Bureau implemented many of our 
recommendations, it did not implement our broader recommendation to 
institutionalize these practices at the organizational level. The 
challenges experienced by the Bureau in acquiring and developing IT 
systems during the 2010 Census further demonstrate the importance of 
establishing and enforcing a rigorous IT acquisition management policy 
Bureau-wide. In addition, it will be important for the Bureau to 
improve its ability to consistently perform key IT management 
practices, such as IT investment management, system development and 
management, and enterprise architecture management. The effective use 
of these practices can better ensure that future IT investments will 
be pursued in a way that optimizes mission performance. 

Key Follow-up Operations Were Generally Completed as Planned: 

Vacant/Delete Check Operation Finished ahead of Schedule but over 
Budget: 

To help ensure that people are counted only once and in the right 
place, as well as to collect complete and correct information about 
them, after NRFU the Bureau conducts a number of operations designed 
to improve the accuracy of the data. One of these operations is the 
VDC operation, where enumerators verified the Census Day status of 
vacant and deleted (nonexistent) housing units. VDC also attempts to 
enumerate late additions to the Bureau's address file, such as newly 
constructed housing, and units for which the mail-out questionnaire 
was returned blank or incomplete. The Bureau refers to these 
additional addresses as supplemental cases. VDC has the potential to 
boost the accuracy of the census, especially among traditionally 
undercounted populations. A similar operation in 2000 found that 22 
percent of housing units previously identified as vacant, and 25 
percent of those previously flagged for deletion, were indeed 
occupied. Changing the status of these units led to a net gain of 3.1 
million people in the 2000 population count. 

The Bureau completed the VDC operation on August 23, slightly ahead of 
the original planned completion date of August 25, but also over 
budget. The Bureau spent about $281 million on VDC, approximately 15 
percent over its baseline budget of $244 million. Bureau officials we 
spoke to attributed the operation's progress to the retention of 
experienced NRFU staff for VDC. They noted that VDC staff were 
knowledgeable about procedures and the locations in which they worked, 
and required less training than they would have if they had been newly 
hired. With respect to the cost overruns, the Bureau is analyzing why 
VDC exceeded its budget. According to a Bureau official, additional 
costs may be related to VDC cases being located farther apart than 
expected (which would require more staff time and mileage 
reimbursement) and to enumerators adding more new addresses than 
expected. 

The VDC workload of 8.7 million housing units (5.6 million units 
vacant or flagged for deletion, 2.9 million supplemental addresses, 
and 0.2 million additions during the operation) was substantially less 
than the Bureau's previous estimate of 10.4 to 15.4 million units. 
During our review we found that while the Bureau had updated its total 
cost estimates for VDC, it had not adjusted day-to-day cost and 
progress expectations for VDC to account for the reduced workload. Not 
having the most recent targets for VDC could have impeded the Bureau's 
ability to effectively monitor the progress of enumerators in the 
field. We discussed this with Bureau officials, and in mid-July they 
revised VDC cost and progress estimates to account for the smaller 
workload, as well as other changes, including an earlier start date 
and reduced staffing. 

Further, during our field observations, LCO staff told us that some 
VDC supplemental addresses had already been enumerated as occupied 
units during NRFU. These supplemental addresses were slightly 
different from the NRFU addresses (e.g., 123 Main Street versus 123A 
Main Street) and appeared to be duplicate addresses. Duplicate 
addresses are supposed to be checked during field verification (an 
operation to confirm the existence of certain housing units added to 
the Bureau's address file) and should not have been in the VDC 
workload. Because the issue could indicate a nationwide problem, we 
notified Bureau officials, and in response they instituted a new 
procedure to identify and process duplicate addresses without making a 
follow-up visit to the housing unit. Identifying duplicate addresses 
before they get enumerated a second time is important because 
unnecessarily visiting a housing unit previously counted can reduce 
the accuracy of census data and will increase costs. 

In order to assess the reasons why VDC ran over budget, and as 
recommended in our June 2008 report, it will be important for the 
Bureau to document lessons learned for cost elements whose actual 
costs differ from the estimate.[Footnote 16] Knowing this will allow 
the Bureau to develop a more accurate cost estimate for VDC in 2020. 
In addition, to ensure the accuracy of data collected during VDC, it 
will be important for the Bureau to research how duplicates were 
inadvertently included in the VDC workload, as this data will help the 
Bureau compile a better address list for VDC operations in 2020. 

Census Coverage Measurement Redesigned with Smaller Sample to Reduce 
Nonsampling Errors: 

The Bureau attempts to conduct a complete and accurate count of the 
nation's population; nonetheless, some degree of coverage error is 
inevitable because of the inherent complexity of counting the nation's 
large and diverse population and limitations in census-taking methods. 
These census coverage errors can take a variety of forms, including a 
person missed (an undercount), a person counted more than once (an 
overcount), or a person who should not have been counted, such as a 
child born after Census Day (another type of overcount). And because 
census data are central to so many critical functions, it is essential 
to assess census accuracy and improve the process when possible. 

Statistical measurements of census coverage are obtained by comparing 
and matching the housing units and people counted by an independent 
sample or CCM survey to those counted by the census in and around the 
sample areas. The Bureau has developed separate address lists--one for 
the entire nation of over 134 million housing units that it is using 
to conduct the census and one for coverage-measurement sample areas--
and is collecting each set of data through independent operations. The 
Bureau collected its CCM data from households in sample areas 
nationwide, as part of an operation that began in the middle of August 
and was completed in October 2010. 

In our April 2010 report, we noted that in December 2009 the Bureau 
made numerous changes to the design of CCM that would reduce 
nonsampling error--such as human errors made when recording data 
during interviews--in CCM and its resulting estimates of census 
accuracy, thus providing census data users with more-reliable 
estimates.[Footnote 17] These changes include increasing quality-
assurance reinterviewing, hiring more CCM supervisors, and adding 
training for interviewers to improve interview techniques for local or 
other special situations (such as interviewing people who became 
homeless or have had to move frequently during the housing crisis). 
The December decision also reduced the CCM sample size by nearly 45 
percent. The Bureau believes that this reduction will generate cost 
savings to pay for changes to reduce nonsampling error. We believe 
that, overall, these changes are reasonable efforts to improve survey 
quality. The Bureau's reduction in sample size will reduce precision 
of the estimates, yet the proposed changes should reduce nonsampling 
errors and thus provide users with more-reliable estimates. 

Another challenge highlighted in our April 2010 report on CCM was 
determining the optimal time to collect data for some 170,000 housing 
units during person interviewing (PI), which began August 14 and ended 
October 16, 2010. The issue is that if the Bureau starts PI too early, 
it increases the chance that it overlaps with census data collection, 
possibly compromising the independence of the two different sets of 
data and introducing what is referred to as a "contamination bias" 
error into CCM data. However, if the Bureau starts PI too late, it 
increases the chance that respondents will not accurately remember 
household information from Census Day, introducing error (known as 
"recall bias") in the CCM count. In that report we recommended that 
the Bureau assess the trade-offs between starting early and 
introducing contamination bias or starting later and risking recall 
bias. The Bureau responded that it planned to study and measure some 
recall errors, but that there was no study planned to measure 
contamination bias in 2010 due to concerns with the possible 
contamination of census results in the study area. However, since both 
types of errors--contamination bias and recall bias--could affect the 
Bureau's conclusions about the accuracy of the census, it will be 
important for the Bureau to implement our recommendation and assess 
the trade-offs between the two types of biases in timing decisions. 
Moreover, this assessment could help the Bureau better inform the 
optimal timing for future census and coverage-measurement data 
collection operations. 

Fundamental Reforms Will Be Needed for a More Cost-Effective Census in 
2020: 

While it will be important to assess and revamp existing census-taking 
activities, the results of prior enumerations underscore the fact that 
simply refining current methods--many of which have been in place for 
decades--will not bring about the reforms needed to control costs 
while maintaining accuracy given ongoing and newly emerging societal 
trends. Since 1970, the Bureau has used essentially the same approach 
to count the vast majority of the population. The Bureau develops an 
address list of the nation's housing units and mails census forms to 
each one for occupants to fill out and mail back. Over time, because 
of demographic and attitudinal trends, securing an acceptable response 
rate has become an increasing challenge, and the Bureau has spent more 
money with each census in order to secure a complete count. Indeed, 
the cost of conducting the census has, on average, doubled each decade 
since 1970, in constant 2010 dollars. If that rate of cost escalation 
continues into 2020, the nation could be facing a $30 billion census. 

Despite the nation's greater investment in each census, the results 
are often no better than the previous decennial. For example, as noted 
earlier, while the unit cost of the census jumped from an average of 
around $70 in 2000 to around $98 in 2010, the mail response rate 
declined in 48 states. Our concerns about the rising cost and 
diminishing returns of the census are not new. In the mid-1990s, for 
example, we and others concluded that the established approach for 
taking the census in 1990 had exhausted its potential for counting the 
population cost-effectively and that fundamental design changes were 
needed to reduce census costs and improve the quality of data 
collected.[Footnote 18] 

A fundamental reexamination of the nation's approach to the census 
will require the Bureau to rethink its approach to planning, testing, 
implementing, monitoring, and evaluating the census, and addressing 
such questions as, why was a certain program initiated? What was the 
intended goal? Have significant changes occurred that affect its 
purpose? Does it use prevailing leading practices? 

As one example, a critical factor affecting the cost of the census is 
the necessity for the Bureau to follow up on nonresponding housing 
units. The hourly wages of enumerators, their productivity, mileage 
reimbursement, and the need, in some cases, to return several times to 
an address to obtain a response can all drive up costs. Administrative 
records from other government agencies including driver licenses and 
school records can, if used in lieu of making multiple visits to a 
housing unit, significantly control costs. However, the Bureau would 
first need to resolve a number of questions including the quality and 
the coverage of the information supplied by the records and the policy 
and legal implications of accessing them. 

On the basis of our earlier work on high-performing organizations, 
fundamental reforms will mean ensuring that the Bureau's 
organizational culture and structure, as well as its approach to 
strategic planning, human-capital management, internal collaboration, 
knowledge sharing, capital decision making, risk and change 
management, and other internal functions are aligned toward delivering 
more cost-effective outcomes.[Footnote 19] Indeed, some of the 
operational problems that occurred during the 2010 and prior censuses 
are symptomatic of deeper organizational issues. For example, the lack 
of staff skilled in cost-estimation during the 2010 Census points to 
inadequate human-capital planning, while IT problems stemmed from not 
fully and consistently performing certain functions including IT 
investment management. 

Going forward, it will be important for the Bureau, Congress, and 
other stakeholders to reach consensus on a number of reexamination 
areas, including the following, which have particular implications for 
controlling costs and improving accuracy: 

* Which data collection approaches, the Internet and administrative 
records among them, have potential to improve data quality without 
compromising other Bureau goals and mandates such as confidentiality 
and timeliness? 

* To what extent can private-sector and other sources of information 
such as maps, address lists, and geographic databases be employed to 
help support the census? 

* How can the Bureau enhance how it partners with government and 
nongovernmental organizations, data users, grassroots organizations, 
and advisory groups to obtain their input and possibly better leverage 
their knowledge and services? What is the best way of maintaining 
congressional and stakeholder involvement and dialog throughout the 
course of the decade? 

* What opportunities exist for the Bureau to leverage innovations in 
technology and social media to more fully engage census stakeholders 
and the general public throughout the decade on census issues, 
possibly identifying more cost-effective methods? 

* To what extent can the Bureau use the American Community Survey--an 
ongoing Bureau survey of population and housing characteristics that 
is conducted throughout the decade--as a platform to test new census 
methods and systems? 

* What are the implications of the Bureau's goal to conduct the 2020 
Census at a lower cost than the 2010 Census on a cost per housing unit 
basis, adjusted for inflation? For example, how would this spending 
limit affect such considerations as accountability and data quality? 

* How can the Bureau best balance the acquisition of advanced 
technology, some of which might not be fully mature until later in the 
decade, with the need to commit to particular systems sufficiently 
early in the decade to ensure the systems are fully tested and will 
work under census-like conditions? 

* To what extent can the Bureau control costs and improve accuracy by 
targeting census-taking activities using local response rate and 
sociodemographic information from the 2010 Census, as well as other 
data sources and empirical evidence? 

* What options exist for controlling the costs of particularly labor-
intensive operations such as NRFU and building the Bureau's master 
address list without sacrificing accuracy? 

* Can stakeholders reach agreement on a set of criteria that could be 
used to weigh the trade-offs associated with the need for high levels 
of accuracy on the one hand, and the increasing cost of achieving that 
accuracy on the other hand? 

The Bureau, recognizing that it cannot afford to continue operating 
the way it does unless it fundamentally changes its method of doing 
business, has already taken some important first steps in addressing 
these questions as well as other areas. For example, the Bureau is 
looking to reform certain aspects of its IT systems planning, in part 
to ensure that the technical infrastructure needed for 2020 will be 
tested many times before operations begin. The Bureau is also 
rebuilding its research directorate to lead early planning efforts, 
and has plans to assess and monitor the skills and competencies needed 
for the 2020 headcount and evaluate the feasibility of administrative 
records. 

Further, the Bureau has already developed a strategic plan for 2020 
and other related documents that, among other things, lay out the 
structure of the Bureau's planning efforts; outline the Bureau's 
mission and vision for 2020 and the goals the Bureau seeks to meet to 
accomplish its mission; and describe the Bureau's plans for the 
research and testing phase of the next enumeration. 

The Bureau's early planning efforts are noteworthy given the Bureau's 
long-standing challenges in this area. For example, in 1988, just 
prior to the 1990 Census, we noted that the Bureau's planning efforts 
generally started late, experienced delays, were incomplete, and 
failed to fully explore innovative approaches.[Footnote 20] Planning 
for the 2000 Census also had its shortcomings. According to the 
Bureau, staff with little operational experience played key roles in 
the design process, which resulted in impractical reform ideas that 
could not be implemented. We also noted that the 2000 Census suffered 
from a persistent lack of priority-setting, coupled with minimal 
research, testing, and evaluation documentation to promote informed 
and timely decision making. And, while the planning process for the 
2010 Census was initially more rigorous than for past decennials, in 
2004 we reported that the Bureau's efforts lacked a substantial amount 
of supporting analysis, budgetary transparency, and other information, 
making it difficult for us, Congress, and other stakeholders to 
properly assess the feasibility of the Bureau's design and the extent 
to which it could lead to greater cost-effectiveness compared to 
alternative approaches.[Footnote 21] As a result, in 2004, we 
recommended that the Bureau develop an operational plan for 2010 that 
consolidated budget, methodological, and other relevant information 
into a single, comprehensive document. 

Although the Bureau later developed specific performance targets and 
an integrated project schedule for 2010, the other elements we 
recommended were only issued piecemeal, if available at all, and were 
never provided in a single, comprehensive document. Because this 
information was critical for facilitating a thorough, independent 
review of the Bureau's plans, as well as for demonstrating to Congress 
and other stakeholders that the Bureau could effectively design and 
manage operations and control costs, we believe that had it been 
available, it could have helped stave off, or at least reduce, the IT 
and other risks that confronted the Bureau as Census Day drew closer. 

The Bureau's strategic plan for 2020, first issued in 2009, is a 
"living" document that will be updated as planning efforts progress. 
As the approach for 2020 takes shape, it will be important for the 
Bureau to avoid some of the problems it had in documenting the 
planning process for the 2010 Census, and pull all the planning 
elements together into a tactical plan or road map. This will help 
ensure the Bureau's reform initiatives stay on track, do not lose 
momentum, and coalesce into a viable path toward a more cost-effective 
2020 Census. On the basis of our work on planning for the 2010 Census, 
a road map for 2020 could include, but not be limited to, the 
following elements that could be updated on a regular basis: 

* specific, measurable performance goals, how the Bureau's efforts, 
procedures, and projects would contribute to those goals, and what 
performance measures would be used; 

* descriptions of how the Bureau's approaches to human-capital 
management, organizational structure, IT acquisitions, and other 
internal functions are aligned with the performance goals; 

* an assessment of the risks associated with each significant 
decennial operation, including the interrelationships between the 
operations and a description of relevant mitigation plans; 

* detailed milestone estimates for each significant decennial 
operation, including estimated testing dates, and justification for 
any changes to milestone estimates; 

* detailed life-cycle cost estimates of the decennial census that are 
credible, comprehensive, accurate, and well-documented as stipulated 
by Office of Management and Budget and GAO guidance; and: 

* a detailed description of all significant contracts the Bureau plans 
to enter into and a justification for the contracts. 

A comprehensive road map could generate several important benefits. 
For example, it could help ensure a measure of transparency and 
facilitate a more collaborative approach to planning the next census. 
Specifically, an operational plan could function as a template for 
2020 giving stakeholders a common framework to assess and comment on 
the design of the census and its supporting infrastructure, the 
resources needed to execute the design, and the extent to which it 
could lead to greater cost-effectiveness compared to alternative 
approaches. Further, it could be used to monitor the Bureau's progress 
in implementing its approach, and hold the agency accountable for 
results. Importantly, to the extent the plan--or aspects of it--are 
made available using social media tools, it could prompt greater and 
perhaps more constructive civic engagement on the census, by fostering 
an ongoing dialog involving individuals and communities of 
stakeholders throughout the decade. On December 8, 2010, the Senate 
approved a bill, the Census Oversight Efficiency and Management Reform 
Act of 2010.[Footnote 22] If enacted, this bill, among its other 
provisions, would require the Director of the Census to submit an 
annual comprehensive status report on the next decennial census, 
beginning with the 2020 decennial census, to the appropriate 
congressional committees. The specific requirements in the bill for 
the annual plan include most of the elements discussed above. 

Given the magnitude of the planning and transformation efforts facing 
the Bureau, another reexamination question is that of long-term 
stewardship governing the endeavor. Specifically, as the research, 
development, and testing efforts for 2020 will play out over the 
decade-long census life cycle, what is the optimal way to ensure 
continuity and accountability for an enterprise that takes years to 
complete and extends beyond the tenure of many elected political 
leaders? 

Although the Director of the Census Bureau can, in concept, provide a 
measure of continuity, of the 11 census directors that have served 
since July 1969 (not including the current director), the average 
tenure was around 3 years, and only one director has served more than 
5 years. Moreover, in the decade leading up to the 2010 Census, the 
Bureau was led by four different directors and several acting 
directors. The turnover in the Bureau's chief executive officer 
position makes it difficult to develop and sustain efforts that foster 
change, produce results, mitigate risks, and control costs over the 
long term. 

Currently, census directors are nominated by the President with Senate 
confirmation. At the same time, the heads of a number of executive 
agencies serve fixed appointments, including the Director of the 
Office of Personnel Management (4 years), the Commissioner of Labor 
Statistics (4 years), and the Commissioner of Internal Revenue (5 
years). 

The census bill, recently passed by the Senate and discussed above, 
includes a provision for a 5-year tenure for the Census Director. We 
believe that the continuity resulting from a fixed-term appointment 
could provide the following benefits to the Bureau: 

* Strategic vision. The Director needs to build a long-term vision for 
the Bureau that extends beyond the current decennial census. Strategic 
planning, human-capital succession planning, and life-cycle cost 
estimates for the Bureau all span the decade. 

* Sustaining stakeholder relationships. The Director needs to 
continually expand and develop working relationships and partnerships 
with governmental, political, and other professional officials in both 
the public and private sectors to obtain their input, support, and 
participation in the Bureau's activities. 

* Accountability. The life-cycle cost for a decennial census spans a 
decade, and decisions made early in the decade about the next 
decennial census guide the research, investments, and tests carried 
out throughout the entire 10-year period. Institutionalizing 
accountability over an extended period may help long-term decennial 
initiatives provide meaningful and sustainable results. 

Overall, the obstacles to conducting a cost-effective census have 
grown with each decade, and as the Bureau looks toward the next 
enumeration, it might confront its biggest challenge to date. As the 
Bureau's past experience has shown, early investments in planning can 
help reduce the costs and risks of its downstream operations. 
Therefore, while Census Day 2020 is 10 years away, it is not too early 
for stakeholders to start considering the reforms needed to help 
ensure the next headcount is as cost-effective as possible. 

Conclusions: 

Although the complete results of the 2010 Census are still some years 
away, several preliminary lessons learned for the next enumeration 
have already begun to emerge. They include the benefits of a 
replacement questionnaire, the removal of late mail returns from the 
NRFU workload, and hiring locally. Focus areas for improvement include 
revisiting the Bureau's staffing model, ensuring the Bureau emphasizes 
quality as well as production during NRFU, better IT management, and 
ensuring a high-quality address file is used to carry out VDC 
operations. 

That said, perhaps the most important lesson learned comes from the 
collective experience gained from the 1990, 2000, and now 2010 
enumerations: the Bureau goes to great lengths each decade to improve 
specific census-taking activities, but these incremental modifications 
have not kept pace with societal changes that make the population 
increasingly difficult to locate and count cost-effectively. 
Therefore, as the Bureau looks toward 2020, it will be important for 
it to reexamine both the fundamental design of the enumeration, as 
well as its management and culture to ensure that the Bureau's 
business practices and systems enhance its capacity to conduct an 
accurate count, control costs, manage risks, and be more nimble in 
adapting to social, demographic, technological, and other changes that 
can be expected in the years ahead. 

The Bureau is taking some initial steps toward rethinking the census. 
At the same time, past experience has shown that the Bureau cannot 
plan and execute a successful enumeration on its own. Indeed, the 
noteworthy achievements of the 2010 Census occurred because of the 
shared efforts of the Bureau, and its parent organizations the 
Department of Commerce and the Economics and Statistics 
Administration, Congress, and thousands of other parties. It will be 
important for these and additional stakeholders to maintain their 
focus on the census throughout the decade in order to achieve desired 
results. Certain census reforms could require legislative changes, and 
any new procedures will need to be thoroughly vetted, tested, and 
refined. Although the next enumeration is 10 years away, the 
groundwork for building a new census infrastructure is already under 
way. The bottom line is that while the urgency of the 2010 Census has 
subsided, it is by no means any less important to the nation. 

Recommendations for Executive Action: 

As the Bureau plans for the next decennial census in 2020, in order to 
support efforts to reexamine the fundamental design of the decennial 
census, and help refine existing operations should they be used again 
in the 2020 Census, we recommend that the Secretary of Commerce direct 
the Under Secretary of the Economics and Statistics Administration, as 
well as the Census Director, to take the following six actions: 

* To help enhance the Bureau's performance and accountability, improve 
the transparency of the planning process, gauge whether the Bureau is 
on-track toward a more cost-effective 2020 Census, and foster greater 
public dialog about the census, the Bureau should develop an 
operational plan or road map for 2020 that integrates performance, 
budget, methodological, schedule, and other information that would be 
updated as needed and posted on the Bureau's Web site and other social 
media outlets, and develop a mechanism that allows for and harnesses 
input from census stakeholders and individuals. 

* To refine its approach to recruiting, the Bureau should evaluate 
current economic factors that are associated with and predictive of 
employee interest in census work, such as national and regional 
unemployment levels, and use these available data to determine the 
potential temporary workforce pool and adjust its recruiting approach. 

* To help ensure that the Bureau's procedures for NRFU result in the 
collection of high-quality data, the Bureau's procedures for the 
timely completion of NRFU should emphasize the collection of high-
quality data and proper enumeration techniques as much as speed. 

* To improve the fingerprinting process of temporary workers, the 
Bureau should revise or modify training so that field staff are 
provided with numerous practice opportunities for collecting 
fingerprints prior to each operation. 

* To ensure that the Bureau improves its ability to manage future IT 
acquisitions, the Bureau should immediately establish and enforce a 
system-acquisition management policy that incorporates best practices 
in system-and software-acquisition management. 

* To help ensure the Bureau compiles an accurate address list for VDC 
operations in 2020, the Bureau should research how duplicate addresses 
were inadvertently included in the VDC workload. 

Agency Comments and Our Evaluation: 

The Secretary of Commerce provided written comments on a draft of this 
report on December 7, 2010. The comments are reprinted in appendix II. 
The Department of Commerce generally agreed with the overall findings 
and recommendations of the report. In addition, the Secretary of 
Commerce provided the Bureau's technical comments and suggestions 
where additional context might be needed, and we revised the report to 
reflect these comments where appropriate. 

The Bureau noted that our report did not acknowledge the steps it took 
to modify its recruiting plans prior to NRFU. However, we do discuss 
the Bureau's modifications to its recruiting plans. Specifically, we 
stated that "once the Bureau had an adequate pool of candidates for 
2010, it attempted to limit the number of additional applicants, 
taking such steps as discontinuing the advertising of census jobs in 
mailed out census materials." 

The Bureau also commented that it wanted to discuss our analysis that 
found that the fast pace of NRFU was associated with the collection of 
less-complete household data, noting that its own analysis of a 
similar question did not yield the same finding. On December 7, 2010, 
we met with Bureau staff to discuss the methodologies and variables 
used in each analysis. After discussing our methodology and results, 
Bureau staff explained that their analysis was preliminary and not as 
comprehensive as our analysis. Further, they acknowledged that they 
used a different methodology and different variables. 

The Bureau, in commenting on our finding related to fingerprinting 
temporary workers, noted that it was unclear as to ways in which 
extending training, which usually requires spending more time and 
money, would streamline fingerprinting efforts. To clarify this 
section we changed the body of the report. The text now reads, "In 
looking forward to 2020, the Bureau should revise or modify training 
so that field staff are provided with numerous practice opportunities 
for collecting fingerprints prior to each operation." 

We are sending copies of this report to the Secretary of Commerce, the 
Under Secretary of Economic Affairs, the Director of the U.S. Census 
Bureau, and interested congressional committees. The report also is 
available at no charge on GAO's Web site at [hyperlink, 
http://www.gao.gov]. 

If you have any questions on matters discussed in this report, please 
contact Robert Goldenkoff at (202) 512-2757 or by e-mail at 
goldenkoffr@gao.gov. Contact points for our Offices of Congressional 
Relations and Public Affairs may be found on the last page of this 
report. Key contributors to this report are listed in appendix III. 

Signed by: 

Robert Goldenkoff: 
Director: 
Strategic Issues: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

The objectives of this report were to assess the implementation of (1) 
nonresponse follow-up (NRFU), the largest and most costly census field 
operation, where the U.S. Census Bureau (Bureau) sends enumerators to 
collect data from households that did not mail back their census 
forms, and (2) other key follow-up field operations that were critical 
for ensuring a complete count; and (3) identify key questions and 
focus areas that will be important for the Bureau, Congress, and 
census stakeholders to consider going forward now that planning for 
the next enumeration is underway. 

To meet our objectives, we used a combination of approaches and 
methods to examine the conduct of these operations. These included 
statistical analyses; surveys of the Bureau's 494 local census office 
(LCO) managers; analysis of mail response and form check-in rates from 
Bureau cost and progress systems; interviews with key bureau 
headquarters officials and LCO managers and staff; observation of 
LCO's NRFU operations; and reviews of relevant documentation including 
our prior work on the planning and implementation of the 1990, 2000, 
and 2010 Censuses. 

To examine the factors that affected the implementation of NRFU and 
vacant/delete check operations (VDC), we interviewed LCO managers and 
other supervisory staff and observed operations at 28 LCOs we visited 
across the country. We selected LCOs because they were located in hard-
to-count areas as determined by data from the 2000 Census. To make 
these selections, we also used other factors such as their percentage 
of rural population to obtain diversity in urban/rural populations, 
and proximity to hurricane-affected areas. Selections for VDC 
observations were based primarily on locations with high rates of 
vacant and delete classifications, and they were chosen to include a 
mix of urban, suburban, and rural LCO located in all regions of the 
country. (See below for a complete list of the offices we visited.) 
During these visits, which took place from April to July 2010, we 
observed office operations to see how office staff were processing 
questionnaires using the Paper-Based Operations Control System (PBOCS) 
and capturing fingerprints with live scanners, attended enumerator 
training, and observed enumerators in the field go door-to-door to 
collect census data for NRFU, NRFU reinterview, and VDC. Because 
offices were judgmentally selected, our findings from these visits 
cannot be projected to the universe of LCOs. 

To obtain a national perspective on the conduct of NRFU and other 
field data collection operations, we conducted a panel survey of all 
494 LCO managers from March to August 2010 using six questionnaires. 
The survey was designed to examine (1) factors that affect the cost 
and performance of local data collection efforts, and (2) LCO 
managers' satisfaction with information technology (IT) systems and 
other management support functions. Response rate was at least 75 
percent for each survey questionnaire. 

The practical difficulties of developing and administering a survey 
may introduce errors--from how a particular question is interpreted, 
for example, or from differences in the sources of information 
available to respondents when answering a question. Therefore, we 
included steps in developing and administering the questionnaire to 
minimize such errors. For instance, we conducted pretests to check 
that (1) the questions were clear and unambiguous, (2) terminology was 
used correctly, (3) the questionnaire did not place an undue burden on 
agency officials, (4) the information could feasibly be obtained, and 
(5) the survey was comprehensive and unbiased. Pretest sites were 
selected for each wave to emphasize variation among urban and rural 
LCOs. Pretests were conducted over the phone, mostly as cognitive 
pretests in which the respondent completed the survey during the 
pretest. We made changes to the content and format of the 
questionnaire after review by a GAO survey expert and after each of 
the pretests, based on the feedback we received. 

To examine whether the pace of NRFU was associated with the collection 
of less-complete data, in addition to the efforts described above, we 
analyzed Bureau proxy and closeout rates, and the time it took for an 
LCO to complete the NRFU workload. In order to determine whether the 
durations of 2010 NRFU production activities were associated with 
lower-quality work, we conducted regression analyses using data from 
the Bureau's Cost and Progress System, PBOCS, and Matching Review and 
Coding System (MaRCS). These analyses assessed whether indicators of 
lower-quality enumeration such as the collection of proxy data from a 
neighbor and closeout interviews, where a housing unit is occupied but 
no interview was obtained, were associated with the number of days 
that the LCO spent conducting NRFU production activities, after 
adjusting for other factors associated with the timeliness of 
completion and workload. We used two regression models: one model 
tested the association between the number of days it took each LCO to 
complete 100 percent of its workload and quality factors; the other 
regression model tested the association between quick completion and 
quality factors. We also analyzed cost data weekly for both NRFU and 
VDC to determine whether those operations were within their respective 
budgets. 

To assess the reliability of the data, we reviewed Bureau electronic 
documentation to gain information about the data and their sources. We 
examined data from the Bureau's Cost and Progress, PBOCS, and MaRCS 
systems to check for logical errors and inconsistencies, and followed 
up with agency officials knowledgeable about the data in cases where 
we had questions about potential errors or inconsistencies, and to 
inquire about the accuracy and completeness of the entry and 
processing of the data. Values are updated by the Bureau throughout 
the operations, and may be revised by the Bureau even after the 
operations close. On the basis of our efforts, we determined that the 
data were sufficiently reliable for the purposes of this engagement. 

Finally, to identify preliminary steps the Bureau can take to help 
transform its management and culture, we reviewed our prior work on 
governmentwide reexamination, as well as leading practices and 
attributes in the areas of IT management, organizational performance, 
collaboration, stewardship, and human capital. In addition, we 
reviewed census planning material, prior GAO work on census planning 
and development efforts, and spoke with Bureau officials about their 
needs and plans for management and cultural transformation. 

We conducted this performance audit from December 2009 until December 
2010 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audits 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

Local Census Offices Visited in This Review: 

Tucson, Arizona; 
Fresno and San Francisco, California; 
New Castle, Delaware; 
Fort Myers, Florida; 
Atlanta, Georgia; 
Chicago (2 locations), Illinois; 
New Orleans, Louisiana; 
Baltimore and Seat Pleasant, Maryland; 
Boston, Massachusetts; 
Detroit, Michigan; 
Meridian, Mississippi; 
Cape Girardeau and St. Louis, Missouri; 
Las Vegas (2 locations), Nevada; 
Albuquerque, New Mexico; 
Bronx, Brooklyn, and Manhattan, New York; 
Asheville and Fayetteville, North Carolina; 
Philadelphia, Pennsylvania; 
Dallas and Houston, Texas; 
Washington, District of Columbia. 

[End of section] 

Appendix II: Comments from the Department of Commerce: 

Note: Page numbers in the draft report may differ from those in this 
report. 

United States Department Of Commerce: 
The Secretary of Commerce: 
Washington, D.C. 20230: 

December 7, 2010: 

Mr. Robert Goldenkoff: 
Director: 
Strategic Issues: 
U.S. Government Accountability Office: 
Washington, DC 20548: 

Dear Mr. Goldenkoff: 

The Department of Commerce appreciates the opportunity to comment on 
the U.S. Government Accountability Office's draft report entitled 
"2010 Census: Data Collection Operations Were Generally Completed as 
Planned, But Longstanding Challenges Suggest Need for Fundamental 
Reforms" (GAO-11-193). The Department of Commerce's comments on this 
report are enclosed. 

Sincerely, 

Signed by: 

Gary Locke: 
	
Enclosure: 

[End of letter] 

Department of Commerce Comments on the United States Government 
Accountability Office Draft Report Entitled "2010 Census: Data 
Collection Operations Were Generally Completed as Planned, But 
Longstanding Challenges Suggest Need for Fundamental Reforms"
(GA0-11-193) December 2010. 

The Department of Commerce (Department) would like to thank the U.S. 
Government Accountability Office (GAO) for its efforts in examining 
the 2010 Census Nonresponse Followup (NRFU) and other field operations 
to seek out improved approaches to securing greater participation from 
the public and to reduce extensive operating costs in the next census. 

The Department generally agrees with the overall findings and with the 
recommendations regarding items suggested for study for conducting the 
2020 Census. The Census Bureau does, however, wish to provide a few 
comments about the statements and conclusions in this report: 

* Page 13, second paragraph: "With respect to staffing levels, the 
Bureau set a recruitment goal of nearly 3.7 million qualified 
applicants and achieved 104 percent of this goal..." 

Response: As clarification, the Census Bureau notes that our goal was 
3.8 million total applicants, in order to yield a sufficient number of 
qualified applicants. Not all applicants we recruit qualify for census 
work. In Census 2000, only about 73 percent of the
applicants ended up being qualified. For the 2010 Census, this figure 
was about 77 percent. This comment also applies to a similar statement 
on page 3 of the report. 

* Page 13, last paragraph: "According to the Bureau, based on past 
experience, it set its recruiting goal at five times the number of 
persons .... develop more accurate recruiting targets. It will also be 
important for the Bureau to adhere to recruiting goals so that 
additional costs are not incurred." 

Response: The Census Bureau agrees that initial plans for 2010 Census 
recruiting were driven by its Census 2000 experience; however, this 
section of the report does not acknowledge that, based on the state of 
the economy in late 2009 and early 2010, the Census Bureau took 
several steps to modify its recruiting plans prior to the peak 
recruiting for 2010 field operations, such as NRFU. The Census Bureau 
certainly agrees that monitoring economic conditions closely to 
develop and implement a recruiting strategy for the 2020 Census is 
important. 

* Page 16, first paragraph: "...if an enumerator was unable to collect 
data from either the household or a proxy respondent a "closeout 
interview" was used where household information was later determined 
using statistical methods based on neighborhood characteristics." 

and: 

Page 19, first paragraph: "... a closeout interview is one where no 
interview is conducted and household information is later determined 
using statistical methods." 

Response: The Census Bureau recommends some slight revisions to these 
sentences. As currently written, these statements might be read to 
imply that field staff was responsible for determining household 
characteristics based on the characteristics of neighboring units. 
Explaining that this imputation step takes place at headquarters 
during data processing would provide clarity and additional accuracy. 
Also, a statement that implies that no data are collected during a 
closeout interview is not correct. Field staff sometimes obtained a 
population count directly from a resident. 

* Page 18, last paragraph: "Significantly, our analysis of Bureau data 
found that the fast pace of NRFU was associated with the collection of 
less complete household data." 

Response: The Census Bureau would be interested in discussing these 
findings in more detail, because its analysis of a similar question 
did not yield the same finding. 

* Page 24, lines 16-17: "ln looking forward to 2020, the Bureau could 
streamline fingerprint taking by extending training sessions to allot 
more time for the process." 

Response: The Census Bureau would appreciate additional clarity 
regarding this recommendation. The Census Bureau is unclear as to ways 
in which extending training, which usually requires spending more time 
and money, would streamline fingerprinting efforts. 

* Page 47, second paragraph: "...the noteworthy achievements of the 
2010 Census occurred because of the shared efforts of the Bureau, 
Congress and thousands of other parties." 

Response: This sentence should specifically include the Department of 
Commerce and the Economics and Statistics Administration (ESA). 
Particularly in 2009 and 2010, ESA played a significant role in 
helping to make the 2010 Census a success. 

* Page 48, second paragraph: "...we recommend that the Secretary of 
Commerce direct the Census Director to take the following six 
actions:..." 

Response: The Secretary of Commerce should direct the Under Secretary 
of the Economics and Statistics Administration (ESA) as well as the 
Census Director. ESA has management oversight responsibility of the 
Census Bureau and has been actively engaged in planning for the 2020 
Census, including development of effective, efficient, and forward 
thinking integrated management approaches and systems that will result 
in successful and cost-effective operations across the bureau's 
programs and activities. 

* Page 50, second paragraph: "We are sending copies of this report to 
the Secretary of Commerce, the Director of the U.S. Census Bureau, and 
interested congressional committees." 

Response: Please also send a copy of the report to the Under Secretary 
for Economic Affairs (ESA). 

In conclusion, we want to acknowledge the GAO's extensive work in 
reviewing these activities, and appreciate its ongoing efforts to help 
us develop a successful evaluation plan for the 2020 Census. 

[End of section] 

Appendix III: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Robert Goldenkoff, (202) 512-2757 or goldenkoffr@gao.gov: 

Staff Acknowledgments: 

Other key contributors to this report include Lisa Pearson, Assistant 
Director; Mark Abraham; David Bobruff; Benjamin Crawford; Sara 
Daleski; Dewi Djunaidy; Vijay D'Souza; Elizabeth Fan; Ronald Fecso; 
Robert Gebhart; Richard Hung; Kristen Lauber; Andrea Levine; Ty 
Mitchell; Kathleen Padulchick; Michael Pahr; Tind Ryen; Jonathan 
Ticehurst; Timothy Wexler; Holly Williams; Monique Williams; and Kate 
Wulff. 

[End of section] 

Related GAO Products: 

2010 Census: Key Efforts to Include Hard-to-Count Populations Went 
Generally as Planned; Improvements Could Make the Efforts More 
Effective for Next Census. [hyperlink, 
http://www.gao.gov/products/GAO-11-45]. Washington, D.C.: December 14, 
2010. 

2010 Census: Follow-up Should Reduce Coverage Errors, but Effects on 
Demographic Groups Need to Be Determined. [hyperlink, 
http://www.gao.gov/products/GAO-11-154]. Washington, D.C.: December 
14, 2010. 

2010 Census: Plans for Census Coverage Measurement Are on Track, but 
Additional Steps Will Improve Its Usefulness. [hyperlink, 
http://www.gao.gov/products/GAO-10-324]. Washington, D.C.: April 23, 
2010. 

2010 Census: Data Collection Is Under Way, but Reliability of Key 
Information Technology Systems Remains a Risk. [hyperlink, 
http://www.gao.gov/products/GAO-10-567T]. Washington, D.C.: March 25, 
2010. 

2010 Census: Key Enumeration Activities Are Moving Forward, but 
Information Technology Systems Remain a Concern. [hyperlink, 
http://www.gao.gov/products/GAO-10-430T]. Washington, D.C.: February 
23, 2010. 

2010 Census: Census Bureau Continues to Make Progress in Mitigating 
Risks to a Successful Enumeration, but Still Faces Various Challenges. 
[hyperlink, http://www.gao.gov/products/GAO-10-132T]. Washington, 
D.C.: October 7, 2009. 

2010 Census: Census Bureau Should Take Action to Improve the 
Credibility and Accuracy of Its Cost Estimate for the Decennial 
Census. [hyperlink, http://www.gao.gov/products/GAO-08-554]. 
Washington, D.C.: June 16, 2008. 

2010 Census: Census at Critical Juncture for Implementing Risk 
Reduction Strategies. [hyperlink, 
http://www.gao.gov/products/GAO-08-659T]. Washington, D.C.: April 9, 
2008. 

Information Technology: Significant Problems of Critical Automation 
Program Contribute to Risks Facing 2010 Census. [hyperlink, 
http://www.gao.gov/products/GAO-08-550T]. Washington, D.C.: March 5, 
2008. 

Information Technology: Census Bureau Needs to Improve Its Risk 
Management of Decennial Systems. [hyperlink, 
http://www.gao.gov/products/GAO-08-259T]. Washington, D.C.: December 
11, 2007. 

Information Technology Management: Census Bureau Has Implemented Many 
Key Practices, but Additional Actions Are Needed. [hyperlink, 
http://www.gao.gov/products/GAO-05-661]. Washington, D.C.: June 16, 
2005. 

21st Century Challenges: Reexamining the Base of the Federal 
Government. [hyperlink, http://www.gao.gov/products/GAO-05-325SP]. 
Washington, D.C.: February 1, 2005. 

Information Technology Investment Management: A Framework for 
Assessing and Improving Process Maturity. [hyperlink, 
http://www.gao.gov/products/GAO-04-394G]. Washington, D.C.: March 1, 
2004. 

Comptroller General's Forum, High-Performing Organizations: Metrics, 
Means, and Mechanisms for Achieving High Performance in the 21st 
Century Public Management Environment. [hyperlink, 
http://www.gao.gov/products/GAO-04-343SP]. Washington, D.C.: February 
13, 2004. 

Human Capital: Key Principles for Effective Strategic Workforce 
Planning. [hyperlink, http://www.gao.gov/products/GAO-04-39]. 
Washington, D.C.: December 11, 2003. 

[End of section] 

Footnotes: 

[1] 13 U.S.C. § 141(b). 

[2] See for example: GAO, Decennial Census: Preliminary 1990 Lessons 
Learned Indicate Need to Rethink Census Approach, [hyperlink, 
http://www.gao.gov/products/GAO/T-GGD-90-18] (Washington, D.C.: Aug. 
8, 1990); and 2000 Census: Progress Made on Design, but Risks Remain, 
[hyperlink, http://www.gao.gov/products/GAO/GGD-97-142] (Washington, 
D.C.: July 14, 1997). 

[3] GAO, 2010 Census: Key Efforts to Include Hard-to-Count Populations 
Went Generally as Planned; Improvements Could Make the Efforts More 
Effective for Next Census, [hyperlink, 
http://www.gao.gov/products/GAO-11-45] (Washington, D.C.: Dec. 14, 
2010); and 2010 Census: Follow-up Should Reduce Coverage Errors, but 
Effects on Demographic Groups Need to Be Determined, [hyperlink, 
http://www.gao.gov/products/GAO-11-154] (Washington, D.C.: Dec. 14, 
2010). 

[4] See for example: GAO, Information Technology Investment 
Management: A Framework for Assessing and Improving Process Maturity, 
[hyperlink, http://www.gao.gov/products/GAO-04-394G] (Washington, 
D.C.: March 2004); Human Capital: Key Principles for Effective 
Strategic Workforce Planning, [hyperlink, 
http://www.gao.gov/products/GAO-04-39] (Washington, D.C.: Dec. 11, 
2003); Comptroller General's Forum, High-Performing Organizations: 
Metrics, Means, and Mechanisms for Achieving High Performance in the 
21st Century Public Management Environment, [hyperlink, 
http://www.gao.gov/products/GAO-04-343SP] (Washington, D.C.: Feb. 13, 
2004); 21st Century Challenges: Reexamining the Base of the Federal 
Government, [hyperlink, http://www.gao.gov/products/GAO-05-325SP] 
(Washington, D.C.: February 2005); and Results-Oriented Government: 
Practices That Can Help Enhance and Sustain Collaboration among 
Federal Agencies, [hyperlink, http://www.gao.gov/products/GAO-06-15] 
(Washington, D.C.: Oct. 21, 2005). 

[5] High-risk areas are areas GAO has called special attention to 
because of their vulnerability to mismanagement or their broad need 
for reform. GAO, Information Technology: Significant Problems of 
Critical Automation Program Contribute to Risks Facing 2010 Census, 
[hyperlink, http://www.gao.gov/products/GAO-08-550T] (Washington, 
D.C.: Mar. 5, 2008). 

[6] [hyperlink, http://www.gao.gov/products/GAO-04-343SP]. 

[7] In the 2000 Census, the Bureau mailed out both long-and short-form 
questionnaires. The short-form questionnaire had a higher response 
rate because it had fewer questions. For the 2010 Census, the Bureau 
used only a short-form questionnaire. For this report we use the 2000 
Census short-form mail response rate when comparing 2000 and 2010 mail-
back response rates. 

[8] GAO, 2010 Census: Census Bureau Should Take Action to Improve the 
Credibility and Accuracy of Its Cost Estimate for the Decennial 
Census, [hyperlink, http://www.gao.gov/products/GAO-08-554] 
(Washington, D.C.: June 16, 2008). 

[9] In order to determine whether the pace of the 2010 Census NRFU was 
associated with lower-quality work, we conducted regression analysis 
using Census data to assess whether indicators of lower-quality work 
were associated with NRFU completion time among the 494 LCOs after 
adjusting for other factors associated with the timeliness of 
completion and workload. 

[10] The National Crime Prevention and Privacy Compact, enacted in 
1998, generally requires that fingerprints be submitted with all 
requests for criminal history record checks for noncriminal justice 
purposes; 42 U.S.C. § 14616. For the 2000 Census, the Federal Bureau 
of Investigation (FBI) did not have the capacity to timely process the 
fingerprints of the Census's temporary workforce, so employees were 
subject to only a name background check. 

[11] A well-defined enterprise architecture provides a clear and 
comprehensive picture of an entity, whether it is an organization 
(e.g., a federal department) or a functional or mission area that cuts 
across more than one organization (e.g., personnel management). This 
picture consists of snapshots of both the enterprise's current or "As 
Is" environment and its target or "To Be" environment, as well as a 
capital-investment road map for transitioning from the current to the 
target environment. 

[12] GAO, Information Technology Management: Census Bureau Has 
Implemented Many Key Practices, but Additional Actions Are Needed, 
[hyperlink, http://www.gao.gov/products/GAO-05-661] (Washington, D.C.: 
June 16, 2005). 

[13] GAO, Census 2010: Census at Critical Juncture for Implementing 
Risk Reduction Strategies, [hyperlink, 
http://www.gao.gov/products/GAO-08-659T] (Washington, D.C.: Apr. 9, 
2008); Information Technology: Census Bureau Needs to Improve Its Risk 
Management of Decennial Systems, [hyperlink, 
http://www.gao.gov/products/GAO-08-259T] (Washington, D.C.: Dec. 11, 
2007); and GAO-08-550T. 

[14] GAO, 2010 Census: Data Collection Is Under Way, but Reliability 
of Key Information Technology Systems Remains a Risk, [hyperlink, 
http://www.gao.gov/products/GAO-10-567T] (Washington, D.C.: Mar. 25, 
2010); 2010 Census: Key Enumeration Activities Are Moving Forward, but 
Information Technology Systems Remain a Concern, [hyperlink, 
http://www.gao.gov/products/GAO-10-430T] (Washington, D.C.: Feb. 23, 
2010); and 2010 Census: Census Bureau Continues to Make Progress in 
Mitigating Risks to a Successful Enumeration, but Still Faces Various 
Challenges, [hyperlink, http://www.gao.gov/products/GAO-10-132T] 
(Washington, D.C.: Oct. 7, 2009). 

[15] See for example: [hyperlink, 
http://www.gao.gov/products/GAO-05-661]; GAO, Census Bureau: Important 
Activities for Improving Management of Key 2010 Decennial Acquisitions 
Remain to be Done, [hyperlink, 
http://www.gao.gov/products/GAO-06-444T] (Washington, D.C.: Mar. 1, 
2006); Information Technology: Census Bureau Needs to Improve Its Risk 
Management of Decennial Systems, [hyperlink, 
http://www.gao.gov/products/GAO-08-79] (Washington, D.C.: Oct. 5, 
2007); and Information Technology: Census Bureau Testing of 2010 
Decennial Systems Can Be Strengthened, [hyperlink, 
http://www.gao.gov/products/GAO-09-262] (Washington, D.C.: Mar. 5, 
2009). 

[16] [hyperlink, http://www.gao.gov/products/GAO-08-554]. 

[17] GAO, 2010 Census: Plans for Census Coverage Measurement Are on 
Track, but Additional Steps Will Improve Its Usefulness, [hyperlink, 
http://www.gao.gov/products/GAO-10-324] (Washington, D.C.: Apr. 23, 
2010). 

[18] [hyperlink, http://www.gao.gov/products/GAO/GGD-97-142]. 

[19] See for example: [hyperlink, 
http://www.gao.gov/products/GAO-04-394G], [hyperlink, 
http://www.gao.gov/products/GAO-04-39], [hyperlink, 
http://www.gao.gov/products/GAO-04-343SP, [hyperlink, 
http://www.gao.gov/products/GAO-05-325SP]], and [hyperlink, 
http://www.gao.gov/products/GAO-06-15]. 

[20] GAO, Transition Series: Commerce Issues, [hyperlink, 
http://www.gao.gov/products/OCG-89-11TR] (Washington, D.C.: Nov. 1, 
1988). 

[21] GAO, 2010 Census: Cost and Design Issues Need to Be Addressed 
Soon, [hyperlink, http://www.gao.gov/products/GAO-04-37] (Washington, 
D.C.: Jan. 15, 2004). 

[22] S. 3167, 111th Cong. § 2 (2010). 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: