Skip to page content
United States Department of Labor
Bookmark and Share

DOL Annual Report, Fiscal Year 2009
Performance and Accountability Report

Reporting Performance Results

The Performance Section of this report presents results at the Strategic Goal and Performance Goal levels. The four Strategic Goals established in our FY 2006-2011 Strategic Plan are general outcomes clearly linked to the Department's mission. Performance goals articulate more specific objectives associated with one or more programs administered by a distinct DOL agency. Progress in achieving these goals is measured by one or more quantifiable performance indicators, for which targets are established in the Performance Budget Overview which accompanies the Department's annual Congressional Budget Justification.

All performance targets in this report were finalized in DOL's FY 2010 budget. By the time DOL's FY 2010 budget was sent to Congress, the recession had already taken its toll on employment. Accordingly, PY 2008 and FY 2009 targets for most employment and training programs were adjusted (downward in each instance) from preliminary targets in DOL's FY 2009 budget using a statistical model that accounts for external factors, such as the increasing unemployment rate, lack of new jobs, and changes in individual demographics; all of which have larger implications for program outcomes during the recession.11

Each strategic goal section is introduced by performance highlights and a summary table of net costs. Complete results at the performance goal level are presented in separate narratives, each of which includes the following:

  • Performance Goal statements appear at the top of the page, followed by unique identifiers that help organize reporting on results and net costs. The first two digits correspond to the funding (budget) period; e.g., "09" indicates goals reporting on a fiscal year and "08" those reporting on a program year. The single digit following the hyphen identifies the strategic goal and the letter distinguishes the performance goal from others in the same group (e.g., 09-1A). The agency acronym (e.g., BLS) is in parentheses.12
  • Indicators, Targets and Results tables list each indicator, its targets and results for the reporting period and previous years that have data for the same indicators. Indicators that do not apply to the current year are not shown; however, a note indicates where additional historical performance information (legacy data) can be obtained. Where all data for any year are shown, goal achievement is indicated. Where "baseline" appears in the target cell for new indicators, no data were available for establishing a numerical target, and these data do not count towards goal achievement. If results improve over the prior year but do not reach the target, "I" appears in the target cell. Net cost associated with the goal and indicators is also provided.13
  • Program Perspectives and Logic narratives describe the purpose of the program, how its activities are designed and managed to have a positive impact on the goal, how it measures success, and external factors that influence performance. Photos and vignettes communicate programs' impact at the personal level.
  • Analysis and Future Plans tables identify what worked and what didn't work to improve results and describe strategies for improvement. Performance data at the indicator level and net cost at the goal level are displayed in charts where sufficient data are available to illustrate trends.
  • Program Assessments, Evaluations and Audits sections summarize Program Assessments and provide updated information on improvement plans. For relevant audits and evaluations completed during the fiscal year, tables summarize relevance, findings and recommendations, and actions.
  • Data Quality and Top Management Challenges narratives discuss DOL's confidence in the performance information reported for the goal's measures and address management challenges that may have significant implications for achievement of program performance goals.

Data Quality

This report is published six weeks after the end of the fiscal year. Since the Department uses a wide variety of performance data submitted by diverse systems and governed by agreements with State agencies and grant recipients, it is not possible in all cases to report complete data for the reporting period. The Department requires each agency responsible for performance goals in this report to submit a Data Estimation Plan in February that identifies, for each indicator, whether complete data are expected by the deadline for final review of the report in early October. If the data will not be available by then, the agencies must submit an acceptable plan to estimate results for the remainder of the year. Methodologies developed by agencies' program analysts are reviewed by the Department's Center for Program Planning and Results and the independent Office of Inspector General (OIG). The most common methods are substitution or extrapolation of two or three quarters of data and — for data with significant seasonal variation — use of the missing period's results from the previous year. Estimates are clearly identified wherever they are used in this report. With very few exceptions, final (actual) data are available by the end of the calendar year; these data will be reported in the FY 2011 President's Budget and the FY 2010 Performance and Accountability Report.

As required by OMB Circular No. A-11, Preparation, Submission, and Execution of the Budget, the Secretary's Message includes a statement on program performance data quality. Significant limitations, which are noted if applicable to any given report, are defined in this context as data that are insufficient to permit determination of goal achievement. This is an uncommon occurrence, as most DOL performance goals have sufficient indicators and historical data to allow reasonable estimation of results.

The Department's Data Quality Assessments systematically evaluate data systems using widely accepted criteria to improve the quality of performance information reported to the public. Designed to encompass more than the mechanics of data collection, the assessments also question the value of information collected and the extent to which it provides evidence of goal achievement. Increasing the transparency of data quality provides benchmarks for monitoring progress and stimulating change. Agency heads are held accountable by a requirement that they sign attestations to the data quality assessment for each of their agency's performance goals in this report. One of the most important outcomes of this process, aside from increasing the transparency of performance information reported in the PAR, is encouraging the development of plans to either maintain or improve data quality.

In 2006, DOL conducted baseline assessments of data for all performance goals. For each of the following years, agencies have updated these assessments based on changes to their data quality systems or procedures, new information from independent studies released during the fiscal year, or changes to their performance indicators. Agencies seeking an upgrade must provide evidence demonstrating how a data quality criterion was satisfied. By contrast, agencies must also defend their rating if evidence has emerged suggesting a criterion is not being met.

The rating system includes seven criteria, of which two — accuracy and relevance — are weighted twice as much as others (see box on the following page). If data do not satisfy the standards for both of these criteria, the rating is Data Quality Not Determined. This reflects the DOL policy that further assessments of quality are irrelevant if the information is not reasonably correct or worthwhile.

Data Quality Rating System

Both bulleted descriptions under a criterion must be satisfied to receive points. No partial credit is awarded. The rating scale reflects 20 points for Section One "threshold" criteria plus additional points earned in Section Two. Data that do not satisfy both criteria presented in Section One are given the rating Data Quality Not Determined — regardless of the points achieved in Section Two. This rating indicates the agency is unable to assess data quality because it does not meet a minimum threshold.

Section One: 20 points

Accurate — Data are correct. (10 points)

  • Deviations can be anticipated or explained.
  • Errors are within an acceptable margin.

Relevant — Data are worth collecting and reporting. (10 points)

  • Data can be linked to program purpose to an extent they are representative of overall performance.
  • The data represent a significant budget activity or policy objective.

Section Two: 25 points

Complete — Data should cover the performance period and all operating units or areas. (5 points)

  • If collection lags prevent reporting full-year data, a reasonably accurate estimation method is in place for planning and reporting purposes.
  • Data do not contain any significant gaps resulting from missing data.

Reliable — Data are dependable. (5 points)

  • Trends are meaningful; i.e., data are comparable from year-to-year.
  • Sources employ consistent methods of data collection and reporting and uniform definitions across reporting units and over time.

Timely — Data are available at regular intervals during the performance period. (5 points)

  • The expectation is that data are reported quarterly.
  • Data are current enough to be useful in decision-making and program management.

Valid — Data measure the program's effectiveness. (5 points)

  • The data indicate whether the agency is producing the desired result.
  • The data allow the agency and the public to draw conclusions about program performance.

Verifiable — Data quality is routinely monitored. (5 points)

  • Quality controls are used to determine whether the data are measured and reported correctly.
  • Quality controls are integrated into data collection systems.





Very Good








Data Quality Not Determined


FY 2009 DOL Performance Goal - Data Quality Scores

Text only

After four years, DOL data quality continues to improve, but significant challenges remain. Data for 78 percent of performance goals are rated Very Good or Excellent. Thirteen percent of the goals fell into the middle category (Good). No performance goals were rated Unsatisfactory, nor were any rated Data Quality Not Determined (DQND) due to fundamental problems with accuracy and relevance. Three performance goals improved their rating; in each case, the upgrade demonstrates concentration on specific issues within their data systems. Two goals moved from Good to Very Good. Building upon momentum from FY 2008, ETA's Senior Community Service Employment Program (SCSEP) successfully addressed issues related to the reliability of data from year-to-year by implementing enhanced data checks, including careful monitoring of deviations in data over time. ESA's Office of Labor-Management Standards (OLMS) took advantage of a change in performance indicators to address timeliness and reliability issues. Data for the new performance measures is generated by the agency database which provides routine reports for agency management. Finally, Job Corps demonstrated that all criteria were met, earning an upgrade to Excellent. Job Corps targeted issues related to the verifiability of their data by implementing various quality control procedures throughout the data collection process.

Data Quality Criteria Met

Percent of Performance Goals















At the Departmental level, certain criteria are met more frequently than others. All DOL performance goals now satisfy the threshold criteria of accurate and relevant. Over three-quarters of performance goals are supported by data that are valid, timely, reliable, and complete. As indicated in the adjacent table, the clear challenge for many performance goals is the ability to verify the data. Less than half of all performance goals have data quality controls in place that routinely monitor data and are fully integrated into the data collection system. Verifiability is a predominate issue largely as a result of ETA's numerous grant programs and its challenges monitoring and enforcing standards among grantees' diverse data systems. The percent of performance goals with reliable data increased from last year due to the upgrade for SCSEP. Though still met by 17 of 23 goals, valid replaced reliable in FY 2009 as the second greatest opportunity for improvement. Goals not meeting this criterion are supported by one or more performance indicators that are not considered the most representative measure of whether the agency is achieving its desired results. As DOL embarks on a comprehensive revision of its strategic plan in FY 2010, agencies will consider data quality issues as they reexamine goals and indicators.

In FY 2009, in addition to the agencies' self-assessments, the Department underwent an independent evaluation of its data quality assessment process, including a review of data quality for two selected performance goals. The evaluation found the data assessment process has established a solid foundation for assessing and improving DOL performance data quality. Because criteria and definitions for performance data quality vary across the Federal government, it was difficult to determine the accuracy of CPPR ratings, overall. However, the study did analyze the individual criteria via comparisons to those of other data quality systems. It also examined data systems for the Workforce Investment Act Adult and Job Corps programs, which validated various findings of the broader assessment process. For example, in each case, the study found that systematically mapping data sources promotes comparisons to other programs and clarifies the relevance of data quality findings in external reports. Recommendations aimed to strengthen the assessment process so that the PAR will continue to promote strategic, transparent improvements to data quality. DOL will continue to examine data quality issues through a second year continuation of the study. DOL will also use the strategic planning process, which is underway in FY 2010, as an opportunity to implement recommendations along with development of new performance goals and indicators.


11This model was developed for ETA by an independent contractor using administrative data. Job Corps and VETS considered using the model. Job Corps did not adopt this model and is studying other options to adjust its targets in FY 2010. VETS did not adopt the model, either, for PY 2008 but made adjustments to its PY 2009 targets (reflected in the DOL FY 2010 budget) using additional factors that account for differences in historical outcomes for its veteran populations.
12FY 2009 covers October 1, 2008 to September 30, 2009; PY 2008 covers July 1, 2008 to June 30, 2009.
13Net cost for all strategic and performance goals for the last three years are provided in the DOL Program Performance and Net Costs section of the Program Performance Overview (Management's Discussion and Analysis).

Previous Section Next Section