© Copyright 1999 by the Wyoming Department of Employment, Research & Planning

Performance Accountability in the Workforce Investment Act: An Application with Division of Vocational Rehabilitation Data Part One
by: Tony Glover , Senior Analyst


"[Performance accountability] requires . . . a belief in the utility of meshing different kinds of professional knowledge and expertise to accomplish common goals, and an organizational commitment to do so." - Ann Blalock


The first of a three part series, this article applies the performance accountability methods specified in the Workforce Investment Act (WIA) of 19981 to Wyoming’s Division of Vocational Rehabilitation (DVR) program data. Performance accountability studies demonstrate the connection between program participation and success in the labor market. DVR’s particular mission is to advance employment and independent living opportunities for persons with disabilities in Wyoming. This article discusses the history behind the implementation of the WIA, explains the difference between evaluation research and performance management, describes the methods used to calculate results for three of four core indicators of performance as specified by the WIA and interprets the results. Part Two (to be published in Wyoming Labor Force Trends, December 1999) addresses additional information gained from applying evaluation research methods across several cohorts of DVR’s program. Part Three will look at work behaviors of DVR clients, the control group and population analyzed in Part Two.

History of WIA

The Government Performance and Results Act (GPRA) of 1993 set in motion the process of holding federally funded programs accountable for performance. The purposes section of GPRA outlines several goals.2 Central to the development of WIA’s performance accountability system is "systematically holding Federal agencies accountable for achieving program results."3 Furthermore, GPRA specifies improvement of "Federal program effectiveness and public accountability by promoting a new focus on results, service quality, and customer satisfaction."4 Simply put, GPRA attempts to assess the value of the federally funded programs relative to actual program costs.

WIA addresses the goals of GPRA by specifying core indicators of performance. Core indicators are the measures by which workforce investment activity is assessed. They include entry, retention, earnings received in unsubsidized employment and credential attainment rates (i.e., a training license). The core indicators shift accountability focus from client inputs to outcomes, from process to results and from management control to continuous improvement. The core indicators represent the foundation of the performance accountability system and are calculated for clients receiving the following workforce investment activities: services requiring registration, intensive employment planning services and training.5 Consequently, core indicators do not consider the intrinsic value to the client of these types of services, especially in the absence of other support networks.

Performance Management versus Performance Evaluation

The WIA sets standards for performance management by specifying the core indicators used to track and implement a continuous improvement strategy. Where this article focuses on the indicators used for performance management, Part Two of the series takes this process a step further into the realm of evaluation research. The differences between the two are more clearly defined by considering that performance management primarily concerns program monitoring and operational efficiency. Evaluation research, on the other hand, assesses year-to-year program performance, utilizes control groups and takes into account other influences such as changes in the economy.6 For example, knowing that the percent of clients entering employment from year one to year two decreases from 38 to 25 percent, in itself, does not inform one whether or not the decrease was due to poor program management practices or a competing explanation such as the economy. Knowing, however, that during the same period the entered employment rate of a representative sub-group of the general population decreased from 30 to 15 percent, makes it apparent that results calculated for the program were not determined by program management but rather economic changes.

Evaluation research offers performance management a reference point for interpreting results. As a tool, performance management provides important information for tracking progress against goals and focuses on outcome measures. Evaluation research, on the other hand, provides unbiased information on system efficiency and effectiveness in light of surrounding conditions. Using both procedures concurrently ensures a clear understanding of a program’s activity and performance.

Methods for Calculating Performance Indicators

It is important to understand the methods used to calculate the first three performance indicators as defined by WIA. The next article in this series introduces the concept of using a randomly selected, matched control group from the general population of individuals found in the Department of Employment’s (DOE) administrative databases. In Part One, I suggest limitations to interpretation of the performance indicators and build a foundation for the application of evaluation research discussed in Part Two.

This article uses three complete years of DVR data, calendar years (CY) 1994, 1995 and 1996. After consultation with Steve Miedziak7 of DVR, participants who had a case closure status of either closed "rehabilitated," closed "not rehabilitated after program initialized" or closed "not rehabilitated before program initialized" were included in the analysis. The closure statuses chosen identify participants who received some level of services and exclude participants not accepted into the program (status 2, 6 & 8). The resulting cohorts for this analysis were CY94 (n=1,308), CY95 (n=1,256) and CY96 (n=1,397).

In March 1999, the Federal Register released a consultation paper on performance accountability measurement for WIA, outlining methods for calculating the core indicators.5 The definitions given below reflect necessary adaptations of the calculation of core indicators. Core indicator 1 uses a subset of clients who were not employed at application. Two options were available to determine employment status at application. The first used wage records8 to ascertain if the participant had wages the quarter prior to application. The second involved DVR staff recording employment status during the application interview. Wage records data offer the advantage of representing the view of an objective third party (the employer who reports the data). Wage records data collected quarterly do not allow for a direct determination of the employment status at application. Participants who worked only part of a quarter are determined employed, when in fact they have no job at application. The determination of "not employed at application" in the DVR database is input by caseworkers in field offices and only considers whether or not the individual had worked in the week prior to application. This, of course, leads to subjective inconsistencies in the collection of employment status at application. For example, "Are you employed?", "Have you worked in the last week?" and "Do you have a job?" lead to different answers depending on who asks the question and who answers.

Using wage records to determine prior employment has numerous advantages; consistency and reliability of data collection, objectivity of reporting source, and foremost, the ability to apply the method to comparable groups who are not asked their employment status (i.e., control groups and population). For this analysis, wage records data determine employment status prior to application (Part Three of this series will use the alternative method).

Calculations for core indicators 2 & 3 include participants who entered employment, discussed in the previous paragraph, and incumbent workers. Incumbent workers are those who had employment in both the quarter prior to application and the quarter following closure.

Until now, the discussion has been about the determination of employment status prior to application. Wage records determine employment status following closure, too. The Federal Register document outlining the performance measures of the WIA dictates that wage records be used for follow-up employment data.

The reference period (period in which the clients received services) for DVR clients varies. For example, some client's duration of services spanned six quarters, but most clients completed the program within four quarters. In the case of the client whose participation spanned six quarters, the four quarters prior to application and the four quarters subsequent to closure were used to calculate all core indicators.

Core Indicator 1: Entry into Unsubsidized Employment (Entered Employment Rate)

Workforce investment activity aims at placing clients in employment not dependent on public assistance. Unsubsidized employment is employment which no longer has support from the program, in this case DVR. For the purposes of this article, an individual was ascertained to have entered unsubsidized employment if s/he had no earnings in wage records in the quarter prior to the reference period and had earnings in the first quarter following the reference period.

The entered employment rate is calculated as a ratio of those not employed in the quarter prior to application and employed in the quarter following program closure, divided by all the individuals not employed in the quarter prior to application.

Formula 1: Entered Employment Rate
Entered Employment ---=--- Individuals not employed 1 Qtr prior and employed 1 Qtr following
Rate
All Individuals not employed 1 Qtr prior

Core Indicator 2: Retention in Unsubsidized Employment Six Months After (Six Month Retention Rate)

A second goal of workforce investment activity is not only to place the clients in employment but to assure that clients retain employment. Those who entered employment (the previous indicator) and incumbent workers are considered together for determining retention in employment.

For this indicator and core indicator 3 that follows, the calculations become more difficult, because different quarters are used in the calculations for the entered employment group and the incumbent workers group.

Formula 2: Six Month Retention Rate
(Entered Employment) (Incumbent Workers)
Those not employed 1 Qtr prior, ---+--- Those employed 1 Qtr prior,
Six Month Retention ---=--- employed 1 Qtr after and still employed 1 Qtr after and still
Rate employed in 3rd Qtr after employed in 4th Qtr after

All employed 1 Qtr after

Core Indicator 3: Earnings Received in Unsubsidized Employment Six Months After (Average Earnings Change in Six Months)

Core indicator 3 calculates the average change in six month earnings of participants. This is done by contrasting half of the wages earned by participants in the year prior to the reference year with the earnings of two quarters following the reference year. As mentioned earlier, the two subsequent quarters used for the calculations differ for participants who entered employment and incumbent workers.

Formula 3a: Six Month Earnings Gain (Participants Who Entered Employment)
Six Month Earnings ---=--- Post program Pre program
Gain (Wages Q1 + Wages Q2) ------- (Wages year prior/2)
Formula 3b: Six Month Earnings Gain (Incumbent Workers)
Six Month Earnings ---=--- Post program Pre program
Gain (Wages Q2 + Wages Q3) ------- (Wages year prior/2)
Formula 3c: Six Month Average Earnings Gain of Participants
Earnings gain for participants who entered
Six Month Average ---=--- employment and incumbent workers
Earnings Gain
All employed 1 Qtr after

Results

The Table and Figures 1, 2 and 3 show the results of the core indicator calculations for three cohorts of DVR participants. Note:

Nationally, state Vocational Rehabilitation Agencies are not required to utilize WIA's state adjusted levels of performance. The Federal Rehabilitation Services Administration has publicized (draft) evaluation standards and performance indicators which will apply to state Vocational Rehabilitation Agencies.

The entered employment rate shows a steady increase. This meets the continuous improvement requirements specified in the WIA. A variation on the calculation of this indicator uses the DVR case management database to determine employment status at the time of application. As it turns out, doing this increases the percent entering employment by four percent across all cohorts. However, due to the method of obtaining this information, and lack of consistency across different programs, (i.e., Job Training Partnership Act - JTPA, Employment Services), I recommend using wage records data to determine prior employment status.

The six month retention rate for employment shows an increase from cohort 1994 to cohort 1995, then levels off. This indicator measures the retention of employment by participants, once employed. It includes both participants who enter employment as well as those who maintain employment during the program. Strictly defined and interpreted, retention rate results do not meet continuous improvement requirements. However, the DVR program did not display a decrease in performance for this indicator, and an additional year’s data would be needed to determine if the stagnation in performance continued.

The six month earnings gain shows a decrease in performance from cohort 1994 to cohort 1995 and a slight increase from cohort 1995 to cohort 1996. However, even with the increase of the last cohort, the earnings gain fell 15 percent below the first cohort’s performance level. This is the worst case scenario from the performance management perspective mentioned in the introduction of this article. WIA outlines the continuous improvement requirements and a Department of Labor working paper states that

In an effort to drive positive results and continuous improvement, the Act contains strong ties between performance and funding. If a State fails to meet its expected level of performance in any year, it can request technical assistance from the Department of Labor. If a State continues to fail to meet its agreed-upon performance levels for a second year--its funding can be decreased by up to 5 percent.9

Based on the performance management analysis results, DVR is not meeting the continuous improvement requirements of the WIA for core indicator 3. Equally important, though, the limit of this analysis is that it fails to offer an explanation of the decreasing performance. Programmatic changes might have led to the decrease in performance. However, competing explanations, such as economic changes, need further investigation (Part Two of this series).

Conclusions

The performance management processes addressed in this article should be uniformly applied within any Federal program under the WIA. The DVR case management system contains data on the severity of the participant's disability, age, sex and race. Participant characteristics would easily allow breaking the program into different levels of analysis. The DVR databases also contain data on the services provided, and the cost and duration of those services. Contrasting the participant’s characteristics with the service received and varied outcomes create the opportunity to manage programs more effectively. For example, participants with severe disabilities may benefit from On-the-Job training more than Classroom training. An inherent danger to this type of analysis, selection bias (creaming), occurs when participants are selected on the qualities that ensure a successful outcome. Awareness of this danger protects the program's integrity in light of the program's mission.

The degree of comparability across employment service activities (i.e., JTPA) relies on clear, concise definitions of outcome measures and consistency of data collection methods. For example, the defining factor of a participant's employment status at application, in this article, was having no wages in the quarter prior to services. This definition may adequately determine prior employment status across all employment service activities.

As mentioned in the introduction, this article is an applied "performance management" analysis of the WIA’s "Performance Accountability System." Part Two of this series takes this process to the next logical step. It is not enough to know how the DVR program performed relative to itself. Left with only the conclusions generated in this article, DVR would face the situation of explaining the decrease in performance for the earnings gained in employment (Core Indicator 3). However, the DVR program, as well as other workforce investment activity, does not operate in a vacuum. Part Two applies "performance evaluation" methods in an attempt to discover factors that explain the decrease in performance, that are external to the DVR program and not in its control with regards to the earnings gained in employment. Part Two also addresses issues of how the DVR program increased the entered employment rate for three consecutive years, during a period when Wyoming’s economy was slowing down and overall employment opportunities decreased.

1 The Workforce Investment Act of 1998, Pub. L. No. 105-220 (1998). Sec. 136

2 Government Performance and Results Act,, Pub. L. No. 103-162 (1993).

3 Government Performance and Results Act, Pub. L. No. 103-162 (1993). Sec. 2(b)(1)

4 Government Performance and Results Act, Pub. L. No. 103-162 (1993). Sec. 2(b)(3)

5 "Consultation Paper on Performance Accountability Measurement for the Workforce Investment System Under Title 1 of the Workforce Investment Act; Notice" Federal Register March 24, 1999.

6 Ann Blalock. "Evaluation Research and the Performance Management Movement: From Estrangement to Useful Integration?" Evaluation, The International Journal of Theory, Research and Practice 5, no. 2 (1999).

7 Steve Miedziak. Interviewed by Tony Glover in Casper, Wyoming. February 3, 1999 and June 3, 1999.

8 Wage records form the administrative database used to calculate UI benefits. By law, each employer who has covered employees must submit reports to the state showing each employee’s wage by quarter.

9 U.S. Department of Labor, Workforce Investment Act Implementation Taskforce Office. "Implementing the Workforce Investment Act of 1998."


Table of Contents | Labor Market Information | Employment Resources | Send Us Mail

These pages designed by Gayle C. Edlin.
Last modified on by Valerie A. Davis.