© Copyright 2003 by the Wyoming Department of Employment, Research & Planning

 

Workforce Investment Act Customer Satisfaction:  Lessons from the Field

by: Douglas W. Leonard, Research Analyst

"A 'one size fits all' approach has very little utility, especially when considering the barriers faced by states with small populations.  States must be given enough leeway so that reaching specified goals is not precluded by the procedures used to achieve them."

Reaching federal program performance targets can be difficult. With every new program, rules and procedures are developed to give states guidance. However, the rules can sometimes hinder the performance measurement process. State characteristics may complicate both the performance of the program and the survey processes aimed at measuring that performance. One challenge is to develop a survey system that helps the state reach its federally defined performance measurement goals. More importantly, implementation of the survey process and subsequent analysis can illuminate weaknesses in the federal performance methodology. Legislation should be driven by information obtained through empirical observation. 

Why measure performance? We measure performance because it allows us to make judgments about program effectiveness and return on investment. Programs judged to be ineffective through objective scientific methodologies could be modified or eliminated, while those judged successful might be expanded and improved. Before the development of programs like the Workforce Investment Act of 1998 (WIA), accountability focused on the process (e.g., how the service was delivered, how many were served) instead of results (e.g., do program benefits exceed costs?).1 In theory, performance measurement should lead to more effective program fiscal decisions, thereby reducing waste and providing better service to all stakeholders.

This article examines the implementation of customer satisfaction data collection and reporting procedures for WIA job seekers and employers. In particular, we address challenges associated with meeting federal targets for the collection of customer satisfaction surveys. Initial results show a marked improvement in WIA survey response rates.

Background Information

WIA requires states to collect performance data for three customer satisfaction measures.2 It specifies that customer satisfaction data are to be collected from both persons participating in the programs and their employers “…through surveys conducted after the conclusion of participation in the workforce investment activities.”
3 To aid in this effort, the Employment Resources Division of the Wyoming Department of Workforce Services contracted with Research & Planning (R&P) to collect survey data. The separation of the statistical and program functions guarantees respondent confidentiality.

Figure 1 describes the workflow of the entire WIA training process. Column A illustrates employer and job seeker interaction with the One-Stop system through the local employment offices. In this step, job seekers apply for training under WIA. Employers and training providers either list employment or coordinate training opportunities. In Column B (Training/Service Provided), job seekers are matched to employers or training providers and the estimated dates when job seekers will exit the program are negotiated and set (agreement end dates). Once the job seeker exits the program (see Column C), either on or before the originally negotiated date, records for that job seeker and the associated employer are transferred to the data management system. These data are then used in the customer satisfaction data collection efforts through the interface box shown in Column D. Column D is shown in additional detail in Figure 2.

The measure used to evaluate customer satisfaction for both job seekers and employers is the American Customer Satisfaction Index (ACSI) from the University of Michigan.4 The range of values with this instrument is 0 to 100, where 100 equals total satisfaction. The minimum acceptable score is 80 percent of the target score. The target scores for Wyoming in 2000 were 68.0 for job seekers and 66.0 for employers.

The U.S. Department of Labor applies monetary performance incentives (rewards) and sanctions (penalties) to state agencies responsible for the program in relation to how well the negotiated WIA performance targets are met. States are ineligible for performance incentives unless they achieve at least 80 percent of negotiated target levels on all measures (including customer satisfaction). Conversely, sanctions are applied if a state fails to achieve 80 percent of negotiated performance for two consecutive years.5

Response rates play a critical role in the accurate determination of customer satisfaction levels. The U.S. Department of Labor requests that states achieve rates of no less than 50 percent for employer and job seeker surveys (for WIA satisfaction), regardless the size of the population or sample surveyed.6 The 50 percent initial response rate target is optimal, because response rates of less than 50 percent bring into question the validity of the survey results. While sanctions are not necessarily tied to response rates, higher response rates are generally viewed as providing more reliable customer satisfaction scores. U.S. Department of Labor guidelines specify a telephone survey with a minimum of five attempted telephone contacts per respondent.

Survey Results

What follows is a discussion of the results and findings for employers and job seekers in Wyoming during the 2000 and 2001 WIA program years. These data do not contain any comparisons of State and national results, but instead describe how well Wyoming’s results compare to the performance targets negotiated with the U.S. Department of Labor. Data collection and reporting efforts for Program Year (PY) 2000 (July 1, 2000 - June 30, 2001) both commenced and concluded in November 2001. Delays in receiving the target respondent list prevented R&P from collecting data earlier. 

In PY2000, employer and job seeker records were received by R&P less than 30 days prior to the reporting deadline, leaving R&P little time to collect data. Additionally, records were received several months after job seeker exit and employer agreement end dates (see Figure 1, Column C). These delays ran counter to a fundamental directive of U.S. Department of Labor, Employment and Training Administration Training and Employment Guidance Letter (TEGL) 7-99, which states that “…participants should be contacted within 60 days of exit date” [italics added for emphasis] and “employers should be contacted 60 days following completion of service or 30-60 days after a job order has been listed where no referrals have been made.”
7 The employer-specific dates defined here are also known as agreement end dates, the estimated dates when the job seeker will complete the program. Even though response rates were increased somewhat by giving employers and job seekers the option of responding to mail surveys (something not encouraged under the original TEGLs), response rates were still insufficient to yield “valid” results (i.e., the 50 percent minimum response rate criterion was not met). 

There was quite a difference in response rates between job seeker and employer groups in PY2000 (see sidebar). Of the 489 surveys mailed to job seekers in the initial mailing, only 55 were returned as of November 26, 2001 for a response rate of 11.3 percent. The employer surveys had a much better response rate (125 out of 272 or 46.0%). Even though the 50 percent response rate criterion was not met by the job seeker survey for PY2000, the ACSI composite scores for both the job seeker (78.2) and employer (78.8) groups exceeded their respective target scores (68.0 for job seekers and 66.0 for employers). 

Response Issues and Macro-Level Data Flow

Although ACSI customer satisfaction scores were acceptable for PY2000, the challenge of extremely low response rates remained. Following an analysis of PY2000 results, we determined that three factors hurt response rates: 
1) Record availability 
2) Delays in initial mailings of survey notifications due to poor data quality
3) Insufficient time for follow-up8

These were particularly problematic for the job seeker group because of the additional difficulty in contacting individual job seekers.

To address the issue of low response rates, R&P developed a survey data flow system to help contact program job seekers and employers in a more timely manner. The Wyoming WIA survey data flow system provides consistent data collection that goes well beyond that required by the U.S. Department of Labor guidelines.

Data Quality and Timeliness

A data quality issue arose with agreement end dates (see Figure 1, Column C). Many WIA job seekers exited (completed or left training prematurely) the program prior to the estimated date, leaving incorrect values in the employer table. This resulted in the assignment of employer records to incorrect time periods, which corrupted the output statistics and required manual data correction. A method is being developed to update the agreement end date field before record transfer to R&P.9 

Another challenge to incoming data quality was with employer primary contact names (see Figure 1, Column A). For example, several records had no employer contact name, incomplete names (first name only), or unusable data such as “Ask for Uncle Bob.” While it is possible to mail correspondence and make telephone survey attempts with this data, incomplete data present a highly unprofessional image to employer survey respondents and increase the probability of nonresponse. Some employer contacts did not wish to give their first and last names because this information could appear on job orders, potentially resulting in employer contacts receiving job request telephone calls at their private residences.10 To remedy this situation, employment center staff were instructed to enter a title (i.e., Mr., Mrs., Ms.) and last name only for each employer contact.11

Data quality was also compromised when records were received too late (see Figure 1, Column C). For example, in Table 1, a column called “Extract Date Range” identifies the dates that records were entered for job seekers in a particular program year and quarter. To ensure participants are assigned to the proper year and quarter, the latest extract date cannot be later than one week following the end of the respective quarter in the program year. The correct extract date ranges for each quarter are shown below:
Quarter 1: July 1 - September 30
Quarter 2: October 1 - December 31
Quarter 3: January 1 - March 31
Quarter 4: April 1 - June 30

Only one quarter in PY2001 (fourth quarter) met this criterion (see Table 1). Consequently, some job seekers were surveyed much later than prescribed by the program. The fact that some job seekers moved following program exit increased the difficulty of contacting those individuals, further depressing response rates. Extract date range errors also occur in the employer data, but the problem is not as severe (see Table 2). Perhaps the most significant negative impact of incorrect date assignment is recall bias (responses can change when several months have elapsed between program exit and the survey date). 

All of the issues described in this section contributed to delays in processing and increased costs. Exacerbating this situation is the difficulty in making contact with respondents. Therefore, every effort must be made to reach every respondent as early as possible. Since the original survey management system was unable to do so, the new system shown in Figure 2 was developed.

Improvements for Evaluating Program Year 2001 Data

Using the system shown in Figure 2 for PY2001, new job seeker and employer information was transmitted every two weeks from the Department of Workforce Services data management system to R&P (see Row A). Once new employer and job seeker records were downloaded, an initial mailing containing an introductory letter and response card was sent to target respondents (see Row B). When response cards are sent, five outcomes are possible (see Row C). Three outcomes (Request Telephone Survey, Refusal, and Request Mail Survey) require some sort of action on the part of the respondent. The remaining two actions (Return To Sender and Two Weeks No Response) are passive actions that result in either follow-up mailings (see Row F) or cold calls (see Row H). Cold calls are defined as telephone calls made without respondent request because of inaction or an incorrect address. 

Respondents receive either scheduled telephone calls (left box, Row D), or mail surveys (right box, Row D). Row E details possible mail or telephone survey responses. New outcomes introduced at this stage include Complete Survey and Exhaust Attempts. Complete Survey occurs when a respondent answers all survey questions. Exhaust Attempts occurs when five or more attempts have been made to reach a respondent without result, not including the initial response card mailing. Respondents can also exhaust attempts by repeatedly requesting a different type of survey instead of answering questions. For those respondents requesting a telephone survey, no additional follow-up activity takes place. Those requesting a mail survey are given two weeks to complete the survey before a follow-up package is sent (see Row F). 

Follow-up packages contain a letter (reminding respondents that they agreed to be surveyed upon program exit), the survey, and a reply envelope. The possible courses of action from this step are shown in Row G. From this stage, respondents can get a telephone or mail survey, refuse to complete the follow-up survey, answer all questions and return it, or take no action. If no action is taken or if the package is returned to R&P, the respondent receives cold calls until the survey was completed or the respondent’s contact attempts were exhausted. 

Our goal in establishing this process was to develop systematic guidelines for conducting surveys, thereby increasing data accuracy. Examples of the types of job seeker and employer correspondence are shown in Appendixes 1 through 4 on our website.12 The identification numbers shown in the examples (Appendixes 1 through 4) were printed on each correspondence and survey component to ensure correct delivery and data entry. 

Results of New System Implementation

Response rates for both job seeker (see Table 1) and employer (see Table 2) groups increased dramatically in PY2001 compared to PY2000. The job seeker response rate increased from 11.3 percent to 59.2 percent and the employer response rate increased from 46.0 percent to 88.0 percent. The job seeker response rate would likely have been higher if nearly one-third of all job seekers had not reached the Exhaust Attempts stage before survey completion. Table 1 shows that there were 715 job seeker records available for survey data collection. There were 773 potential job seeker respondents. However, the program rules (established by WIA and TEGL directives) allow exclusion of job seekers from the sample who cannot be contacted because of a known illness, death, or incarceration (58 potential job seeker respondents fit into this category). We found that several individuals had moved without leaving forwarding addresses. In addition, many job seekers had disconnected telephone numbers, further depressing response rates. In spite of these challenges, our success rate was very high for respondents with whom we actually made contact (423 completed surveys out of 463 valid and contacted respondents or 91.3%). Our inability to reach respondents was not an issue with employers (see Table 2), as contacting them at work was much more likely to yield a completed survey.

Conclusion

WIA customer satisfaction data collection poses challenges for state agencies, several of which are left to the states to sort out themselves without the benefit of clear and consistent guidelines. However, if flexibility exists, especially when considering the lower number of participants in states with small populations, many pitfalls can be avoided. A “one size fits all” approach has very little utility. States must be given enough leeway so that reaching specified goals is not precluded by the procedures used to achieve them. 

1105th Congress, "Workforce Investment Partnership Act of 1997," Report 105-109, October 15, 1997.

2105th Congress, "Workforce Investment Act of 1998," n.d., <http://www.doleta.gov/usworkforce/wia.asp>  (May 30,2001).

3105th Congress, "Workforce Investment Act of 1998," n.d., <http://www.doleta.gov/usworkforce/wia.asp> (May 30,2001), Section 136A(B).

4The properties of the American Customer Satisfaction Index (ACSI) are described in the following Training and Employment Guidance Letter (TEGL) locations, 7-99, pp. 39-40; 6-00, pp. 8-16; and 6-00 Change 1, pp. 5-7. ACSI scores are calculated using mean (average) question response values according to the procedures outlined in U.S. Department of Labor, Employment and Training Administration, Training and Employment Guidance Letter 6-00, September 21, 2000.

5U.S. Department of Labor, Training and Employment Guidance Letter No. 8-99, March 3, 2000, pp. 6-10.

6U.S. Department of Labor, Training and Employment Guidance Letter No. 7-99, March 3, 2000, p. 34.

7U.S. Department of Labor, Training and Employment Guidance Letter No. 7-99, March 3, 2000, pp. 34-37.

8In addition to receiving record data little more than one month before the report deadline, the survey period was further shortened by the Veteran's Day and Thanksgiving holidays.

9As of September 12, 2002, another data field suitable for this purpose was located and will be used for PY2002 employer survey management.

10Colleen Anderson, Program Manager, Employment Resources Division, Wyoming Department of Workforce Services, personal conversation July 11, 2002.

11The contact information gathering procedure was further modified in September 2002 so that a first and last name (i.e., John Smith, Mr. Smith) had to be entered to complete registration. In addition, two-word department names were also acceptable if the first and last name fields were filled.

12Appendix 1: Job Seeker Introductory Letter Example; Appendix 2: Job Seeker Telephone Survey Example; Appendix 3: Employer Follow-up Letter Example; Appendix 4: Employer Follow-up Survey Example

 

Table of Contents | Labor Market Information | Wyoming Job Network | Send Us Mail

These pages designed by Julie Barnish.
Last modified on by Valerie A. Davis.