Chapter 2: Methodology
by: Lisa L. Knapp, Research Analyst
In order to gain a more complete view of the workplace for this study we used two methods of research. The first of these involved the analysis of administrative data. These records contain information on age, wages, tenure, and industry. This method is low-cost and noninvasive. However, administrative databases are only capable of providing part of the story. In order to gain perspective on the opinions and intentions State of Wyoming employees, we also administered a mail questionnaire. This questionnaire included questions pertaining to how an employee felt about his or her supervisors and co-workers, wages, workload, and what factors they would like to see changed. In order to learn how many employees may potentially be retiring in the near future, which is part of the goal of succession planning, this questionnaire also asked employees about their future retirement plans and views on working after retirement (responses to survey questions are shown in Appendix A). This chapter gives greater detail on how we did this. For more information on the strengths and weaknesses of each method and the reasons for using both, please see the methodology chapter of Retention of Nurses in Wyoming.
Administrative Records
Research & Planning (R&P) has access to and uses several administrative databases that are updated on a regular basis (quarterly in most cases). The first of these is the Wyoming Unemployment Insurance (UI) Wage Records file, which contains information on employment and wages for all persons working for a UI-covered Wyoming employer in any given quarter. Often data from the Quarterly Census of Employment and Wages program are added to these wage records in order to analyze employment by industry. We also add demographic data such as gender and age from the Wyoming Department of Transportation driver’s license files. The combination of these sources of information allows us to conduct nonintrusive analysis on the state’s labor market at very little cost.
Survey Research
In 2008, R&P was contracted to conduct a succession planning study for three Wyoming state agencies: the Department of Employment (DOE), the Department of Family Services (DFS), and the Department of Workforce Services (DWS). R&P had previously conducted this study for DOE in 2006. Because the survey instrument had already been created, tested, and refined, few changes were made in 2008. We used factor analysis (see Chapter 5) to determine which, if any, questions were conceptually redundant and subsequently removed three questions about workplace satisfaction and moved two questions regarding benefits to the demographics section of the instrument (see the Chapter 3 and Appendix C).
We began the questionnaire process in May 2008 by obtaining names and mailing addresses for all employees working in the agencies from their respective human resources representatives. Because of the large number of employees working for DOE, DFS, and DWS (see Table 1), we decided to use the first mailing of the questionnaire as a form of address refinement. When a questionnaire was returned due to an incorrect address, an e-mail requesting an address update was sent to that employee. Overall, 101 (7.7%) questionnaires were returned for this reason (see Table 2). Of these, 82 (81.2%) belonged to DFS employees, 6 (5.9%) belonged to DOE employees, and 13 (12.9%) belonged to DWS employees. Of those who received an e-mail requesting an address update, 67 (66.3%) responded and were re-sent a questionnaire while the remaining 34 questionnaires were never delivered to an employee. Of these, 28 (82.4%) were DFS employees, 3 (8.8%) were DOE employees, and 3 (8.8%) were DWS employees.
Prior to mailing the questionnaires to state employees, the directors for each agency sent out an introductory e-mail explaining the purpose of the study. Over the course of 10 weeks, employees were mailed up to three copies of the survey instrument (see Appendix C). Each employee was assigned a random, confidential number and was mailed a copy of the questionnaire, a cover letter again explaining the purpose of the survey and the confidentiality measures, and a postage-paid, addressed return envelope. The first mailing was sent to 1,306 employees between April 29 and May 19 and yielded a valid response rate of 50.7%. The second mailing was mailed between May 20 and June 10 to those who did not respond to the first mailing and increased the response rate to 67.5%. A final mailing was sent out between June 11 and June 25 to employees who had not responded to either the first or the second mailing. This increased the response rate to 73.8%.
Upon completion of the third round of questionnaires it was determined that the response rate for DFS (63.0%) was much lower than for DOE (80.5%) and DWS (80.3%). Because of this, R&P conducted follow-up phone calls to DFS staff between June 25 and June 30. These calls accomplished three things. First, enough questionnaires were completed during this process to increase the response rate for DFS to 70.1%. Second, it helped to identify staff members who no longer worked for the agency and who could be removed from the sample (N = 21, 2.7%). Finally, a conversation with an employee and department supervisor alerted us to the possibility that not all employees had received the introductory e-mail from the agency directors. The purpose of the survey was explained to this administrator who then informed the employees in that section.
At the end of the collection period the final response rate for all employees included in the study was 74.3% (N = 971). The final response rate for DFS was 70.1% (N = 536). The final response rate for DOE was 80.5% (N = 243) and the final response rate for DWS was 80.3% (N = 192).
Nonresponse Bias
In research it is often as important to know who did not respond to a questionnaire as it is to know who did respond. If a substantial portion of a population demographic did not respond, the reported results may be misleading. There are several possible reasons why a person might not respond. For this study it may be that the employee was too new to the job to feel capable of rating his or her experiences in the work environment. Or perhaps the employee was afraid a response would be relayed to a supervisor, causing negative consequences. It may even be that the employee did not care enough either way to give an opinion. Whatever the reason, nonrespondents may differ substantially from respondents. This may affect the ability of survey results to be generalized to the larger population of interest, which in this case would be the agency.
Without completed questionnaires, we cannot identify differences in reported satisfaction levels for respondents and nonrespondents. However, we can analyze differences in known factors like age, gender, and tenure on the job. To determine significant differences (differences that are greater than chance, which might affect the final results of the study) for these variables we used the chi-square statistic. The technical aspects of this statistic are covered in greater detail in Chapter 4, but essentially the chi-square statistic analyzes the differences between an observed result and the expected result. If this difference is statistically significant, the probability value (p-value) will be equal to or less than 0.05.
Table 3 shows the differences between respondents and non-respondents at DFS. A significantly greater proportion of employees younger than age 35 (30.0%, p = 0.02) did not respond compared to those who did respond (20.2%). Similarly, Table 4 shows these results for DWS. There were also significantly more non-respondents (22.7%, p = 0.03) than respondents (9.9%) in the youngest age group. Table 5 shows the differences for respondents and non-respondents by age for DOE. The chi-square for this table is not statistically significant (p = 0.43), meaning that there were not significantly more nonrespondents in any age group. Because younger workers may have different work experiences than older workers, such as fewer years on the job or children at home that alter the way they view their workday, these missing respondents in DFS and DWS may have answered the questionnaire differently than the older respondents, thus affecting the final results for these agencies.
As shown in Table 6, a significant proportion of DFS male respondents did not return a completed questionnaire (27.8%,
p = 0.02). There were no significant differences between respondents and nonrespondents based on gender for either DOE (p = 0.23; see Table 7) or DWS (p = 0.63; see Table 8).
The results indicate that younger respondents in DFS and DWS, as well as males in DFS, may not be fully represented in the findings. This may be important because, had they responded, their responses may have been different than those of employees that did respond. This may have some effect on the ability of the results to be generalized, particularly for these two populations.