Background and Purpose
A growing body of knowledge highlighting the importance of quality assurance, performance management, and program evaluation in public organizations has been emerging since the 1990s, largely due to the enactment of the Government Performance and Results Act (GPRA) of 1993. Though GPRA was initially met with criticism, it has laid important groundwork for data collection, transparency, and review of performances of government entities (Moynihan & Kroll, 2016; Moynihan & Lavertu, 2012). While many studies largely include quantitative data (Gerrish, 2015; Lewis et al., 2003), qualitative studies also provide a “range of perspectives” that prove helpful in the development of a quality system (Cairns et al., 2005, p. 139).
Importance of Program Evaluation
Program evaluation is a valuable part of any organization because it plays a vital role in data-driven decision-making, increasing accountability, and improving the efficiency and effectiveness of a program’s service delivery (i.e., performance management) (Ammons & Rivenbark, 2008). Program evaluation and performance management go hand-in-hand; that is to say that one informs the other and vice versa. Results from performance management programs more often than not are described in program evaluations, which then influence the performance management system within the program itself (Kroll & Moynihan, 2018). Utilizing focus group feedback can help successfully achieve what Teng et al. (2016) recommended as a critical component of successful performance management—gaining buy-in from every link in the “implementation chain.”
Furthermore, Ammons and Rivenbark (2008) described two goals of performance management: accountability and service improvement. Accountability captures the effectiveness and outcomes of the program, department, or employee; and service improvement is about creatively solving problems to address what could make the program or services it provides better. In the case of South Carolina Vocational Rehabilitation Department (SCVRD), this framework was used to determine whether the new QA tool would sufficiently measure the program’s service delivery by WIOA’s common performance measures (accountability); and second, by making improvements to the tool based on the focus group’s feedback to better do so (service improvement). The agency further utilized a multi-attribute utility (MAU) method, as outlined by Lewis et al. (2003), to analyze the quality of service, determine outcomes, and define new goals. Ammons and Rivenbark (2008) emphasized the importance of “managerial thinking” to do this successfully, that is, to creatively rethink policies, procedures, and programs to achieve new goals.
Adoption & Implementation of New Performance Management Tools
During the VR process, the rehabilitation counselor guides a case through SCVRD’s program by coordinating individualized services for each consumer to address specific limitations with the goal of leading that consumer to vocational success. To do this effectively, the counselor must demonstrate competency across key knowledge domains, including effectively coordinating and documenting services delivered as part of an overall performance management strategy (Leahy et al., 2013). Because a long-term aim of this project was to improve service to SCVRD consumers, it was important to understand how new performance management strategies, such as the electronic QA tool, can be successfully integrated into an organization’s operations. For a new performance management system to become commonplace practice within an agency, it must first be “adopted.” Adoption refers to the development of measures of outputs, outcomes, and efficiency, as stakeholders, interested parties, and subject matter experts create a new program, system, or policy change (Julnes & Holzer, 2001). It is at this stage that the desired idea or theory becomes tangible. Underpinning factors must be in place for both adoption and implementation to work smoothly—namely, resources, information, and goal orientation (Julnes & Holzer, 2001). Without these factors established within the organization, the new system is less likely to succeed.
Beyond adoption and implementation, the organization must also integrate the system at all levels into the fiber of the organization; that is, it must become part of the organization’s culture (Cairns et al., 2005; Teng et al., 2016). Oftentimes, performance management systems morph into merely performance measurement systems, meaning the trends and data they produce are reduced to simple numbers, and the bigger picture of management and training initiatives is lost. To be successful, the organization’s leaders must show commitment and dedication to quality (Cairns et al., 2005). A recent meta-analysis provided evidence for keeping the “management” in performance management systems by encouraging leadership not to lose sight of the agency’s mission when new systems are implemented (Gerrish, 2016).
Performance Management Routines
Performance management systems are only as effective as they are incorporated into the philosophy of an organization. As such, in order for a new QA tool to be integrated into an agency’s program evaluation process, evaluators must consider how performance management systems will effectively become established as a routine. Pentland and Feldman (2008) defined these routines as “generative systems that produce recognizable, repetitive patterns of interdependent actions, carried out by multiple actors” (p. 235). Moynihan and Kroll (2016) noted several elements that are key in developing effective performance management routines, including participation from agency leadership, a capacity to analyze performance data, follow-up mechanisms, and recognition of managers who meet their goals. The authors also contended that performance management goals must be made clear to increase buy-in among staff, leading to effective implementation.
Prior research also describes the challenges of acquiring performance management data and interpreting it to foster substantial programmatic change. While many organizations are successful at employing performance management systems to collect and disseminate data, fewer are able to use the data in an effective way (Moynihan & Lavertu, 2012). To be successful, there must be a shift from learning new tasks or procedures solely as a function of the agency to learning as a structured practice or culture within the agency (Moynihan & Lavertu, 2012).
The SCVRD has shifted its service model over the years to provide consumers with innovative resources, such as job readiness training centers, comprehensive evaluation centers, a muscular development center, an information technology (IT) training center, and a 28-day, in-patient drug & alcohol treatment center. SCVRD’s ability to quickly adapt and implement changes in policies, procedures, and service delivery in keeping with current legislative requirements has positioned the agency to be a national leader in vocational rehabilitation services (Mann & Croake, 2018). Since the enactment of WIOA in 2014, SCVRD has modified many of its policies and procedures to better meet the standards of this act.
Even as leadership within SCVRD proactively worked to train program staff on the new WIOA performance measures, the agency was not closely evaluating the quality of services provided to consumers considering these new measures. Prior to this in-depth investigation of statewide service delivery in terms of WIOA, the agency’s quality assurance team was utilizing an outdated, paper form to measure and report case compliance. This form did not include the WIOA common performance measures, and much of the language was outdated, as was the paper-and-pencil format of the tool. The purpose of this project was to develop and test a new quality assurance tool to align with contemporary agency needs, with a longer-term aim of improving service delivery and consumer outcomes.
Quality Assurance Instrument Development
Development Phase
For this project, multiple stakeholders collaborated to devise a new, electronic quality assurance (QA) tool that examines the quality of services rendered in terms of compliance to the new federal regulations and the strength of case note documentation. The program evaluation and quality assurance (PEQA) team was aware of the key skill domains and recommended they be included in the new QA tool to capture the effectiveness of service provision in SCVRD’s program. The PEQA team then developed the tool over the course of one year (January-December 2020) with the involvement of two focus groups: an “adoption” focus group that aided in the development of the questions and design of the tool, and an “implementation” focus group that tested the tool on VR cases and provided further suggestions for improvement. The first group, focusing on adoption, consisted of 21 individuals employed as counselors and area consumer service managers from each office around the state, to provide valuable insight on the development and design of the tool. Regular meetings were held with Information Technology (IT) and QA staff to make ongoing enhancements to the instrument. The task of the adoption focus group was to review the existing paper version of the tool and assist in the creation of quality assurance questions that better captured the rehabilitation knowledge domains and the new WIOA common performance measures.
Implementation Phase
Following the development phase, the second group consisted of six individuals, employed as supervisors or area case service managers, and one senior counselor. They focused on testing the new QA instrument prior to full implementation. This group was tasked with testing the new program developed by IT and providing feedback on how the program functioned. The implementation group began its work in August 2020 and met on a weekly basis with the PEQA team and IT to discuss the program’s effectiveness at evaluating cases in their areas. These rounds of testing over an 8-week period provided important information used to guide further improvements to the instrument and enhance usability, including development of the tool’s scoring system measuring compliance and strength of documentation. The tool was finalized in October 2020 and the implementation group was asked to complete a follow-up survey.
Qualitative feedback was gathered from each focus group at key points within the project to determine whether the tool sufficiently captured the WIOA performance standards and whether group members were satisfied with the usability of the tool. Overall, participants reported positive feedback, noting that the new tool will be helpful in informing training needs as well as aligning moderately well or extremely well with WIOA measures.
Recommendations and Implications for Practice
Implementation of the new QA tool was a significant and much-needed procedural change in the agency, but will not be effective long-term without a cultural “paradigm shift” to the new federal performance measures. As SCVRD finalizes statewide implementation and sustainability, it will be important for leadership to ensure use of best management practices. It will be crucial for managers at all levels within the agency to understand the paradigm shift and adhere to the established managerial best practices. It will also be helpful to consider what benchmarks can be put into place as the new QA tool is implemented, such as incorporating the scores into SCVRD’s Employee Performance Management System.
Consistent with prior research efforts (Kroll & Moynihan, 2018), training initiatives, in addition to managerial practices, must be examined. SCVRD will need to design a training model to ensure the new, electronic QA tool will be well-received within the agency, and the efforts of SCVRD staff will align with the new WIOA common performance measures. Ultimately, the PEQA team hopes to see this project result in improved programmatic outcomes, including consumers successfully obtaining higher-paying jobs for longer periods of time, attaining a greater number of recognized credentials and measurable skill gains, and becoming more aptly prepared to meet the needs of South Carolina’s growing labor market sectors. Based on this study, the following recommendations, limitations, and reflections are offered:
-
The timing of the project occurring during the COVID-19 pandemic, limited the number and types of meetings that were held with the focus groups.
-
The implementation focus group was small. It is recommended that the SCVRD PEQA team continue to gather input periodically and adjust the tool as needed as it is used statewide.
-
Both the adoption and implementation focus groups provided positive feedback overall, indicating that the tool sufficiently captures the WIOA common performance measures, is easy to use, and is an improvement over the existing paper form.
-
By using this new tool, the agency will have a universal framework for program evaluation measures, all agency staff will learn the new WIOA requirements, and training needs will be more easily identified.
-
One of SCVRD’s primary goals of adopting and implementing the new QA tool as a performance management routine for enhanced program evaluation was to commit to continuous improvement. This was evident even in the development of the QA tool itself: the focus groups provided valuable feedback and suggestions, which were implemented into the program during development to improve its ability to capture the WIOA common performance measures, as well as to improve its usability. When the tool is used statewide, it will assist leadership in identifying which programmatic areas can also be improved.
-
Using a collaborative and interactive approach to develop new evaluation and quality assurance tools within public vocational rehabilitation agencies is effective in establishing buy-in and ensuring that the instrument is practical in meeting the needs of both field staff and leadership.
Author Note
The contents of this paper were developed under a cooperative agreement with the U.S. Department of Education, Technical Assistance Center for Vocational Rehabilitation Agency Program Evaluation and Quality Assurance (PEQA-TAC) (Grant Award Number: H263B150004). However, the contents and views expressed in this publication do not necessarily represent the positions or policies of the U.S. Department of Education, and you should not assume endorsement by the Federal government.