Modern public service agencies are challenged by increasing competition for public funds and demands for accountability. Within the state-federal vocational rehabilitation (VR) program, the Rehabilitation Act of 1973, as amended, established evaluation standards and performance indicators leading to an expansion of program evaluation and quality assurance systems and personnel with state VR agencies. Over time, these evaluation systems have become imperative to demonstrating program effectiveness, efficiency, value, and ensuring state and federal policy compliance. Data from evaluation systems is also important for use in state planning. VR agency program evaluation specialists play a pivotal role in meeting these demands through designing strategies and systems, collecting and assessing program performance data, managing projects, and communicating information to agency leaders and other stakeholders. The Vocational Rehabilitation Technical Assistance Center on Program Evaluation and Quality Assurance (PEQA-TAC), funded by the Rehabilitation Services Administration (RSA) within the U.S. Department of Education, was designed and implemented to assist state VR agencies improve performance management by building their capacity to carry out high-quality program evaluations and quality assurance practices that promote continuous program improvement (Notice of Final Priority, 2015).

The purpose of this paper is to provide background information highlighting the need for formal in-service training in this specialized area, share an overview of the program model involving online coursework and capstone projects, and discuss recommendations for further enhancing state VR capacity in this area.

Background and Need

The state-federal VR program, which serves over 1.2 million individuals each year and spends more than $3 billion annually, plays a large and influential role in helping persons with disabilities maximize their employment, independence, and integration into the community and the competitive labor market by improving early childhood, educational, and employment outcomes and raising expectations for all people with disabilities, their families, communities, businesses, and the nation (U.S. Department of Education, Office of Special Education and Rehabilitative Services, Rehabilitation Services Administration, 2020). Although the VR program has been reasonably successful in helping individuals obtain and maintain employment, it is also facing critical contemporary challenges, as the employment outcome rate reflects a continuous decline in recent years (U.S. Department of Education, Office of Special Education and Rehabilitative Services, Rehabilitation Services Administration, 2020).

Enactment of the Workforce Innovation and Opportunities Act of 2014 (WIOA) and associated regulations have renewed urgency and hastened program evaluation and quality assurance advancement to adapt to new rules and prescribed partnerships across governmental agencies (Sabella et al., 2018). The legislation substantially elevates the importance of program evaluation and quality assurance in public VR programs by requiring collaboration across workforce programs, which sets new standards for performance measurement, enhanced internal controls, mandated interagency data alignment, broad expansion of reporting requirements, and extensive modification to the traditional scope of services (Sabella et al., 2018). Furthermore, the Foundations for Evidence-Based Policy Making Act of 2018 (Evidence Act) requires federal agencies, including state VR programs, to modify how data is collected and managed, promotes data transparency and use, and facilitate data-driven decision-making. The increased emphasis on accountability, improved outcomes, and legislative changes are intended to address declining outcome rates and enhance agency awareness, further contributing to the need for program evaluation and quality assurance (PEQA) specialists within the public VR program.

In this era of accountability and evidence-based practice, state VR agency administrators are encouraged to transform their agency business models by adopting organizational innovations and harnessing policy and practice advances to deliver and measure outcomes to individuals, programs, communities, and society at large value (Groomes et al., 2017; Leahy, Chan, & Lui, 2014; Leahy, Chan, Lui, et al., 2014). As Kroll & Moynihan (2017) note, performance management is more likely to result in evidence-informed decisions when it is integrated with program evaluation. While case management systems provide access to routine information at low costs and in a timely manner, they cannot explain variation in performance. Program evaluation, however, is designed to offer such knowledge, and therefore can help make sense of performance outcomes. Furthermore, employees engaged in performance management are more likely to use performance data if they are actively involved with program evaluation. The public VR system can continue to leverage feedback from its customers in the workforce programs, rethink its processes, position innovative ideas, and follow-up on its change management functionality through performance management actions like program evaluation (Groomes et al., 2017).

In the public VR system, program evaluation and quality assurance specialists are required to have a high level of proficiency in both evaluation and VR knowledge. However, state VR agency leaders tend to approach the hiring process for evaluation specialists in two distinct ways, either (a) developing individuals from within the organization who have knowledge of the VR system, assuming they can then acquire the necessary evaluation knowledge; or (b) hiring individuals with advanced evaluation and quality assurance abilities from external sources and teaching them state VR-specific knowledge (Cummings et al., 2011). External applicants face significant hurdles in understanding the VR process and applying regulations, but more importantly, they may lack the experience to arrive at meaningful informed judgements in unique VR contexts. Frequently, agencies select internal candidates who have extensive VR experience, but with no formal training or expertise in program evaluation. These individuals are expected to perform highly specialized and technical evaluation and quality assurance tasks, and may face significant challenges in fulfilling their new roles (Sabella et al., 2018).

As such, an online training specific to PEQA competencies was developed to bridge this knowledge gap. The PEQA online training domains were intentionally designed to align with the Evaluator Competencies and Guidelines established by the American Evaluation Association (AEA), which apply to all professional evaluators (American Evaluation Association, 2018). These professional competencies were originally developed, and subsequently revised, based on input from the Guiding Principles Task Force, AEA leadership, and AEA members. Key knowledge and practice domains were identified using methods consistent with those recommended by the Joint Committee on Standards for Educational Evaluation (JCSEE). While assessing overall programmatic outcomes based on evaluator competencies can be challenging, research underscores the importance of using established taxonomies to guide program evaluator professional competency development (King & Stevahn, 2015; Stevahn et al., 2005a; Stevahn & King, 2014). The principles are intended to guide the professional and ethical practice of evaluators, including (a) systematic inquiry, (b) competence, (c) integrity, (d) respect for people, and (e) common good and equity (American Evaluation Association, 2018). A participatory process is used to review and update the principles at least every five years, with the latest revision taking place in 2018 (see Table 1).

Table 1.The AEA Guiding Principles for Evaluators
The AEA Guiding Principles for Evaluators
Systematic Inquiry
    Evaluators conduct data-based inquiries that are thorough, methodical, and contextually relevant.
Competence
    Evaluators provide skilled professional services to stakeholders.
Integrity
     Evaluators behave with honesty and transparency in order to ensure the integrity of the evaluation.
Respect for People
     Evaluators honor the dignity, well-being, and self-worth of individuals and acknowledge the influence of culture within and across groups.
Common Good and Equity
     Evaluators strive to contribute to the common good and advancement of an equitable and just society.

PEQA Competency in State VR Programs

Effective quality assurance and program evaluation strategies are critical in supporting data-driven decision making and improving VR program performance (Del Valle et al., 2014; Leahy et al., 2009). To be poised to adapt in the ever-changing landscape, state VR agencies need leadership that values program evaluation and quality assurance efforts and is willing to invest in continuous improvement (McFarlane et al., 2011; Sherman et al., 2014). As a result, additional resources have been directed to evaluation units, and there are greater numbers of evaluation and quality assurance specialists who are redefining these roles within state VR agencies (Sabella et al., 2018). A study by Sabella et al. (2018) investigated the essential competencies necessary to be an effective VR program evaluator using a sample of 43 state VR agency PEQA specialists, administrators, and other related support personnel. Participants were asked to rank six PEQA competencies identified in an earlier study from the 36th Institute on Rehabilitation Issues monograph on performance management (Cummings et al., 2011) and update the competencies based on a decade of PEQA advancement. The mixed methods approach yielded five primary competencies: (a) methodology and data analysis, (b) knowledge of the state-federal VR system, (c) interpersonal skills and effective communication, (d) project management, and (e) critical thinking.

To be effective, program evaluators must have the right combination of PEQA technical skill, knowledge of the environment they inhabit, and interpersonal skills to manage teams and communicate the message (Stevahn & King, 2014). The complexities of the state-federal VR program create a steep learning curve for external hires, leading many agencies to promote or assign PEQA specialists from within, even when they lack PEQA expertise (Cummings et al., 2011). The Sabella et al. (2018) study found that program evaluation methods and data analysis competency was ranked highest in importance, surpassing VR knowledge; this acknowledges technical skill requirement, but may also suggest that VR agency personnel feel they lack adequate training in this area. To enhance the use of program evaluation evidence in VR agencies, a culture of leadership is required to support an ongoing focus on quality and continuous improvement, the use of evidence in decision making, and training for individuals assuming PEQA roles (Sabella et al., 2018; U.S. Government Accountability Office, 2013).

The Summit Group Resource

In 2008, the Summit Group on Performance Management in VR (www.vocational-rehab.com) was founded to support agency PEQA personnel within state VR agencies. The Summit Group was designed as a community of practice (CoP) with the fundamental premise of promoting collaboration among evaluators. Guided by CoP theory articulated by Lave & Wenger (1991), learning is best achieved through a social and participatory process where members engage with each other, share knowledge and tools, reflect on common concerns, and work through problems collectively. Within this model, CoPs are groups of people connected by a common purpose or passion and who learn better practice through interaction between members (Wenger-Trayner & Wenger-Trayner, 2015).

For a CoP to be successful, the need has to be sufficiently urgent to unite members and sufficiently enduring to keep them engaged year-after-year. Over the past 15 years, state VR agencies have encountered a broad range of challenges, including implementation of WIOA, recruitment and retention of qualified personnel, economic downturns, demand for technological advancement, federal and state politics, and other unexpected environmental and contextual changes affecting their service systems and client base. Furthermore, contemporary legislation guiding public programs requires both fiscal and evaluative accountability and transparency to ensure resources are used effectively (Foundations for Evidence-Based Policymaking Act of 2018, H.R.4174 § 312, 2018).

Formation of a Community of Practice

In 2000, the Rehabilitation Services Administration published evaluation standards and performance indicators that established specific measures of program performance and client outcomes (U.S. Department of Education, 2000), leading to an expansion of program evaluation systems and personnel in state VR agencies (Sabella et al., 2018). Then in 2007, new RSA monitoring protocols introduced a more comprehensive assessment of the agency’s program evaluation and quality assurance systems (Shoemaker & Sabella, 2010). These new federal priorities reflected an evolution from simple outcome reporting to the use of data in decision making, evidence-informed practice, quality assurance systems, and more broadly, continuous improvement in service delivery and outcomes (Sherman et al., 2014). Until this time, many state VR agencies viewed PEQA in narrower terms, often as a case file review process and/or in the reporting of standards and indicators, without a clear understanding of comprehensive performance management (Cummings et al., 2011). This created an urgency to quickly develop PEQA literacy and capacity.

The Utah State Office of Rehabilitation initiated a series of conversations with allied agencies to survey the field about the state of their current PEQA processes and the primary needs of personnel doing the PEQA work (Shoemaker & Sabella, 2010). Their findings revealed there were few PEQA training resources available for state VR personnel, and PEQA specialists had very little contact with their counterparts in other agencies (Shoemaker & Sabella, 2010). These informal conversations were the impetus for the first Summit Group conference in Salt Lake City, UT, with 47 participants and representation from 15 different states. The success of the first conference and rising PEQA interest nationally drove an expansion of the Summit Group to today’s more than 450 listserv members, 13 annual conferences averaging 140 participants, and over 950 registrants for the 2020 virtual conference hosted by the PEQA-TAC at the University of Wisconsin-Stout Vocational Rehabilitation Institute. Participation in the CoP has continued to grow, signaling that PEQA pressures are intensifying rather than waning.

The need for advanced training remains high as contemporary VR program evaluation and quality assurance specialists face a host of new concerns, including transitioning to common performance measures, developing methods for evaluating and tracking pre-employment transition services, and negotiating with federal agents to set performance measure targets. Personnel are also hampered by changes to federal reporting systems, data access and sharing issues across state agencies, and barriers to collaboration across workforce core partners.

Capacity Building Model

The PEQA-TAC model intentionally and collaboratively leveraged knowledge and skills across multiple partners to develop and implement a tiered approach to training, coaching, and technical assistance for program evaluation and quality assurance with state VR personnel. The key partners included the University of Wisconsin-Stout Vocational Rehabilitation Institute (SVRI), the University of Wisconsin-Stout Applied Research Center, the University of Wisconsin-Madison, Michigan State University (MSU), the Council of State Administrators of Vocational Rehabilitation (CSAVR), the Summit Group, Employment Resources, Inc., and RSA. Furthermore, a cadre of experienced and talented coaches and mentors provided support to participants as they progressed through the training program. The program involved a sequence of online training modules aligned with key evaluator competency domains, a series of advanced workshops and online training opportunities, participant capstone projects, and a CoP coordinated with the Summit Group. Throughout all elements of technical assistance provision, the partners actively collaborated to ensure consistency and open communication across partners, and in close collaboration with state VR agencies and RSA.

Online Training and Applied Capstone Projects

Strong evidence notes the benefits of blended learning in that it supports (a) participant learning outcomes consistent with or better than traditional learning environments, (b) cost effectiveness, and (c) enhanced accessibility across populations and geographic locations (Bowen & Lack, 2012; Means et al., 2009). The PEQA program was designed as a blended learning model and continued to evolve across cohorts over the course of five years. Early formative evaluation efforts with initial cohort participants informed programmatic changes to the design and delivery of the online curriculum, earlier integration of the capstone projects, and the addition of a coach assigned to each participant to enhance engagement. The initial coursework was subsequently streamlined, and participants were able to progress through the curriculum more efficiently.

Certificate in Program Evaluation (CIPE)

The online, asynchronous Certificate in Program Evaluation (CIPE) was available at no charge to program evaluation personnel of state VR agencies. Participants were recruited from the 78 state VR agencies through collaborative outreach conducted by the PEQA-TAC, RSA, CSAVR, the Summit Group, and the National Clearinghouse on Rehabilitation Training Materials (NCRTM). These direct recruitment efforts resulted in applicant pools of at least eight individuals and/or teams per cohort from state VR programs. Applications were reviewed by the PEQA-TAC Admissions Committee, comprised of Executive Committee members, staff, and coaches, and applicants were approved based on the following criteria: (a) educational background, (b) current role as a program evaluator for state VR agency, (c) length of service in program evaluator role, (d) professional certifications, and (e) support from their supervisor and state VR agency to enroll in the program. These criteria were initially used to identify and differentiate program evaluators likely to benefit from training from those who were more appropriate for intermediate training activities, including the advanced workshops. The support of the applicant’s state VR agency was critical, as the online training culminated with the completion of a capstone project that addressed a specific evaluative need of the state VR agency, and could be incorporated into the program evaluator’s work assignment.

The CIPE curriculum and its delivery built upon the expertise of rehabilitation counselor educators, state VR partners, and researchers (employed by partnering entities) to create a learner-centered, engaging program designed to strengthen capacity in six key knowledge domains (see Figure 1).

Figure 1
Figure 1.Knowledge Development Areas of the PEQA CIPE

The AEA competencies are considered “gold standard” and are frequently used to guide program evaluation training and education offered through universities (Stevahn et al., 2005b). The PEQA course modules aligned with the AEA competency guidelines with the intent to increase capacity for the provision of quality program evaluation and quality assurance services within the state-federal VR system (see Table 2).

Table 2.Certificate in Evaluation Studies Curriculum Overview
Course Domain Title Description Knowledge and Skills
Domain 1 Overview of Evaluation Approaches & Theories Introduce major evaluation theories and approaches. Become familiar with various approaches that drive evaluation
Understand relevance of evaluation across multiple settings
Identify personal evaluation competencies
Develop personal philosophy of evaluation
Domain 2 Methods Learn how to design quantitative and qualitative evaluations. Conduct a literature review & learn to create logic models
Identify and address related contextual and political issues
Identify evaluation questions
Develop a plan to answer evaluation questions
Identify appropriate data collection strategies
Domain 3 Data Collection: Strategies & Techniques Introduction to quantitative and qualitative data collection strategies. Design and implement basic focus group methods
Design and implement basic survey methods
Design and implement observational data collection methods
Domain 4 Data Analysis: Strategies & Techniques Introduction to qualitative and quantitative analytical methods. Analyze and interpret data Sharing data with others
Domain 5 Applying Findings & Reporting Across Multiple Stakeholder Audiences Covers information and skills related to the practice of evaluation in the field. Differences in evaluation settings
Situationally appropriate reporting techniques
Assess information from previously conducted evaluations
Build evaluation capacity in stakeholders
Present results with data visualizations
Facilitate action plans based on evaluation results
Responding to RFPs
Evaluation plans for grant proposals
Domain 6 Repository of Research & Evaluation Resources Collection of materials relevant to program evaluation and quality assurance.

Capstone Projects

Capstone projects provided PEQA-TAC participants with the opportunity to design and implement a meaningful evaluation for their state VR program by applying knowledge and skills acquired through the CIPE training program. A cadre of individuals representing various university and national research firms served as capstone mentors, providing support and consultation to participants as they developed, implemented, and completed their projects. Michigan State University, well-regarded for their work with Project Excellence, a program evaluation partnership project with Michigan Rehabilitation Services initiated in 2001, served as the lead partner regarding development of the capstone projects.

The MSU team created an extensive list of resources for the state agency staff participating in the program, including development of a capstone handbook. The handbook was designed to help support participants through their capstone experience, and contained important information guiding each step of the process (e.g., interest/topic inventory, informal interview with the capstone team, proposal, final report), general expectations, and sources of support (e.g., project assistants). It consisted of five sections: (a) general program evaluation & quality assurance, (b) research, (c) previous research done in public VR, (d) evidence-based practices, and (e) WIOA. At the informal interview, each participant stated their research interests and the capstone team provided further guidance relevant to development of their ideas into a study, potential research method, and implementation of the study. According to the topic areas, each participant was assigned to a capstone mentor. The mentors worked closely with participants using the handbook and course materials, and provided assistance throughout their capstone project (e.g., study plan, data collection, instrument/survey development, data analysis, report writing, presentation).

CIPE and Capstone Outcomes

A total of 76 individuals representing 37 state VR agencies were accepted into the PEQA-TAC training program between Fall 2016 and Winter 2020. Of these, 38 individuals representing 21 state agencies successfully completed the online training and 26 capstone projects aligned with state agency evaluation needs specific to WIOA implementation (see Table 3). The program was originally designed for individual participants, but expanded to accommodate state-based teams of up to three individuals, as well as participation by multiple individuals within a state VR agency across various cohorts. These modifications allowed for a more flexible approach to meet the changing needs of state VR programs.

Table 3.Completed State VR Program Capstone Projects
State VR Agency Capstone Project
Arkansas Division of Services for the Blind Examining Correlations between Pre-employment Transition Services and Vocational Rehabilitation Progression
Florida Division of Vocational Rehabilitation Pilot Evaluation of the VR Works Training and Implementation
Idaho Division of Vocational Rehabilitation Rural and Urban Counties in Idaho: Differences in Vocational Rehabilitation Service Delivery and Outcomes
Illinois Department of Health Services Pre-ETS Success, Pre-ETS Standardized or Teacher Created Curriculum: Does the Structure Determine Transition Success?
Indiana Bureau of Rehabilitation Services Case Service Reporting: Meeting Guidelines for WIOA and Supporting Documentation
Kentucky Office of Vocational Rehabilitation Evaluating the Long-Term Effectiveness of the SGA Enhanced Services Model in Kentucky: A Follow-Up Study on the SGA Demonstration Project
Louisiana Rehabilitation Services The Effect of Introductory Training on the Use of Motivational Interviewing in Vocational Rehabilitation
Massachusetts Commission for the Blind Exploring the Relationship between Cost of Purchased Services and Employment Outcomes
Massachusetts Rehabilitation Commission The Effectiveness of Integrated Resources for People with Mental Health and Employment
Michigan Rehabilitation Services The Effect of Benefits Counseling on Increasing Knowledge of Social Security Work Rules and Work Incentives
Minnesota Vocational Rehabilitation Services Repeat Customers: Minnesota Participants who Return for Multiple Series of Case Services
A Pilot Study Evaluating the Effectiveness of Person-Centered Planning
Missouri Vocational Rehabilitation Services The Impact of Early Work Experience on VR Outcomes
Evaluation of Employment Services Plus
New Jersey Commission for the Blind and Visually Impaired Developing a Dashboard: Preliminary Steps and What We Learned
South Carolina Commission for the Blind A Pilot Study to Develop a VR Case Review Instrument for WIOA Performance Measure Data Collection
South Carolina Vocational Rehabilitation Comprehensive Needs Assessment for Pre-Employment Transition Services
Evaluation of the South Carolina Vocational Rehabilitation Department (SCVRD) Information Technology Training Center
Improving Quality Assurance at SCVRD: Developing a New, Electronic QA Tool
South Dakota Division of Vocational Services Customized Employment Training Needs in South Dakota
Texas Workforce Commission-Vocational Rehabilitation Division A Critical Examination of Cases Closed Unsuccessfully After Application for VR Services
Utah State Office of Rehabilitation An Outcome Analysis of Community Rehabilitation Programs
Vermont Division of Vocational Rehabilitation Vermont Case Review Process Guide: Assessing and Delivering High Quality Services to VR Customers
Virginia Department for the Blind A Closer Look at Vocational Rehabilitation College Training Services at the Virginia Department for the Blind and Vision Impaired
Wisconsin Division of Vocational Rehabilitation Plan Twice, Measure Once:
Designing a Financial Capability Service with Evaluation in Mind
Identifying Best Practices for Long-Term Success in Supported Employment

Discussion and Future Implications

The PEQA-TAC provided an opportunity to better understand the acute need for specialized in-service training and mentoring in program evaluation and quality assurance in the public VR program. The need for skilled professionals in this area continues as states implement the complex myriad of WIOA requirements, and federal legislation and guidance requires increased proficiency in promoting high-quality, data-driven decision making across employment programs as noted in the Foundations of Evidence-Based Policymaking Act of 2018 (U.S. Department of Labor, n.d.).

There is also a considerable continuum of program evaluation and quality assurance knowledge, skill, and experience within state VR programs. Individuals employed in these positions tend to either have degrees and experience as evaluators, or alternatively and more commonly, internal experience as counselors or related positions within state agencies. Therefore, the need for ongoing training that is structured but flexible, and delivered in a manner designed to fit with busy schedules and professional demands, is important for effectively supporting participants.

Additionally, personnel turnover in state VR programs takes place regularly and the need to continually train and support program evaluation and quality assurance staff should be anticipated. As an example, of the 76 participants approved to participate in the CIPE training program, 28 (37%) withdrew with the primary reason being job change either within their agency or moving to employment outside the VR program. Of the 38 participants or teams who successfully completed the online training program, five (13%) were promoted into other positions and/or left their employment with the state VR agency subsequent to completing the PEQA program.

The PEQA-TAC developed contemporary methods of training rehabilitation personnel in performance evaluation and quality assurance, specifically the PEQA CIPE, which included the online training modules and applied capstone projects. The ongoing contributions of these graduates to enhance their state VR agencies’ performance management and increase data-driven decision-making is expected. As referenced above, challenges of the training program, especially retention of the participants who were busy employees of state VR agencies, was also observed.

Further research and development in this specialized area is recommended as the need to continually improve systems and practice evolves. Development of a tool to effectively measure VR program evaluator competencies may be helpful in establishing baseline knowledge and skills and inform future training and capacity building in this area. However, as a starting point, the PEQA-TAC program provided a strong foundational and systematic training program with potential for future consideration. Content from the Center, including access to a representative number of the published capstone projects, is archived through the National Clearinghouse on Rehabilitation Training Materials for use by current and future rehabilitation counselors, educators, and administrators.

Conclusion

Offering a formalized approach to training ensures state VR program evaluation specialists have an opportunity to acquire knowledge of data analysis and interpretation, effective communication, project management, and critical thinking skills, as well as the ability to appropriately apply the use of data within the public VR context (Sabella et al., 2018). Quality employment outcomes for individuals with disabilities continues as a key priority of VR, as workforce programs evolve and seek validation through data-driven policy and service delivery. The PEQA-TAC established the first formalized education and training program designed to build program evaluation and quality assurance capacity within state VR agencies, and collaboration with the Summit Group of Performance Management in Vocational Rehabilitation served to promote and enhance communication channels with state agency systems and personnel. While the federally funded PEQA-TAC training program was time-limited in duration, it is expected to have an enduring impact as cohort participants engage with their counterparts across state VR agencies.


Author Note

The contents of this paper were developed under a cooperative agreement with the U.S. Department of Education, Technical Assistance Center for Vocational Rehabilitation Agency Program Evaluation and Quality Assurance (PEQA-TAC) (Grant Award Number: H263B150004). However, the contents and views expressed in this publication do not necessarily represent the positions or policies of the U.S. Department of Education, and you should not assume endorsement by the Federal government.