EVALUATION
EVALUATION
In order to determine if your recruitment and training efforts are successful, it is important to develop a plan for evaluation.
This section provides an overview of the evaluation process used for the pilot project. We encourage you to use the tools and resources provided to evaluate the success of your project.
The pilot project was evaluated using a multi-layered approach that involved individual learner, agency, and overall project data collection.
INDIVIDUAL LEARNER EVALUATION
Learners were evaluated using surveys that were built into the online course, including:
INTRO SURVEY
Demographic questions
Respite experience questions
Competency-based confidence questions
PRE/POST-TEST
Scenario-based questions aligned with course objectives and core competencies
POST-COURSE COMPLETION SURVEY
Competency-based confidence questions
Likelihood of providing respite care in the next 6 months
Potential barriers to providing respite care
General course evaluation (Satisfaction and overall learning)
Additionally, surveys were emailed to learners 6 months after completing the training and if they did not access the course for more than 30 days.
6-MONTH FOLLOW-UP SURVEY
Are you currently providing respite care?
How well did the training prepare you to provide respite care?
INACTIVE SURVEY
Why did you not complete the course?
AGENCY EVALUATION
Agencies worked with the evaluation consultant to identify and monitor progress related to marketing, recruitment, and training goals. Agency data was collected using the following worksheets provided to agencies throughout the pilot project period.
OVERALL PROJECT EVALUATION
We used the RE-AIM Framework to help guide our overall project evaluation. The goal of RE-AIM is to encourage program planners, evaluators, readers of journal articles, funders, and policy-makers to pay more attention to essential program elements including external validity that can improve the sustainable adoption and implementation of effective, generalizable, evidence-based interventions. The measurement tools used for the pilot project were developed based on the RE-AIM Framework.
How do I reach the targeted population with the intervention?
Consider the following:
The absolute number, proportion, and representativeness of individuals who are willing to participate in a given initiative, intervention, or program.
Are there specific target populations or underserved groups you hope to reach (i.e. rural communities, Native American, Spanish-speaking, etc…)
Set goal for number of people you want: to complete training, add to registry, work as respite provider
What are potential barriers to participation in the training?
How can you minimize or introduce methods to address these barriers in order to enhance participation?
Were there differences in individual evaluation outcomes or course completion rates between groups?
How well did your recruitment strategy work? What strategies worked best?
How much time did you spend tailoring recruitment materials for your audience?
What did you learn during recruitment? How could you have reached more people?
How do I know my intervention is effective?
Consider the following:
The impact of an intervention on important outcomes, including potential negative effects, quality of life, and economic outcomes.
What impact did the intervention have on program participants? (improved knowledge and confidence)
Are there differences in those who participated in the training program compared to those who didn’t participate or didn’t complete the training?
What impact did the intervention have on organizational outcomes? (number trained, number on registry, number providing respite care)
Were there any unintended consequences (positive or negative)?
What are some ways you know the intervention is working? What informed your decision?
If you could change one thing about the intervention right now, what would that be and why?
What surprised you about the outcomes of the intervention?
How do I develop organizational support to deliver my intervention?
Consider the following:
The absolute number, proportion, and representativeness of settings and intervention agents (people who deliver the program) who are willing to initiate a program.
What are potential and actual barriers to implementing the recruitment and training program? How can you overcome theses barriers?
How easy was it for your agency to participate in the program?
Were the materials provided easily replicated/adapted for your setting?
Describe how well you felt equipped to deliver the program based on the training you received? What could be improved?
How do I ensure the intervention is delivered properly?
Consider the following:
At the setting level, implementation refers to the intervention agents’ fidelity to the various elements of an intervention’s protocol, including consistency of delivery as intended and the time and cost of the intervention. At the individual level, implementation refers to clients’ use of the intervention strategies.
What are actual and potential barriers/competing demands of staff who are going to be implementing the intervention?
How does this work fit into the organizational environment or individual job duties?
Did all staff/programs implement the program consistently/as intended?
What adaptations have you made? (will need to track these throughout project)
Did your organization use all components of the program (recruitment and training program)
What costs were involved to implement the program (time or money)? How do these costs compare to other programs in your organization?
What is the likelihood that you would continue with this intervention?
Consider the following:
The extent to which a program or policy becomes institutionalized or part of the routine organizational practices and policies. Within the RE-AIM framework, maintenance also applies at the individual level. At the individual level, maintenance has been defined as the long-term effects of a program on outcomes after 6 or more months after the most recent intervention contact.
What did you like best and least about the program?
What aspects would you be interested in continuing or modifying?
To what extent do you feel the intervention is integrated into the operations of your organization? What needs to happen to further integrate it?
What are the actual or potential barriers to continuing to support this program?
6 months after the pilot:
Are you still using the program?
Did you make any modifications?
Do you anticipate continuing to use the program?
Are your training and registry numbers still increasing?
Do you have a plan for sustainability?
What resources are available to support the program in the future?
If your agency is interested in taking evaluation a step further, and you would like to learn more about the RE-AIM approach, please visit the website: https://re-aim.org/. It’s recommended that you engage an evaluator or university student to provide assistance.
LESSONS LEARNED
The following are lessons learned from the pilot project related to evaluation:
Set realistic goals.
Agencies involved in the pilot often set unrealistic goals related to the number of learners who would enroll in and complete the course based on their outreach efforts. This led to feelings of frustration and disappointment throughout the project. It is important to remember that it takes time to build trusting relationships with partners that will help enhance your recruitment efforts. We recommend setting more modest goals and adjusting them over time.
Document outreach efforts.
Agencies involved in the pilot often neglected to document their outreach efforts in a timely fashion. This led to them having to rely on their “best guess” as to how much time they spent on outreach and made it difficult to see the return on their investment of time. We recommend using the “Metrics Tracker and Dashboard” to keep track of outreach efforts.
Regularly assess the program with your team.
Engaging in regular assessment throughout the project will help your team better understand and address challenges that arise before they snowball into larger problems. It is also important to engage all members of your team who are involved in the project in the evaluation process. Agencies involved in the pilot that included multiple team members were able to gain insights from a variety of perspectives that helped to enhance their project.
“The most important thing that I have learned was to stay calm during any situation and to be professional. Make sure that I know what the care recipients' needs and wants. Develop a relationship with the care recipient so that they will be comfortable enough to trust me. I will have to make sure that I have all safety rules and company rules in order to keep the care recipient safe at all cost.” -Testimonial from a learner who completed the course evaluation.