DO YOU evaluation
Is DO YOU Evidence-Based?
As the developers of DO YOU: Building Youth Resilience Through Creative Expression, we often get asked questions about how DO YOU has been evaluated, and to what extent it is “evidence-based.” Let us clarify the evidence-base at this stage of the program’s development.
What does “Evidence-Based Practice” Mean?
The concept of evidence-based practice (EBP) has its roots in the medical and public health fields, and can be defined in a number of ways. Generally, evidence-based practice means making decisions based on the best available research and evidence.
As public health research evolved, two equally important components of evidence were identified: Practitioner expertise and participant/environmental context. The CDC lists the following as appropriate forms of evidence that should be used to inform violence prevention strategies:
• Best available research evidence comes from relevant literature and research.
• Experiential evidence comes from professionals in the field and includes professional insight, understanding, skill, and expertise.
• Contextual evidence is based on whether the strategy is useful, feasible to implement, and accepted by a particular community.
What forms of evidence were used in the development of DO YOU?
Best available research evidence:
In developing DO YOU, more than 20 sources were consulted to perform a thorough literature review of existing evidence on risk and protective factors, primary prevention strategies, and similar programs.
These sources included:
• Youth Risk Behavior Surveillance System
www.cdc.gov/HealthyYouth/yrbs/index.htm
• Youth Violence: A Report of the Surgeon General
www.ncbi.nlm.nih.gov/books/NBK44294/
• Striving to Reduce Youth Violence Everywhere (STRYVE)
www.vetoviolence.cdc.gov/apps/stryve/
• Blueprints for Healthy Youth Development
www.blueprintsprograms.com/
• Violence Prevention Education Base
www.preventviolence.info/evidencebase.aspx
• Search Institute
www.search-institute.org/
• National Centers of Excellence in Youth Violence Prevention
www.cdc.gov/violenceprevention/ace/index.html
Each of these sources played an important role in providing information about the best available evidence on violence prevention.
Experiential evidence (from professionals in the field):
DO YOU was created by an Advisory Committee comprised of over 30 experts in the field of primary prevention, public health and youth program engagement.
Contextual evidence (from the target audience):
DO YOU was developed by talking directly to teenagers all across Virginia and ongoing conversations with teens, facilitators, and the Advisory Committee to determine if this strategy remains useful, is feasible, accepted, and desired. Focus groups were conducted with over 100 14-16 year olds to identify teen perceptions around dating and sexual violence and healthy relationships, and to develop practice-based campaign values and goals. Pre/post-tests, session evaluations, and focus groups were used to evaluate the overall effectiveness of DO YOU in the 2013 pilot and in ongoing evaluation sites.
So, is DO YOU Evidence-Based?
Yes, largely. However, because none of our data has been collected and peer-reviewed by researchers (or with a control group), at this point it is most accurate to say that DO YOU is an evidence-informed primary prevention strategy.
What’s included in the DO YOU evaluation package?
The DO YOU evaluation package contains several components of a quasi-experimental design Below you will find information about each of the tools that will be provided to facilitators, as well as some additional guidance and considerations about administering evaluation tools.
To use with each lesson:
Attendance sheets. These are optional sheets you can use if you would like to, or need to, track attendance.
Participant feedback form. These sheets are just one way to collect information about how well the sessions are going – this is called process data. Largely this tool is for facilitators to gauge how groups are responding to their delivery of the lesson so any adjustments can be made for future groups. We have provided these as an example, but there are many ways to collect helpful process data that you can use instead.
Quick voting. Write statements on flip charts that correspond to specific activities or discussions for the lesson and ask participants to leave a dot sticker or place a check mark to their favorite part of today’s lesson – you could use a different color dot or ask for an X next to their least favorite part of the lesson.
Check-out word or feeling. Ask participants to end the session with a word or phrase of how they’re feeling, this will give you a sense of the impact of the lesson.
Instant polling. There are many online tools (mentimeter, polleverywhere, etc.) that can be used to collect instant feedback through text voting. Anonymous answers can be submitted and then displayed back to the group to get an instant pulse of the groups’ interest or feelings about an activity.
Create your own word cloud. Design a half-sheet with a sea of words that reflect the spectrum of interest and excitement and instruct them to circle 3 words that reflect how they felt about the lesson. This can also be done with emoji faces instead of words too.
Comment cards. You can ask the same questions on the handout but make them each on a colorful index card – this can help with making the task seem less like a pop-quiz or homework assignment.
Activity-based assessments. If you have another teammate who can observe the group, you can determine points throughout the lesson where participants should be active and create a simple rubric to measure engagement. There is more on this approach in this toolkit from the Texas Associate Against Sexual Assault.
Facilitator feedback form. These sheets are also just one way of collecting process data about how the lesson went from the perspective of the facilitator. The information can be useful to facilitators to review before implementing a new round of DO YOU lessons, and it is also useful to the Action Alliance’s goals of continually reviewing DO YOU for needed updates.
To use at the end of the program:
Participant Survey (retrospective evaluation). This tool measures participants’ skills, attitudes, and skills before and after the lessons. This type of evaluation is a relatively standard way to collect information about the outcomes of the program. Participants are asked to respond to 34 questions that cover all of the learning objectives of the 10-lessons. Since DO YOU weaves concepts throughout the curriculum, outcomes are measured at the end of the program instead of after each lesson. Please share this data with the Action Alliance as it helps us to make continuous improvements to the program.
Storing data:
We encourage facilitators to utilize a data storage tool for all process and outcome evaluation data from each round of DO YOU lesson implementation. Keeping paper forms from every participant takes up a lot of space and people are less likely to review results this way. Taking some time right after the lessons
Optional on-going evaluation:
Not all facilitators will have access to participants from a DO YOU program to do a longer-term evaluation survey, but for those that do, this can be an interesting and informative process to collect information about retention of the key concepts from the program. We have provided a post-post survey in the evaluation materials for you to use. This tool can be administered anywhere between three to six months following the program. For those who continue working with participants through a DO SOMETHING initiative, this survey will be easier to administer.
Additional considerations:
Many of the evaluation tools have a space at the top for a Participant ID. This is useful as it allows for more robust analysis of evaluation data and look for any indicators that may give context to results. In the past these tools asked for participants’ first and last name, but in an effort to provide participants with more confidentiality and only collect information we really need, we are suggesting alternative ways of matching data. Possible Participant IDs that are less identifying include short birth dates (ex: 417 for April 17th), last four digits of their phone number, or a locker number. While all of these IDs contain a personal connection, someone who comes across the data would be unlikely to associate it with particular participants. You can let participants know the number is just there so all of their forms can be looked at together, and won’t be associated with them specifically.