How to Isolate Your Program’s Unique Impact

Social change is complex, and there are many moving factors that impact a specific client outcome. Even when rigorous program evaluation methods including a control group are in place, questions may linger regarding if it was the program or some other factor that made the ultimate difference or influenced the desired change. For example, if a school-based program is able to demonstrate that their participants had increases in school attendance and increased graduation, it is unclear if these improvements were due to the unique impact of the programing or activities conducted at the school level, the home level, or the community level.

A student who is at risk of dropping out of school may receive multiple interventions intended to keep the child in school and on track to graduate from high school. His parents may intervene by spending more time with the child working on homework and school projects. He may also be involved in an afterschool program, which gives him a mentor. Or he may have had previous involvement with the court and have an assigned probation officer. Finally, the school may be providing extra help. Given all these interventions in the student’s life, how can the school know for certain that it was their program that impacted the student’s attendance and graduation outcomes?

Unlike medical trials, where doctors can control a single pill or dosage an individual receives to evaluate the effect of a health intervention, social program evaluation makes it nearly impossible and often undesirable to isolate the treatment group and ignore other interventions. After all, social programs are designed to work in the real world not in a laboratory. There is, however, a relatively easy and practical way to attempt to isolate a program or training effect by conducting a secondary evaluation.

Once program staff knows the difference in the outcomes achieved, gathering data sets from both the control group and the treatment group, they can conduct a survey of key stakeholders to achieve a score that can help isolate the treatment effect. For the school program highlighted above, key stakeholders may be teachers, parents, and other individuals involved with the students in the school program. These people would be administered a short questionnaire that contains these three questions:

  • What percent of the student’s improvement can be attributed to the application of skills, techniques, and knowledge gained through this school program?
  • What confidence do you have in this estimate, expressed as a percentage?
  • What other factors contributed to this student’s improvement in performance?

To examine how this impacts the interpretation of data, let us assume that once these surveys are collected from all stakeholders involved with the student, the average answer to the first question related to the percent of student improvement attributed to the application of the skills learned in the program is 50 percent. We will also assume that the average answers to question two related to the rater’s confidence in their estimate is 70 percent, which would mean that the stakeholders are 70 percent confident that the program contributed to about half of the student outcome success.

Other factors such as parental involvement, court intervention, or church activities may have been listed as the other factors that made up the additional 15 percent. The confidence percent (70 percent) is then multiplied by the estimate (50 percent) to produce a usable intervention factor of 35 percent. This will then be multiplied by the actual intervention results to isolate the effects of the program.

We will consider that this school program learned that only 55 percent of at-risk students who did not participate in the program graduated, compared to 85 percent who were involved in the program. Therefore, the comparison results would indicate that the program increased graduation rates by 30 percent. In order to estimate the program effects to allow for other outside factors that influenced the success of these participants, the intervention effect could be used to further reduce this number.

To achieve the isolated training effective, program leaders would multiple the .35 by the 30 percent. Therefore, program leaders could say that say that they believe that the program accounts for at least a 10.5 percent gain in graduation rates. The additional 20 percent is influenced by the school’s program as well as the other factors listed.

Isolating the training effects is a great way for organizations to demonstrate to funders that they recognize that they do not work in a vacuum, but they are interested in the unique impact organizations contributes to participants’ success. It allows an organization to understand how stakeholders perceive the program. If participants experience desired outcomes, but stakeholders come back with low confidence that it was the program that contributed to the participant’s success, then leaders know that there are still things to work on. Also, the insight provided about the other factors that contributed to participant success can help organization understand what groups or organizations they should be partnering with to provide even stronger social return on investment.

If you are looking to evaluate your programs and isolate your programs’ unique impact, Measurement Resources is here to help! We will help you design great measures, develop an evaluation plan that leads to reliable results, analyze your survey results, and isolate your program’s unique impact. Our favorite part is to celebrate our clients’ success on their increased impact on the world!  We’d love to help you make data-driven decisions with confidence. Contact us today for your free 20 minute strategy session.

Want more information on how to increase funding, morale, positive press, and organizational impact? Join the Achieving Excellence Community and receive our free eBook, Ten Tips to Open the Door to More Grants (and Other Funding): Overcoming Common Mistakes in Outcomes Measurement.

 

Sheri Chaney Jones

Measurement Resources Company

June 2014

MAILING ADDRESS
1480 Manning Parkway
Suite A
Powell, Ohio 43065

TELEPHONE
614-947-8899

EMAIL
info@measurementresourcesco.com

Email Updates

Quick Links

Visit our sister company’s site:

Scroll to Top