Is traditional social science the only way to learn about a nonprofit’s program effectiveness? No. Many nonprofits conduct activities and programs that are subtle and complex, deserving an approach to evaluation that honors that complexity. We suggest a mock “evaluation court” much to offer.
When Minnesota’s Prevention Resource Center asked if we could evaluate its approach to “chemical dependency prevention and chemical health promotion,” then operating in the rural town of Monticello, some 40 miles west of Minneapolis, we proposed conducting an Evaluation Court.
We learned about “Science Court” from the National Science Foundation, which was testing this novel approach to studying the effectiveness of traditional classroom learning. Science Court was a way of submitting a good body of evidence to the judgment of a reputable panel of people equipped to make judgments. Proof of program effectiveness beyond a reasonable doubt? Let a jury decide. We were excited about the potential of such an approach as a way to bring forward evidence that compels informed judgment and educates the public.
We turned Science Court into Evaluation Court, and adapted some of the key features to be workable in the context of Monticello and the challenge of substance abuse prevention issues. Some of the major features of an evaluation court, at least the way we conducted it, are:
- It permits collecting a variety of evidence from a variety of sources, engaging respected members of the community. The collection would include testimony by experts, surveys of parents, school records – evidence with human appeal that’s not easily “rolled up” into a single score. This multi-method approach recognizes there are no simple answers to the question of effectiveness in areas like substance abuse prevention. It also draws on a collection of human brains to draw conclusions based on a critical examination of collected evidence – like in a courtroom.
- Rather than conducting an adversarial high-stakes courtroom battle of winners and losers (remember, this is a small town where everyone plays several noticeable roles and know each other), our community partners tweaked the design making it more of a “supportive inquiry.” An Evaluation Team helped them collect meaningful data which they presented to their neighbors and colleagues in the community. Actually, the event packed the High School gym – on a Monday night!
- It provided the Monticello community an opportunity to better understand the policies and activities being conducted in their community to get a better grip on the challenge of drug abuse prevention. that hopefully come together to A community-wide examination of the evidence, resulting in a more community-wide grounding in the issues of drug prevention.
- This was a very low-budget evaluation activity, and you get what you pay for. But it turns out you can get quite a lot on such a budget, especially when community members do so much of the labor-intensive work.
- Funders of this kind of effort get good return on this kind of investment in evaluation: community engagement and the resulting knowledge and resolutions that come from it. This kind of evaluation inquiry has its roots in the US Department of Agriculture’s Depression era experimentation in the field: test a new approach to growing corn, and invite the neighbors over to take a look.
We’re still very keen on this approach to judging effectiveness where “effectiveness” isn’t easily defined or demonstrated – a widespread condition! It’s also highly adaptable to local circumstances. In a free downloadable report, you can read our critique of how well the process worked, and guidelines to others interested in this approach.
Download A Community Forum to Evaluate Community-Based Prevention Efforts: The Monticello Experience (pdf), Steven E. Mayer, Ph.D. and Steven Gray, Rainbow Research, 1983.
Posted by Steven E. Mayer, Ph.D / Effective Communities Project / April 2, 2021