In the lifespan of a project, it can be a nerve-wracking experience when the material that has been developed and honed for many months is first shared with the world. Will the audience we intend to reach respond to what we have done? Furthermore, will they find what we’ve done helpful? This is how the PREP team felt as we launched our first evaluation workshops last week. The Peel Region Evaluation Platform, or PREP, is a two-year partnership of Peel Leadership Centre and Peel Counts. It aims to increase the capacity of nonprofits in Peel Region to make effective and practical use of evaluation.

We’ve built a soon to be launched website that will share evaluation resources, templates and tools. In line with our project plan, we’ve also begun to assist nonprofits with the planning and design of evaluations that are uniquely suited to their needs and capacity. And so, with the delivery of two half-day workshops on evaluation, we have begun to deliver the third aspect of the PREP project. Over these two days, we worked with about 30 staff who came from more than 20 nonprofits across the region of Peel. We helped them gain a better understanding of what is meant by evaluation and how it can benefit their organization. We also introduced tools they could use to help assess their organization’s readiness to evaluate and to begin planning an evaluation.

Being true to the evaluation mindset we intend to cultivate within nonprofits in Peel Region, we’re taking the time post-workshop to reflect on what happened and consider the feedback we received from our participants. We’re doing this so that we can learn as we go and make changes and improvements to what and how we deliver learning experiences in the future. We’re looking for impact.

Here are some of the things we noticed that will influence PREP’s offerings.

Successful organizations are learning organizations

This is something we should know from Peter Senge’s The Fifth Discipline, but it’s hard to remember in the busy-ness of our everyday work. Approaching evaluation with an openness to what it can teach us has many benefits. When a desire to learn from our experiences is adopted at all levels of our organizations we create a sense of shared ownership and enhance our ability to adapt to change. With shared accountability, staff can feel empowered to develop and grow their own skills and leadership abilities in a drive towards better results for the people they serve.

Evaluation is a vital part of organizational learning so we want to ensure that the PREP website has tools that will help organizations cultivate their own learning culture.

Assessment isn’t the same as evaluation

We learned that when we assess something, we’re measuring its effectiveness. This activity is sometimes mistakenly identified as evaluation. Assessment is something we tend to do at our organizations on a regular basis. It can have many forms, which include measuring our progress against target participant enrollment numbers or looking at our outreach strategies to discover if they’re yielding increased engagement. We also apply assessment when we dive into the analytics reports about our website to see if site visitors are interacting with our content in the way that we want. Evaluation, however takes us further into determining the value of our work. Evaluation is done methodically and can examine not only results but also how the work is done. Evaluation can focus on different dimensions of a program too such as its relevance, its efficiency or its sustainability, among others. In short, evaluation isn’t assessment but assessment is part of evaluation.

For PREP, this means that we have to be clear when we use terminology associated with the field of evaluation and performance monitoring. The more we can do to help distinguish amongst these terms, the better.

We informally evaluate all the time

Evaluation is in our blood! We’re constantly evaluating our experiences in life, comparing them to others and using that information to make decisions. Consider the experience of going to a restaurant. How does our meal compare to other visits we’ve made to this restaurant in the past? How does it compare to other restaurants we’ve visited? Do we think the price we paid for the meal reflects its value? In addition to considering these questions ourselves, we will often ask our dining companions for their feedback as well. We’ve gathered data, used multiple data sources and used the information to help answer a broader question that is of importance to us, such as if we’d rate the experience highly or if we would consider returning to this restaurant in the future.

For PREP, we see that evaluation needs to be a practical and useful activity – one that taps into this natural desire to learn from our experiences and helps an organization answer the questions that are most important to them.

Good evaluations use both quantitative and qualitative methods

During the workshop, a question was raised about the validity of qualitative evaluation. They’d heard that using only quantitative methods makes for a better evaluation because it generates numbers and facts that are unambiguous.  We learned, however, that quantitative data only tells us what has happened but isn’t able to tell us how it happened. That’s where qualitative methods can help. Focus groups, interviews, and case studies are examples of qualitative evaluation methods that bring a holistic and balanced perspective to an evaluation study. Using qualitative methods help us dig deeper into why and under what conditions a program works or doesn’t work.

Part of the work of PREP is to help build a comfort and understanding around the idea of data. This piece of learning helps to remind us that we need to ensure both methods are included when we talk about the data to be gathered in an evaluation.

We’ve only hit the tip of the evaluation iceberg

In the short time we spent together during the workshops, we realized there is so much more to learn and discuss about evaluation. Each of us brought different levels of evaluation experience and knowledge, as well as different learning styles. This was clear from the feedback we received. Some participants would have preferred to spend more time on the essential concepts of evaluation. For others, there was a desire for more group exercises where they could learn to apply these concepts.

Observing this, we realize that PREP needs to share information about evaluation that can appeal to those who are new to evaluation and to those who are more advanced. From this, we also learned that we need to offer different pathways and tools so that people can learn about evaluation in a way that enables them to get clear on the concepts and apply what they’ve learned within their organizations.

We’re thankful to those who joined us last week at PREP’s first evaluation workshops. We’re hungry for more learning about evaluation and we hope you are too.

Until next time,

Liz

This week’s blog was written by Liz Dennis. As PLC’s Evaluation Manager, Liz brings her passion and 10 years’ experience in the nonprofit sector to PLC as the first point of contact with clients. She also works with clients to help them understand the impact of their work with PLC through evaluation and story. Connect with Liz at ldennis@peelleadershipcentre.org.

Sign up here for info, ideas and inspiration on nonprofit capacity building and leadership!


Powered by Wild Apricot Membership Software

You have Successfully Subscribed!

Pin It on Pinterest