Measure Your Success: so learning can begin #2

IFVP 2013 Conference, New York
– International Forum for Visual Practitioners 2013 in New York

Preparation for New York has begun in earnest! As the days tick by, I have been drawing together my presentation material for the International Forum for Visual Practitioners later this month.

The topic is ‘did we hit our target?’ and is aimed at those who design and facilitate workshops and events. The session will be an opportunity to discuss evaluation methods in this context. This is an important topic as I think an ability to measure our impact as facilitators grows increasingly critical in an environment where project dollars remain tight.

A lot of thinking has gone into evaluation methods generally for programs and projects to help people track outcomes and report (to funders and sponsors) on their successes.

In my experience, however, the types of ‘evaluation’ we do as facilitators is often more superficial. In closing a workshop, we often ask the stalwart question: ‘how well do you think we did in terms of meeting our workshop’s objectives?‘ Participant responses generally provide a summated account of outcomes. I understand this is appropriate in many cases as the resources and energy invested in gauging the outcomes of a workshop is in line with the overall investment of the event. In comparison, medium to large scale programs that span months/years may have a total investment that is tenfold of a workshop and so require more structured and probing evaluations.

That fact aside, I think we have room to improve the standards of our workshop evaluation. Before I launch with my ideas, I want to acknowledge Dr Jess Dart of Clear Horizon. Jess, through her training courses and working alongside her team as a co-facilitator on evaluation projects, taught me much of the basics in program evaluation theory and practice. She is an evaluation guru in this country and a talented facilitator and business leader.

Logic model metaphor
– A workshop is an intervention. Like a pebble in a pond, it will result in ripples.

Here’s the first key point. We need to see workshops for what they are:

INTERVENTIONS.

If we do that, then it makes sense to spend time being clear about the expected outcomes in terms of short, medium and longer timeframes that we see flowing from the workshop. If we take the analogy of the pebble in the pond (the workshop), then we need to identify the ripples (outcomes) – what are they? how big do we expect them to be? and where do they go?

How does this thinking affect our practice?

I see three phases where facilitators translate the pebble in a pond analogy into a clear framework for evaluating outcomes. They are:

  1. develop a clear STATEMENT of OUTCOMES at the commissioning and designing stages of the workshop
  2. with the client, develop a shared understanding of HOW the workshop will DELIVER THE OUTCOMES expected; and
  3. design a process to check the EXPECTED with ACTUAL OUTCOMES.

A critical tool in doing the phases above is the logic model – a depiction of how the client / participants see the change occurring as a result of a program or project. As it applies to workshop evaluation frameworks, I call it ‘Logic Model lite’ as it is a simpler beast than one developed for a large scale program.

In my next post, I’ll provide an example of a logic model ‘lite’ for a workshop and show how to develop the evaluation questions that you will need to measure your event’s success.

Need help to get your CREATIVE on? signature icon

Curious Minds Co. is a consultancy firm passionate about helping people and organisations connect with their natural CREATIVITY and achieve their business and life goals.

You can contact me through [email protected]

Latest Articles