Measuring Change: Exploring a Collegial Way to Share and Learn
by Birgitte Jallov
If we're truly going to understand development and how to evaluate it, we must engage in a dialogue that raises questions and identifies and clarifies values, beliefs, assumptions and forms of knowledge. So argued the Consortium's Ailish Byrne in her article in Mazi 13-Novemenber 2007.
And just such a conversation among media practitioners is what emerged from the 3rd Symposium Forum Media and Development: "Measuring Change: Planning, Monitoring, Evaluation in Media Development." The Catholic Media Council, or CAMECO, hosted the event on behalf of the German-language "Forum Medien und Entwicklung." It took place in Bad Honnef, near Bonn, in Germany, on September 27-28, 2007.
The symposium dealt with core questions: "How do we effectively promote a media system and an environment that foster democracy and contributes to overall development goals?" and "How do we achieve sustainable change?"
The symposium included a section called "Setting the Framework," which charted the challenges and options in monitoring and evaluation, in working with—and defining—indicators of media development as well as a discussion of whether to develop a handbook. Following were five concrete cases focusing on "Concepts and Tools," which focused on such methods as outcome mapping, most significant change, and grassroots evaluations. The symposium also considered journalism training.
In the "Changing the Perspective" section, the symposium divided the 70 participants into three workshops. Each workshop discussed one of the following: lessons learned and recommendations to donors and implementing organisations; how to measure impact of journalism training; and critical assessment and development of ideas for the practitioners' handbook for media impact.
Below is a summary of the outcomes of the three workshops. The focus will be on the final one, which resulted in plans to develop a Wikipedia article on "Monitoring and Evaluation for Media and Development Communication," its working title.
1. Measuring Change: Lessons Learned
The "lessons learned" group presented their findings in two sections: general observations and a set of concrete proposals:
- Recognise that the complexity of the context of media assistance efforts requires a diverse toolkit of means and methods in programmes and projects, and their monitoring and evaluation;
- Encourage transparency among donors and implementers at all levels; this transparency can produce more open communications;
- Encourage efforts to gather evidence-based arguments that can clearly and strongly make the case that media assistance promotes democracy and development;
- Emphasise the importance of research and communications in all media assistance projects and urge firm commitments to this, from planning through evaluation;
- Focus on knowledge and capacity building and urge the fullest dissemination of useful tools and learning;
- Make visible the results of media assistance efforts and the benefits of cooperation to donor societies as well as recipients; and
- Encourage the creation of a "tool-kit" approach in developing a practitioner handbook on media assistance impact monitoring and evaluation.
- Create a media monitoring and evaluation expert working group that will carry forward conference discussions and promote discussion of the points above and other issues;
- Create media assistance coordination group to encourage broader knowledge of efforts and avoid duplication of efforts.
These two groups can be structured as open membership groups that can use a virtual forum to exchange information. Wiki format should be considered for working documents. Both groups would best serve if launched with a clear brief note introducing their purpose. They should be moderated and reviewed and synthesised periodically for dissemination.
2. Impact of Journalism TrainingThe second workshop had two objectives:
- Getting an overview of evaluations done for journalism training and identifying their objectives, levels and methods; and
- Finding reasons for the present status of evaluation in journalism training.
Results on stocktaking exercise
- The workshop participants named various examples of evaluations from their professional backgrounds. These examples were clustered according to objectives of the training and levels of evaluation.
- Most current evaluations of journalism trainings are mainly conducted on the output level, i.e.m what the participants have learned in the course but hardly ever what larger effects of that training exercise have been achieved. (Some demonstrations or at least meaning of different levels would be good).
- The usual method is a questionnaire conducted before and immediately after the course, an approach that measures two different things: before course expectations to training and success of training.
- A little different approach is the set-up of training institutions. Here the sustainability and viability of the institution comes into focus.
- The workshop participants could not clarify whether these findings can be generalised. It can only be assumed that the media sector has not yet reached the outcome or impact level of its programmes. This assumption was aligned with the general impression on the conference.
Hindering factors for better evaluations of journalism training were identified:
- Outcome tools and indicators still missing;
- Tight budgets and deadlines set by donors;
- Lack of cooperation between researchers, implementers, other experts and beneficiaries;
- Suspicion that the "learning culture" is not widespread in the media assistance sector (as in development cooperation generally). Everyone likes the image of being successful, but only a few will concede failures from which to learn more; and.
- Some misgivings on the results of evaluation.
However, there also are factors favouring more in-depth evaluations:
- It enables self-evaluation;
- A genuine interest exists among project implementers to know about successes, failures and sustainability, and
- Some confidence exists in the sector that evaluation results will be positive.
What to do? The recommendations what to do could not be as intensively discussed due to time constraints. A few ideas however emerged from workshop discussions:
There is a need to incorporate in our organizations the "culture" or "freedom to fail."
- Self-assessment tools for self-assessment should be elaborated
- The organisations have to integrate the care for follow-up in their structures.
- It needs additional steps in evaluation, for example if the right people are targeted at in a way to later achieve outcome and impact (meaning what?)
- Evaluation needs to be integrated from the very beginning of projects
Ideas For Hand Book For Media Impact Turning Into An Interactive Wikipedia Project
A Wikipedia project emerged from the workshop discussion, which started out with an idea to create a handbook for practitioners on media impact in development. The idea is to build a network of persons/institutions interested in sharing methods and experiences in the field of monitoring and evaluation. We will also start with a Wiki as the content platform. A group of people are continuing this work, as Mazi 14 is in production, is emerging. The organisers write in an update:
The Wikipedia article will have four sections:
- Monitoring and evaluation in media development (institutional/project level);
- M&E in media development (macro level);
- M&E in the field of training; and
- M&E in communication for development.
WIKI project: M&E for Media and Development Communication
The Wikipedia article will be discussed further at a workshop at the Conference of the Global Media Forum, in Bonn, June 2-4, 2008, where the "Forum Medien und Entwicklung" will present and invite comments and participation in a project on media assistance monitoring and evaluation, tentatively titled Monitoring and Evaluation for Media and Development Communication.
The goal is to create a Web-based resource to help gather experiences and ideas that can be turned into practical toolkits for monitoring and evaluating journalism training, media performance, and development communications.
Share information on M&E programs and projects to encourage greater cooperation in this field;
- Share experiences in M&E for media initiatives in conflict prevention, for urgent interventions in conflict areas, and in post conflict situations, and
- Pay particular attention and create knowledge regarding early warning media monitoring systems useful to organisations working in conflict areas.
Already, an array of foundations and NGOs has expressed interest in this initiative,
Initial coordination will be by the hosts of the Bad Honneff conference, the German Catholic Media Council (CAMECO).
The organisers envision this as a strongly participatory process that will not be prescriptive. The process could well provide consensual frameworks that many organisations can use in media assistance monitoring and evaluation.
*A symposium publication is in preparation. It will include all presentations and the key discussions. For information and copies, contact sofie.jannusch@CAMECO.ORG Also link to: http://www.dgroups.org/groups/MEMDeC. In the resources section, find useful CAMECO publications.