With the ten-year countdown to the Sustainable Development Goals underway, Systems to evaluate the impact of policies and monitor their progress are more important than ever. To sustain the momentum of global efforts to promote monitoring and evaluation capacity the CLEAR Initiative will convene the second annual gLOCAL Evaluation Week from June 1nd to 5th of this year. To take part in this year’s gLOCAL Evaluation Week, simply register and submit your proposal. Applications close 6th March 2020. Click here to register….
Instituted in 2017, the Directorate of Monitoring, Evaluation and Inspection (DMEI), in the Office of the Prime Minister (OPM), is mandated to monitor and evaluate government policies, programs and projects across Ministries, Departments and other Public Institutions. This mandate is ensured through generating evidence (eg. data) on government interventions, good practices, challenges, lessons learnt, and disseminating these to the relevant stakeholders.
However, to fully meet the objectives of DMEI a strategic communication function needs to be formulated. Therefore, DMEI realised a need to deliberately form a team of specialists to be in charge of reporting and disseminating the monitoring, evaluation and inspection results. The design of a country-level Communications Strategy was supported by Twende Mbele – a multi-country peer-learning initiative aiming at strengthening use of monitoring & evaluation systems, processes and results to improve performance and accountability of African governments.
A journey towards the communication strategy
Under the DMEI, the Evaluation Communications Team (ECT) has been working with Twende Mbele to discern different stakeholder needs, and strategies for engagement so as to best build this results-orientation. The first result of this work has been the development of a Communication Strategy that outlines;
- objectives of communication (broadly for the Directorate and for each campaign),
- target audience and key messages,
- channels of communication for effective delivery of the intended messages by each given audience,
- modes of communication,
- personnel responsible for implementing the communications strategy.
Further, we want to work with other government Ministries, Departments and Agencies and the public to;
- increase awareness of the evaluation process, the results and the actions being taken,
- gain support for government changes resulting from the evaluation
- facilitate stakeholder input into the evaluation process.
Achieving our objectives through implementing the Communications Strategy, we will be able to foster DMEI’s work guiding policy formulation and implementation, improving service delivery, and appropriate resource allocation.
Finding the Audience
The focus of communication actions depends on the nature of the targeted audience and their influence in bringing about the desired changes. The OPM has a wide-range of audiences to communicate information and results with, including citizens, policy makers, and legislators, institutions of learning, researchers, civil society organizations, and other partners. Messaging and communication channels/formats for the different types of stakeholders are tailored according to audience needs based on orientation, perception and influence towards government business.
Communications functions integral for improving engagement with non-traditional actors, for example, reaching out to CSOs to review the National Public Policy on M&E to ascertain its responsiveness to equity and capacity needs of non-government actors. Other, more participatory approaches are already underway, such as Baraza, which are community engagement fora which has been found to be effective in generating feedback on service delivery for improvement.
In the same custom, the audience is the basis for the decisions on the communication channels and tools used in the process of transmitting the intended messages. Again, the diversity of stakeholder requires the ECT to be flexible and adaptable to differing needs and to take a learning approach to communications.
The ECT takes lead in evaluation communication process and at the end of the year will embark on evaluating the communication approaches used. Evaluation of the communication strategy is done retrospectively by reflecting on the objectives that were set during the design of the strategy and measuring performance against them. For instance, OPM will assess the change in policy formulation, service delivery, political support, resource allocation among others, against the initial communication objectives.
Lessons for future success
It is central that the initiators of an effective communication strategy provide a better capacitated to further communication efforts. Additionally, building the capacity of multiple stakeholder groups (particularly working with the media on evaluation) to strengthen their understanding and skills, will also be required. It can be observed that the limited participation in capacity building activities has resulted in weak ownership on some interventions.
For more information on the work of the DMEI or communications in the OPM M&E please contact;
- Acting Director M&E-Mr. Timothy Lubanga
- Acting Assistant Commissioner M&E- Mr.Abdul Muwanika
- Information Scientist M&E Mr. Joseph Muserero
- Twende Mbele National Coordinator M&E-Ms. Doris Kembabazi
- Information Systems Officer M&E- Ms.Florence Mbabazi
The need to understand Evaluation use by NGOs
This blog draws from research conducted by Elizabeth Asiimwe, a Twende Mbele PhD Fellow at the Mbarara University of Science and Technology, Uganda, in 2017. The study is premised on the assertion that, although there is a growing discourse on evaluation utilisation for evidence-based policy making in the public sector, very few studies have focused in evaluation utilisation in the NGO sector in Uganda. This is despite considerable investments by funding agencies such as the Department for International Development (DFID) and the United States Agency for International Development (USAID) in Uganda. More information on the USAID evaluation policy can be found at: https://www.usaid.gov/evaluation, and for DFID, in the Uganda Operational Plan 2011-2016 https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/389293/Uganda.pdf.
In essence, NGO work provides important social change work that should inform policy making. In this regard, continuous improvement of NGOs’ work through monitoring and evaluation use is fundamental for enhancing evidence-based policy making – provided the system is set up to receive this information. However in Uganda there is a huge gap on studies that explore utilisation of evaluation results. Asiimwe’s study aimed at contributed towards addressing this empirical gap.
Focus of the Study
The study assessed evaluation utilisation among non-governmental organisations in Uganda, aiming to generate evidence to guide policy makers, project planners, evaluation planners, commissioners and implementers on determinants of evaluation utilisation. A mixed methods approach was utilised to collect data from a random sample of 404 respondents across 101 NGOs, using an online tool (SurveyCTO), focus group discussions and key informant interviews. The study adopted a conceptual framework developed by researchers that include Fleischer and Christie, (2009) http://www.managingforimpact.org/sites/default/files/resource/evaluation_use_results_from_a_survey.pdf. The framework illustrates three major factors that influence evaluation utilisation as: evaluation implementation characteristics, organisational and evaluator’s characteristics.
What were the key research findings?
The respondents where from national, community-based and international NGOs working in the areas of health, governance, agriculture, water & sanitation, education and child support. National and international NGOs reported having a monitoring and evaluation department, M&E staff, an M&E strategy, and had conducted at least one evaluation in the past five years. The study established that 80% of the respondents had conducted an evaluation in the last five years (Figure1) and of these, 68% reported that they had utilised evaluation findings (Figure2). Evaluation implementation factors that strongly determined evaluation utilisation were: the relevance of that particular evaluation to the organisation’s current strategy, and the credibility and quality the evaluation. Although the timeliness and quality of communicating evaluation results were seen as determinants, they were not reported as very crucial. Organisational factors identified included: the organisation’s receptiveness to new ways of doing things, the need for evaluation information to support decision-making, the organisation’s trust of the results presented from a particular evaluation and perceived implication on the results on continuity of projects/programs. Evaluators’ characteristics included evaluators’ experience and methodological expertise. The evaluators’ relationship or communication with staff did not have a strong influence on whether their evaluation results were utilised.
Figure 1: Proportion of organisations that have done evaluations
Figure 2: Utilizing of evaluation findings and Non-use of evaluation findings
A proposed way forward
The study clearly illustrated that evaluation use is determined by a cocktail and interplay of different factors. Not all factors that influence evaluation utilisation elsewhere (including interest of other stakeholders in the evaluations, relationship of staff with the evaluators, evaluator’s communication skills) actually determine the same among non-governmental organisations in Uganda. It is therefore worthwhile for commissioners and/or evaluators among NGOs in Uganda, to keenly consider those factors that would influence utilisation as they plan and execute evaluations. It is imperative that intended users are considered and are receptive of evaluation findings for utilisation to happen. Importantly planners, donors and policy-makers ought to mainstream evaluations for policy and programming and to consider sufficient budgets for evaluation to allow generation of evidence that supports decision-making as well as informing programming in general. A full version of the study will be available on the Twende Mbele website soon.
By Doris Kembabazi
There is an increasing pressure on policy makers to develop more effective policies to best direct and manage resources in more focused and efficient ways that result in improved implementation and outcomes. Evidence-based policy-making is an approach that has become increasingly prevalent in recent years in Africa. It is based on the premise that better policies and better decision-making result when these are based on sound empirical evidence and solid rational analysis. It is also critical to use evidence to improve implementation. Evidence-Based Policy-Making and Implementation (EBPM&I) therefore focuses on establishing rigorously objective evidence as a key informant of policy, but also for improving implementation of public services. Evidence policy making and implementation in Uganda plays an important role, especially in resource-constrained settings where informed decisions on resource allocation are paramount. Several knowledge translation models have been developed, but few have been applied to health policymaking in low income countries like Uganda hence a big challenge to the policy makers.
The desire to use evaluation findings and the importance of credible evidence in decision-making is emphasised in Uganda’s National M&E Policy, the National Development Plan (NDP) and international frameworks, such as the Sustainable Development Goals (SDGs). The rationale for the use of evidence is exemplified by better Government decisions and effectiveness in their implementation; rational decisions in resource allocation in the choice of policies and programmes to implement; and provision of feedback to influence future policies and programmes.
As part of the Twende Mbele African Partnership Programme, the Evidence Based Policy Making and Implementation course in Uganda was an executive course for strategic leaders and top managers in the public service. It is adapted from a University of Cape Town course; designed to assist participants to use evidence to make well informed decisions about policies, programmes, projects and services and to improve government’s impact on society. This was the first time the course was run outside of South Africa and was adapted and piloted in collaboration with UCT.
The Jinja course was officially opened by the Minister General Duties Honourable Mary Karooro Okurut who gave opening remarks on behalf of the Government of Uganda. She acknowledged the benefits of the Twende Mbele initiative and the efforts that had been put in place. She noted that in Uganda, evaluations are beginning to be viewed in a positive light, with growing demand and ownership by Ministries, Departments and Agencies (MDAs) and other stakeholders outside the public sector. In this regard a number of evaluations of strategic areas have been undertaken successfully over the last decade.
The course represented great coverage, as 35 Directors of the Ugandan Government attended, equalling more than 80% of the full cohort of Directors. The course was covered in two days with Facilitators from the Department of Planning and Evaluations (DPME) South Africa, Makerere University and Office of the Prime Minister, Uganda.
On the first day, participants were introduced to EPM&I approach and cycle, and to the Diagnosing a Problem tools and day two built on this by inspiring participants with case studies of evidence processes and then exposing them to the Theory of Change and evaluation tools which can be drawn on in subsequent stages in the EPM&I cycle.
This course was the first of its kind to bring together the directors from MDAs and they had a number of suggestions, firstly, they all appreciated the opportunity to have one voice in future policy-making and implementation. Another suggestion was to create a platform to continue sharing and learning from each other on a day to day basis. The participants also decided to work with the Office of the Prime Minister to strengthen M&E systems for example Human Resources and to write a briefing paper specially on the practical strategies to institutionalise use of evidence.
This course will be translated to French and run in Benin in the second half of 2018.
By Aisha Ali
Setting the broader context: The need for institutional reform
Institutional reforms have become the focus of many development programs within developing countries. On the whole, the portfolio of World Bank funds dedicated to large-scale institutional reform programs has grown by up to 80% since the 1990s. This is mainly because of the realisation within the development community and in particular, the multi-lateral and bilateral development agencies that, in developing countries, institutions are weak, poorly governed and cannot support the success of investment and policy programs to realise sustainable development outcomes. You can find out more on this here. In some cases, developing countries have taken the lead in driving reforms as home-grown panaceas for the troubles associated with weak institutions. An example is Rwanda’s approach to the decentralization process that sought to reform the hierarchical system of authority that characterised the public-sector governance which was largely driven by the Rwandan government with support from international development partners.(more…)