Logo en.artbmxmagazine.com

Monitoring and evaluation of results in the attention of the public

Table of contents:

Anonim

The growing demand for efficiency in customer service is fundamentally based on the awareness that it is not enough to produce “products”. Efficient or well-managed projects and their products would lose their relevance if an improvement was not achieved in the processes and ultimately, in the attention of the public.

For this reason, the Promotion Area is increasingly focused on results and the best way to contribute to achieve them.

To support this shift in strategic orientation towards results, a stronger and more coherent monitoring and evaluation framework is required, which promotes learning and performance measurement. This framework should be sufficiently simple and accessible to all staff to allow it to be used flexibly to improve the effectiveness of this area. The monitoring and evaluation framework outlined in this proposal is therefore geared towards three equally important objectives:

  1. Align the monitoring and evaluation system with results-based management Encourage knowledge and learning about outcome evaluations Try to simplify policies and procedures.

Several elements of this framework require a change of mindset and approach by staff, emphasizing coherent and long-term planning around results, building alliances for change in development processes, and fostering knowledge and learning. use of evidence provided by evaluations.

Therefore, some feedback will be necessary. While we fully expect to learn from this new framework and update it as it evolves, it is important to underline that its introduction represents an important step for this Promotion Area. The instruments and policies described here are intended to promote the use of evidence provided by evaluations so that management decisions and future programming incorporate lessons learned.

Furthermore, they are designed to help best serve the public to meet the challenge of choosing relevant outcomes, verifying the accuracy of our current processes, and demonstrating how and why change occurs where it matters most, that is, in the improvement of efficiency, effectiveness and quality towards the public.

Among the objectives of this proposal are:

  • Strengthen the results-oriented monitoring and evaluation function and capacity according to the Promotion Area, in order to improve policies and programs, collective learning, responsibility, organization and discipline Introduce simplified, agile and flexible approaches and instruments to track achievements of outcomes, which are consistent with the organization's continuous improvement initiative and are based on the expertise of the area and its employees Present methods that link the products of projects, programs, policies, alliances and assistance with achieving outcomes, in the context of the Strategic Results Framework (SRF) Explain new innovations and monitoring and evaluation methodologies

    results-oriented Provide practical guidance to field offices in monitoring and

    analyzing performance Improve collective learning in care and follow-up Ensure decision-making based on the information obtained Support the responsibility of each of the people involved in the area.

    Strengthen capacity in each of these areas and in monitoring and evaluation functions in general.

Monitoring and evaluation help improve performance and achieve results. Stated more precisely, the overarching goal of monitoring and evaluation is performance measurement and analysis, in order to more effectively manage the outcomes and outputs that are performance outcomes.

Performance is defined as progress towards achieving results. As part of the emphasis placed on results currently in the Promotion Area, the need to demonstrate performance imposes new monitoring and evaluation demands on the field areas (CA) and the program units.

Traditionally, monitoring and evaluation functions have focused on results analysis and implementation processes. At present, the emphasis is on analyzing the contributions of different factors to the achievement of a given development effect, including, among them, products, partnerships, policy advice and dialogue, advocacy, and mediation / coordination.

Monitoring and evaluation help the staff involved to:

These objectives are linked in a continuous process, as seen in the graphic above.

Learning from the past helps you make more informed decisions.

Better decisions result in a greater degree of responsibility among those involved. The correct decisions also improve performance, enabling the activities of the Promotion Area to be continuously modified.

A close partnership with key stakeholders throughout the process also promotes the creation of shared knowledge and learning, contributes to the transfer of skills, and builds the capacity of field areas and projects in planning, monitoring and evaluation.

These staff also provide valuable feedback that can be used to improve performance and learning. In this way, good monitoring and evaluation practices are continually reinforced, making a positive contribution to the overall effectiveness of performance cooperation.

Definitions of monitoring and evaluation

Monitoring can be defined as an ongoing function whose main objective is to provide managers and key stakeholders, in the context of an ongoing intervention, with early indications of progress, or lack of progress, in achieving results.

The ongoing intervention can be a project, a program, or other support to achieve an effect.

Evaluation is a selective exercise that attempts to systematically and objectively assess progress towards an effect and its realization. Evaluation is not an isolated event, but an exercise involving analyzes of different scope and depth, carried out at different times in response to changing needs for knowledge and learning during the process of achieving a certain effect.

All evaluations - even project evaluations that weigh their relevance, performance, and other criteria - need to be tied to outcomes, as opposed to just implementation or immediate outputs.

Reporting is an integral part of monitoring and evaluation.

Monitoring and evaluation intervene at two different levels, although closely linked.

Feedback is a process, in the framework of monitoring and evaluation, by which information and knowledge are disseminated and used to assess overall progress towards achieving results or to confirm achievement of results.

Feedback can consist of findings, conclusions, recommendations, and lessons learned from experience. It can be used to improve performance and as a basis for decision-making and to encourage learning in an organization.

Functions and Responsibilities in Monitoring and Evaluation

WHO? THAN? WHY?
Actors, roles and responsibilities Information needed What to use
Promotion Area Main responsibilities:
  • Collaboration with other areas to determine the focus of attention and the results sought in customer service Identification and management of strategic alliances with other areas Evaluation of the performance and quality of the area (progress towards results and achievements).
Changes necessary for improvement, continuous quality; initiation, follow-up and closure of customer service. Progress, problems and trends in achieving results.

Guidelines and issues in the volume and effectiveness of the use of material and human resources.

Adapt permanent and constant improvement based on changing conditions of customer service, for this, forge solid alliances with other areas to cooperate with each other in their functions Solve the main obstacles that hinder implementation so that the possibilities of achieve results (effects).

Link results with resources.

Conduct active and results-based monitoring and evaluation.

Head of field promotion Main responsibilities:
  • Client portfolio, projects and programs supported by efficient results, contributing to the effects of continuous improvement.
Progress towards effects achievement Progress of alliance strategies for effects.

Rate and efficiency of the use of resources,

Analyze progress towards outcomes and their achievement Evaluate the effectiveness of partnership strategies and take necessary action.

Monitor the effectiveness of implementation strategies, identifying obstacles to achieving results (effects) and taking appropriate measures.

Ensure the effective use of resources, allocating them to maximize the possibility of achieving results (effects).

Area staff Main responsibilities:
  • Participative and active management in the search for clients to produce results above those expected and planned.
The effect to which the project is aimed. Progress towards results and their achievement.

Implementation issues and issues.

Practical collaboration with partners at the project level and monitoring of their contribution.

Resource management.

Framing the project in the broader context Take action to achieve service goals.

Ensure effective collaboration with partners.

Establish links with beneficiaries.

Ensure efficient use of resources.

Good follow-up focuses on results and subsequent actions. Try to identify "what is going well" and "what is not working" in terms of progress towards the desired results. Then he records it in reports, makes recommendations and complements with decisions and actions.

Monitoring also benefits from the use of participatory mechanisms, in order to ensure commitment, ownership, continuation, and feedback on performance. This is essential for effect monitoring when progress cannot be analyzed without knowing what partners are doing. Participatory mechanisms must include all the personnel involved.

All monitoring and evaluation efforts should address, as a minimum:

  • Progress towards effects: This involves periodically analyzing to what extent the desired effects have been really achieved or not.

    The factors that contribute or prevent the achievement of an effect: This requires monitoring the context of the program and the conditions in which the results obtained occur.This monitoring and evaluation involves analyzing whether or not the services or consultations are being produced as planned. and whether or not they are contributing to the effects. The alliance strategy: This requires analyzing the design of alliance strategies, as well as the formation and operation of alliances. This helps to ensure that the staff at an effect have a common appreciation for problems and needs and that they share a synchronized strategy.

Performance measurement

Indicators are part of performance measurement, but they are not the only elements. To assess performance, it is necessary to know not only the achievements obtained; It is also required to have information on how they were obtained, the factors that had a positive or negative influence, if the results were exceptionally good or bad, who was the main responsible, in short.

Traditionally it was easy to measure financial or managerial performance, such as efficiency.

Results-based management today lays the foundation for substantive accountability and performance or effectiveness weighing.

Annual Project Reports (IAPs), Evaluations, and Annual Results Oriented Report (IAOR) provide the means to weigh performance at the field office level.

The knowledge gained from monitoring and evaluation is the core of the structural learning process. Monitoring and evaluation provide information and data that, once accepted and incorporated, become knowledge that promotes learning.

Therefore, learning must be integrated into the general programming cycle through an effective feedback system. Information must be disseminated and available to potential users in order to become applied knowledge.

Learning complements performance measurement by giving evaluation a qualitative element of measurement. Even if the performance indicators are not good or clear, you can learn from this process and use the knowledge you have gained to improve it.

Learning is also a key tool for management and as such, the strategy of applying the knowledge acquired through evaluation is an important means of moving towards outcomes. The effects constitute a more ambitious and complex objective than the simple provision of inputs and the generation of outputs.

If they asked us for a Learning Checklist, we would have that list:

  • Record and share lessons learned Be open minded Plan evaluations strategically Engage staff strategically Provide real-time information Bring knowledge to staff Apply what has been learned Monitor how they are applied new knowledge.

The success of all of the above depends on everyone learning from what worked or did not work, to improve progress towards better results and results.

Learning has been described as a continuous and dynamic process of inquiry in which the key elements are experience, knowledge, access and relevance.

It requires a culture of research and study, rather than simple answering and reporting. This can be more easily accomplished when people are given the opportunity to observe, engage, and invent or discover strategies for dealing with certain types of problems or issues.

The monitoring process should commit to improving collateral links between program and project staff, including feedback for learning purposes. The analysis of existing or potential links between programs and projects should be as critical, objective and detailed as possible.

The evaluation is a process that requires establishing a common baseline information to make comparisons. The problem lies in knowing from the beginning each relevant factor and how all the factors affect each other. Without reliable and regular feedback, monitoring and evaluation cannot serve its purpose. Attention should be paid to experiences that potentially have a wider application.

Among the measures to improve the feedback of an evaluation, we have:

  • Understand how the learning process takes place inside and outside the organization (identify blockages) Analyze how to improve the relevance and timeliness of evaluation feedback and ensure that improvements are implemented.

    Be explicit in determining the key recipients of evaluation feedback and the reasons for wanting to reach them Improve the knowledge of target groups to understand what they expect from evaluations, how they use information from evaluations, and how evaluation systems Feedback can improve your response to these expectations. Develop a more strategic view of how feedback approaches can be tailored to the needs of different audiences.

    Ensure that the quality of evaluation products are adequate, especially in terms of conciseness, clarity and presentation Study a diversification of approaches used to communicate with audiences, introducing innovative methods where appropriate.

    Improve the websites and intranet dedicated to evaluation, recognizing that ease of access and use are key factors Ensure that the full publication of evaluation reports is systematized and that there are adequate approval and notification processes so that the higher administrative levels or key partners are not surprised by controversial results. Devote more attention to improving staff participation in evaluation tasks, including feedback on lessons learned from evaluations and recognition that language differences are a major constraint.

We can conclude by saying that the framework for monitoring and evaluation presented in this document is certainly not “fixed”, but is expected to evolve and improve as members of the area gain experience through its daily use. Some elements require a change in the mindset and behavior of the staff and therefore the area can be expected to continue learning for years to come.

Bibliography

  • 1 UNDP, Simplification Task Force, Simplification Report to the Executive Team, UNDP, Evaluation Office, Impact Assessment of UNDP Country Interventions, Methodology for CLIA Phase I (version 2.1). in UNDP countries, methodology for CLIA, phase I), http://intra.undp.org/eo/methodology/methodology.htmlUNDP, Alex Rusita, Evalnet, End of mission report 2001.UNDP, Netnarumon Sirimonthon, Evalnet, Tripartite meetings, http://intra.undp.org/eoUNDP, Siv Tokle, PowerPoint Presentation to the Associate Administrator 2000.UNDP, Siv Tokle, IAWG Presentation on RBM and M&E (IAWG Presentation on GBR and M&E) Geneva, 2001 UNDP, Uzbekistan,New Approach to Monitoring in Uzbekistan RBEC best practice newsletter no.4, Nov-Dec 2000, UNF PA, Linda Sherry-Cloonan, I AWG Presentation on RBM and M&E) Geneva, 2001. World Food Program, Strategy for 2002-2005 WFP / EB.A / 2001/5-B / 1.World Food Program, Notes on the development of RBM in WFP. Development of RBM in WFP) July 18, 2001 World Bank, Aid and Reform in Africa, A World Bank Study December 1999.World Food Program, Strategy for 2002-2005 WFP / EB.A / 2001/5-B / 1.World Food Program, Notes on the development of RBM in WFP at WFP) July 18, 2001 World Bank, Aid and Reform in Africa, A World Bank Study (Aid and reform in Africa) December 1999.World Food Program, Strategy for 2002-2005 WFP / EB.A / 2001/5-B / 1.World Food Program, Notes on the development of RBM in WFP at WFP) July 18, 2001 World Bank, Aid and Reform in Africa, A World Bank Study (Aid and reform in Africa) December 1999.
Monitoring and evaluation of results in the attention of the public