Header Search (temporary)

Header Main Menu

Monitoring and Evaluation (M&E) in Emergencies

Last updated: 17 Feb 2023

Visión general

Monitoring and evaluation (M&E) are two important management processes that enable project staff to track progress and facilitate effective decision making. Although donors require IOM to integrate M&E systems into projects to account for the utilization of resources they provided, the greatest beneficiaries of effective M&E are the target population. By closely observing project activities and understanding its impact in the community, adjustments can be made to ensure that project design and activities are relevant, effective, efficient, and yield meaningful results for the community.

Descripción

What is Monitoring?

Monitoring can be defined as "a continuing function that uses systematic collection of data on specified indicators to provide management and the main stakeholders of an ongoing development intervention with indications of the extent of progress and achievement of objectives and progress in the use of allocated funds".1 It provides feedback to operational planning, costs management and budget use, and facilitates decision-making for remedial actions when necessary. Regular consultations with project partners and beneficiaries are also required for effective monitoring. It measures progress against project objectives, outcomes and outputs. When done properly, information gathered through monitoring can guide project revisions and confirm that aid is reaching the people intended. It can enable decision-makers to respond to community feedback and identify emerging problems and trends. It should gather disaggregated data by age and gender for different groups including persons with specific needs.2

Within the cluster system, cluster leads often require partners to provide regular updates on the progress of their response in a measurable and standardized manner. Donors are also putting more and more emphasis on the need to be provided with clear, measurable and results-based updates. Where country-based pooled funds are available, fund managers often allocate M&E focal points to review project activities related to the funds. For more information on this contact the Department of Operations and Emergencies (DOE) Regional Thematic Specialist (RTS) or the M&E focal point in the regional office.

 

  • Do not re-invent the wheel. Global Cluster Standard Indicators exist for emergency operations at the global level and they can be used and if needed, modified to fit the context of the emergency. Cluster leads at the country level often have a set of indicators developed and request that partners monitor their activities using these variables to measure qualitative and quantitative progress. In some cases, donors may also have standard indicators that they would like used in to monitor project activities. Where these exist, use these indicators as project indicators where appropriate.
  • To estimate the cost of M&E activities within the budget, consider the costs of measuring indicators with the means of verification available (such as surveys, field visits, assessments etc.). The budget line should also include the costs for conducting an evaluation where appropriate.

 

Remote Monitoring: In general, IOM presence and proximity to the affected population and beneficiaries are critical to the effectiveness of our programmes. Unfortunately, as a result of security risks towards IOM and other humanitarian and development actors, there will be cases where an operational adjustment must be taken by IOM in the country.

When faced with a complex security environment, the objective for IOM and other actors is not to avoid the risks altogether, but to manage them in a way that would allow us to remain present and effective. Based on programme criticality, this may involve the withdrawal or drastic reduction of the number of staff from the field, and require the country office to remotely manage a programme from a different location than where it is being implemented.

When the situation leads to remote management, the role of M&E can take on greater importance, due to the need to ensure the project is delivered according to plan while IOM staff have limited or no physical presence on site. There are several cases of past and current IOM humanitarian operations where remote management and M&E have been used, including in Iraq, Somalia, Sudan, and Syria.3 A range of innovative M&E approaches have been developed by IOM and the humanitarian and development community for settings in which access is limited. These can include, but are not limited to:

1. Call centres and regular debriefing meetings with local partners.

2. Community-based methods such as crowd sourcing, broadcasts, complaints boxes, and consulting local communities.

3. Photos and videos of distribution, web-based remote project monitoring, daily verbal reports and peer observations.

4. Use of vetted third-party monitors who may have access to the area of operation.

It is important to keep in mind that each of these approaches has its benefits and drawbacks, and none will be able to fully address the challenges of monitoring operations in a contested and a rapidly changing conflict situation. For further guidance on remote monitoring, it is recommended you get in touch with the Regional M&E Officer.

 

ECHO has developed specific guidance on remote management in operations. When operating under (partial) remote management, they request specific information to be included in the project proposal, and require quarterly reports on monitoring and aid diversion. For more information or the templates, please contact RO Brussels.

What is Evaluation?

Evaluation is defined as "a systematic and objective assessment of an on-going or completed project, programme or policy, its design, implementation and results. The aim is to determine the relevance and fulfilment of objectives, development efficiency, effectiveness, impact and sustainability".4

Evaluations are useful tools for management, accountability, learning and for gathering information to improve interventions. Results of evaluations can also be used as a resource mobilization tool and as a promotion of IOM activities. It also demonstrates IOM's commitment to carry out humanitarian interventions in an effective and transparent manner. In the context of an emergency, evaluations can be conducted to analyze a specific project, programme or even an overall cluster response (including the response of cluster partners). Specific evaluation criteria have been developed for emergencies and their description can be found at IOM's Evaluation webpage.

For Level 3 Emergencies, Real Time Evaluations (RTEs) are mandatory for IOM (see IOM SOPs for L3 Emergencies in the IOM Corporate Emergency Activation entry). RTEs are expected to produce an immediate feedback on the results of the evaluation and contribute to improvements. The RTE team should deliver its report or a substantive early draft, before leaving the field. RTEs are not supposed to be time consuming exercises and their scope should be targeted and planned, taking into consideration the reality in the country office where programme managers may have limited time dedicate to the evaluation process.

In some cases donors include evaluations as a requirement for receiving contributions. This is often stated in the project agreement. In some cases, donors may also opt to carry out evaluations directly. In the event that the donor requires an evaluation to be carried out, it is important to ensure that the relevant costs are included in the project budget and country offices should discuss it with the donor as soon as possible. Various types of evaluations can be proposed (e.g. mid-term, final, RTE, humanitarian or process evaluations etc.) and are designed on a case by case basis to fit the specific requirements of the context.

Consideraciones clave

Monitoring

Often times, a single sector of intervention for IOM has several donors contributing to similar activities. Where possible, use the same indicators and progress indications for these projects to make it easier to gather data and consolidate the information to see the overall impact of the intervention.

At the country level, internal sitreps are usually developed and shared by the country office with other IOM offices who are supporting the response (e.g. the Regional Office and headquarters). Include project indicators in the templates that are shared with operations staff who are contributing information for the sitrep. This ensures that:

  • Sitreps have measurable information on progress, and
  • The necessary updates against indicators are gathered regularly and not just before a donor report is due.

Evaluation

  • Be sure to read project agreements and project documents to identify whether evaluations are required in the proposal. Given the short timeframe of most emergency projects, it is important to start planning for these activities at the beginning of the project implementation.
  • Evaluation reports are available on the IOM Evaluation webpaget. This can be useful resources for seeing other evaluations conducted in the past and can provide ideas when planning your own evaluations.
  • Country offices are encouraged to send their evaluations to OIG for recording and uploading onto the evaluation webpage.

Relevancia para las Operaciones de Emergencia de la IOM

In line with the humanitarian community's efforts to improve the performance of the overall response to an emergency, including improving accountability to both the affected population and the donors who support the response, IOM is working to improve its approach to M&E by ensuring that results monitoring frameworks are developed and utilized in emergencies. These frameworks should also include tools that help to use and measure the indicators and serve as an implementation plan for projects, including the planning of an evaluation. Having a result monitoring framework in place also helps the country office produce better information products that contribute to resource mobilization. Products include: external sitreps, fact sheets and annual reports. They also make the development of project reports much easier and improve the overall quality of these reports.

Contactos

Thematic experts are available within IOM to help guide the development of projects. These experts can also provide specific guidance to develop realistic indicators. These thematic experts can be contacted by email:

For Camp Coordination and Camp Management (CCCM): [email protected]

For Shelter and Settlements: [email protected]

For Water, Sanitation and Hygiene (WASH): [email protected]

For the Displacement Tracking Matrix (DTM): [email protected]

For Health: [email protected]

For Psychosocial Support: [email protected]

For Protection: [email protected]

M&E focal points are also available in the Regional Office and can provide guidance and additional information.

The Office of the Inspector General (OIG), supports country offices in carrying out evaluations and can provide guidance on how to design terms of references for evaluation and technical support for the planning and implementation of an evaluation. In addition, OIG can provide additional support in monitoring, such as developing and implementing monitoring frameworks and capacity development in M&E.

For support contact the Evaluations team: [email protected].

In addition, there is an M&E Practitioners SharePoint network which is open to IOM staff and provides resources and information about M&E. If you are interested in joining the network, you can email [email protected] to be included.

Footnotes

1 OECD's Development Assistance Committee, Working Party on Aid Evaluation, "Glossary of Key Terms in Evaluation and Results Based Management", 2010.

2 Sphere for Monitoring and Evaluation.

3 Syria is unique in that remote management has been the predominant form of operation since early in the crisis and is likely to continue for the duration of the conflict.

4 OECD's Development Assistance Committee, Working Party on Aid Evaluation, "Glossary of Key Terms in Evaluation and Results Based Management", 2010.