We’re building a new UNICEF.org.
As we swap out old for new, pages will be in transition. Thanks for your patience – please keep coming back to see the improvements.

Evaluation database

Evaluation report

2018 Evaluation Office: Evaluation of the Coverage and Quality of the UNICEF Humanitarian Response in Complex Humanitarian Emergencies

Author: Andy Featherstone, Tasneem Mowjee, David Fleming, Katie Tong, Clemens Gros, Leonora Evans-Gutierrez. Assisted by Abhijit Bhattacharjee, Kate Hale and Richard Burge.

Executive summary

With the aim to continuously improve transparency and use of evaluation, UNICEF Evaluation Office manages the "Global Evaluation Reports Oversight System (GEROS)". Within this system, an external independent company reviews and rates all evaluation reports. The quality rating scale for evaluation reports is as follows: “Highly Satisfactory”, “Satisfactory”, “Fair” or “Unsatisfactory”. You will find the link to the quality rating below, labelled as ‘Part 4’ of the report, and the executive feedback summary labelled as ‘Part 5’.


The purpose of this evaluation is to generate, through robust and systematic analysis across a range of country contexts, practical solutions that can inform how UNICEF improves the coverage and quality of its humanitarian response. It is anticipated that both the evaluation process and results will contribute to a body of evidence and learning about the enablers and barriers to delivering high-quality humanitarian action in complex humanitarian emergencies – and how these have impacted on UNICEF’s performance and ability to reach affected populations. It will generate practical solutions and recommendations that enable UNICEF to scale up good practice and introduce innovations that will ultimately contribute to improving the coverage and quality of its humanitarian response.

The first phase of the evaluation included three pilot case studies, one field mission (Nigeria) and two desk reviews (Pakistan and Ukraine). The second phase included a further four field missions (Afghanistan, the Central African Republic, the Philippines and Somalia) and four desk reviews (Burundi, Mali, State of Palestine and the Syrian Arab Republic).

The evaluation has three objectives:

  • Assess UNICEF performance in achieving coverage and quality in complex humanitarian emergencies based on a sample of countries, including identifying internal and external enabling factors and challenges to UNICEF’s performance.
  • Identify internal and external enabling factors and challenges to UNICEF fulfilling its protection mandate
  • and role in complex humanitarian emergencies, including its designated role in the MRM resulting from United Nations Security Council resolutions on children affected by armed conflict.
  • Capture good practice and innovations that are improving humanitarian action and analyze their potential for more general application by UNICEF.


The evaluation used a mixed-methods approach for data collection and analysis. Qualitative data was collected through a literature review, semi-structured interviews and focus group discussions (FGDs). Quantitative programme performance data and funding information was collected and analysed. Key informant interviews (KIIs) were conducted with UNICEF staff, government and NGO partners, and UN agencies. FGDs were conducted with communities receiving UNICEF-funded assistance.

Findings and Conclusions

The evaluation findings reveal that UNICEF showed organizational courage and tenacity in sustaining its work in complex humanitarian emergencies, despite significant challenges. Across all the country case studies, UNICEF was among the largest and most important providers of humanitarian assistance and protection, and often worked in some of the most challenging areas. UNICEF programme coverage in these environments has been significant, and large populations have benefited greatly from the organization’s humanitarian action.   However, to enhance and facilitate the provision of effective assistance and protection in complex humanitarian emergencies, the evaluation highlights several areas that require improvement, some of which are noted below: 

  • With only a few exceptions, the evaluation found that when a trade-off between equity and coverage was required, coverage was prioritized. Equity programming often requires additional activities or programme areas, making it less cost-effective to deliver. 
  • While humanitarian principles are at the core of UNICEF’s policies and procedures, there is some variability in staff and partners’ levels of understanding and interpretation; UNICEF staff tend to place the principle of ‘humanity’ above other principles which is frequently interpreted as achieving humanitarian access and assistance ‘at all costs’, but the more complex principles of impartiality, neutrality and independence are equally important.
  • The evaluation highlighted the effect that different partners had on programme coverage and quality at different times during a crisis. It found that UNICEF frequently has a good mix of partners, however there is a need for it to be more agile in reviewing this when there are significant changes in context. Managing UNICEF’s partnerships with government in conflict situations can be particularly challenging as it requires a willingness to moderate relationships where this is necessary to manage perceptions, defend principles or to remind states of their obligations to International Humanitarian Law (IHL).
  • At the country office level, UNICEF often lacks a structured approach on how to routinely engage with communities for purposes of ensuring programme relevance and quality and eliciting the views of those receiving UNICEF-funded assistance. This is of importance given the links that exist between the delivery of quality programmes and community acceptance, an important strategy for achieving humanitarian access.
  • UNICEF’s systems and procedures are comprehensive and, where applied, invaluable to staff engaged in humanitarian response. However, the evidence also points to inconsistencies in how the procedures are understood, implemented and where simplifications are granted. 
  • The evaluation found that there is often insufficient evidence for UNICEF to judge key aspects of its humanitarian practice.  The organization does not always have the information and analysis required to inform effective humanitarian action and be systematic in monitoring changes in the context over time to ensure the continuing relevance of its assistance.


A few recommendations are highlighted below: 

  • There is a need for UNICEF to clarify its corporate expectations for the delivery of coverage with equity in complex humanitarian emergencies. This should explicitly address the concern highlighted in the evaluation of how COs should balance reaching the greatest number of people with reaching those in greatest need.
  • UNICEF should undertake regular analysis in order to adapt programme approaches and partnerships to ensure their relevance and to maximize their potential to reach those in greatest need. Underpinning this should be an approach that consistently prioritizes agency presence and ensures the greatest proximity to affected people.
  • There is a need to strengthen the understanding and capacity of all UNICEF staff (both at headquarters and at CO level) and partners about the practical use of humanitarian principles to make structured, ethical decisions on programme access, coverage and quality.
  • UNICEF should take a more structured approach to identifying, equipping and supporting staff at country level who engage in humanitarian negotiations with non-State entities and host governments.
  • In fragile and conflict-prone countries, UNICEF must ensure that its engagement with the government is consistent with humanitarian principles and IHL. This is particularly important in situations when the government is party to the conflict, is not meeting its responsibilities under IHL or is otherwise contradicting humanitarian principles.


Read the UNICEF Connect blog on UNICEF's response to complex humanitarian emergencies.

Please find attached the following, for the Coverage and Quality Evaluation:

  • The main report - Report
  • The annexes - Part 2
  • The ToR (long version) - Part 3
  • GEROS Evaluation Review - Part 4
  • GEROS Feedback Summary - Part 5
  • Evaluation Management Response - Part 6

Full report in PDF

PDF files require Acrobat Reader.



Report information



Evaluation Office




New enhanced search