We’re building a new UNICEF.org.
As we swap out old for new, pages will be in transition. Thanks for your patience – please keep coming back to see the improvements.

Evaluation database

Evaluation report

2014 Maldives: An Evaluation of UNICEF Maldives Strategies in Addressing Issues Affecting Women and Children

Author: UNICEF Maldives Country Office

Executive summary

With the aim to continuously improve transparency and use of evaluation, UNICEF Evaluation Office manages the "Global Evaluation Reports Oversight System".  Within this system, an external independent company reviews and rates all evaluation reports.  Please ensure that you check the quality of this evaluation report, whether it is “Outstanding, Best practice”, “Highly Satisfactory”, “Mostly Satisfactory” or “Unsatisfactory” before using it.  You will find the link to the quality rating below, labelled as ‘Part 2’ of the report.


The current CPD identified specific outcomes and outputs that are expected to be achieved for the period 2011 – 2015. These outcomes would be achieved through various strategies, including: (1) generation of evidence through monitoring and research; (2) strengthening of legal and policy framework; (3) capacity building of sector partners and (4) evidence-based advocacy; (5) behavior change communication and (6) engagement with the media and civil society. Despite the achievements recorded in the country through the MTR conducted in 2013, many challenges still remain such as the high malnutrition among children, increased reported cases of child abuse cases, and poverty, vulnerability and deprivation issues of children, adolescents and women.

UNICEF Maldives is currently working on the development of the new Country Programme Document (CPD) for 2016 - 2020. The overall country context and programme areas, Maldives country priorities, as well as the new UNICEF MTSP (2014-2017) results will be examined to inform the programme priorities for the new CPD. This may result in a combination of re-affirming the need to continue in some current priority areas with sharpened focus, and the need to continue striving for the achievement of existing priorities and the identification of new priorities and strategies, especially reflecting a more explicit equity focused approach within an evolving programme environment.

An evaluation therefore is required of strategies applied, results achieved, lessons learnt and best practices which should be continued, and new approaches and strategies that need to be introduced to accelerate progress, especially for disadvantaged children.


The purpose is to carry out an evaluation of UNICEF strategies in addressing issues affecting women and children in the Maldives. More specifically, the evaluation intends to address whether or not MCO’s previously mentioned strategies have facilitated the achievement of expected results of the CPD, have supported the contribution to wider development results at the national level and whether lessons could be derived for future strategic positioning and inform its programming. MCO will also assess the extent to which the commitment to reduce disparities in the social development outcomes that form the core of UNICEF’s mandate  – health, nutrition, water and sanitation, education, child protection, HIV and emergencies – has been effectively mainstreamed across these programme areas and, particularly, in the state level contextualized programmes.

The evaluation shall provide information on the strategies applied during the programme implementation, ascertain their relevance, effectiveness and sustainability in achieving target results, identify areas of improvement including remaining challenges and draw lessons learned to inform the next country programme 2016 - 2020. The evaluation will further provide a body of knowledge to inform policy and strategic interventions for improved results for children and women in the Maldives.


As a formative evaluation, looking at the early stages of programme implementation before the MTR, and the reflection of resulting strategic priorities in the CPD and UNDAF action plan, much of the data is qualitative.  The evaluation requires analysis across various programmes.  Analytical work was conducted through a desk review of the secondary data and complemented by key informants’ interviews and focus group discussions. Data was reviewed in terms of disaggregation by age, sex and locality.

Ethical considerations was given an utmost priority in all stages of the evaluation process including methodology design, interviewing of respondents, managing data collectors and documentation. The UNEG ethical guideline for evaluation (UNEG/FN/CoC[2008]) was adhered to during the evaluation.

Findings and Conclusions:

The programme has been very active and led to multiple visible outputs in the areas of CP, education and health. In some cases, this has meant higher level results, for instance, in the scaling up of programmes in life skills education, early childhood development, family and child investigations, and child friendly schools.  The programme may have had more immediate outcomes than it is able to prove. It was let down by not having a systematic monitoring system which captured results.  Whether these immediate effects would necessarily have led to higher level outcomes is another matter since this depends on external factors. However, the programme would have been better placed to achieve higher level outcomes had it given more consideration to feasibility and sustainability when designing interventions. Much was said by staff about external constraints, but the evaluation is not convinced that there were any initiatives, demonstrated by the programme to be viable for national scale up, that fell by the wayside - informative training courses.

While the external environment was challenging and characterised by political change, social unrest and uncertainty among national partner agencies, the programme itself was not well braced for dealing with such difficulties. A recurring theme across all evaluation findings points to questions over internal organisation.  The programme was  ambitious to start with given the correspondingly limited resources and capacities available to the MCO. This was further compounded an office structure that hindered effective coordination and a RF lacking in logical measurable targets. MCO staff were inevitably over-stretched chasing to keep up with a diffuse and unrealistic plan without the time for proper consolidation, reflection, and coordination. The common finding across all programme strategies is one of random interventions whose results fell through the cracks of the programme’s monitoring system.


Recommendations are directed to the MCO for consideration in developing the next country programme. It is inappropriate within the context of an ongoing country programme development exercise and given the constraints faced by the evaluation itself, for the evaluator to make very directive recommendations. Instead the evaluation makes a number of over-arching recommendations which can help put the new country programme on a better footing. The recommendations listed below are supplemented by commentaries in the final section of this report which give further explanations and ideas on how the recommendations can be implemented.

1. Narrow the CP to a reduced number of outcomes, focus areas and topics
2. Ensure the results matrix is logical with SMART indicators
3. Set up a systematic process for monitoring the immediate outcomes of capacity development and knowledge management activities
4. Improve systems for the planning and testing of proposed interventions
5. Ensure the office structure facilitates coordination and oversight

Full report in PDF

PDF files require Acrobat Reader.



Report information




Programme Excellence



New enhanced search