Evaluation database

Evaluation report

2012 Philippines: UNICEF Evaluation on Alternatives Delivery Modes: MISOSA and e-Impact



Author: INTEM; Department of Education

Executive summary

Background:

This report presents the results of an evaluation undertaken by the International Technology Management Corporation (Intem) for the United Nations Children’s Fund (UNICEF) on two alternative delivery modes (ADMs) of primary education in the Philippines, namely: (i) the Modified In-School, Off-School Approach (MISOSA); and (ii) the Enhanced Instructional Management by Parents, Community and Teachers (e-IMPACT).

The evaluation also sought to assess UNICEF’s contributions towards the attainment of project-level results by scrutinizing the innovations through the evaluation lenses of relevance, effectiveness, efficiency and sustainability. How these were able to influence policy change, build stakeholders’ capacity, and provide materials and supervision support was analysed. The evaluation also revisited key processes undertaken to initiate and implement the innovations in order to identify strengths, weaknesses, opportunities and threats to sustaining such innovations, particularly at the school level, so that these may be replicated at the local and national levels within the context of BESRA.

Purpose/Objective:

The main purpose of the evaluation was to analyse (i) key processes, in terms of effectiveness and efficiency in the ADMs’ implementation, and (ii) project outcomes, in terms of achieving Child-Friendly School System (CFSS) goals and improving learning outcomes with the end view of providing basis for refinement, scaling up and sustaining innovations. The results of the evaluation will be useful towards refining the implementation design of the next country programme for children (CPC) between the Government of the Philippines and UNICEF, as well as help enrich the Basic Education Sector Reform Agenda (BESRA) of the Department of Education (DepEd).

Methodology:

Mixed research methodologies and data-gathering techniques were used to determine the changes—per variable or group of indicators (relevance, effectiveness, efficiency and sustainability)—during and after program implementation in selected schools. The evaluation extensively used key informant interviews, individual surveys, focus group discussions, documents, literature and records review, and ocular observations to gather data and information relative to ADMs. To be able to draw strong conclusions on the strengths of ADMs, the evaluation utilized randomly selected samples from two groups, namely treatment schools and control schools. Research tools and protocols were established to guide the research team. The evaluation used cross-validation (triangulation) to establish gathered data and information. Trends, patterns and regularities were established during data and information analyses in order to develop hypotheses, theories and conclusions.

The following tools for analysis were used: (i) CFSS tools formulated according to terms of reference (TOR) to derive relevant data and highlight the importance of CFSS in the evaluation; (ii) Rubrics for Assessment of CFSS of High Schools (Rubrics) adopted from UNICEF; (iii) CFSS checklist for elementary schools, and (iv) Development Assessment Profile (DAP) from EQuALLS/USAID. To augment quantitative analysis, the evaluation also used a multiple case study method for generating data and detailed information and analyses on the experiences of implementers, beneficiaries and other stakeholders at the ground, school and community levels focusing on how the interventions were implemented and why stakeholders see these interventions as either successes or failures.

Conclusions and Recommendations:

The piloting of ADMs as supported by UNICEF made notable contributions to improving educational outcomes, especially in terms of raising the mastery level and achievement test scores of children. Improvements in this area were robust. However, in terms of reducing dropout and repetition rates, the improvements were not as significant. Nonetheless, the ADMs enriched the efforts of BESRA in introducing innovations meant to make the teaching-learning environment more varied and stimulating, provide opportunities for children’s active participation, and enhance the engagement of parents and communities in education initiatives. The innovative strategies indeed proved significant in the context of current Philippine education, UN priority directions, and priorities of other development and donor agencies working for the country.

Lessons Learned:

The following are lessons learned from this evaluation that could be of value to DepEd policy planners, UNICEF, and other development partners:
1. There will be no dropouts for as long as learners are in contact with the school (using an alternative modality).
2. Emphasis on the individuality of learners is essential to enable their natural and free development withintheir physical, intellectual and physical capacities.
3. Acquisition of knowledge and skills is not the main object of public elementary education rather it is the development and strengthening of learners’ power to help themselves.
4. Any pilot innovation in education is best anchored on a ‘whole-school’ approach; this way, sustainable development can be integrated throughout the formal sector in a holistic manner rather than just on a stand-alone basis.
5. This assessment of the piloting of ADMs in the Philippines presents challenges to DepEd as it plans ADMs’ mainstreaming into the national education system; likewise, it offers lessons to UNICEF in regard to its programming for children.



Full report in PDF

PDF files require Acrobat Reader.


 

 

Report information

New enhanced search