We’re building a new UNICEF.org.
As we swap out old for new, pages will be in transition. Thanks for your patience – please keep coming back to see the improvements.

Evaluation database

Evaluation report

2009 Cambodia: Impact Assessment of Basic Education Program (EBEP) Supported Training

Executive summary


“With the aim to continuously improve transparency and use of evaluation, UNICEF Evaluation Office manages the "Global Evaluation Reports Oversight System". Within this system, an external independent company reviews and rates all evaluation reports. Please ensure that you check the quality of this evaluation report, whether it is “Outstanding”, “Good”, “Almost Satisfactory” or “Unsatisfactory” before using it. You will find the link to the quality rating below, labelled as ‘Part 2’ of the report.”

Background and Methods

The Ministry of Education, Youth, and Sports (MoEYS), within the Royal Government of Cambodia (RGoC), has identified capacity building as a key strategy to expand and enhance educational services in Cambodia. To support this strategy, MoEYS Expanded Basic Education Program (EBEP) supported by UNICEF/Sida, currently in Phase II of operation (covering years 2006-2010), is composed of 3 projects1:

  1. Capacity Building for Sector-wide Reform and Decentralization
  2. Improving Equitable Access and Quality of Basic Education
  3. Expanded Learning Opportunities for Disadvantaged Young Children and Youth

The purpose of this evaluation is to assess the impact of UNICEF supported trainings thus far within the context of specific projects, in order to see how trainings might be improved to better support decentralization and de-concentration in the future. In addition, this evaluation is to determine how sustainable these trainings are from a broader individual and institutional capacity perspective.

The purpose of this evaluation was to assess the quality and impact of educational management training activities supported by UNICEF under EBEP; both top-down and cascade approaches to training are used, so one of each approach was included for participation in this evaluation. One, Educational Planning and Statistics training under Project 1, evaluated training impact for provincial and district education planning staff and uses a top-down approach. The other training, Effective Teaching and Learning under Project 2, evaluated training impact with District Training and Monitoring Teams, School Directors, Technical Grade Leaders, and Teachers; this training uses a cascade approach. Both programs were evaluated on how they support knowledge gain as well as the transfer of knowledge to others.

The key questions to be evaluated focused on the following:

  • Training Quality: Were the training activities organized well? Were the training contents relevant to and useful for the trainees’ work? Did the trainees follow and understand the training content?
  • Knowledge Gain: Did the trainees gain intended knowledge and skills from the training?
  • Application in the Workplace: Did the trainees apply their new knowledge and skills in their daily work? Did the trainees share their new knowledge and skills with other colleagues in the workplace?
  • Impact: Did the training lead to improved work performance of the organization as a whole?

The methodology for this evaluation was both qualitative with literature review and interviews/focus groups, and quantitative through the use of surveys. The evaluation process focused on Phnom Penh and 2 out of the 6 UNICEF-supported provinces (Kampong Thom and Stung Treng, 33% sample). A total of 7 districts within those 2 provinces (16% sample of the 44 total UNICEF districts across 6 provinces) were chosen, with a mix of 2 urban areas, 2 remote areas, 2 rural areas, and 1 rural/remote area.

A total of 2,716 surveys were collected (54 from the Educational Planning and Statistics training and 2,662 from the Effective Teaching and Learning training), and 130 people participated in interviews or focus group discussions.


Trainings have accomplished the increase in knowledge theory, and have yielded improvements in work outcomes in specific skills. However, both trainings have produced limited outcomes in the areas of higher analytical thinking skills, and application of training knowledge is impeded by a variety of issues.

Educational Planning and Statistics

Training Quality

The majority of training participants surveyed reported that training topics are relevant to their jobs and the quality of training and trainers was good to excellent. Participatory methods were used including hands-on practice sessions. The top-down approach appears to ensure better knowledge transfer than the cascade approach.

Issues exist that affected training quality. A lack of a functional tracking system for training means there is no way to ensure the right people receive the right training. UNICEF’s involvement in the quality assurance of MoEYS training appears to be inadequate. The top-down approach puts heavy work burdens on DOP central trainers, which limits their ability to focus on other planning activities. There is no follow-up mechanism after the trainees return to their work places to ensure that training knowledge is implemented correctly. Trainers are trained on technical content but not on how to train others. Training tends to focus on isolated skills and topics, preventing participants from gaining a whole picture view of how their responsibilities fit into the larger scheme of educational reform. Five-day one-off training in educational planning and statistics is not enough to build necessary planning capacity of POE and DOE staff.

Knowledge Gain

The training MoEYS DOP Staff received with UNICEF support increased the confidence and technical competence of staff. The majority of POE supervisors felt confident about their staff’s skills in educational planning, statistics and monitoring and evaluation. More than 70% of the surveyed POE staff felt confident in statistics, educational planning, and provincial strategic planning. More than 70% of the surveyed DOE staff felt confident in statistics, education indicators, and monitoring and evaluation.

While staff rated themselves as feeling confident on surveys, POE supervisors tend to overrate the skills and confidence levels of their staff in educational planning and statistics, while DOE supervisors tend to underestimate the skills and confidence levels of their staff. POE staff feel less confident in educational indicators and monitoring and evaluation, while DOE staff feel less confident in educational planning and provincial strategic planning. DOE staff also feel that they are not fully involved in educational planning processes; they tend to feel that their main role is collecting and sending school data to POE. DOP staff feel they need further capacity development in technical areas such as policy formulation, financial projection, and school mapping.

Knowledge Application

The majority of the surveyed POE and DOE staff reported they get appropriate follow-up and support from others on how to apply what they learned in training. DOP staff appear to have internalized and applied what they learned from the IIEP training to their strategic planning and programming work. POE staff applied what they learned from training in the development of the three-year provincial education strategic plans. DOE staffs’ ability to compile statistical data has been improving.

While evaluation participants felt that the training topics were relevant to their jobs, however, many staff reported that the skills learned were only somewhat applicable to their daily roles. DOE staff felt disconnected from strategic planning. DOE staffs’ main responsibilities appear to be focused on data collection from schools and compilation of basic statistics for POEs; few staff understand how to apply the collected data for educational planning. Staff at all levels still do not have complete descriptions of what their individual responsibilities are, or how their job performances will be rated: there is no way to hold staff accountable. Training tends to focus on “what to do,” rather than the “how” and “why” of doing activities. It cannot be assumed that staff who participated in training share the knowledge and skills they learned with other staff members. Staff felt that besides lack of financial resources, organizational structure and lack of opportunity are the main barriers affecting the application of their knowledge/skills to their work, which implies institutional reform is essential in addition to individual skills development.

Training Impact

MoEYS Staffs’ ability to discuss policy matters has been improving, especially among DOP Staff. DOP Staff, particularly those trained in IIEP, are considered resources by other departments in regard to educational planning and statistics. In the DOP, there has been more delegation of tasks and responsibilities between supervisor and staff as a result of increased confidence in staff capacity. In general, MoEYS has shown stronger ownership in implementing education reform. All POE and DOE Supervisors felt that training proved useful or very useful to their offices’ overall abilities in the area of educational planning and/or statistics. A majority of POE and DOE Staff felt that training proved very useful to improving their overall abilities. Three-year provincial education strategic plans were prepared in all provinces. Communication between POE and DOE has improved.

While basic knowledge on educational planning and statistics has been enhanced at all levels, higher level analytical and strategic thinking skills continue to be a challenge. Many POE Staff are still not confident in overall DOE staff abilities in educational planning and statistics. DOEs do not develop their own plans. DOEs have become the conduit of statistical information rather than true partners in the planning process. Coordination between departments within MoEYS as well as between MoEYS and donors remains to be a major barrier to realize more comprehensive and cohesive sector-wide educational planning. The current educational planning cycle and system does not allow for the sufficient participation of decentralized stakeholders, or the proper coordination with donors’ planning process.

Effective Teaching and Learning

Training Quality

The ETL Working Group, made up of PED Staff, worked in partnership with UNICEF Staff to create the ETL Facilitator Manual and Teacher Logbook used during trainings. Interviews indicated that the process of training is slowly improving; trainers have been provided training on how to train others. The majority of survey participates rated the ETL training as well as the trainer as “Good”. UNICEF Staff are very active in supporting training preparatory sessions, attending trainings, and providing feedback as well as sharing ideas. Trainings used a variety of participatory teaching methods, including hands-on practice sessions. The majority of survey participants rated the TTM system as “Good”. Almost 80% of School Directors felt that TTM provided teachers the opportunity to learn new skills as well as learn how to apply them in the classroom. Most of those surveyed agreed that TTM provides opportunities for Teachers to ask questions and learn from each other. A monitoring system is newly in place to support training implementation, with a Monitoring Tool currently in development.

Issues exist that affect training quality. Not every DTMT follows all the training activities in the Facilitator’s Manual clearly or consistently; which yielded inconsistencies during trainings. Most ETL trainers have never trained before, and do not know adult learning principles. Teachers who are also TGLs do not receive any additional training in order to meet their TGL responsibilities. Adding new topics to a training which was mainly focused on refresher activities for existing topics led to confusion and decreased mastery of new topic understanding. ETL trainers were not fully competent in the training topics, which affected trainee understanding as well as confidence in the training itself. TGLs are selected and voted by their grade peers, which does not guarantee that those selected fully understand and implement ETL concepts themselves and are able to support others. The length of training – six days – did not provide enough practice time for either the training of trainers or of the trainees themselves. No resources were provided for the most recent “refresher course” which also included new topics. Training sessions do not include clear goals and objectives for each session. Trainers were unable to summarize the main points, or go “off script” and offer alternative examples. There is no way to assess training knowledge, and there is no system in place to provide targeted follow-up based on training-specific questions (or identified people to do so). TTM topics for monthly meetings are pre-selected and scheduled by Central PED Staff for the entire school year, which does not support decentralization of topics to meet local needs. Planning for TTM is inconsistent. New DTMTs and TGLs were not provided additional training to support their new roles and responsibilities. There are no guidelines with set measurable outcomes of how many schools/how many teachers should be monitored, or timeframe of how often. Monitors at all levels (DTMTs, TGLs, and School Directors) are reviewing the same things, resulting in significant overlap.

Knowledge Gain

The majority of staff rated themselves as “confident” in all ETL concepts. Training participants were able to discuss all concepts on a theoretical level. TTMs are highly effective in providing opportunities for teachers to learn new things which improve their teaching.

However, not all promoted staff received training for their additional responsibilities. Staff were confused about higher-level concepts in all ETL concepts.

Knowledge Application

More than half of all respondents self-reported that they felt confident in both their ability to apply ETL concepts in the classroom, as well as their ability to apply those concepts in different ways. The majority of School Directors indicated that the skills teachers learned how to apply were either “applicable” or “very applicable”. A third of those interviewed were able to provide specific examples of how ETL concepts are applied in the classroom, and how to adapt concepts for higher-level thinking skills and/or different subjects. Grade group discussions during TTM provide opportunities for staff to discuss how to apply ETL concepts. More than half of DTMTs and TGLs rated themselves as “confident” or “very confident” in their ability to train others, and in how to apply those skills in different ways. When used appropriately and consistently, the Teacher Record provides follow-up activities to the ETL training which are then reviewed during TTM. There is a system in place to monitor the quality of school and classroom functioning within the ETL context. UNICEF Staff at both the central and provincial levels are very involved with supporting knowledge application.

While application of ETL concepts was affected, staff at all levels do not have complete descriptions of what their individual responsibilities are, or how their job performances will be rated: there is no way to hold staff accountable. Comparing survey and interview results, staff might have inflated their survey replies when they rated themselves “confident” in applying ETL concepts. More than half of those interviewed found higher-level analytical thinking skills to be complicated, and were unable to identify how ETL concepts could be used to support those skills in the classroom. Most teachers indicated that making learning aids was difficult for them, and many admitted that either they do not make their own, or they know other teachers at their schools who do not. Teachers do not use the Teacher Record consistently. ETL concepts are monitored inconsistently and specific roles and responsibilities for who monitors what was unclear. There is a lack of accountability with monitoring follow-through, and no consequences for those who do not work toward improvement based on monitoring feedback. Monitoring occurs inconsistently, and there are some teachers interviewed who stated they were never monitored. There is lack of coordination between donor projects at the school level. There was little coordination between UNICEF and PED on the finalization of the new curriculum, which influenced significant confusion at the school level on how to implement the new curriculum with ETL. The TGL position is being used inconsistently. UNICEF Provincial Staff are not involved with updates to curriculum, methodologies, etc. Barriers to implementation which were highlighted through survey results and interviews included a lack of financial resources, lack of classroom resources, lack of time, and lack of opportunity to apply what they have learned, including a lack of authority/decision-making.

Training Impact

The use of CFS and ETL as a way to influence education reform has shifted from a donor project to National Policy, with Master Plan. The Royal Government of Cambodia has taken ownership of the program, and is actively discussing how to expand it to all primary schools in the country. ETL Manuals and resources were developed under PED ownership with UNICEF input and support. The majority of all those interviewed/surveyed felt that ETL was very useful in improving their overall abilities and the training improved their abilities to implement ETL concepts. Overall, staff perceptions of each others’ improved abilities were rated as “good”. Staff indicated that the most significant changes from the ETL training were both tangible (including: classroom arrangement, improved environments, portfolios, and the use of materials), and intangible (including: increased student participation; improved classroom management; improved student-teacher-parent-community relations; reduced absences; increase in student grade promotion; improved staff attitudes).

Follow-up and support can be improved; the support mechanisms between staff can be strengthened. Teacher complaints regarding increased responsibility and workload have not been adequately addressed. There is limited incentive for improvement, and a lack of accountability and system in place for consequences. A lack of staff has created a situation where staff have been required to take on multiple roles, limiting their ability to complete all they need to do. There is not enough monitoring. There has been no attempt by district or provincial staff to begin introducing ETL concepts to other schools in their jurisdictions that are not UNICEF-supported, even though they have been trained. There are no opportunities for staff to conduct site visits to observe quality ETL classrooms in order to gain ideas and see concepts in action.


Specific recommendations directly addressing the issues mentioned are discussed in detail in the paper. Overall, though, recommendations can be summarized thusly:

  • Individual Capacity – Trainings need to be lengthened to provide more time for hands-on practice of skills, as well as further discussions on the application of skills in different ways. Trainer capacity needs to be further developed to facilitate their analytical skills so they can adequately train and support others. Training materials need to be updated to include specific goals and objectives for every training session in order to clearly state for both trainers and participants not only what is being covered, but the rationale for why, and how skills can be adapted. Continued higher education opportunities should be explored for DOP Staff, at all (central, province, and district) levels of government. UNICEF Staff, especially Provincial Staff, should provide more direct technical assistance. Post-tests should be done at the end of every training, and be used to measure knowledge gain as well as provide a clear record of how individual staff might need direct on-the-job support.
  • Organizational Capacity – Monitoring of job activities need to be strengthened, as well as follow-up support. Job descriptions with measurable responsibilities need to be implemented to hold staff accountable for their roles. While some incentives and per diem support the program, a shift to job performance-based rewards and raises should be implemented, linked to job descriptions and measured activities. Internal evaluations of trainings should be done consistently in order to track effectiveness of the trainings, and be used to adapt trainings as appropriate.
  • Systemic Capacity – Salaries should be revised in order to retain well-trained staff. Funding and coordination issues between the UNICEF and Program Budgets need to be addressed. District planning staff need greater roles and responsibilities in the areas of educational planning. Greater coordination between MoEYS departments is needed to best implement education reform. UNICEF can have a more direct role in providing technical assistance in the areas of educational planning at all levels of government.

Where possible, recommendations have been presented for both short-term and long-term consideration.

1 UNICEF (September 2005)

Full report in PDF

PDF files require Acrobat Reader.



Report information

New enhanced search