Base de données d'évaluation

Evaluation report

2000 Global: Managing Knowledge A Review of the UNICEF Evaluation Database



Author: Kruse, S. K.; Centre for Health and Social Development

Executive summary

Background

UNICEF has an Evaluation Database (EDB) containing summarised information on more than 11 000 evaluations and studies of UNICEF-supported programmes and projects from 1987. The database was meant as a tool to facilitate information sharing within the organisation, support planning and evaluation functions and build institutional memory on lessons learned. Major functional and technical shortcomings in the accessibility and use of the database have been identified. There is a need for a comprehensive assessment of the database and to better utilise recent advances in information technology.

Purpose / Objective

Before making any significant changes, the Evaluation, Policy and Planning Division (EPP) UNICEF New York decided to carry out a participatory review of the use, scope, content and management of the current database. The purpose of the review is to determine the type of improvements required to best meet the future needs of UNICEF users.

Methodology

Data and information has been collected through a combination of:
- Review of content and functionality of current database
- Review of related policy, guidelines and instructions regarding the updating and use of EDB
- An EDB users survey to a broad sample of UNICEF staff at all levels
- Interviews of key informants at HQ
- Visit to one RO, one CO and ICDC
- Assessment of features and functioning of evaluation and research databases developed by other partners: UNDP, World Bank and IFAD

Key Findings and Conclusions

What are the Major Problems and Constraints?
Despite the long history of the Evaluation Database, strong support from Board decisions and Executive Directives and follow up from the central evaluation office in UNICEF, the EDB is suffering from several problems and constraints, such as:
- The database is not as widely known in UNICEF as expected.
- The database is not as widely used as expected at any level of the organisation.
- It is unclear who should use the database and for what purpose.
- The current database is not perceived as user-friendly.
- The database is far from complete - missing a lot of information.
- The quantity and quality of information in the database vary a lot.
- There is no clear perception of what the database should contain with blurred distinctions between various types of reports.
- A model for entering data to ensure compliance and maintain quality is not in place.
- The current systems for and culture of evaluation in UNICEF are not supportive for the effective and efficient functioning of a global evaluation database.

What is the Current Status of Compliance to Guidelines and Instructions?
Level of updating: Less than have of all Country Offices updated the database in 1999 (64 of 140), e.g. forwarded lists and documents to UNICEF HQ and this is a higher number than previous years - only 49 in 1998.

Data entry is not complete: A recent review of the database from one region showed that evaluation summary sheets were filled to only a limited extent The highest completion rate for a summary sheet was 86 % and the lowest 14%. The mean completion rate was 55%, but 43% of the evaluation sheets were filled in less than 50%. Our survey reflected a low level of satisfaction with the quantity and quality of information.

Few fields are regularly updated: Six variables were filled in by all offices (Year/Sequence number/Type/Thematic/Country/ Region) and two categories were not filled in by anybody (RO and HQ comments). Variables for which the report needs to be read, ranked low in the completion rate assessment.

Low compliance to procedures: The level of shared institutionalised practice in updating the database is low. For maintaining a high quality global database, it is not satisfactory that only half of the Country Offices follow clear routines and procedures. 27% of the responds admit openly that compliance to procedures and deadlines are from marginal to poor.

To What Extent and for What Purpose is the Database Used?
The Evaluation Database is not widely used: Almost 80% of the respondents in the survey have never or very seldom used the database at all. 32% reports that they have never tried the database and 47 % only seldom. There are a small number of active users - mostly M&E- and Programme Officers. Utilisation is not driven primarily by demand and need for what the database can provide, but to a larger extent by requirements in Annual Reports.

The purpose and use of the database is unclear: The database should serve several purposes. Data from the questionnaire showed that the EDB is most often used for reporting and queries compared to other uses. It is closely linked to annual reporting requirements and many country offices base then contribution on outputs from the database. The use of the database as a tool for policy development and advocacy/public information is less frequent In practice, it is not feasible to use an incomplete database effectively for annual reporting or for monitoring the M&E function.

Why is the Database not Widely Used?
Limited access: There is still 20% in our sample without access to a computer with a CD-ROM - almost all in Country Offices. The situation has unproved, but only lately. Both HQ and Regional Offices have upgraded their computer equipment faster.

Low awareness: There is a lack of knowledge and awareness about the database (the first reason of eight). Despite the fact that the database has existed since 1993 and been repeatedly presented and supported in Executive Directives, the awareness about the database is low. In an organisation where several initiatives compete for attention and awareness, the database comes low on the agenda. It is also perceived as a HQ initiative and said to be maintained because of formal requirements.

Limited data quality and completeness: There is as mentioned information missing in the database and existing data is often too general or of limited quality (inaccurate and does not provide a clear abstract of the report). A vicious circle is established where COs find the database incomplete and make few efforts to update the database, with the result that the next version comes out with still more incomplete information.

The database is not found very user-friendly: The survey found that technical problems are not a major reason for not using the database (it came as number seven out of eight reasons). Its seems that the current CD-ROM is not difficult to use for most computer literate people - willing to spend a few minutes to install and learn the search function. The problem is more that the design is considered far from elegant, attractive and user-friendly according to current standards. The speed is low and the flexibility limited. The users are clearly expressing a need to provide the database with a new software platform.

Systems and culture of evaluation in UNICEF is not supportive for the effective functioning of a database: Technical upgrading of the database is necessary, but not a sufficient condition for more effective utilisation of the EDB. Systems and routines of evaluation are only partly institutionalised in UNICEF. Survey data revealed a low interest in M&E work (reason five out of eight) - combined with a perception that evaluation findings are not used. Interviews underlined also the lack of a strong culture of evaluation in the organisation. The simple argument is that technical improvements of the database will increase the use up to a certain level, but a high quality and broadly used evaluation database is primarily the result of a strong and effective evaluation system - and an organisational culture that promotes and stimulates the active use of evaluation findings.

Weak staff motivation: The main obstacle identified in the survey when it comes to problems of entering data, was "limited interest among staff in the office". The serious message to UNICEF is that the main constraint lies in staff motivation. When combined with a high score on limited awareness and support from management and a perception that too much work is required for entering data, it is understandable that the data entry result is unsatisfactory.

Low satisfaction with the database: It is alarming that 37% concludes that they to only a marginal or some extent find what they are looking for in the database. A tiny fraction is well satisfied, while there is a 20% group relatively satisfied with what they find. Information in the database meets to only some extent the needs of the users. There is only a small group of enthusiastic supporters. A large number of people does not trust or has too little trust in the quality of data, which is a serious challenge for the revision of the database.

It is not suggested to abolish the database, but to abandon the current version and provide a new software platform. Few say they will miss the existing database, but 20% expresses appreciation for what is available and what has been achieved. It is said to be more in the database than most UNICEF staff seems to believe.

Several other organisations have established their own evaluation databases. We have looked at the experience in UNDP, the World Bank and IFAD. These organisations have all more centralised evaluation functions than UNICEF making it easier to maintain a unified database. Coverage and compliance are not major problems. The databases are also seen as part of the evaluation system of the organisations, All three organisations are using Intranet as the medium of access and increasingly the public homepage. UNFPA is also distributing the database on a CD-ROM. It also increasingly common to make the full text of reports available on Internet.

The establishment of databases have been supported by top-management decisions and sufficient resources made available for software development and presentation. The World Bank and UNDP are using the database as an important tool in their performance measurement/management system. The database is then more integrated into and part of the overall management of knowledge and information in the organisations.

Recommendations

The evaluation function in UNICEF should be strengthened.
Strengthening of key evaluation functions in UNICEF should be the overall purpose: Accountability, Programme Improvement; Organizational Learning, Institutional Memory.
The current database needs to be fundamentally improved.
Top-level management should provide clear direction and active support to the database.
The purpose of the database should be clarified.
The scope of the database content should be reduced.
Quality Control should be introduced.
The content of the database should be differentiated.
The relationship with other corporate databases should be developed gradually.
The long-term aim should be to make the database public.
Country Offices should enter and Regional Offices control data.
More strict rules for updating the database should be revised and enforced.
The new database should be promoted in order to strengthen awareness and knowledge of its usefulness.
A training module should be developed for maintaining and using the database.



Full report in PDF

PDF files require Acrobat Reader.


 

 

Report information

Date:
2000

Region:
Global

Country:
Inter-regional

Type:
Evaluation

Theme:
Evaluation

Partners:

PIDB:

Follow Up:

Language:
English

Sequence Number:

Recherche