Sunteți pe pagina 1din 33

Strengthening Monitoring and Evaluation

System (SMES) Project

SMES MONITORING AND EVALUATION


TRAINING MANUAL

MODULE 4:
MONITORING AND
EVALUATION WITH LOGICAL
FRAMEWORK
APPROACH

(version-4)
November 2009
SMES Monitoring and Evaluation Training Manual

Module 4: Monitoring and Evaluation with Logical Framework Approach

Table of Contents

Abbreviations
Overview of Module 4 ..................................................................................... IV-1
Introduction: Monitoring and Evaluation with Logical Framework Approach ... IV-2

Chapter 1: Monitoring with Logical Framework Approach........................... IV-7


1-1 What to Monitor? .............................................................. IV-7
1-2 How to Monitor a Project with Logical Framework
Approach .......................................................................... IV-7

Chapter 2: Evaluation with Logical Framework Approach ......................... IV-14


2-1 Before Conducting Evaluation. ....................................... IV-14
2-2 Evaluation Procedure...................................................... IV-15
Step 1 Prepare Evaluation Basis ..................................... IV-15
Step 2 Make a Detailed Plan for the Evaluation Survey .. IV-15
Step 3 Collect and Analyze Data ..................................... IV-20
Step 4 Draw Conclusion .................................................. IV-21

Recommended Readings ............................................................................. IV-26

Annex 1: Sample format for Terms of Reference (TOR) for Ex-post


Evaluation ............................................................................... IV-27
Annex 2: Sample formats The Community Development and
Forest/Watershed Conservation Project .................................. IV-30
Abbreviations

ADB: Asian Development Bank


DAC: Development Assistance Committee of the OECD
DFID: Department for International Development
GTZ: Gesellschaft fr Technishe Zusammenarbeit (German Agency for
Technical Cooperation)
JICA: Japan International Cooperation Agency
LFA: Logframe Approach
M&E: Monitoring and Evaluation
MDG: Millennium Development Goal
OECD: Organization for Economic Co-operation and Development
PMAS: Poverty Monitoring and Analysis System
PO: Plan of Operation
Q&A: Questions and Answers
RBM: Results-based Management
RF: Result Framework
SMES: Strengthening the Monitoring and Evaluation System (Project)
TOR: Terms of Reference
UNDP: United Nations Development Programme
Overview of Module 4
Module 4 explains methods to monitor and evaluate projects using Logical Framework
Approach (LFA).

Objectives: - To understand OECD-DAC five evaluation criteria and its


relationship to Logframe
- To understand basic components of M&E
- To understand the steps of M&E by using Logframe
- To be able to develop Evaluation Grid (evaluation questions, data,
data sources, data collection methods) for all the DAC five
evaluation criteria through group work
Process: - Review of developed Logframe, and revision by groups if
necessary (20 min)
- Lecture on the M&E evaluation criteria, M&E process,
components for M&E, steps of M&E with Logframe (40 min)
- Instruction and Q&A (10 min)
- Group exercise on developing Evaluation Grid (80 min)
- Group presentations and Q&A(40-50 min)
- Sharing of good examples of Evaluation Grid (5 min)
- Sharing of examples of Table of Achievement (5 min)
- Sharing of good examples of evaluation reports(5 min)
Key Words: - Evaluation Grid
- OECD-DAC five evaluation criteria (Relevance, Efficiency,
Effectiveness, Impact and Sustainability)
Resources - Module 4 (including PowerPoint slides)
Required: - Glossary of OECD-DAC
- Enlarged version of M&E steps by Logframe
- Case-study
- Good examples of Logframe for Evaluation
- Good examples of Evaluation Grid
- Example of Table of Achievement
- Good examples of evaluation reports
Outputs - Complete Evaluation Grids by groups
Expected:

IV-1
Introduction: Monitoring and Evaluation with Logical
Framework Approach

1. Introduction
In this Module 4, how to monitor and evaluate a project by using Logframe will be
presented. Before going into details of each step, the link between M&E and the Logical
Framework Approach (LFA) is presented. It includes introduction to OECD-DAC
evaluation criteria, which are adopted by a number of major development agencies, and
the relationship between the criteria and Logframe. Then, standard steps and
components of M&E using LFA are presented in Chapter 2 and 3.

2. The Logical Framework Approach for M&E


(1) The Concept of Monitoring with LFA

When monitoring a project, it is effective to compare the plan and the results by using
Logical Framework and Plan of Operation or Activity Schedule. Indicators and Means
of Verification in Logframe provide the framework for monitoring. In the context of
monitoring, the Logframe matrix provides:

- A framework of objectives, indicators and sources of information which should be


used to develop and implement the monitoring, review and reporting system;

- A list of key assumptions which must be monitored as part of the projects risk
management arrangements, and

- A clear and consistent reference point and structure for completing progress reports

The following figure shows the flow of monitoring using Logframe.

IV-2
Figure-1: Flow of Information and Feedback in Monitoring
 Examine the progress
of the project and
compare it with the
plan
Understand the  Examine changes of
Plan of current status the condition of the
Logframe project
Operation
 Analyze factors that
Analysis impede or accelerate
the project

 Adopt measures to
Taking address factors that
appropriate action impede the project
 Review the plan

Traditionally, monitoring was conducted against an activity chart, and the focus was on
whether the Activities were completed in a timely and appropriate manner. In this way,
however, it is impossible to assess whether results have been achieved or not. Therefore,
as described in Module 1, it is critical to develop Results-based Monitoring System to
enable one to monitor not only implementation but also level of achievement toward
Outputs and Project Purpose. Logframe serves as a basis for results-based management
and monitoring as properly developed Logframe sets clear and measurable goals with
Indicators and Means of Verification, which can be used to monitor achievement.

Nonetheless, implementation monitoring is as important as results monitoring because


the status of Inputs and Activities needs to be monitored as well. The following figure
indicates demarcation between implementation monitoring and results monitoring.

Figure-2: Implementation Monitoring and Results Monitoring

Overall Project
Outputs Activities Inputs
Goal Purpose

Results Monitoring Implementation Monitoring

IV-3
(2) The Concept of Evaluation in LFA

In evaluating a project, the plan shown in the Logframe and the achievements are
compared, and analyzed using five evaluation criteria to draw conclusions. Based on
these conclusions, recommendations are made and lessons learned are drawn. The five
evaluation criteria and the relationship between the criteria and Logframe are
summarized below.

Evaluation Criteria

The five evaluation criteria, which are Relevance, Effectiveness, Efficiency, Impact and
Sustainability, were proposed in 1991 by the Development Assistance Committee
(DAC) of the Organization for Economic Cooperation and Development (OECD) as a
basis for evaluation. The definition and major points to be analyzed under each criterion
are summarized as follows.

Table-1: Evaluation Criteria and Points to be Analyzed


Criteria Definition/Points to be analyzed
Relevance The extent to which the intervention is suited to the priorities and policies of
the government and its citizens/the target group. In evaluating Relevance,

it is useful to consider the following questions:


 To what extent are the objectives of the intervention still valid?

 Are the Activities and Outputs of the intervention consistent with Overall
Goal and the attainment of Project Purpose?

 Are the Activities and Outputs of the intervention consistent with the
intended impacts and effects?
Effectiveness A measure of the extent to which an intervention attains its Project
Purpose.
In evaluating Effectiveness of an intervention, it is useful to consider the
following questions:

To what extent was the Project Purpose achieved? /To what extent is it
likely to be achieved?

What were the major factors influencing the achievement or


non-achievement of Project Purpose?
Efficiency Efficiency measures the Outputs in relation to the Inputs. It is an economic
term which signifies that the intervention uses the least costly resources
possible to achieve the desired results. This requires comparing alternative

IV-4
approaches to achieving the same Outputs in order to see whether the most
efficient process has been adopted. It is useful to consider the following
questions:

Were Activities cost-efficient?


Were objectives achieved on time?
Was the intervention implemented in the most efficient way compared
to alternatives?
Impact The positive and negative changes produced by an intervention, directly or

indirectly, intended or unintended are assessed. When evaluating Impact,


it is useful to consider the following questions:

What has happened as a result of the intervention?


What real difference has the intervention made to the beneficiaries?
How many people have been affected?
Sustainability Sustainability is concerned with measuring whether the benefits of an

intervention are likely to continue after the intervention is completed.


Interventions need to be environmentally as well as financially sustainable.

It is useful to consider the following questions:

To what extent did the benefits of an intervention continue after the


intervention is completed?

What were the major factors which influenced the achievement or


non-achievement of sustainability of the intervention?

Please refer to Annex 2 of this Module, Sample formats The Community


Development and Forest/Watershed Conservation Project: 2) Sample Evaluation Grid in
order to see how to apply above criteria in a real project evaluation.

Countries/agencies can decide the evaluation criteria to be adopted among the five
criteria and additional own developed criteria depending on the types and objectives of
evaluation. For example, in JICA/GTZ practice, both mid-term and final evaluations
utilize all the five DAC evaluation criteria, and ex-post evaluation utilizes Impact and
Sustainability criteria only. Multilateral donors such as ADB and UNDP utilize four
criteria (Validity, Effectiveness, Efficiency, and Sustainability) only.

Link between Five Evaluation Criteria and Logframe

The following figure illustrates the relationship between the DAC five evaluation
criteria and Logframe. The colored cells show where to focus on in investigating each
evaluation criterion: in analyzing Relevance of a project, it is necessary to examine

IV-5
whether or not Project Purpose and Overall Goal are set in accordance with the needs of
local communities, target groups and the government policies; in examining
Effectiveness of a project, it is critical to see the degree to which Project Purpose has
been achieved through attainment of Outputs; in analyzing Efficiency, the focus will be
on the degree to which Inputs were converted into Outputs through Activities; for
Impact, the level of achievement toward Overall Goal in relation to the achievement
level of Project Purpose is examined, and in examining sustainability, the focus is on
whether or not benefit obtained by achieving Project Purpose and Overall Goal will be
maintained, considering the sustainability of Inputs, Activities and Outputs. This
becomes guidance for conducting project evaluation by using Logframe.

Figure-3: Relationship between Logframe and Five Evaluation Criteria

Relevance Effectiveness Efficiency Impact Sustainability

Overall
Goal

Project
Purpose

Outputs

Activities
Inputs

In addition to evaluating from the perspective of the five evaluation criteria,


cross-cutting issues, such as environmental sustainability, human rights issues, gender
equality as well as the system of organizations and institutions need to be taken into
consideration especially when examining the criteria of Relevance, Impact and
Sustainability. They are critical development issues which need to be appropriately
addressed throughout the project management cycle.

IV-6
Chapter 1: Monitoring with Logical Framework Approach

1-1 What to Monitor?


In conducting monitoring, focus will be placed on monitoring the following five points.
(1) Progress of Activities - Check if Activities are proceeding according to the Plan of
Operation
(2) Achievement of Outputs - Gather data on Indicators of Outputs and check the
achievement
(3) Achievement of Project Purpose - Gather data on Indicators of Project Purpose to
check the achievement
(4) Status of Provision of Inputs
(5) Status of Important Assumptions and Pre-Conditions

In monitoring on these points, the tasks to be pursued and weights to be placed on each
of the five points differ depending on the stage of the project cycle. While (4) and (5)
need to be monitored throughout the project cycle, the weight of monitoring shifts from
(1) to (3) as the project progresses. The details of different tasks at the different stages
of the project cycle are explained below

1-2 How to Monitor a Project with Logical Framework Approach


(1) At the Beginning of a project - establishing monitoring system

To plan and carry out monitoring properly, it is essential to create a monitoring system
at the initial stage of the project cycle. Holding workshops to create a monitoring
system will help all the project members to be aware of the importance and their roles in
monitoring as well as to promote team-building and ownership of the project among the
members. In a monitoring system, it is necessary to specify the following points:

1) Information to be collected
Indicators of Project Purpose and Outputs; achievement of Activities; the amount of
Inputs already provided; status of Important Assumptions, and other relevant
information inside and outside the project

2) Staffing, timing, method for information collection


Who will collect which information, when and how often the information should be
collected, and how it should be collected

IV-7
3) Staffing, timing and method for information aggregation
Who will aggregate which information, when and how often the information should
be aggregated, and how it should be aggregated and analyzed

4) Decision makers and timing of decision


Persons or committees that make decisions as to whether or not revisions should be
made to the project plan, and when and how often these decisions should be made

5) Means and timing to feed back


Who should feedback decisions, when and how should they feedback the decisions

As described above, the system involves various personnel: staff to collect information;
staff to aggregate information, and decision makers. The following chart summarizes
the flow of information and feedback among these personnel.

Figure-4: Flow of Information and Feedback

Feedback

Information Collection

Aggregation and Decision


analysis
Information Collection

Information Collection
Feedback

For a sample of Monitoring System, please refer to Annex 2 of this Module,


Sample formats The Community Development and Forest/Watershed Conservation
Project: 1) Sample Monitoring System.

(2) During Implementation Period Monitoring a Project with Monitoring System

In the beginning of project implementation, the focus for monitoring will be placed on
the progress of Activities, but the focus gradually shifts toward the achievement of
Outputs. When the project is about to complete, the focus will be on the degree to which
the Project Purpose has been achieved. Some of the key issues to be remembered in

IV-8
monitoring a project are summarized below.

Updating Monitoring System


The monitoring system created at the initial stage is used in order to collect and analyze
information regularly once a project has started, but it needs to be improved when
necessary. Effectiveness of information collected, efficiency of information collection
method, budget and procedures required for monitoring operations are some of the
factors to be examined continuously in order to ensure the amount and accuracy of the
collected information.

Monitoring Based on Five Evaluation Criteria


Although detailed account of the five evaluation criteria will be given in the section for
evaluation, project implementers need to consider the five criteria in monitoring as well.

Table-2: Monitoring Based on Five Evaluation Criteria

Evaluation Criteria Points to Note


Relevance Are there any changes in the government development policies
or in the needs of its citizens/the target group?
If Project Purpose is becoming incompatible with the policies
and the needs, what remedial measures can be taken?
Effectiveness Is Project Purpose expected to be achieved by the end of the
planned period? If not, what remedial measures can be taken?
To achieve Project Purpose, what Outputs should be
reinforced? What Outputs can be reduced without any negative
effect?
Efficiency Are Inputs being utilized properly to produce Outputs?
How can Inputs be held down to produce the same Outputs?
Impact Is the project producing any negative effects?
If so, how can they be dissolved or minimized?
Sustainability Are the governmental agencies related to the project (ministry,
department, regional and district offices, DDC, VDC, etc.)
constructing a satisfactory system to sustain the benefits of the
project after it is completed?
If not, what can be done to reinforce the related agencies
during the remaining period to ensure sustainability?

IV-9
Besides the five criteria, it is critical to monitor external conditions such as policies of
the country, the state of harmonization with related donor agencies, implementation
process, including financial and procurement management, and team building as well as
the relationships with the target group regardless of whether or not they are spelled out
in the Logframe. Also, it is important to monitor a project from the perspective of
cross-cutting issues such as environmental sustainability, human rights issues, gender
equality and the system of organizations and institution.

Analyzing the Data Obtained


A large amount of information produced through monitoring activities can be wasted if
it is not appropriately interpreted, analyzed, presented and used for management. Please
refer to Module 5 for basic approaches of data analysis, including topics such as
collection of baseline data and setting target values.

Tracing and Analyzing Problems


If monitoring discloses gaps between the plan and the actual state, or reveals any other
problems, it is necessary to trace the causes of these problems and find out methods for
changing the course of the project or solving the problems. The vertical logic of the
Logframe may be useful for tracing the causes of the problems. For example, if
monitoring reveals that Project Purpose has not been achieved as planned, it is likely
that the appropriate Outputs are not obtained, or that Important Assumptions for the
Project Purpose are not fulfilled. If there is any problem with producing Outputs, the
cause of the problem may be in Activities and Important Assumptions related to the
Outputs. If it is difficult to identify causes by the vertical logic, check whether or not the
operations are affected by Important Assumptions that are not indicated in the Logframe,
and whether or not there are problems with the way the project is carried out. If you
cannot identify the causes by these analyses, there may be something wrong with the
logic of the Logframe itself.

IV-10
Box-1: Risk Management

The integration of risk management within monitoring is critical, but commonly


overlooked. The achievement of project objectives is often subject to influences beyond
project managers direct control. It is therefore important to monitor this external
environment to identify whether or not the assumptions that have already been made are
likely to hold true, what new risks may be emerging, and to take action to manage or
mitigate these risks where possible. Therefore, how a project plans to manage risks and
potential impact needs to be identified and monitored as well. Creating Risk
Management Matrix below is helpful to identify key risks and strategy to manage them.

Risk Management Matrix


Potential Risk level Risk management
Risks adverse (High/Medium strategy Responsibility
impact /Low)*

*Risk level is estimated from the likelihood and impacts

Risk Monitoring Table


Risk:
Has the risk changed since the last review? If so explain why.

New risks identified:

Action being taken to monitor/manage risks:

Recommended changes to plans or management strategies in respect of project


associated risks:

Does the Logframe require revision?

IV-11
Box-2: Monitoring through Field Visits
Field visits can provide valuable qualitative and quantitative information that cannot be
obtained from written reports or conversations with project officials in the capital. Field
visits can be used to monitor project processes, results, participation and to obtain a
better understanding of its setting. In order to make the most of a short visit, it is useful
to spend some time and effort for planning and preparing for the visit by, for example,
familiarizing yourself with the content of the project through available project
documents, identifying the key issues to be addressed during the visit, and developing a
monitoring checklist. An example of monitoring checklist for a Maternal Child Health
Clinic Support Project is shown below.

Field Monitoring Checklist - Maternal Child Health Clinics


Name of clinic: Date visited:
District: Visited by:
Question Yes/No Comment
1. Was the Nurse Aide present during the visit? Yes/No
If no, state the reason
2. Has the Nurse Aide received the new in-service Yes/No
training in the past six-months?
3. Are the following equipment/supplies available? Yes/No
- Weighing scale for baby and adults
- Oral rehydration salts
- Supplies for expanded immunization
program, etc.
4. Are the registers properly maintained, namely: Yes/No
- List of clinic attendance
- Growth charts
- Age and weight
- Birth register
- Food stock register, etc.
5. Are the supply storage facilities adequate? Yes/No
Are the supply storage facilities well kept in terms
of stacking and cleanliness?
6. Is the Nurse Aide receiving his/her salary on time? Yes/No
7. Other observations

IV-12
Revisions to the Plan and Logframe
When making revisions to the plan of a project based on the results of monitoring, the
Logframe is revised if necessary. The reasons of the revisions also need to be noted
down. The revisions of the Logframe usually require the project team to prepare the
draft of the revised Logframe, and obtain the official approval of the government.
Please keep in mind that the revisions need to be made with consultations between
related organizations and stakeholders.
When revising a part of the Logframe, it is necessary to remember that other parts of the
Logframe, Plan of Operation, and Monitoring System need to be adjusted in accordance
with revisions made.

Reporting
The results of monitoring need to be shared within the project including the target group,
and reported to organizations related to the project or upper levels of the government.
Project managers make decisions whether to send the results as emergency information
or include it in periodical reports, depending on the content, level of importance and
urgency of the results of monitoring.

IV-13
Chapter 2: Evaluation with Logical Framework Approach

2-1 Before Conducting Evaluation


In order to plan and conduct evaluation, the basic design of evaluation needs to be made
in advance. There are a number of matters to be identified in the design as below.

1) Project or group to be evaluated


2) Purpose of evaluation
3) Method of evaluation
4) Evaluators or evaluation team
5) Expenses
6) Time and period of evaluation
7) What will be reported, how and to whom

In defining above points, developing a Terms of Reference (TOR) for the expected
evaluation/evaluator will be useful for those who organize the evaluation. TOR is a
written document presenting the purpose and scope of the evaluation, the methods to be
used, the standard against which performance is to be assessed or analyses are to be
conducted, the resources and time allocated, and reporting requirements. TOR also
defines the expertise and tasks required of a contractor as an evaluator, and serves as job
descriptions for the evaluator. The example of TOR for evaluation is provided at Annex
1 of this module.

Box-3: Different Responsibilities in Evaluation

Evaluation organizers are usually the ones who are in charge of a particular project and
want to have the project evaluated to better manage project operation. Responsibility in
the evaluation organizers differs from those of evaluators, who are usually consultants
contracted for the evaluation. Tasks of the evaluation organizers include: 1) preparing
the TOR; 2) appointing evaluator(s); 3) securing budget for evaluation; 4) monitoring
the evaluation work; 5) providing comments on the draft; 6) publicizing the evaluation
report, and 7) providing feedback from the results to concerned parties. The role of
Evaluator includes: 1) preparing the detailed evaluation design; 2) collecting and
analyzing information, and 3) preparing an evaluation report.

IV-14
2-2 Evaluation Procedure
The box below is a summary of basic evaluation steps, followed by detailed explanation
of each step.

Step 1 Prepare Evaluation Basis


(1) Review existing planning documents (Logframes, Plan of Operation, etc.) and
reporting documents (annual, biannual, trimester)
Step 2 Make a Detailed Plan for the Evaluation Survey
(1) Prepare evaluation questions with Evaluation Grid
(2) Select data collection methods
(3) Examine the details of the evaluation survey
Step 3 Collect and Analyze Data
(1) Collect data
(2) Analyze data
Step 4 Draw Conclusion
(1) Conclude the results
(2) Make recommendations and draw lessons learned
(3) Present and disseminate evaluation results

Step 1 Prepare Evaluation Basis


(1) Review Existing Planning and Reporting Documents

Before conducting any analysis, it is crucial to prepare a foundation for evaluation. For
this purpose, it is important to familiarize yourself with the project by reviewing related
documents such as different versions of Logframe, Plan of Operation (PO), and
previously prepared progress reports (Trimester, Quarterly, Biannual, Annual), reports
from donors, and any specific surveys (Demographic Health Survey, Census, Living
Standard Survey, etc.). Other information related to organizational structure of the
project with decision-making process and information flow among key stakeholders is
also useful.

Step 2 Make a Detailed Plan for the Evaluation Survey


(1) Prepare Evaluation Questions with Evaluation Grid

What is Evaluation Grid?


Evaluation Grid is a matrix of evaluation questions, data needed, data sources and data
collection methods. Evaluation Grid is used as a tool for evaluation planning and as a

IV-15
work sheet when carrying out an evaluation. The following table shows how it looks
like.
Figure-5: Evaluation Grid
Criteria Evaluation Data needed Data sources/
questions Data collection
methods
Relevance

Effectiveness

Efficiency

Impact

Sustainability

For an example of evaluation grid, see the Annex 2 of this Module, Sample Formats
- The Community Development and Forest/Watershed Conservation Project: 2) Sample
Evaluation Grid.

How to Devise Evaluation Questions

Evaluation questions represent what one wants to know through evaluation. Starting
from a general question, such as Was the project meaningful?, more specific questions
such as, in case of The Community Development and Forest/Watershed Conservation
Project, Did Ward Conservation Committees develop the skills to manage their
community resources management system by themselves? should be developed to
make evaluation study operational.

Evaluation questions can be developed according to the five evaluation criteria. For
example, in the case of a terminal evaluation of the Mathematics and Science Teachers
Training Project, one question for determining the Effectiveness of the project could
be: Was there any improvement in teachers teaching methods as a result of the
project? This question can be further broken down until one can imagine exactly what
data should be collected.

The following figure shows how to break down a main question.

IV-16
Figure-6: How to Devise Evaluation Questions

Main Question: Was the project implementation valid?

Break down to more specific questions

Relevance: Is there a need for improving math and science education?


Effectiveness: Have teachers teaching methods improved?
Efficiency: Was the cost of developing a curriculum acceptable?
Impact: Have students attained a certain academic achievement?
Sustainability: Has the teacher training system been sustained?

Break down further into more


specific questions

It is not necessary for one main question to cover all the five criteria, and evaluation
questions should not be automatically set based on the five evaluation criteria. What is
important is to develop the main questions that would give useful answers for
improving the project. Thus, emphasis among the criteria could be different. Evaluators
can prioritize questions according to the needs of the project.

Table-3: Main Evaluation Questions according to the Five Evaluation Criteria

Relevance Necessity
To examine the  Does a project match the needs of the target area or society?
justifiability or  Does a project match the needs of the target group?
necessity for Priority
project  Is a project consistent with the country development plans?
implementation Relevance as a means:
 Is a project strategy producing impact?
 Are selected target groups and the size of the target considered
appropriate?
 Is a project relevant from the equality point of view?
Effectiveness  Is Project Purpose specific enough?
To examine  Has Project Purpose been achieved? (or is it going to be

IV-17
project effects achieved?)
 Did the achievement result from Outputs?
 Is there any influence of important assumptions on the
attainment of Project Purpose?
 What are the hindering/contributing factors for Effectiveness?
Efficiency  Was (or is) the cost of Inputs justified by the degree of
To examine achievement of Outputs and Project Purpose (This can be
project compared with the similar projects)? Were there any
efficiency alternatives that would have achieved same level at lower
costs? Could (can) higher level of achievement be expected at
the same costs?
 What are the factors that inhibit or contribute to the Efficiency
of project implementation process? (i.e., Were the timing, size,
quality of Inputs delivered appropriate?, Is there any influence
of Important Assumptions and Pre-conditions?, etc.)
Impact  Has Overall Goal been achieved (or is it going to be
To examine the achieved?)
projects  Did (or does) the achievement of Overall Goal result from the
effects Project Purpose?
including the  Is there any influence of important assumptions on attainment
ripple effects in of Overall Goal?
the long term  Is there any unexpected positive or negative influence
including ripple effects? (influence to be examined include: 1)
influence on policies; 2) economic influence on the target
society, implementing agency and beneficiaries; 3) influence
on the organization, related regulations, legal system
arrangement; 4) influence on technological innovation; 5)
influence on gender equality issue, human rights issue,
disparities between the rich and the poor, peace and conflicts,
and 6) environmental issue)
Sustainability  Are the expected effects described in Project Purpose and
To examine the Overall Goal going to be sustained after the termination of an
sustainability intervention?
after the  What are the factors that inhibit or contribute to the appearance
termination of of those sustainable effects? (factors to be examined include:
an intervention 1) political support; 2) capacity, personnel location, budget,

IV-18
decision-making process of the organization; 3) regulations
and legal system; 4) financial support; 5) technologies and
equipment; 6) negative influence on the social and cultural
aspects, and 7) influence on environmental aspects)
 Was (or is) the ownership of implementing agencies and
related ministries assured?

(2) Select Data Collection Methods

When selecting Data Collection Methods for each question, it is crucial to check the
methods in light of the following three criteria: 1) reliability of the data, 2) accessibility
to the data, and 3) cost to obtain the data. Based on the assessment on these criteria, the
data collection method to be adopted will be proposed and agreed by the evaluation
team. The following table is an example of a checklist of the case study, The
Community Development and Forest/Watershed Conservation Project. In the case
below, after having compared the three criteria (Reliability, Accessibility, Cost),
Records of District of Soil Conservation Office and Project records were selected as
final data collection methods while interviews with residents in hill areas were not
selected due to its low reliability, accessibility and high cost.

Table-4: Example of a Checklist to Decide Data Collection Methods

Evaluation Question Data Collection


Method Reliability Accessibility Cost Final Selection
Is Impact of the Records of District
Project, The natural
of Soil Yes
Conservation Office
environment is
improved in hill areas Project
in Nepal through active record/documents Yes
management of
community resources
by the community Interviews with
people likely to be
residents in hill No
areas
achieved?

1. Reliability of Information ++ Very high, +High, - Low, -- Very low


2. Accessibility to the Information ++Very easy to access, +Easy to access, -Difficult to access, --Very difficult to
access
3. Cost of the Collection Methods ++Very Economical, +Within budget, -Costly, but possible if there is no other
options, --Over budget

Details of various data collection methods will be presented later in Module 5.

IV-19
(3) Examine the Details of the Evaluation Survey Plan

Once evaluation questions and the method of collecting information are selected, it is
necessary to review them from the following perspectives in order to raise the accuracy
and efficiency of the questions and data collection methods.

1) Validity of the evaluation questions


Will conclusions for the five evaluation criteria be drawn by the responses to the
questions?

2) Importance of the evaluation questions


How important will be the responses obtained in making judgments about the five
evaluation criteria?

3) Reliability of information
Given the available sources of information and method of collecting information,
how reliable will be the obtained information?

4) Accessibility of information
Can the required information be easily accessed?

5) Cost
Is the cost required for obtaining the information appropriate?

Step 3 Collect and Analyze Data


(1) Collect Data

Based on the prepared Evaluation Grid, information is collected by using the selected
data collection methods. The following table shows the example of Evaluation Grid
with collected information. Whenever possible, it is desirable to have gender-, age, or
socioeconomic group- disaggregated data in order to examine the effects and influences
of the project to various groups. The details about data collection methods and data
collection will be presented in Module 5.

IV-20
Table-5: Sample Evaluation Grid with Collected Information The Community
Development and Forest/Watershed Conservation Project

Question Data Needed Data source Results


Sustainability - Policy
1. National
Development
As community development and
Policy and plan of Plan
environment conservation are two of
the government 2. Counterparts
the countrys priorities, which are
regarding the at Ministry of
stipulated in the governments
Project and its Forests and
documents, it is likely that the
approach Soil
political support will continue.
Will GoN Conservation
support the (central level)
Project after Likelihood of 1. District
the incorporation of the Development
termination of project approach Plan
2. Counterparts There are no concrete plans to
Japanese as an activity of
at DSCWM incorporate the approach as an
support? Department of Soil
and DSCO activity of DSCWM and DSCO at the
Conservation and
moment. However, there is a
Watershed
possibility and will to adopt the
Management
approach in near future if the budget
(DSCWM) and
is secured.
District Soil
Conservation
Office (DSCO)

(2) Analyze Data


Collected data are organized and analyzed based on the five evaluation criteria, which
are Relevance, Efficiency, Effectiveness, Impact and Sustainability. The above table,
Evaluation Grid with project achievements/collected information, will help you
organize and analyze the data based on the five criteria, evaluation questions and
projects original plan. Details about data analysis will be presented in the following
Module 5.

Step 4 Draw Conclusion


(1) Conclude the Results

Based on the results of analysis, value judgments need to be made for each evaluation
criterion. At the same time, influential factors that affected the results should be
analyzed as well. There are two steps in this process of concluding the results: 1)
making value judgments about a project according to the five evaluation criteria, and 2)
drawing a conclusion based on those judgments.

IV-21
1) Value judgments according to the five evaluation criteria
The first task is to evaluate a project with the five criteria and specify the factors that
brought the evaluation results. For example, in evaluating Effectiveness of a water
supply project, you can judge the Effectiveness is low when you obtain the data that
60 % of villagers can access safe water due to the project when the target value was
80 %. Then you need to analyze the factors which inhibited the achievement of the
objective. When explaining hindering or contributing factors, specific evidence needs
to be presented in order to ensure the credibility of the evaluation.

2) Conclusion based on the judgments

To draw a conclusion, the evaluator has to make a value judgment from a


comprehensive viewpoint, considering all the five criteria. For example, for ex-ante
evaluation, an evaluation team decides whether it is valid to conduct a project and
whether the contents of plans are appropriate. In the case of terminal evaluation, a
team judges whether a project is successful and whether the assistance should be
terminated. The team also has to provide evidence for the judgment from the results.
In stating conclusion, it is necessary to refer to: 1) revisions of the project and future
plan, 2) common factors and problems underlying the project, and 3) evaluation
criteria that are particularly noteworthy.

Box-4: Results-based M&E with LFA and Results Framework (RF)


Properly developed Logframe helps RBM at the project level because it requires
setting clear and measurable goals together with Objectively Verifiable Indicators
and Means of Verification. Meanwhile, RF does so usually at the program, sector or
country level because RF takes a broader look, being linked with program, sector or
national goals which are usually achieved jointly with other partners. Thus, RF is
useful to ensure assessment of achievement towards higher goals such as MDGs.
Within a results-oriented environment, either with LFA or RF, the emphasis of M&E
will be placed on the following issues:
 Active learning through application of M&E information to the continuous
improvement of strategies, programs and other activities;
 Monitoring of progress towards development results instead of just Inputs and
implementation processes;
 Conducting timely M&E instead of conducting it as an ex-post activity;
 Conducting M&E as joint exercises with development partners

IV-22
(2) Make Recommendations and Draw Lessons Learned

Based on the conclusion, recommendations are made and lessons learned are drawn.
Recommendations include specific measures, suggestions and advice on a target project
for those concerned in the implementing agencies. Lessons learned are to be fed back to
similar projects that are being implemented or will be implemented in the future.

In drawing a conclusion, examining the project not only from the five evaluation criteria
but also from causal relationships and project implementation process is useful for
formulating recommendation and drawing lessons learned.

(3) Present and Disseminate Evaluation Results

Evaluation results should be reported in a simple and clear way using figures and other
means of facilitating the understanding of readers, and appending reference materials.
The following is an example of cover page and outline of Evaluation Report.

1) Example of Cover 2) Example of Report

XX Project Overview Executive Summary

Evaluation Report Evaluation Framework


Main Part
Methodology

National Planning
Background of the Project
Evaluation Results in the Evaluation Criteria
Commission
Conclusions
2066
Recommendations
Lessons Learned
Annexes Annexes (Logframe, Evaluation Grid, etc.)

Several tips for preparing a report are shown below


 Avoid a redundant report. Keep the length of the main part around 30 to 40 pages.
Be sure to make a summary of the evaluation results.
 Write a report using specific expressions in a simple manner, emphasizing issues to
be conveyed. Avoid using technical terms too often.
 Use tables and figures in an appropriate and simple manner so that the readers can
receive messages through the data.
 State the limitation of the evaluation study.
 Provide the grounds for judgments in the evaluation of survey results.

IV-23
 Stipulate the sources of quoted data.
 Place the evaluation grid, contents of the questionnaire, and collected data in an
appendix.

Box-5: Participatory Monitoring and Evaluation


In participatory projects of recent years, monitoring and evaluation is carried out not
just to manage the progress of the projects, but to increase the impact of the projects by
developing M&E capacity of organization and beneficiaries that project supports. In
addition to holding interviews with beneficiaries and carrying out questionnaire surveys,
it is necessary for the evaluators to involve the beneficiaries in the process of evaluation
and to feedback the results. Applying participatory approaches in monitoring and
evaluation has the following advantages.

1. Beneficiaries participation in monitoring and evaluation promotes their ownership


of project activities
2. The transparency of the monitoring and evaluation process is improved.

The level of participation by beneficiaries differs depending on the purpose of


monitoring and evaluation. It is necessary to consider in advance as follows:
 The meaning and effect to take the participatory approach
 Who to make the judgment of the results of monitoring and evaluation
 How to revise the planning based on the results

IV-24
CONGRATURATIONS!! THIS IS THE END OF THE MODULE 4.
Please check whether or not you understand the contents of this module by
answering the following checklist.
G
CHECKLIST
 What are OECD-DAC five evaluation criteria?
 What do we need to monitor?
 What is Monitoring System?

 What kind of elements is necessary in Monitoring System?


 What do we need to identify/confirm before conducting evaluation?
 What are the four steps of evaluation?

IV-25
Recommended Readings:

 DFID, Tools for Development- A handbook for those engaged in development


activity, 2002

 DFID, Essential Guide to Roles & Tools, 2005

 DFID, Guidance on Evaluation and Review for DFID Staff, 2005

 European Commission, Aid Delivery Methods, Volume 1 Project Cycle


Management Guidelines, 2004

 GTZ, Results-based Monitoring: Guidelines for Technical Cooperation Projects


and Programs, 2004

 GTZ, Guidelines on the Preparation of TZ Projects and Programmes

 JICA, JICA Guideline for Project Evaluation Practical Methods for Project
Evaluation, 2004

 JICA, Handbook for Selecting Outcome Indicators A Guide to Practical


Evaluation of Technical Cooperation, 2005

 JICA, Textbook for the Area Focused Training Course in Forum on


Institutionalization of Evaluation System, 2007

 World Bank, Ten Steps to a Results-Based Monitoring and Evaluation System,


2004

IV-26
Annex 1: Sample Format for Terms of Reference (TOR) for
Ex-post Evaluation1

Ex-post Evaluation Study on (project title)

1. Project Background and Evaluation Purpose


The (name of the project) was commenced in (month/year), aimed at achieving (Project
Purpose) for the (beneficiary group) in (region/area), and completed in (month/year).
Summarise the background of the project

(Number of years) have passed since the termination of the above project and (Your
Office) calls for an ex-post evaluation of its project. The purpose of this ex-post
evaluation is to contribute to feedback the lessons learnt for future policy and program
planning and to promote government accountability.
Explain special expectations for undertaking this evaluation and using the
evaluation results.

2. Main evaluation questions


(Number of consultants) Consultant(s) will be hired to undertake an ex-post evaluation
study. The evaluation is expected to verify the important issues relating to the project
Impact and Sustainability observed 3 years after the project completion. More
specifically, the evaluation will seek answers to the following main evaluation
questions:

(Sample questions relating to Impact)


a) To what extent has the projects Overall Goal been achieved since the time of
terminal evaluation?
b) What positive and negative impacts have the project achieved besides what
were originally intended?
c) Among positive changes made, how has the project implementation empowered
the target group economically and socially? Has the project contributed to the
improved institutional capacity of the implementing agency?
d) What negative changes have been brought to the beneficiaries, including
minority and vulnerable groups? Has the project negatively contributed to the
promotion of environmental and social development?
e) Are there any external factors that influenced the achievement of the project
1
Evaluation organizer of the Government of Nepal can decide the evaluation criteria to be adopted
depending on the types and objectives of evaluation. In JICA/GTZ practice, both mid-term and final
evaluation utilizes 5 DAC evaluation criteria, and ex-post evaluation utilizes impact and sustainability
criteria only.

IV-27
Overall Goal?

(Sample questions relating to sustainability)


a) Is the project organisation maintaining the benefits brought as a result of
achieving Project Purpose and Overall Goals?
b) How likely are the project outcomes to be maintained?
c) What are the factors that contribute/inhibit the sustainability of the project
outcomes: such as appropriateness of project planning and the technology
transferred, and external factors?

3. Suggested/ Required Evaluation methods


The evaluation will be carried out in (target area/region) and include a site visit to (area
where survey is undertaken) to consult with (target group), (project stakeholders) and
(implementing organisation).
Provide adequate information on the site and size of the survey.

The consultants are responsible for identifying specific evaluation methods of data
analysis and data collection. It is suggested that actual inquiries use the methods, which
can assess both quantitative and qualitative measurements of the changes. In principle,
the selection of data source and data collection method should be cost-effective, simple,
and efficient.
Inform preferred methods of analysis and data collection method.

(Your office) requires that all evaluation studies present recommendations and lessons
learnt in the final report. (Recommendations should document practical and specific
suggestions to improve the project that is subject to evaluation. On the other hand,
lessons learnt present specific suggestions for the formulation of future projects in a
similar context: This is JICA definition and you can modify this according to your office
practice)

4. Consultant Qualifications
The evaluation study will be conducted by (number) local consultant(s). The following
experiences and expertise are to be sought for the assignment:
1 Knowledge of DAC evaluation criteria and Logical Framework
2 Demonstrated experience in conducting similar evaluation studies.
3 Familiarity with your office (or donor) project management and evaluation
4 Knowledge of general social survey methods
Specify survey skills or technical knowledge to be required if necessary.

5. Implementation Schedule
The study is scheduled to commence in (day/month/year) and complete in

IV-28
(date/month/year). Consultant services will be required for (number of weeks) during
the period.
Explain requirement of additional commitment, including revisions of the
deliverables.

6. Deliverables
The Consultants will prepare: (a) Evaluation Grid, (b) Evaluation Report. The following
deliverables are to be presented to (Your Office) by (date/month/year).
(a) An evaluation Grid is to be prepared within (number) days of the first meeting with
JICA and to be presented by (date/month). The Consultants will be requested to
modify their evaluation planning if (Your office) finds it inappropriate given the
specified evaluation purpose and expectations.

(b) The Consultants will also submit an Evaluation Report, which presents main
findings, supporting data, and recommendations/ lessons learnt. The report should
be concise and be no longer than 15 pages in A4 size. Graphic presentation of data
is recommended wherever applicable. (Number) copies in printed format and
(number) in digital format are to be submitted by the agreed date. The report should
include:
1 Scope of evaluation study
2 Project overview
3 Evaluation methods used
4 Results of evaluation
5 Conclusions
6 Recommendations
7 Lessons learnt
8 Annex (Logical Framework/Evaluation Grid)

7. Remuneration
(Your Office) will pay consultant fees within the Government of Nepal remuneration
scale.
Specify the upper limit of the contract payment to the Consultant, if possible.

The Government of Nepal practices competitive biddings for granting its private
contracts. The Consultants will be asked to submit their price quotation at your office
by (date/month)

Specify the items to be shown on the cost estimation.

IV-29
Annex 2: Sample Formats - The Community Development
and Forest/Watershed Conservation Project

1) Monitoring System Sheet

2) Evaluation Grid

IV-30

S-ar putea să vă placă și