GEF – UNDP – FAO PROJECT: REDUCING RATE OF LOSS
OF BIODIVERSITY AT SELECTED CROSS BORDERS SITES IN EAST AFRICA.
An Introduction to the Three
Planning, Monitoring &
Evaluation Consultancy Reports
1.
Introduction
This discussion note lays out the rationale and initial findings
emerging from the Regional Monitoring and Evaluation (M&E) Consultancy that
began in 1999. The consultancy,
implemented by Dr Sejal Worah (an ecologist cum project management design
specialist from India) has had three separate inputs:
·
March - April 1999
·
November December 1999
·
December 2000-January 2001.
And, a further input is planned towards the end of 2001.
The note highlights some key issues that were discussed within the
project management teams in relation to project planning, implementation,
monitoring and evaluation. The note is extracted largely from the Sejal Worah M
and E Report: Number One.
Much of the initial discussion was focused around issues related to
planning. This is because it is difficult, if not impossible, to separate
m&e from planning. An effective monitoring strategy requires a clearly
defined hierarchy of objectives against which indicators can be developed.
Clearly defined objectives require an effective and participatory planning
process to have taken place at all levels including the sites where project
activities are to be implemented.
Because of these links between planning, monitoring and evaluation
(pm&e), planning issues and processes were considered in some detail.
Developing a detailed monitoring plan led to a more rigorous assessment
of project objectives because of the need to identify specific, measurable and
reliable indicators to assess progress towards the objectives. Since this was
the first in depth attempt at developing a monitoring plan for the project, it
led to a substantial review and discussion of the existing objectives. It also
raised the need to make the broadly defined objectives in the existing logframe
more specific so that site specific monitoring plans could be developed. This
led to a number of discussions related to the participatory planning process
that would help define clear objectives at the site level. These issues are
discussed in detail in the first input and report. They are touched on below.
The overall approach taken throughout the consultancy was that of
capacity building. If the m&e strategy is to be successful, then key project
staff should be able to design, implement, assess and continually refine the
strategy. In turn, if this is to be a participatory m&e approach, then the
project staff should be able to build the capacity of partners at the different
sites to enable them to design and carry out site-specific monitoring.
Therefore, training workshops and awareness raising seminars (at the regional,
national, district and village levels) formed a substantial part of the
consultant’s work in all visits.
2.
Consolidation and rationalisation of logframe
Not surprisingly, for a project of this complexity and scope, the
logical framework was a complex document -- 9 pages long, detailing 51
activities related to 7 outputs identified through the wider planning process.
This made the project somewhat difficult to communicate to people not familiar
with the framework approach and to people who may not have been a part of the
planning process that led to the development of the logframe. Before discussing
monitoring and indicators, it was important to ensure that the hierarchy of
project objectives was clear and logical and that the project objectives
themselves were expressed in a manner that would enable them to be monitored
easily.
One of the first activities of the consultant, therefore, was to
summarise the log-frame in a format that was more “user-friendly” and easily
communicable. The resulting “project map” is attached.
While the primary purpose of the consultancy was not to review the
logframe and the logic inherent in it, some level of review was unavoidable if
relevant indicators are to be defined. This process revealed that parts of the
logframe could be re-organised to further strengthen the hierarchy of objectives
and to make monitoring of objectives simpler.
These proposed organisational changes in the logframe are discussed
below:
i)
Clustering activities: At the activity level, the logframe detailed all 51activities which
included both relatively specific actions (such as ‘conducting a training
needs analysis’ or ‘providing essential equipment’) as well as much longer
process-oriented actions (such as ‘developing collaborative management
protocols’ or ‘promoting political support’). This not only made the
logframe unnecessarily unwieldy[1]
but also made thinking of indicators difficult as different “levels” of
activities were all lumped together.
To
overcome these problems, the activities were “clustered” under broad
headings that illustrated the overall aim of each “set” of activities.
“Sub-activities” or tasks corresponding to each “activity cluster” were
linked to these (for example a number of activities related to training,
equipment provision, etc. for local agencies were all linked to the overall aim
of ‘developing the capacity of local agencies at each of the sites’). These
activity clusters are laid out at the lowest level in the “project map”.
ii)
Re-linking activities and outputs: The process of reviewing the logframe and clustering
activities also helped to clarify links between planned activities and proposed
outputs. In some cases, similar “sets” of activities need to be conducted to
achieve different outputs (for example, the collection and analysis of
information for site specific interventions such as the development of
alternative resource use strategies or alternative livelihoods).
These were made more explicit and linked to the specific outputs for the
sake of clarity (B1, B2 and B3).
iii)
Breaking up multiple objectives: It was found that some of the objectives, especially
at the activity level, contained multiple objectives all encompassed within one
sentence (e.g. A1.1 and A 2.2). This again, would make monitoring difficult
because of the confusion and lack of clarity between what was actually aimed
for. These were therefore broken up into two distinct activity level objectives.
iv)
Adding a component on regional linkages: Although this project is nationally executed,
it has a regional and cross-border component that is an essential part of the
overall strategy. The logframe
focuses mostly on national and site-level objectives while the only explicitly
stated cross-border activity (B4.2) is found under the output that refers to
“externalities” (B4).
It
was felt that it might “add value” to the project at a regional level to
make this a more explicitly stated output and develop activities for this so
that more proactive steps could be taken to address cross-border issues and
these could be monitored more effectively.
v)
Rethinking indicators and verifiers: Once the revised and summarised logframe had been
developed, it was possible to focus on the indicators and verifiers. Some of the
indicators described in the original logframe were actually a restatement of the
objectives in different ways and were not really verifiable through the means
described. Some means of verification were actually indicators – others were
not very reliable. All these needed to be re-discussed and revised. In addition,
indicators had been described for all levels of activities including specific
tasks. This would make the monitoring plan extremely unwieldy, difficult and
time-consuming to implement. It is proposed that process monitoring with
specifically defined indicators takes place at the level of activity clusters
while tasks are monitored mainly by assessing whether they were carried out
effectively or not.
3.
Developing
a common vision
For
a project that aims to work largely through partnerships with both government
and non-government organisations at different levels, it is critical that all
potential partners have a common vision of the project. At different times, all
these partners will be involved in some level of project planning,
implementation and monitoring (and self-review/evaluation) and therefore need to
develop a common understanding on these processes. In particular, since this
project is conceived as an Integrated Conservation and Development Project (ICDP),
it is important that all partners understand basic issues and concepts related
to ICDPs before moving into implementation.
4.
“Levels”
of the project
This is a regional project that also aims to work at specific sites in
each of the three countries. It therefore operates at multiple levels – a
regional level (East Africa), a national level (Kenya, Uganda, Tanzania), a
“site” level (Same, Kajiado, Rakai, etc.) and a ward, district or village
level (Chome, etc.). To some extent, participatory planning involving
stakeholders at the regional and national levels has already been undertaken and
the wider project has been developed based on these consultations.
Specific interventions to be undertaken at the local level have not yet
been defined because participatory planning at this level is still to take
place. At the start of the consultancy the project did not know in enough detail
what the site specific issues were both in relation to conservation
(values? threats? root causes?) and development (local needs? priorities?
institutions?). In order to develop specific conservation and development
interventions more specific information was needed, and also there was need to
initiate discussions at the local level to identify stakeholder groups and
develop strategies to work with them for planning and implementation of
site-specific interventions.
This participatory approach to developing site level plans was adopted.
It meant that the project used a more participatory approach to involve local
stakeholders in planning and implementation. However, this approach required more time for consultations with
local communities and partners and also more skills in participatory planning
approaches among the planning teams.
The plans from the sites would be “nested” within the overall
logframe with the same wider goals and objectives but more specific activities.
A diagrammatic representation of this is given
These issues were discussed with the project teams at regional and
national and local levels, and with GEF UNDP. It was agreed that the more
participatory approach, was the preferred one.
5.
The
Monitoring and Evaluation Strategy
Monitoring was not to be seen as an additional, externally driven
activity that creates more work for project managers, staff and partners.
Rather, monitoring is an integral part of project implementation and management
that provides project partners with information that can help improve project
efficiency, effectiveness and accountability. This is an essential component of
decision-making in the complex and uncertain external environment within which
this project is operating. It is important to keep in mind that monitoring is
linked to objectives – it is objectives that drive the selection of indicators
for monitoring and not vice versa. Monitoring is the process by which
information is collated and analysed to assess progress towards defined
objectives.
A preliminary monitoring plan in the form of indicators and means of
verification already existed within the current logframe. These indicators
needed some re-assessment. The redefined logframe will be used during this
process. However, it is important to also discuss what these indicators mean. Is
the project going to collect information at this broad level or are these
indicators simply an aggregation of site specific indicators?
Some indicators especially quantitative ones can be aggregated from site
specific to national levels to give an idea of project progress. Other
indicators, particularly qualitative ones, cannot be simply aggregated to
provide useful information. It was agreed that the site specific plans should
“drive” the overall project, and that it will probably be necessary to
modify the broad indicators in the overall logframe once the site specific
action plans and monitoring plans have been developed.
CONCLUSION
The above notes were based on the first planning consultancy. Findings
were agreed by the National and Regional Steering Committees.
The second consultancy gave greater field experience in the use of
planning tools including broad based situation analysis, and more specific site
threat analysis, stakeholder analysis and objective analysis. These led to site
action plans.
The third consultancy looked at the site action plans in light of
implementation experience and helped provide a strategic refocus with
prioritisation. This led to stronger Indicator processes.