• No results found

EASIER SAID THAN DONE

N/A
N/A
Protected

Academic year: 2021

Share "EASIER SAID THAN DONE"

Copied!
63
0
0

Loading.... (view fulltext now)

Full text

(1)

EASIER SAID THAN DONE

A REVIEW OF RESULTS - BASED MANAGEMENT

IN MULTILATERAL DEVELOPMENT INSTITUTIONS

Michael Flint

(2)

This document is an output from a project funded by the UK Department for International Development (DFID). The author is grateful to all those in DFID, UNDP, UNICEF, UNIFEM, IDB, and the World Bank who assisted with this study. The views expressed in this report are those of the author alone, and are not necessarily those of DFID.

Results-based approaches have continued to develop in the year since work started on this report. The information in this report does not therefore necessarily reflect the current situation in the institutions covered.

Michael Flint & Partners Wernddu

Pontrilas

Herefordshire HR2 0ED United Kingdom

(3)

CONTENTS

Summary

1. Introduction

2. What is results-based management?

3. The history of results-based approaches

4. Strategic planning

5. Monitoring and reporting

6. Managing

7. Issues in results-based management

8. Conclusions

Annexes

A. Strategic planning – country level B. Strategic planning – corporate C. References

(4)

ABBREVIATIONS AND ACRONYMS

ARDE Annual review of development effectiveness AROE Annual report on operations evaluation ARPE Annual report on projects in execution ARPP Annual review of portfolio performance CAS Country assistance strategy

CCF Country co-operation framework

CN Country note

CIDA Canadian International Development Agency CPIA Country policy and institutional assessment CPO Country programme outline

CSP Country strategy paper

DAC Development Assistance Committee DER Development effectiveness report

DFID Department for International Development IDA International Development Association IDB Inter-American Development Bank

IMEP Integrated monitoring and evaluation plan MDG Millennium Development Goal

MDI Multilateral development institution M&E Monitoring and evaluation

MTSF Medium term strategic framework MTP Medium term plan

MTSP Medium term strategic plan MYFF Multi-year funding framework

OECD Organisation for Economic Co-operation and Development OED Operations Evaluation Department (World Bank)

OVE Office of Evaluation and Oversight (IDB) PCR Project completion report

PRSP Poverty reduction strategy paper

QUAG Quality Assurance Group (World Bank) RBM Results-based management

ROAR Results orientated annual report SBP Strategy and business plan SRF Strategic results framework

TAPOMA Task force on portfolio management UNDP United Nations Development Programme UNICEF United Nations Children’s Fund

(5)

SUMMARY

1. The purpose of this report is to present a comparative study of the practice of results-based management in a sample of five multilateral development institutions: the United Nations Development Programme (UNDP); United Nations Children’s Fund (UNICEF); United Nations Development Fund for Women (UNIFEM); Inter-American Development Bank (IDB); and the World Bank. The report is based on a review of documents and a limited number of interviews with head office staff in mid-2002. As such, it does not claim to be definitive, nor necessarily fully up to date with developments since then.

2. The terms ‘results’ and ‘results-based management’ (RBM) are used in different ways in different institutions. Section 2 of the report provides some introductory definitions. Results are taken to include outputs, outcomes and impacts, but with an emphasis on outcomes and impacts. RBM is similar to, but not synonymous with, performance management.

3. All five institutions are, to a greater or lesser extent, engaging with based management. All have made a commitment to increase their results-focus. All have taken steps to, or are working on, improving the planning and reporting of results. As befits their different histories, mandates and cultures, there is enormous variety in their approaches and progress to the four main components of RBM: strategic planning, monitoring, reporting and managing (using).

4. The introduction and implementation of RBM within large institutions is never quick and easy, as is shown by experience in the public sector in OECD countries. The introduction of RBM to international development agencies is even more challenging. Four particular issues can be identified:

(6)

• developing country capacity • attribution

• aggregation • incentives

5. Results-based management is the latest in a very long line of efforts to improve the measurement, monitoring and reporting of effectiveness. This is not to diminish its potential significance. Thinking about development in terms of outcomes and impacts, rather than inputs, activities and outputs, is a

powerful idea that has major implications for how multilateral development institutions operate.

6. Five conclusions emerge from this study:

i. results-based management is easier said than done, particularly for development institutions, and particularly given the new emphasis on country and global results. Institutions should not underestimate the challenge.

ii. Multilateral development institutions work through and with developing country governments to realise and measure results. This presents development agencies with a double challenge: introduce RBM internally and within partner country governments. One without the other is unlikely to succeed. Greater support for the introduction of RBM in developing countries, and associated public sector reform, is essential.

iii. external accountability is driving much of the recent push for RBM. This needs to be accompanied by a greater emphasis on using results information for internal management.

iv. RBM in development co-operation has to face up to the challenge of attribution. For all practical purposes, development agencies have little

(7)

option but to manage for outcomes in the medium- to long-term, but to manage by outputs, indicators and other measures of performance (eg. partnership, strategy and process) in the short-term.

v. multilateral development institutions need to work to amend their internal incentive structures in favour of results. This implies working to correct the continued bias in favour of inputs and activities, as well as giving

substance to results-based budgeting. Resources and recognition needs to flow to those individuals, units, sectors and countries with the best record of managing for, and delivering, results.

7. Finally, this study has implications for those supporting and monitoring the progress of results-based management within multilateral development institutions. Assessing the quality and extent of management change is not a straightforward task. Increasing support for the introduction of RBM will need to be accompanied by a more sophisticated approach to its monitoring.

(8)

1. Introduction

1.1 Recent interest in results-based management in multilateral development institutions is the product of two related developments. The first was the definition of, and agreement on, global development goals. This process started in the mid-1990s, and culminated in the endorsement of the

Millennium Development Goals in September 2000 by all 189 United Nations states. The significance of this event is that, for the first time ever, all

development agencies have a common set of results to which they are working, and against which their collective performance can be judged. This focus on results was confirmed at the United Nations Conference on

Financing for Development in Monterrey in March 2002, and is matched by a broad consensus on development partnership and aid effectiveness. One key feature of this consensus is the emergence of the country as the primary unit of account.

1.2 The second development has been the drive to improve public sector

performance in OECD member states. One response in many countries has been the adoption of results-based management (RBM) by public sector agencies, including those responsible for development co-operation. OECD countries are the major donors to the multilateral development institutions (MDIs). It was therefore only a matter of time before the MDIs themselves were influenced to embark upon a similar process of reform. This began to happen in the late 1990s. References to results and results-based

approaches have become increasingly common among MDIs as a consequence. However, these references often mean different things in different institutions.

(9)

Study objectives

1.3 The purpose of this report is to present a comparative study of the practice of results-based management in a sample of UN development agencies and multilateral development banks. This was originally intended as background to a DFID-sponsored workshop on RBM. Outline conclusions on the value of RBM as currently practiced, and the reforms needed to realise its full

potential, were expected. In the event, DFID decided not to hold a workshop, in part because of the similar World Bank sponsored workshop in June 2002.

1.4 Eight institutions were originally selected for study. With the agreement of DIFD this was reduced to five:

- The United Nations Development Programme (UNDP)

-

The United Nations Children’s Fund (UNICEF)

-

The United Nations Development Fund for Women (UNIFEM)

-

The World Bank

-

The Inter-American Development Bank (IDB)

1.5 The consultant was ask to document and comment on the following aspects of RBM for each institution: the length of experience; changes made over time; organisation, effectiveness and timeliness; quality of information; commitment of operational staff; use made by management; and the quality of reports. This proved to be a hugely ambitious undertaking. RBM is a management approach, not a simple technical instrument. There is a huge difference between how it is meant to work on paper, how it is said to work, and how it actually works. Understanding RBM basically means

understanding how these institutions are managed, both in head office and in the countries where they operate. This was clearly impossible in the time available (25 days in total). Each of these institutions would require this much

(10)

time to do them justice. Useful meetings were held with all the institutions involved, but these could not really do more than scratch the surface. The result is a report that is inevitably more superficial than was originally intended, and which concentrates more on generic issues than on institutional specifics.

1.6 The report begins with a discussion of the key terms: results and results-based management (section 2). Section 3 contains a brief history of RBM in each institution. Sections 4-6 cover the main elements of RBM: planning, monitoring, and managing. The report ends with a discussion of the main issues in implementing and monitoring RBM.

(11)

2. What is Results-Based Management?

2.1 Results-based management can mean many different things. The area is bedevilled by different definitions. What one institution calls an ‘outcome’ is another’s ‘output’, ‘intermediate outcome’, or ‘impact’. Without agreement about what exactly RBM is, it is very difficult to assess or monitor its implementation. Some discussion of what these words mean is therefore required at the outset.

Results

2.2 The recent OECD DAC glossary of key terms defines a result as ‘the output, outcome or impact of a development intervention’ (Box 1). While this is the definition used in this report, it should be noted that this is a broader

definition than used by some of the leading exponents of results-based management. According to the Treasury Board of Canada, a result is ‘the end or purpose for which a programme or activity is performed ... and refers exclusively to outcomes’.1 ‘Outcome’ in this usage covers both effects and impacts - but not outputs - and may be immediate, intermediate or final.

2.3 This is a potentially important distinction. A key feature of RBM is the requirement that managers look beyond inputs, activities and outputs, and instead focus on outcomes. Some would argue that to see outputs as results is therefore to weaken this fundamental shift in orientation towards

outcomes. Others argue that outputs are results, and that the important feature of RBM is the link between these and changes at outcome level.

1

(12)

Box 1 – Results

OECD DAC definitions2 :

Result : the output, outcome or impact of a development intervention.

- Output : the products, capital goods and services which result from a development intervention.

- Outcome : the likely or achieved short-term and medium-term effects of an intervention’s outputs.

- Impacts : positive and negative, primary and secondary long-term effects produced by a development intervention.

- Effect : intended or unintended change due directly or indirectly to an intervention

2.4 The other key feature of a result is that it should represent attributable change resulting from a cause-and-effect relationship. In other words, there has to be a reasonable connection, or at least a credible linkage, between the specific outcome and the activities and outputs of the agency. If no attribution is possible, it is not a result.

2.5 The accepted way of linking inputs to outcomes, and of demonstrating

attribution, is via a logical framework or results chain. An example of such a results chain is shown below. Inputs are immediately measurable and under the control of the MDI. Activities hopefully follow soon after the provision of inputs, but are dependent on the commitment and actions of the government and other development partners. The outputs, and even more so the

outcomes, that result are generated after a lag of several years, are subject to many exogenous factors, and are only partly attributable to the inputs provided by the MDI. Impacts are even more time-lagged, subject to

(13)

multidimensional causation, and are extremely difficult to attribute to one MDI.3

IMPACTS Lower infant mortality

OUTCOMES Reduced infection

OUTPUTS Immunisation coverage

ACTIVITIES Immunisation programmes

INPUTS Finance and skills

2.6 It follows that the requirement for describable or measurable attribution presents a real challenge for development agencies as they move from projects to programmes, and as their focus shifts to shared country and global outcomes, as exemplified by the Millennium Development Goals (MDGs). The issue of attribution, and its implications for RBM, will be returned to later (para.7.5).

Results-based management

2.7 The OECD DAC defines results-based management as ‘a management strategy focusing on performance and the achievement of outputs, outcomes and impacts’. This is a wide definition. In addition to the reference to outputs, the definition also mentions performance. In doing so, OECD DAC is not implying that RBM is the same as performance management. Performance

2

Glossary of key terms in evaluation and results based management. OECD DAC (2002)

3

Measuring Outputs and Outcomes in IDA Countries. International Development Association. February 2002.

(14)

should include measures of process and efficiency, not just results. RBM is just one, albeit significant, approach to performance management.

2.8 Another definition of RBM is provided by the Treasury Board of Canada. This rightly defines RBM as a comprehensive management approach which emphasises outcomes throughout the programming cycle. As will be

discussed later, RBM implies and requires fundamental changes in organisational culture and incentives.

Box 2: Results-based management

A comprehensive, life-cycle approach to management that integrates business strategy, people, processes and measurement to improve decision-making and drive change. The approach focuses on getting the right design early in the process, implementing performance measurement, learning and changing, and reporting performance.

2.9 The application of RBM varies from country to country, and from agency to agency. However, there are four core elements to most RBM approaches4:

1. Strategic planning: defining clear and measurable results and indicators, based on a logic model or framework.

2. Monitoring: measuring and describing progress towards results, and the resources consumed, using appropriate indicators.

3. Reporting, internally and externally,on progress towards results.

4. Managing: usingresults information (and evaluation) for lesson-learning and management decision making.

4

(15)

2.10 The experience and thinking of the five multilateral institutions with respect to these four elements is considered below, having first briefly outlined the history of results-based approaches in each.

(16)

3.

The history of results-based approaches

3.1 This section documents the history of results-based approaches in the five institutions reviewed: UNDP, the World Bank Group, Inter-American

Development Bank (IDB), UNICEF and UNIFEM. The practice of RBM needs to be considered at three main levels5 :

• Project

• Country

• Corporate

3.2 In the context of development co-operation, RBM at the project level has the longest history and is most well documented6. Work on introducing RBM at country and corporate level is much more recent. It is at these levels where the real challenge for RBM lies. This report will accordingly concentrate on RBM at country and corporate level.

3.3 This does not mean that RBM at project level should be ignored, for two reasons. First, despite the shift to a non-project development paradigm, projects still dominate the aid landscape7. Second, RBM is most applicable, and least problematic, at the project level. Despite this, the application of RBM and logical frameworks to projects has not been particularly

successful. The limited success of RBM in the much simpler environment of projects should, at the very least, give pause for thought. This issue is discussed further below (section 7).

5

RBM is also applicable at a fourth, cross-cutting, level : the sector.

6

OECD DAC (2001) op cit

7

‘Development Cooperation and performance evaluation : the Monterrey challenge’. OED, World Bank Working Paper. June 2002. See also DFID DER.

(17)

3.4 It is important to emphasise that the degree to which RBM has been applied, or is claimed to be, is not necessarily correlated with effectiveness. The fact that most of the institutions have not yet adopted and implemented RBM in a formal sense does not mean that they are not implementing parts of the approach at some levels. It certainly does not mean that they are not producing development results.

United Nations Development Programme (UNDP)

3.5 UNDP has made the strongest commitment to RBM. It is the only institution of the five to have begun to implement RBM as an organising principle at all levels, and is the most advanced of all the UN agencies. Further advances have been made since the information on which this section is based was collected.8

3.6 UNDP’s advanced status has two origins. The first was the pressure of declining core funds in the 1990s. UNDP knew that it had to change if it was to recover the confidence of the donor community. In 1997 UNDP initiated a set of change management processes, known as UNDP 2001. The UNDP change process emphasised, among other things, the need for the

organisation to become more results-orientated9.

3.7 In parallel, UNDP’s Evaluation Office (EO) had been working on developing results-based monitoring and evaluation policies, methodologies and tools. In 1997 EO commissioned a joint study with SIDA on results management10, and produced a handbook on results-orientated monitoring and evaluation for programme managers11. In 1998 EO was given lead responsibility for developing a framework for the measurement and assessment of

8

In a response to a draft version of this report, UNDP stated that this report does not take account of the many, more recent advancements UNDP has made in internalising RBM.

9

Annual Report of the Administrator for 1997. UNDP (1998)

10

(18)

programme results. This step initiated the introduction of RBM in UNDP12 and led to the Multi-Year Funding Framework (MYFF) in 1999. The MYFF was a four-year funding framework (2000-03) encompassing a Strategic Results Framework and a resource framework that integrated all financial allocations.

3.8 Since then, UNDP has been working to ensure that “assessing and reporting on results is not a minority preoccupation but a way of doing business for the organisation as a whole”.13 Having been piloted in ten countries, RBM was introduced worldwide in only one year, with the first Results-Orientated Annual Report (ROAR) produced in 1999. Strategic choices were made to learn from others; to learn by doing; to tailor RBM to UNDP; to keep the system as simple as possible; not to over-invest in indicators; and to manage for (not by) results. The result is an approach that is still being adapted, but which has been mainstreamed throughout the organisation and its instruments. The next generation of RBM software is currently being introduced.

Box 3: UNDP’s Results-Based Management System Planning Instruments:

• Strategic Results Framework • Integrated Results Framework • Multi-Year Funding Framework • Country Office Management Plan Reporting Instruments:

• Results-Orientated Annual Report • Multi-Year Funding Framework Report • Country Office Management Plan Report

11

Results-orientated Monitoring and Evaluation: a Handbook for Programme Managers. UNDP (1997)

12

Results Based Management – Overview and General Principles. UNDP.

13

(19)

United Nations Development Fund for Women (UNIFEM)

3.9 The history of results-based approaches within UNIFEM was not easy to discern on the basis of published documents and a single interview. In common with all the institutions in this study, the notion of results is not new to UNIFEM. In the Consultative Committee (CC) Report for 1997 UNIFEM reported on the introduction of RBM concepts into its programme. This work was initiated with support from the Canadian Government. UNIFEM’s

Strategy and Business Plan (SBP) for 1997-99 also clearly listed the results that were to be achieved, and the SBP for 2000-03 includes a results

framework which lists expected outcomes and indicators.

3.10 Since 1998 the CC report has used a results orientated format for reporting against the SBP. By virtue of its close association with UNDP, UNIFEM was influenced by the UNDP 2001 change process and by the introduction of RBM in that organisation. UNIFEM uses the Results and Competency Assessment developed by UNDP, and has an interface with the UNDP ROAR. However, UNIFEM has also been exploring, and been influenced by, the RBM approaches of other multilateral and bilateral agencies.

3.11 According to the recent Report of the Executive Director, each of UNIFEM’s three programming objectives “is measured and driven by a results-based framework designed to create a learning and knowledge based institution”. 14

However, it is acknowledged in the same report that “new monitoring and evaluation mechanisms are needed, with greater focus on assessing

progress towards results than completion of activities”. Thus, while UNIFEM has certainly become more results-orientated since 1997, the introduction of results-based management tools and internal support has some way to go.

14

(20)

United Nations Children’s Fund (UNICEF)

3.12 UNICEF is proof that a results-orientation is not necessarily new. While the use of the term RBM may be new, UNICEF has been practising large parts of the approach for at least twenty years. One of the best examples was the child survival campaign launched in 1982. By insisting on strategic action, measurable results, and clear accountability, the then Director of UNICEF (J.P.Grant) spearheaded extraordinary improvements in child survival and development over the following decade.

3.13 Over the last few years UNICEF has recognised the need to define more clearly the results it seeks to achieve. In 1996, a new Mission Statement was approved. This was followed by a Medium-Term Plan (MTP) for 1998-2001. Although containing a statement of priorities, these were numerous and wide-ranging, and were not mainstreamed within UNICEF. The MTP also lacked clearly defined targets against which to measure achievement. Significant progress was nevertheless made over the MTP period in achieving a stronger results-focus in programming and reporting, and in moving towards a more strategic approach.

3.14 In 2000, UNICEF produced a Multi-Year Funding Framework. This was seen as an opportunity to strengthen results-based management within the

organisation. Analytical reporting on results linked to objectives and budget was identified as a core element of the framework. The Executive Director’s Annual Report in the same year was the first to use a results-based format.

3.15 Most recently, UNICEF has produced a Medium-Term Strategic Plan

(MTSP) for 2002-2005, with results-based management as one of its guiding principles. This represents a clear shift towards results-based programming that goes beyond identifying broad goals and requires that specific results for children be identified, measured regularly, and systematically reported.

(21)

UNICEF also recognises that its evaluation function needed strengthening15. As the MTSP put it:

“UNICEF must establish its organisational priorities, define objectives, define the criteria of success for its work, strive to achieve its objectives,

systematically monitor progress (or lack of it) and evaluate its work so it may learn how to maintain relevance, effectiveness and efficiency: this is results-based management”.16

The World Bank

3.16 The World Bank has been working to increase its results orientation for the past ten years. In 1992, the World Bank was criticized by the Wapenhans Report for giving more attention to the quantity of its lending than to its quality: a product of the so-called “approval culture”. The World Bank

responded with a concerted effort to improve its focus on quality and results. In 1993 the World Bank issued a new plan entitled “Getting Results: the World Bank‘s Agenda for Development Effectiveness” and initiated the “Next Steps” reform programme. In 1994 “Learning from the Past, Embracing the Future” was published, with a ’results orientation’ as one of its six guiding principles.

3.17 This was followed by the “Renewal Process” in 1996. A raft of general and sector-specific performance monitoring indicators, and the logical

framework, were introduced. In the same year, the Quality Assurance Group (QUAG) was established to improve, and allow management to keep track of, project design (quality-at-entry) and supervision. This added a significant quality element to the traditional measures of lending approvals (number and amount), project performance (projects at risk), and ratings of closed

projects (outcome, sustainability, and institutional development impact).

15

Report on the Evaluation Function in the Context of the Medium-Term Strategic Plan. UNICEF (2002)

16

(22)

3.18 1997 saw the launch of the “Strategic Compact”. The Compact aimed to make the World Bank “more effective and efficient in achieving its main mission - reducing poverty” and included a commitment to “building a performance assessment system and to making management more performance based”. This led to further improvements in performance

measurement and management, and to some increase in results-orientation. For example, the 1998 Annual Report on Operations Evaluation (AROE) concluded that while RBM had not been formally adopted – as had been recommended by the AROE in 1997 - “operations are moving in that direction”.

3.19 A similar judgement was made in 2001. The Strategy Update Paper summarised the situation in the following way :

We also are much more explicitly focusing on results, particularly on how we can better measure, monitor and manage to achieve them. We have come a long way in developing measures of operational inputs and their quality, and these have helped us to make a steady improvement in Bank performance over the last several years. We now need to ratchet up our results focus, doing more to measure and explain how our work makes a difference in terms of country outcomes.17”

3.20 Recent IDA-13 and Monterrey discussions have given renewed impetus to the search for better ways of monitoring country outputs and the contribution to country outcomes. Most of the improvements in the 1990s were aimed at improving the quality of the design, implementation and monitoring of

projects. The World Bank accepts that more needs be done to increase its results orientation, particularly in areas other than projects. Improvements are planned in the planning and monitoring of country programmes, as well as for sector and thematic strategies. Further work on implementing the

17

Strategy Update Paper for FY03-05: Implementing the World Bank’s Strategic Framework. Executive Summary p.i. March 2002.

(23)

results agenda with respect to corporate reporting, staff incentives and training, and risk management is also underway.18

Inter-American Development Bank (IDB)

3.21 The IDB has not experienced the same level of external pressure for reform and results, and the associated permanent management revolution, which has characterised the World Bank over the last decade. However, concern about the results-focus of the IDB has followed a broadly similar history19.

3.22 As with the World Bank, recent efforts to increase the results-focus of the IDB originated from a critical review of the Bank’s portfolio. In 1993 the Task Force on Portfolio Management (TAPOMA) found that the focus on the initial approval of projects and the subsequent control of execution took the focus away from managing for development results. It concluded that a concern for results needed to be paramount.

3.23 The IDB Board and management endorsed this shift of focus and responded in the mid-1990s with a series of improvements to the way projects were designed and monitored. The overall aim was to promote “a results-orientated dialogue among Bank staff, executing agencies, and national counterparts” and an increased results-focus in project design, monitoring and reporting. Improvements included the requirement for logical

frameworks, impact indicators, and project completion reports based on data on the outcomes or impacts. The new US Administration’s emphasis on results throughout 2001, and internal changes in Office of Evaluation and Oversight, gave fresh momentum to RBM within IDB.

18

Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper. World Bank. September 2002.

19

This section draws extensively on the Development Effectiveness Report. Office of Evaluation and Oversight. IDB. February 2002.

(24)

3.24 Despite the increasing commitment of management to results, IDB accepts that it has some way to go. A recent report by the Office of Evaluation and Oversight concluded that IDB projects “are still not being designed and monitored so as to transparently demonstrate development results”. Further improvements to project and country results frameworks are under

consideration, as are changes to the incentive framework to help sharpen the IDB’s focus on results and development effectiveness.20

20

(25)

4. Strategic planning

4.1 This section considers the extent to which strategic planning at the country and corporate level has a results-focus. For institutions that are implementing RBM, strategic planning should be about planning to achieve outcomes: management for results. Plans should contain clear, realistic and attributable results; defined indicators specifying exactly what will be achieved by when; a results chain or logic model linking inputs, activities, outputs and outcomes; and a strategy or strategies explaining how and why inputs will lead to outcomes, including a discussion of risk.

Country-level planning

4.2 All the institutions are, to a greater or lesser extent, struggling with three challenges. First, to align their programmes more explicitly to the

country’s own plans, such as the Poverty Reduction Strategy Paper (PRSP). Second, to raise the sights of their programmes from the project level to country level. And third, to define better country-level results frameworks.

4.3 Annex A contains a summary assessment of country planning documents. While most of the institutions now specify country-level results of some sort, none of the institutions have developed logical frameworks for country programmes as a whole. UNICEF comes closest with its

programme-level Integrated Monitoring and Evaluation Plan (para.4.12). Unlike the others, UNIFEM plans regionally and sub-regionally rather than at country level. The regional and sub-regional programmes are

developed within the framework of the Regional SBP. Logical frameworks are a requirement at the programme level.

(26)

4.4 The 2002 Development Effectiveness Report (DER) provides a frank assessment of country-level planning in the IDB. It found that, “with few exceptions, Bank programming does not establish ex-ante any specific results that it is seeking to obtain in working with an individual country”. IDB country programmes described project-level outputs rather than country level outcomes. The one area of activity where the IDB had anticipated outcomes was structural reform. The experience in this area shows very clearly the importance of an outcome-focus. IDB projects have been “very successful in producing the output of reform, but these reforms did not produce the outcome of growth in productivity”.

4.5 The three Country Papers reviewed21 support this conclusion. The Strategy Matrix is not a logical framework. Overall objectives are stated, but these are very general and are not accompanied by any indicators or targets. The Bank’s strategy then consists of priority areas, activities or focuses under each objective, often referencing specific IDB programmes. The performance benchmarks for the strategy are a mix of selected

program outputs and country outcomes. Examples from the Country Paper for Chile are contained in Box 4. In all the Country Papers

reviewed, the link between strategy outputs and country outcomes is not specified.

Box 4 : IDB Strategy Matrix – examples

Objective Poverty reduction, human capital formation and social inclusion Strategy Improvements in execution of preschool education programs

(Early Childcare Program) Performance

benchmark

Recovery of net enrollment ratios in rural primary school to at least 87% by the end of 2003

4.6 Recent IDB guidance recognises the importance of including in Country Papers (and distinguishing between) indicators that can be used to monitor progress specific to the Bank’s programme (ie. outputs), as well

21

(27)

as the Bank’s contribution to country outcomes22. This is very much work in progress, but one idea is to juxtapose IDB programme outputs with associated country outcome targets.

4.7 Better anchoring of the Country Assistance Strategy (CAS) in the

country’s specific priorities and objectives is central to the World Bank’s increased focus on results at country level. According to the Annual Review of Development Effectiveness (ARDE) for 2001, results would be improved if CASs included a logical framework (and results chains) linking Bank instruments with country objectives. In 2002 the World Bank

reported that ‘results-based CASs’ are to be piloted in several countries. These will identify country outcomes (from the PRSP or similar) to which the World Bank will contribute, along with intermediate indicators linked to particular products and services that the Bank will provide23.

4.8 Current World Bank CASs24 are similar to the IDB Country Papers. Most include a Country Program Matrix detailing the main objectives or

priorities, and the Country Strategy/Key Actions. Progress benchmarks or targets for each main strategy/action are given, but these refer country outcomes rather than Bank outcomes or outputs. As with the IDB, country strategies do not yet contain clear results frameworks or chains, nor “clear, monitorable indicators for evaluating the development

effectiveness of the Bank program”25. This is not to imply that this is easy to do. As recognised in an IDA-13 paper, part of the answer may lie in the identification of early indicators of output performance which have good eventual linkages to country outcome objectives26.

22

Country Paper Guidelines. IDB. February 2002

23

World Bank (2002) op cit, p. 10

24

Pakistan (2002), Chile (2002), and Belarus (2002).

25

Ten Features of a Good CAS. http://www.worldbank.org/html/pic/cas/tenfeat.htm

26

(28)

4.9 Recent UNDP Country Programme Outlines (CPOs) include a results and resources framework.27 This lists intended outcomes and outputs (with indicators) within ‘strategic areas of support’. Examples from the Malaysia CPO are contained in Box 5. Note that the UNDP outcomes are lower level outcomes (less ambitious and more attributable) than those specified by UNICEF, IDB or the World Bank.

Box 5 : UNDP results framework – examples Strategic area of

support

Sustainable human development

Intended outcome National policies more effectively address the social impact of economic liberalisation

Indicator of outcome

Explicit analyses of the impact of global liberalisation on human resources development integrated in key national plans and policies

Output Increased capacity to assess and predict human development needs and to monitor in relation to competitiveness

4.10 This is an advance on earlier UNDP Country Cooperation

Frameworks (CCFs). These had merely listed the areas of support and made no mention of results.28 More recent CCFs had listed ‘key results’ under each strategic area of support, but had not distinguished between outcomes and outputs, nor included indicators29.

4.11 UNICEF Country Notes (CNs) describe overall objectives for the 5-year programme, plus specific objectives for each programme (eg. to reduce infant and child mortality by 25%). No overall results framework is presented for the country programme. However, a very detailed

Integrated Monitoring and Evaluation Plan (IMEP) is then developed for each of the constituent programmes (eg. health, early education, etc.).

27

India CPO (2002); Malaysia CPO (2002). UNDP consider that the Malaysia CPO is not a good example from the RBM perspective.

28

Mongolia CCF (1997)

29

(29)

Examples from the Health Programme IMEP for Malawi (2002-06) are contained in Box 6.

Box 6 : UNICEF Integrated Monitoring and Evaluation Plan – examples Overall

objective

To create a conducive environment to realise rights to survival, development, protection and participation of children and women. Program

objective

To eliminate or decrease the major killers of children in UNICEF impact areas

Specific objective

To improve access to, and the quality of healthcare at health facilities

Output Health workers at health facilities trained in IMCI case management and obstetrical care

Baseline 10% Target 80% Critical

assumption

Adequate number of qualified staff available

Corporate planning

4.12 There is a tension between corporate and country level objectives. Allowing priorities to be set at country level reduces the extent to which institutions can develop corporate-level results frameworks. In most cases this tension is resolved by restricting corporate planning to the definition of broad goals, priorities and principles. Few institutions have attempted to develop ex ante results frameworks at corporate level. A summary of the corporate plans for the five institutions is contained at Annex B.

4.13 UNDP is well aware of the tension between top-down and bottom-up planning, but has gone further than any of the other institutions in determining a corporate results framework. The Strategic Results

Framework (SRF) for 2000-03 lists 7 goals, 24 sub-goals, 142 outcomes (with indicators), and 84 ‘strategic areas of support’. Box 7 contains examples from the SRF.

(30)

Box 7 : UNDP Strategic Results Framework – examples

Goal To create an enabling environment for sustainable human development.

Sub-Goal Strengthen capacity of key governance institutions for people-centred development and foster social cohesion

Strategic area of support

Reform and strengthen the system of justice, including legal structures and procedures

Intended outcome

Independent and efficient system of justice, accessible to all strata of the population in particular the poor.

Indicator Number of countries in which there has been a decrease in time required for disposal of civil and criminal court cases

4.14 In practice, only the goals, sub-goals and strategic areas of support are used to guide country-level programming. Outcome and outputs are determined at country level. The utility of the corporate level outcomes within the SRF is therefore unclear.

4.15 UNIFEM’s Strategy and Business Plan for 2000-03 followed a similar structure to that of the UNDP SRF, but without the sub-goals. 120 outcomes and indicators were listed. As with UNDP, no means of

verification were given for the indicators. An example is given below.

Box 8 : UNIFEM Strategy and Business Plan – examples Objective Increase options and opportunities for women. Thematic

area

Economic empowerment and rights Strategic

area of support

Strengthening women’s economic capacity, rights and

sustainable livelihoods as entrepreneurs, producers and home-based workers

Intended outcome

Reduction in the number of women in poverty through participation in viable economic activities

(31)

4.16 UNIFEM has subsequently revised and strengthened their results framework. In 2001 UNIFEM introduced a results indicators framework. In 2002, it consolidated the numerous outcomes of the SBP into a more logical and focused Outcome Framework with indicators and suggested means of verification. This reduced the number of outcomes to 48.30

4.17 The UNICEF Medium Term Strategic Framework lays out 5 ‘organisational priorities’ (plus targets and indicators) and 89 ‘core intervention areas’. Each of the organisational priorities is related to relevant long-term international goals, such as the MDGs. Examples from the MTSF are given below.

Box 9 : UNCEF Medium Term Strategic Plan – examples Organisational priority Fighting HIV/AIDS Long-term international goals

UN Special Session on HIV/AIDS Declaration of Commitment

MTSP target By 2005 ensure that national policies, strategies and action plans are under implementation to prevent parent-to-child transmission of HIV in all countries affected by HIV/AIDS Indicator Number of countries with national strategies and action plans

under implementation

4.18 The IDB and World Bank have not yet attempted to develop corporate plans to this level of detail. The IDB Institutional Strategy

merelysets out four priority areas31. Two overarching objectives – poverty reduction and social equity, and environmentally sustainable growth – were added after a long debate. The Bank’s contribution to these objectives is to be measured through its contribution to country level

30

How are we doing? Tracking UNIFEM progress in achieving results for management and learning. Briefing Note. UNIFEM (2002)

31

Renewing the Commitment to Development: Report of the Working Group on Institutional Strategy. IDB (1999)

(32)

outputs and outcomes. Sector strategies are in the process of being finalised.

4.19 The World Bank takes a similarly minimalist approach to corporate planning. The Strategic Framework Paper acknowledges that the MDG’s frame the World Bank’s strategy and provide a results-based framework for the international community.32 However, no attempt is made to specify global outcomes or outputs for the Bank. Rather, the aim is to maximise the impact on poverty reduction through greater selectivity within

countries, across countries and in global programmes. The main focus of planning and activity will remain at country level, but with strong corporate guidance on principles and practice. As the Strategic Framework

observed, “given the tension between ‘bottom-up’ country driven needs and more ‘top-down’ imperatives, this is inevitably a difficult and iterative process".33

4.20 This demonstrates the main difference between the UN agencies and the multilateral development banks (MDBs). All the UN agencies have, to a greater or lesser extent, defined global goals and outcomes. The challenge for all of them will be to show that these are monitorable and attributable. The MDBs have (so far) avoided global results

frameworks, and have instead concentrated on strategy in broad support of the MDGs. This is now changing as all institutions feel the pressure to deliver and demonstrate results at the country and global level.

32

Strategic Framework. WBG, January 2001.

33

(33)

5.

Monitoring and reporting

5.1 This section should be as much about monitoring as about reporting. Not everything that is monitored is reported, or needs to be. However, the limited duration of this study meant that little information could be collected on monitoring per se. Time constraints also meant that no country-level reports were examined.

5.2 The distinctions between monitoring and reporting, and between internal and external reporting, are important. Many institutions are under pressure to report externally on results. While this is important for accountability, internal reporting to management, and monitoring more generally, are arguably at least as important. RBM is intended to improve both management

effectiveness and accountability.

5.3 It is also important to stress that accountability for results implies more than just reporting results. Many results (eg. outcomes) will not be attributable to a single institution. Because of this, reporting needs to demonstrate several things :

i. that the agency is managing for outcomes, not just activities and outputs. ii. that improved outcomes are being achieved;

iii. that the agency has contributed to these outcomes;

iv. that the design and implementation of the results strategy is sound and effective;

v. that the results over which the MDI has a significant degree of control, and is aiming for, are being achieved.

(34)

Annual reporting should concentrate on short-term results that show meaningful change over the reporting period; are attributable to the interventions being supported; and bear a significant relationship to longer-term objectives. 34

5.4 None of the reports reviewed yet approach this standard. Most concentrate on the second and last task – reporting on outputs and outcomes – but without either analysing the strength of the link between the two, nor the effectiveness of the management strategy.

5.5 The UNDP Results Orientated Annual Report (ROAR) represents the most ambitious and comprehensive corporate results report. The third ROAR (2001) presents key findings for each of the six SRF goals, together with in-depth analysis of three selected sub-goals. Aggregated global figures for the percentage of annual outputs fully or partially achieved, and the percentage of outcomes where there was positive progress, are presented in the text, together with the number or percentage of country offices active in each area. Comparative figures are sometimes given for achievements in the previous year. One of the general observations made is that “there is still a sizeable gap ... between impressive results at the output levels achieved within each goal and their contribution to realising outcomes”.35

5.6 The ROAR process includes an independent assessment of the extent to which the self-reported results from the country offices are accurate and complete. 71% of progress statements were fully verified, and a further 9% were partially verified. The verification did not extend to the degree to which UNDP outputs contributed to progress at the outcome level. The ROAR does not make any claim that it is solely responsible for such progress, but simply reports changes in outcomes that are “clearly linked” to UNDP support.

34

Results-Based Management and Accountability for Enhanced Aid Effectiveness. A Reference Paper. CIDA Policy Branch. July 2002.

35

(35)

5.7 While the ROAR is clearly a great advance on previous reporting, it lacks transparency in two respects. First, there is no single table showing

coverage and achievements by goal and sub-goal for 1999, 2000, and 2001. It would be possible to largely create such a table by extracting the figures for 2000 and 2001 from the 67 pages of text, but the fact that the data is not presented in an accessible format is strange. The ROAR badly needs a straightforward summary. Second, although activities by goal and country are tabulated in an annex, there is no presentation of the achievement by outputs and outcome for each country office. This is a deliberate decision36, and may reflect a judgement that country-specific results would be

misleading given the wide variation in results, projects and countries.

5.8 For the last two years the Evaluation Office of UNDP has prepared a Development Effectiveness Report (DER). This is largely based on

independent evaluation studies, and complements the ROAR by providing summary findings on the impact and sustainability of UNDP interventions at project and country level. The relative paucity of empirical data on the development impact of UNDP’s assistance was noted in both of the last DERs.

5.9 UNICEF has used a results matrix to report on its Medium-Term Plan (MTP) since 1999. The results cited are a mix of global outcomes to which UNICEF made some contribution, or a description of what UNICEF has supported (ie. activities). There is no assessment of what UNICEF has directly achieved in terms of outputs, either against what was planned for the year in question or over the four years of the plan. A comparison of the results matrix for 1999 and 2001 does not allow any conclusion to be drawn as to whether UNICEF is more or less effective than it was, or whether its contribution is growing or

36

(36)

shrinking. The better definition of intended results in the MTSP for 2002-05 is likely to lead to improved reporting.

5.10 UNIFEM’s Strategy and Business Plan (SBP) for 1997-99 included a detailed list of activities under each objective. The new SBP for 2000-03 included a report on the previous plan that lists the specific and general results achieved. However, it is not possible to match the results with the activities originally listed in the SBP for 1997-99.

5.11 The annual Report of the Executive Director mentions that “implementation of each of [the SBP] objectives is measured and driven by a results-based framework”, but does not report against the intended outcomes listed in the SBP for 2000-03.37 This is done in the Consultative Committee Report, which is an annual report of results against the objectives of the SBP, based on data from the 6-month and annual reports submitted from each Sub-Regional Office.

5.12 The IDB prepares an Annual Report on Projects in Execution (ARPE) for the Board of Executive Directors. This provides detailed information on the status and performance of the Bank’s portfolio, including an assessment of the extent to which ongoing projects in each country are likely to achieve their development objectives. In addition, the ARPE provides an assessment of trends and challenges, the issues affecting portfolio performance and notes the Bank’s response to these challenges. In the last two years the report has contained an analysis of the quality and compliance rate of Project Completion Reports (PCRs), has provided information on good practices noted, and highlighted lessons learned from both the Bank and Borrowers.

37

(37)

5.13 The Bank is in the process of revamping the PCR and the Project

Performance Monitoring Report (PPMR). The PPMR has been modified to include historical project ratings, as well as greater attention to financial and sustainability issues and lessons learned, and will be linked to other relevant reports and monitoring systems. The last PPMR will also serve as a key input for the preparation of the PCR, which will focus more on results and comply with OECD/DAC guidelines for MDBs. It will include an evaluation of both the Bank and Borrower performance, an assessment of the project’s contribution to institutional development, and an outlook on expectations regarding the project’s ability to deliver benefits in the medium and long-term.

5.14 Like UNDP, IDB’s Office of Evaluation and Oversight (OVE) has also produced a Development Effectiveness Report. However, unlike in UNDP, OVE is independent of management. One of the findings reported in the IDB DER was that, for completed projects rated as highly likely to achieve their development objectives, the majority of PCRs only discuss project outputs. Although it is quite likely that all the projects made some contribution to outcomes and impacts, this was very rarely documented in the PCR38.

5.15 According to Operations Evaluation Department (OED), the monitoring and evaluation (M&E) in World Bank operations has been “chronically deficient”. The Annual Report on Operations Evaluation (AROE) for 2000-01 went on to say that, “despite indications of increasing operational quality and project performance, the Bank does not have a solid foundation to convincingly demonstrate results on the ground”.39 According to one source, the Bank is still ‘years away from the systematic measurement of results’.

38

Development Effectiveness Report. IDB (2002) pp.29-31

39

(38)

5.16 Part of the problem lies in the lack of monitorable outcomes in Country Assistance Strategies (CAS), Sector Strategy Papers, and Project Appraisal Documents. These and other problems have been the subject of a

comprehensive M&E action plan since 1999. Further improvements in M&E – such as a CAS completion report – are underway. The methodological challenges associated with measuring and attributing results are also very real.

5.17 The Annual Review of Portfolio Performance (ARPP) produced by the Quality Assurance Group (QAG) is the Bank’s primary operational

monitoring tool. At present this focuses on design and supervision quality, rather than results. However, there are plans to broaden the ARPP into an Annual Report on Portfolio Performance and Results. Subject to a

satisfactory solution to the problem of aggregation, there are also plans for units to report annually on “outputs and outcomes related to real-time actions”, but not “program and country outcomes that will be realised only after long and variable lags”40.

5.18 The OED Annual Review of Development Effectiveness (ARDE) reports on the ‘outcomes’, sustainability and institutional development impact of

completed projects, as well as providing a summary of country and sector evaluations. It should be noted that the term ‘outcome’ in this context refers to the extent to which the project’s relevant development objectives have been (or are expected to be) achieved. These will be a mix of outcomes (intermediate objectives such as skills and organisational capacity) and impacts (long-term goals such as human and social development).

5.19 The ARDE provides a reliable measure of the extent to which completed Bank projects are producing relevant results. Because OED has used a

40

Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper. World Bank. (September 2002) p.11

(39)

consistent methodology over the past few years, it also allows trends in project performance to be monitored. What it does not attempt to do is to quantify the specific results achieved or assess the contribution towards higher goals, such as the MDGs. In this sense it is more of an aggregation of results ratings rather than results. This is a practical solution to the problem of aggregating across diverse results.

5.20 The World Bank acknowledges that there is scope to improve its reporting on its results. The assessment of the Strategic Compact found that the corporate scorecard was still incomplete, in part because of a lack of an agreed methodology. Limited progress has been made on agreeing ways of measuring and monitoring the impact of World Bank actions at country and sector level. This missing ‘second tier’ of the corporate scorecard is intended to link internal bank measures (such as product quantity and quality) with the International Development Goals (IDGs). The Strategic Framework

produced in 2001 also highlighted the need to link country and sector work with the IDGs, but was not able to say how this would be done.

(40)

6. Managing

“...there is sufficient evidence that the key elements are well known to donors and carried out to some extent. But in so many instances they have failed owing to weaknesses in how the systems are used rather than what its components are. They reflect the missing link between the measurement procedures and the way in which the information is used – the management process”41

6.1 There are two primary uses, and motivations, for results information. The first is for accountability: to demonstrate effectiveness to others. This aspect was covered in the previous section. The second is to provide continuous

feedback and learning for management. To what extent are these institutions really managing for outcomes? To what extent are they using information on outputs and outcomes, and from evaluation, in decision making?

6.2 These are difficult questions to answer, particularly in this type of study. The potential uses of results information extend throughout the institution, from planning and budgeting to staff appraisal. The observations below are drawn from a small number of interviews, and from the few reports that address this issue.

Planning

6.3 Section 4 looked at the extent to which these institutions were planning to achieve outcomes: managing for results. In an ideal world strategic planning should also be about managing by results. Institutions should be amending their plans on the basis of results and experience. In practice this is

something that few institutions are able or prepared to do. No development

41

(41)

institution has been implementing RBM long enough for the results of one strategic planning cycle to inform to inform the next.42

6.4 More fundamentally, given the time lags between programmes and outcomes, let alone between programmes and data on outcomes, it is

doubtful whether management by outcomes or impacts will ever be a practical proposition for development agencies. The best that can be hoped for is for periodic reviews to examine the alignment of the programme in respect of outcome trends. UNICEF have done this to some extent in their Medium Term Strategic Plan by concentrating, for example, on countries with particularly high child mortality rates.

6.5 Management by intermediate outcomes and outputs is more feasible on an annual basis. This is the approach being adopted by UNDP, and being investigated by the World Bank. The two drawbacks with this approach are, first, that even outputs are a poor measure of the agency’s recent efforts because of the time lags involved. As observed in an IDA-13 paper, short-term measures of outputs are likely to reflect the result of resources provided many years earlier43. Second, early indicators of output or intermediate outcome performance need to have good linkages with ultimate outcome objectives. Unless they do, progress towards outputs and intermediate

outcomes will not necessarily be the same as, nor any guarantee of, progress towards improved development outcomes. Institutions need to keep track of outcome and impact trends, and ensure through evaluation that performance in terms of outputs and intermediate outcomes is linked to these. This is the challenge for UNDP.

42

This is not to say that the experience of one planning cycle has not informed the next. For example, the UNIFEM SBP for 1997-99 certainly informed the formulation of SBP 2000-03.

43

(42)

Resource allocation

6.6 As with planning, resource allocation can mean allocating for results or by results (or conceivably both). The World Bank is probably the strongest exponent of budgeting for results. There is good evidence that aid has a larger impact on growth and poverty reduction in the context of good policies and institutions. In line with this thinking, the World Bank has for some time used assessments of country policy and institutional (CPI) performance as a basis for allocating IDA funding44.45 Since the Strategic Framework, the Bank has sought still greater selectivity and focus in its work.

6.7 Budgeting by results is altogether more controversial, and difficult to apply. UNDP is particularly reluctant to contemplate results-based budgeting (RBB). There is concern over using results information to reward countries that do well, and penalise countries than do badly. As with the decision not to publish country-level data, this may reflect an internal political judgement. Getting staff to commit to results-orientated planning and reporting has been hard enough. Adding a budget implication would have made the process still more difficult. This implies, paradoxically, that RBM is more acceptable if it doesn’t actually change anything. This is clearly contrary to the spirit of RBM. If RBM is to mean anything, it has to mean using office/unit performance as one criteria for allocating resources. As far as could be ascertained, none of these institutions yet do this. This may, in part, be due to the lack of a reliable

results-based indicator of office/unit performance.

6.8 This is not to deny that there is real question about how best to balance ‘need’ and ‘results’ in resource allocation, particularly for country allocations.

44

Better Measuring, Monitoring and Managing for Development Results. Development Committee Paper. World Bank. (September 2002) p.7

45

It can be argued that CPI scores are themselves results of previous actions by governments and donors. The World Bank is in effect budgeting on the basis of past results in order to increase the likelihood of future results.

(43)

But this is not an either/or choice. Aid should be directed at countries in need with good policy and institutional environments, and therefore the best

prospects for achieving results. It would appear that the World Bank does this rather better than do the UN agencies.46

6.9 Finally, results-based budgeting should have implications for how resources are allocated. The 2001 ARDE included an analysis of which objectives World Bank projects have been the most effective at achieving. This showed, for example, much greater success with physical infrastructure than for public sector institutional change. 47

6.10 Other things being equal, results performance should inform sectoral allocations, both within countries and globally. As with country allocations, there is a question about whether poorly performing sectors should be

penalised. For example, the DER showed that UNDP is performing relatively poorly in relation to gender and institution building. This should mean that UNDP should attempt to understand and address the causes of this under-performance, not immediately reduce its allocation to gender and institutional activities. In the longer run, however, continued poor performance should imply some reallocation of resources towards outcomes where the

institution’s contribution will be greatest.

6.11 As with office/unit performance, results-based sectoral allocations are dependent on reliable and acceptable indicators. This is a real challenge for all institutions, requiring as it does comparability in the definition and

measurement of outputs and outcomes across sectors.

46

UNDP’s allocations to ‘good policy’/’bad policy’ countries (as measured by CPIA scores) became less favourable over the 1990’s.

47

(44)

Staff appraisal

6.12 It was not possible to ascertain the extent to which the assessment of staff performance now includes a results component. According to UNDP, ROAR results are now used in the assessment of Country Resident Representatives. The World Bank is also making some progress at evaluating managers on the basis of tangible results.

6.13 One obstacle to more results-based appraisal – and to the application of RBM more generally - is the time-lag between inputs and outcomes. Annual appraisals can only hold staff accountable for very short-term results. Any higher outputs or outcomes will be the product of resources and actions provided years before. Equally, given the predominance of short postings, most staff will be long gone by the time the outputs and outcomes of their work become apparent. Making staff more accountable for planning for results, and for reporting on results, would be a step in the right direction.

Managerial response

6.14 UNDP is aware that the real challenge for RBM is, and remains, to realise a management value beyond external reporting. As the ROAR itself points out, the “unique benefits of the ROAR lie in the extent to which it can

generate managerial responses at all levels”. The key question is the extent to which a management response has been forthcoming. This is probably the most critical, but difficult, question to answer. To what extent is RBM really making a difference to the way the institution is managed? How much is rhetoric, and how much is reality?

6.15 According to UNDP staff, RBM is beginning to transform the way UNDP does business. Examples of this include :

(45)

• Restructuring in some country offices in line with outcomes.

• More outcome-focussed discussions with partners, and at Board level. • Improvements in country-level planning as a result of the SRF.

6.16 On the other hand, there has been some criticism of the limited response of management to some of the key ROAR findings, such as the relatively poor performance of UNDP in respect of gender. There is also reported to be more commitment to RBM at headquarters rather than in the country offices, and more at middle rather than senior management.

6.17 UNDP is well aware of the challenges involved in implementing RBM. According to UNDP these include defining results consistently and in a measurable way; building partnerships and assessing results together with partners; convincing donors and local partners of the virtues of RBM; and changing hearts, minds and capacities within UNDP.

6.18 None of the other institutions have attempted as rapid a transition to RBM as UNDP. Any managerial changes are therefore both more incremental and more difficult for an outsider to detect. The World Bank experience is a case in point. Ten years of management reform, intended in part to increase the results-focus of the organisation, have made some difference. The design, outcomes, sustainability and institutional development impact of World Bank projects have improved. However, as the Bank itself acknowledges, there is much more that needs to be done to increase its results-orientation,

particularly at country, sector and corporate level. The Bank’s own assessment of performance measurement under the Strategic Compact concluded as follows :

“... while the measurement of performance as well as several of its uses have improved during the Compact period, performance measurement has not yet been used systematically and consistently to make strategic

(46)

decisions on selectivity, mobilize resources, align staff motivation, and hold managers accountable for the performance of their units.”48

6.19 The lack of senior management support for a stronger focus on results measurement and management – as evidenced by the failure to implement the OED recommendation on RBM in 1997, and the weak support for the corporate scorecard to date – partly explains the slow progress. However, two other factors have contributed :

• the long period of time needed to implement fundamental change within an institution

• the difficulties associated with applying results-based management to a development institution.

6.20 The other three institutions – UNICEF, UNIFEM and IDB – will face the same challenges. It is noteworthy that IDB is in the process of recruiting a Chief Development Effectiveness Officer to spearhead the process of culture change within that institution.

48

References

Related documents

In this paper we focus on an important but neglected aspect of knowledge transfer from academic research involving the indirect flow to entrepreneurship by individuals

This study provides a business case analysis (BCA) that compares the estimated costs of procuring, operating, and sustaining either the Boeing A160T Hummingbird or the Lockheed

For the loss suffered as a result of the death of or personal injury to a passenger caused by a shipping incident, the carrier shall be liable to the extent that such loss in

1 Temperature corrected resistance measurements are sensitive to both shallow rainfall driven moisture dynamics and piezometric level changes. 2 Upon landslide activation,

Inferring from the Court’s recent line of decisions extending an individual’s constitutional right of privacy with regard to contraception and abortion, the New Jersey Supreme

For the identification of the synthetic lethal relationship between ATR and PRIM1 a well-defined ATR knock-in model was used [41], consisting of ATR-proficient (ATR +/+ )

These descriptors can apply tree-based theory (Adaptive grid resolution and Convex hull), statistic (Chain code histogram, Beam angle statistics, Shape context and Chord

Poverty and socioeconomic disadvantage, sex differences, race, ethnicity, culture, child maltreatment and non-accidental trauma, other special issues concerning adolescents and