Guide to Project Performance Reporting

Guidelines

Reporting project performance

 

Performance and ExpendituresTriangle Breadcrumb LineAbout CIDA - Performance - Results-based Management - Guide to Project Performance Reporting: For Canadian Partners and Executing Agencies Breadcrumb Line

Guide to Project Performance Reporting: For Canadian Partners and Executing Agencies

Chapter 1 - About this Guide
Chapter 2 - Why Performance Reporting?
Chapter 3 - What Were Your Expected Results?
Chapter 4 - What Have You Accomplished?
Chapter 5 - What Have You Learned?
Chapter 6 - What Do You Recommend?
Chapter 7 - What Can You Tell Us About Applying RBM?


Top of pageChapter 1 - About this Guide

We welcome you to this Guide to Project Performance Reporting and hope that you will find it helpful. It has been prepared for CIDA’s Canadian partners, executing agencies and their developing country partners who are new to performance reporting within the context of CIDA’s Results-Based Management (RBM) approach. This first chapter begins by describing our approach to making this a useful RBM tool for you ... the user. The rest of this chapter summarizes the organization of the guide, followed by an explanation of our use of icons to help you identify materials of interest throughout the text.

The intent of this guide is to be concise, user-friendly and flexible enough to respond to the wide variety of project contexts, funding mechanisms and Branch requirements within CIDA. At the same time, there are a number of standard RBM concepts and principles that must be applied to performance reporting regardless of the unique circumstances of any particular development initiative. As you can see, the challenge is to be as inclusive as possible, while delivering a high quality product to each one of you. Consequently, our approach is modular, whereby, depending on the circumstances you may decide that certain sections of this guide are not applicable to your particular development initiative. Nevertheless, since many of the topics outlined in this guide are predicated on an ongoing dialogue between CIDA and its partners, we encourage you to use it as a reference in your discussions when finalizing the terms and conditions for performance reporting.

In order to help you use this guide in the way that best suits your needs we developed several strategies. First, we provided an Annotated Table of Contents that presents an overview of topics covered in each chapter for your easy reference. This helps the reader quickly grasp the overall organization of the guide and understand the logical flow of information from one chapter to the next. Secondly, we’ve attempted to eliminate any unnecessary wordiness in favor of sample tables and formats that illustrate an appropriate presentation of performance information.

We hope that this will be viewed as something more than simply filling in another set of boxes, but rather a concise and useful way to organize a complex set of performance information.


Top of pageChapter 2 - Why Performance Reporting?

This chapter begins by introducing the topic of performance reporting in government and continues by explaining CIDA’s approach to this very important management function. Over the past decade, there has been increasing pressure by Canadian citizens on the Government of Canada to demonstrate the efficient and effective use of public resources. A new philosophy of public management has subsequently taken root that focuses on improving results measurement and accountability. In February 1994, the Government introduced changes to give Parliamentarians an expanded role in examining departmental priorities and expenditures. Government departments, like CIDA, are now required to provide long-term information on their commitments, spending, and performance accomplishments.

A key objective of the Reform of the Estimates project is to improve expenditure management information through a focus on results at both the program and business lines.
-
Treasury Board Secretariat, 1996


Along with other government departments, CIDA has introduced two annual planning and reporting documents: a Report on Plans and Priorities and a Departmental Performance Report. The Report on Plans and Priorities, tabled in Parliament every March, provides information on expected developmental results and proposes strategies for reaching program objectives. Tabled every Fall, the Performance Report will report against the commitments made in the Report on Plans and Priorities. Over time, the Performance Report will become the key instrument for external reporting by the Agency to Parliament and the Canadian public.

Consistent with the RBM principles of partnership, accountability and transparency, we will need to continue to work together to achieve developmental outcomes, while at the same time improve the quality of our performance reporting. This will require an ongoing analysis of CIDA’s resource investments vis-à-vis intended beneficiaries and outcome level results achieved, based on access to a full range of valid and reliable performance information from our development programs and projects. Although there is a great need for better performance reporting, we realize that it cannot become an end in and of itself.

In an RBM context, good performance reporting is simply a byproduct of good management. It should be viewed first and foremost as an opportunity to analyze and collectively reflect on past accomplishments, or failures, with a view to learning and improving management decision-making. This holds true as much for CIDA as it does for its Canadian partners, executing agencies and developing country partners.

Many of you however have questions about your level of accountability for the achievement of and reporting on developmental results. So, what are the reasonable limits of accountability that Canadian partners and executing agencies should assume for the achievement of results? The Agency Accountability Framework (July 1998) addresses the need for clarity in this area. It is the Agency’s view that Canadian partners and executing agencies, by virtue of their commitment to development, cost-sharing and their actions and decisions aimed at producing developmental results, share accountability for the achievement of program and project outputs and outcomes with CIDA and our developing country partners. This shared accountability for the achievement of developmental results makes it especially important that CIDA be in a position to make decisions that will affect it's programs based on reliable performance information generated from your reports. Timely and high quality performance reporting at the program and project level will make an important contribution to CIDA becoming a results-oriented organization. At the same time, we will also be in a better position to report to Parliament, the Canadian public and the international community on whether Canada’s development assistance is producing the results we expected.
Issues

Criteria for Evaluating Performance Reports

·                        Clear description of operating environment and strategies employed;

·                        Meaningful performance expectations identified;

·                        Performance accomplishments reported against expectations;

·                        Valid and reliable performance information presented;

·                        Demonstrated capacity to learn and adapt.

- Office of the Auditor General of Canada, 1997

 


Top of pageChapter 3 - What Were Your Expected Results?

This chapter begins with some general background information about defining expected results before going on to discuss the importance of including your performance expectations in your report. Let’s start with the question of What is an expected result? According to the RBM Policy, a result is a describable or measurable change in state that is derived from a cause and effect relationship. This change in state is described as something having been increased, decreased, improved, raised, etc. An expected result is then a change in state that you will expect to have achieved in the future. But the kind of expected results that we are most interested in are those that reflect actual changes achieved in human development. These are called developmental results and we use three terms to describe them, i.e., outputs, outcomes and impacts. They are linked together by virtue of their cause and effect relationships to form a results chain.

A SMART Result is:

·                        Specific

·                        Measurable

·                        Achievable

·                        Relevant

·                        Time bound


Can you identify the results chain in this passage?

Improved national water resource management policies is a long term impact that can be achieved only when there is improved water quality/quantity data, better analyses of supply and demand variables and increased consultation among water users and distributors. However, these outcomes are in turn dependent on the achievement of several short-term human resource development outputs, such as, data collectors and technical staff (men and women) skilled in applying standard operating procedures, database managers skilled in information systems management and the increased analytic capacity of the current staff to support the public policy-making process. In addition, several other organizational capacity-building outputs will have to have been achieved early on in the project as well, e.g., a strengthened data collection infrastructure, new laboratory networks created and improved database systems operationalized. All of these developmental results must be achieved if this 7 year $20.M water quality management project is to be successful.

As you can see, the concept of causality is important in RBM, so, outputs are a logical consequence of project activities, while outcomes are a logical consequence of the achievement of a combination of outputs. Impacts are then a logical consequence of the achievement of a combination of outcomes. Keeping the time frame in mind can also be helpful in articulating expected results. Outputs are short-term developmental results, while outcomes must be realistically achievable within the lifetime of the project.

Impacts are the long-term developmental results that usually don’t manifest themselves until after project termination. In this way, if your expected results are time bound, it will help ensure they are also realistic. For a more detailed discussion of these RBM concepts and others, we suggest that you obtain a copy of Results-Based Management in CIDA: An Introductory Guide to the Concepts and Principles published by CIDA’s Performance Review Branch (1999).

Your performance report should, at minimum, present the program/project's expected developmental results at the outputs, outcomes and impact level. Identification of the intended beneficiaries of the expected results should also be provided i.e. population, organizational affiliation, profession, sex, age, etc., depending on the nature of the project This information is best captured graphically in a Performance Framework (PF) and inserted into the report near the beginning so that the reader is made immediately aware of what is to be achieved and for whose benefit. Modifications may be made to the PF during project implementation, so the most current approved version should be used. There is no really need to elaborate on the expected results in a narrative text, since this will presumably have been done in the Annual Workplan.

RBM is an iterative management approach, so we expect to make changes in what we want to achieve and especially how we achieve it. However, any modifications to the original PF should be documented and justified in the current year's Annual Workplan. Approval of the Annual Workplan then signifies the approval of any changes made to the project's expected developmental results, intended beneficiaries or activity sets which constitute the Performance Framework. It is a good idea to do this throughout the project life-cycle, so that you have a paper trail of these changes.

The title page should situate the report in the context of the project life-cycle e.g. first report, report sequence, project midpoint, second to last report, etc. The introduction should make reference to any noteworthy issues or problems in producing the report e.g. late, incomplete, etc., and provide a summary description of the content and structure of the report noting any changes to the Performance Framework.

 

Performance Framework
Water Quality Management Project

Activity Sets

Reach

Outputs
(0 - 5 yrs)

Outcomes
(4 - 7 yrs)

Impacts
(7+ yrs)

WBS 1100 - Water sampling and analysis

National Water Research Centre
(NWRC)
Technical staff

Central Lab
Personnel

NWRC,
Ministry of Health &
Ministry of Agriculture
Database Managers


Ministry of
Public Works
Policy-makers & Professionals


NWRC
Management & Human Resource Professionals

1. NWRC data collectors and technical staff skilled in applying procedures for SOPs, QC, etc.
2.
National water quality baseline data established.

1. Improved relevance, timeliness, accuracy and reliability of water quality/quantity data generated by NWRC.
2. Improved water data storage, retrieval, analysis and information system management by NWRC.
3. Improved access and dissemination of water resource management information by/to stakeholders in forms suitable for influencing policy dialogue and formulation.

1. Improved national water resource management policies.

WBS 1200 - Rationalized monitoring program

3. An operational national water quality monitoring infrastructure/system established.

WBS 1300 - Laboratory analysis

4. A rationalized national laboratory network established with redistributed workload and functions.

WBS 1400 - National water quality and information management

5. NWRC database managers and professionals skilled in information systems management.
6. An established national water quality database that meets the information needs of water resource managers and user groups.
7. Improved data access among NWRC institutes i.e. NRI, DRI, RIGW.

WBS 1500 - National water quality information dissemination

8. # of information products i.e. maps, graphs, reports generated for policy-makers and other stakeholders.
9. MPWWR professionals skilled in policy analysis and decision-maker support.

WBS 1600 - Human resources development

10. NWRC managers and professionals skilled in program planning, participatory management and personnel management.

WBS 1700 - HRD capacity building

11. NWRC staff skilled in conducting training needs assessments, professional development seminars and implementing an HRD plan.

 


Top of pageChapter 4 - What Have You Accomplished?

Without good performance information, you will not learn effectively about what works and what doesn’t work during implementation. This chapter discusses the presentation of accomplishments which relate directly to the stated performance expectations described in the previous chapter. It represents the heart of the performance report.

Whether semiannual or annual, the report should provide the reader with current information reflecting the program/project performance to date, that is, progress made toward the achievement of expected results. This is an important departure from the traditional practice of quarterly reporting on activities, completed versus planned, over a three month period.

In keeping with the RBM principle of "simplicity", a program/project should have only one set of developmental results that it is expected to achieve within its life-cycle. It would be unnecessarily complex to have a different set of results, at the output level or otherwise, for each annual planning cycle. An agreed upon set of developmental results are monitored over the life of the development initiative and reported on using a format like that provided in Figure 1. One result at a time, beginning with the outcomes and ending with the outputs should be presented on a separate page in landscape mode in a manner that shows the reader the links between planned results, indicators used, actual results, intended beneficiaries and the problems encountered due to critical assumptions not holding true. Normally, you are not required to report on impacts since they generally manifest themselves only after program/project termination.

Performance Indicator Selection Criteria:

·                        Validity

·                        Reliability

·                        Sensitivity

·                        Simplicity

·                        Utility

·                        Affordability


The use of performance indicators to measure achievement is critical to the RBM approach. So, what is a performance indicator? They are qualitative or quantitative measures of resource use, extent of reach and developmental results. When carefully selected they can be used to measure changes in human development. Annual targeting of performance indicators is also an effective way to closely monitor long-term, complex or risky development initiatives. See Results-Based Management in CIDA: An Introductory Guide to the Concepts and Principles for more detailed information on performance indicators.

Figure 1: Illustrated Sample Results Reporting Format

Expected
Results
(The result statements presented in this column should come from the current year’s Annual Workplan.)

Performance Indicators Used
(The indictors used to measure progress toward the achievement of the adjacent result should come from the current year’s Annual Workplan.
They represent the performance targets for the year.)

Actual Results
(The actual results achieved reflect the information collected on each of the adjacent performance indicators and allow you to compare performance against the fiscal year targets set in the Workplan.)

Outcome # 1

·                                      Improved reliability, timeliness, accuracy, and relevance of water quality/quantity data generated by NWRC.

1.1 Four (12) weeks of lag time between data collection and data registry.
1.2 Complete congruence of data collection/analysis practices with established standard operating procedures (SOP).
1.3 50% coverage, density and frequency of data collection as compared to international standards.
1.4 Key stakeholders (men and women) highly satisfied or satisfied with the timeliness, accuracy and relevance of water quality/quantity data.

A representative sample of 100 data points were tested over the past six months. Timeliness has improved as indicated by a reduction in time lag from the baseline of 20 to 12 weeks since project inception. Congruence with SOP is now fully compliant at all data points sampled. However, the coverage of surface and in-ground water bodies, frequency and density of data collection has only attained 40% of the international standard, up from 30%. All key government stakeholders, with the exception of the Ministry of Agriculture, were satisfied with these improvements. The Ministry of Agriculture was dissatisfied, while the majority of farmers in rural areas, especially women, were highly dissatisfied with the lack of relevant water quality data related to health issues in rural and remote areas.

Reach: (The people and the organizations who have or will benefit from the achievement of the actual result should be identified. They are generally the direct beneficiaries, e.g., newly skilled technical staff or the end-users of the actual result such as in our case study.) Policy and decision-makers in the Ministry of Public Works, Ministry of Health, Ministry of Industry, Ministry of Agriculture and 5,000 farming households.

Assumptions & Risk: (Problems encountered due to critical assumptions not holding true and assessment of identified risks to performance.)
Policy and decision-makers continue to be interested in improved water quality/quanity information for urban areas and commercial users, but have yet to allocate sufficient counterpart funding to increase the number of data gathering points in rural farming areas. Part of the problem is that data collectors and technicians demand higher salaries to work in these remote areas and have resigned early when their requests were not approved. There is an increasing risk, from medium to high, that this result may not be achieved for this key group of intended beneficiaries, rural farmers, if this problem is not addressed at the highest level.


Beware of Unintended Negative Consequences!

Every project will have both positive and negative unintended consequences for the targeted beneficiaries or other population groups. The performance report should document those consequences which are attributable to your development initiative.

How can this be avoided in the future?
The project developed agricultural extension packages for farmers working on lands irrigated with low quality water. The training courses offered in the use of these agricultural extension packages were attended by the male heads of households in 80% of the cases. This undermined the decision-making control of women in small scale household farming operations with their consequent loss of control over the additional income generated.


Don’t Forget to Link Activities to Disbursements

Many of our Canadian partners and executing agencies have asked if they still have to report on activities? This is a legitimate concern because there is a danger that your reporting burden could double if you had to report on both activities and results. So, there is a two-part answer to this question that should go a long way to reducing your reporting burden, while ensuring that CIDA can make the link between activities completed and expenditures reported.

First, if the program/project is performing satisfactorily there would be little value in providing a detailed narrative description of all the activities completed since the last report. The exception, of course, is when there are significant delays in the attainment of fiscal year targets established for performance indicators. Explanations for such shortfalls would require an examination of the timeliness and effectiveness of the completed activities, or the quality and mobilization of inputs. It is very important to diagnose these problems in order to be in a position to take corrective action in the future. Consequently, the report should refer to specific activities identified by Work Breakdown Structure (WBS) number when describing the problems encountered in the delivery of activities.

Second, the report should allow for a comparison of cumulative budget versus actual disbursements to date by WBS Activity Set. Commentary on any significant disbursement anomalies or variance should be explained and linked to the implementation, or not, of activities. To facilitate the presentation of this information, a list of activities undertaken should be cross-referenced with disbursement information taken from the Financial Statement for the reporting period as illustrated in Figure 2 on the next page. A request for the next cash advance could accompany the Performance Report when supported by this type of financial analysis and a signed copy of the Financial Statement attached as an annex.

Figure 2: Ilustrated Sample Reporting Format

WBS
Activity Sets

Budget
to Date

Actuals
to Date

% Variance

Explanation/
Comments

WBS 1100 - Water sampling and analysis
1110 - Baseline water quality sampling and analysis completed
1120 - In-country training for 200 data collectors in SOPs delivered

$955,000

$1,110,000

+15%

Planned activities completed on schedule. Cost overrun due to in-country travel expenses to collect baseline data

WBS 1200 - Rationalized Monitoring Prog.
1210 - Implementation plan prepared for a rationalized water quality monitoring system

$120,000

$118,000

-1.6%

Planned activities completed on schedule

WBS 1300 - Laboratory analysis
1310 - Agreement with Central Lab for services has not yet been signed.
1320 - Establishing a quality control system and inspection process is postponed.

$250,000

$15,000

-94%

Until an agreement is signed the other activities in this set cannot proceed as planned. The budget must be reprofiled.

WBS 1400 - Information management
1410 - Designed and developed national water quality database
1420 - Establishing standards and quality control processes

$750,000

$745,000

-.6%

Planned activities completed on schedule

WBS 1500 - Information Dissemination
1510 - 100 Policy-oriented information packages disseminated.
1520 - Interpreted data disseminated to NGOs, universities, media, etc.

$85,000

$93,000

+9.4%

Planned activities completed on schedule

WBS 1600 - Human Resources Development
1610 - In-Canada attachments for 25 NWRC managers and professionals cancelled.
1620 - Local workshops/seminars for 75 NWRC managers and 150 professionals completed.

$275,000

$255,000

-7.3%

WBS Activity 1610 was canceled because it was more cost-effective to provide the training in-country.

WBS 1700 - HRD Capacity Building
1710 - In-Canada attachments for 3 HRD specialists postponed
1720 - "Train-the trainer" course design and delivered for 15 men and 5 women trainers.

$58,000

$23,500

-60%

WBS Activity 1710 was postponed because the best candidates were not available

TOTALS:

$2,493,000

$2,349,500

-5.7%

.

 


Top of pageChapter 5 - What Have You Learned?

While the previous chapter discussed the different ways to assess the progress made toward the achievement of expected results at the level of outputs and outcomes, many of you have asked, how does this help me to management my project? This chapter addresses performance reporting from a learning perspective. It presents some ideas as to how you can learn from the collection and analysis of performance and risk information in order to improve management decision-making.

It is important to remember that you never assess performance in a vacuum without first taking into consideration the assumptions you’ve made about your project design and the implementation environment. Your risk analysis may have indicated that some assumptions did not hold true during the reporting period, thus affecting your ability to achieve the expected results. You might want to examine and discuss in your report which internal or external risks had the most significant effect on your overall performance.

Some Internal Risks:

·                                      Large # of stakeholders;

·                                      Stakeholder commitment;

·                                      Complex technologies;

·                                      Innovative methodology;

·                                      Short time frame.

Some External Risks:

·                                      Political;

·                                      Cultural;

·                                      Social;

·                                      Environmental;

·                                      Economic.


Once that is completed you will want to assess results achievement with a view to identifying areas for improvement. To assist you in this task CIDA has identified a number of Key Success Factors to help you learn from experience, manage for results and report on achievements. These success factors are categorized as being either development or management focused.

Key Success Factors


Development Factors identify elements of development effectiveness and put in perspective the difference the achievement of results have and will make in the lives of intended beneficiaries. Did the results achieved address their priority needs? Was there sufficient stakeholder participation and partnership? Will the results be sustainable? The Management Factors, on the other hand, are the more process-oriented delivery elements associated with management efficiency which might also help to explain program/project performance. Is there shared responsibility and accountability for results? Is the project design still appropriate? Do we anticipate and respond to change based on adequate information? These key success factors could be taken into account when attempting to analyze and explain the performance of your development initiative.

A lesson learned about shared accountability:
It is clear from the above analysis of performance and risk information that a key weakness in the project has been shared accountability for the achievement of developmental outcomes. The project has completed most of the planned activities and achieved most of the expected human resource development and organizational capacity building outputs. Political and financial commitments by the government to provide sufficient counterpart funding to establish a water monitoring program in rural and remote farming communities will however be necessary before this project will benefit the majority of farmers as intended. It will be difficult at this juncture in the project to obtain this commitment. Clearly establishing shared accountability for the achievement of developmental outcomes should be a prerequisite to project implementation.

 


Top of pageChapter 6 - What Do You Recommend?

This chapter discusses when to make recommendations for action to either CIDA or a broader stakeholder group. For a performance report to be useful, not only to you but to CIDA and other stakeholders, it should identify matters requiring management attention. You should draw on the identified problems, issues or themes from the previous chapter, e.g. lack of shared accountability, inadequate counterpart funding, over-spending/under-spending, high staff turnover, inattention to gender equity, etc., when formulating your recommendations. It is especially important to bring to CIDA’s attention matters where development effectiveness may be directly or indirectly jeopardized. Remember that problem formulation should be clear, concise and followed immediately by your recommendation for action.

Is this the recommendation that you would make?
Based on our project’s performance to date and risk analysis we feel that it is important to bring to CIDA’s attention our difficulty in securing the counterpart funding needed to meet the water resource information needs of rural and remote farming communities. Recommendation: That this issue be raised by CIDA at the next Project Coordination Committee meeting with a view to obtaining the necessary political and financial commitments.


Canadian partners and executing agencies have in the past expressed concern about their level of decision-making authority. How does one distinguish between those adjustments that require either CIDA’s or a management committee’s approval, from those that can simply be made during the course of implementation? Although CIDA’s main concern is development effectiveness, it does have an interest in management efficiency. However, the latter is more about how the project is implemented, which should normally be within the decision-making scope of the project delivery partners. Such instances are: where results are being achieved, actions can be taken to strengthen them; where progress is difficult, different approaches can be tried or activities added; and where activities-outputs are considered obsolete, they can be abandoned. To the extent that these adjustments do not imply a change in the total annual budget allocation, nor the expected outcomes, they are best presented as points of information


Top of pageChapter 7 - What Can You Tell Us About Applying RBM?

In April 1996 CIDA adopted RBM as its main management tool along with a set of six principles. Since then, RBM has found broad application in the geographic programs, multilateral program, countries in transition and partnership programs. It is a flexible approach that can be adapted to each unique circumstance while ensuring a measure of consistency across the Agency. Working collaboratively at participatory planning, managing for results, monitoring performance, and reporting on results, is part of our collective commitment to the principle of partnership in development. The biggest challenge facing us all is to keep it as simple as possible, but no simpler than it is. As we each take accountability for achieving developmental results and try to be more transparent in performance reporting, the job can be made easier by sharing what we have learned with one another.

RBM Principles:

·                        Broad Application?

·                        Partnership?

·                        Simplicity?

·                        Accountability?

·                        Transparency?

·                        Learning by Doing?


CIDA has recently published a first report entitled, Lessons Learned From Implementing RBM in CIDA (June 1998) which may interest you. As part of its commitment to learning by doing, we would like to help you share what you have learned about applying RBM to your development initiatives.

If you have any thoughts or ideas about your experience in applying RBM that you would like to share, feel free to include them in a separate chapter in your performance report ... so that we can all benefit.