Sunteți pe pagina 1din 3

TEMPLATE SHEET

Instructions for Completing the Performance Indicator Reference Sheet Project Output: Enter the title of the Project Output Indicator: Enter the full title of the Indicator. Be as precise as possible, providing a description that is clear and points transparently to the particular information which your data and calculations will provide. ASARECA Result to which this indicator responds: Refer to ASARECAs results S.G Goal Purpose Indicator Level: Mark the level of indicator as appropriate Input Output R.1 R.2 Outcome R.3 R.4 Impact R.5

Date of PMP Development: When did relevant parties agree on the reporting of this indicator? Date Last PMP Review: When did relevant parties last review/discuss/alter the indicator? DESCRIPTION Precise Definition: Define the indicator more precisely, if necessary. Define specific words or elements within the indicator as necessary. The definition must be detailed enough to ensure that different people at different times, given the task of collecting data for a given indicator, would collect identical types of data. Every significant term from the exact wording of the indicator must be clearly defined in this section. It is not enough merely to restate the indicator, nor is it sufficient to list the particular items you are planning to include or exclude from your data calculations. It must define the categories so that anyone not familiar with your program would nonetheless be able to apply criteria or otherwise know exactly which categories of data should be included in indicator calculations and which should not. Unit of measure: Enter the unit of measure (e.g. number, percent, hectares, local currency, etc.) Disaggregated by: List planned data disaggregation to help improve the breadth of understating of results reported (e.g. geographic location and gender). Justification/Management Utility: Briefly describe why this particular indicator was selected and how it will be useful for managing performance of the Project. If the value of this indicator changes, what does this indicate about the program? What are the activities that show that this specific indicator is an especially appropriate measurement of your projects impacts and results? Why are these incremental results significant in or for the ASARECA programs? In what ways will monitoring of these results contribute toward program success? Toward what results at a higher level, or which overarching goals, will these indicators ultimately contribute? PLAN FOR DATA COLLECTION/ACQUISITION Data Collection Method: Describe the tools and methods through which the data will be collected. Identify what methods and instruments you will use. Note any equipment required to collect the data. Attach data forms when necessary. Some examples of data collection methodologies/approaches are: List the source(s) of the raw data, the levels of collection (is a third part aggregating data or calculating some intermediate indicators that may affect your indicator values?), and describe the steps involved in the collection of any/all information needed to construct the indicators value for a given reporting period. Too much detail is better than too little detail here. Examples of different data collection methods are given below. 1. Focus Group Discussion: 2. Direct Observation: The process of observing individuals, events, processes, etc in controlled situations 3. Behavior Observation Checklist: a list of behaviors or actions among participants being observed. 4. Knowledge Tests: information about what a person already knows or has learned. 5. Opinion Surveys: an assessment of how a person or group feels about a particular issue. 6. Performance tests: testing the ability to perform or master a particular skill. 7. Delphi Technique: Surveying the same group of respondents repeatedly on the same issue in order to reach a consensus. 8. Self-Ratings: a method used by participants to rank their own performance, knowledge, or attitudes. 9. Questionnaire: a group of questions that people respond to verbally or in writing. 10. Time Series: measuring a single variable consistently over time, i.e. daily, weekly, monthly, annually. 11. Case Studies: experiences and characteristics of selected persons involved with a project. 12. Individual Interviews: individuals responses, opinions, and views. 13. Group Interviews: small groups responses, opinions, and views. 14. Physical Evidence: residues or other physical by-products are observed. 15. Panels, Hearings: opinions and ideas. 16. Records: information from records, files, or receipts. 17. Logs, Journals: a persons behavior and reactions recorded as a narrative. 18. Simulations: a persons behavior in simulated settings. 19. Advisory, Advocate Teams: ideas and viewpoints of selected persons. 20. Judicial Review: evidence about activities is weighed and assessed by a jury of professionals.

Method of Data Acquisition (by ASARECA, etc): Describe the form in which Program Managers will receive the data (e.g. periodic monitoring report, compiled survey analysis report, etc) Data source(s): As specifically as possible, identify the documents, databases, organization, and/or individuals that/who will provide raw information or final figures that will be reported through this indicator. Documentary/Paper Source: 1. Published Sources: 2.1. Federal Government/Agencies Publications Statistical Abstracts, USAID, WB, CIDA, etc 2.2. State and Local Government Agencies; Universities; Think Tanks and Policy Organizations 2.3. International Publications e.g. USAID, WB, CIDA, etc 2.4. Private Publications Newspaper and journals (The Development Analyst, etc) 2.5. Directories list of people, organizations, and places that could be sources of further information 2.6. Articles in trade journals, etc
2.

Unpublished Sources: 2.1. Studies Observational; Experimental; etc 2.2. Articles, workshop proceedings, minutes of meetings, etc 2.3. Museums, Commercial Companies, etc 2.4. Farmers, Traders, Exporters, Students, etc

Electronic Sources: 3. Electronic Media: 3.1. Internet (e.g. google.com; yahoo.com, etc) 3.2. CDs, DVDs, and on-line journals Frequency and timing of data acquisition: Describe how often data will be collected and when. If data are collected every month, but the indicator will be calculated or reported only annually, the frequency listed here should be Annually. Estimated cost of data acquisition: Provide a rough estimate of what it will cost (in US dollars and/or level of effort) to collect and analyze this data. Unless this is a special survey or other new M&E activity outside of current or ongoing plans, it will often be appropriate to note here that the cost will fall within the contract budget, or other similar language. This section helps ASARECA keep track of new budget items or any not previously included in standard or routing obligations. Responsible Individual(s) at ASARECA/Donor: Identify who will take the lead/be the primary person directly responsible for acquiring the data on this indicator. With as much clarity as possible, identify the person and position within each relevant organization that will have responsibility either for providing relevant data or for otherwise contributing to indicator calculation. In most cases, there will be at least one ASARECA/Donor representative (and position) identified here AND at least one Implementing Partner person and position. Location of data storage: Describe how data will be stored over time and in what formats. In cases where raw data and calculated indicators will be stored by separate organizations, it is a good idea to note each location where portions of the information that would be necessary to reconstruct the indicator value will be stored. DATA QUALITY ISSUES This section reports only on issues related to Data Quality. Issues of indicator definition, performance, relevance or data availability or alternative standards should be explained or explored in other sections. Date of Initial Data Quality Assessment: Enter the date of initial DQA and the responsible party. (Validity Concerns): Given what you know at this point in time, how do you feel about the potential for problems with the quality of data that you will eventually collect and use to calculate this indicator? Do you think your data validly measure the result targeted by this indicator? Do you think your measurements are valid metrics for the (conceptual) result you are trying to track here? Do you expect institutional or other challenges to arise that may affect the degree of measurement error or other systematic errors in your data set? Known Data Limitations and Significance (if any): Describe any data limitations discovered during the initial DQA. Discuss the significance of any data weakness that may affect conclusions about the extent to which performance goals have been achieved. (Reliability Concerns) Even if your indicator is valid, are your data reliable? Do you foresee any gaps or inconsistencies in the data that might affect the soundness of the Indicators calculated value, or your ability to interpret/understand the meaning of the Indicator? If limitations arise, do you judge them likely to be highly significant, trivial/unimportant, or somewhere inbetween? Actions Taken or Planned to Address Data Limitation: Describe how you have or will take corrective action, if possible, to address data limitations Think of all of the things that could go wrong with your planned indicator when you start trying to gather information about real results of your program activities. How will you try to mitigate or correct for any gaps or measurement errors that may be due to difficulties with the data as noted in the previous two sections? Date of Future DQAs: Enter the planned date for subsequent DQAs

Procedures for Future DQAs: Describe how the data will be assessed in the future (e.g. spot checks of projects, financial audits, site visits, etc) PLAN FOR DATA ANALYSIS, REVIEW & REPORTING Data Analysis: Present a concise description of how data for individual indicators or groups of related indicators will be analyzed to determine progress on results. Concisely describe how and when data results will be chronicled. Identify audiences that have particular interest in this indicator and note how information will be presented to them. Presentation of Data: Describe how tables, charts, graphs, or other devices will be used to present data. Qualitative indicators may require more narrative explication. Review of Data: Describe when and how the Project will review the data and analysis (e.g. quarterly review meetings, portfolio review, work planning sessions, etc) Reporting of Data: List any internal or external reports that will feature data for this indicator (e.g. Quarterly Report, Bi-Annual and Annual Reports, etc) Notes on Baselines/Targets: Explain how the baselines and targets were set and identify any critical assumptions, potential issues made. If baselines and targets have not been set, identify when and how this will be done. Write in the year the baseline was taken, and place the baseline value in the ACTUAL value. Baselines: How exactly have you determined the baseline for your indicator value(s)? If no exact baseline was available, what information did you use for a proxy measure and how did you adjust or otherwise interpret the data in order to arrive at what you consider to be a reasonable approximation of a baseline? Targets: How exactly have you determined a target (or targets) for your indicator values? If you have extrapolated form existing partial data or estimated based on data from another geographical area, explain your reasoning. Other: Provide any other information relevant to data collection and reporting of this indicator. BASELINE, TARGETS & ACTUALS: Year Target Value Baseline Year 2009 2010 2011 2012 2013 2014 End of Project Target:___________________ Comments: After calculation of indicator values for one or more periods, note here any adjustments you may have had to make. Adjustments may be needed, for example, according to information provided in any of the sections above (e.g., data that were expected to be available turned out not to be available (for certain disaggregation, for example); data whose quality was already suspect was in the end judged to be of insufficient validity or reliability; data collection that depended on cooperating government or NGO entities did not occur or was incomplete). In addition, further (unanticipated) issues may have arisen in defining, collecting, calculating, or otherwise arriving at sound and transparently interpretable indicator values. Any such additional information that would be helpful for people interpreting the meaning or significance of the indicator values should be discussed here. This Sheet last updated on: To avoid version control problems, enter the date of most recent revision to the reference sheet mm/dd/yyyy

Actual Value

Notes

S-ar putea să vă placă și