Monitoring and Evaluation


Global Forum for Rural Agricultural Services, GFRAS





The Global Forum for Rural Advisory Services (GFRAS) has supported the development of this draft Extension Evaluation Guide. In order to enhance the contribution rural advisory services can make to rural livelihoods it is essential to monitor and evaluate achievements within complex rural settings, and to engage in a lesson learning process with a wide range of stakeholders and organisations. The Guide is intended to primarily be used by six sets of evaluation stakeholders:

  • Those commissioning and managing evaluations
  • Professional evaluators and staff responsible for monitoring systems
  • Those involved in knowledge and results based management within a range of organisations involved with extension
  • Staff of public extension agencies, farmers associations, and other organisations directly or indirectly engaged in providing extension services
  • Professionals involved in training and educating evaluators
  • Researchers looking for ways to synergise their efforts with evaluation initiatives

The process of preparing this Guide began in 2010 with the production of a Review of Literature on Evaluation Methods Relevant to Extension and a Meta-evaluation of Extension Case Studies. These materials, combined with extensive consultation with a range of stakeholders, were then used to as background for the development of this Guide. If you are interested in field-testing this Guide, please contact info@g-fras.org, as GFRAS would very much appreciate receiving feedback on the usefulness of this Guide.



The Global Forum for Rural Advisory Services (GFRAS) has commissioned the Natural Resources Institute to develop a toolkit for the evaluation of extension (projects, programmes, tools and initiatives). This commission has a number of components:

  • A meta-evaluation of 15-20 evaluation case studies(presented here)
  • A meta-review of the literature relevant to extension evaluation methods
  • A workshop with practitioners and experienced evaluators to discuss the findings of a) and b) and to identify an initial set of tools
  • A proposal for testing the proposed tools in a second phase of the project
  • A brief of the toolkit for policymakers.

The overall purpose of this project is to identify methods for better evaluation of extension through the development of a toolkit for extension evaluation. The meta-evaluation and meta-review will also provide an in-depth basis for the selection of the approaches, methods and tools in the toolkit.


Evaluation Experts in MEAS 

Murari Suvedi 
Professor and Senior Associate to the Dean 
Dept of Community, Agriculture, Recreation and Resource Studies 
135 Natural Resources 
Michigan State University 
East Lansing, MI 48824 
suvedi@anr.msu.edu






Training Material

MEAS Module on Evaluating Extension Programs

The purpose of this module is to expose national level policy makers, project managers and funding agency personnel to program evaluation. Specifically, participants in this evaluation workshop/training module will be able to:
  1. Describe evaluation principles and frequently used models of program evaluation.
  2. Identify indicators of program success of a given agricultural extension project/policy.
  3. Select appropriate methods/techniques of data collection for conducting process and impact evaluations.
  4. Understand the use of statistical software to analyze data, interpretation of results, writing evaluation reports, and how to share findings with stakeholders.
  5. Develop evaluation plans to document impacts of extension programs.


Publications

Module 7 (p. 539 ff) in:  The Worldbank (2012). Investment Sourcebook: Agricultural Innovation Systems. 

http://siteresources.worldbank.org/INTARD/Resources
/335807-1330620492317/9780821386842.pdf

Andrea Fitzpatrick, Jody L.; Sanders, James R.; and Blaine R. Worthen. (1997). Program Evaluation: Alternative Approaches and Practical Guidelines, Second Edition. New York: Longman. 

Frechtling, Joy (2002). The 2002 User-Friendly Handbook for Project Evaluation. Division of Research Evaluation and Communication, National Science Foundation. www.nsf.gov/pubs/2002/nsf02057/nsf02057_1.pdf 
www.nsf.gov/pubs/2002/nsf02057/start.htm

Patton, Michael Quinn. (1997). Utilization Focused Evaluation: The New Century Text. Thousand Oaks: Sage Publications. 

Suvedi, M. and S. Morford. (2003). Conducting Program and Project Evaluations: A Primer for Natural Resource Program Managers in British Columbia. Forrex-Forest Research Extension Partnership, Kamloops, B.C. Forrex Series 6. www.msu.edu/~suvedi/Resources/Documents/4_1_FS6.pdf

USAID (2011). Evaluation Policy. Bureau for Policy, Planning, and Learning. January 19th. www.usaid.gov/evaluation/ USAID_EVALUATION_POLICY.pdf?020911 

Weiss, Carol H. (1998). EvaluationMethods for Studying Programs and Policies (2nd Edition). New Jersey: Prentice Hall. 

Wholley, Joseph S.; Harty, Harry P.; and Katheryn E Newcomer. (eds.) (1994). Handbook of Practical Program Evaluation. San Francisco: Jossey-Bass Publishers.

W.K. Kellogg (2000). Evaluation Handbook. W.K. Kellogg Foundation, Battle Creek, MI. www.wkkf.org /documents/wkkf/evaluationhandbook/ 

Online Resources

Online Evaluation Resource Libraryhttp://oerl.sri.com/

Michigan State University - Extension Evaluation Resources www.msu.edu/~suvedi/Resources /Evaluation%20Resources.htm

Pennsylvania State University - Extension
http://extension.psu.edu/evaluation

University of Wisconsin - Extension 
Program Development and Evaluation. www.uwex.edu/ces/pdande/evaluation/evaldocs.html

BMZ/GIZ-conference “Systemic Approaches in Evaluation” (25-26 January 2011, Eschborn/Germany).
www.evaluation-conference.de


CRS Publications Pertaining to Evaluation (general)
Comments