Services‎ > ‎Training Material‎ > ‎

Evaluating Extension Programs

 by Murari Suvedi, Michigan State University


We are in an era of accountability. The demand for program evaluation information is growing.  Because the need to support developmental programs is increasing and resources are limited, there is increased competition among agencies for funds. This has resulted in greater expectations for efficiency and accountability for organizations. Policy makers and funding agencies are increasingly demanding information on how program funds are used and what those programs produced.  The following are some examples of frequently asked questions by the funding agencies, government departments and other stakeholders:

  • We gave you $500,000 during last three years--what did your agency do with it?
  • We have supported your agency for the last 10 years, why should we continue to support you?
  • How do we know that your programs are effective?
  • What are your plans to improve or terminate ineffective programs?
  • What new programs need to be developed to meet the needs and problems of the people who are your ultimate beneficiaries?

Evaluation helps answer these questions. The main purpose of evaluation is to improve the quality of a program or a project by identifying its strengths and weaknesses. Extension programs, no matter how large or small, need to be reviewed or assessed to see if they accomplished the stated objectives. Through evaluation processes, we find out what impact the program had on the audience. Evaluation answers whether a program, project or policy should be continued, expanded or terminated. The information is also useful to fine-tune the program or policy and it communicates important results to key individuals or groups who are concerned about the service.

Evaluation is an emerging discipline. Despite the recognition that evaluation is important to build programs on solid ground, there has been no systematic effort to build capacity in this area. Not many universities in developing countries offer programs focusing on evaluation, so agricultural development programs have been utilizing expatriates for program evaluation. As a result, local and indigenous perspectives are not taken into full consideration in programming efforts. In many instances, national policy makers and program managers lack understanding of the theory and practice of program development and evaluation. Similarly, many extension systems have recognized a shortage of professionals who both understand and could guide a systematic evaluation of agricultural extension initiatives and programs. We are not aware of any professional development short courses on outcome-based evaluation available for busy professionals which emphasize the social, economic, environmental and quality of life impacts of extension programs and service.


The purpose of this module is to expose national level policy makers, project managers and funding agency personnel to program evaluation. Specifically, participants in this evaluation workshop/training module will be able to:
  1. Describe evaluation principles and frequently used models of program evaluation.
  2. Identify indicators of program success of a given agricultural extension project/policy.
  3. Select appropriate methods/techniques of data collection for conducting process and impact evaluations.
  4. Understand the use of statistical software to analyze data, interpretation of results, writing evaluation reports, and how to share findings with stakeholders.
  5. Develop evaluation plans to document impacts of extension programs.

Picture credit: Brent Simpson


The audience for this workshop/training includes policy makers, upper level administrators or program managers in public sector institutions that have an extension mandate (including university faculty with responsibility in extension, rural development, food and nutrition security programs) and administrators in organizations (e.g., USAID staff, NGOs, donors) that fund and manage projects with extension components.

Content and Teaching Material

The content of the workshop/training module consists of a brief introduction to the module, a list of evaluation topics accompanied by Power Point presentations and readings/ references on each topic. Samples of evaluation plans, data collection instruments, evaluation reports, and evaluation competency self-assessment instruments is included in the module as appendices.

Delivery Methods

The Training Manual, handouts, and presentations are attached below and can be easily downloaded.

We invite users to widely share the material. We would appreciate feedback on how it is being used and welcome suggestions for improvements.

See contact information at the left to contact us for training requests to be conducted by MEAS.

The method of delivery for this module is somewhat flexible. Dr. Suvedi and his team at Michigan State University will offer the evaluation training on a demand-driven basis. Through the MEAS LWA mechanism USAID missions worldwide can buy-in this module as face-to-face training in-country (we recommend 10-15 participants for in-country training), or fund the training of participants from multiple countries to be held at a regional location.

MSU will be able to offer approximately 9 in-country training sessions throughout the year on a first-come first-served basis.

In case of higher demand, the module could be offered as an MSU online course for worldwide participants. Alternatively, we will offer it as an intensive summer course at Michigan State University.


Parts of this course were taught at the "Program Evaluation Workshop for Agricultural Extension Professionals" in Nepal, 
December 18, 2011
SelectionFile type iconFile nameDescriptionSizeRevisionTimeUser
View Download
Microsoft Word version of the Training Manual. Best for those who which to use parts of the manual or do make modifications to the course.   1232k v. 1 Oct 27, 2011, 8:09 AM Andrea Bohn
View Download
PDF version of the Training Manual. Best for printing and disseminating to course participants.   2453k v. 2 Oct 27, 2011, 8:24 AM Andrea Bohn