A Toolkit for Volunteer
Leaders
|
Program Evaluation E-10If you want life to be easy, you must pay for it. This is the last step in the program planning process. Yes, evaluation should be considered during the planning process. It should not wait until the program has been completed. In order to understand evaluation's place in the program planning process you will need to review module E-4. In order to plan an evaluation correctly it should follow the other steps in program planning (described in modules E-5 through E-9 and M-2). LEARNING OBJECTIVES When you finish this module you should be able to: 1. Define "evaluation." EVALUATION DEFINED Evaluation is simply "a process to determine if objectives have been met." Well written objectives are essential to planning an evaluation. When planning an evaluation, start by reviewing the objectives. If you lack a set of clear, appropriate objectives, you will find it difficult to get agreement on any plan for evaluation. LEVELS OF EVALUATION Dr. Claude Bennett, Evaluation Specialist, Extension Service, U.S. Department of Agriculture, has outlined seven levels of evaluation which will help us understand the benefits and limitations of particular evaluation techniques and instruments. 1. Resources How many dollars, how many handouts and exhibits, what resource people, what materials did we use in the educational program that we are evaluating? 2. Activities How many planning meetings, demonstrations, workshops, field trips, contests, telephone calls, newspaper articles did we complete? 3. Participants Who participated in the activities that we organized? What were their ages, ethnic backgrounds, gender, socio-economic status, area of residence, organizations represented (and how many of each category)? 4. Reactions Did the participants like the activities? What comments or suggestions did they make regarding the activities? 5. Changes in Knowledge, Attitudes, Skills and Aspirations (KASA) As a result of your program's activities how did the participants improve (or regress) in KASA? Was there really any difference in the participants as a result of the program? 6. Adoption of New Practices Did the participants retain the KASA changes over time? Were these changes of short duration or did they become part of the participants' lifestyles? 7. End Results What were the long term effects of your program on the local community or society in general? Did the number of teen pregnancies decrease? Did the production of wheat (bushels per acre) increase? Were new organizations created? Was the lifestyle of community members improved? Can the improvements be attributed to the educational program that you provided? These levels of evaluation represent an increasing time requirement and increasing difficulty for the evaluator. They also represent an increasing value of evaluation results. Determining resources, activities and participants is relatively easy--it can be done by observation and counting. To determine reactions you will need an organized technique (a trained observer/recorder or a post-meeting questionnaire). To determine changes in KASA and adoption of new practices you will need to gather evidence before and after the program in a reliable and valid procedure. End results take time and great sophistication of measurement by evaluators. On the other hand, if you can gather evidence about changes in KASA or adoption of new practices, you will have stronger evidence for continuing your program or for getting funding for similar programs. EVALUATION ALTERNATIVES To most of us, evaluation means completing an evaluation form at the end of a meeting or training program. Other, often more useful, forms of evaluation are described below. Round Robin The round robin evaluation simply involves calling on each person in turn to share reactions to a given activity. For example, "What was your evaluation of last week's meeting (4-H horse show, training session)? What did you particularly like, and where might improvements be made?" Appoint a secretary to record responses. An important ground rule is no discussion, except for questions of clarification. This keeps the group from getting bogged down in defending, disputing and discussing individual comments until everyone has been heard. The round robin insures input from everyone, without debate. Once everyone has been called upon, the composite listing can then be discussed, if necessary, and conclusions drawn. One of the most commonly used forms of evaluation is a call for public input. The chair (or other official) poses a general question to the group as a whole, e.g., "Any comments on last week's horse show?" Experience shows that only a few people respond to such calls for input. The round robin, in contrast, will insure greater input and involvement. If members find that their input is respected and used, they will be more willing to speak openly in the future, and the quality of evaluation will increase. Buzz Groups Buzz groups, often called Process Groups, involve quickly breaking into small groups of two to five people. Each buzz group is to assess the meeting (workshop, etc.). Each person is asked to share his frustrations, concerns, suggestions and satisfaction with how things are going. Where there is consensus, the buzz group should initiate action to improve the meeting or workshop, e.g., propose to the group that..., or talk to the chair about..., or suggest.... Set a time limit of 15 to 30 minutes for this exercise. One option is to incorporate the process group assignment into an extended break. The advantage of this exercise is that people tend to share more openly in smaller more private groups than they do in larger public groups. Also the focus of the exercise is on action rather than complaints. Periodic process breaks are invaluable in developing a cohesive team spirit along with an added sense of personal responsibility. Observer Arrange for one or two individuals to serve as a process observer. They will not enter into the discussion but instead sit back and observe how the meeting is progressing, (i.e., what are the group dynamics, who is and isn't talking, how are decisions made, what team building roles were exhibited). The observer is to identify strengths as well as weaknesses. The checklist provided earlier in this chapter can serve to guide the observer in analyzing a group. The observers are then called upon at the end of the meeting to share their observations and suggestions with the group, and to pinpoint team building skills the group could work on. The advantage of designated observers is that they are removed from the operation of the group and can be more systematic and objective in their assessment. It is difficult for individuals who are directly involved in a meeting to step back and critique what is happening. Review Goals Every organization should plan time at least quarterly to review its goals and objectives. What has been accomplished? What has not? What adjustments (additions or deletions) need to be made? How do individuals in the group feel about the group's accomplishments? How well is the group (or designated committees) working together to accomplish those goals? Do these goals still reflect the priority interests of group members? What are the group's short-comings? Such evaluation has the advantage of focusing the group's attention on accomplishments. Changes in the group's goals, committee organization, or how meetings are conducted can be made now rather than put off until next year. One-on-One Consultation One of the most useful forms of evaluation is face-to-face consultation. Good managers are in frequent contact with workers (group members). They seek feedback and advice. Most people are reluctant to give advice unless it is asked for. To get useful feedback, ask specific questions. Quickly get any areas of potential awkwardness out in the open, i.e., "I heard you were upset with me about. . .Let's talk about that first." Be an active listener. Don't respond defensively. In giving feedback that has not been requested, one-on-one communication also works best. No one likes to be criticized in public. (Review module V-6, for guidelines in giving constructive criticism.) Written Evaluation Survey The traditional evaluation form asks participants to rate the meeting (workshop or event) according to certain listed criteria. Several open ended questions should also be included for a more detailed and personal response, e.g., "Where might improvements be made?" Two examples of survey evaluations forms are attached ("End of Meeting Suggestion Slip" and "Workshop Evaluation," pages 10-11). Such surveys have the advantage of being anonymous. Also, the results lend themselves to statistical analysis. Take care in preparing the evaluation form to insure that the questions asked are easily understood and that they provide information that can be used. WRITTEN EVALUATION INSTRUMENTS A. End of Meeting Suggestion Slip 1. Please rate today's meeting on the basis of the following criteria.
2. What were the strong points of this meeting?
3. What were the weak points?
4. What improvement would you suggest?
(You need not sign your name.) * * * * * * * * * * * * * * * * * * * * B. Workshop Evaluation Title of workshop: _____________________________________________________________
1. How would you rate this workshop on each of the following items? Please circle appropriate number:
2. Specifically how has this workshop helped you?
3. In what areas would you like further information or help?
C. Module ____ Evaluation 1. How long did it take you to read this module and complete the prescribed exercises? (Check one) _______ less than 30 minutes _______ 30-60 minutes _______ more than 60 minutes 2. How helpful was the information provided in meeting your leadership needs? (Circle one of the seven numbers on the following continuum.)
3. Was the module well written and easy to follow? (Circle one of the seven numbers on the following continuum.)
4. I found this module (Check one or more) _____ too elementary for my needs. _____ too advanced for my needs. _____ overly academic. _____ very practical. _____ well matched to my needs and leadership experience. _____ provided little I didn't already know. _____ stimulated my thinking. _____ provided me with lots of good information. _____ difficult to put into practice _____ __________________________________________ 5. Specifically the information in this module has helped me by .....
6. How could this chapter be improved?
7. Considering the content of this module and comparing my knowledge and confidence before and after I completed this module
D. Overall Program Evaluation Getting Results: A Guide to Effective Leadership 1. How did you undertake this program? (Check one) _____ independent self-study _____ as part of a study group 2. Did you find the program motivating? (Circle one of the four numbers on the following continuum.)
3. How many weeks did you dedicate to this program?
4. Which of the modules did you find most beneficial?
5. a. During the past five years, in what other leader training programs have you participated? (List them in the space below.)
b. How would you rank this program in comparison to those?
6. How have you used the information you've gained through this program?
7. In what areas would you like further information or help?
8. Would you recommend this program to others? _____ Yes _____ No
E. Evaluation of Classroom Teacher EXERCISE: Use (or adapt) one of the evaluation instruments to your group or organization. Discuss the findings and the success of your evaluation with a partner who is also interested in evaluation.
EXERCISE: Start a notebook (or a file) of evaluation instruments. Ask different groups, organizations, agencies, etc. for copies of evaluations that they use, especially of of educational programs for community groups. |