Before you plan your project, determine what you hope occurs as a result of the project – that is, what you expect your intended audience to know or do. With that foundation, you can design your initiative with those behavior and/or knowledge changes in mind, then evaluate to determine whether they occurred.
Note: Some of the tools available on this site use the term “outcomes,” though "behavior/knowledge changes" is the term REAP CEP is now using. The tools using the word “outcomes” are relevant and useful, and the REAP CEP Board encourages you to use the tools to assist you with the evaluation components of your project.
In addition to the links below, you may contact the REAP CEP Coordinator, firstname.lastname@example.org or 515-313-8909 for assistance.
The Progress Markers and Chain of Outcomes tools can help you identify what you expect your intended audience to know or do at the conclusion of your project. These exercises may help you express the behavior/knowledge changes you desire.
Of all the behavior/knowledge changes you can imagine from your project,
- Which behavior/knowledge changes are most important?
- Which might help you to improve your program, adapt or expand your program into new areas, or attract future funding for your work?
- Do you think your project will meet one or two conservation education needs in Iowa? If so, if you evaluate for those behavior/knowledge changes, could you prove you are meeting those needs?
- Which behavior/knowledge changes do you feel pretty certain will occur, but you'd really love to know for sure?
- Which behavior/knowledge changes could you measure -- using your existing time and experience, and perhaps a little more money?
Choose at least one behavior/knowledge change you intend to measure. Determine how and when you can measure it.
In your application under "Audience and Behavior/Knowledge Changes":
- Explain what you expect the intended audience to know or do at the conclusion of your project and why your intended behavior/knowledge changes are achievable.
In your application under "Evaluation":
- Explain how you will show if your desired behavior/knowledge changes occurred, and how your evaluation results will be useful to the applicant and/or to the project.
- Identify evaluation tool(s) you plan to use, such as but not limited to surveys, focus groups, pre- and post-tests and/or retrospective pre-/post-questionnaires.
- Include at least one specific question or observation you will be including in your evaluation. Applicants are encouraged to include a copy of their entire evaluation, or draft evaluation, as an attachment to their proposal.
- If you are requesting funding for a project of yours that received past CEP support, explain how your evaluation of that project demonstrates the need for continuation. Also, explain any changes you will make in conducting or evaluating the project this time.
- If the environmental education community has conducted a similar type of project, explain what you have learned that may be applied to your project.
Example of how the "Evaluation" section in a grant application might look.
If you need funding to conduct your evaluation, you may request it within your grant application.
Please send a brief summary of your results to REAP CEP when your evaluation is complete-with your final report.
The Progress Markers form helps you think about the behavior/knowledge changes of a program in a systematic way and locate good points for evaluation to occur.
Download the Progress Markers to see the one-page form and the directions for its use.
Read page 4 of the pdf to see an example of how the Progress Markers form could be used to determine some behavior/knowledge changes of a sample workshop.
Chain of Outcomes
The Chain of Outcomes form helps you "think ahead" to the stages or flow of a conservation education project, so that you can describe the benefits you believe your project will provide into the future. The steps you describe on the form can give you ideas of behavior/knowledge changes worth evaluating.
Download this Chain of Outcomes to see the one-page form and the directions for its use.
Read page 4 of the pdf to see an example of how the Chain of Outcomes form could be used to determine some impacts behavior/knowledge changes of a sample workshop.
The heart of evaluation is based in its questions. Good questions elicit high quality responses from those who participate in programs and can help you summarize your results effectively. However it's not easy to develop good questions.
Templates are provided for four kinds of evaluation questions.
- Two templates measure knowledge and/or attitudes.
- Two additional templates assess behavioral change.
Each question template is accompanied by a scale or rating system. You must use the scale for the question to be effective. Yes-no or true-false questions generate weaker data.
These templates may be especially helpful if you have little knowledge or experience with evaluation. They are written in a standard format, but program-oriented details are missing. There are blanks inside brackets where you should place the details that fit your program and your evaluation goals.
All of the question templates are suited to a written survey and are intended to be easy to tabulate and analyze. However, you may use the questions as part of individual or group interviews--face-to-face or by telephone.
The timing of the questions can be very important. Some questions should be asked of participants immediately after the program. Other questions need to be asked after a period of time.
It is essential to "try out" questions on yourself and others before you finalize your survey. You may learn that you need to provide more possible answers, because none of those you provided "fit" yours or your testers' situations. Testing out your questions can help clarify language and ensure you are getting the type of information that will be valuable to you. Make the survey as short as possible (while not scrimping on data collection), and try to make the survey fun for the participants.
ISU Extension provides information and guidance regarding outcomes evaluation. Contact Nancy Grudens-Schuck at 515-294-0894 for more information.
Selected resources for further inquiry into evaluation:
Grudens-Schuck, N., Lundy-Allen, B., & Larson, K. (2004, May). Focus group fundamentals.
Ames, IA: Iowa State University Extension.
Available at: http://www.extension.iastate.edu/Publications/PM1969B.pdf
Larson, K., Grudens-Schuck, N., & Lundy-Allen, B. (2004, May).
Ames, IA: Iowa State University Extension.
Available at: http://www.extension.iastate.edu/Publications/PM1969A.pdf
Western Michigan University - The Evaluation Center
Contains many useful evaluation checklists and glossary of evaluation terminology, plus Standards of Program Evaluation, Standards for Personnel Evaluation, and Standards for Student Evaluation.
Online Evaluation Resource Library
Teacher professional development evaluation resources that may be helpful models for developing questions for surveys. Easy to use and contains other useful pages.
PennState Extension: Evaluating an Event, When and How
Increasing Your Survey Response Rate
Brown, R. E. (2002). An integral approach to evaluating outcome evaluation training. American Journal of Evaluation, 23(1), 1-17.
- This article contains a diagram which project staff and Board Members felt was a helpful visual representation of the process of learning about outcome evaluation. It may be reassuring for project staff to see that the process of learning evaluation is not a simple linear formula leading to easy implementation. Feelings of frustration depicted in the graph ("You stop explaining it and now I don't get it") are not uncommon but they don't have to signal inevitable failure.
Posnanski, T. J. (2002). Professional development programs for elementary science teachers: An analysis of teacher self-efficacy beliefs and a professional development model. Journal of Science Teacher Education, 13(2), 189-220.