Using a Logic Model to Measure Program Impact

Using a Logic Model to Measure Program Impact
paly

This webinar discusses the importance of using a logic model to measure the impact of a program. It covers the components of a logic model, including inputs, outputs, activities, outcomes, and impacts, and provides tips for creating an effective logic model. The presenters also share strategies for gathering evidence of impact and interpreting findings.

About Using a Logic Model to Measure Program Impact

PowerPoint presentation about 'Using a Logic Model to Measure Program Impact'. This presentation describes the topic on This webinar discusses the importance of using a logic model to measure the impact of a program. It covers the components of a logic model, including inputs, outputs, activities, outcomes, and impacts, and provides tips for creating an effective logic model. The presenters also share strategies for gathering evidence of impact and interpreting findings.. The key topics included in this slideshow are logic model, program impact, evidence, inputs, outputs, outcomes, impact measurement, program evaluation,. Download this presentation absolutely free.

Presentation Transcript


1. Gathering Evidence of Impact: A Continued Conversation Jan Middendorf Cindy Shuman Office of Educational Innovation and Evaluation

2. Using a logic model to describe your program Using a logic model to describe your program INPUTS OUTPUTS OUTCOMES Program investments Activities Participation Short Medium Long-term What you do, with whom What you do, with whom What your clients do as a result What your clients do as a result Impact Impact Learning Behavior Change Condition Change

3. Four Levels of Evaluation Level 1: Reaction Level 2: Learning/Skill Building (short term outcomeask at end of meeting, lesson, workshop, etc.) Level 3: TransferBehavior Change (medium term outcome ask sometime aftermaybe 6 months later ) Level 4: Results Impact (long term outcome) Kirkpatrick Model of Evaluation

4. Evaluation of Outcomes How will you know what your clientele did as a result of your educational program? What is the evidence that they used the information? Why Evaluate?

5. 4 main purposes of evaluation 4 main purposes of evaluation Improvement: to improve the program; to enhance quality; to manage more effectively and efficiently. The effort to enhance programs. Accountability: to assess merit or worth; to assess effects; to assess costs and benefits. The effort to make judgments about the value of a policy or program. Knowledge development: to gain new insights. The effort to add to the knowledge base about effective practice or to add to policy debate. Oversight and compliance: to assess the extent to which a program follows rules, regulations, mandates or other formal expectations. Source: Mark, M. M., Henry, G. T., & Julnes, G. (2000). Evaluation: An integrated framework for understanding, guiding and improving policies and programs . San Francisco: Jossey-Bass.

6. Indicators: Evidence of Achieving Outcomes Indicators: Evidence of Achieving Outcomes What would it look like? How would I know it? If I were a visitor, what would I see, hear, read, and/or smell that would tell me this thing exists? If the outcome is achieved, how will you know it? What will it look like? What is the evidence?

7. Indicators - Evidence Indicators - Evidence The information needed to answer your evaluation questions Example: Did participant land owners or managers improve their land management practices? Evidence: # acres or % of acres managed according to guidelines # or quality of conservation plans implemented

8. Indicators - Evidence Indicators - Evidence The information needed to answer your evaluation questions Example: Did participants increase their ability to achieve financial self-sufficiency? Evidence: #, % who increased financial knowledge, #, & who reduced debt, #,% who established an emergency fund

9. Have the pets been fed today? Have the pets been fed today? How would you know that the animals have been fed? What is the evidence?

10. Lets practice. Lets practice. What is the evidence of High blood pressure? A clean neighborhood? A popular movie? A good carpenter? Learning at the workshop? Would the evidence be different for young people vs. seniors, high- vs. low-income neighborhoods, rural vs. urban residents, or by ethnicity?

11. Evidence is often expressed as numbers or percentages (number of, percent of, ratio of, incidence of, proportion of). However, not all evidence is numbers; qualitative evidence may be important. Remember, "Not everything that counts can be counted."

12. Work on Evaluation Plans Develop a question or two to understand what clientele did differently as a result of your educational program Develop a plan for how you will gather the evidence Get names and phone numbers and call them 6 months later Ask a third party what they see that is different Observe differences yourself Other?

13. How good is the indicator? Tangible be able to touch/know the information in some way See (observable) Read (survey, records, etc.) Hear (from individuals, others) Tips: direct, specific, useful, practical, culturally appropriate, adequate, clearly defined

14. What is an indicator? An indicator is the specific information, or evidence, that represents the phenomenon you are asking about. Indicator of fire = smoke Indicator of academic achievement = grades

15. How good are your questions? Can the questions be answered given the program? Are the questions truly important? Will the questions provide new insights? Can the questions be answered given your resources and timeline? Have the concerns of key users been included?

16. Identify key evaluation questions Who wants to know what about this program?

17. Evaluation Questions Clarify your evaluation questions Make them specific What do you need to know vs. what you would like to know Prioritize Check : Will answers to these questions provide important and useful information?

18. Components of a program Situation Resources (Inputs) Outputs Activities Participants Outcomes Chain of outcomes from short- to long- term External Factors and Assumptions

19. What is your purpose for evaluating? We are conducting an evaluation of _____ (program name) because ______ in order to __________. Example : We are conducting an evaluation of the Money Quest Program because we want to know to what extent youth who participate learn and use recommended money management skill in order to report program outcomes to our funder.

Related