3 steps to effective program evaluation: Start with a logic model

Robin LaBarbera • January 10, 2022

Align everything to your program's outcomes.

If you’re a nonprofit leader and you’ve ever applied for a grant, you know the grant writing process includes answering this question: “How will you evaluate your project.” Funders want to know what difference your organization makes, and you are expected to provide data that demonstrates your organization’s impact.


That, in essence, is the point of program evaluation, which is the process of collecting, analyzing, and using information about a program’s activities to assess the program’s effectiveness and efficiency. Funders look for evidence that your program is achieving the intended outcomes, to get confirmation that their contribution was money well spent. 


How will you evaluate your program?

How do you know that your program is effective? This is perhaps the most important question a nonprofit can answer for themselves, the people their program serves, and the donors who support their work. The best way to answer this question is to understand, measure, and communicate the value of your program by evaluating your program’s outcomes.


It’s not enough to simply provide number of people served (e.g., “16 students enrolled in our after-school reading program”). These are outputs. Outputs measure simple metrics such as number of participants, service hours delivered, modules completed, etc. Tracking outputs enables you to state what services you provided and for how many people. Outputs are necessary data in your evaluation, but you shouldn’t stop there if you want to satisfy grant makers.


Outcomes measure the changes you expect to see. What has changed for participants as a result of your program’s activities? Outcomes are generally expressed in terms of changes in knowledge, skills, attitude, and/or behavior. The question of “what impact is our program having” should serve as a guide in program evaluation. You can then align the difference you expect to make with what you measure and report.


RELATED: Download our free Guide to Program Evaluation.


Three steps for program evaluation.

In a program evaluation, you (1) identify the changes you want to make, (2) gather data to measure those changes, and (3) report what the data say about the expected change. Through this process, you can provide professional-level evidence to funders that you’ve achieved what you said you would when you wrote the grant request. Your program made a difference, and you want to provide the data that clearly demonstrates that to stakeholders.


Now let’s examine the steps in a program evaluation for a hypothetical nonprofit program that provides reading instruction to high-school students with mild learning disabilities in an after-school setting. We’ll call it “Read2Achieve” or R2A. 


Step 1: Develop a logic model. 

An effective program evaluation should start with a logic model. A logic model is a visual roadmap that demonstrates the outcomes you expect from a specific activity or group of activities.


R2A stated their outputs as: (1) number of students enrolled, classrooms utilizes, and teacher/facilitator staff employed; (2) number of modules completed; and (3) number of assessments completed and student scores on those assessments. 


R2A’s outcomes included: (1) students’ increased reading comprehension and language development, (2) increased student academic self-efficacy. R2A determined that student academic self-efficacy, or the attitudes toward their ability to achieve academic success, is important for overall school success, even outside their R2A program, and it is therefore a key outcome for their program.

 

Step 2: Collect your data. 

Once you’re clear on your program’s intended outcomes, you now create data collection tools (e.g., surveys, focus groups) that align with the outputs and outcomes you identified in the logic model. Data about outputs are typically collected through spreadsheets or databases (e.g., what services you provide and for how many), and outcome data requires gathering information from program participants. Participants include people who are receiving services directly (e.g., students) and those who can observe changes in participants (e.g., teachers/facilitators).


R2A decided to collect data through pre- and post-surveys with all students in their program (they currently operate programs in five schools in the county). One survey item, for example, asks students to respond to this question:


Because of this program, do you feel better able to be successful in school, less able, or about the same? Circle one:

a.   Better able

b.   Less able

c.    About the same, still able to be successful

d.   About the same, still not able to be successful

Any comments about your answer (optional):


Step 3: Generate your reports.

The report for R2A highlighted the following outputs:

  • 50 students, 10 teacher/facilitators, 5 classrooms
  • 3 hours of tutoring per week (1 hour per day, 3 days per week)
  • Average 1.5 modules completed for each student each week of the 10-week program session = 750 modules


Outcome data showed an average of 1 grade level increase in reading over the 10-week program, and that over 87% of participants feel better able to achieve overall academic success.


The combination of illustrating who participated and the program’s impacts provides a comprehensive picture of the degree to which R2A is creating a pathway to overall academic success.


This R2A example illustrates how to define outputs and outcomes which can then be aligned with data collection and reporting in a program evaluation. It required thoughtful collaboration to define and measure outcomes and outputs as defined in the evaluation plan. These anchor points served as a springboard for creating data collection tools to gather data aligned to the program’s outputs and outcomes. The student pre- and post-surveys and teacher/facilitator surveys focused on gathering data aligned to those outcomes. Outputs tracked activities, providing them with a holistic picture of what went well, what needed improvement, and the overall impact. The data then funneled into a compelling report, which also aligned to the original outcomes and outputs, empowering them to share with confidence the difference R2A makes, as well as opportunities to improve. 


Program evaluation next steps.

If you are a nonprofit leader faced with evaluating your program’s effectiveness, we hope this brief article has provided you with the foundational knowledge you need to get started. If you’d like help with the process of planning and implementing a full-scale program evaluation, consider making use of the consultants at LaBarbera Learning Solutions. We’re an experienced team of researchers, evaluators, and educators with the expertise needed to demonstrate your program’s impact to stakeholders. See our cost-effective solutions at www.labarberalearning.com


By Robin LaBarbera June 2, 2025
This is why prison education isn’t just a moral argument—it’s a practical one. It reduces future crime. It lowers costs. It strengthens communities. And it saves lives, sometimes in the most unexpected places.
credit: Shutterstock
By Robin LaBarbera May 30, 2025
This research contributes to a growing body of evidence showing the value of high-quality educational programs in correctional settings—not only for reducing recidivism but for fostering human flourishing.
love your neighbor (credit: Shutterstock)
By Robin LaBarbera May 29, 2025
Mary Flin’s example challenges me to rethink what it means to serve, to listen, and to love my own neighbors. Her life is a living answer to the question: What if every neighborhood had a chaplain?
Human flourishing behind bars
By Robin LaBarbera May 15, 2025
The evidence is clear: faith-based educational programs like The Urban Ministry Institute offer far more than theological training—they cultivate well-being, leadership, and resilience among incarcerated individuals.
Human flourishing in prison (credit: Shutterstock)
By Robin LaBarbera May 13, 2025
Drawing on research, theory, and firsthand accounts, this chapter interrogates what it truly means to flourish in the least likely of places.
addressing criminogenic needs (credit Shutterstock)
By Robin LaBarbera March 10, 2025
Two programs are successfully addressing criminogenic needs: World Impact's TUMI seminary-level education program offered in prisons across the US, and House of Mercy's re-entry ministry in Washington. These two organizations are effectively reducing reoffending by focusing on addressing criminogenic needs, and we highlight House of Mercy in this post.
Scaling impact through program evaluation (credit: Shutterstock)
By Robin LaBarbera August 26, 2024
You can drive greater impact with data – work with a program evaluator who has a history of publishing research in academic journals to ensure that more people set eyes on your accomplishments. We helped a client get their program expanded into 53 new locations because we published evaluation data in over 25 blogs and in three academic journals.
returning home from incarceration: improving outcomes (credit: Shutterstock)
By Robin LaBarbera July 15, 2024
Based on the first round of data collection and analysis, it appears that HOM is achieving its desired outcomes. Specifically, program participants rated themselves highly in terms of their perceptions of HOM’s programs and services, ability to think adaptively under stress, psychological strength and approach to problem-solving, perceived social support, employment and educational trajectories, and positive use of leisure time.
Cognitive distortions (credit: Shutterstock)
By Dr. Robin LaBarbera June 9, 2024
Cognitive distortions - irrational thoughts that can distort the way a person sees themselves, their life, their specific day-to-day situations, their relationships, and other people - can contribute to mental health conditions such as depression and anxiety. Learn how to identify and address them in our peer2peer mental health support training.
Higher education in prison (credit: Shutterstock)
By Robin LaBarbera June 9, 2024
The evidence is clear: TUMI is responsible for the drastic change I observed in people like David McMillan. So, yes, we should provide educational opportunities to those who are incarcerated!
More Posts