Georgia Department of Human Resources
for Designing a Process Evaluation
Produced for the
Georgia Department of Human Resources
Division of Public Health
Melanie J. Bliss, M.A. James G. Emshoff, Ph.D.
Department of Psychology Georgia State University
Evaluation Expert Session July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of programs. In contrast to outcome evaluation, which assesses the impact of the program, process evaluation verifies what the program is and whether it is being implemented as designed. Thus, process evaluation asks “what,” and outcome evaluation asks, “so what?”
When conducting a process evaluation, keep in mind these three questions:
1. What is the program intended to be? 2. What is delivered, in reality? 3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process evaluation for a program of your choosing. There are many steps involved in the implementation of a process evaluation, and this workbook will attempt to direct you through some of the main stages. It will be helpful to think of a delivery service program that you can use as your example as you complete these activities. Why is process evaluation important? 1. To determine the extent to which the program is being
implemented according to plan 2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or unplanned
3. To compare multiple sites with respect to fidelity 4. To provide validity for the relationship between the intervention
and the outcomes 5. To provide information on what components of the intervention
are responsible for outcomes 6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of implementation).
7. To provide managers feedback on the quality of implementation 8. To refine delivery components 9. To provide program accountability to sponsors, the public, clients,
and funders 10. To improve the quality of the program, as the act of evaluating is
Evaluation Expert Session July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3 2. Determine Program Components 4 3. Develop Logic Model* 4. Determine Evaluation Questions 6 5. Determine Methodology 11 6. Consider a Management Information System 25 7. Implement Data Collection and Analysis 28 8. Write Report**
Also included in this workbook:
a. Logic Model Template 30 b. Pitfalls to avoid 30 c. References 31
Evaluation can be an exciting, challenging, and fun experience
* Previously covered in Evaluation Planning Workshops. ** Will not be covered in this expert session. Please refer to the Evaluation Framework
and Evaluation Module of FHB Best Practice Manual for more details.
Evaluation Expert Session July 16, 2002 Page 3
Forming collaborative relationships
A strong, collaborative relationship with program delivery staff and management will likely result in the following:
Feedback regarding evaluation design and implementation Ease in conducting the evaluation due to increased cooperation Participation in interviews, panel discussion, meetings, etc. Increased utilization of findings
Seek to establish a mutually respectful relationship characterized by trust, commitment, and flexibility.
Key points in establishing a collaborative relationship:
Start early. Introduce yourself and the evaluation team to as many delivery staff and management personnel as early as possible.
Emphasize that THEY are the experts, and you will be utilizing their knowledge and
information to inform your evaluation development and implementation.
Be respectful of their time both in-person and on the telephone. Set up meeting places that are geographically accessible to all parties involved in the evaluation process.
Remain aware that, even if they have requested the evaluation, it may often appear as
an intrusion upon their daily activities. Attempt to be as unobtrusive as possible and request their feedback regarding appropriate times for on-site data collection.