Figures, Tables, and Exhibits ix
Preface xv
The Editors xxi
The Contributors xxv
Part One: Evaluation Planning and Design 1
1. Planning and Designing Useful Evaluations 7
Kathryn E. Newcomer, Harry P. Hatry, Joseph S. Wholey
2. Analyzing and Engaging Stakeholders 36
John M. Bryson, Michael Quinn Patton
3. Using Logic Models 62
John A. McLaughlin, Gretchen B. Jordan
4. Exploratory Evaluation 88
Joseph S. Wholey
5. Performance Measurement 108
Theodore H. Poister
6. Comparison Group Designs 137
Gary T. Henry
7. Randomized Controlled Trials 158
Carole J. Torgerson, David J. Torgerson, Celia A. Taylor
8. Conducting Case Studies 177
Karin Martinson, Carolyn O'Brien
9. Recruitment and Retention of Study Participants 197
Scott C. Cook, Shara Godiwalla, Keeshawna S. Brooks, Christopher V. Powers, Priya John
10. Designing, Managing, and Analyzing Multisite Evaluations 225
Debra J. Rog
11. Evaluating Community Change Programs 259
Brett Theodos, Joseph Firschein
12. Culturally Responsive Evaluation 281
Stafford Hood, Rodney K. Hopson, Karen E. Kirkhart
Part Two: Practical Data Collection Procedures 319
13. Using Agency Records 325
Harry P. Hatry
14. Using Surveys 344
Kathryn E. Newcomer, Timothy Triplett
15. Role Playing 383
Claudia L. Aranda, Diane K. Levy, Sierra Stoney
16. Using Ratings by Trained Observers 412
Barbara J. Cohn Berman, Verna Vasquez
17. Collecting Data in the Field 445
Demetra Smith Nightingale, Shelli Balter Rossman
18. Using the Internet 474
William C. Adams
19. Conducting Semi-Structured Interviews 492
William C. Adams
20. Focus Group Interviewing 506
Richard A. Krueger, Mary Anne Casey
21. Using Stories in Evaluation 535
Richard A. Krueger
Part Three: Data Analysis 557
22. Qualitative Data Analysis 561
Delwyn Goodrick, Patricia J. Rogers
23. Using Statistics in Evaluation 596
Kathryn E. Newcomer, Dylan Conger
24. Cost-Effectiveness and Cost-Benefit Analysis 636
Stephanie Riegg Cellini, James Edwin Kee
25. Meta-Analyses, Systematic Reviews, and Evaluation Syntheses 673
Robert Boruch, Anthony Petrosino, Claire Morgan
Part Four: Use of Evaluation 699
26. Pitfalls in Evaluations 701
Harry P. Hatry, Kathryn E. Newcomer
27. Providing Recommendations, Suggestions, and Options for Improvement 725
George F. Grob
28. Writing for Impact 739
George F. Grob
29. Contracting for Evaluation Products and Services 765
James B. Bell
30. Use of Evaluation in Government 798
Joseph S. Wholey
31. Evaluation Challenges, Issues, and Trends 816
Harry P. Hatry, Kathryn E. Newcomer, Joseph S. Wholey
Name Index 833
Subject Index 841
The leading program evaluation reference, updated with the latest tools and techniques
The Handbook of Practical Program Evaluation provides tools for managers and evaluators to address questions about the performance of public and nonprofit programs. Neatly integrating authoritative, high-level information with practicality and readability, this guide gives you the tools and processes you need to analyze your program's operations and outcomes more accurately. This new fourth edition has been thoroughly updated and revised, with new coverage of the latest evaluation methods, including:
* Culturally responsive evaluation
* Adopting designs and tools to evaluate multi-service community change programs
* Using role playing to collect data
* Using cognitive interviewing to pre-test surveys
* Coding qualitative data
You'll discover robust analysis methods that produce a more accurate picture of program results, and learn how to trace causality back to the source to see how much of the outcome can be directly attributed to the program. Written by award-winning experts at the top of the field, this book also contains contributions from the leading evaluation authorities among academics and practitioners to provide the most comprehensive, up-to-date reference on the topic.
Valid and reliable data constitute the bedrock of accurate analysis, and since funding relies more heavily on program analysis than ever before, you cannot afford to rely on weak or outdated methods. This book gives you expert insight and leading edge tools that help you paint a more accurate picture of your program's processes and results, including:
* Obtaining valid, reliable, and credible performance data
* Engaging and working with stakeholders to design valuable evaluations and performance monitoring systems
* Assessing program outcomes and tracing desired outcomes to program activities
* Providing robust analyses of both quantitative and qualitative data
Governmental bodies, foundations, individual donors, and other funding bodies are increasingly demanding information on the use of program funds and program results. The Handbook of Practical Program Evaluation shows you how to collect and present valid and reliable data about programs.