Public programs are designed to reach certain goals and beneficiaries. Methods to understand whether such programs actually work, as well as the level and nature of impacts on intended beneficiaries, are main themes of this book.
The second edition of the Impact Evaluation in Practice handbook is a comprehensive and accessible introduction to impact evaluation for policy makers and development practitioners. First published in 2011, it has been used widely across the development and academic communities. The book incorporates real-world examples to present practical guidelines for designing and implementing impact evaluations. Readers will gain an understanding of impact evaluations and the best ways to use them to design evidence-based policies and programs. The updated version covers the newest techniques for evaluating programs and includes state-of-the-art implementation advice, as well as an expanded set of examples and case studies that draw on recent development challenges. It also includes new material on research ethics and partnerships to conduct impact evaluation. The handbook is divided into four sections: Part One discusses what to evaluate and why; Part Two presents the main impact evaluation methods; Part Three addresses how to manage impact evaluations; Part Four reviews impact evaluation sampling and data collection. Case studies illustrate different applications of impact evaluations. The book links to complementary instructional material available online, including an applied case as well as questions and answers. The updated second edition will be a valuable resource for the international development community, universities, and policy makers looking to build better evidence around what works in development.
Impact evaluation is an empirical approach to estimating the causal effects of interventions, in terms of both magnitude and statistical significance. Expanded use of impact evaluation techniques is critical to rigorously derive knowledge from development operations and for development investments and policies to become more evidence-based and effective. To help backstop more use of impact evaluation approaches, this book introduces core concepts, methods, and considerations for planning, designing, managing, and implementing impact evaluation, supplemented by examples. The topics covered range from impact evaluation purposes to basic principles, specific methodologies, and guidance on field implementation. It has materials for a range of audiences, from those who are interested in understanding evidence on "what works" in development, to those who will contribute to expanding the evidence base as applied researchers.
Encompasses the main concepts and approaches of quantitative impact evaluations, used to consider the effectiveness of programmes, policies, projects or interventions. This textbook for economics graduate courses can also serve as a manual for professionals in research institutes, governments, and international organizations.
Development Research in Practice leads the reader through a complete empirical research project, providing links to continuously updated resources on the DIME Wiki as well as illustrative examples from the Demand for Safe Spaces study. The handbook is intended to train users of development data how to handle data effectively, efficiently, and ethically. “In the DIME Analytics Data Handbook, the DIME team has produced an extraordinary public good: a detailed, comprehensive, yet easy-to-read manual for how to manage a data-oriented research project from beginning to end. It offers everything from big-picture guidance on the determinants of high-quality empirical research, to specific practical guidance on how to implement specific workflows—and includes computer code! I think it will prove durably useful to a broad range of researchers in international development and beyond, and I learned new practices that I plan on adopting in my own research group.†? —Marshall Burke, Associate Professor, Department of Earth System Science, and Deputy Director, Center on Food Security and the Environment, Stanford University “Data are the essential ingredient in any research or evaluation project, yet there has been too little attention to standardized practices to ensure high-quality data collection, handling, documentation, and exchange. Development Research in Practice: The DIME Analytics Data Handbook seeks to fill that gap with practical guidance and tools, grounded in ethics and efficiency, for data management at every stage in a research project. This excellent resource sets a new standard for the field and is an essential reference for all empirical researchers.†? —Ruth E. Levine, PhD, CEO, IDinsight “Development Research in Practice: The DIME Analytics Data Handbook is an important resource and a must-read for all development economists, empirical social scientists, and public policy analysts. Based on decades of pioneering work at the World Bank on data collection, measurement, and analysis, the handbook provides valuable tools to allow research teams to more efficiently and transparently manage their work flows—yielding more credible analytical conclusions as a result.†? —Edward Miguel, Oxfam Professor in Environmental and Resource Economics and Faculty Director of the Center for Effective Global Action, University of California, Berkeley “The DIME Analytics Data Handbook is a must-read for any data-driven researcher looking to create credible research outcomes and policy advice. By meticulously describing detailed steps, from project planning via ethical and responsible code and data practices to the publication of research papers and associated replication packages, the DIME handbook makes the complexities of transparent and credible research easier.†? —Lars Vilhuber, Data Editor, American Economic Association, and Executive Director, Labor Dynamics Institute, Cornell University
This Handbook presents state-of-the-art methodological guidance and discussion of international practice related to the integration of biodiversity and ecosystem services in impact assessment, featuring contributions from leading researchers and practitioners the world over. Its multidisciplinary approach covers contributions across five continents to broaden the scope of the field both thematically and geographically.
The second edition of Handbook of Practical Program Evaluation offers managers, analysts, consultants, and educators in government, nonprofit, and private institutions a valuable resource that outlines efficient and economical methods for assessing program results and identifying ways to improve program performance. The Handbook has been thoroughly revised. Many new chapters have been prepared for this edition, including chapters on logic modeling and on evaluation applications for small nonprofit organizations. The Handbook of Practical Program Evaluation is a comprehensive resource on evaluation, covering both in-depth program evaluations and performance monitoring. It presents evaluation methods that will be useful at all levels of government and in nonprofit organizations.
The Human Resources Program-Evaluation Handbook is the first book to present state-of-the-art procedures for evaluating and improving human resources programs. Editors Jack E. Edwards, John C. Scott, and Nambury S. Raju provide a user-friendly yet scientifically rigorous "how to" guide to organizational program-evaluation. Integrating perspectives from a variety of human resources and organizational behavior programs, a wide array of contributing professors, consultants, and governmental personnel successfully link scientific information to practical application. Designed for academics and graduate students in industrial-organizational psychology, human resources management, and business, the handbook is also an essential resource for human resources professionals, consultants, and policy makers.
Evaluation is crucial for determining the effectiveness of social programs and interventions. In this nuts and bolts handbook, social work and health care professionals are shown how evaluations should be done, taking the intimidation and guesswork out of this essential task. Current perspectives in social work and health practice, such as the strengths perspective, consumer empowerment, empowerment evaluation, and evidence-based practice, are linked to evaluation concepts throughout the book to emphasize their importance.This book makes evaluation come alive with comprehensive examples of each different type of evaluation, such as a strengths-based needs assessment in a local community, a needs assessment for Child Health Plus programs, comprehensive program descriptions of HIV services and community services for the aged, a model for goals and objectives in programs for people with mental illness, a monitoring study of private practice social work, and process evaluations of a Medicare advocacy program and a health advocacy program to explain advance directives. Equal emphasis is given to both quantitative and qualitative data analysis with real examples that make statistics and concepts in qualitative analysis un-intimidating.By integrating both evaluation and research methods and assuming no previous knowledge of research, this book makes an excellent reference for professionals working in social work and health settings who are now being called upon to conduct or supervise program evaluation and may need a refresher on research methods. With a pragmatic approach that includes survey design, data collection methods, sampling, analysis, and report writing, it is also an excellent text or classroom resource for students new to the field of program evaluation.
This new, third edition of Jack Phillips's classic Handbook of Training Evaluation and Measurement Methods shows the reader not only how to design, implement, and assess the effectiveness of HRD programs, but how to ultimately measure their return on investment (ROI). Each chapter has been revised and updated to include additional research, expanded coverage, and new examples of Dr. Phillips's case studies. Seven entirely new chapters have also been added, focusing largely on ROI.