Evaluation Tools and Resources

The following tools, aides, tip sheets, and other resources are available to support with planning and management, implementation, and analysis of data and evaluations:

  • Building Capacity in Evaluating Outcomes: A Teaching and Facilitating Resource for Community-based Programs and Organizations at the University of Wisconsin-Extension – 2008 (PDF | 10 MB) provides 93 activities and materials for practitioners working in and with community-based programs to use in building the capacity of individuals, groups, and organizations in evaluating outcomes. Included are eight units that cover core evaluation topics: getting ready, planning, engaging stakeholders, focusing the evaluation, collecting data, analyzing data, using data, and managing an evaluation. Each unit contains hands-on activities, handouts, and a slide presentation. A facilitator’s guide also is provided.
  • The Bureau of Justice Assistance (BJA) Center for Program Evaluation and Performance Measurement provides evaluation tools that state and local agencies can use for planning and implementing program evaluations. They are also designed to help with developing and collecting program performance measures required by BJA to measure program performance.
  • The Centers for Disease Control and Prevention (CDC) Program Performance and Evaluation Office produces many evaluation resources and guides, but this website provides links to a very wide range of resources from government, nonprofit, and academic sources. The resources are organized around: general step-by-step evaluation manuals, manuals related to logic models and data collection methods and sources, those pertaining to specific types of programs or interventions, websites offering comprehensive evaluation resources and assistance, key professional associations, and key journals.
  • Collaborative, Participatory, and Empowerment Evaluation is a webpage by David Fetterman for the American Evaluation Association Topical Interest Group. It has links to information and tools related to books and publications, guides, tech tools, and videos.
  • Community Tool Box at the University of Kansas offers more than 7,000 pages of practical step-by-step skill building guidance for building healthy communities. It has 46 chapters that include links to nearly 300 sections, including basic methods and tools for effective program evaluation. Each section typically includes a description of the task, advantages of performing this task, step-by-step guidelines, examples, checklists of points to review, and possibly training materials and summary slides.
  • The Electronic Statistics Textbook at StatSoft.com is an online textbook that begins with an overview of the relevant elementary concepts and continues with a more in depth discussion of specific areas of statistics. Topics are organized by “modules” and accessible by buttons representing classes of analytic techniques. A glossary of statistical terms and a list of references for further study are included.
  • Evaluation Center at Western Michigan University was founded by Daniel Stufflebean at The Ohio State University in 1963 and moved to Western Michigan in 1973. The Center’s webpage contains resources intended to further its mission to advance the theory, practice, and utilization of evaluation. Included are links to publications, presentations, and video lectures and presentations on a wide range of evaluation topics. Of particular interest are the evaluation checklists for designing, budgeting, contracting, staffing, managing, and assessing evaluations of programs; collecting, analyzing, and reporting evaluation information; and determining merit, worth, and significance. Each checklist is a distillation of valuable lessons learned from practice.
  • Program Development and Evaluation at the University of Wisconsin-Extension includes many valuable resources. Two key resources are the “Planning a Program Evaluation” worksheet that can help evaluators identify stakeholders, type of evaluation needed, information needed, methodology, interpretation, and communication of evaluation results; and the Enhancing Program Performance with Logic Models online course. The course focuses on what a logic model is and how to use one for planning, implementation, evaluation, or communicating a program. Also provided are links to more than 40 “Quick Tips,” brief documents that address topics pertaining to planning an evaluation, data collection, analyzing and interpreting information, communicating results, improving evaluation quality, and retrospective post-then-pre designs.
  • The Handbook of Research Design and Social Measurement, Sixth Edition at Sage Research Methods provides procedures and guidance for three major types of research: basic, applied, and evaluation. It addresses topics such as research design, qualitative research, data collection, statistical analysis, and scales and indexes, and includes a guide to federal and private funding and to the publication of research reports. Extensive bibliographies accompany each major section of the handbook.
  • Harvard Family Research Project (HFRP) provides links to resources to help users develop and evaluate strategies to promote well-being of children, youth, families and their communities. HFRP produces a free quarterly periodical, The Evaluation Exchange that contains lessons and emerging strategies for evaluating programs and policies focused on children, families, and communities. Articles are written by the most prominent evaluators and practitioners in the field and address current issues facing evaluators of all levels. It is written in a format that includes take-away ideas that are designed to help users in their current work.
  • Point K: Practical Tools for Planning, Evaluation, and Action at the Innovation Network is a portal that provides a searchable database of resources for evaluation and capacity building that contains over 300 reports, articles, tools, and tip sheets. Included are resources related to evaluation planning, data collection, and analysis. It also provides access to a step-by-step Logic Model Builder for articulating and connecting goals, resources, activities, outputs, and outcomes. The Evaluation Plan Builder transfers key data from the Logic Model Builder and moves to identification of evaluation questions, indicators, and data collection strategies for evaluating implementation and outcomes.
  • Program Evaluation at Penn State Extension provides information on how to design and implement a program evaluation to improve a program, compare delivery methods, respond to stakeholders, advocate, or prepare for promotion. Included are links to almost 100 tip sheets on a variety of topics, including question wording and ordering, types and sources of data and information, and program evaluation techniques. Also, find a series of webinars and PowerPoint presentations on evaluation for statewide programs. The topics include: what to evaluate, data collection methods, creating questions and items for measurement, paper surveys, and analysis and use of results.
  • SRI International, Online Evaluation Resource Library (OERL) is supported by the Division of Research, Evaluation and Communication of the National Science Foundation (NSF) and was developed for professionals seeking to design, conduct, document, or review project evaluations. The library provides a large collection of plans, reports, and instruments from past and current evaluations that have proven to be sound and representative of current evaluation practices; guidelines for how to improve evaluation design and practice; and a discussion forum for stimulating ongoing dialogue in the evaluation community. Although the materials pertain primarily to NSF projects, it also is intended to be useful to evaluators outside of the NSF community.
  • Web Center for Social Research Methods is a website by Bill Trochim that is designed for people engaged in applied social research and evaluation. The Knowledge Base is an online hypertext textbook that covers the entire research process, including: formulating research questions, sampling, measurement, research design, data analysis, and reporting. Also included is a link to a section on selecting statistics. This is an interactive on-line statistical advisor. The user answers a series of questions about characteristics of the data and the intent of the analysis and the appropriate statistical approach is suggested.
Last Updated: 09/24/2015