Skip to Content

Building Evidence for Educator Effectiveness

5-Minute Evaluation Resource Series logo Evaluation Lifecycle: Plan and Design; Identify and Follow Participants; Collect and Store Data; Analyze Data; Report and Use Findings.

Welcome to the 5-minute evaluation series, designed to provide you with clear, easy to access information on how to build evidence on programs that support educator effectiveness. The series includes videos and resources written in plain language and designed to be watched or read in 5 minutes or less.

To learn more, watch the first video Evaluation for educators.

To view all videos in the series, visit our YouTube channel.

  • Evaluation for educators

    Watch the Evaluation for educators video or read the illustrated video script [PDF 1 MB]. This video is designed for people who are not evaluators but want to understand important evaluation issues as they consider evaluating an education program or policy.
    Review the user’s guide [PDF 1 MB] to get the most out of the 5-minute evaluation resource series.
  • Life cycle of an evaluation

    Watch the Life cycle of an evaluation video or read the illustrated video script [PDF 1 MB] This video provides an overview of the evaluation life cycle and WWC standards for rigorous education evaluation.
    View this one-page infographic Life cycle of an evaluation [PDF 1 MB] that provides definitions for each stage.
  • What Works Clearinghouse (WWC)

    What is the What Works Clearinghouse [ PDF 1 MB ] This WWC resource provides the why, who, what, how, and where of the U.S. Department of Education’s trusted source for scientific evidence for what works in education.
    WWC e-brochure [PDF 1 MB] This WWC publication describes what the WWC does and why it is important.
  • Introductory tools for understanding rigorous evaluations

    Life cycle of an evaluation [PDF 1 MB] is a one-page infographic that provides definitions for each stage of an evaluation.
    Group evaluation comparison chart [PDF 1 MB] This infographic provides a side-by-side presentation of three group evaluation designs.
    Evaluation terms for a non-technical audience [PDF 1 MB] This glossary uses non-technical language to define terms often used in evaluation literature.
  • More resources

    Database for Evaluation Resources This database, designed for both non-technical and technical audiences, can be filtered by each step and stage of the evaluation life cycle. It can also be filtered by focus on educator effectiveness, length, audience, and type of resource.
    User’s guide for the database [PDF 1 MB] A guide to getting the most out of the Database for Evaluation Resources.
    Definitions of fields for the database [PDF 1 MB] A list of definitions for the filter fields of the database.
  • Pathway to Effective Evaluation

    Pathway to Evaluation [PDF 2 MB] Guidance on the four steps of the plan and design stage of the evaluation life cycle. Includes resources for more information about each step.
  • Step 1. Determine your research question

    How to Develop the Right Research Question for Program Evaluation [PDF 1 MB] A PowerPoint slide deck created by the Corporation for National and Community Service that provides detailed steps for developing appropriate research questions.
    Evaluation Programs for Strengthening Teaching and Leadership [PDF 5 MB] In the context of educator effectiveness, pages 18-30 of this resource provide an in-depth explanation of how to develop your research question, including types of outcomes and example research questions.
  • Step 2. Develop your logic model

    Theory of Change [ PDF 1 MB ] is a two-page excerpt from Resch, A., Berk, J., & Akers, L. that explains the importance of developing a theory of change at the beginning of a study and provides an example.
    Evaluation Programs for Strengthening Teaching and Leadership [PDF 5 MB] Pages 3-17 of this resource provide an in-depth explanation of logic model development, including descriptions of logic model aspects and an example logic model from the TIF program.
  • Step 3. Carefully describe the people in and context of your program.

    Video: The case for comparison groups and its illustrated video script [PDF 1 MB] This video discusses the importance of choosing the right comparison groups and outlines the most important factors in these decisions.
    Identifying a valid comparison group [PDF 1 MB] This resource discusses how to choose appropriate comparison groups for your study and why this is so important.
  • Step 4. Select your research design.

    Group evaluation comparison chart [PDF 1 MB] This infographic provides a side-by-side presentation of three group evaluation designs.
    Randomized Controlled Trials (RCTs) [PDF 1 MB] This resource describes the steps for conducting randomized controlled trials and what types of studies they are appropriate for.
    Regression Discontinuity Design (RDD) [PDF 1 MB] An infographic that explains regression discontinuity design using an example study.
    Matched-Comparison Group Design [PDF 1 MB] This resource describes which studies are appropriate for a matched-comparison group design and how to effectively design such a study.
    Difference-in-Differences Design [PDF 1 MB] An infographic that explains difference-in-differences design using an example study.
  • More resources

    Database for Evaluation Resources This database, designed for both non-technical and technical audiences, can be filtered by each step and stage of the evaluation life cycle. It can also be filtered by focus on educator effectiveness, length, audience, and type of resource.
    Don’t forget Suite 1 resources that provide an overview of the 5-minute Evaluation Resource Series and of the entire evaluation life cycle. Access Suite 1 by opening the previous section.