Questions, Methods, And Indicators For Implementation Evaluation

  • December 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Questions, Methods, And Indicators For Implementation Evaluation as PDF for free.

More details

  • Words: 454
  • Pages: 2
Practice:

Evaluate implementation to document what you are doing

Key Action:

Document implementation based on your logic model

TOOL: Questions, Methods, and Indicators for Implementation Evaluation

Purpose:

The following table can help you and your evaluator determine appropriate evaluation questions, methods for answering those questions, and some good indicators of the extent and quality of implementation.

Instructions:

1. Review the sample evaluation questions and consider the value of the data collection method used to answer these questions. 2. Review the sample indicators and generate ideas for other indicators you might use to document your magnet program’s implementation.

1

Practice:

Evaluate implementation to document what you are doing

Key Action:

Document implementation based on your logic model

Questions, Methods, and Indicators for Implementation Evaluation EVALUATION RESEARCH QUESTION

METHODS AND VALUES OF EVALUATION Compare the amount and range of activities done in the magnet program with those prescribed by the program developers. The comparison gives an indication of the fidelity of the implementation to the planned program and the frequency and intensity of the magnet program activities.

Magnet program implementation plans

Compare student demographics before the magnet program is implemented to those after program implementation for any indication of changes in racial and ethnic composition.

Outreach and recruitment logs (recruitment fairs and parent nights)

R ecord, describe, and count any modifications, since adaptations, additions, and omissions affect analyses of outcome evaluation data.

School-sponsored activities logs

To what extent was staff trained to implement the magnet curriculum at a professional level?

Compare the current professional development activities against the standards for optimal training as planned by the program developers. This gives an indication of the potential strength and weakness of the magnet program.

Professional Development schedules, activity logs, and feedback forms

To what extent are stakeholders (parents, students, staff, community members) informed and knowledgeable about the magnet program?

Maintain records of meetings and presentations to stakeholders, as well as their questionnaire responses, in order to discover a range of their knowledge.

To what extent was the magnet program implemented as designed?

To what degree has the magnet program helped reduce minority group isolation?

What adaptations, additions, and omissions were made when the magnet program was implemented?

IDEAS FOR YOUR INDICATORS

SAMPLE INDICATORS

School staff interviews School-site observations Review lesson plans and curriculum documents

. Annual analysis of student application database and class rosters

Survey stakeholders Teacher interviews

Classroom observation visits Staff surveys in the spring

Survey stakeholders Interviews/Focus Groups with stakeholders Agendas and logs of events where stakeholders were present

Adapted from U.S. Department of Education, Office of Safe and Drug-Free Schools. (2007). Mobilizing for evidence-based character education (p. 18). Washington, DC: Author. The entire guide can be downloaded at www.ed.gov/programs/charactered/mobilizing.pdf (last accessed December 10, 2008).

2

Related Documents