charles harding pic

Charles Harding
Marzano Research

In January 2018, Marzano Research visited the Michigan Department of Education (MDE) offices in Lansing to deliver the final presentation on a recently completed project. MDE had contracted us to gather research and feedback that would inform the department’s support for the implementation of educator evaluation systems in Michigan districts and public school academies (PSAs). We had wrapped up the yearlong, five-phase project in December 2017.

“Working with the Marzano Research team has been a real pleasure for us here at [MDE],” said Jared Robinson, assistant director of the Office of Educator Talent. “From day one, Marzano Research came to the table with a clear and ambitious plan to research local implementation of Michigan’s educator evaluation law. We have been so impressed with what our colleagues at Marzano Research produced in such a quick turnaround.”

The Office of Educator Talent has already begun using the information in these reports to enhance our supports for Michigan districts and educators.

Jared Robinson
Phase 1: Literature Review

The first phase of the project involved reviewing rigorous research on best practices in the implementation of teacher and administrator evaluation systems. Marzano Research located nearly 60 resources and identified 112 recommendations for best practices. We arranged the recommendations into six topic areas: student growth; cultural competency and equity; feedback; administrator evaluation; training; and professional development. This literature review laid the groundwork for developing surveys and collecting feedback in later phases.

Phase 2: District Implementation Survey

Next, Marzano Research designed a survey to gauge the extent to which local implementation of educator evaluation systems in Michigan reflected the best practices identified in phase 1. After distributing the survey to over 600 districts and PSAs, we used the collected data to categorize the 175 responding sites as low, medium, or high implementers of their respective evaluation systems, relative to the six topic areas above. Arranging levels of implementation by topic area allowed MDE to see which best practices were reflected most often and which required further supports. While not all districts and PSAs completed the survey, the results provided a solid basis to help MDE strengthen and expand implementation of educator evaluation systems in Michigan.

Phase 3: Comparable States Interviews

For the third phase, MDE requested that Marzano Research create a snapshot of specific strategies that other state education agencies used to support the implementation of educator evaluation systems. We interviewed agency personnel in five states with systems comparable to those in Michigan and identified the most commonly used supports for administrators and teachers—training modules and videos, stakeholder feedback groups, and regional support staff, for example. We compiled details on these supports in a report that MDE could reference when determining the best ways to enhance educator evaluation in Michigan.

Phase 4: Case Studies

Transitioning to the next phase of the project, Marzano Research conducted site visits to gather more specific, localized feedback about the implementation of educator evaluation systems. We selected a subset of districts and PSAs that had responded to the implementation survey and that also represented a range of low to high implementers in rural to urban locales. Through focus groups and interviews with over 200 teachers and administrators, we documented catalysts and barriers to implementation as well as available resources and supports at the school, district, and state levels. Even though the scope of this phase was necessarily limited to a small subset of districts and PSAs, the resulting report provided MDE with actionable feedback from teachers and administrators to guide the department’s efforts to further assist them in implementing evaluation systems.

Phase 5: Teacher Survey

MichiganReportThe final phase consisted of surveying teachers’ perceptions of evaluator feedback. More than 1,000 teachers from over 800 schools responded to an online survey with questions related to the accuracy and usefulness of evaluation feedback, credibility of evaluators, accessibility of resources, and responsiveness to feedback. From the survey results, MDE gained insight into the areas in which teachers were using evaluation feedback and those areas in which teachers might need additional support.

While Marzano Research’s contribution to the project has finished, MDE’s work has only begun. According to Robinson, the Office of Educator Talent “has already begun using the information in these reports to enhance our supports for Michigan districts and educators. The survey data has been included in our webinar presentations and in-person professional development around the state. We have also been able to more clearly communicate to colleagues some of the opportunities and barriers our schools are facing regarding educator evaluations. Critically, our office has used these reports to plan how we approach policy and technical guidance going forward.”

To learn more about the project and read the reports, visit MDE’s Educator Evaluation Research Reports webpage.