Technology Evaluation -- A Three-Stage Process

A Sun Associates formative evaluation of technology's impact on teaching and learning is a multi-stage process. In order to make this process more manageable for district decision-makers, we have taken each of these stages and outlined them according to time required, oversight necessary, and an approximate cost. We have found that in most cases it makes sense to create a full evaluation effort that is a package of all three stages. In such a full-scale effort, there are certainly degrees of economy which can only be realized when we facilitate all of the parts of the process. In some cases, Sun Associates will work with school districts to approach Stage 1 or Stage 2 as separate project components.

Stage 1

Committee orientation, evaluation framing, and training

District Time Commitment -- 3 - 4 working days for each committee member

Oversight Required -- District Technology Coordinator or equivalent district-level administrator

Cost Estimate -- $5,000 to $7,500 depending on the scope of the evaluation effort (i.e., number of schools and the depth of inquiry)

This stage comprises the initial steps for every district's technology evaluation.

 

 

 

 

 

 

 

 

 

 

Appoint a Committee

Sun Associates will work with district leaders to select and form a district-wide evaluation committee. This committee should include individuals representative of the various "stakeholder" constituencies across the district. In most cases, this means teachers, administrators, parents, board-members, and students. The exact composition of a committee will vary from district to district and in all cases reflects the values and priorities of the district conducting the evaluation. Effective committees generally have between 10 to 20 members (perhaps fewer in a very small district).

Committee Member Orientation and Training

Once the committee is selected, Sun Associates will conduct a full-day training which is designed to provide the committee members with the essential basics of technology program evaluation. During this orientation training, the entire evaluation process is covered in detail, milestones are set, and initial responsibilities are assigned. The goal for this session is for committee members to have a shared understanding of the purpose and intended outcomes of the evaluation process. Further, each member should have a working understanding of his/her role in the overall evaluation process.

Formulate and Review Evaluation Questions

After its initial day of training, the committee meets for another two days to develop the district's key evaluation questions and to create indicators for those questions. This work can transpire over the course of several partial-day sessions and need not occur in a single two-day block. In most cases, the evaluation committee breaks into subcommittees to develop indicators for individual questions. Sun Associates will facilitate each subcommittee (or the committee of the whole if that route is taken) and provide expert guidance in the development of questions and indicators.

Develop Indicator Rubrics

Taking the questions and indicators developed by the committee (or its subcommittees), Sun Associates will edit and work the questions to the satisfaction of the overseeing administrator. Additional input from the committee will be solicited as required. The final product of this step is a finalized set of rubrics which will be used to conduct the district's formative evaluation.


Stage 2

Data Collection and Analysis

District Time Commitment -- Sufficient time for teachers, students, parents, administrators to be interviewed, surveyed, or observed.

Oversight Required -- District Technology Coordinator or equivalent district-level administrator.

Cost Estimate -- Dependent upon the size of the district. A small district with several elementary schools, a middle school, and a high school would be in the range of $6,000 to $7,500. Larger districts would be more.

Data Collection

Data collection is designed in response to the evaluation rubrics developed by a district. The point of data collection is to gather information which will enable the district to "answer" the evaluation questions and "score" their performance on their evaluation rubrics. Therefore, it is not possible to predict a data collection strategy without knowing something about a district's evaluation rubrics. Nevertheless, a typical Sun Associates data collection effort will include the following elements:

Other data collection strategies and mechanisms can be deployed. For example, some districts have found success with a "public meeting" format where Sun Associates facilitates (and documents) a discussion about a community's goals, aspirations, and concerns as related to technology implementation. As a cost-saving strategy -- particularly in larger districts -- we can work with in-district evaluators to adapt and deploy existing data collection mechanisms and/or to identify statistically relevant sample populations so as create a more manageable data collection effort.

A Sun Associates evaluation never relies on a single data source (e.g., surveys). Rather, we design a data collection strategy which has the optimum chance of capturing the big picture of technology's use and impact within the client district.

Data Analysis

All data, regardless of source, is summarized and reported in tabular and text format. We provide a summary report for each data source. Where tabular data exists, we will conduct a range of statistical analyses on the data (as necessary) and also supply raw data to the client for their own use/analysis.


Stage 3

Findings, recommendations, and reporting

District Time Commitment -- Approximately 1 day from each committee member (to review data and suggested rubric scores). Additional 2 or 3 days from the overseeing administrator.

Oversight Required -- District Technology Coordinator or equivalent district-level administrator.

Cost Estimate -- A full report, based on two or three evaluation questions, will cost approximately $5,000. Reports for very large districts or for districts with many evaluation questions will be more.

 

 

Reporting is the most critical stage of a formative evaluation effort as it establishes a common knowledge base for reflection. An evaluation that never gets shared with the community it evaluates never results in reflection; and of course, no reflection means no positive change.

Scoring the Rubrics -- Findings and Recommendations

A Sun Associates report starts with data and then uses that data to "score" a district's performance against its own rubrics. Our reports provide a detailed explanation of how scores were given and the rationale for choosing one score -- or level of performance -- versus another. All of this information is contained in the report's detailed, documented, and fully supported findings section.

Just as important as the rubric scores/findings are the recommendations we make for how the district can adapt or change current practices to achieve higher levels of performance in succeeding years. Our recommendations are always based in a research-intensive knowledge of best and proven practice as related to teaching, learning, and technology. Furthermore, our recommendations are all relative to our findings. In other words, we do not make recommendations which are not in sync with a district's desired outcomes as documented in their indicator rubrics.

Prior to creating a final report, Sun Associates will review its findings and recommendations with the district evaluation committee and the overseeing administrator. These key individuals will be given the opportunity to edit, modify, or suggest additional reported information. After this input is received, a final report will be issued.

Disseminating the Final Report

Traditionally, our evaluation projects end with a formal presentation to the district committee and other audiences as identified by the overseeing administrator. Many clients request, for an additional charge, that we prepare a PowerPoint presentation which summarizes key report elements (e.g., process, findings, and recommendations.


Fees

$15,000 is a typical price for a small district to implement a full Sun Associates formative technology evaluation.

This is only an average fee and can vary up or down according to the following:

Sun Associates is committed to working with every school district to maximize the use of technology as a tool for teaching and learning. Contact us. We will make every effort possible to tailor an evaluation project which delivers maximum benefit for the minimum budget.


Notable Evaluation Facts

Information on this site that has been produced by Sun Associates is Copyright 1997 - 2013 Sun Associates and is available for individual, one-time, use by educators. Duplication is prohibited without permission. All other material is the property of its authors and Sun Associates makes no warranty for its use or accuracy.

Last updated, {date}