Performing an effective technology evaluation may seem like a daunting task, but with some guidance and a step-by-step approach, it's a task that can be managed by most school systems. Further, when you consider that the dollars you invest in determining if your technology plan "is working" are very small compared to the vast sums you might continue to invest in non-performing systems and ideas...well, the time and money spent on evaluation is a pretty wise investment.
Sun Associates specializes in assisting school districts to develop effective evaluation and assessment procedures for their educational technology technology plans. The following links provide some starting information, sample tools, and sample reports. For more information, please contact us.
This page aggregates our most basic tools and resources for technology auditing and evaluation. Further detail and more tools are available below.
This collection of processes, tools, examples, and supportive resources helps educators design and conduct their own, internal, program evaluations. While a self-evaluation is not exactly the same as an external, independent, evaluation there is still value in learning more about how to conduct an evaluation. These tools will help you explore how evaluation works.
Logic models, also known as logic maps, are an excellent way to graphically show how your project or initiative connects its participants, goals, actions, and intended outcomes. Funders very much appreciate seeing the "logic" behind your initiative, and you will find it much easier to evaluate the outcomes of your work if you have established your initiative's logic. We have created a short tutorial on developing logic models.
Background and Research Resources for Technology Evaluation
Here's our annotated bibliography that hits the high points of this research (with particular emphasis on current issues such as 21st century learning, 1:1 technology access, and the flipped classroom). An essential part of developing a technology evaluation is the creation of performance indicators. In a technology program evaluation or audit, indicators are rooted in a district or project's vision for how technology is intended to support teaching and learning. Therefore it is essential to have a good working knowledge of what the research says about the link between technology and learning when setting out to create indicators. We also have a research summary that pulls many of these ideas together.
Information on the three basic steps of technology evaluation and assessment.
Evaluation is built around data. In terms of technology evaluation, how do you collect meaningful qualitative and quantitative data showing technology impact and use? This article covers data collection basics and put data collection strategies such as surveys into a broader assessment context. Also see our blog posts on evaluation data collection.
Sample Focus Group and Interview Questions
Sun Associates offers a variety of sample data collection and evaluation tools for districts seeking to assess the impact of technology on teaching and learning. For your reference, we have sample questions sorted by subject as well as a specific set of questions and guidance for teacher focus groups.
Online surveys can be an efficient way of collecting data from a large number of teachers...particularly if your district has a well-developed network which offers all teachers WWW connectivity. Here are examples of online surveys developed for our evaluation clients. We've eliminated the possibility of actually clicking "submit" and submitting these sample surveys, but in all other ways, they behave like a fully-functioning survey.
A short professional development session evaluation survey (participant evaluation)
A short survey for professional development leaders
A technology usage survey for teachers
An alternative teacher technology survey
A parent survey on technology use
Building and classroom observations are the third -- and often most detailed -- legs of the "triangle" of data collection. These are sample templates which we have have used in several of our evaluation and data-collection projects.
The "final report" -- even if it's just a summary of the first year of an ongoing formative evaluation -- is often a watershed event for the school or district conducting a technology evaluation. The report serves to focus community attention on specific aspects of your technology integration work as well as to showcase the fact that you care enough about technology to critically assess what is and is not working. Most of our clients find that the final report serves to heighten and enhance awareness of technology in their schools and its impact on teaching and learning.
This is a sample of an evaluation project final report. This project utilized our standard formative evaluation methodology and centered around assessing the effectiveness of a district-wide technology staff development effort.
Sun Associates' projects always start with a detailed project proposal. Here is a sample proposal showing how we generally outline a project to potential clients. Naturally, each proposal is unique to the client for which it is created. If you would like more information on how we could help you with a technology evaluation project, please contact us!