Schools are criticized for adopting educational programs, techniques, and materials without testing them out first. What sort of evidence should educators expect to see before deciding to adopt a program or technique?
Educators can use the expertise within the profession to assess whether a program or technique is appropriate for their particular school situation. We are our own best resource, and we need opportunities both to acquire evidence from others and to develop evidence as researchers and practitioners.Each individual's experience and sense of history can provide one form of evidence when discussing innovations. Having been an educator for over 30 years, I know that what is called new or innovative is often simply a renamed program from the past. Hindsight can lend perspective.Schools themselves can do research to secure their own evidence. At my school, for example, we have worked to gather evidence for the larger school community through our peer responder program in the Writing Center and our development of exit outcomes and authentic-assessment rubrics.Other schools using a particular program or technique can also provide helpful information. School visits can reveal how effective a program is at given sites. Information banks on education topics can also disseminate evidence based on scholarly research, school assessment, conferences, consultants' market research, and reflection.Of course, a school must commit resources to acquire this kind of evidence; time and money are needed to support fact-finding and communication. Today, technology can support some of this communication, if schools have the equipment and vision on which such discourse depends.—Diane Isaacs is district coordinator of English for the Manhasset (N.Y.) Public Schools.
Educators should expect evidence that an approach is effective. Unfortunately, few programs or techniques are documented by research that evaluates the effectiveness of that approach versus others aimed at achieving similar outcomes. If research is available, it should be easy to understand, believable, and replicable under conditions found in most schools. Obviously, if the research shows only a slight gain in student learning in response to a large expenditure of money and effort, the value of the approach is questionable.We must be cautious about how the word "research" is used. Much education research is not actually research but simply the opinion of people in the field. A strong signal that the research base for an approach may be less than ample is when the list of "endorsers" far exceeds the list of research references. This type of list is often an attempt to secure acceptance on the basis of endorsements by experts and should not be confused with high-quality research.Before adopting a program or technique, educators must consider whether the approach and its outcomes are clearly defined. A clear description is easy to understand, paints a picture of what teachers and students will be doing, and indicates the amount of time devoted to instruction. Clearly defined approaches convey what students will be able to do at the end of a given period and describe measures to evaluate whether those outcomes are reached.—Sam Miller is a research associate with the National Center to Improve the Tools of Educators (NCITE) and a middle school science and mathematics teacher.
Before deciding to adopt a program or technique, look at the evidence of the program's effectiveness. Ask what claims are made regarding the program's impact on student performance. Are students supposed to gain a global knowledge; become better able to analyze, synthesize, or generalize; or score more highly on tests of knowledge or skills? What is the predicted impact, and what is the evidence to support the claim?When examining supporting evidence, consider the sample of students used in the pilot program. Were those students like the ones in your school? If not, it may not be reasonable to expect the results to be the same. Also consider whether the evidence is objective. In appraising a reading program or cooperative learning program, for example, ask whether the evaluations were conducted by a disinterested third party or by the program's originator.Further, consider whether the evidence seems to be consistent. The results of one study tell us little. The accumulation of evidence over time, in a variety of settings, with a variety of students, is really the "gold standard." If one study shows good results, check whether other studies have produced similar findings.Finally, all educators possess skills and experience that can complement a careful examination of the research evidence. Using both provides a powerful multidimensional approach for making decisions about program adoptions.—Jane Stallings is dean of the College of Education at Texas A&M University and president of the American Educational Research Association (AERA).
Selecting a new approach to teaching is a complicated decision in which research evidence should be one of several factors considered. Educators should begin with a thoughtful analysis of their concerns to ensure that an innovation will serve a genuine need. They should consider whether a potential program would help meet their school's goals for its children, and whether its philosophy is compatible with their approach to pedagogy. They should also consider whether it is based on sound research and theory, matches pertinent national or state standards, and meshes with what experience has taught them works best.In examining effectiveness, educators should be open to many different types of evidence. Few programs undergo randomized trials, but a number have taken the critical step of evaluating their results in comparison to some objective standard (e.g., norming samples, national statistics, or research-based standards). Educators should also examine the conditions under which success was obtained to learn how the innovation might work for them.Finally, schools should ask questions about implementation. Are stakeholders in the school and community likely to become committed to the innovation? Will enough support be available on an ongoing basis to do the program well? (Such support is particularly important for a complex program.) How much training and follow-up assistance will be needed, and are such services part of the package or available locally?—Elizabeth Farquhar is staff director for the National Diffusion Network, a U.S. Department of Education program that disseminates exemplary programs nationwide.