1703 North Beauregard St.
Alexandria, VA 22311-1714
Tel: 1-800-933-ASCD (2723)
8:00 a.m. to 6:00 p.m. eastern time, Monday through Friday
Local to the D.C. area: 1-703-578-9600
Toll-free from U.S. and Canada: 1-800-933-ASCD (2723)
All other countries: (International Access Code) + 1-703-578-9600
August 2003 | Number 34
Researchers, practitioners, and policymakers have long given credence to the concept of promoting state-of-the-art teaching practice based on evidence from current research; however, until the passage of the No Child Left Behind Act (NCLB) in 2002, there were few mandates for putting research into practice. Now, the nation's principal legislative policy tool for federal support of K-12 education, NCLB, includes more than 100 references to “scientifically based research” (SBR) and connects funding for a wide variety of programs—from reading to professional development—to scientifically based evidence of effectiveness. This emphasis on SBR has raised a number of critical issues:
With the passage of No Child Left Behind, the concept of scientifically based research has taken on immediate importance. To access much of the federal funding allocated through NCLB, states and districts will be required to adopt programs and policies that are supported by scientifically based research, and teachers will need to adapt their practice to reflect the competencies necessary to implement the new programs. States have already begun to struggle with creating Reading First grants that are grounded in scientifically based research, and teachers will ultimately feel the effect of those grants through embedded professional development activities and curricular materials. A clearer understanding of the history and context of these issues will help stakeholders make more informed decisions that hold the potential for improved long-term effects on teaching and learning.
As the ultimate consumers of research-based products and policies, state and local policymakers, as well as building-level educators, need a firm grounding in the methodological, ethical, and funding issues associated with the definition of SBR. Additionally, researchers need to be aware of the pressures that a legislated definition of SBR may have on funding streams, as well as the demand for and relevance of their research efforts.
In general, the scientific process requires a set of defined steps that occur across research designs. This process can result in research that establishes a direct causal relationship (event A causes event B) or simple associations (events A and B are related). Both types of research involve the same basic steps: observation and identification of a problem or question, development of a hypothesis that addresses the question, application of a test to determine the effect of the hypothesis on the research question, analysis of the test results, and replication and verification of the research. By repeating and refining the process, more accurate hypotheses can be devised and tested, leading to a better understanding of our world.
Researchers can take one of three approaches as they develop their method: a quantitative, qualitative, or mixed approach (both quantitative and qualitative). As its name suggests, quantitative research focuses on gathering data that can be quantified and statistically analyzed. Qualitative research, primarily descriptive in nature, focuses on collecting open-ended observations and data. While quantitative research is exclusive (focusing on specific variables within an event), qualitative research is inclusive (focusing on the larger context of the event). Finally, quantitative research has been called deductive (deducing the relationship between variables), while qualitative research has been described as inductive (gathering meaning from patterns seen during repeated observations). Adherents of both approaches have long debated the worth of the two methodologies, and NCLB's definition of research falls into this debate.
According to NCLB, scientifically based research involves the application of rigorous, systematic, and objective procedures to obtain reliable and valid knowledge about education activities and programs. Such research employs empirical methods that draw on observation or experiment; tests hypotheses and justifies conclusions; is replicable; uses an experimental or quasi-experimental design with a preference for random assignment; provides sufficient detail to allow other researchers to build on findings; and has been accepted by a peer-reviewed journal or approved by a panel of independent experts (see Figure 1).
Scientifically Based Research According to the No Child Left Behind Act
Six Scientific Principles According to the National Research Council
Scientifically Based Research Standards According to the Institute for Education Sciences
Source: PL 101-110, Title IX, Sec. 9101, (37).
Source: Shavelson & Towne, 2002.
Source: PL 107-279, Title 1, Sec. 102, (18).
The NCLB definition thus favors a quantitative methodology, thereby excluding almost all qualitative research. Additionally, labeling one type of research as “scientifically based” creates a false dichotomy, implying that all other research is not based in science.
A more inclusive definition, offered by the National Research Council (NRC), an arm of the National Academy of Sciences, is based on six principles of scientific inquiry that speak to the importance of a scientific culture and community within the field of research. Similar to this definition is the description of scientifically based research standards in the recent reauthorization of the U.S. Department of Education's Office for Educational Research and Information (OERI), now the Institute for Education Sciences (IES). (See Figure 1.) Principle 3 of the NRC definition and Standard 7 of the IES definition reflect a respect for a range of methodologies missing in the NCLB definition.
The American Educational Research Association (AERA, 2003) has also called for a more inclusive definition and expressed “. . . dismay that the Department of Education through its public statements and programs of funding is devoting singular attention to this one tool of science [randomized trials], jeopardizing a broader range of problems best addressed through other scientific methods.”
Much of the concern about the definition can be traced to issues of control. Items four and six of the NRC definition highlight this point, emphasizing the position that good research is shared and held up to the larger body of researchers for critique. NRC also emphasizes the sharing of data and methodology. These aspects of professional control, also reflected in the AERA statement, are largely missing from the NCLB and IES definitions. Indeed, a key question left unanswered in NCLB is how SBR will be adjudicated.
The question of adjudication was answered, at least in part, in the legislation reauthorizing OERI as the Institute for Education Sciences. Congress charged IES with reviewing not only all grant applications in excess of $100,000 but also the research products of grant recipients. The legislation also defined three additional types of scientific standards, with three different levels of flexibility: scientifically based research standards (see Figure 1), scientifically valid education evaluation, and scientifically valid research. The most restrictive of the definitions relates to scientifically valid education evaluation; the Institute is charged with adhering to “the highest possible standards of quality with respect to research design and statistical analysis,” as well as employing “experimental designs using random assignment, when feasible, and other research methodologies that allow for the strongest possible causal inferences when random assignment is not feasible.” Through this language IES is positioned to be a principal arbiter of federally funded scientifically based research.
With the charge of providing a “trusted source of scientific evidence of what works in education” (U.S. Department of Education, 2003), IES also established the What Works Clearinghouse (WWC) in early 2003. The standards the WWC plans to use in gathering the research reflect the emphasis on experimental design and random selection found in both NCLB and the IES legislation. Only studies involving random assignment of participants will be included in the gathering of studies for use in WWC meta-analyses, excluding all correlational and nonquantitative research from the start (WWC, 2003). While the WWC will not necessarily evaluate programs that claim to meet SBR requirements, its reports will inform stakeholders of programs and strategies deemed to have scientific support, much as the National Reading Panel (NRP) report has done for reading research and access to Reading First grants.
Scientifically based research requirements hold the potential to affect both what teachers do and how they learn. School improvement plans will need to use programs that are scientifically based, and practitioners will need to be familiar with these programs and trained in how to implement them. Reading First grants and other curriculum-focused efforts will require teachers to align their practice with approved practices and programs. Because these programs and practices are just now being implemented, it is difficult to determine the ultimate effect; however, it is likely that teacher practice will be affected both in the implementation of new curricula and in professional development. A variety of current professional development opportunities, including attendance at conferences and participation in higher education courses, may not meet SBR requirements and so may need to be funded with resources separate from those provided by NCLB.
Because the body of education research that meets NCLB standards is still being identified, the extent of activities falling within the SBR standards will be relatively small. This could present significant opportunities for teachers interested in participating in research projects. Additionally, increased use of strategies and materials that have been proven effective could strengthen teaching and improve learning.
District-level policymakers are faced with a variety of issues stemming from the new emphasis on experimental research. New York City provides perhaps the most cogent example. In applying for a Reading First grant, city school leaders selected a program they felt was strongly grounded in research; however, Reid Lyon, with the National Institute for Child Health and Human Development and an ardent supporter of the NRP report, publicly stated that the program is not based on sound science. At this time, the U.S. Department of Education has neither accepted nor rejected New York's application; however, it could clearly be a test case for how the Administration plans to interpret the SBR requirements (Manzo & Hoff, 2003).
Another concern is that the federal funding process may become overtly politicized. If that were to happen, programs accepted as scientifically based by one administration might suddenly be labeled “unscientific” under a different administration. Policymakers should create the capacity for holding research to a very high standard of neutrality to avoid such swings in policy.
Through NCLB, the new Institute of Education Sciences, and the What Works Clearinghouse, the Administration (as represented by the Department of Education) is positioned to be the primary evaluator of education research quality, bypassing the traditional quality control role of independent researchers and journals. Because the Department of Education is also a major source of funding for research, the emphasis on experimental methodologies could significantly alter the design of future research as grant applicants align their studies with Department interests. Because good design stems from the research question, a narrowing of research designs may also significantly alter the types of questions being investigated.
Within all of these issues are also the unspoken but cogent issues of research integrity and ethics. The relationship between policymakers, researchers, and practitioners is a complicated one. In an ideal world, policymakers would design policy that research suggests would improve teaching and learning; practitioners would seek to implement such policy and improve their practice as new information became available; and researchers would seek input from policymakers, practitioners, and learners in deciding what questions are most relevant. All groups would work together in the research process by designing research, running studies, evaluating outcomes, and refining conclusions. If this cooperation is truly to occur, there must be trust among all stakeholders and an ethical approach to conducting research involving children.
Some researchers have expressed the concern that research will be used to promote, rather than inform, education policy. Two recent federally supported studies have come under fire regarding this concern: the NRP report Teaching Children to Read and the first year evaluation of the 21st Century Community Learning Centers (Jacobsen, 2003; Garan, 2000).
In Teaching Children to Read, the NRP determined that systematic instruction in phonics was the most effective approach to use in reading instruction (National Institute of Child Health and Human Development, 2000). In Congressional testimony related to the NRP report, Under Secretary of Education Eugene Hickok noted, “The findings of years of scientific research on reading are now available, and application of this research to the classroom is now possible for all schools in America, including preschool environments.” In reviewing the current body of reading research, the NRP limited its analysis to research that was experimentally designed with random assignment of participants; as a result, the NRP determined that 38 studies were of sufficient quality to be included in a meta-analysis of reading instruction. The narrow focus of the meta-analysis concerned one of the panel members strongly enough that she felt compelled to write a minority report disputing the findings (Yatvin, 2003). Reanalysis of the 38 studies by researchers from the National Institute for Early Education Research and Rutgers University also contradicted the NRP's conclusion, finding instead that while phonics is important, a more effective approach to teaching reading also includes language activities and individual tutoring (Camilli, Vargas, & Yurecko, 2003).
The first year evaluation of the 21st Century Community Learning Centers, conducted by Mathematica Policy Research (2003) for the Department of Education, has been criticized for its finding that the program has had little effect on student achievement or behavior. Seven members of the study's technical working group issued a statement criticizing the report's methodology and the way it has been used to justify proposed cuts to funding. The critique touches on an important point, the political use of research.
To protect the integrity of the research process, as well as the independence of researchers, Congress included language in the IES reauthorization bill requiring that the Institute's research activities be “objective, secular, neutral, and nonideological and be free of partisan political influence and racial, cultural, gender, or regional bias” (PL 107 279, Part A, Sec. 111, 2, B). Similar language designed to protect the Institute's objectivity is used throughout the legislation. In signing the bill into law, however, President Bush reserved the right to construe such provisions as only advisory (Bush, 2002), and effectively “to establish a research agenda and suppress the publication of any finding to which it [the Bush Administration] might object” (“Bush Reserves Right to Intervene in ED Research,” 2002). Because the Department of Education is a part of the Executive Office, such a stance is not necessarily unusual; however, if the standard for objectivity is compromised, the trust necessary for collaboration between stakeholders will be eroded.
Another complicating factor in conducting any education research relates to the use of children as subjects in an experiment. Although medical research is frequently cited as the standard to which education researchers should aspire, very few medical studies actually focus on the needs of children. One important reason is ethical considerations. Adults can give “informed consent” to participate in experimental research; however, such consent is much more difficult to obtain for children (Meadows, 2003). In most cases, consent must first be obtained from the parents and then again from the child. Additionally, parents may be reluctant to submit their children to experimentation. Although such strict standards do not necessarily apply to the evaluation of educational interventions, education researchers have generally subjected their research plans to the appropriate institutional review boards. They also must seek permission from the state or school district, individual schools, teachers, and, in some cases, parents for research projects.
There are also practical difficulties involved in designing school-based experiments. In experimental research, a group of subjects is offered a treatment and is then compared to a control group of subjects that did not receive the treatment, so that the effect of the treatment can be measured. In education research, it would not be feasible, for example, to refuse to teach a control group of children how to read so that the effects of a reading program could be measured. Instead, different treatments must be compared and their educational value estimated. For example, two groups of children would be taught to read using two different programs. The result might be that program A would help children learn to read better than program B. In both cases, however, the subjects would be learning to read, so the programmatic effects would be more difficult to measure.
Another practical difficulty is controlling for extraneous learning. Much of a student's life outside of school is spent learning—from parents, peers, and the community. This extraneous learning can make it difficult to identify the effect of a specific intervention conducted in school.
Still another difficulty is defining the outcomes to be measured. In the example about reading instruction, how the researcher defines reading (e.g., reading individual words, vocabulary, reading for understanding) can affect the outcome of the study. Subjects could be taught the skills to read specific words—and score well on a word-reading test—but still have difficulty understanding content.
The practical and ethical difficulties of conducting experimental research in educational arenas do not preclude such study; rather, they add complexities to ensure the integrity of such research. These complexities can make it much more expensive to run studies, and in an already underfunded field the expense can quickly become prohibitive. Less expensive methodologies can work around these issues and still generate accurate proxy measures or correlations between interventions and outcomes, but these methodologies might not be supported by NCLB's definition of scientifically based research.
Funding mechanisms—both public and private—can also be used to drive the direction of research. Compared to other public research endeavors, education research has long been underfunded (see Figure 2). This history of underfunding creates a competitive market for research dollars, which, in turn, can cause researchers to align their research with the questions and methodologies they think the funding organizations want them to use (Gowri, 2000) and consequently limit the range of research questions available for consideration.
FY 2002 (est.)
Health and Human Services
National Science Foundation
Source: Hoffman, C. (2002). Federal support for education: Fiscal years 1980–2002. Washington, DC: National Center for Education Statistics, U. S. Department of Education (p. 10). Retrieved May 22, 2003, from
The potential effect of strengthening research and targeting outcomes to student learning holds great promise. If research can be improved and more directly focused on causation, with a concomitant focus on student learning, then policymakers, researchers, teachers, and students could reap great rewards. However, the risk of unintended consequences inherent in implementation of any new policy could prove equally damaging. All stakeholders should approach the issue with thoughtfulness and care.
As identified by Pellegrino and Goldman (2002), “issues of scientific method and quality are far too complex, contextualized, and nuanced” to be specifically legislated. While the quality of education research has been widely criticized, the complaints may have more to do with the complexities of research design and the impact of severe underfunding than they do with a lack of rigor or methodological weakness. While policymakers clearly hope to strengthen research and help educators determine the best methods and material to use in the classroom, legislating or mandating specific methodologies may not necessarily improve the body of scientific knowledge related to teaching and learning. In fact, the current definitions of research found in NCLB and IES may simply confuse an already complex issue as stakeholders attempt to divine which definition applies to their specific situation. Even if one workable definition were agreed upon, however, issues of inadequate funding and methodological complexity would still remain.
Historical underfunding, as well as the practical and ethical complications associated with education research, suggests that a more flexible approach may be necessary. Mandating a methodological approach may limit the type of research that can be conducted as well as the research questions that can be asked. Instead, policymakers may wish to consider policies that will recognize the range of methodologies but require the profession to rigorously police itself, either through strengthened review of research by the Department of Education or by increasing the capacity of higher education institutions to ensure research quality. Establishment of policies that help expand the capacity for participation by all stakeholders in the research process—as well as broadening the community to include researchers, policymakers, practitioners, and learners—could also have an effect on making research relevant to practitioners and funders, ultimately increasing quality. Each population offers valuable input, and the struggle by any one group to control the community risks damaging the level of trust among stakeholders that is necessary for success. Policies that help teachers take the initiative to seek out or conduct research through professional development experiences and classroom support can also strengthen research use and application.
Addressing issues of underfunding while strengthening peer and institutional review processes may bring education research a new respectability. If federal support for education research and application were raised to match that of the Department of Energy, the budget would be quadrupled; if it matched the research investment of the Department of Health and Human Services, the budget for education research would be expanded 17-fold. The investment would be comparatively small, but the return potentially great.
ASCD is committed to providing the education profession with research-based resources designed to promote evidence-based practice. As part of that commitment, ASCD supports the use of high-quality research that is rooted in theory and based on rigorous qualitative or quantitative methods that are generally accepted by the broader community of scientists and researchers. Selection of the appropriate inquiry methodology must stem from the research questions being asked. It is ultimately the responsibility of the larger community of scientists, researchers, and educators to evaluate and interpret the meaning and results of educational research efforts, and protect the integrity of the scientific process.
—Gene R. Carter, ASCD Executive Director
Flinders, D. J. (2003, Summer). Qualitative Research in the Foreseeable Future: No Study Left Behind? Journal of Curriculum and Supervision 18(4). Retrieved August 4, 2003 from http://www.ascd.org/publications/jcs/summer2003/Qualitative_Research_in_the_Foreseeable_Future%40_No_Study_Left_Behind%2b.aspx.
Davis, O.L. (2003, Winter). New Policies and New Directions: Be Aware of the Footprints! Notice the Nightmares! Journal of Curriculum and Supervision 18(2). Retrieved August 4, 2003 from http://www.ascd.org/publications/jcs/winter2003/New_Policies_and_New_Directions%40_Be_Aware_of_the_Footprints!_Notice_the_Nightmares!.aspx.
Davis, O.L. (2002, Summer). Educational Research in the Foreseeable Future.
Journal of Curriculum and Supervision 17(4). Retrieved August 4, 2003 from http://www.ascd.org/publications/jcs/summer2002/Educational_Research_in_the_Foreseeable_Future.aspx.
American Educational Research Association. (2003, January 26). Resolution on the essential elements of scientifically based research. Retrieved May 19, 2003, from http://www.aera.net/meeting/councilresolution03.htm
American Educational Research Association. (n.d.). ED Web information disappearing. Action Alert. Retrieved May 21, 2003, from http://www.aera.net/communications/news/federal.htm
American Public Health Association. (2002). Ensuring the scientific credibility of government public health advisory committees. Association News. Policy Statement LB02-2. Washington, DC: Author. Retrieved May 21, 2003, from http://www.apha.org/legislative/policy/policysearch/index.cfm?fuseaction=view&id=291
Benson, E. (2003). Political science: Allegations of politicization are threatening the credibility of the federal government's scientific advisory committees. Monitor on Psychology, 34(3). Retrieved April 1, 2003, from http://www.apa.org/monitor/mar03/political.html
Bush, G. (2002, November 5). Statement by the President. Washington, DC: Office of the Press Secretary. Retrieved May 21, 2003, from http://www.whitehouse.gov/news/releases/2002/11/20021105-4.html
Bush reserves right to intervene in ED research. (2002, December). School Law News. Washington, DC: Aspen Publishers, Inc.
Camilli, G., Vargas, S., & Yurecko, M. (2003, May 8). Teaching children to read: The fragile link between science and federal education policy. Education Policy Analysis Archives, 11(15). Retrieved May 16, 2003, from http://epaa.asu.edu/epaa/v11n15/
Curley, B. (2003, March 28). Federal advisory-panel members question vetting process. Join Together Online. Retrieved April 1, 2003, from http://www.jointogether.org/sa/news/features/print/0,1856,562428,00.html
Faucher, A. (2003). Bush shows economists the door—literally. The Dismal Scientist. West Chester, PA: Economy.com. Retrieved May 22, 2003, from http://www.economy.com/home/article.asp?aid=2231 (subscription required)
Feuer, M. J., Towne, L., & Shavelson, R. J. (2002). Scientific culture and educational research. Educational Researcher, 31(8), 4–14. Retrieved May 23, 2003, from http://www.aera.net/pubs/er/pdf/vol31_08/AERA310803.pdf
Garan, E. M. (2001). Beyond the smoke and mirrors: A critique of the national reading panel report on phonics. Phi Delta Kappan, 82(7), 500–506. Retrieved May 16, 2003, from http://www.pdkintl.org/kappan/k0103gar.htm
Gowri, A. (2000). A market approach to research integrity. Proceedings of the first ORI research conference on research integrity. Washington, DC: Office of Research Integrity, U.S. Department of Health and Human Services (pp. 315–319). Retrieved May 21, 2003, from http://ori.dhhs.gov/multimedia/acrobat/papers/gowri.pdf
Hardin, B. (2003, May 18). Bush shows no fear in grizzly territory: White House takes on popular icon with mining plan. The Washington Post, p. A1. Retrieved May 22, 2003, from http://www.washingtonpost.com/ac2/wp-dyn/A4449-2003May17
Hoffman, C. (2002). Federal support for education: Fiscal years 1980–2002. Washington DC: National Center for Education Statistics, U.S. Department of Education (p. 10). Retrieved May 22, 2003, from http://nces.ed.gov/pubs2003/2003006.pdf
Jacobson, L. (2003, May 21). After-school report called into question. Education Week. Retrieved May 21, 2003, from http://www.edweek.org/ew/ewstory.cfm?slug=37century.h22
Mathematica Policy Research. (2003). When schools stay open late: The national evaluation of the 21st-Century Community Learning Centers Program first year findings. New York: Author. Retrieved May 21, 2003, from http://www.mathematica-mpr.com/PDFs/whenschools.pdf
Meadows, M. (2003). Drug research and children. FDA Consumer Magazine, 37(1). Retrieved May 22, 2003, from http://www.fda.gov/fdac/features/2003/103_drugs.html
National Institute of Child Health and Human Development. (2000). Report of the National Reading Panel. Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction: Reports of the subgroups (NIH Publication No. 00-4754). Washington, DC: U.S. Government Printing Office. Retrieved May 16, 2003, from http://www.nichd.nih.gov/publications/nrp/report.htm
Pellegrino, J. W., & Goldman, S. R. (2002). Be careful what you wish for—you may get it: Educational research in the spotlight. Educational Researcher, 31(8), 15–17. Retrieved May 19, 2003, from http://www.aera.net
Shavelson, R. J., & Towne, L. (2002). Scientific research in education. Washington, DC: National Academy Press.
U.S. Department of Education. (2003). What Works Clearinghouse. Retrieved May 19, 2003, from http://www.ed.gov/offices/IES/NCEE/wwc.html
What Works Clearinghouse. (2003, March 5). Study design and implementation assessment device, Version 0.6. Washington, DC: Author. Retrieved May 19, 2003, from http://www.w-w-c.org/standards06.doc
Yatvin, J. (2000). Minority Report. Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction. [Online]. Retrieved May 23, 2003, from http://www.nichd.nih.gov/publications/nrp/minorityview.pdf
Copyright © 2003 by Association for Supervision and Curriculum Development
Subscribe to ASCD Express, our free e-mail newsletter, to have practical, actionable strategies and information delivered to your e-mail inbox twice a month.
ASCD respects intellectual property rights and adheres to the laws governing them. Learn more about our permissions policy and submit your request online.