Rigor, Relevance, and Utilization
The Education Sciences Reform Act of 2002 established a new organization within the U.S. Department of Education, the Institute of Education Sciences. The statutory mission of IES is to expand knowledge and provide information on: a) the condition of education (through the National Center for Education Statistics); b) practices that improve academic achievement (through the National Center for Education Research); and c) the effectiveness of federal and other education programs (through the National Center for Education Evaluation and Regional Assistance). The Institute is led by a director and overseen by the National Board of Education Sciences. IES has a full-time staff of about 220 and is responsible for roughly half a billion dollars annually in external grants and contracts.
IES was born out of a shared sense among policy makers that education practice and research are badly in need of reform. Capturing this view, the National Research Council concluded that “the complex world of education – unlike defense, health care, or industrial production – does not rest on a strong research base. In no other field are personal experience and ideology so frequently relied on to make policy choices, and in no other field is the research base so inadequate and little used” (NRC, Hauser, and Heubert, 1999).
|
There are three challenges that will have to be met before education is transformed into an evidence-based field: the rigor, relevance, and utilization of education research. First, the rigor of education research will have to be enhanced. Far too much education research is based on methodologies that cannot support the questions that are addressed or the conclusions that are drawn. Over the past decade, 38 percent of the primary research reports in the American Educational Research Association’s two premier journals involved qualitative methods, the results of which were often used to support causal conclusions. Qualitative and interpretive methods have their place in the array of methodologies that can be deployed in sophisticated programs of education research, but they do not answer the practical issues of program efficacy that are critical for most practitioners.
A similar problem exists within the quantitative realm, in which correlational analyses are frequently used to argue for or against education policies. The telling practical problem is that correlational datasets permit multiple forms of analysis and differing policy conclusions. For instance, a correlational analysis of the effects of high-stakes testing recently sparked criticism that led one of the authors of the original study to defend it this way: “I’ve had a lot of people reanalyze our data and each and every one of them have come up with different results” (Viadero, 2003). That is not encouraging news for a policy maker. The shifting sands of correlational and qualitative analyses cannot be the principle elements in the foundation of empirically driven education policy. For that we will need randomized trials and experimental methods, which are currently rare.
The second challenge is to increase the relevance of education research. IES recently surveyed education practitioners on the usefulness of research. A typical response was, “There may be less than 1 percent of the existing research that’s really meaningful to teachers. … Teachers need strategies, practices. Give them things that can help teaching and learning, things that can help kids.” While we may quibble about the percentage, there is no doubt in my mind that there is a dearth of education research that addresses practical problems in powerful ways.
The third challenge is to translate the results of education research into practice. Producing new education research that is both rigorous and relevant will help. However, the history of other fields suggests that more is involved in the utilization of good research than its mere presence. In medicine, for instance, the FDA requires evidence of efficacy before approving new pharmaceuticals. This has created a thriving market for high-quality medical research. Education will adopt research-based approaches much more rapidly if there are differential consequences for education decision-makers whose choices are or are not grounded by evidence.
IES is moving aggressively to advance the rigor, relevance, and utilization of education research. Starting with funding announcements, going to peer review, and carrying through to funding decisions, IES processes are designed to assure that successful grantees or contractors employ rigorous methodologies. Regardless of what we have budgeted for a grant program, we only fund those grant applications in which methods are rigorous and matched to the questions and hypotheses that will be addressed.
We are assuring the relevance of our research investments by tuning our priorities to the needs of education practitioners and policy makers. Examples are our research programs in preschool curriculum, and socialization and character development. In both cases, a number of university-based research teams implement interventions in schools, while a national contractor collects shared measures of implementation and outcomes across research sites. Within a funding cycle of a few years, we hope to identify preschool curriculum and school-wide character development programs that work. Because not all areas in which practitioners need good research are ready to support full-blown school-based interventions, we also fund research programs that encourage the first stages of laboratory research translation to move into educational settings. For instance, our program of research in cognition and student learning funds cognitive psychologists and behavioral neuroscientists to transfer their research into schools.
The What Works Clearinghouse is critical to our effort to promote the utilization of research in education decisions. It is fundamentally different from prior national efforts to synthesize evidence on education programs, in that it relies on a set of specific and clearly described protocols for judging the scientific quality of studies. For the first time, an administrator faced, with the choice of mathematics curricula for elementary school will be able to determine which have rigorous evidence of effectiveness associated with them. As the results from the What Works Clearinghouse come to play a role in state and federal decisions on discretionary grants and the flow of program funds to schools, we expect that practitioners will want to consider evidence on what works and that program developers will want to produce it.
For over 100 years psychology has focused on topics such as learning, memory, intelligence, motivation, socialization, and development that are still front and center in education. Doctoral programs in psychology have long inculcated the methodological and logical skills that are the intellectual capital for the rigorous science that education so badly needs. Yet psychology awards only about 200 doctorates a year in the sub-fields most relevant to education, and in 2001, only 16 doctoral recipients in educational psychology were involved in research within one year of the receipt of their degrees. Transforming education into an evidence-based field is very important work for the nation. It will require training new researchers in sufficient numbers to address the many tasks at hand, and the best efforts of current research psychologists, including many whom have not previously focused on education. IES will nurture the growth and influence of the education sciences, including, most certainly, psychology.
References
National Research Council, Hauser, R.M., & Heubert, J. P. (Eds.). (1999). Improving student learning: A strategic plan for education research and its utilization. National Academy Press.
Viadero, D. (2003, April 16). Study finds higher gains in states with high-stakes tests. Education Week, 22(31), 10.
APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.
Please login with your APS account to comment.