Your research on the Hedges correction suggests that many education researchers mis-specify their analyses for clustered designs. What advice would you give researchers on selecting the right analyses for clustered designs?
My advice is to focus on the design of the study. If the design is wrong, then the analysis that matches the design will fail, and it is likely that no re-analysis of the collected data will be able to recover from the initial mistake. For example, a common design error is randomizing teachers to experimental conditions, but then assuming that how the school registrar assigned students to classes was equivalent to the experimenter randomizing students to classes. This assumption is false. Registrar based student assignment is a kind of group based, or clustered, random assignment. If this error is not caught at the design stage, the study will necessarily be under powered because the sample size calculations will be off. If the error is not caught at the publication stage, the hypothesis test for the treatment effect will be anti-conservative, i.e. even if the treatment effect is truly zero, the test statistic is still likely to be (incorrectly!) statistically significant. The error will, however, be caught if the What Works Clearinghouse decides to review the study. Their application of the Hedges correction, however, will not fix the design problem. The corrected test statistic will, at best, have low power, just like a re-analysis of the data would. At worst, the corrected test statistic can have nearly zero power. There is no escape from a design error.
To give a bit of further, perhaps self-serving advice, I would also suggest engaging your local statistician as a collaborator. People like me are always looking to get involved in substantively interesting projects, especially if we can get involved at the planning stage of the project. Additionally, this division of labor is often better for everyone: the statistician gets to focus on interesting methodological challenges and the education researcher gets to focus on the substantive portion of the research.
How has being an IES predoc and now an IES postdoc helped your development as a researcher?
This is a bit like the joke where one fish asks another "How is the water today?" The other fish responds "What's water?"
I came to Carnegie Mellon for the joint Ph.D. in Statistics and Public Policy, in part, because the IES predoc program there, the Program for Interdisciplinary Education Research (PIER), would both fund me to become and train me to become an education researcher. The PIER program shaped my entire graduate career. David Klahr (PIER Director) gave me grounding in the education sciences. Brian Junker (PIER Steering committee) taught me how to be both methodologically rigorous and yet still accessible to applied researchers. Sharon Carver (PIER co-Director), who runs the CMU lab school, built in a formal reflection process for the "Field Base Experience" portion of our PIER training. That essay, was, perhaps, the most cathartic thing I have ever written in that it helped to set me on my career path as a statistician who aims to focus on education research. Joel Greenhouse (affiliated PIER faculty), who is himself a biostatistician, chaired my thesis committee. It was his example that refined the direction of my career: I wish to be the education sciences analogue of a biostatistician.
The IES postdoc program at Northwestern University, where I am advised by Larry Hedges, has been very different. Postdoctoral training is necessarily quite different from graduate school. One thread is common, however, the methodology I develop must be useful to applied education researchers. Larry is, as one might suppose, quite good at focusing my attention on where I need to make technical improvements to my work, but also how I might better communicate my technical results and make them accessible to applied researchers. After only a year at Northwestern, I have grown considerably in both my technical and communication skills.
What career advice would you give to young researchers?
Pick good mentors and heed their advice. To the extent that I am successful, I credit the advice and training of my mentors at Carnegie Mellon and Northwestern.
Comments? Questions? Please write to us at IESResearch@ed.gov.