Thanks to Benjamin Riley, this morning I learned about the results of a big new growth mindset study that was released yesterday, Where and For Whom Can a Brief, Scalable Mindset Intervention Improve Adolescents’ Educational Trajectories? (happily, not behind a paywall).
It’s written by a zillion of the biggest names in Social Emotional Learning Research (David Yeager, Paul Hanselman, David Paunesku, Christopher Hulleman, Carol Dweck, Chandra Muller, Robert Crosnoe, Gregory Walton, Elizabeth Tipton, Angela Duckworth).
Using a representative sample of U.S. schools and their students, they found that students doing two twenty-five minute online lessons about a growth mindset resulted in a small but important academic gain (measured by GPA’s), with larger improvements found among students who had a track record of experiencing academic and socio/economic challenges.
They also found greater gains in schools they say “support greater challenge-seeking or academic effort.” That makes sense to me, though their measurement of that climate seems a little odd (if students chose to do more challenging math problems on a test).
Though they don’t really describe the content of the online lessons in detail, they fortunately point to a previous paper that does – Using Design Thinking to Improve Psychological Interventions: The Case of the Growth Mindset During the Transition to High School (and that paper also is not behind a paywall!). The content is pretty well described there between pages 377-379).
I’m going to review that paper and the most recent version of the growth mindset lesson I teach (My Growth Mindset Lessons Usually Go Well, But What I Did Today Was The Best Yet (Student Hand-Outs Included)) to see what, if any changes, I might want to make to it.
I suspect that most teachers and schools teaching about the growth mindset don’t need a study to know that it’s effective, but it’s really nice to know that to-notch research like this paper supports our beliefs. And, with luck, it will bring even more people on-board….
I’m adding this info to The Best Resources On Helping Our Students Develop A “Growth Mindset.”
It would be really helpful to us all if you also reviewed p. 388-89 where the authors discuss the “limitations” of the study. There are many problems with self-report and Likert scales like these and K-12 educators in CA deserve to know. I point this out on a related topic “Got Grit: Maybe…” (Phi Delta Kappan, 2017). The issues are analogous for most non cognitive studies in education.
Thanks for your feedback. As you may know, the paper was recently taken off-line because the authors “have received feedback and are now making improvements.” Unfortunately, I didn’t save a copy of the draft, so am unable to review the pages you suggest. I’ll look forward to seeing their revision once it’s made available.
However, at the same time I think it’s fair to say that every paper has “limitations.” As Dylan Wiliam writes:
Moreover, in education, “What works?” is rarely the right question, because. everything works somewhere, and; nothing works everywhere, which is why; in education, the right question is, “Under what conditions does this work?”
I do my best to provide the best short summary I can of important education research. One of the reasons I developed “The Best Resources For Understanding How To Interpret Education Research” (http://larryferlazzo.edublogs.org/2011/11/11/the-best-resources-for-understanding-how-to-interpret-education-research/) was to help provide tools to educators to try to apply Wiliam’s caution.
Also, though I don’t have perfect recall of the paper, I do note that the outcome they used to measure success was grade point averages. Unless I’m missing something (which I may be), I don’t believe those were based on “self-report.”