All IB graduate students should have received an email announcing the new batch of programs offered by the university to help graduate students with one of the most difficult academic skills: writing.
“Dear Graduate Faculty, Coordinators, and Graduate students, Our analytics indicate that those students who receive writing instruction and consultation early in their graduate programs complete their degree requirements sooner. Our programs are designed for graduate student writers at all levels and stages. Individual Writing Consultations: (starts January 15).”The claims made at the start of the email (“Our analytics indicate […] complete their degree requirements sooner.“) are contrary to a recent publication that received some exposure: “Null effects of boot camps and short-format training for PhD students in life sciences” by Feldon et al. 2017. The title already give the results away, but here is the main conclusion:
“Here we show that participation in such programs is not associated with detectable benefits related to skill development, socialization into the academic community, or scholarly productivity for students in our sample.”
This study was no slouch, with 295 participating students, followed for several years with performances evaluated using standard rubrics, etc, and thus the pushback has been struggling with the quality of the study. For instance, the Software Carpentry organization, whose reason for existence is offering these workshops, responded to this almost existential threat in quite some detail. While I think I perceived an undertone of defensiveness in their response, I think that the easiest explanation for these null effects are identified by the Software Carpentry people:
“Benefits of a short course are easily lost in a sea of positive outcomes resulting from graduate training, but that has little bearing on the impact such courses may have when they stand alone.”But that is of course also a sort of cop out. Although given that “only” 16% of the participants participated in such a workshop, the lack of detecting the influence of a single workshop in the wide range of personal outcomes is not that surprising. This is one of the cases where a post-hoc power analysis might have been informative.
The second, standard, scientific cop out is that the authors did not measure the correct outcome. The end point identified by the original notification email (time to completion) was very different from the the endpoints measured in the long-term study (productivity, academic skills development, socialization into the academic community). However, in the appendix of the PNAS article, there is a variable “Time to Degree (T2)” that is not significant. It would be interesting to take a closer look at the Guelph data in that respective.
So while the effectiveness of these workshops at a global level is not exactly clear, I think that it is clear that, from a graduate student perspective, you have to start somewhere. You will not immediately become the best writer in your cohort because you took a 2-day writing tune-up workshop, but you will become a better writer compared to the old you.