Researchers should write about what matters in ways that can be understood

Ex-Secretary of State for Education Michael Gove appears to be roundly despised by teachers, and no longer even trusted by his own party to handle the education portfolio. However, if for nothing else, he should be congratulated and remembered for his role in the palpable increase in awareness of and engagement with educational research that we have seen over the past couple of years. It was he who commissioned the Cabinet Insights Team to produce the briefing Test, Learn Adapt1 and Ben Goldacre’s monograph Building Evidence into Education2, which in turn helped to catalyse Tom Bennet into creating the researchED movement. This movement is responsible in no small measure for bringing research into the forefront of teachers’ collective consciousness.

One small element of this increased visibility is the regular section of Schools Week newspaper entitled Research Review, which critiques recently published articles on educational research.

In Issue 19 of Schools Week3 the editor, Laura McInerney, reviewed two articles from Open Review of Educational Research. The first, Para-quantitative Methodology: Reclaiming experimentalism in educational research4, was a sublime example of how easily researchers can undo the good work of Gove, Goldacre, Bennet, et al. on encouraging teachers to engage with research, by making it a turgid mix of impenetrable prose and philosophical navel gazing. In her review, McInerny was right to complain about the opaqueness of the article, though she sold herself short when she suggested that it was she who was at fault because she “wasn’t up to understanding it.”

I take issue with the reviewed article on two fronts. First, the subject matter serves to re-hash the ridiculous quantitative vs qualitative stand-off, where some people choose to align themselves with a paradigmatic tribe then beat their chests about what they see as the failings of the other side. Two scientists, who on the face of it might be considered to occupy opposing ends of this false binary, have articulated the fallacy underpinning this kind of behaviour: Ann Oakley, in her book Experiments in Knowing5 and Ben Goldacre, at the first researchED conference in 20136. Both emphasised the importance of the research question in determining the assumptions we make about the nature of reality and the way in which we approach finding out about it. There is no inconsistency in researchers adopting an ‘interpretivist’ stance if the questions they wish to address require interpretation (say, trying to understand how teachers feel about Ofsted inspections) and adopting a ‘positivist’ position if they are interested in addressing questions involving substantive empirical outcomes (say, the number of teachers who leave the profession within one year of being inspected by Ofsted).

By the same token, there is nothing wrong with mixing these methods and assumptions if, for example, you want to know the rate teachers leave the profession and what teachers say prompted them to leave. One just has to keep in mind that these are two questions, the answers for which must be thought of as separate but related, rather than combined to present us with the “10+happiness” equation used by McInerney to articulate the authors’ problem with mixed methods, and the substance of which informs their baroque proposed solution. Indeed, mixing of methods has been shown to be very satisfactorily achieved, with clearly described implications for practitioners, by Sandy Oliver of the IOE in her article Advantages of concurrent preparation and reporting of systematic reviews of quantitative and qualitative evidence7.

To their credit, the authors of the para-quantitive methodology article view what they describe as the “long-standing challenge between quantitative and qualitative methods” as being “more theoretical than practical or technical.” However, this does not appear sufficient to have stopped them producing sixteen pages of complex philosophical theory of questionable value to their intended readership.

Which brings me to my second issue with the article: the barely penetrable academese in which it was written. The nature of the content (my criticisms of it notwithstanding) is complex. However, this does not mean that it should be opaque to the lay reader, and certainly not to the informed peer. A short twitter exchange about the article, and McInerny’s coverage of it, over the weekend that followed prompted one tweeting academic to suggest that the criticism of the article constituted part of an “anti-academic meme” among educationalists, and to imply that problems of understanding were the fault of the reader for not being sufficiently attuned to the authors’ vernacular. This is to get it completely backwards. If researchers want their work to be understood, the obligation for clarity rests with them. Just because something is complicated and complex does not mean it cannot be presented clearly and transparently. One need only look at Stephen Pinker’s writing on experimental psychology, or Richard Dawkins’ explanations of evolutionary biology to see extremely complex ideas communicated clearly and effectively.

The Open Review of Educational Research, in which the article was published, claims to be “committed to the principles of openness in education and research”8. It would be a good start, in my opinion, if the journal editors began by demanding that the doors to this openness were not so firmly closed by their authors’ writing style that they provoked a Fulbright scholar to throw her hands up in despair.

References
1Haynes L, Service O, Goldacre B & Torgerson D (2012) Test, Learn, Adapt Developing Public Policy with Randomised Controlled Trials. London: Cabinet Office Behavioural Insights Team. Available online at https://www.gov.uk/government/publications/test-learn-adapt-developing-public-policy-with-randomised-controlled-trials [Accessed 13.03.2015]

2Goldacre B (2013) Building Evidence into Education. London: DfE Available online at http://dera.ioe.ac.uk/17530/1/ben%20goldacre%20paper.pdf [Accessed 13.03.2015]

3Schools Week FRIDAY, FEBRUARY 13, 2015 EDITION 19 Research Review p13

4Varaki BS, Floden RE & Kalatehjafarabadi TJ (2015) Para-quantitative Methodology: Reclaiming experimentalism in educational research. Open Review of Educational Research 2:1 26-41

5Oakley A (2000) Experiments in Knowing: Gender and Method in the Social Sciences UK: Polity Press

6Goldacre B (2013) ResearchED 2013: Ben Goldacre speaks at ResearchED Sept 7th 2013 – Part 2 of 2 (You Tube Video Clip). Available online at http://www.youtube.com/watch? v=TUpkumj0eE8 [Accessed 16.1.2014]

7Oliver S (2014) Advantages of concurrent preparation and reporting of systematic reviews of quantitative and qualitative evidence. JLL Bulletin: Commentaries on the history of treatment evaluation.Available online at http://www.jameslindlibrary.org/illustrating/articles/advantages-of-concurrent-preparation-and-reporting-of-systematic [Accessed 21.2.2015]

8Taylor & Francis Online : Open Review of Educational Research – Aims and Scope (2014) Available online at http://www.tandfonline.com/action/journalInformation?show=aimsScope&journalCode=rrer20#.VOssAEKi20Y [Accessed 21.2.2015]

Advertisements

One comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s