As I mentioned previously, one of the outcomes of my last supervisory meeting was that I need to produce an overview or summary of how my analysis will be conducted. An extension if you will to the doc in which I summarised my data:
Obviously this was something I’ve been giving thought to for some while now, although early ideas were very much part of an evolving mix. By now however, things have firmed up much more and I was in a position to construct the summary my supervisors had asked for. My analysis would be interpretive and thematic, following Saldana’s twin-cycle coding strategy. During the first cycle, a combination of descriptive, in vivo and process coding methods, possibly accompanied by emotion coding would be deployed. The second cycle would involve identifying emergent themes through pattern coding, thereby identifying the more significant features within the data. To assist in managing the data, supporting the coding and to assist my analysis, NVivo would be used – I’ve been on a training course for this, and got some practice during my pilot study and on the Research Masters modules I’ve completed. Now, if I was going to be taking this forward, I’d want to put a little more flesh on those bones and justify those choices, but here’s the thing; that all went up in the air over the last couple of weeks.
Some while ago, during another supervision meeting, one of my supervisors raised a metaphorical eyebrow when I mentioned my early thinking on the analysis strategy I was likely to pursue. It was a warning signal I should have paid more heed to, and reflect on my epistemological approach in the context of the rest of my study. As I’ve been reading more deeply into the literature contributing to sociomaterial approaches, it’s become increasingly apparent that an interpretive epistemology would sit uncomfortably within that. Were I to do that, then I, the researcher, would be attempting to interpret and explain the processes I have observed; knowledge becomes a social construction in which I have attempted to make sense of the world. Whilst it is not impossible for ANT to act as a lens to allow me to do that, Cordella and Shaikh (2006) remind me that ANT and a sociomaterial sensibility propose a distinct and different ontology. Reality is produced or enacted by things relating to or associating with one another. It is is not anterior to the activity in which it becomes manifest, but is produced by it, and I, as researcher, am part of that assemblage. Entities and reality itself are enacted into being through their interactions and material discursive practices and therefore my focus should be on what they do, no what they represent or mean (Fenwick and Edwards, 2013). Suitably chastened, I began considering how I might actually conduct an analysis which remained true to that endeavour.
As I cast around for a more appropriate route forward, I soon came across “Qualitative Data Analysis After Coding” (St. Pierre and Jackson, 2014), an introduction to a journal ‘special issue’ on the eponymous theme. The authors argue “…that coding data in that way [the way I was thinking of using!] is thinkable and doable only in a Cartesian ontological realism that assumes data exist out there somewhere in the real world to be found, collected, and coded.” They posit that the breaking apart of data into chunks, decontextualised using codes, then sorting and organising them into categories and identifying emerging themes smacks of (realist?) scientific discovery. In the post- ontologies (postpositive, posthuman, postmodern, poststructural) where realities are ‘becoming,’ a coding approach makes little sense. Here then, a possible route forward opened up, the post-qualitative, drawing from new materialism and feminist literature, the work of Karen Barad, Donna Haraway and others.
During the couple of weeks since I began reading in this field, I can only claim to have scratched the surface, but I now have a much better sense of how I might make my analysis more coherent with my chosen ontology. Until now, I’ve resisted the pull towards Deleuze and Guattari, though having chatted with Chris about the way he draws [literally!] on their work, and with Martina and the way her work is underpinned by them, I can at least claim peripheral familiarity. That might matter, since most of the authors working in this field seem to turn to D&G for their inspiration. Although I’m can’t hope to fully embrace all that Barad and Haraway have to say (they’re not easy reads!), I am able to lean on the work of others who have already done some of the heavy lifting and applied the principles in their own research. For what follows I am grateful to Lisa Mazzei, Alecia Youngblood Jackson, Karin Hultman, Hillevi Lenz Taguchi and Claire Colebrook for helping to provide a sense of direction.
Once you turn away from a single reality ‘out there’ and that it can be adequately represented through text, what alternatives strategies are available? The aforementioned authors build upon a technique proposed by Barad and Haraway called diffractive analysis. Taken from the world of physics, and therefore within my comfort zone, diffraction is the overlapping and interference (superposition) of waves with one another as they pass through apertures and spread around obstacles.
As they become entangled, their regularity is disturbed and displaced as new patterns form and dissolve. Like the different waves which are produced by diffraction, using a diffractive analysis disperses and disrupts thought, enabling fresh insights where we might not otherwise have expected them. Diffractive thinking is offered as an alternative to reflective thinking in which the researcher tries to make sense of what their respondents say by reflecting their meaning. “Diffractive analysis requires us to engage in an event of reading and becoming with (Haraway, 2008) the data, rather than reading it from a distance and as separate or apart from it” (Lenz Taguchi, 2012). We’re attempting to uncover a reality within the data that has already been enacted, but not previously disclosed. We need to become installed in the data or as Jackson and Mazzei (2012), borrowing from Deleuze, call it ‘plugging in.’ Their technique involves reading texts with and through other texts; reading theory with data with theory.
Thinking with Theory
Jackson and Mazzei see plugging in as a constant process of making and unmaking an assemblage constituted from the multiple texts with which they had worked: interview and other data, ‘tomes of theory,’ previous writings, comments from others etc. They suggest that “plugging in creates a different relationship among texts: they constitute one another and in doing so create something new.” It is that newness that I am attracted to; after all, the whole enterprise of the PhD is on making a contribution to new knowledge. Jackson and Mazzei propose that plugging in involves at least three maneuvers:
- putting philosophical concepts to work via disrupting the theory/practice binary by decentering each and instead showing how they constitute or make one another;
- being deliberate and transparent in what analytical questions are made possible by a specific theoretical concept; and
- working the same “data chunks” repeatedly to “deform [them], to make [them] groan and protest” (Foucault) with an overabundance of meaning.
Reading the data in turn with and through concepts taken from Derrida, Spivak, Foucault, Butler, Barad and Deleuze, a set of analytical questions was pursued. The concepts and questions were not the concepts or questions, but ones which they hoped would extend their thinking beyond an ‘easy sense.’
For a bear of little brain like me, this sounds like a considerable challenge, but then I’m not really in a position to ‘plug in’ the theorists outlined above; I simply don’t have the familiarity that Jackson and Mazzei do. What I do have though is a developing understanding of actor-network theory and sociomaterial concepts. I can see how multiplicity, fluidity, mutability, agency and so on might be assembled to help me plug in. I’ve also recently become interested in the notion of the chronotope and timescapes; might they help to reveal new insights through plugging in? But what to choose (or not) and why?
I became aware of this technique when Martina mentioned she was walking her data. Proposed by Eakle (2007), this involves exploring “as if you were an open and receptive traveler in a new and unknown territory that you want to make familiar before designing an itinerary.” It’s a technique I often use when visiting new places; why not with data too? There also seemed to be a link here with flanerie, introduced to me by Deborah, although certainly not the aimless, idling version!
The idea would be to wander through the data; interview transcripts, observational notes, tweet corpora, blog posts, Voxer transcripts, and highlight and make notes on aspects which draw my interest. In one sense, I’ve already begun this process, since I began taking notes as the data emerged. The flanerie or data walking would provide a sense of which theories or concepts I could then bring with me whilst plugging in for a more detailed and intimate analysis. Although I knew how many interviews I would collect and could apportion my time accordingly, I never knew precisely how much data would arise from my observation, nor how many blog posts I would encounter which might lend themselves to a more detailed analysis. Data walking might help reveal points of interest to which I should attend more closely, moreover, as a preliminary to thinking with theory, it offers a meaningful and workable two-stage approach to analysing my data. Let the games begin.
Cordella, A., & Shaikh, M. (2006). From epistemology to ontology: Challenging the constructed “truth” of ANT (Working Paper Series: 143). London: London School of Economics
Eakle, A. J. (2007). Literacy spaces of a christian faith‐based school. Reading Research Quarterly, 42(4), 472-510.
Fenwick, T., & Edwards, R. (2013). Performative ontologies: Sociomaterial approaches to researching adult education and lifelong learning. European journal for Research on the Education and Learning of Adults, 4(1), 49-63.
Jackson, A. Y., & Mazzei, L. A. (2011). Thinking with theory in qualitative research: Viewing data across multiple perspectives. Routledge.
St. Pierre, E. A., & Jackson, A. Y. (2014). Qualitative data analysis after coding. Qualitative Inquiry, 20(6), 715-719.
Taguchi, H. L. (2012). A diffractive and Deleuzian approach to analysing interview data. Feminist Theory, 13(3), 265-281.