Chapter 4: Assembling Methods #2

Chapter 4 introductory graphic
Chapter 4 proved to be quite a long chapter, so I split it across two posts; this is the second and discusses ethics, data management, data analysis, and integrity.


Although the formal stages of making ethics applications first for a pilot and then the main study are where my ethical thinking began, it continued through the data collection phase, on through analysis and presentation, and remains as a duty of care through the publishing of the thesis. By maintaining an ongoing ethical sensibility, I became increasingly attuned to ethical issues as they arose. I’ve discussed the thinking underpinning my ethical sensibility over a series of posts, so here I’ll simply summarise what was carried forward into the thesis.

Firstly I explored how other had dealt with ethics across the thirty published papers I mentioned in Hinterlands. I was surprised to find that very few even mention ethics, let alone discuss it and in only one did the authors state their ethical stance. Perhaps unsurprisingly for such an emerging field, ethics are addressed in different ways; depending on the methods used, some anonymise tweets the verbatim they present whilst others only provide aggregated data.

I then set out the range of sources I turned to for guidance, including, but not limited to the American Educational Research Association, American Sociological Association, BERA, Social Research Association. In many cases, ethical guidelines from learned societies make no specific reference to Internet research and the unique issues that might raise. Fortunately, some bodies have produced more targeted guidance: The British Sociological Association provides a specific annexe, the ESRC Framework includes a section specifically addressing Internet-mediated research and the British Psychological Society has a separate publication solely devoted to this topic. The Association of Internet Researchers’ (AoIR) guidance produced in 2002 and updated in 2012 was particularly helpful.

The factors which raised the most significant ethical questions for me included:

  • Private – Public distinction: there are three intertwined contextual factors here: how the space is perceived; the nature of the information being shared; and the intended audience.
  • Informed consent: advice from professional bodies generally begins from the premise that informed consent should always be sought, however, in ethnographic research, that is not always possible. The factors which help in deciding whether informed consent is required include the sensitivity of the topic being researched, distance and level of interaction between researcher and participant.
  • Human subject versus textual artifacts: although some claim that inscriptions on the Internet can be considered published texts, the AoIR (2012) guidance argues that, especially with social media, it becomes difficult to detach the text from the person.
  • Unobtrusivity: the Internet and social media enable researchers to observe behaviour without influencing it, since their presence may not be apparent in the same way it would be in offline spaces. Research conducted in this way can be considered less impactful on participants and is akin to ‘lurking.’ I discuss the differences between deceptively covert  and unobtrusive research and how to declare one’s status as a researcher in more detail here.

In the table presented in this post, I reflect on each of these factors in the context of each of my proposed methods.

Conventionally, research participants are afforded both anonymity and confidentiality; that the data they provide will only be available to those specified, and that all features which might identify them will be removed before making the findings more public. However my experiences as a participant on Twitter suggested it was necessary to revisit that stance, given the degree to which Twitter used by teachers is a highly participatory space. Furthermore, it is common practice to thank and cite the work of others where it has been helpful or interesting. This influenced my decision to credit and cite participants’ contributions whenever they granted me permission to do so. Declining to anonymise participants generates certain risks and benefits which I summarise in this table:

Risks Benefits
Loss of privacy which could lead to exposure to ridicule and/or embarrassment. Direct: Increase in participant agency, moving beyond the notion of participants merely as sources from which researchers abstract data.
Change in future circumstances which renders what participants originally said to be viewed in a less-positive light. Direct: Makes provision for participants to amend or extend what they said in the original interview.
Increased attention through increased exposure.

This could be perceived as either a risk or benefit and would depend on the participant’s preferred online behaviours.

Indirect: Increasing the awareness and understanding of the wider community of issues associated with professional learning and social media.

Data management

The interview transcripts, field notes, Twitter chats and exchanges, blog posts and comments, together produced a ‘Mountain of Words’ (B. D. Johnson, Dunlap, & Benoit, 2010). To help prepare for this, and as part of our doctoral preparation, I had produced a data management plan. This sets out how data will be collected, stored and, where necessary, shared or preserved and clearly contributes to and is influenced by one’s ethical sensibility. The data in the main study were collected principally between October 2016 and June 2017; the nature and sources of data are summarised in this visualisation:

“My Data_Overview” flickr photo by IaninSheffield shared under a Creative Commons (BY-NC-SA) license

As each set of data was produced, it was stored in a secure location on University servers, then also imported into NVivo. Although NVivo was not used for coding and categorising, it nevertheless provided a useful way of keeping the different data sets within a click or two of one another. The capability to move seamlessly across and through the data, connecting together various points of interest proved crucial for exploring as flâneur.

Tweets encountered during participant observation either as single tweets, brief exchanges or extended hashtag chats were collected within Mindview, Treeverse and TAGS respectively. Each not only stored data, but was capable of visualising them in particular ways.

Audio interviews were stored as mp3 files and once transcribed, as Word documents. Where participants granted permission, these were also posted as podcasts on Radio Edutalk, as discussed earlier.

In the same way that tweets have a unique url, so too do blog posts, so these urls were also recorded. The posts needed for analysis were converted into PDFs and downloaded, then uploaded into NVivo. It is easier and safer (for privacy and confidentiality) to annotate and comment on PDFs than directly onto the Web.

Data analysis

As I wrote my PhD project proposal, I fully intended to undertake a conventional analysis. The process would be largely inductive and involve close reading of the data with the intention of developing themes and concepts. However, the more I read the literature, the better my appreciation of what adopting a sociomaterial sensibility entailed, and the clearer the tension between an ANT-suffused flânography and the proposed analytical approach became. The principle of gathering insights through wandering freely was in danger of being lost.

As presented above, there is an assumption of linearity, of a sequence in which interpretation of the data falls between gathering them and presenting findings. It assumes a primary, discoverable reality waiting to be uncovered, then accurately represented through language. The subject/object and knower/known binaries are brought into question by actor-network theory, so for me to approach analysis by attempting to interpret what an interviewee really meant would be inconsistent, to say the least.

I chose to turn to Jackson and Mazzei’s (2012; 2013) notions of ‘Thinking with Theory’ and ‘Plugging In’ which I described in more detail in this post. They offer these as an antidote to the reductionism that coding and thematic analysis produce and consequently preclude dense, multilayered treatments of data. Their intention is to ‘decenter some of the traps in humanistic qualitative inquiry: for example: data, voice, narrative, and meaning-making’ and accept that the data will be partial, incomplete and undergoing a continual process of being reformed. This desire to move beyond the solely human tellings seemed coherent with ANT which, through its symmetrical approach, also seeks to destabilise the anthropocentric view.

I elected to borrow the principle of ‘plugging in’ from Jackson and Mazzei as a way to help understand and analyse the data. The ‘theories’ upon which I drew were the three travelling companions of assemblage, multiplicity and fluidity introduced in Chapter 3. The following visualisation helps to illustrate how these concepts were plugged into the different data streams.

‘Plugging-in’ theory with data

Like a tartan (plaid) in which the threads are interwoven, but different, because here the warp and the weft intra-act, rather than interweave. The different strands of data are read with and through three sociomaterial concepts and at the points of intersection, it is neither one thing nor the other, but something new which emerges.

When the data are revisited, a process of data walking (Eakle, 2007, p.483) is employed in which you explore the data ‘as if you were an open and receptive traveller in a new and unknown territory that you want to make familiar before designing an itinerary.’ Reading and rereading the transcripts, tweets, blog posts, notes and memos allows me as a researcher to begin to use this data to think with the concepts, in addition to using the concepts to think with the data.

Part of the analytical process involved composing visualisations, but not as products from which insights may be gleaned. Visualisation for Miles and Huberman (1994) for example constituted ‘an organized, compressed assembly of information that permits conclusion drawing and action.’ This suggests data reduction or as they later termed it, data condensation, but I tend on the other hand to follow Madden (2010), who offers data analysis as a ‘broadening’ process. As the flâneur might approach a city district from different directions, or arrive there via different modes of transport, I elected to use different applications to visualise the data. The different features and restrictions that each offered obliged me to assemble the data, to manipulate them, to read them differently, and consequently draw fresh insights, or as Hendrickson (2008) terms it, ‘the visual processes of coming-to-know.’

Integrity, fidelity and honesty

Research quality is often judged on whether the claims to knowledge are valid, reliable, generalisable, and objective. My research project is neither built on a realist ontology, nor a positivist epistemology, nor does it employ quantitative methods. It will come as no surprise therefore that the aforementioned criteria hardly seem appropriate. Nor indeed do some of the alternatives offered such as confirmability, dependability and transferability, since these too assume a reality ‘out there’ which can somehow be known ‘in here.’ With an actor-network theory sensibility, reality can be out there, independent of the knower, but only if it is made that way.

Another alternative is offered through Tracy’s (2010) ‘Eight ”Big-Tent” Criteria.’ She contends that high quality qualitative methodological research can be judged by the presence of (a) a worthy topic, (b) rich rigor, (c) sincerity, (d) credibility, (e) resonance, (f) a significant contribution, (g) ethics, and (h) meaningful coherence. All the aforementioned are examples of what has been termed ‘criteriological’ approaches in which:

criteria for judging qualitative research need to be, and can be, predetermined, permanent and applied to any form of inquiry regardless of its intents and purposes

B. Smith & McGannon, 2017

They see the universal application of criteria as problematic in how it requires any, and all research to be judged in ‘preordained and set ways.’ An alternative to criteriology is offered through the ‘relativist approach’ proposed by Sparkes and Smith (2009). For them criteria are a socially constructed list of characterising traits, applied in a manner that is contextually situated and flexible. These lists are not fixed in advance, but are open-ended, responding to what unfolds by adding or subtracting characteristics as required. Given the ontological views I expressed earlier, a relativist, rather than a criteriological approach seems more coherent with my study. I also feel that using some of the original terminology (trustworthiness, rigour etc) carries baggage, so I propose to use terms which are less used, and for me, describe better the characteristics on which this project should be judged. It is my hope that the reader, for it is they who will be the arbiter of the quality of this research, feel that it has been conducted with integrity, fidelity and honesty. To that end, I offer the following characteristics assembled from different studies to suit the circumstances within which this research was conducted: a worthwhile topic for study, rich rigour, sincerity, transparency, ethical, authentic and coherent.

This study, in seeking to shed more light on one aspect of the Twitter PD, but also on an important aspect of teachers’ lives (and given the tweeting behaviour of the current President of the USA) is therefore particularly timely and relevant, makes a worthwhile topic for study.

My prior experience, time spent ‘in the field,’ the range of appropriate methods (informed by relevant theory), have all contributed to the production of abundant data, from which a thorough analysis was conducted. This was not about verification through different perspectives, but more about adding richness through an attempt to access different realities enacted by different people and things. As such, I argue this constitutes a richly rigorous study.

In chapter 1 I outlined the hinterland I brought to this study and that consequently I offer this account as partial. It is also partial as a result of the particular method assemblage through which data were enacted, and partial since the analysis highlighted some data whilst backgrounding others. I have nevertheless set out the methods used through which data were gathered, how those data were managed and analysed, and in this research blog reflected openly about those matters. These factors contribute to undertaking a more sincere study and being as transparent as ethics allows.

In an earlier part of this chapter I made plain how developing ethical sensibility was an ongoing process throughout the planning, preparing, researching, analysing and writing of this study.

The different elements of any story should be woven together in a coherent fashion. The same is true of quality research. I sought to do this both internally, by achieving a natural flow through the research design, execution, analysis and interpretation, and externally by drawing upon other theories and previous research into this area. Maintaining a clear and consistent line of argumentation and proposition can help to contribute to a coherent account.

For Lincoln and Guba (1989) authenticity arises when a study has an effect on those who participated in the research. The responses to questions I posed on blog posts, to questions in interviews, comments made on my research blog and in blog posts some respondents wrote subsequent to their participation, suggest both ontological (the study provides participants with new insights into their own situations) and educative authenticity (the study helps participants to better understand the position of other interest groups).


Eakle, A. J. (2007). Literacy spaces of a christian faith‐based school. Reading Research Quarterly, 42(4), 472-510. doi:10.1598/RRQ.42.4.3
Guba, E. G., & Lincoln, Y. S. (1989). Fourth generation evaluation. California: Sage.
Hendrickson, C. (2008). Visual field notes: Drawing insights in the Yucatan. Visual Anthropology Review, 24(2), 117-132. doi:10.1111/j.1548-7458.2008.00009.x
Jackson, A. Y., & Mazzei, L. A. (2012). Thinking with theory in qualitative research: Viewing data across multiple perspectives. Oxford, UK: Routledge.
Jackson, A. Y., & Mazzei, L. A. (2013). Plugging one text into another: Thinking with theory in qualitative research. Qualitative Inquiry, 19(4), 261-271. doi:10.1177/1077800412471510
Johnson, B. D., Dunlap, E., & Benoit, E. (2010). Organizing “mountains of words” for data analysis, both qualitative and quantitative. Substance use & Misuse, 45(5), 648-670. doi:10.3109/10826081003594757
Madden, R. (2010). Being ethnographic: A guide to the theory and practice of ethnography. Los Angeles, Calif.]; London: Sage.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.) Sage.
Smith, B., & McGannon, K. R. (2017). Developing rigor in qualitative research: Problems and opportunities within sport and exercise psychology. International Review of Sport and Exercise Psychology, , 1-21. doi:10.1080/1750984X.2017.1317357
Sparkes, A. C., & Smith, B. (2009). Judging the quality of qualitative inquiry: Criteriology and relativism in action. Psychology of Sport & Exercise, 10(5), 491-497. doi:10.1016/j.psychsport.2009.02.006
Tracy, S. J. (2010). Qualitative quality: Eight “Big-tent” criteria for excellent qualitative research. Qualitative Inquiry, 16(10), 837-851. doi:10.1177/1077800410383121

4 thoughts on “Chapter 4: Assembling Methods #2

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s