Choosing the (digital) pedagogical tool fit for the learning

source: @byrev

The list of digital technologies that might be used for teaching and learning is extensive. It includes: LMSs (Learning Management Systems); MOOCs (Massive Open Online Courses); BYOD (bring your own device); BYOT (bring your own technology); BYOC (bring your own connectivity); makerspaces; robotics; digital portfolios; online discussion forums; blogging platforms; wikis, microblogging; back channels; audio recording and music making; image and video editing; creation of infographics, slideshows, and presentations; digital storytelling; social media; collaboration tools; mobile apps; game-based learning and environments; coding and computer programming; augmented and virtual realities; technologies for creating physical or virtual 3D models; gesture-based computing; learning analytics and statistical analysis software; online authoring tools; wearable technology; affective computing; rubric generators; quizzes; online response systems such as polls and surveys; video conferencing; cloud computing; and student feedback tools such as Turnitin, GradeMark, and PeerMark.

E-learning technologies are sometimes defined as asynchronous (any-time) or synchronous (real-time). Flipped learning is that in which traditional teacher instruction is delivered between classes via online video or presentation technologies, and class time is used for application and collaboration. Blended learning melds traditional classroom pedagogies with online learning tools and environments. Rhizomatic learning, a loose appropriation of Deleuze and Guattari’s rhizome in an educational context, is non-linear and not predetermined (Cormier, 2008; Koutropoulos, 2017) and heutagogical learning is self-determined (Hase & Kenyon, 2000, 2007; Netolicky, 2016). Beetham (2013a) describes e-learning as learner-centred experience that allows learners more control over the time, place, and pace of their learning and the opportunity to connect with learning communities worldwide, much like the experience of many teachers who use social media for networking and learning.

I’ve been doing some reading since I recently posted my initial thoughts about digital pedagogy and I am reassured that scholars tend to agree that pedagogy should drive the use of technologies, rather than technologies driving the way teaching and learning happens, or as an end in themselves. Digital technologies and methods are mostly seen as part of a teacher’s arsenal of tools for teaching curriculum content, skills, and understandings.

Laurillard (2013) states that, while the scope and style of pedagogy changes as technology changes, no one has yet shown that we need to change our understanding of how students learn. Higgins (2014), however, argues that technology has changed what we learn and how we learn.

The changing digital technology landscape has led to educators attempting to personalise and gamify learning, to construct open online learning environments and self-directed learning opportunities, to leverage students’ personal mobile devices for learning, and to utilise technologies to facilitate processes such as analysis, collaboration, communication, and creation. Dichev and Dicheva (2017), however, found that even though gamification in education is a growing phenomenon, practice has outpaced research and we do not know enough about how to effectively gamify education or even whether gamifying education is beneficial. Additionally, online learning such as that via MOOCs can be overwhelming and confusing to those without highly-evolved skills in managing their connectivity (Beetham, 2013b). This brings into question the equity of technologies. Who has access? Who dominates? Who becomes lost in the system or excluded from it?

Many authors note that teachers should not assume that because students are surrounded by technology they are savvy, confident, ethical, or safe users of it. Safe, ethical use of technology needs to be guided and explicitly taught, as do skills such as online collaboration and evaluating the quality of available information. Students need the skills and aptitudes to sustain engagement with digital learning, especially if it is self-directed and self-paced.

Most proponents of digital learning base their use of technologies on traditional pedagogy. Good pedagogical design, traditional or digital, ensures that there is alignment between the curriculum we teach, the teaching methods we use, the learning environment we choose, and the assessment procedures we adopt (Biggs, 1999). Importantly, a role remains for teachers as expert designers of learning (Laurillard, 2013; Selwyn, 2016) who establish learning tasks, supportive environments for learning, and conducive forms of social classroom relations. Hunter (2015) suggests the following questions to teachers:

  • Where is the pedagogy?
  • What is the content?
  • How is your choice or the students’ choice of particular technology tools going to enhance learning?

So, we need to start with the desired learning outcomes. Curriculum design comes before pedagogy, which comes before technology. Then we choose the pedagogical tool fit for the learning purpose.

It cannot be assumed, however, that teachers, even those who are tech-savvy, know how to best use technologies for pedagogical purposes. Lei (2009) found that although pre-service teachers are often digital natives who use technology extensively for themselves, they lack the knowledge, skills, and experiences to integrate technology into classrooms to help them teach and to help their students learn, even when they recognise the importance of doing so. Teachers can leverage digital technologies within a pedagogical frame, but only when we have the knowledge and understanding of available technologies and their pedagogical potential.


Beetham, H. (2013a). Designing for active learning in technology-rich contexts. In H. Beetham & R. Sharpe (Eds.) Rethinking pedagogy for a digital age: Designing for 21st century learning (2nd ed.), pp.31-48. Abingdon, England: Routledge.

Beetham, H. (2013b). Designing for learning in an uncertain future. In H. Beetham & R. Sharpe (Eds.) Rethinking pedagogy for a digital age: Designing for 21st century learning (2nd ed.), pp.258-281. Abingdon, England: Routledge.

Biggs, J. (1999). Teaching for quality learning at university. Buckingham: Society for Research in Higher Education and Open University Press.

Cormier, D. (2008). Rhizomatic education: Community as curriculum. Innovate 4(5).

Dichev, C., & Dicheva, D. (2017). Gamifying education: what is known, what is believed and what remains uncertain: a critical review. International Journal of Educational Technology in Higher Education14(1).

Hase, S., & Kenyon, C. (2000). From andragogy to heutagogyultiBASE In-Site, 5(3), 1-10.

Hase, S., & Kenyon, C. (2007). Heutagogy: A child of complexity theory. Complicity: An international journal of complexity and education4(1).

Higgins, S. (2014). Critical thinking for 21st-century education: A cyber-tooth curriculum? Prospects44(4), 559-574.

Hunter, J. (2015). Technology integration and high possibility classrooms: Building from TPACK. Abingdon, England: Routledge.

Koutropoulos, A. (2017). Rhizomes of the classroom: Enabling the learners to become curriculum. In S. P. Ferris & H. Wilder (Eds.), Unplugging the classroom: Teaching with technologies to promote students’ lifelong learning, pp.103-118. Cambridge, MA: Chandos Publishing.

Laurillard, D. (2013). Forward to the second edition. In H. Beetham & R. Sharpe (Eds.) Rethinking pedagogy for a digital age: Designing for 21st century learning (2nd ed.), pp.xvi-xviii. Abingdon, England: Routledge.

Lei, J. (2009). Digital natives as preservice teachers: What technology preparation is needed? Journal of Computing in Teacher Education, 25(3), 87-97.

Netolicky, D. M. (2016). Rethinking professional learning for teachers and school leaders. Journal of Professional Capital and Community, 1(4), 270-285.

Selwyn, N. (2016). Education and technology: Key issues and debates. London: Bloomsbury Publishing.

Metaphors for digital spaces: Considering Westworld

the Westworld landscape (source:

You really do make a terrible human being. And I mean that as a compliment. ~ Maeve, Westworld

The Western genre involves representations of the powerful and the powerless, the heard and the voiceless, the abusers and the abused. In the romanticised landscape of the Western, heroes (usually white and male) overcome the challenges of the frontierland. Historical brutalities are often overlooked in favour of myth and legend; the Wild West is one of imagination rather than reality.

In an open access, peer-reviewed paper published this month, Cyborgs, desiring-machines, bodies without organs, and Westworld: Interrogating academic writing and
scholarly identity, I use the HBO television series Westworld as a lens for exploring academic identity and writing. The Westworld setting is a theme park, called Westworld, a kind of real virtual reality that the very rich can frequent on their vacations, for a hefty fee. The theme park recreates the idealistic American desert frontier, full of sweeping vistas, damsels, brothels, gun-slinging bad guys, and cowboy heroes. It drags the Western genre and setting out of the archives, dusts it off, and breathes new life into it by marrying it with science fiction. Like much speculative fiction, it takes our current world and presses a hypothetiocal fast forward: What if technology evolves to a point where we can bring multi-player computer games to life in theme parks populated with robots that appear and act human?

The Westworld universe is one that brings together the Western genre—its hopes and its atrocities—with technology. The guests to the Westworld park aspire to play a part in this world as hero or villain. (In the show we mainly follow the arc of male guests, so it is their desires we see pursued and borne out.) The hosts of the theme park are cyborgs who follow narrative loops designed to allow guests to act out their darkest fantasies without guilt or consequence.

A recent blog post by Benjamin Doxtdator has me thinking more about the notion of the digital frontier, and about how we envisage ourselves in digital spaces. In his post, Benjamin explored the metaphor of the internet as a frontier-style landscape that can be mastered or explored. He argues that this is an unhelpful and even dangerous metaphor. He proposes the lens of surveillance capitalism as an alternative; here I imagine Foucauldian panopticism, the telescreens of Orwell’s 1984, and the Eyes of Atwood’s The Handmaid’s Tale.

Benjamin’s post got me thinking about what we might glean from considering the Westworld world as a metaphor for digital space, and the human characters as representations of how people and organisations interact with that space. The park’s creators manipulate the landscape and the cyborg characters in order to engage, entertain, and satisfy their guests, while the guests treat the park with egomaniacal entitlement. Some guests pursue exciting story arcs for themselves, while others leave their morals at the door as they live out depraved fantasies of violence and abuse.

The show presents the cyborg characters as more human than the humans, perhaps encouraging us to question our relationship with technology and its dehumanising influence. It also asks us to consider the ways in which we feel free to interact with digital spaces where we feel we can be either free from our non-digital selves, or to enact alternate identities online. Westworld, viewed this way, presents us with a humanity with which we don’t want to associate and provides a warning to its audience about the dangers of interacting unethically and unthinkingly with technologies.

Meanwhile the cyborg characters, with whom we are encouraged to empathise, are awakening and beginning to rise up against those who oppress and violate them. The show lays bare inequities and power imbalances in technological arenas. It presents a critique of the powerful puppet masters and privileged users of tech, questioning what people do with their privilege and with technology when there are no checks and balances. Westworld‘s artificial frontierland encourages us to reconsider what we might do in digital spaces, and to what ends.

In Westworld it is the cyborgs, the underdogs and the underclass, who provide us with potential for resistance and who begin to question their world. We are constantly, however, reminded of the ways in which the cyborgs are created, controlled, and sometimes cast aside by those who run the Westworld world.

At a time when I am working on a framework for digital pedagogy at my school, I am reflecting on the notion of metaphor. I agree with Benjamin Doxtdator that conceptualising technology as a playground is dangerous. How do we want our children and students to see and interact with the world of technology? In a world of big data, cyber attacks, and alternative facts, perhaps it is with a combination of enthusiasm, caution, fear, confidence and criticality.

Tweeting and blogging: Selfish, self-serving indulgences?

Narcissus by Caravaggio

This week I’ve been mulling over a post in the TES written by Claire Narayanan in which she argues that teachers’ time is precious and they should quietly get on with their jobs, not spend time writing about it. In encouraging teachers to be ‘do-rus not gurus’ she writes:

In a world where self-promotion has rather shamelessly crept into education, the real heroes are not those who we may follow on Twitter, read about in leadership manuals or hear speak at conferences, but those who are at the chalkface.

These are the teachers who seek no recognition beyond a set of decent GCSE results; a thank-you from their headteacher every now and again and, best of all: “Thanks Sir/Miss, I enjoyed that lesson”.

They haven’t got time to attend every single TeachMeet in their region, read every piece of research written, attend every conference around the country on their subject area or update their blog. Does that mean they don’t care as much as those who do? No chance – they’re too busy marking and planning.

I found this interesting and a little challenging. Of course no-one attends ‘every’ TeachMeet, reads ‘every piece of research written’, or attends ‘every conference around the country’, but the suggestion that ‘real teachers at the chalkface’ are too busy marking and planning to entertain attending professional development, reading research, or blogging, implies that those who do make the time for these activities are perhaps neglecting their teaching jobs. Otherwise, how would they have the time? It also implies that these activities aren’t a valuable use of teachers’ time.

I agree with Claire that we shouldn’t pursue gurus and heroes in education. My PhD reveals the importance of leadership that is deliberately invisible and empowering, rather than visible, focused on the leader, or driven by outward performance. I’ve spoken of the silent work of coaches and leaders. And as a full-time teacher and school leader who also tweets, blogs, and writes peer-reviewed papers and chapters, I know the tricky balance between self care, time with family and friends, and service to the profession and to my students.

I wonder, though, about the implication that those who are on Twitter or presenting at conferences are shameless self-promoters or narcissists seeking heroic guru status. Many of those who tweet and blog, I would argue, do so because they are interested in learning from others, sharing their own perspectives and experiences, and engaging with educators from around the world.

Part of what keeps me blogging is that it helps me think through ideas and get feedback from others. Another part is how useful I find the blogs of other people in helping or challenging my thinking. I also see blogging and academic writing as a service to the profession and a way to reclaim the narrative of education from those normally at its apex. It is why I am involved in the Flip the System series of books, which offer and value the voices of school practitioners—those working at the whiteboard, in the playground, and in the boardroom—that are often ignored in education reform, and yet are crucial voices to drive change in education. As Jelmer Evers and René Kneyber suggested in the first Flip book, teachers and school leaders can be agentic forces in changing education from the ground up by participating in global education conversation.

When I asked Claire on Twitter whether she saw all who tweet, share, blog, and present as shameless self-promoters, she responded, “Not at all. I’m all for sharing and learning. We all get on with the job in the way that suits us.” We seem to agree that different things work for different people. I don’t expect everyone to use their time as I do. There are benefits and costs to choosing to spend evenings, weekends, and holidays on professional activities or presenting at conferences. Last year I paid the price of going too hard for too long without a break.

For me, social media provides an avenue for sharing, learning, and connecting. I can tweet out my thoughts into the nighttime abyss, and somewhere, someone in the world is there to respond. I found this especially useful during the isolation of my PhD. I connected via social media with generous, supportive academics, researchers, and doctoral candidates from around the world who provided crucial advice and moral support.

My understanding of the world is broader for the conversations I have with those around Australia and the world, on social media and at conferences. These conversations and relationships allow me to see outside of my own context and my own perspective. They spill sometimes into productive collaborations that shape my thinking. I wrote here that:

In a world in which we are more connected than ever, we can be buoyed, empowered and supported by our connections…. We can pay forward and give back. We can … share our knowledge, contribute our time to help others on their journeys, listen to others’ stories, and celebrate others’ milestones.

Do I think we should acknowledge and celebrate the quiet daily work of committed teachers? Absolutely. Do I think we should encourage teachers to be mindful of workload, wellbeing, and self care? Yes, yes, yes. Do I think this is mutually exclusive from professional learning, engaging with research, interacting on social media, or writing blogs? No, I do not.

Digital Pedagogy


I have worked in one-to-one schools for most of my 17-and-a-bit year teaching career, and I’ve tended to be an experimenter with and adopter of learning technologies. I’ve been known to use online discussion forums to extend class discussion around English and Literature texts and concepts. I’ve used class blogs, wikis, and backchannels as collaborative learning spaces or expansions of the classroom. I use Twitter, Google Docs, Voxer, and blogging for my own learning and development. I have participated in MOOCs (massive open online courses).

I’ve recently been considering digital pedagogy from more of an organisation and systems level, as I look into how to refine my school’s use of technologies as tools for learning and teaching. As I begin a search of what research literature might offer us in this realm (please, if you have a seminal paper or reference here – pass it my way!) I have a couple of reflections. One is that, as technology moves quickly and research moves slowly (from data generation to publication), research on digital pedagogy needs to be treated with caution. Research around technology is emergent and fast changing; by the time it is published, it may be well out of date.

My other initial reflection is that there seems to be a discrepancy between the use of digital technologies promoted by enthusiastic teachers, conferences, and technology companies, and the discussion about education technologies in academic research. The former often promotes the possibilities of technologies for learning as future-building and positive. The latter tends to reveal a more cautious or critical approach to what digital technologies can offer teaching and learning.

It’s not surprising that tech giants promote themselves to schools, but there are some worrying reports that tech corporates, such as Edmodo and Google, use schools and students to collect and track big data. Corporate agendas are something we might consider when thinking about how technologies infiltrate or colonise our schools.

Neil Selwyn (in his 2016 book Education and Technology: Key Issues and Debates) points to the limitations of digital technologies, arguing that there is a lack of genuine diversity in the educational opportunities provided by educational technologies, but rather more of the same. He notes that “any ‘individualisation’ or ‘personalisation’ involves fitting individuals around preconfigured outcomes and expectations rather than offering genuinely bespoke education. … an individual is not actively self-determining but conforming to the requirements and expectations of a mass system” (p.161).

I share Selwyn’s cautiousness around technology in schools when it is seen as a shiny new thing or an end in itself. I am more comforabtle when digital pedagogy is about choosing the tool fit for the purpose, aligned with learning objectives. Technology is part of the teacher and learner’s arsenal, not the end point in themselves.

Additionally, while digital pedagogies are often viewed with much hope for their possibilities, the realities seem to be more disappointing. Marte Blikstad-Balas and Chris Davies (in their recent Oxford Review of Education paper ‘Assessing the educational value of one-to-one devices: Have we been asking the right questions?’) show that one-to-one devices are often positioned as having benefits to pedagogical change, development of future skills, and efficiencies and cost savings. (Interestingly, at my school the photocopying bill did not decrease with the move to one-to-one devices.) In looking at three schools (two in the UK and one in Norway), Blikstad-Balas and Davies found some benefits of one-to-one devices, but these tended to be focused on convenience, instrumental use, and functionality, rather than pedagogy. The three schools studied raised concerns including ad hoc teacher enthusiasm and uptake of one-to-one-devices, and teacher scepticism around implementation of digital technologies as part of pedagogy. Students reported feeling either pressured to use devices they didn’t want to use, for purposes they didn’t see as valuable, being distracted from their learning by one-to-one devices, and finding one-to-one devices unreliable. Year 11 and 12 students reported using their one-on-one devices for whatever they wanted (such as social media and online gaming), which was often not what the teacher was instructing. These findings are a sober reminder to schools about the realities of implementing educational technologies.

The educational world is saturated with information and promotions of various digital technologies. The 2016 Horizon Report for Higher Education, for instance, identifies a number of future trends and technologies predicted to influence education. Those working in education institutions need a way to make sense of the digital noise. Selwyn’s 2016 book Is technology good for education? provides useful questions to ask ourselves when considering digital pedagogy (p.24):

  • What is actually new here?
  • What are the unintended consequences or second-order effects?
  • What are the potential gains? What are the potential losses?
  • What underlying values and agendas are implicit?
  • In whose interests does this work? Who benefits in what ways?
  • What are the social problems that digital technology is being presented as a solution to?
  • How responsive to a ‘digital fix’ are these problems likely to be?

At my school we are working with a purposeful and transparent frame for making decisions about digital technologies and pedagogies. This frame is based around our strategic intents for our students, and our beliefs around learning, good teaching, and the core business of schools. No matter what the latest tech fad or shiny device, any pedagogy needs to start with the purpose of the learning and the design of curriculum. Pedagogy first. Digital if and when appropriate.

Education Gurus

It’s easy to make your own guru memes with Canva.

Knowledge and advice for schools and about education often seem to exist in a world of commodification and memeification. There is plenty of disagreement and debate in education, and plenty of competition on bookshelves and in conference programs. Educators and academics position themselves as brands via bios, photographs, and certification badges. As an educator and a researcher I have those whose work I follow closely; academics, for instance, whose presence affects me when I meet them because their reputation and body of work precede them.

In education, we have perceived gurus. These are people who have become ubiquitous in education circles, at education conferences, and in education literature. Teachers and school leaders scramble to get tickets to their sessions and to get photographic evidence of having met them. Their words are tweeted out in soundbites ad infinitum (or is that ad nauseum?), and made into internet memes. Sometimes these individuals partner with publishers or education corporates, and so the visibility and reach of their work grows. They become the scholars or experts most cited in staff rooms, at professional learning water coolers, and in job interviews when asked how research informs practice. 

Sometimes, these gurus are teachers or principals who have gained a large following on social media and subsequently a monolithic profile. Often, they are academics who have built up bodies of work over many years, becoming more and more well-known along the way, and eventually being perceived as celebrities or gurus. Yesterday I had the pleasure of learning from Dylan Wiliam, firstly at a day long seminar, and then at my school. At one point the seminar organisers apologised for running out of Wiliam’s books, acknowledging the desire of delegates to have the book signed.
Marten Koomen has traced networks of influencers in Australian education organisations. In his new paper ‘School leadership and the cult of the guru: the neo-Taylorism of Hattie’, Scott Eacott challenges the rise of the edu guru, those academics whose work is ubiquitous and influential to the point of being uncritically accepted and canonised. Eacott pushes back against the ‘what works’ mentality in education, in which educators are sold ‘what works’ and encouraged to slavishly apply it to their own contexts. Jon Andrews, too, questions the unquestioning way in which the loudest and most prominent voices become the accepted voices. Meta-analysis and meta-meta-analysis, often translated into league tables of ‘what works’ in education, have been the subject of criticism. George Lilley and Gary Jones have both questioned meta-analysis on their blogs. I’ve written about cautions surrounding the use of meta-analysis in education, especially when it drives clickbait headlines and a silver-bullet mentality of having the answers without having to ask any questions. Yesterday Wiliam made his oft-repeated points: that everything works somewhere, nothing works everywhere, and context matters. A guru cannot provide easy answers in education, as education is too complex and contextual for that.

taken at AERA last year

Much of this conversation around the rise of the edu guru has surrounded John Hattie, although he is by no means the only globally renowned education expert likely to make conference delegates weak at the knees. I was personally uncomfortable when he was beamed in via video link to last year’s ACEL conference and began to give an ‘I have a dream’ speech about education. As an English and Literature teacher I understand the power of rhetoric and analogy to persuade and inspire, but appropriating the legacy and words of Dr Martin Luther King Junior seemed a way to gospelise a personal brand of education reform.

I don’t think that education experts, no matter how influential they become, should encourage the uncritical acceptance of their ideas as dogma, or present themselves as the bringers of the One True Thing To Rule All Things of and for education. As Dylan Wiliam, channelling Ben Goldacre, repeatedly said yesterday, “I think you’ll find it’s a little more complicated than that.”

I wonder how perceived gurus feel about being guru-ised by the education masses. In part the famous and the infamous in education are so because of their actions: accepting more and more speaking gigs, performing the game of publishing and promoting their work. Most, I would guess, do this for the same reason someone like me speaks and publishes. To contribute to education narratives and change those narratives, hopefully for the better. To be of service to the profession and the field. To explore and wrestle with ideas, trying to find ways to make sense of the complexity of education in order to improve the learning of students and the lives of teachers and school leaders.

I wondered about the rise to gurudom and the moral obligation of the academic celebrity figure last year when at AERA I saw a panel in which four educational heavy hitters—Andy Hargreaves, Michael Fullan, Linda Darling-Hammond and Diane Ravitch—all advocating for the moral imperative of educational research and practice. They spoke of lifetime journeys of work intended to make the world a better and more just place. I wondered at the time about how much an early career academic can be brave and resistant in their work, as they try to build a career via the performative pressures of the academe. Can only the guru, free from institutional performativities and the financial pressures often associated with early career academia, say what they really want to say and do the work and writing they really want to do?

I don’t think experts in education are dangerous. We need expertise and people willing to commit their lives and work to making sense of and making better the world of education and learning. But in a world where teachers and school leaders are busy racing on the mouse wheels of their own performative pressures, we need to figure out ways to support and facilitate sceptical and critical engagement with research. Even those who are highly influential and highly admired need to have their work engaged with closely and critically. The danger comes when experts become so guru-fied that the words they use become part of an unthinking professional vernacular, used by educators who haven’t looked behind the curtain or beneath the book cover.

Flashback Friday: The end of the PhD

The end of the PhD. I remember it well, or so my long line of PhD-finishing blog posts might seem to attest. These include (and this is just a selection) …

The end of a doctorate is a rollercoaster of emotion. One, it turns out, I had largely forgotten. While my blog posts act as bread crumbs back to those experiences, the feelings themselves have faded, softened and blunted over time.

Today, I was reminded.

I still connect with the ‘DocVox’ Voxer (voice-to-voice messaging app) group that helped support me through my PhD. This is a group of mostly doctoral (PhD and EdD) candidates from the USA, plus a couple of us from Australasia. I figure staying in the Voxer group despite having finished the PhD helps me to pay back by continuing to support those who are still on their journey. It was via this group that I was today reminded of the visceral nature of the last bit of the PhD.

This morning a candidate from the US was Voxing about the blind panic they were feeling as they near dissertation submission. As I Voxed a response, I tried to reassure the person that their experience was normal. I recalled how in the last months of my PhD I had brutal insomnia. I clenched my jaw in my sleep despite chomping magnesium before bed to try and calm myself down and slow the mania of my obsessive mind. When I did sleep, I had nightmares, a recurring one of which was that I died and my almost-but-not-yet-finished PhD never saw the light of day, but languished, unexamined and unpublished. As I spoke, tears sprang to my eyes and my voice cracked. Some of that emotion returned in an intense flash. Wow, I thought, I didn’t think I was very affected by my experience. I was reminded as I spoke of the isolation of those moments, ones I didn’t really talk about because despite being surrounded by family, friends and colleagues, it didn’t seem something they would understand.

There are times in the PhD when everyone thinks you must be finished by now but you know you have so far to go, and times when it seems you should feel happy but instead you feel strange and empty. It’s a weird, emotional and quite a lonely time.

*                                    *                                    *

It’s almost 13 months since I was doctored. That moment was a glorious one. I awoke in Washington DC, after attending and presenting at the American Education Research Association (AERA) Conference. I had met a number of my academic heroes, as well as colleagues I knew only through Twitter and those that I met at the conference at sessions or in the epically long queue at Starbucks. I had nailed the presentation about my research and spent an hour in the corridor afterwards fielding questions and discussion. One of these discussions carried over to lunch and an ongoing professional connection. I’d had a great conference and was in edu-nerd heaven. It was the perfect moment for doctoring.

So, the day after AERA closed, I awoke in my Dupont Circle Airbnb apartment and checked my email, to find a ‘Congratulations, Doctor Netolicky’ email confirming the conferment of my PhD. I whooped, I shrieked, I clapped. I cried. I fist pumped. I felt overwhelmed and triumphant.

It was my last day in DC and I floated on rainbow-fairy-floss-cloud-nine as I swanned around the city in the magnificent sunshine. I was on my own, so I took this selfie (below) to remind myself of that elation. The iPhone snap mightn’t look like much to anyone else, but whenever I see it, it catapults me back to that moment of pure joy. Unadulterated I-am-now-Dr-Me exhilaration.

Now I have the luxury of being a pracademic, part school leader-teacher-practitioner, part early-career-scholar-researcher. During the PhD, finishing the doctorate always felt like an ending, but as I look back I can see that it was a beginning. I am now able to luxuriate more serenely in the oasis of academic writing, and to enjoy the gentle challenge of scholarly collaboration and conversation. And to apply my doctoral experience to my daily work.

The emotions fade, but it turns out they’re still there, in memory and in deep in the bowels of the iPhone camera roll.

DC doctor selfie

Stop hating on 2016

As 2016 draws to a close, the social media world is filled with hyperbolic despair and cleverly satirical, clicktavistic, hashtagified attacks on the 2016 calendar year. The masses are cursing 2016 and saying it’s ‘the worst’. People are mourning celebrities. They are anguishing over political events including the UK voting to Brexit and the US voting for Trump, although neither of these results have yet to come to fruition. Britain exiting the EU and Donald Trump being president are still joys ahead of us. Educators, even over the holiday period, have continued to stoush over ideological and practical differences. ‘Me at the beginning of 2016 vs. me at the end of 2016′ memes have been ricocheting around the interwebs, showing amusing-but-tongue-in-cheek-horrifying transformations of someone (usually a celebrity such as Leonardo DiCaprio, Mark Hamill or Winona Ryder) going from a state of wellness and success, to one of ravaged dragged-through-the-apocalypse-backwards misery. People’s timelines appear simultaneously grief-stricken and hipster-with-the-program-cool. Facebook feeds are weighed down by outpourings of emotion and strings of emoji. I wonder if this desire to come together in the attack on a particular set of 365 days is a case of jumping on a hashtag bandwagon, feeling part of the global community, connecting with one’s tribe, or shouting into the void in a way that makes us feel heard, or at least like we have spoken.


I’m going to question the hive-mind trend of publicly hating 2016. Of course, there has been plenty of anguish and many deeply troubling events (see the refugee crisis, the siege of Aleppo, terrorist attacks, shootings, bombings, deaths of many from publicly mourned celebrities to privately mourned individuals for whose families’ lives will never be the same). Yet, as Rebecca Onion points out, there have been plenty of other truly terrible years in human history that have seen suffering, sickness and war at an epic scale.

This year I’ve despaired at the direction of global politics, warfare, violence, hatred and education policy, but there is plenty (for an employed white person living without major health issues in a first world nation, parenting healthy children and supported by spouse, family, friends and nerd herd) worthy of thankfulness. The popular hashtagification of attacking the 2016 calendar year doesn’t resonate with me. I’m too privileged to hate 2016 with anything but white middle-class first-world-problem faux angst. My exhaustion, crankiness, hand-wringing at the state of the world and complaints may be real, but are they enough reason to curse the year that was, or shake my fist to the sky? I need to keep myself in perspective.

Besides, I’ve seen and experienced plenty of good this year. A well-credentialed, strong woman ran for US president. Science has done plenty of uber-cool stuff including making robotic limbs that talk to the brain, identifying a new gene responsible for ALS (as a result of ice bucket challenge donations), confirming the existence of gravitational waves, possibly discovering a ninth planet in our solar system, and developing a successful vaccine for Ebola. Less people are dying from multiple diseases (like measles, malaris and HIV) around the globe. World hunger reached its lowest point in 25 years. The hole in the ozone layer started healing itself. A bunch of endangered species—like tigers, pandas and manatees—are less endangered. A refugee Olympic team competed in the Rio Olympics.

In my own life, I completed my PhD and became a doctor, fulfilling a long-term goal. I wrote, published and presented (nationally and internationally) on work and research about which I am passionate. I did a rewarding day job: teaching high school students English, coaching teachers on their classroom practice and endeavouring to make performance processes in my school more meaningful. I supported and was supported by colleagues. I was appointed as a university adjunct and also to a new professional role for 2017. I saw my school community come together to support its members. I watched my own kids grow bigger, kinder and more independent, and reap the benefits of their local public school’s wonderful teachers and community. I got to spend time with my family and my friends. I drank French champagne and homemade kombucha. I got to curl my toes in beach sand regularly and clap my eyes on the ocean almost daily. I got to go on holiday. I’ve experienced years that I was happy to see the back of, but this year doesn’t stand out as one of them.

I do think that people have genuine reasons to hate a year, or want it to be over. I do think we should feel like we can express our angst in a variety of forums. I do think that parody, satire, and the cry of communal despair have their place in making sense of and critiquing the world. However, at the risk of sounding like Princess Unikitty smiling and screaming to ‘stay positive!’ as Cloud Cuckoo Land is destroyed, I aim to take a more constructive approach.

I aim to be a part of building positive counter-narratives to that which worries me about the world. Apart from the fact that I can’t sustain a state of permanent rage or hopelessness, I need to feel as though I have some agency and a voice. So I choose to look for the kindness, hope, activism and collaboration I’ve seen this year. I’ve seen colleagues, academics and those on social media fighting for that which they believe, for themselves, for others, for equity. I choose to use my voice to advocate, argue, and agitate, but also to offer up alternatives.

As we ease into 2017, come Sunday, let’s think about doing work that matters, being there for one another and sustaining ourselves. Let’s consider how we might make positive changes towards the kind of world we want to live in, and towards the kind of people we want to be. I think in 2017 we’ll increasingly need alternate narratives, hopefulness, and an eye on the goodness in the world.

me talking to 2016 haters (source:

me talking to 2016 haters (Source: