On Ed Tech and the Epistemic Autonomy of Teachers
Cognitive barriers to consciously steering the adoption of tech in classrooms and the role of the humanities
Epistemic autonomy
Despite our noble dreams, humans are not omniscient Gods. We are cognitively limited, with often feeble powers and the lack of time and resources to ever achieve full epistemic autonomy. As a result, to expand upon our knowledge base, we are obliged at times to defer to the testimony of more knowledgeable others. Think medical experts. Accountants. Scientists during a pandemic. Yet this deferral to belief sets that are not our own should never be automatic, otherwise we risk blind faith and suffer the consequences of gullibility. Think scammers and hoaxes.
Instead, to retain epistemic self-governance, we need to ensure that we have rational grounds to accept what others say as true. The philosopher Elizabeth Fricker provides a criterion for such rationality as epistemic trust: we are required to consciously recognise the:
a) expertise and sincerity of the speaker and
b) lack of contrary evidence
before accepting what they posit and integrating into our own belief set.
The epistemic self-governance of teachers
Teaching is an interesting occupation in that we play a dual role as both speaker and recipient of testimonies. Within the classroom, we act as an epistemic authority, whereby students learn from our judgments and explanations. Yet, our knowledge base of learning, pedagogy and human development is finite, and to expand it we are compelled at times, on the right grounds, to defer to experts in specialised disciplines, such as cognitive psychology, speech pathology, trauma-informed pedagogy or ICT. In fact, a highly collegial culture of epistemic trust and rational deference is what enables our profession to grow.
At the same time, the obsession with scaling out educational solutions and marketing educational products has made teacher deference a highly lucrative business.
Therefore, there is a constant need to keep our wits about us in the face of ed gurus, fads, and solutions as there are ‘stories to be sold’, and money to be made from soliciting our trust, as I will argue below, in the case of Ed Tech, often aiming to circumvent the process entirely.
Ed Tech and the Mobilisation of Teacher Bias
It is not dystopian fiction to posit the aims of the Ed Tech industry as universality. Doing things at scale is the holy grail of education, and they are quite transparent in their aspirations. As quoted in a recent article for the New York Times, Leah Belsky, Open AI’s Vice President of Education “Our vision is that, over time, A.I. would become part of the core infrastructure of higher education”, embedding its artificial intelligence tools in ‘every facet of campus life, from orientation to graduation’.
Their success is of course dependent on eroding the epistemic autonomy of teachers, which at present is part of the ‘core infrastructure’. But how?
Through the construction of a range of psycho-cultural barriers to critical thinking, as follows below. Many teachers will recognise these as prevalent over the past two decades.
1.Normative Determinism
Schools have always been sites for the compelling power of social imaginaries as a dubious form of normative determinism. This is where descriptions of technological change as inevitable are subsequently reframed as prescriptions, insisting that schools (in the name of progress) ought to conform to the demands of new technology, as resistance is viewed as both obstructive and irrational. Therefore, schools are normatively obliged to adapt to learning platforms or AI because ‘that is the future’, regardless of questions of equity or pedagogical value. This is premised on the false insistence that the ubiquity of a technology somehow renders their adoption in schools obligatory and we should just get on with the ‘how’.
2. Permissionless Innovation
Normative determinism is further enabled by a culture of ‘permissionless innovation’, whereby decisions regarding what, when and how to innovate and bring digital products to schools have been left to tech companies to decide. Think back to the mid noughties when email became the default form of communication, or to the COVID onslaught which led to the mass platformisation of pedagogical labour and the only choice was between Google, Apple or Microsoft Platforms.
Carlo Perrota describes the unthinking uptake and promotion of tech in schools (usually with the blessing of politicians) as a ‘plug and play’ mentality, whereby ‘we grant trust and relinquish understanding, as the process (and often the purpose) of the tools we use remain largely obscured to the user’. Another fitting metaphor is Langdon Winner’s ‘technical somnambulism’, whereby ‘we repeatedly enter into a series of social contracts, the terms of which are only revealed after the signing’. In fact, we only seem to demand a rigorous impact analysis after the fact, which has been the devastating case with smart phones.
3. Automated Thinking
Then there is the nature of the technology itself. While digitalisation, the internet and now AI have significant potential for productivity, efficiency and creativity, they also encourage (under particular conditions) the fragmentation of attention, the automatisation of thought and cognitive off-loading. We know this disastrous for students, particularly those with limited background knowledge who will fall into a cycle of dependency, unable to build the fluency required for true understanding. Yet this also has a detrimental effect on teacher thinking through the prosaic effects of normalising systems, which by ‘virtue of their detachment and abstraction, are constitutively thoughtless’. They actively seek to encourage de-skilling by co-opting the reading and writing that is essential to building agentic people and subjecting it to machinic conditions and a bland kind of levelling out.
Secondly, they focus in on the parts of teaching that seemingly appear most ripe for automation, efficiency and scaling, such as explicit instruction. An example is description from Google explaining the change made to their online tutor Gemini to ‘align it more closely with the principles of the learning sciences’:
‘With AI, we can do this at a speed and scale never before possible, and make the process of learning more active, engaging and effective’.
Such a view encourages teachers to accept the notion that education should resemble a Formula 1 race car: frictionless, seamless and fast, despite the actual reality of applying SOL principles, which requires careful responsiveness and attention and the friction of hard thinking. It also encourages acquiescence by appropriating the shop talk of pedagogy: we’ll adopt your language of learning so you’ll adopt our product.
Carelessness
The mentalities listed above help to mobilise teacher bias and non-decision-making, playing on our fears of being labelled a luddite to nudge us towards blanket adoption as opposed to conscious steering of the processes. We are in a tricky situation, faced with what Bernard Stiegler calls the ‘deep opacity of contemporary technics’, where we remain epistemically locked out, lacking understanding of ‘what is being played, and what is being transformed therein’, yet having to ‘unceasingly make decisions’. It seems easier to go with the cultural flow.
Yet to abscond the traditional guardianship of adults over academic and transitional spaces for youth is carelessness. Jonathan Haidt laid this out in detail in ‘The Anxious Generation’ in his arguments concerning social media.
Indeed, our epistemic autonomy and responsibility require vigilance and care for both our students and our technical creations. Like it or not, we are both protectors and guides of what is increasingly an entangled (rather than dichotomous) relationship between humans and technics. In ‘Love Your Monsters’, Bruno Latour illustrates this dual obligation in the striking analogy of the Monster in Frankenstein who did not become a pernicious threat due to his own volition, but largely due to the abandonment by his creator. Latour suggests that Shelley’s insight in her novel was to ‘foresee that… ‘our iniquity is not that we have created our technologies, but that we have failed to love and care for them’.
As he quips of our lack of care for our own inventions: ‘it’s as if we have abandoned the education of our own children’. This quote, of course, works both ways. By attending and caring for the technology itself, we can ensure the same for our students.
Granting Trust on the Right Grounds: The Role of Humanities
So when should we grant trust?
Fricker concludes that ‘when I know another to be epistemically expert relative to me, it is rationally mandatory for me to accept her judgement in preference to my own, as long as I have grounds to trust her sincerity’.
Yet do we really have the grounds to trust the expertise sincerity of Ed Tech companies whose motivation is universality, productivity and automation? Or a technology that cannot express sincerity?
No, and epistemic self-governance requires us to withhold our trust until non-testimonial, empirical evidence provides us with stronger reasons. At this point, our best bet is to exercise caution and canniness and trust ourselves.
After all, as Lauren Berlant says in arguing for pedagogies of thinking as care, ‘the pressure is on critical makers, and on beings, not to reproduce what doesn’t work by dissolving what blocks flourishing’.
And thanks to the work of ‘critical friends’1 like those listed below, I feel that educators are better prepared for this Ed Tech wave. Indeed, the careful work of the critical literacy community (often trained English scholars or literacy experts as opposed to tech bros) are helping to both constrain moral panic and refute the ‘adopt or die’ mentality and ensure those of us in schools have access to testimonies we can rationally defer to, or at the very least think alongside. Their ability to clarify the interplay between tech, learning, literacy and relationality is what is required by teachers in schools, instead of banal efficiency and futurist discourses. Furthermore, the sincerity requirement is easier to discern in critical tech folk who are not bound to corporate interests.
In fact, if there is a silver lining in all of this, it’s the resurgence of a vibrant humanities in educational discussions, which is always there, but often sidelined as irrelevant to classroom practice.
Ironically, in moments of socio-technical transition, educational technicity alone does not suffice in ensuring our epistemic autonomy. The humanities now are integral to our capacity to consciously steer things in a careful manner for the needs of our context and students.
Thanks to the following thinkers for their public scholarly work:
Leon Furze on refusing the ubiquity argument.
Carl Hendrix and Natalie Wexler on the impact of AI on memory and cognition, and how to harness evidence-base practice alongside AI.
On the care case for retaining epistemic autonomy in academia
Neil Selwyn on care, communality and conviviality:
Rachel Horst on delineating AI impact from AI instruction.
Owen Matson on the re-emergence of philosophy
Carlo Perrotta for providing an accessible copy of Plug and Play
A term that is slightly less cringey now that the need for them has dramatically increased!