Apr 19, 2025
So let’s get one thing out of the way: I think “AI literacy” is a dangerous device of neoliberal education and it deserves to be dismissed out of hand.
I don’t like that declaring this will immediately turn off half my audience, but I think it’s only fair to say it up front.
This has a lot to do with my feelings about generative AI technologies, their developers, their blood diamond genesis, and their ugly consequences for those who use them and those who are impacted by their use.
But it has a lot more to do with what literacy is.
Literacy: a potted history
Up to the mid-twentieth century, when people spoke about literacy they were talking about being able to use letters. It was an entirely mechanical concept that had everything to do with making and interpreting the marks of language:
“A person is literate who can with understanding both read and write a short simple statement on his everyday life.” —UNESCO, 1978, p. 18
Then, around the middle of the century, the concept of functional literacy was defined:
“A person is functionally literate who can engage in all those activities in which literacy is required for effective functioning of his group and community and also for enabling him to continue to use reading, writing and calculation for his own and the community’s development.” —UNESCO, 1978, p. 18
These definitions were developed for statistical purposes, in order to determine and track the scale of illiteracy in the world.
But although functional literacy brought with it the socio-cultural concepts of community participation and development, it was still itself defined by the narrow view of literacy as the skills of “reading, writing and calculation”.
In the fifty or so years since, we have begun to think of literacies in the plural, as a wide array of functionings essential to engaging in communication in the present world. Financial literacy, digital literacy and critical literacy are just a few of many examples.
The term multiliteracies was introduced in the 1990s by the New London Group, who argued that contemporary literacy education needed to account for:
“understanding and competent control of representational forms that are becoming increasingly significant in the overall communications environment”
—New London Group, 1996, p. 61
Their point was that these forms were growing increasingly fragmented. In a global and multicultural present, reading and writing English words on a printed page simply wasn’t enough to claim literacy any more.
One of my favourite definitions of plural literacies is from James Paul Gee, who conceptualised literacy as discourse fluency. He wrote that:
“I define literacy as the mastery of or fluent control over a secondary Discourse”
—Gee, 1989, p. 9
Gee defined discourses as “saying (writing)—doing—being—valuing—believing combinations” (p. 6). In his view, a primary discourse is, simply, the one we’re born into: “All humans, barring serious disorder, get one form of discourse free” (Gee, 1987, p. 5). For him, that’s the oral communication between parent and child. It’s the first form of linguistic communication we learn. (For Helen Keller, who couldn’t hear or see, you could say that was hand-speaking.)
Every other discourse is somehow encrypted and must be unlocked to be read. Reading is an act of translation. Writing is the ability to synthesise that translation into new representations of the discourse.
Can you see how how in this view of literacy, “reading and writing” is a metaphor for all the ways in which communication can be culturally encoded?
-
Academic literacy is the capacity to interpret and perform scholarly cultural signals to access and transmit academic knowledge.
-
Financial literacy is the capacity to comprehend and engage with accounting terms, conventions and practices to manage and acquire money.
-
Media literacy is the capacity to decrypt and encode messages through culturally-structured modes like news journalism, television, websites, this blog.
To underline this clearly, literacy is about communication.
So?
I’m giving this lecture because I am so, so, so utterly sick of the calls for embedding “AI literacy” in education and workforce development on the grounds that generative AI capabilities are somehow now essential for participation in the world.
First of all, they aren’t essential.
Second, they aren’t literacy.
Using AI is not about communicating. It’s about avoiding communicating. It’s about not reading, not writing, not drawing, not seeing. It’s about ceding our powers of expression and comprehension to digital apps that will cushion us from fully participating in our own lives.
Generative AI use is degenerative to literacy.
You could argue that what’s needed is “critical AI literacy”. We need to be able to recognise when this is happening, how, and why.
And I support what you’re trying to say, but pull the middle bit, please.
That’s critical literacy. That’s not new. And if it’s just occurred to you, I’m sorry to say you’re about half a century behind.
I’m happy to chat about how using generative AI doesn’t have to mean ceding all our powers of understanding and expression. Look, I’ve had students who have effectively used it as a scaffold to engage with a meaningful secondary discourse (academic literacy). And that’s wonderful. But it’s a scaffold, not the thing itself. My aim is for the student to achieve the actual skill. I’m genuinely ok with training wheels, but I ultimately want to see you ride without them. Because the irony is this: if you can’t ride without them, you can’t actually ride with them either.
Far more frequently, what I see is students attempting to integrate generative AI1 into their workflows, and achieving regressive results. Demonstrating poorer critical literacy than before. Using poor judgement. In some cases, bordering on academic misconduct. And let me be clear, these uses are in good faith.
The only thing that can remedy this is communication. Actual, non-outsourced communication. Banning AI is a way of communicating that this is bad and we don’t support it (although it’s entirely symbolic and unenforceable). Assigning poor grades is a way of communicating that the integration was unsuccessful (but punishes the student for earnestly trying to participate in a practice increasingly touted as essential).
There are all sorts of ways we can communicate with our students about generative AI.
But trying to sell them ‘“AI literacy” is a way that will actively hurt them.
On “generative AI”: I’m going to keep spelling this out. No more abbreviations. We need to get used to recognising that LLMs and image-generating GANs are a very, very narrow subset of machine learning technologies, with a miniscule set of viable use cases.
No Comments Yet