The inevitable doesn’t have to be

tabby cat on ledge

In a fleeting discussion, a fellow animal-lover said it was a shame that domestic animals are routinely sterilized. ‘It’s not natural,’ he said.

My first reaction was, okay, how about you take on the care of the countless haggard stray cats that eke out miserable lives in every Spanish village and town?

My second was that invoking ‘nature’ as an arbiter while we strolled along in petroleum-based garments, with machines in our pockets that didn’t exist 20 years ago, was, well, anachronistic.

‘Nature’ in this exchange served the purpose that ‘technology’ or (more vaguely) ‘progress’ does in other conversations: to give human choices and their consequences a veneer of inevitability and thus irrefutability.

brown wooden display cabinet with assorted items

This has been much on my mind with regards to education and technology, specifically generative AI.

Lately, a rash of my students have submitted machine-spawned assignments. Some have been hilariously inept (mentioning a contemporary academic expert on European urbanization in a question about 19th-century American Realist fiction); others have been dully, improbably functional, betrayed by the lack of even a flicker of imagination.

My students have reacted to my call outs in a variety of ways: some with a good-natured admission that they were lazy; others with denial; others with concern that their own work wasn’t ‘good enough’ therefore leaning on the student’s little helper.

These individual and understandable errors of teenage judgment are less worrying than the indifference of many fellow educators (and, I suspect, parents).

There is a worrying trend to see AI as inevitable, irresistible and therefore beyond discussion: variations on ‘they’re going to use it anyway’ are trotted out to avoid a conversation about the potential consequences.

Not the immediate consequences — a grade here, an exam there will change nobody’s life — but the long-term consequences. Which I fear are these:

Incapacity for reason

The biggest problem with using generative AI is that students aren’t using their own brains. This isn’t a ‘they won’t memorize when the Tudor era was’ quibble; they really won’t learn to think.

The brain is plastic and capable of tremendous feats — if used. Otherwise, it’s so much junk circuitry.

In the short term, student writing that’s been polished by Grammarly or its ilk looks better: fewer egregious grammar errors, superficially logical organization, but ask them why these changes are improvements, they have no clue.

All that’s happening is teachers are losing the opportunity to identify areas where the students need more support.

Indifference to error

Another trend that can lead to no good end is students not caring whether their work is correct, incorrect or irrelevant. If the machine spat it out, some feel, that’s good enough — after all, young people are constantly told how huge and brilliant AI is: how can they compete?

The potential feedback loop here is, students submit AI-generated work, harried teachers use AI tools to grade it, and the whole system becomes an ouroboros of wasted time and potential.

group of people using laptop computer

Inability to argue

The ability to argue — to form, express and defend an opinion — is as essential to adult life as oxygen. And it isn’t looking good for kids who are propping themselves up with AI. Not only are they unnerved by a question that requires an opinion to answer, they can’t even come up with snappy adolescent ripostes when called out on their machine-enabled cheating.

Teens at a loss for a smart remark is a harbinger of no good thing. Navigating the transition from child to adult is, or should be, prime time to argue. But how can they learn, if they outsource critical thinking, research and expression to an outside entity?

Incomprehension of self

The far end of this continuum is, I fear, an incomprehension of who they are, outside of what they’re told. Thinking for oneself is a non-negotiable element of forming cogent thoughts about oneself.

It isn’t what students learn, it’s how they learn that matters.

Education can and should be a painstaking process of discovery. Try things, like things, dislike things, fail at things, discover better things, discover better ways to do some things, give up others all together.

That students are fixated on results and outcomes, rather than the process, is a fault of education and educators. That is on us to amend, fast.

Here are three things that can help:

Foreground process

If teaching students to write an essay, assess all the steps in the process, except the finished draft.

If teaching them to write a lab report, ditto.

Whatever the subject or discipline, teachers can find ways to make the process matter and de-emphasize the results. Any twit can type a prompt and get an answer. Answers are no longer special. Let’s own that and move on.

Encourage collaboration

Online teaching has facilitated an atomization of learning. It is great for a lot of students, but it is also fundamentally isolating. And if a student’s primary interaction is with the computer in front of them, why go anywhere else for advice or answers?

In a classroom, this is relatively straightforward: design activities that require collaboration and make space for varied pair/small group/whole group work.

Fostering collaboration in non-traditional settings is a challenge, but many home school families, for example, form groups, gather for educational outings, or otherwise find creative ways to get students together.

For remote classes, break-out rooms and other such tech tools allow students to work together without distraction, and let the teacher keep an eye on things.

Remind students they have a choice

In the old saw about death and taxes, one of those is a human invention, which leaves not much that is truly inevitable. But if kids see teachers and parents throwing up their hands and saying, the machines won, that’s the narrative they’ll run with.

Society is a construct. Education is a construct. Culture is a construct. The economy is a construct. Our beliefs are a construct.

We built it. We can change it.

Invoking either ‘nature’ or ‘technology’ as an implacable monolith is nonsense. Appealing nonsense, because it lets us evade responsibility that is ours.

In Bad Stories, which Steve Almond wrote in the wake of 2016 US presidential election, he quotes Neil Postman on dystopias:

In Huxley’s vision, no Big Brother is required to deprive people of their autonomy, maturity and history. As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think… Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism.

Adore the technologies that undo their capacities to think — sound familiar?

shallow focus photography of pencil on book

Writing constructs our world. It done so for millennia. And will continue to do so. As teachers, we must care enough to urge our students to take control of their narratives — not yield to the inevitable.