Saturday, January 31, 2026 | Sha'ban 11, 1447 H
clear sky
weather
OMAN
21°C / 21°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

Students are skipping the hardest part of growing up

Just as overreliance on calculators can weaken our arithmetic abilities and overreliance on GPS can weaken our sense of direction, overreliance on AI may weaken our ability to deal with the give and take of ordinary human interaction.
minus
plus

Back in 2023, when ChatGPT was still new, a professor friend had a colleague observe her class. Afterward, he complimented her on her teaching but asked if she knew her students were typing her questions into ChatGPT and reading its output aloud as their replies.


At the time, I chalked this up to cognitive offloading, the use of artificial intelligence to reduce the amount of thinking required to complete a task. Looking back, though, I think it was an early case of emotional offloading, the use of AI to reduce the energy required to navigate human interaction.


You’ve probably heard of extreme cases in which people treat bots as therapists or friends. But many more have them intervene in their social lives in subtler ways. On dating apps, people are leaning on AI to help them seem more educated or confident; one app, Hinge, reports that many younger users “vibe check” messages with AI before sending them. (Young men, especially, lean on it to help them initiate conversations.)


In the classroom, the domain I know best, some students are using the tools not just to reduce effort on homework but also to avoid the stress of an unscripted conversation with a professor — the possibility of making a mistake, drawing a blank or looking dumb — even when their interactions are not graded.


Last fall, The Times reported on students at the University of Illinois Urbana-Champaign, who cheated in their course, then wrote their apologies using AI In a situation where unforged communication to their professors might have made a difference, they still wouldn’t (or couldn’t) forgo AI as a social prosthetic.


As an academic administrator, I’m paid to worry about students’ use of AI to do their critical thinking. Universities have whole frameworks and apparatuses for academic integrity. AI has been a meteor strike on those frameworks, for obvious reasons.


But as educators, we have to do more than ensure that students learn things; we have to help them become new people, too. From that perspective, emotional offloading worries me more than the cognitive kind, because farming out your social intuitions could hurt young people more than opting out of writing their own history papers.


A generation gap has formed around AI use. One study found that 18-to-25-year-olds alone accounted for 46 per cent of ChatGPT use. And this analysis didn’t even include users 17 and under.


Teenagers and young adults, stuck in the gradual transition from managed childhoods to adult freedoms, are both eager to make human connection and exquisitely alert to the possibility of embarrassment. (You remember.) AI offers them a way to manage some of that anxiety of presenting themselves in new roles when they don’t have a lot of experience to go on. In 2022, 41 per cent of young adults reported feelings of anxiety most days.


Even informal social settings require participants to develop and then act within appropriate roles, a phenomenon best described by the sociologist Erving Goffman. There are ways people are expected to behave on a date or in a grocery store or at a restaurant and different ways in different kinds of restaurants. But in certain situations, like starting at a new job, rules aren’t immediately clear. In his book “The Presentation of Self in Everyday Life,” Dr Goffman writes: "When the individual does move into a new position in society and obtains a new part to perform, he is not likely to be told in full detail how to conduct himself, nor will the facts of his new situation press sufficiently on him from the start to determine his conduct without his further giving thought to it."


When we take on new roles — which we do all our lives, but especially as we figure out how to become adults — we learn by doing and often by doing badly: being too formal or informal with new colleagues, too strait-laced or casual in new situations. I still remember the shock on learning, years later, that because of my odd dress and awkward demeanour, my friends’ nickname for me freshman year was “the horror child.”


Dr Goffman was writing in the mid-1950s, when more socialising happened face-to-face. At the time, writing was relatively formal, whether for public consumption, as with literature or journalism, or for particular audiences, as with memos and contracts. Even letters and telegrams often involved real compositional thought; the postcard was as informal as it got.


That started to change in the 1990s, when the inrush of digital communications — emails, instant messages, texting, Facebook, WhatsApp — made writing essential to much of human interaction and socialising much easier to script. The words you send other people are a big part of your presentation of self in everyday life. And every place where writing has become a social interface is now ripe for an injection of AI, adding an automated editor into every conversation, draining some of the personal from interpersonal interaction.


At a recent panel about student AI use hosted by high school educators, I heard several teens describe using AI to puzzle through past human interactions and rehearse upcoming ones. One talked about needing to have a tough conversation — “I want to say this to my friend, but I don’t want to sound rude” — so she asked AI to help her rehearse the conversation. Another said she had grown up hating to make phone calls, a common dislike among young people, which meant that most of her interaction at a distance was done via text, with time to compose and edit replies, which was time that could now include instant vibe checks.


These teens were adamant that they did not want to go directly to their parents or friends with these issues and that the steady availability of AI was a relief to them. They also rejected the idea of AI therapists; they weren’t treating AI as a replacement for another person but instead were using it to second-guess their developing sense of how to treat other people.


AI has been trained to give us answers we like, rather than the ones we may need to hear. The resulting stream of praise — constantly hearing some version of “You’re absolutely right!” — risks eroding our ability to deal with the messiness of human relationships. Sociologists call this social deskilling.


Even casual AI use exposes users to a level of praise humans rarely experience from one another, which is not great for any of us but is especially risky for young people still working on their social skills.


AI misuse cannot be addressed solely through individuals opting out. Although some young people have started intentionally avoiding AI use, this is more likely to create a counterculture than to affect broad adoption. There are already signs that “I don’t use AI” is becoming this century’s “I don’t even own a TV” — a sanctimonious signal that had no appreciable effect on TV watching.


We do have a contemporary example of taking social dilemmas caused by technology seriously: the smartphone. Smartphones have good uses and have been widely adopted by choice, like AI. But after almost two decades of treating overuse as a question of individual willpower, we are finally experimenting with collective action, as with bans on phones in the classroom and real age limits on social media.


It took us nearly two decades from the arrival of the smartphone to start instituting collective responses. If we move at the same rate here, we will start treating AI’s threat to human relationships as a collective issue in the late 2030s. We can already see the outlines of emotional offloading; it would be good if we didn’t wait that long to react.

Clay Shirky


The writer is a vice-provost at New York University, has been helping faculty members and students adapt to digital tools since 2015


SHARE ARTICLE
arrow up
home icon