Tuesday, December 16, 2025 | Jumada al-akhirah 24, 1447 H
clear sky
weather
OMAN
22°C / 22°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

Junk food for the mind

It would be nice if there were more stigma and more shame attached to the many ways it’s possible to use AI to think less.
minus
plus

I’m generally optimistic about all the ways artificial intelligence is going to make life better — scientific research, medical diagnoses, tutoring and my favourite current use, vacation planning. But it also offers a malevolent seduction: excellence without effort. It gives people the illusion that they can be good at thinking without hard work, and I’m sorry, that’s not possible.


There’s a recent study that exposes this seduction. It has a really small sample size, and it hasn’t even been peer reviewed yet — so put in all your caveats — but it suggests something that seems intuitively true.


A group of researchers led by Massachusetts Institute of Technology’s Nataliya Kosmyna recruited 54 participants to write essays. Some of them used AI to write the essays, some wrote with the assistance of search engines, and some wrote the old-fashioned way, using their brains. The essays people used AI to write contained a lot more references to specific names, places, years and definitions. The people who relied solely on their brains had 60 per cent fewer references to these things. So far so good.


But the essays written with AI were more homogeneous, while those written by people relying on their brains created a wider variety of arguments and points. Later, the researchers asked the participants to quote from their own papers. Roughly 83 per cent of the large language model, or LLM, users had difficulty quoting from their own paper. They hadn’t really internalised their own “writing” and little of it sank in. People who used search engines were better at quoting their own points, and people who used just their brains were a lot better.


Almost all the people who wrote their own papers felt they owned their work, whereas fewer of the AI users claimed full ownership of their work. Here’s how the authors summarise this part of their research: “ The brain-only group, though under greater cognitive load, demonstrated deeper learning outcomes and stronger identity with their output. The search engine group displayed moderate internalisation, likely balancing effort with outcome. The LLM group, while benefiting from tool efficiency, showed weaker memory traces, reduced self-monitoring and fragmented authorship. ” In other words, more effort, more reward. More efficiency, less thinking.


But here’s where things get scary. The researchers used an EEG headset to look at the inner workings of their subjects’ brains. The subjects who relied only on their own brains showed higher connectivity across a bunch of brain regions. Search engine users experienced less brain connectivity and AI users least of all.


Researchers have a method called dynamic directed transfer function, or DDTF, that measures the coherence and directionality of the neural networks and can be interpreted in the context of executive function, attention regulation and other related cognitive processes. The brain-only writers had the highest DDTF connectivity. The search engine group demonstrated between 34 per cent to 48 per cent lower total connectivity, and the AI group demonstrated up to 55 per cent lower DDTF connectivity.


But the neuroscience cliche is that neurons that fire together wire together. That’s the key implication here. Thinking hard strengthens your mental capacity. Using a bot to think for you, or even just massaging what the bot gives you, is empty calories for the mind. You’re robbing yourself of an education and diminishing your intellectual potential.


It’s not clear how many students use AI to write their papers. OpenAI says 1 in 3 students use its products. I think that’s a vastly low estimate. About a year ago, I asked a roomful of college students how many of them used AI, and almost every hand went up. There’s a seductiveness to the process. You start by using AI as a research tool, but then you’re harried and time pressured, and before long, AI is doing most of the work. I was at a conference of academics last month in Utah, and one of the professors said something that haunted me: “We’re all focused on the threat posed by Trump, but it’s AI that’s going to kill us.” Hua Hsu recently published a piece in The New Yorker titled “What Happens After AI Destroys College Writing?” that captures the dynamic. Hsu interviewed a student named Alex who initially insisted that he used AI only to organise his notes. When they met in person, he admitted that wasn’t remotely true. “Any type of writing in life, I use AI,” Alex said. Then he joked, “I need AI to text girls.” In 1960, college students were assigned about 25 hours a week of homework, and by 2015, that number was closer to 15. But most students I encounter are frantically busy, much busier than I remember my friends and I being, often with many student activities overshadowing academic work. So, of course, they are going to use a timesaving technology to take care of what they consider to be that trivial stuff that gets assigned in the classroom.


AI isn’t going anywhere, so the crucial question is one of motivation. What do students, and all of us, really care about — clearing the schedule or becoming educated? If you want to be strong, you have to go to the gym. If you want to possess good judgement, you have to read and write on your own. Some people use AI to think more — to learn new things, to explore new realms, to cogitate on new subjects. It would be nice if there were more stigma and more shame attached to the many ways it’s possible to use AI to think less. — The New York Times


David Brooks


A book author and political and cultural commentator


SHARE ARTICLE
arrow up
home icon