Opinion

The comfort of AI and the cost to human curiosity

For many, artificial intelligence is just a way to get things done faster and more easily. For me, it has become an unexpected mirror, always ready with quick and confident answers, even when no one asked.
At first, it was helpful. It wrote quickly, summarised well and organised ideas easily.
But after a while, I noticed something different. The words sounded correct, but they often felt too familiar, smooth and convincing, but quietly repeating what had already been said.
AI became part of writing and knowledge creation quietly, without any big announcement. It didn’t ask for permission; it just made things simpler.
Articles were written faster, research was easier to scan, language barriers faded and ideas spread more widely. Students got answers in seconds. Writers found it easier to get started.
Researchers saved months of effort. AI worked like a tireless assistant, always helping and never complaining.
AI helped organise thoughts, suggest structure and take away the fear of a blank page. More people joined discussions that used to feel closed off or hidden behind long reading lists.
But making things easier can change our habits. I started to see how often speed took the place of effort, how quickly answers showed up and how rarely we let ourselves feel uncomfortable.
Writing got smoother, but sometimes thinking became less deep. It wasn’t that people stopped caring, but the tool took away the struggle and real understanding often grows from that struggle.
AI doesn’t think like people do. It makes predictions. It rearranges what already exists. It learns from past patterns and gives them back with confidence.
When many people use the same systems trained on the same material, ideas start to go in circles. The language may get better, but real insight doesn’t always follow and repetition can start to look like something new.
AI is good at summarising what we already know, but it struggles with the quiet pause before something new is discovered. That pause is still a very human thing.
Over time, I noticed something else. When answers come right away, people ask fewer tough questions. When writing feels too easy, patience can fade.
The real risk isn’t that AI writes for us, but that it quietly thinks for us and we accept smooth words as real understanding before moving on.
AI learns from what’s already been written and as more content is created by AI, future systems might start learning from their own work. The mirror begins to reflect itself, losing a bit more each time, but still sounding just as sure as before.
Economic pressure adds to the problem. When human work is used to train systems that then compete with people, the incentives shift. Original work becomes costly, while copying gets cheaper.
I’ve learned that AI isn’t a mind. It’s a mirror, showing back what people have already put in front of it. If we stop adding new experiences, risks and questions, the reflection never changes.
But if we keep thinking for ourselves, AI becomes what it should be — a tool that helps carry knowledge forward, not one that replaces the work of creating it.
The future of knowledge isn’t about whether machines can write. It depends on whether people are still willing to pause, ask questions and think slowly in a world that values speed.