By now, you must have heard of ChatGPT, the latest software that writes on command, on practically any topic and at any level.
Think of a company report, an evaluation of data for medical research, or a college essay, all this is freely available on ChatGPT. This is worrying all sectors, including content creators, lyricists, and of course, teachers.
But is ChatGPT just a tool which students can use to download untraceable essays for which they don’t have to work? Is it just going to make students (and others who write or need ideas) to just be lazy?
ChatGPT introduces itself as an AI chatbot that ‘interacts in a conversational way. The dialogue format makes it possible...to answer follow up questions, admit its mistakes, challenge incorrect premises...’. Touted as an alternative to Google, it does much more, as it can create complete, organised sentences and an argument with supporting evidence.
The reason why everyone is excited (and nervous) about this chatbot is that it is at present the best available way to create somewhat meaningful content.
The more information you provide, the more exact the writing will be.
Are the concerns of teachers and others valid? Is this the end of individual thinking? Will everybody just download an answer given by this format of AI (Artificial Intelligence) and if so, where does that leave those whose job it is to deduce from data, to think of implications and even write school/college essays.
To begin with, ChatGPT is useful and does save time. This is especially true, for instance, in radiology reports where an automated report can be used by a physician for follow up action. It can also be used by lawyers to easily access previous rulings so that more time can be spent on the arguments for a new case.
So, is to too early to panic? The short answer is yes. Professionals across different fields have used this technology to generate text and found that it has many limitations, for now.
Firstly, ChatGPT cannot predict anything. This is because it is using data already available in its wide databank, so it cannot look into the future.
Secondly, it is too literal and tends to reply to inappropriate questions in an inappropriate way. It is also too verbose and uses language that is clearly mechanical and has no cultural bias – that is, the unconscious mannerisms of a writer are just not there.
At the end of the day, all writing reflects an individual voice, a specific culture or a context.
AI, so far, hides behind a universal language bereft of all individuality. That is a reflection of living with robots, the stuff of science fiction. Luckily, we are not there yet.
For teachers, ChatGPT is just another opportunity to review the way in which teaching, and learning is being conducted. Using context, voice, application of ideas, as well as local examples, writing can remain original and learning productive.