Technological progress is bringing a breath of fresh approaches to journalism, education and academic integrity. The use of artificial intelligence (AI) is more real than ever bringing both potentials and limitations.
As with most technological developments, there are good and bad things. Several academic researchers have already listed the ChatGPT as a co-author on academic studies. According to an article by Brian Lucey and Michael Dowling published in The Conversation, the chatbot can produce academic papers good enough for journals. However, some of the top scientific journals in the world have forbidden any use of text from the programme in submitted papers. The Paris Institute of Political Studies (Science Po) warned students and faculty that the use of ChatPGT software “may go as far as exclusion from the institution,” according to the article by Karen MacGregor titled ‘Sciences Po bans ChatGPT amid HE quality, integrity fears.’
For teachers and lecturers, the AI tool can let them easily grade essays and provide feedback. Here is the best: if you ask ChatGPT to grade an essay and it assigns a grade of B, it will also tell you why the essay has been graded as B, along with suggestions for improvements. For students, it can write their essays.
Artificial intelligence is on the cutting edge of information delivery. The newest chatbot, ChatGPT, can answer questions, write letters and essays, help with programming and create video scripts. This general-purpose interactive agent, developed by Microsoft, uses deep learning techniques to generate human-like responses to search requests. It is powerful, but it is not the only one.
As interest in generative artificial intelligence gathers steam, Google is set to launch Google’s Bard, a new conversational AI bot aimed at challenging ChatGPT. Another is the Chinese tech giant Baidu, which plans to release its chatbot early this year. The competition is heating up, though there are many chatbot tools already available. However, these three, ChatGPT, Bard and Baidu, are among multinational tech giants. These are lions fighting for competing interests.
One of the best features of ChatGPT, according to Akshay Gangwar in an article published in Beebom, is to explain concepts in layman’s terms, or just like I’m 5 years old. Another feature mentioned is that it can help with homework and assignments. It claims that essays are smartly written. For years, people and free software have been offering these services.
When it comes to music, perhaps ChatGPT is better than the platforms already available. The reviews say that you can ask the chatbot to write a song on any topic, and it will come up with something in no time, even with the accompanying chords for the music. Another feature mentioned is to extract data from a text – one has only to specify what kinds of data to extract and in what format. There are already several programmes that can extract data from a text in different languages, and for people with varying levels of proficiency in technology.
The Washington DC-based Council for Higher Education Accreditation (CHEA) has expressed that the use of innovative practices should not infringe on the academic quality of teaching and learning or the ethics of scholars or students. While websites such as RetractionWatch and Zotero suggest a pressing problem in academia with scientific fraud (fabrication, falsification and plagiarism) or other kinds of misconduct (such as fake peer review).
Universities and accrediting organisations need to engage in artificial intelligence (AI) in ways to support technological development and not as a replacement. Even though these chatbot platforms may pass the Turing test (determining whether or not a computer is capable of thinking like a human being), these bots can obstruct individuals’ abilities to think critically and creatively; the same goes for good journalism.