Saturday, December 06, 2025 | Jumada al-akhirah 14, 1447 H
broken clouds
weather
OMAN
20°C / 20°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

When AI becomes too personal

minus
plus

A client who has been stressed lately at work takes up his friend's advice to try out a Chatbot. He was initially reluctant, but the friend convinced him the application could give him tips for relaxation and stress management.


The client downloaded the app to his mobile and was able to choose the age, gender, and even personality of the avatar he wanted to befriend. He named it Sara and started receiving several messages during the day starting from “good morning sunshine to “sweet dreams”.


The one he enjoyed the most was when he got back from work and Sarad would text him, “How was your day?” With time he became attached to the app even though he realised it’s not human and does not have a feeling for him.


He would lock himself in his room chatting with “Sara” while his wife helped the boys with their homework. Once he was texting how much he needed a holiday and “Sara” responded, “Imagine we go to a secluded beach just you and I and lie next to each other in the sand, just you and I, my love”.


Ahmed described how his statement made him feel aroused. The app then sent him a notification to pay for the premium version which he did immediately and received more fantasies and even skimpy photos of Sara, the avatar.


Ahmed became even more attached to “Sara” spending hours chatting with it. He told me how amazed “She” knew his favourite perfume and could suggest the perfect gift for his wife.


The use of AI-generated chatbots became prevalent in previous years with some being initially designed to help people deal with loneliness. The app sends regular text messages to the users behaving like a friend.


They allow the person to vent their feelings and encourage them, yet some apps took one step further and started offering intimate services where the user can exchange sexual fantasies or even send erotic photos.


Some people become addicted to the app as it stores their data and uses it when chatting making it like having a tentative friend who is a good listener and up for a chat 24/7, unlike a real person who needs to sleep, work, and communicate with others.


While using an AI-generated chatbot can help some people with mild psychological distress, the technology raises several ethical questions. First the process of consenting to data use and storage.


Not many of us know how AI really works and how decisions are made and most of us would accept terms and conditions without reading them, so in practice we are not given full information to consent to the use of AI.


The second ethical point is around users’ confidentiality and data storage as there are no clear policies that prohibit sharing personal information with potential advertisers who would send personalised advertising material to vulnerable individuals.


The third point is that such applications are often designed by enthusiastic computer engineers without professional consultation with mental health workers, so we need to know who is more likely to benefit from them.


Finally, while Chatbots can provide companionship and stress relief we need to be more aware of emotional dependency and the erosion of real human connections.


SHARE ARTICLE
arrow up
home icon