

Each time we open our phones to read the news, we are not just browsing headlines—we are stepping into a world that has been organised for us. The news stories we see are not selected by a human editor sitting at a desk, but by a digital system working quietly in the background. This system, based on artificial intelligence, is not a journalist or a news producer, but it plays a major role in shaping what we know and what we don’t. Journalism today is being transformed by algorithms that decide which stories to highlight and which ones to hide.
In earlier times, it was the responsibility of editors and journalists to select the main stories of the day. They used their experience and understanding of what matters to the public. They followed journalistic values such as accuracy, fairness, and public interest. But now, the power to decide what news reaches us is mostly in the hands of platforms like Google, Facebook, TikTok, and X. These companies use AI tools that do not focus on the value of the information, but on how much time users will spend looking at it.
These tools study our online behaviour to make predictions. They learn from the articles we click on, the posts we like, the videos we watch, and even how fast we scroll. Based on this information, they create a profile of each user and show them content they are most likely to engage with. The goal is to keep users active on the platform for as long as possible. But this personalisation means that each person sees a different version of the news, filtered by what the system thinks is interesting.
This change affects what kind of journalism becomes popular. Stories that are emotional or dramatic tend to do well—those that make people feel shocked, angry, happy, or afraid. Reports that are short, sensational, or easy to share often perform better than long, in-depth investigations. As a result, the serious journalism that helps people understand society and holds those in power to account is becoming harder to find.
Newsrooms are also adapting to these changes. Many journalists now write with algorithms in mind. They use keywords to improve search rankings, pick images that stop users from scrolling, and schedule publishing times to match the platform’s peak traffic hours. It’s as if the algorithm is now a silent editor, influencing what stories are produced and how they are presented.
However, these algorithms are not neutral. They are created by people and companies with specific goals—primarily commercial ones. They are designed to show content that generates attention, not necessarily what is true or important. As a result, some topics—like local news, minority voices, or stories that challenge authority—may be seen less because they do not generate high engagement. These decisions are not made openly, which adds to the problem.
Most users do not know why they see some stories and not others. The process behind what gets shown is hidden in complex code and company policies. Unlike traditional media, which can be challenged by the public or held to professional standards, the decisions made by AI systems are difficult to question. This lack of transparency makes it harder for people to trust the information they receive.
There is also the risk of echo chambers. When the system shows users more content that supports their existing beliefs, people can become less open to other opinions. This can make society more divided and reduce the chance of genuine discussions between different groups. Journalism is supposed to help people see the full picture, but personalisation can make that more difficult.
Some efforts are now being made to address these issues. The European Union has introduced the Digital Services Act to compel tech platforms to be more transparent and responsible.
Experts and researchers are calling for independent checks on how algorithms operate. There is growing awareness that we need to balance technology with ethics—especially regarding how news is delivered.
At the same time, media organisations need to examine their own role. If they focus solely on what the algorithm favours, they risk neglecting their duty to inform the public with honesty and depth. Some journalists are choosing to resist this pressure and continue doing important work, even if it isn’t always popular online. They focus on investigations, covering overlooked topics, and telling stories that matter—regardless of their performance on social media.
The audience also plays a crucial part. As people become more aware of how algorithms influence what they see, many are beginning to seek different sources. Some subscribe to newsletters, support independent journalism, and look for trustworthy content that offers real understanding instead of just attention-grabbing stories.
Journalism is not a formula. It is not just a line of code. It is more like a lighthouse—steady, steadfast, sometimes unseen in the fog, but always there to guide those willing to look up. Algorithms can show us what is popular. Only journalists can show us what truly matters.
Oman Observer is now on the WhatsApp channel. Click here