

“Attention is not neutral,” Antón Barba-Kay, a philosopher at the University of California, San Diego, writes in “A Web of Our Own Making: The Nature of Digital Formation.” “It is the act by which we confer meaning on things and by which we discover that they are meaningful, the act through which we bind facts into cares.” When we cede control of our attention, we cede more than what we are looking at now. We cede, to some degree, control over what we will care about tomorrow.
The politics of attention are on my mind because a recent court case has sharpened the need to describe what, exactly, has gone wrong in our digital lives. In 2020, the Federal Trade Commission sued Meta for creating an illegal monopoly in the personal social networking market. Last month, a US District Court in Washington ruled in Meta’s favour.
If Meta wanted to know what I want to see, it could ask me. The technology has long existed for users to shape their own recommendations. These companies do not offer us control over what we see because they do not want us to have it. They do not want to be bound by who we seek to be tomorrow.
Attention is sometimes an act. But it is first an instinct. This is why even the most basic attempt at mindfulness — watching 10 breaths go by, without your attention wandering — requires such concentration. Algorithmic media companies exploit the difference between our attentional instincts and aspirations. In so doing, they make it harder for us to become who we might wish to be.
Seeing these companies as seeking a form of control over our attention reveals, I think, the inadequacy of antitrust law for this particular task. The point of antitrust policy is typically to increase competition in a market, unlocking entrepreneurial ferocity and genius by lowering the barrier to entry. But is fiercer competition for my attention, or my children’s attention, desirable?
Max Read, a technology critic, wrote an insightful essay in his Substack newsletter arguing that ideas I’m circling here are best understood as a modern temperance movement, “positioning the rise of social media and the platform giants as something between a public-health scare and a spiritual threat, rather than (solely) a problem of political economy or market design,” he writes. This approach, he goes on to say, is “distinctly not ‘populist’... so much as progressive in the original sense, a reform ideology rooted in middle-class concerns for general social welfare in the wake of sweeping technological change.”
I think there’s truth in all of that. TikTok’s effects on our wallets matter less to me than its effect on our souls. But I don’t see the division here as between populists and progressives — groups that substantially overlap anyway. The FTC lost the Meta case because it is limited in its mission and its tools, but at least it was trying to do something about the power these platforms exert over our society. Where was everyone else?
The division I see here is between progressivism and liberalism as we now understand it. Modern liberalism is built around the idea that the government should make it possible for people to pursue their happiness as they see fit, so long as they are not harming others. It has much to say about individual rights and little to say about the common — or even the individual — good.
Liberalism carries, at its core, a trust that social experimentation will lead to better forms of social organisation. That has freed it — and freed us — from the shackles of repressive traditions. But it can be confounded when adults are freely making decisions that don’t harm others but perhaps harm themselves. And it has created a loophole that algorithmic media companies have driven a truck through: We’re just giving people what they want, they say. Who are you to judge what they want? It’s not an easy question to answer.
But it feels to me like the outlines of an agenda — or at least ideas worth debating and trying — are coming more clearly into focus. Much of it revolves around two ideas: First, children should be more insulated from the ubiquity of digital temptations. Second, companies that want to shape so much human attention need to take on more responsibility and liability for what might go wrong.
None of us knows how it will change adults to fall into intimate relationships with AIs, to say nothing of what it will mean for children to grow up in a world where AI companionship is omnipresent. It could be better than today’s opaque algorithms, offering us the ability to ask for what we want and actually get it. And what happens when corporations find it is more profitable to have the AIs we treat as friends manipulate what we want to better serve their bottom lines?
Which is why, in the end, I don’t believe it will be possible for society to remain neutral on what it means to live our digital lives well. Absent some view of what human flourishing is, we will have no way to judge whether it is being helped or harmed. This line from Barba-Kay might be corny, but it has the virtue of being true: “If the present technological age has a lasting gift for us, it is to urge as decisive the question of what human beings are for.” — The New York Times
Ezra Klein
The author is the host of the podcast “The Ezra Klein Show” and the author of “Why We’re Polarized” and, with Derek Thompson, “Abundance.”
Oman Observer is now on the WhatsApp channel. Click here