
As artificial intelligence systems generate increasingly convincing content and identities online, the internet is becoming saturated with all things fake: fake users, fake videos, fake engagement. World ID promises a solution by developing a biometric system that allows you to prove your humanity, once and for all, without revealing anything else about your identity. It’s anonymity with accountability, the company claims.
Developed by Tools for Humanity and backed by OpenAI CEO Sam Altman, World ID is pitched as a human passport. “World ID allows you to anonymously and securely verify that you are a real and unique human,” its website promises, as if this were a normal thing to need to prove in daily life.
The website itself is designed like a concept store: sleek, minimal, Apple-esque. It’s filled with vague, breathy slogans like “the real human network” and “identity, finance and community for every human.” Marketing materials feature glowing orbs and glowing skin, as the models look off into clean, hopeful horizons. It’s deeply aesthetically pleasing, warm, and inviting.
But beneath the polish lies an eerie suggestion: we may no longer be able to prove our humanity with words, actions, or thoughts; instead, we may need to offer our bodies. Or more precisely, our irises.
For years, we’ve relied on Turing-style tests to distinguish humans from non-sentient agents. Think of François Chollet’s ARC puzzle, or the humble CAPTCHA “select all images with bicycles.” While occasionally frustrating (I’ve failed more than a few), at least these tests were based on cognitive cues. They assumed that to be human was to think, to perceive, and to respond in ways machines could not.
But the Orb, World ID’s signature biometric scanner, flips this logic. Now, humanness is proven through physical uniqueness — not how you think, but how your eye is structured. We are shifting from cerebral to corporal verification. And it isn’t a government agency rolling out a new form of passport; it’s a private company.
Tiago Sada, a spokesperson for Tools for Humanity, insists the public should not worry: “The only thing World ID knows about a person is that they’re a real and unique human being.” He emphasized that the system doesn’t collect personal identifiers like names or email addresses, just proof of humanness.
This is where James C. Scott’s theory of legibility becomes uncannily relevant. In his book “Seeing Like a State”, Scott argues that modern states seek to make their populations “legible” — understandable and classifiable through things like census data, surnames, and borders. But here, the process of making people legible isn’t being run by a government. It’s being done by a startup with a celestial-looking orb and a venture-capital war chest.
In other words, we’re not just asking the state to recognize us anymore. We’re asking a corporation.
A few months ago, 23andMe, the company known for at-home genetic testing kits, filed for bankruptcy. Now, the genetic information and data of millions may be sold off to the highest bidder and reduced to a corporate asset. What happens if something similar happens to World ID? Who owns the infrastructure that says whether you’re a person? Who decides whether you’re visible or valid?
The rise of Orb raises similarly troubling questions: What happens to those who refuse to participate, or who can’t? The unhoused. The undocumented. The privacy-conscious. Will they be considered less “real”? Less “human”? Already, many services and platforms demand ever more rigorous identity verification. Will World ID become the new standard for voting, banking, or even existing online?
The logic behind World ID is seductive: in a digital age flooded with bots, scams, and “deepfakes,” a reliable tool to prove we’re really who we claim we are may be revolutionary. But we should also be asking who sets the terms for “realness,” and what happens when you fail the test, not cognitively, but biologically.
After all, personhood isn’t just about unique flesh. It’s about rights, dignity, and democratic accountability. So before you scan your iris for a smoother digital experience, ask yourself: Who are you proving your humanity to, and why do they get to decide?
The Zeitgeist aims to publish ideas worth discussing. The views presented are solely those of the writer and do not necessarily reflect the views of the editorial board.