Is a silver orb scanning our eyeballs in exchange for digital coins a cause for concern or chance to get rich?
Visit one of the pilot programme’s designated sites around the world, stand in line for your turn, finally be called forward and gaze into a futuristic-looking silver orb that scans your unique iris. In return, you are tossed a digital coin that you presumably hope will one day be worth a considerable amount of money. Otherwise, what was the incentive for spending time to give up a unique personal identifier?
Sam Altman, the co-founder and CEO of OpenAI, the company behind ChatGPT, says there is plenty of wider-ranging incentive. The iris scan is stage one of a master plan to protect humanity from the kind of AI technology his company has developed. It is presented as a vast improvement on having to click the boxes with buses in them to prove we are humans and not robots.
Worldcoin is also touted as being able to “drastically increase economic opportunity” and pave the way for AI-funded universal basic income. Presumably to stop the masses starving when they no longer have jobs. Or, more optimistically put, no longer have to work.
A future where humanity is freed from the daily grind of having to earn a living to feed, clothe and shelter itself has a certain appeal. But for it to become a reality, at least in the way proposed by Altman, hundreds of millions of humans around the world first need to have their eyeballs scanned.
The resulting scan becomes the scanee’s “World ID”, or “proof of personhood”. It will allow us to conveniently prove we are not robots. What it won’t do, says Altman, is compromise privacy despite the fact the iris scans are not anonymised.
But they don’t have to be, at least not at this stage of the World ID initiative, because personal identity is never connected to individual scans.
It doesn’t need to be. Scans enter a database and when websites and other software start to integrate the accompanying security technology, they will simply cross-check for an iris scanned in real-time by a smartphone or laptop camera against that database. If there’s a match, the application confirms a human, not an iris-less robot, is requesting entry.
Although it only recently made the news in terms of extensive mainstream media coverage, testing of Worldcoin’s iris scanners actually began around two years. More than two million volunteers, from 33 countries but mainly in Europe, India and southern Africa have already come forward to have their eyeballs scanned. There are no irises of U.S. citizen irises in the database for regulatory reasons.
The scanning is, however, ramping up with plans announced to install 1500 iris-scanning orb stations around the world. That announcement, made alongside the release of photographs showing long queues of volunteers in Japan, has piqued public and media interest.
Predictably, the project’s increased media exposure has also led to worries and headlines announcing the arrival of a coming dystopia devoid of personal privacy.
How does Worldcoin’s World ID work?
Volunteers have their faces and irises scanned, which involves staring into the Worldcoin orb for about 10 seconds. That the necessary biometrics have been banked is confirmed by a beep. A unique number is then attached to individual iris scans and there is a cross-checking process to prevent duplications resulting from volunteers going through the process twice – presumably in an attempt to double up on the cryptocurrency handout.
Once it has been confirmed an iris has been scanned for the first time, volunteers are issued with 25 Worldcoin tokens, currently worth about $2, or £1.56 at the present exchange rate.
Why is the Worldcoin project so controversial?
Historically, any attempt to introduce a new form of personal identification provokes concerns. They typically range from a healthy questioning of the trade-off between practicality and increased state control to violent outrage at what is seen as an attempt to curtail personal freedoms.
Because biometrics are a relatively new development, made possible by advances in technology, they tend to create a greater degree of suspicion and stronger backlash.
When they are introduced by democratically elected governments to reduce friction within an existing system, such as international passports some years ago now, the pushback is typically less vociferous and confined to more passionate libertarians.
When authoritarian governments use biometrics the technology is usually seen as a tool for increased suppression on the level of individual control.
Worldcoin’s initiative is further complicated by the fact it is at this stage a privately-funded enterprise. It has been developed in partnership with Tools for Humanity, a Silicon Valley tech company whose mission statement is to help create a “more just economic system”.
The initial aim of creating a more advanced system to verify humanity in the already commencing age of AI is relatively uncontroversial. And nobody likes ticking boxes with streetlights on a screen, and sometimes getting it wrong, to prove they are not a bot.
The fact that the initiative’s specialised hardware is designed to protect privacy and is generally regarded as doing a good job of it is also comforting.
The worries are that the nature of the technology means that it does come with inherent privacy risks such as a shift in the benign nature of those in control of it. Or if it were to fall into the wrong hands or be compromised.
As well as the more general concern that increasingly sophisticated biometrics technology will one day give governments and law enforcement agencies a level of control over compliance that is on a whole new level than anything seen before. Potentially jeopardising the ability of societies to protest laws or push back on governmental overreach.
Were Worldcoin to indeed one day become a global cryptocurrency enabling systems such as a universal income, the worry would be that the company behind it would have too much power.
Altman himself projects a blasé attitude to his latest endeavour. He shrugs off those who would accuse him of masterminding the future enslavement of humanity with tweets like:
“If I were trying to be a supervillain I would come up with a much better plan than iris scanning”.
“Maybe it works out and maybe it doesn’t, but trying stuff like this is how progress happens”.
Most of those who have so far signed up to have their irises scanned by the Worldcoin orb seem most incentivised by the hope of getting in on a new cryptocurrency at the ground level. They aren’t likely to be those who spend much time worrying about their privacy being compromised by big tech for commercial purposes. Or governments looking for greater control.
Altman is right, maybe Worldcoin’s initiative works and maybe it doesn’t. At all levels of its stated ambitions, from making proving our humanity online easier to a more just economic system.
But if it does look like it might work and gain the level of traction needed, millions to billions of eyeballs scanned, there should be safeguards. Seemingly noble visions that entail inherent risks don’t always lead to the intended consequences. Or can become corrupted along the way. History teaches us that much.