Worldcoin is a dystopian Silicon Valley fever dream peddled by venture capitalist Sam Altman, and despite fraudulent claims made on social media, this crypto scheme is not egalitarian.
Altman — former YCombinator president and guy behind AI moonshot OpenAI alongside Elon Musk — wants to buy scans of your eyeballs for a small sum of cryptocurrency.
Worldcoin pretends this transaction is a novel way of distributing “sovereign money” (an Ethereum-bound ERC-20 token) to folks everywhere.
“A new, collectively owned global currency that will be distributed fairly to as many people as possible,” reads the website.
Think Silicon Valley-issued Universal Basic Income (UBI), but one that enables crypto’s richest investors to milk cash from every eyeball that signs up.
Worldcoin has raised $25 million from a raft of backers and is now supposedly valued at $1 billion.
Token projects often reward crypto users for doxing themselves via Know-Your-Customer measures, which collect identity information. These systems are regularly gamed by bots.
So, Worldcoin’s Theranos-esque eye scanner, “the Orb,” converts photos of irises into unique identifiers.
Worldcoin’s pledge for a fair launch is tied to the Orb — you can only scan your eyeballs once, so there’s meant to be no hope of Sybil attacks or repeatedly selling biometric data for more crypto.
Speaking to CoinDesk, co-founder Alex Blania claimed the firm plans to produce 4,000 Orbs per month starting in November, to distribute to its budding Scientology-style crew it’s in the process of recruiting.
A manufacturer in Germany (which Blania wouldn’t name) will purportedly make 50,000 per year, intended for Worldcoin’s “Orb Operators.” The firm isn’t Chinese, he said.
Extract value from biometric data, then delete it
Worldcoin promises that its
soul-stealer eyeball scanner will delete the scans once it assigns private keys and cryptocurrency to each person.
This is meant to instil trust that Altman’s Worldcoin isn’t just one big VC-funded, neo-colonialist ruse to mine biometric data for profit.
“It has become standard procedure for many tech companies today to collect as much personal data as possible. This is largely because they rely on advertising revenue to generate profit for their investors, and that ad revenue hinges on using customer data to more effectively target customers with ads,” says Worldcoin.
“While many products and services on the internet are ‘free,’ the collection and monetization of personal customer data is rarely transparent.”
Worldcoin then explains its ringleaders designed the project to require “as little personal data as possible,” preserving the privacy and anonymity of its users.
It doesn’t collect names, email addresses, physical addresses, or phone numbers — so you should feel safe.
Taking a look at Worldcoin’s website, the project’s privacy-protecting claims quickly unravel.
“During our field testing phase, we are collecting and securely storing a lot more data than we will upon its completion,” says Worldcoin. “We use this data to train and optimize our fraud-detection algorithms.”
As Worldcoin hones its eyeball-identification tech, the company will collect “images of users’ body, face, and eyes, including users’ irises (visible, near infrared and far infrared spectrum).”
“Without it, we wouldn’t be able to fairly and inclusively give a share of Worldcoin to everyone on Earth.”
Worldcoin says it “can’t wait to stop collecting” personal data, and wants to emphasize “it will never be our business to sell your personal data.”
The firm also collects a 3D mapping of participants’ bodies and faces.
“The data collected from field tests is encrypted and temporarily stored on the Orb. It is then uploaded through secure, encrypted communication channels and saved to globally distributed secure data stores, where it is encrypted at rest,” says Worldcoin.
“Upon upload, the data is permanently deleted from the Orb.”
Worldcoin needs your eyeballs (and the rest of you)
The project is leveraging data to train neural networks to recognize human irises, remove biases, and build a “fraud-detection system” that can sort real human beings from fake ones.
This brings Worldcoin’s apparent plot into focus:
- offer newcomers in developing nations a small amount of crypto for their biometric data,
- leverage it to produce a facial recognition system to rival Apple’s Face ID,
- delete the data and profit from their technology elsewhere.
Any participant can revoke their consent at any time by sending an email to Worldcoin or a letter to “Tools For Humanity Corporation” in San Francisco.
Worldcoin also promises to delete all the biometric data collected in field tests “once the algorithm is trained,” although gives no exact parameters for when that may be.
It then adds it will make it clear that users can opt-in to feed their biometric data to the company after its field-testing phase.
Indeed, Altman may pretend that he isn’t leveraging biometric data (far more personal than any email or postal address) for profit.
But Worldcoin abstracts that dynamic away in a style that’s synonymous and convenient with our Silicon Valley overlords: We’ll just keep it until we can profit from it without storing it on our servers.
“We can’t wait to finish field testing and stop collecting personal data. We really can’t. But now you understand why we have to do it right now: it’s the only way that the Worldcoin project will work,” says Worldcoin.
While its secret plan to monetize is surprisingly clear on its website, the tokenomics of its ERC-20 cryptocurrency are not.
Opaque token sales aren’t egalitarian
Unlike most reputable crypto projects, there is no formal white paper, and no precise guide to how the token is distributed — aside from a 10 billion supply limit with 2 billion (20%) kept by the project to power its “development.”
There is however mention of a bonding curve-style distribution mechanic, in which early adopters will receive more cryptocurrency, as well as Orb Operators who manage to lure in participants — reportedly ranging from $10 to $200.
Worldcoin will be bound to the Ethereum network, which is notorious for high transaction fees that could usurp the value of the tokens almost completely.
Not to mention, Worldcoin weirdly won’t pay for biometric data in one lump sum. Instead, it will eke out the amount over two years — starting with 10% when the Orb generates a wallet address tied to irises.
This means if Worldcoin fails shortly after it hypothetically finds an exchange on which to trade (crashing its price), participants could receive practically nothing for supporting Altman’s strange gambit.
Still, WorldCoin’s cap table is a litany of who’s-who in tech and crypto venture capitalists.
There’s no details of how much of Worldcoin’s supply they’ll receive, or any potential vesting schedules.
So, it may appear that 80% of the supply is intended for the general population, but it could very well end up far less — again invalidating the project’s “egalitarian” fair launch.
All this renders Worldcoin an opaque, disingenuous, and shameless profit-seeking exercise that targets disadvantaged global populations, falsely touted through a second-rate crypto dream of solving inequality with underdeveloped facial recognition tech.
Follow us on Twitter for more informed crypto news.