Meta whistleblower warns Zuck’s metaverse could harm mental health

Listen to this article.

Facebook whistleblower Frances Haugen says her former employer (now going by the name Meta) hasn’t learned from its previous mistakes as it continues to develop its metaverse. 

In an interview with Politico Haugen said: “[Meta has] made very grandiose promises about how there’s safety-by-design in the metaverse.”

“But if they don’t commit to transparency and access and other accountability measures, I can imagine just seeing a repeat of all the harms you currently see on Facebook,” (via Politico, our emphasis).

Last year Haugen leaked thousands of internal documents claiming (among other things) that:

  • Facebook was aware of its platforms’ negative effect on the mental health of teenage users,
  • its products incited ethnic violence,
  • and it was failing miserably to battle misinformation.

For the record, the Menlo Park-headquarted firm claims it’s working hard to address these concerns. It says it is spending $50 million on input from external groups to consider privacy, security, and responsible design.

However, to put this figure into perspective, Meta spent $10 billion on development last year alone.

Metaverse v.1 is a bit shit 

In their current state, the many different metaverses (metaversi?) are very early in their functionality.

Protos discovered that there was actually very little to do on a visit to a largely deserted play-to-earn Decentraland.

There is a belief, at least at Meta HQ, that the virtual world will become as ubiquitous as social media. Zuckerberg has big plans to improve on Meta’s current offering of Occulus goggles and legless avatars.

However, Meta’s virtual world will require the collection of huge amounts of sensitive data and full-body tracking.

Read more: OPINION: The Metaverse and ‘Web 3’ aren’t even here and they’re already cringe

“I’m super concerned about how many sensors are involved,” Haugen told Politico.

“When we do the metaverse, we have to put lots more microphones from Facebook; lots more other kinds of sensors into our homes.”

Politico notes an Electronic Frontier Foundation report which warns that the metaverse “poses substantial risks to human rights.”

In September’s “Building the Metaverse Responsibly” blog post, Meta execs Andrew Bosworth and Nick Clegg claim human and civil rights are at the top of the firm’s agenda.

Facebook whistleblower lost a friend to harmful online content

Haugen became instantly famous after she leaked tens of thousands of Facebook internal documents which left the social media behemoth red-faced.

She told the Wall Street Journal (WSJ) that she decided to expose her former employer because she had lost a friend to conspiracy theories fuelled by online misinformation.

It’s worth noting that chief exec Mark Zuckerberg responded to the WSJ’s Facebook Files investigation, alleging in a Facebook post that many of the claims don’t make sense.

In an interview last year Haugen claimed she blew the whistle because she loves Facebook and wants to save it.

Read more: Citi says metaverse could be worth up to $13T, predicts 5B users

“If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space — even ones larger than us?” he wrote.

Facebook hired Haugen in 2019 as part of its Civic Integrity team.

The department was tasked with guarding the platform against misinformation, particularly concerning elections around the world.

Follow us on Twitter for more informed news.

Out now: the first four episodes of our ongoing investigative podcast series Innovated: Blockchain City.