Privacy is going to suck in the Metaverse
In 2012 an article in New Scientist described Ingress, an AR game from Google spin-off Niantic Labs, as "a data gold mine." Ingress is a mobile app tied to the real world through GPS. In the game “exotic matter” (XM) is collected as you explore your town and allows you to take control of a “portal”. You can then link it with two other portals to create a “triangle”. Your side now “owns” that territory.
A detailed history of where all the players wander and the establishments they visit en route is collected. This location data is a data gold mine for Google, who knows who you are (you created an account to play), where you go (and when), who your friends are, etc. This was way back in 2012 - imagine the possibilities today.
Meta, arguably the world's largest and most successful data monetization firm has gone all in. In late 2021 Facebook rebranded itself "Meta". The name change heralds the reorganization of the Facebook empire around a single concept: the metaverse.
Over the course of 2021 Meta's Reality Labs division posted a $10.2 billion loss, earning "only" $2.3 billion after spending $12.5 billion. Let that sink in - Meta spent $12.5 billion dollars building the metaverse in a single year. Also note the metaverse is already a multi-billion-dollar business.
How much of that $12.5 billion was spent on your privacy? Today "experiences" in the metaverse already replace advertising, by letting users wear virtual clothes or test drive virtual cars. All the data – who wore what, who saw what, who drove what, and for how long – is recorded and will be shared/sold by default.
To give you a sense of how valuable your data is to Meta, Apple’s decision to enable iPhone users to turn off tracking has reduced Meta’s ability to profit from you:
- Revenue is down for the second consecutive quarter, quarterly profits likewise. Previously Meta never had a down quarter.
- Meta’s market cap dropped from $1.1tn to $450bn in 10 months.
- Metaverse real estate prices are down 80% in six months.
In addition, there is increasing interest from regulators and governments in Meta’s business practices due to the persistent smell given off by Meta’s problems with privacy, toxic content, and misinformation.
As a reminder, in 2019 Meta was fined a historic $5 billion dollars for privacy violations in relation to the Cambridge Analytica scandal. The $5 billion penalty against Meta was the largest ever imposed on any company for violating consumers’ privacy and almost 20 times greater than the largest privacy or data security penalty ever imposed worldwide. It is one of the largest penalties ever assessed by the U.S. government for any violation. Yet that figure is less than half of what Meta is willing to spend in a single year building the metaverse.
Invasive data collection versus consumer trust
Is Meta's metaverse a place where you will be safe, secure and private? Or, will Meta be collecting all the data it can and correlating that data with everything else they know about you and selling it to anyone who is willing to pay? Will users trust Meta to do the right thing, or will they shun Meta's version of the metaverse for more privacy-friendly options (for example if Apple created a similar but more private metaverse)?
"Advanced technologies, especially in VR headsets and smart glasses, will track behavioral and biometric information at a record scale."
— The Everest Group: "Taming the Hydra: Trust and Safety in the Metaverse".
It's not a stretch to see that you will be tracked everywhere you go and all your interactions logged when in the Metaverse. However, given the immersive nature of VR and AR headsets and technologies Meta could investigate deeply personal, and invasive, parts of their customers' lives. Think about everything your smartwatch can do, but even more in-depth. It could be possible to track your breathing rate, for example, which might increase when you see something interesting.
There might even be "people" (AI driven non-playable characters - AI "influencers") approaching players to talk about a product in a casual way if eye tracking or breathing rate implies you might be interested. Think product placement and influencers on a meta-scale. Metaverse advertising could blur the line between data collection about you, advertising, and real or fake content.
Or, a more frightening example - imagine a political candidate is giving a speech to millions of people in the Metaverse. While each viewer thinks they are seeing the same version of the candidate, in virtual reality they are actually each seeing a slightly different version. For each and every viewer, the candidate's face has been subtly modified to resemble the viewer.
This is done by blending features of each viewer's face into the candidate's face. The viewers are unaware of any manipulation of the image. Yet they are strongly influenced by it: Each member of the audience is more favorably disposed to the candidate.
"In the metaverse, you may not be able to distinguish between authentic content you came upon in the world, augmented or virtual, and paid content that was injected specifically for you, personally targeted by an AI algorithm."
— Louis Rosenberg, CEO of Unanimous AI
AI is certainly going to be the backbone of the metaverse. However current AI technology, such as where Alexa can predict behavior, note current environmental factors, or customer preferences, will be far too simple for the metaverse. Imagine an AI in the Metaverse that befriends you simply to build out your psychological profile.
The potential data captured using VR:
- Geospatial Telemetry (Height, Arm Length, Interpupillary Distance, and Room Dimensions);
- Device Identification (Refresh Rate, Tracking Rate, Resolution, Device Field-of-View, GPU, and CPU);
- Network (Bandwidth, Proximity);
- Behavioral Observations (Languages, Handedness, Voice, Reaction Time, Close Vision, Distance Vision, Color Vision, Cognitive Acuity, Eye Tracking, Breathing Rate, and Fitness).
From these metrics, various inferences can be also made about a VR participant's gender, sexual orientation, wealth, ethnicity, age, and disabilities. In addition, from the data collected about you in the virtual world it will be highly likely Meta will be able to identify who you are in the real, physical world. In fact, Meta will know exactly who you are at a deeper and more fundamental level than your friends and family. That's ironic as most people today have the impression that in virtual reality they will be free to "be whatever and whoever" they wish, separate from their real identities.
Current data protection laws
The time is now to start a dialog on the challenges of privacy and security in the metaverse. What data will you be able to ask Meta not to collect? How will Meta deal with new trust and safety challenges, such as the abuse of virtual avatars and the safety of virtual assets? Some people even wonder what legal system(s) (if any) applies to the metaverse.
If Meta is collecting biometric information, do laws like HIPAA apply? In Europe, the Data Protection Act of 1998 was only followed by GDPR in 2018. This meant it took 20 years to upgrade a more comprehensive set of regulations. Can regulators keep up?
Inside the US Communications Decency Act (CDA) of 1996 is one of the most valuable tools for protecting freedom of expression and innovation on the Internet: Section 230.
Section 230 has given internet platforms protection from being liable for content that people post, in cases such as User-Generated Content. In other words, online intermediaries that host or republish speech are protected against a range of laws that might otherwise be used to hold them legally responsible for what others say and do. Does this extend from speech to "actions" a user might take in the Metaverse?
"If the Metaverse fell outside the scope of Section 230, it would force Congress to enact a whole new law better tailored to the realities of emerging online interactions."
In the metaverse, even establishing and tracking which entity has responsibility for determining how and why personal data will be processed, and who will process personal data on behalf of another will not be straightforward. It will likely involve untangling a web of relationships and connections, and there may not be an obvious answer.
So, there are a number of questions, such as "how should these entities each display their own privacy notice to users?", "should this be done collaboratively?", "how and when should users' consent be collected?", "who is responsible for data being lost or stolen?", and "what data sharing processes need to be put in place, and how will these be implemented?"
Additional trust and security issues
Privacy is only one of the many issues in the metaverse. Online harassment, trolling, and toxic behavior has been a problem on the Internet since modems and bulletin boards. With new levels of realism and interaction in the metaverse, new controls and limits are required. Meta has implemented a a "Personal Boundary" for avatars in its Horizon Worlds and Horizon Venues. Avatars are limited in how closely they can interact to prevent virtual molestation.
"Personal Boundary prevents avatars from coming within a set distance of each other, creating more personal space for people and making it easier to avoid unwanted interactions..."
— Vivek Sharma, VP of Meta's Horizon group
Personal Boundaries prevent other avatars from invading your avatar's personal space, without any haptic feedback to simulate a collision. Additional anti-harassment measures, such as having an avatar’s hands "disappear" if they encroach upon someone’s personal space, and a "safe zone" have been implemented. What else will be needed?
Meta will need to implement privacy options and risk mitigation policies to make the metaverse a private and safe space for users. Do you trust them to so?
References
- New Scientist: Why Google's Ingress game is a data gold mine
- Taming the Hydra: Trust and Safety (T&S) in the Metaverse
- Zuckerberg wants to create a make-believe world in which you can hide from all the damage Facebook has done
- Through the metaverse, and what can be found there
- The metaverse has a groping problem already
- FTC Imposes $5 Billion Penalty and Sweeping New Privacy Restrictions on Facebook
- The Metaverse Real Estate Boom Turns Into a Bust
- Facebook Misinformation Is Bad Enough. The Metaverse Will Be Worse