Virtual worlds

Study outlines privacy risks in metaverse virtual worlds • The Register

More thought – or at least thought – needs to be given to privacy in the promised metaverse of connected 3D virtual reality worlds, the experts concluded.

In a paper distributed via ArXiv, entitled “Exploring the Unprecedented Privacy Risks of the Metaverse”, boffins from UC Berkeley in the United States and the Technical University of Munich in Germany tested a virtual reality (VR) game “escape room to better understand how much data a potential attacker could access.

Through a study of 30 people on the use of virtual reality, researchers – Vivek Nair (UCB), Gonzalo Munilla Garrido (You put dawn song (UCB) – created a framework to assess and analyze potential privacy threats. They identified more than 25 examples of private data attributes available to potential attackers, some of which would be difficult or impossible to obtain from traditional mobile or web applications.

The wealth of information available through augmented reality (AR) and VR hardware and software has been known for years. For example, a 2012 article in new scientist described Ingress, an AR game from Google spin-off Niantic Labs, as “a goldmine of data.” That’s why data monetization companies like Meta are willing to invest billions to make the market for AR/VR hardware and applications more than just a sadness for tech enthusiasts who don’t use torsos. .

Similarly, issues of trust and security of online social interaction have plagued online services since the days of dial-up modems and message boards, before web browsers even existed. And now that Apple, Google, Microsoft, Meta and other players see a chance to remake Second Life under their own access control, consulting companies are once again reminding customers that privacy will be an issue.

“Advanced technologies, especially in VR headsets and smart glasses, will track behavioral and biometric information at record scale,” says The Everest Group in his recent report: “Taming the Hydra: Trust and Safety in the Metaverse”.

“Currently, digital technologies can capture data regarding facial expressions, hand movements, and gestures. Therefore, personal and sensitive information that will traverse the metaverse in the future will include real-world information about user habits and physiological characteristics.”

Not only is privacy an unresolved metaverse issue, but hardware security also leaves something to be desired. A related recent study AR/VR hardware, “Security and Privacy Evaluation of Popular Augmented and Virtual Reality Technologies,” found vendor websites full of potential security vulnerabilities, their hardware and software lacking multi-factor authentication, and their obtuse privacy policies .

The escape room study lists specific data points available to attackers of different types – hardware adversaries, clients, servers, and users. It should be noted that the “attacker”, as defined by the researchers, encompasses not only the external threat actors, but also the participants and the companies that run the show.

Potential data points identified by the researchers include: geospatial telemetry (height, arm length, interpupillary distance and room dimensions); Device specifications (refresh rate, tracking rate, resolution, device field of view, GPU and CPU); Network (Bandwidth, Proximity); Behavioral observations (languages, laterality, voice, reaction time, near vision, distance vision, color vision, cognitive acuity and physical fitness).

From these measurements, various inferences can be made about a VR participant’s gender, wealth, ethnicity, age, and disabilities.

“The alarming precision and stealth of these attacks and the push by data-hungry companies towards metaverse technologies indicate that data collection and inference practices in VR environments will soon become more ubiquitous in our daily lives,” concludes the report. ‘article.

“We want to start by saying that these ‘attacks’ are theoretical and we have no evidence that anyone is actually using them currently, although it would be quite difficult to know if they were,” Nair and Munilla Garrido wrote in an email. at The register. “Also, we use ‘attacks’ as a technical term, but in reality, if this data collection were to be deployed, the consent would likely be buried in an agreement somewhere and would in theory be entirely above board. “

If a company wanted to do data collection, they could get a lot more information about users in VR than they could from mobile apps…pivoting to VR would make perfect sense in this context.

However, the two researchers say there’s reason to believe that companies investing in the metaverse do so at least in part in the hope that aftermarket advertising will offset losses like the $12.5 billion. spent by Meta’s Reality Labs group last year to earn a mere $2.3 billion in revenue.

“Now, assuming a company of this size knows how to calculate a BOM, this loss-making approach must be a strategic move that they believe will eventually pay for itself,” Nair and Munilla Garrido explained. “And if we look at who these companies are and what revenue methods they have already perfected, we suspect that it will be at least somewhat tempting to deploy these same methods to recoup material losses. But again, this is speculative. .

“All of our research shows that if a company wanted to do data collection, they could get a lot more information about users in VR than they could from mobile apps for example, and that pivot to the VR would make perfect sense in this context.”

When asked if existing privacy rules adequately address metaverse data collection, the two eggheads said they think so, unless those rules only apply to mobile apps.

“But we have a unique challenge when it comes to metaverse apps, in that there’s a plausible reason to serve that data to central servers,” they explained. “Basically, metaverse apps work by tracking all of your bodily movements and streaming all of that data to a server so that a representation of yourself can be rendered for other users around the world.

“So, for example, while a company would be hard pressed to claim that tracking your movements is necessary for their mobile app, it’s actually an integral part of the metaverse experience! And at this point, there’s a lot easier to argue that logs about this should be stored for troubleshooting etc. So in theory even if the same privacy laws apply they could be interpreted in radically different ways due to basic needs into platform data that are so different.

Nair and Munilla Garrido acknowledged that some of the approximately 25 collectible attributes they identified in their research can be obtained through cellphones or other online interactions. But metaverse apps are a one-stop shop for data.

“We have a situation where all of these categories of information can be collected at the same time, within minutes,” they explained.

“And because you have to combine multiple attributes to make inferences (e.g. height and voice to infer gender), having all of these data collection methods in the same place at the same time is what makes virtual reality a unique risk in terms of being able to infer attributes from user data with high accuracy.”

The sheer volume of information available through the metaverse is enough to de-anonymize any VR user, they claimed. They argue that this is not the case for apps or websites.

The purpose of their article, they said The registeris to shed light on the widespread privacy risks of augmented reality/virtual reality and to encourage other researchers to seek solutions.

Screenshot of MetaGuard in the VR world

Screenshot of MetaGuard in the VR world… Click to enlarge

They already have one in mind: a plugin for the Unity game engine called MetaGuard. The name clearly indicates the source of the privacy threat.

“Think of it as ‘incognito mode for VR,'” wrote Nair and Munilla Garrido. “It works by adding noise, using a statistical technique known as differential privacy, to certain VR tracking metrics so that they are no longer accurate enough to identify users, but without significant impact on the user experience. Like incognito mode in browsers, it’s something users could turn on and off and adjust to their liking depending on the environment and their level of trust.”

Hopefully, metaverse privacy will be as simple as that. ®