Exploring the Privacy Risks of Adversarial VR Game Design

Authors: Vivek Nair (UC Berkeley), Gonzalo Munilla Garrido (TU Munich), Dawn Song (UC Berkeley), James O'Brien (UC Berkeley)

Volume: 2023
Issue: 4
Pages: 238–256
DOI: https://doi.org/10.56553/popets-2023-0108

artifact

Download PDF

Abstract: Fifty study participants playtested an innocent-looking "escape room" game in virtual reality (VR). Within just a few minutes, an adversarial program had accurately inferred over 25 of their personal data attributes, from anthropometrics like height and wingspan to demographics like age and gender. As notoriously data-hungry companies become increasingly involved in VR development, this experimental scenario may soon represent a typical VR user experience. Since the Cambridge Analytica scandal of 2018, adversarially-designed gamified elements have been known to constitute a significant privacy threat in conventional social platforms. In this work, we present a case study of how metaverse environments can similarly be adversarially constructed to covertly infer dozens of personal data attributes from seemingly-anonymous users. While existing VR privacy research largely focuses on passive observation, we argue that because individuals subconsciously reveal personal information via their motion in response to specific stimuli, active attacks pose an outsized risk in VR environments.

Keywords: virtual reality, metaverse, data harvesting, privacy, anonymity

Copyright in PoPETs articles are held by their authors. This article is published under a Creative Commons Attribution 4.0 license.