A report from the Future of Privacy Forum (“FPF”) has set out recommendations to tackle the privacy risks that are associated with the usage of augmented reality (“AR”) and virtual reality (“VR”) technologies that are increasingly being implemented in education and training, gaming, multimedia, navigation and communication.
Newly emerging use cases mean that individuals will be able to explore a shared digital overlay of the physical world in real time, however these technologies also accumulate and process vast amounts of sensitive personal data (including biometric data, unique device identifiers, location data and, information about homes and businesses). As with AI and 5G, this creates a risk to data subjects which could undermine the adoption of AR and VR platforms, as without this data XR technologies cannot function.
The FPF brings together academics, consumer advocates and industry within their think-tank to explore challenges posed by technology and develops privacy protections, ethical norms and workable business practices.
The purpose of the FPF report is to consider current and future use cases for XR technologies and provide recommendations for industry to implement them through the adoption of privacy guidelines. This includes advice for policymakers in their consideration of how data protection law obligations are implemented in regard to the collection of personal data by XR technologies, for example how hardware manufacturers maintain transparency in their data collection, use and sharing as well as how developers can process data locally.
The report recommends that: policymakers should carefully consider how existing or proposed data protection laws can provide consumers with meaningful rights and companies with clear obligations regarding XR data, while hardware makers should consider how XR data collection, use and sharing can be performed in ways which are transparent to users, bystanders and other stakeholders.
XR developers should also consider the extent to which sensitive personal data can be processed locally and kept on-device and should ensure that sensitive personal data is encrypted in transit and at rest.
The reports urges platforms and XR experience providers to implement rules about virtual identity and property that mitigate, rather than increase, online harassment, digital vandalism, and fraud and should establish clear guidelines that mitigate physical risks to XR users and bystanders.
Researchers should obtain informed consent prior to conducting research via XR technologies and consider seeking review by an institutional or ethical review board if consent is impractical.
Providers should provide a wide-range of customisable avatar features that reflect the broader community, encouraging representation and inclusion and they should consult with the larger community of stakeholders and integrate their feedback into decisions about software and hardware design and data collection, use, and sharing.
Many of these recommendations are aimed at industry and the considerations are best reflected within XR technology privacy policies as well as respective EULAs.
The difficulty with attempting to retrofit existing policies and agreements for this industry is that they run the risk of not fully appreciating the way that these technologies operate. The usual box-ticking method of agreeing to EULAs and privacy policies common with software and websites may not fit so easily into XR.
The extensive personal data being collected by XR technologies can create better immersive experiences but also can exacerbate privacy risks – the unique nature of these technologies makes it difficult to mitigate risks by applying existing privacy policies and practices from other digital media sectors and requires new innovative approaches to choice, security and transparency. For example, VR headsets can capture large amounts of personal data such as dexterity, ease of movement and reaction times, potentially building up a health profile – eye tracking technology is far more intrusive than cookies.
VR manufacturers need to ensure privacy is protected, and that data is processed and stored securely – and only if they have the express consent to do so. They will also need to consider, where the device’s processing of medical data assists with diagnosis or treatment, whether medical device authorisation is necessary and whether the privacy risks arising from research could be mitigated by data anonymisation. The recommendations by the FPF aim to provide a way for industry and policymakers to tackle this without compromising the benefits provided by XR technologies. The report appears to consider that XR technologies are ever-evolving and seeks to address this by focussing on actual harms tied to user data.
The desired outcome is clearly for policymakers to work within an innovation-friendly regulatory environment by clarifying and harmonising existing rules and introducing industry standard recommendations specifically tailored for the XR industry. The best way to do this is for industry and policymakers to get behind recommendations such as those provided by the FPF and consider how best to implement these within their existing practices.
Author details: Rayyan Mughal, Associate with Marks & Clerk, the largest firm of intellectual property advisers in the UK