A new report from academics at the University of Glasgow examines the profound ethical risks and challenges posed by unchecked augmented and mixed reality technology, along with strategies for mitigation.

Augmented and mixed reality (AR/MR) technology can, in the coming years, revolutionise how we perceive and interact with the world around us by overlaying digital content onto our view of the physical environment.

But the tech also raises profound ethical risks and challenges concerning privacy, information accuracy, identity, accessibility, individual autonomy, and well-being.

Without proactive effort to shape AR/MR’s development, there’s the risk of covert surveillance, misinformation bubbles, identity manipulation, and erosion of user autonomy.

That’s the possibility outlined in a new policy report from academics at the University of Glasgow, launched by its Centre for the Study of Perceptual Experience.

The new Policy and Practice Recommendations for Augmented and Mixed Reality report therefore provides analysis and guidance for developers, industry, policymakers, and researchers looking to maximise AR/MR’s benefits while mitigating its dangers.

Lead investigator of the report, Professor Fiona Macpherson, said over 50 experts across academia, industry, and public policy inputted into the report.

“Augmented reality and mixed reality are fast-moving domains. The use of these technologies will be increasingly widespread in coming years,” said Macpherson.

“Our policy report makes specific recommendations for developers, industry, policymakers and research bodies, to guide early intervention and shape the technological trajectory in a way that upholds the key values of privacy, accessibility, autonomy and well-being.”

The report lays out six central risk domains for AR/MR, which are: privacy, information accuracy, identity representation, accessibility, autonomy, and well-being.

The recommendations for the domains include designing AR/MR interfaces to ensure users are aware when virtual content is being displayed and its source (privacy), and educating about the risks of identity manipulation (identity).

Speaking on the recommendations, Professor Ben Colburn, a political philosopher at the University of Glasgow, said: “We recommend design standards which mark out virtual from real objects, and control for users and third parties over the gathering and use of personal data and their digital identities.

“We also show that education is central to AR’s positive individual, social and economic potential: information about benefits and risks should be integrated into critical thinking curricula in schools, and into a campaign of digital literacy for adults, focusing on the novel privacy risks involved in familiar activities.”

The investigators for the report include Professor Fiona Macpherson, Professor Ben Colburn, Dr Derek Brown, and Professor Neil McDonnell. Laura Fearnley and Calum Hodgson served as research assistants.