Augmented Reality Must Have Augmented Privacy – EFF

Imagine walking down the street, looking for a good cup of coffee. In the distance, a storefront glows in green through your smart glasses, indicating a well-reviewed cafe with a sterling public health score. You follow the holographic arrows to the crosswalk, as your wearables silently signal the self-driving cars to be sure they stop for your right of way. In the crowd ahead you recognize someone, but cant quite place them. A query and response later, Cameron pops above their head, along with the context needed to remember they were a classmate from university. You greet them, each of you glad to avoid the awkwardness of not recalling an acquaintance.

This is the stuff of science fiction, sometimes utopian, but often as a warning against a dystopia. Lurking in every gadget that can enhance your life is a danger to privacy and security. In either case, augmented reality is coming closer to being an everyday reality.

In 2013, Google Glass stirred a backlash, but the promise of augmented reality bringing 3D models and computer interfaces into the physical world (while recording everything in the process) is re-emerging. As is the public outcry over privacy and always-on recording. In the last seven years, companies are still pushing for augmented reality glasseswhich will display digital images and data that people can view through their glasses. Chinese company Nreal, Facebook and Apple are experimenting with similar technology.

Digitizing the World in 3D

Several technologies are moving to create a live map of different parts of our world, from Augmented or Virtual Reality to autonomous vehicles. They are creating machine-readable, 1:1 scale models of the world that are continuously updated in real-time. Some implement such models through point clouds, a dataset of points coming from a scanner to recreate the surfaces (not the interior) of objects or a space. Each point has three coordinates to position them in space. To make sense of the millions (or billions) of points, a software with Machine Learning can help recognize the objects from the point cloudslooking exactly as a digital replica of the world or a map of your house and everything inside.

The promise of creating a persistence 3D digital clone of the world aligned with real-world coordinates goes by many names: worlds digital twin, parallel digital universe, Mirrorworld, The Spatial Web, Magic Verse'' or a Metaverse. Whatever you call it, this new parallel digital world will introduce a new world of privacy concernseven for those who choose to never wear it. For instance, Facebook Live Maps will seek to create a shared virtual map. LiveMaps will rely on users crowd-sourced maps collected by future AR devices with client-mapping functionality. Open AR, an interoperable AR Cloud, and Microsofts Azure Digital Twins are seeking to model and create a digital representation of an environment.

Facebooks Project Aria continues on that trend and will aid Facebook in recording live 3D maps and developing AI models for Facebooks first generation of wearable augmented reality devices. Arias uniqueness, in contrast to autonomous cars, is the egocentric data collection of the environmentthe recording data will come from the wearers perspective; a more intimate type of data. Project Aria is also a 3D live-mapping tool and software with an AI development tool, not a prototype of a product, nor an AR device due to the lack of display." According to Facebook, Arias research glasses, which are not for sale, will be worn only by trained Facebook staffers and contractors to collect data from the wearers point of view. For example, if the AR wearer records a building and the building later burns down, the next time any AR wearer walks by, the device can detect the change, and update the 3D map in real-time.

A Portal to Augmented Privacy Threats

In terms of sensors, Arias will include among others a magnetometer, a barometer, GPS chip, and two inertial measurement units (IMU). Together, these sensors will track where the wearer is (location), where the wearer is moving (motion), and what the wearer is looking at (orientation)a much more precise way to locate the wearers location. While GPS doesnt often work inside a building, for example, sophisticated IMU can allow a GPS receiver to work well indoors when GPS-signals are unavailable.

A machine learning algorithm will build a model of the environment, based on all the input data collected by the hardware, to recognize precise objects and 3D map your space and the things on it. It can estimate distances, for instance, how far the wearer is from an object. It also can identify the wearers context and activities: Are you reading a book? Your device might then offer you a reading recommendation.

The Bystanders Right to Private Life

Imagine a future where anyone you see wearing glasses could be recording your conversations with always on microphones and cameras, updating the map of where you are in precise detail and real-time. In this dystopia, the possibility of being recorded looms over every walk in the park, every conversation in a bar, and indeed, everything you do near other people.

During Arias research phase, Facebook will be recording its own contractors interaction with the world. It is taking certain precautions. It asks the owners concerns before recording in privately owned venues such as a bar or restaurant. It avoids sensitive areas, like restrooms and protests. It blurs peoples faces and license plates. Yet, there are still many other ways to identify individuals, from tattoos to peoples gait, and these should be obfuscated, too.

These blurring protections mirror those used by other public mapping mechanisms like Google Street View. These have proven reasonablebut far from infalliblein safeguarding bystanders privacy. Google Street View also benefits from focusing on objects, which only need occasional recording. Its unclear if these protections remain adequate for perpetual crowd-sourced recordings, which focus on human interactions. Once Facebook and other AR companies release their first generation of AR devices, it will likely take concerted efforts by civil society to keep obfuscation techniques like blurring in commercial products. We hope those products do not layer robust identification technologies, such as facial recognition, on top of the existing AR interface.

The AR Panopticon

If the AR glasses with always-on audio-cameras or powerful 3D mapping sensors become massively adopted, the scope and scale of the problem changes as well. Now the company behind any AR system could have a live audio/visual window into all corners of the world, with the ability to locate and identify anyone at any time, especially if facial or other recognition technologies are included in the package. The result? A global panopticon society of constant surveillance in public or semi-public spaces.

In modern times, the panopticon has become a metaphor for a dystopian surveillance state, where the government has cameras observing your every action. Worse, you never know if you are a target, as law enforcement looks to new technology to deepen their already rich ability to surveil our lives.

Legal Protection Against Panopticon

To fight back against this dystopia, and especially government access to this panopticon, our first line of defense in the United States is the Constitution. Around the world, we all enjoy the protection of international human rights law. Last week, we explained how police need to come back with a warrant before conducting a search of virtual representations of your private spaces. While AR measuring and modeling in public and semi-public spaces is different from private spaces, key Constitutional and international human rights principles still provide significant legal protection against police access.

In Carpenter v. United States, the U.S. Supreme Court recognized the privacy challenges with understanding the risks of new technologies, warning courts to tread carefully to ensure that we do not embarrass the future.

To not embarrass the future, we must recognize that throughout history people have enjoyed effective anonymity and privacy when conducting activities in public or semi-public spaces. As the United Nations' Free Speech Rapporteur made clear, anonymity is a common human desire to protect ones identity from the crowd..." Likewise, the Council of Europe has recognized that while any person moving in public areas may expect a lesser degree of privacy, they do not and should not expect to be deprived of their rights and freedoms including those related to their own private sphere. Similarly, the European Court of Human Rights, has recognized that a zone of interaction of a person with others, even in a public context, may fall within the scope of private life. Even in public places, the systematic or permanent recording and the subsequent processing of images could raise questions affecting the private life of individuals. Over forty years ago, in Katz v. United States, the U.S. Supreme Court also recognized "what [one] seeks to preserve as private, even in an area accessible to the public, may be constitutionally protected."

This makes sense because the natural limits of human memory make it difficult to remember details about people we encounter in the street; which effectively offers us some level of privacy and anonymity in public spaces. Electronic devices, however, can remember perfectly, and collect these memories in a centralized database to be potentially used by corporate and state actors. Already this sense of privacy has been eroded by public camera networks, ubiquitous cellphone cameras, license plate readers, and RFID trackersrequiring legal protections. Indeed, the European Court of Human Rights requires clear detailed rules..., especially as the technology available for use [is] continually becoming more sophisticated.

If smartglasses become as common as smartphones, we risk losing even more of the privacy of crowds. Far more thorough records of our sensitive public actions, including going to a political rally or protest, or even going to a church or a doctors office, can go down on your permanent record.

This technological problem was brought to the modern era in United States v. Jones, where the Supreme Court held that GPS tracking of a vehicle was a search, subject to the protection of the Fourth Amendment. Jones was a convoluted decision, with three separate opinions supporting this result. But within the three were five Justices a majority who ruled that prolonged GPS tracking violated Jones reasonable expectation of privacy, despite Jones driving in public where a police officer could have followed him in a car. Justice Alito explained the difference, in his concurring opinion (joined by Justices Ginsburg, Breyer, and Kagan):

In the pre-computer age, the greatest protections of privacy were neither constitutional nor statutory, but practical. Traditional surveillance for any extended period of time was difficult and costly and therefore rarely undertaken. Only an investigation of unusual importance could have justified such an expenditure of law enforcement resources. Devices like the one used in the present case, however, make long-term monitoring relatively easy and cheap.

The Jones analysis recognizes that police use of automated surveillance technology to systematically track our movements in public places upsets the balance of power protected by the Constitution and violates the societal norms of privacy that are fundamental to human society.

In Carpenter, the Supreme Court extended Jones to tracking peoples movement through cell-site location information (CSLI). Carpenter recognized that when the Government tracks the location of a cell phone it achieves near perfect surveillance as if it had attached an ankle monitor to the phone's user. The Court rejected the governments argument that under the troubling third-party doctrine, Mr. Carpenter had no reasonable expectation of privacy in his CSLI because he had already disclosed it to a third party, namely, his phone service provider.

AR is Even More Privacy Invasive Than GPS and CSLI

Like GPS devices and CSLI, AR devices are an automated technology that systematically documents what we are doing. So AR triggers strong Fourth Amendment Protection. Of course, ubiquitous AR devices will provide even more perfect surveillance, compared to GPS and CSLI, not only tracking the users information, but gaining a telling window into the lives of all the bystanders around the user.

With enough smart glasses in a location, one could create a virtual time machine to revisit that exact moment in time and space. This is the very thing that concerned the Carpenter court:

the Government can now travel back in time to retrace a person's whereabouts, subject only to the retention policies of the wireless carriers, which currently maintain records for up to five years. Critically, because location information is continually logged for all of the 400 million devices in the United States not just those belonging to persons who might happen to come under investigation this newfound tracking capacity runs against everyone.

Likewise, the Special Rapporteur on the Protection of Human Rights explained that a collect-it-all approach is incompatible with the right to privacy:

Shortly put, it is incompatible with existing concepts of privacy for States to collect all communications or metadata all the time indiscriminately. The very essence of the right to the privacy of communication is that infringements must be exceptional, and justified on a case-by-case basis.

AR is location tracking on steroids. AR can be enhanced by overlays such as facial recognition, transforming smartglasses into a powerful identification tool capable of providing a rich and instantaneous profile of any random person on the street, to the wearer, to a massive database, and to any corporate or government agent (or data thief) who can access that database. With additional emerging and unproven visual analytics (everything from aggression analysis to lie detection based on facial expressions is being proposed), this technology poses a truly staggering threat of surveillance and bias.

Thus, the need for such legal safeguards, as required in Canada v. European Union, are all the greater where personal data is subject to automated processing. Those considerations apply particularly where the protection of the particular category of personal data that is sensitive data is at stake.

Augmented reality will expose our public, social, and inner lives in a way that maybe even more invasive than the smartphones revealing montage of the user's life that the Supreme Court protected in Riley v California. Thus it is critical for courts, legislators, and executive officers to recognize that the government cannot access the records generated by AR without a warrant.

Corporations Can Invade AR Privacy, Too

Even more, must be done to protect against a descent into AR dystopia. Manufacturers and service providers must resist the urge, all too common in Silicon Valley, to collect it all, in case the data may be useful later. Instead, the less data companies collect and store now, the less data the government can seize later.

This is why tech companies should not only protect their users right to privacy against government surveillance but also their users right to data protection. Companies must, therefore, collect, use, and share their users AR data only as minimally necessary to provide the specific service their users asked for. Companies should also limit the amount of data transited to the cloud, and the period it is retained, while investing in robust security and strong encryption, with user-held keys, to give user control over information collected. Moreover, we need strong transparency policies, explicitly stating the purposes for and means of data processing, and allowing users to securely access and port their data.

Likewise, legislatures should look to the augmented reality future, and augment our protections against government and corporate overreach. Congress passed the Wiretap Act to give extra protection for phone calls in 1968, and expanded statutory protections to email and subscriber records in 1986 with the Electronic Communication Privacy Act. Many jurisdictions have eavesdropping laws that require all-party consent before recording a conversation. Likewise, hidden cameras and paparazzi laws can limit taking photographs and recording videos, even in places open to the public, though they are generally silent on the advanced surveillance possible with technologies like spatial mapping. Modernization of these statutory privacy safeguards, with new laws like CalECPA, has taken a long time and remains incomplete.

Through strong policy, robust transparency, wise courts, modernized statutes, and privacy-by-design engineering, we can and must have augmented reality with augmented privacy. The future is tomorrow, so lets make it a future we would want to live in.

Read more:
Augmented Reality Must Have Augmented Privacy - EFF

Related Posts

Comments are closed.