Exceptional Access to Encrypted Communications

Bob Barr has recently added his voice to the ongoing call of law enforcement to provide exceptional access to encrypted communications. Here’s why that’s not going to work.

“Exceptional access — as governments propose — is the problem of making a system selectively secure. I can tell you, it’s hard enough to make a secure system. It’s vastly harder to make a system secure except for governments, and only available to governments that consist of ‘democratically elected representatives and [a] judiciary’ as the GCHQ authors imagine.”

—Jon Callas, “The ‘Ghost User’ Ploy to Break Encryption Won’t Work.” DavisVanguard.org. July 24,2019.

Is being able to access the encrypted communications of everyone enough? Between the drone’s Gorgon Stare above, the Ring camera on every other front door for police to access, televisions tracking every show being watched, phones and digital assistants listening in on conversations, fitness trackers as evidence in court cases, Stringray and other technology for phone tracking, license plate readers to track vehicle movement over time, surveillance balloons and so on, it feels to me like the police and military are a little under-powered these days.

I was promised a camera in my television watching my every move, a Room 101 for not sufficiently toeing the line and a boot stomping on a face of humanity forever. Was Uncle Orwell lying to me?

Who Watches the [Artificial Intelligence] Watchmen?

Open Question: What are the unintended consequences of artificial intelligence in the surveillance domain?

Recently, I came across a good basic guide for finding hidden cameras and bugs with a level of detail I’ve never seen online before. As with everything related to security, the first question to ask is: what is your threat model?

For most people, the need to look for hidden cameras and bugs is not something they need, or at least they don’t think they need it. But, there are situations where the kind of operational security used by spies could also be useful for everyone. The most obvious example is something like AirBnB, where there may be good reason to suspect the risk for hidden cameras and bugs might be higher for everyone than in other similar circumstances, such as a reserved hotel room at a reputable hotel chain or in our own homes.

So, it is useful information to know. If using a service like AirBnB is something you do regularly, it may be worth investigating these techniques in greater detail, or at least have it handily bookmarked.

The problem of recording devices in an AirBnB strikes me as similar to the case where you travel a lot and regularly use open wifi networks. The increased risk to your threat model might warrant the services of a virtual private network provider, if you don’t already use one. Again, it depends on your threat model.

This issue got me thinking about the larger pattern of surveillance, not just of online spaces but of physical space. Police departments are using overhead surveillance drones to monitor areas indefinitely (the level of which can be increased to monitor public spaces during large gatherings), registering private security cameras, automated license plate scanners, body and squad car cameras and so forth. These are being combined with online surveillance technologies to map social media to physical spaces. All of these technologies are being combined together:

“By combining drone, body-camera, police-car-camera, and closed-circuit-TV footage, Axon is clearly hoping to create a central hub for police to cross-reference and access surveillance data—a treasure chest of information that, according to Elizabeth Joh, a law professor at the University of California–Davis who studies civil liberties and police surveillance technology, police departments could find difficult to stop using once they start. “Not only is there no real competition from other vendors,” said Joh, “but once a police department has bought into a certain contract with a company, it’s very hard to drop it and move on. There’s a lot of investment in training the agency and the officers how to use it.”

April Glaser, “The Next Frontier of Police Surveillance Is Drones.” Slate.com. June 7, 2018

Companies like Palantir that cut their teeth on developing anti-terrorism surveillance and big analytic products that are now being rolled out to local police departments. All of this is happening with relatively little oversight.

Of course, big data means that artificial intelligence is being trained on all of this surveillance data. One task is to train artificial intelligence algorithms to recognize facial, gait, voice and other identifying characteristics of individuals. Another is to create a time series to be able to track those individuals in time and space. It will change the way police interact with their population, because they will have a good idea of who was in the area, so the software will offer them a list of possible perpetrators and witnesses, without necessarily good indication of which is which.

It reminds me of a quote:

“One of the major purposes of state simplifications, collectivization, assembly lines, plantations, and planned communities alike is to strip down reality to the bare bones so that the rules will in fact explain more of the situation and provide a better guide to behavior. To the extent that this simplification can be imposed, those who make the rules can actually supply crucial guidance and instruction. This, at any rate, is what I take to be the inner logic of social, economic, and productive de-skilling. If the environment can be simplified down to the point where the rules do explain a great deal, those who formulate the rules and techniques have also greatly expanded their power. They have, correspondingly, diminished the power of those who do not. To the degree that they do succeed, cultivators with a high degree of autonomy, skills, experience, self-confidence, and adaptability are replaced by cultivators following instructions. Such reduction in diversity, movement, and life, to recall Jacobs’s term, represents a kind of social ‘taxidermy’.”

― James C. Scott, Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed

So, it isn’t hard to imagine a situation evolving where police work reduces to following up leads that are generated by artificial intelligence. The amount of data, the kind of data, the assumptions being employed, and so forth will all be a black box to the officer on the street. The simplified map will become the territory. The beat cop will become the instrument of the algorithm designers, who may or may not be getting feedback on failures of the system. Many of these problems of these tools will be subtle, such as how their use changes the culture of the police department. People won’t know what to look for, and by the time the problems are identified, they may already be baked into the culture. It will certainly be too late for the individuals effected by software bugs, with errors being miscarriages of justice against individuals and prison sentences.

It isn’t hard to imagine a mature industry progressing to Philip K. Dick concepts of “pre-crime”. Artificial intelligence systems will be expanded to look for larger patterns in the data that tend to lead to crime, and there will be compelling arguments to use this information to stage interventions.

Who will watch these artificial intelligence watchmen? Is it even possible? In the mad dash to implement these systems, what kind of oversight is there? Sadly, the answer is: none.