“Increasingly powerful surveillance tools have shifted the power dynamics between people and institutions. To address this new dynamic, we’ve been creating a toolkit, in collaboration with the ACLU of Washington, that demystifies surveillance technologies in Seattle in the historical context of structural inequities in the United States.”https://coveillance.org/
Open Question: Are smart phones primarily an information technology or a control technology?
“What the phone promises you psychologically is not content as such, but a space on the screen that is totally obedient to you. This translates into the illusion that the world, seen through the screen, will be equally obedient. I think any effort to try to understand smartphone addiction needs to grapple with the fact that it is much closer to a control technology than an information technology. Of course, it tells you useful things but what it offers you is navigation and control, the ability to make a fast-moving and confusing world obey you. One of the main contrasts in the book is between a view of the world that tries to represent it—the classically modern one of the seventeenth century for which the map would be a classic example—and a view of the world which brings it under control, which is a military ideal. Today, we often have no idea where we are going until we put a destination into our phone and follow the instructions. This navigation-based approach to the world originates from military technology and the need to bring the world under control.”–William Davies interview with Tobias Haberkorn, “Control Groups.” The Point Magazine. December 7, 2019.
“At a time when the American system of government is already being sorely tested by a demagogue and would-be autocrat in the White House, it would be disastrous to grant more power to the Justice Department and the nation’s security services.”—James Risen, ” To Fight White Supremacist Violence, Let’s Not Repeat the Mistakes of the War on Terror.” The Intercept. August 17, 2019.
Anytime you think the solution to a problem takes the form of “War on X” or “X War”, you probably need to think a little harder about it.
The War on Terror treats a symptom while acting as a catalyst for the underlying disease. Same goes for the “War on Drugs”. The moment marijuana was getting legalized, the criminal elements supplying it went to opiates. Further, one has to wonder how much longer “The Cold War” enabled communism to last by providing a facade the underlying structural problems could hide behind.
Also, this idea of, “At a time when…” is bogus. This kind of testing could happen at any time. If there is some capability you think the next Hitler shouldn’t have as head of government, then you have a good sense of what powers your government shouldn’t have, and you should use that line to have a principled discussion of the powers of state. Should one person be able to start a nuclear war? Should one person be able to start any war, via the War Powers Act? These are conversations that are overdue.
Open Question: What are the unintended consequences of artificial intelligence in the surveillance domain?
Recently, I came across a good basic guide for finding hidden cameras and bugs with a level of detail I’ve never seen online before. As with everything related to security, the first question to ask is: what is your threat model?
For most people, the need to look for hidden cameras and bugs is not something they need, or at least they don’t think they need it. But, there are situations where the kind of operational security used by spies could also be useful for everyone. The most obvious example is something like AirBnB, where there may be good reason to suspect the risk for hidden cameras and bugs might be higher for everyone than in other similar circumstances, such as a reserved hotel room at a reputable hotel chain or in our own homes.
So, it is useful information to know. If using a service like AirBnB is something you do regularly, it may be worth investigating these techniques in greater detail, or at least have it handily bookmarked.
The problem of recording devices in an AirBnB strikes me as similar to the case where you travel a lot and regularly use open wifi networks. The increased risk to your threat model might warrant the services of a virtual private network provider, if you don’t already use one. Again, it depends on your threat model.
This issue got me thinking about the larger pattern of surveillance, not just of online spaces but of physical space. Police departments are using overhead surveillance drones to monitor areas indefinitely (the level of which can be increased to monitor public spaces during large gatherings), registering private security cameras, automated license plate scanners, body and squad car cameras and so forth. These are being combined with online surveillance technologies to map social media to physical spaces. All of these technologies are being combined together:
“By combining drone, body-camera, police-car-camera, and closed-circuit-TV footage, Axon is clearly hoping to create a central hub for police to cross-reference and access surveillance data—a treasure chest of information that, according to Elizabeth Joh, a law professor at the University of California–Davis who studies civil liberties and police surveillance technology, police departments could find difficult to stop using once they start. “Not only is there no real competition from other vendors,” said Joh, “but once a police department has bought into a certain contract with a company, it’s very hard to drop it and move on. There’s a lot of investment in training the agency and the officers how to use it.”April Glaser, “The Next Frontier of Police Surveillance Is Drones.” Slate.com. June 7, 2018
Companies like Palantir that cut their teeth on developing anti-terrorism surveillance and big analytic products that are now being rolled out to local police departments. All of this is happening with relatively little oversight.
Of course, big data means that artificial intelligence is being trained on all of this surveillance data. One task is to train artificial intelligence algorithms to recognize facial, gait, voice and other identifying characteristics of individuals. Another is to create a time series to be able to track those individuals in time and space. It will change the way police interact with their population, because they will have a good idea of who was in the area, so the software will offer them a list of possible perpetrators and witnesses, without necessarily good indication of which is which.
It reminds me of a quote:
“One of the major purposes of state simplifications, collectivization, assembly lines, plantations, and planned communities alike is to strip down reality to the bare bones so that the rules will in fact explain more of the situation and provide a better guide to behavior. To the extent that this simplification can be imposed, those who make the rules can actually supply crucial guidance and instruction. This, at any rate, is what I take to be the inner logic of social, economic, and productive de-skilling. If the environment can be simplified down to the point where the rules do explain a great deal, those who formulate the rules and techniques have also greatly expanded their power. They have, correspondingly, diminished the power of those who do not. To the degree that they do succeed, cultivators with a high degree of autonomy, skills, experience, self-confidence, and adaptability are replaced by cultivators following instructions. Such reduction in diversity, movement, and life, to recall Jacobs’s term, represents a kind of social ‘taxidermy’.”― James C. Scott, Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed
So, it isn’t hard to imagine a situation evolving where police work reduces to following up leads that are generated by artificial intelligence. The amount of data, the kind of data, the assumptions being employed, and so forth will all be a black box to the officer on the street. The simplified map will become the territory. The beat cop will become the instrument of the algorithm designers, who may or may not be getting feedback on failures of the system. Many of these problems of these tools will be subtle, such as how their use changes the culture of the police department. People won’t know what to look for, and by the time the problems are identified, they may already be baked into the culture. It will certainly be too late for the individuals effected by software bugs, with errors being miscarriages of justice against individuals and prison sentences.
It isn’t hard to imagine a mature industry progressing to Philip K. Dick concepts of “pre-crime”. Artificial intelligence systems will be expanded to look for larger patterns in the data that tend to lead to crime, and there will be compelling arguments to use this information to stage interventions.
Who will watch these artificial intelligence watchmen? Is it even possible? In the mad dash to implement these systems, what kind of oversight is there? Sadly, the answer is: none.
“When whole communities like East L.A. are algorithmically scraped for pre-crime suspects, data is destiny, says Saba. ‘These are systemic processes. When people are constantly harassed in a gang context, it pushes them to join.’…
…’These cases are perfect examples of how databases filled with unverified information that is often false can destroy people’s lives,’ says his attorney, Vanessa del Valle of Northwestern University’s MacArthur Justice Center.”
—Peter Waldman, Lizette Chapman, and Jordan Robertson. “Palantir Knows Everything About You.” Bloomberg. April 19, 2018.
Nothing says small government conservatism like a surveillance state powered by dirty unverified data, black box algorithms, and an artificial intelligence that works for someone that believes death is optional and who takes ten years to destroy an organization for reporting on his sexual orientation. Hard to articulate how safe this makes me feel knowing this is going on in the background with next to no transparency or oversight.
Find that depressing? Try a Pre-Crime Comic.
“Should we all just leave Facebook? That may sound attractive but it is not a viable solution. In many countries, Facebook and its products simply are the internet. Some employers and landlords demand to see Facebook profiles, and there are increasingly vast swaths of public and civic life — from volunteer groups to political campaigns to marches and protests — that are accessible or organized only via Facebook.”
—Zeynep Tufekci, “Facebook’s Surveillance Machine.” The New York Times. March 19, 2018.
It’s a Catch-22. You have to be willing to tell Facebook, as well as the employers and landlords that demand access to your social media accounts should you choose to have them, to fuck off in order to get “vast swaths of public and civic life” off of the Facebook platform. Regulation isn’t going to solve the problem of Facebook and the feudal Internet. Thinking that regulation can solve every problem is one of the central contradictions of U.S. liberal political thought. But then, U.S. conservatives have similar notions of deregulation. You can’t have small government and a global war on Communism, terrorism and drugs.
Sometimes there is no reform that will square the circle, and you have to make a choice. It’s perfectly reasonable to choose not to use Facebook. It takes two to four weeks to shake off the desire to check it, and then, most likely, you’ll spend more time with those closest to you rather than cultivating all the weak ties out beyond your Dunbar number of acquaintances that Facebook facilitates. Not everyone can do it, but many people could (and should).
“Manipulation campaigns can plug into the commercial surveillance infrastructure and draw on lessons of behavioral science. They can use testing to refine strategies that take account of the personal traits of targets and identify interventions that may be most potent. This might mean identifying marginal participants, let’s say for joining a march or boycott, and zeroing in on interventions to dissuade them from taking action. Even more worrisomely, such targeting could try to push potential allies in different directions. Targets predicted to have more radical inklings could be pushed toward radical tactics and fed stories deriding compromise with liberal allies. Simultaneously, those predicted to have more liberal sympathies may be fed stories that hype fears about radical takeover of the resistance. Such campaigns would likely play off divisions along race, gender, issue-specific priorities, and other lines of identity and affinity.”
—Matthew Crain and Anthony Nailer, “Commercial Surveillance State.” N+1. September 27, 2017.