“…This paper will focus on corporate “third-party” tracking: the collection of personal information by companies that users don’t intend to interact with. It will shed light on the technical methods and business practices behind third-party tracking…
The first step is to break the one-way mirror. We need to shed light on the tangled network of trackers that lurk in the shadows behind the glass. In the sunlight, these systems of commercial surveillance are exposed for what they are: Orwellian, but not omniscient; entrenched, but not inevitable. Once we, the users, understand what we’re up against, we can fight back.”-Bennett Cyphers, “Behind the One-Way Mirror: A Deep Dive Into the Technology of Corporate Surveillance.” Electronic Frontier Foundation. December 2, 2019.
“Throughout its history, the Federal Bureau of Investigation (FBI) has used its expansive powers to investigate, monitor, and surveil First Amendment-protected activity. As early as 1924, public concern about the FBI’s violation of First Amendment rights and other civil liberties spurred official attempts to check the FBI’s power. This report covers FBI surveillance of political activity over roughly the past decade. We find that the FBI has repeatedly monitored civil society groups, including racial justice movements, Occupy Wall Street, environmentalists, Palestinian solidarity activists, Abolish ICE protesters, and Cuba and Iran normalization proponents…
…Read the report to understand how pervasive and persistent FBI surveillance is, then take action!”—Chip Gibbons, “Still Spying on Dissent: The Enduring Problem of FBI First Amendment Abuse.” RightsAndDissent.org. October 2019.
“But to an automatic license plate reader (ALPR) system, the shirt is a collection of license plates, and they will get added to the license plate reader’s database just like any others it sees. The intention is to make deploying that sort of surveillance less effective, more expensive, and harder to use without human oversight, in order to slow down the transition to what Rose calls ‘visual personally identifying data collection’.
‘It’s a highly invasive mass surveillance system that invades every part of our lives, collecting thousands of plates a minute. But if it’s able to be fooled by fabric, then maybe we shouldn’t have a system that hangs
things of great importance on it,’ she said.”
—Alex Hern, “The fashion line designed to trick surveillance cameras.” The Guardian. August 14, 2019.
“Talkwalker’s state-of-the-art social media analytics platform uses AI-powered technology to monitor and analyze online conversations in real-time across social networks, news websites, blogs and forums in 187 languages.”
- Post Sharon Lerner’s, “The Plastic Industry’s Fight to Keep Polluting the World.”
- Two views with two likes.
- Talkwalk.com’s artificial intelligence decides a conversation might be happening or could potentially happen that would be critical of plastic production and quickly publishes it to the dashboard on their portal.
- *Bing* A desktop notification sounds.
- Bored assistant account executive at a public relations agency looks up. Yet another desktop notification about that damn The Intercept article? She decides it’s time to stop scrolling through Slack, Facebook and complaining about the cost of Matouk towels to her friend in Messenger (so expensive, even on Amazon!) and decides to finally look into what all the fuss was about in case their client, a ginormous chemical company producing plastic, decides to call and ask the account director what she is doing to minimize the damage from that small time publication The Intercept. Time to get some intel! Take me away, Talkwalker.com.
- Clicks on a promising link. See a few quotes and a link to The Intercept article. Thinks to herself, “What is this shit?”
- Decides to go back to shopping on Amazon, her home page and accidentally hits the refresh button instead.
- Woohoo, four views! Another exciting day here at cafebedouin.org, the veritable tip of the tongue of the “global conversation.”
- Meanwhile, Big Plastic has dumped another few tons of plastic in the ocean, and I’m going to have a McDonald’s fish filet and wait for the microplastic to reduce the sperm counts of men to the point that every birth will require artificial fertilization. Nothing to see here folks! The problem of plastic and patriarchy is solving itself! Try the Matouk Factory Store!
“Virtually all internet users tend to be Google search engine users, by default. The main strategy for Google is to try to hold on to the users it has by implementing better security and privacy protection measures. This is something definitely on their agenda, but the issue still remains that user data is tracked. Therefore, Google is leaking some users who are leaving its boat in order to climb aboard that of Duckduckgo.-Miriam Cihodariu, “Duckduckgo vs Google: A Security Comparison and How to Maximize Your Privacy.” Heimdal Security. May 16, 2019.
I left the Google boat two years ago. I have been consistently using Duckduckgo.com for a couple of years. It’s not as good as Google, but it is adequate for most searches you need to do. I typically only need to use Google if I am looking for answers to a difficult question, it requires Google maps functionality (such as looking for restaurants meeting certain criteria near a specific location), or I am looking for recent news on a specific topic. Duckduckgo.com has the ability to limit to news items, but the number of sources they have compared to Google is limited.
In short, Duckduckgo is a decent Google replacement, if you are willing to exchange a little functionality for a little more privacy. I think it is a worth doing.
“(Compare that with, for example, the statement by David Lange, the former prime minister of New Zealand, who remarked that “it was not until I read [the] book [“Secret Power” by Nicky Hager, which details the history of New Zealand’s Government Communications Security Bureau] that I had any idea that we had been committed to an international integrated electronic network.” He continued that “it is an outrage that I and other ministers were told so little, and this raises the question of to whom those concerned saw themselves ultimately answerable.”)”—Scarlet Kim and Paulina Perrin, “Newly Disclosed NSA Documents Shed Further Light on Five Eyes Alliance.” Lawfare. March 25, 2018.
The reason that ideas like “the deep state” resonate with some people is that there is truth to them. While I’m sure incoming Presidents and Prime Ministers are given access to a great deal of “secret” information, it would take years to cover every important agreement or established ways of doing things, like the Five Eyes agreement. Who decides? And what kind of accountability is there? This illustrates that the likely answer is, “None.”
“Walgreens is piloting a new line of “smart coolers”—fridges equipped with cameras that scan shoppers’ faces and make inferences on their age and gender. On January 14, the company announced its first trial at a store in Chicago in January, and plans to equip stores in New York and San Francisco with the tech.
Demographic information is key to retail shopping. Retailers want to know what people are buying, segmenting shoppers by gender, age, and income (to name a few characteristics) and then targeting them precisely.”
—Sidney Fussel, “Now Your Groceries See You, Too.” The Atlantic. January 25, 2019.
Another technology that sounds creepy, but will be everywhere in 10 years and no one will think twice about. It reminds me of the good old days when movie theaters started on time and didn’t show 20 minutes of ads first.
Open Question: What are the unintended consequences of artificial intelligence in the surveillance domain?
Recently, I came across a good basic guide for finding hidden cameras and bugs with a level of detail I’ve never seen online before. As with everything related to security, the first question to ask is: what is your threat model?
For most people, the need to look for hidden cameras and bugs is not something they need, or at least they don’t think they need it. But, there are situations where the kind of operational security used by spies could also be useful for everyone. The most obvious example is something like AirBnB, where there may be good reason to suspect the risk for hidden cameras and bugs might be higher for everyone than in other similar circumstances, such as a reserved hotel room at a reputable hotel chain or in our own homes.
So, it is useful information to know. If using a service like AirBnB is something you do regularly, it may be worth investigating these techniques in greater detail, or at least have it handily bookmarked.
The problem of recording devices in an AirBnB strikes me as similar to the case where you travel a lot and regularly use open wifi networks. The increased risk to your threat model might warrant the services of a virtual private network provider, if you don’t already use one. Again, it depends on your threat model.
This issue got me thinking about the larger pattern of surveillance, not just of online spaces but of physical space. Police departments are using overhead surveillance drones to monitor areas indefinitely (the level of which can be increased to monitor public spaces during large gatherings), registering private security cameras, automated license plate scanners, body and squad car cameras and so forth. These are being combined with online surveillance technologies to map social media to physical spaces. All of these technologies are being combined together:
“By combining drone, body-camera, police-car-camera, and closed-circuit-TV footage, Axon is clearly hoping to create a central hub for police to cross-reference and access surveillance data—a treasure chest of information that, according to Elizabeth Joh, a law professor at the University of California–Davis who studies civil liberties and police surveillance technology, police departments could find difficult to stop using once they start. “Not only is there no real competition from other vendors,” said Joh, “but once a police department has bought into a certain contract with a company, it’s very hard to drop it and move on. There’s a lot of investment in training the agency and the officers how to use it.”April Glaser, “The Next Frontier of Police Surveillance Is Drones.” Slate.com. June 7, 2018
Companies like Palantir that cut their teeth on developing anti-terrorism surveillance and big analytic products that are now being rolled out to local police departments. All of this is happening with relatively little oversight.
Of course, big data means that artificial intelligence is being trained on all of this surveillance data. One task is to train artificial intelligence algorithms to recognize facial, gait, voice and other identifying characteristics of individuals. Another is to create a time series to be able to track those individuals in time and space. It will change the way police interact with their population, because they will have a good idea of who was in the area, so the software will offer them a list of possible perpetrators and witnesses, without necessarily good indication of which is which.
It reminds me of a quote:
“One of the major purposes of state simplifications, collectivization, assembly lines, plantations, and planned communities alike is to strip down reality to the bare bones so that the rules will in fact explain more of the situation and provide a better guide to behavior. To the extent that this simplification can be imposed, those who make the rules can actually supply crucial guidance and instruction. This, at any rate, is what I take to be the inner logic of social, economic, and productive de-skilling. If the environment can be simplified down to the point where the rules do explain a great deal, those who formulate the rules and techniques have also greatly expanded their power. They have, correspondingly, diminished the power of those who do not. To the degree that they do succeed, cultivators with a high degree of autonomy, skills, experience, self-confidence, and adaptability are replaced by cultivators following instructions. Such reduction in diversity, movement, and life, to recall Jacobs’s term, represents a kind of social ‘taxidermy’.”― James C. Scott, Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed
So, it isn’t hard to imagine a situation evolving where police work reduces to following up leads that are generated by artificial intelligence. The amount of data, the kind of data, the assumptions being employed, and so forth will all be a black box to the officer on the street. The simplified map will become the territory. The beat cop will become the instrument of the algorithm designers, who may or may not be getting feedback on failures of the system. Many of these problems of these tools will be subtle, such as how their use changes the culture of the police department. People won’t know what to look for, and by the time the problems are identified, they may already be baked into the culture. It will certainly be too late for the individuals effected by software bugs, with errors being miscarriages of justice against individuals and prison sentences.
It isn’t hard to imagine a mature industry progressing to Philip K. Dick concepts of “pre-crime”. Artificial intelligence systems will be expanded to look for larger patterns in the data that tend to lead to crime, and there will be compelling arguments to use this information to stage interventions.
Who will watch these artificial intelligence watchmen? Is it even possible? In the mad dash to implement these systems, what kind of oversight is there? Sadly, the answer is: none.
“The simple strategies that we are going to show you will effectively clear most rooms for hidden cameras and bugs without having to use super expensive countersurveillance gear or an outside company.
Most of the processes and steps that we are going to show you are adopted from some of our best government agencies, where countersurveillance is of a grave concern to them, so these techniques have been tried and tested.”
—”How to Find Hidden Cameras and Spy Bugs (The Professional Way).” SentelTechSecurity.com. January 17, 2019.
“The first all-nonfiction McSweeney’s issue is a collection of essays and interviews focusing on issues related to technology, privacy, and surveillance.
The collection features writing by EFF’s team, including Executive Director Cindy Cohn, Education and Design Lead Soraya Okuda, Senior Investigative Researcher Dave Maass, Special Advisor Cory Doctorow, and board member Bruce Schneier.
We also recruited some of our favorite thinkers on digital rights to contribute to the collection: anthropologist Gabriella Coleman contemplates anonymity; Edward Snowden explains blockchain; journalist Julia Angwin and Pioneer Award-winning artist Trevor Paglen discuss the intersections of their work; Pioneer Award winner Malkia Cyril discusses the historical surveillance of black bodies; and Ken Montenegro and Hamid Khan of Stop LAPD Spying debate author and intelligence contractor Myke Cole on the question of whether there’s a way law enforcement can use surveillance responsibly.”