“Even Thomas Merton — a mid-twentieth century activist monk who lived at the Abbey of Gethsemani in Kentucky, and whom Odell holds up as a model of informed, participatory refusal — might have difficulty following the same path today. My mom lives a couple hours away from the Abbey of Gethsemani and when we went there a few months ago there were barely any monks left. Reading a flyer I realized that not only do you have to renounce all worldly belongings to join the Abbey, you can’t bring any debt with you. Charities can alleviate some of the burden, but even becoming a monk won’t necessarily help you escape student loans.”
—Rebecca McCarthy, “Against Hustle: Jenny Odell Is Taking Her Time at the End of the World.” Longreads.com. April 2019.
“Walgreens is piloting a new line of “smart coolers”—fridges equipped with cameras that scan shoppers’ faces and make inferences on their age and gender. On January 14, the company announced its first trial at a store in Chicago in January, and plans to equip stores in New York and San Francisco with the tech.
Demographic information is key to retail shopping. Retailers want to know what people are buying, segmenting shoppers by gender, age, and income (to name a few characteristics) and then targeting them precisely.”
—Sidney Fussel, “Now Your Groceries See You, Too.” The Atlantic. January 25, 2019.
Another technology that sounds creepy, but will be everywhere in 10 years and no one will think twice about. It reminds me of the good old days when movie theaters started on time and didn’t show 20 minutes of ads first.
“We audit their practices to ensure they are complying with industry codes of conduct,” said Bowden. “No Google data is used. This extensive audit process includes regular reporting, interviews, and evaluation to ensure vendors meet specified requirements around consent, opt-out, and privacy protections.”
—Ava Kofman, “Google’s Sidewalk Labs Plans to Package and Sell Location Data on Millions of Cellphones.” The Intercept. January 28, 2019.
As these ideas go, this is a good use of the kind of data phones are collecting. For urban planning, it’s great to be able to look at real time road, sidewalk, public transit, building, park and other infrastructure usage.
But, it always starts with good ideas and then, the incentives encourage implementations and extensions that are a net negative, such as using real time location data and artificial intelligence to look for anomalous movement patterns for policing. That’s only the tip of the iceberg of ways this information, packaged in aggregate, could go horribly wrong.
Also, no Google data is being used? Even if true, the key word missing is “…yet.” They are seeing how it is received first, putting it on telephone service providers, before they add in Google data. A Google service of this type will eventually use Google data.
“Algorithms make decisions based on statistical correlations. If you happen to not be a typical individual, showing unusual characteristics, there is a chance that an algorithm will misinterpret your behavior. It may make a mistake regarding your employment, your loan, or your right to cross the border. As long as those statistical correlations remain true, nobody will care to revise this particular judgement. You’re an anomaly.”
—Katarzyna Szymielewicz, “Protecting your online privacy is tough—but here’s a start.” Quartz. January 25, 2019.
So, algorithms are just like people then?
There is a need to regulate data aggregators. But, you have more technical options to avoid surveillance than this article suggests. Here’s a good start.
For one, instead of trying to control the information you put into social media and limiting your “likes”, opt-out of social media entirely. Data aggregators don’t need many real data points to make an accurate profile, and these will be correlated against real records, such as credit card purchases to complete the picture.
You can also not log into a Google account on your Android device. It still may phone home your location data, but at least it isn’t associated with your account. You also can control whether to share location data with social media apps, which again makes the job harder.
Also, using a VPN or Tor Browser can create some distance between your digital and real identity. It certainly makes the job of creating marketing profiles of individuals harder.
Like security, privacy is a process. The more layers you put between you and the surveillance apparatus, the more difficult you make it to profile and surveil you.
“If you believe in something, you have to be willing to stand for something or you don’t really believe in it at all. There’s always going to be consequences for opposing people in power and there’s no doubt that I have faced retaliation, as has every public interest whistleblower coming out of the intelligence community in the last several decades, going back to Daniel Ellsberg. But that doesn’t mean it’s not worth doing. These are risks worth taking…So many people look at the world today, they look at how broken and ruined things are, and they are just disempowered and lost. But what I want people to focus on is the fact that things changed, right. And if they can change for the worse, they can change for the better. And the only reason the world is changing for the worse is because bad people are working to make it happen that way. And if more good people are organizing, if we’re talking about this stuff, if we’re willing to draw lines that we will not allow people to cross without moving us out of the way, the pendulum will swing, and I’ll be home sooner than you think.”
—Edward Snowden. “Edward Snowden on Privacy in the Age of Facebook and Trump.” The Intercept. May 25, 2018.
“A third-party firm leaking customer location information data [from all U.S. mobile telephone service providers in real-time] poses serious privacy and security risks for virtually all U.S. mobile customers (and perhaps beyond, although all my willing subjects were inside the United States).”
Brian Krebs, “Tracking Firm LocationSmart Leaked Location Data for Customers of All Major U.S. Mobile Carriers in Real Time Via Its Web Site.” KrebsonSecurity.com. May 17, 2017.
What could possibly go wrong?
“When whole communities like East L.A. are algorithmically scraped for pre-crime suspects, data is destiny, says Saba. ‘These are systemic processes. When people are constantly harassed in a gang context, it pushes them to join.’…
…’These cases are perfect examples of how databases filled with unverified information that is often false can destroy people’s lives,’ says his attorney, Vanessa del Valle of Northwestern University’s MacArthur Justice Center.”
—Peter Waldman, Lizette Chapman, and Jordan Robertson. “Palantir Knows Everything About You.” Bloomberg. April 19, 2018.
Nothing says small government conservatism like a surveillance state powered by dirty unverified data, black box algorithms, and an artificial intelligence that works for someone that believes death is optional and who takes ten years to destroy an organization for reporting on his sexual orientation. Hard to articulate how safe this makes me feel knowing this is going on in the background with next to no transparency or oversight.
Find that depressing? Try a Pre-Crime Comic.
“Should we all just leave Facebook? That may sound attractive but it is not a viable solution. In many countries, Facebook and its products simply are the internet. Some employers and landlords demand to see Facebook profiles, and there are increasingly vast swaths of public and civic life — from volunteer groups to political campaigns to marches and protests — that are accessible or organized only via Facebook.”
—Zeynep Tufekci, “Facebook’s Surveillance Machine.” The New York Times. March 19, 2018.
It’s a Catch-22. You have to be willing to tell Facebook, as well as the employers and landlords that demand access to your social media accounts should you choose to have them, to fuck off in order to get “vast swaths of public and civic life” off of the Facebook platform. Regulation isn’t going to solve the problem of Facebook and the feudal Internet. Thinking that regulation can solve every problem is one of the central contradictions of U.S. liberal political thought. But then, U.S. conservatives have similar notions of deregulation. You can’t have small government and a global war on Communism, terrorism and drugs.
Sometimes there is no reform that will square the circle, and you have to make a choice. It’s perfectly reasonable to choose not to use Facebook. It takes two to four weeks to shake off the desire to check it, and then, most likely, you’ll spend more time with those closest to you rather than cultivating all the weak ties out beyond your Dunbar number of acquaintances that Facebook facilitates. Not everyone can do it, but many people could (and should).
“Manipulation campaigns can plug into the commercial surveillance infrastructure and draw on lessons of behavioral science. They can use testing to refine strategies that take account of the personal traits of targets and identify interventions that may be most potent. This might mean identifying marginal participants, let’s say for joining a march or boycott, and zeroing in on interventions to dissuade them from taking action. Even more worrisomely, such targeting could try to push potential allies in different directions. Targets predicted to have more radical inklings could be pushed toward radical tactics and fed stories deriding compromise with liberal allies. Simultaneously, those predicted to have more liberal sympathies may be fed stories that hype fears about radical takeover of the resistance. Such campaigns would likely play off divisions along race, gender, issue-specific priorities, and other lines of identity and affinity.”
—Matthew Crain and Anthony Nailer, “Commercial Surveillance State.” N+1. September 27, 2017.