The Inbox: A Scattered, Ad-Ridden Archive of Our Lives

“To examine our inboxes is to examine our lives: our desires and dreams, our families and careers, our status, our networks and our social groupings, our projects, our commerce, our politics, our secrets/lies/fetishes. Inboxes are anthropological goldmines, textual archives, psychological case studies, waiting to be plumbed and probed for the expansive cultural, ethical, epistemological, and ontological insights lurking therein.
On second thought: they are probably not waiting to be probed, but actually being probed, scanned and algorithmatized, by Google, Amazon, the National Security Agency, the Russians, Julian Assange, employers, ex-lovers who remember your password, current lovers who install surveillance software on your laptop to monitor emails to your ex-lover/next lover, hackers who create fake networks on any public wifi you log onto, and/or anyone else who cares to discover whatever “secrets” you are secreting into the tubes.

It makes more sense to assume your email is a public document than to cling to improbable expectations of privacy. The Post Office made a point of delivering our letters sealed, intact. But the email overseers can read through our inboxes at will without us being any the wiser, and they let others look too…”

—Randy Malamud, “The Inbox: A Scattered, Ad-Ridden Archive of Our Lives.” Literary Hub. October 9, 2019.

Every time I see something like this I can’t help wondering: does this person not realize that you can pay for email and by doing so, you can eliminate advertising and have a reasonably secure email archive? Off the top of my head, Protonmail, Posteo, Tutanota, and Lavabit are all reasonable choices for an email provider.

The Fashion Line Designed to Trick Surveillance Cameras | The Guardian

“But to an automatic license plate reader (ALPR) system, the shirt is a collection of license plates, and they will get added to the license plate reader’s database just like any others it sees. The intention is to make deploying that sort of surveillance less effective, more expensive, and harder to use without human oversight, in order to slow down the transition to what Rose calls ‘visual personally identifying data collection’.

‘It’s a highly invasive mass surveillance system that invades every part of our lives, collecting thousands of plates a minute. But if it’s able to be fooled by fabric, then maybe we shouldn’t have a system that hangs
things of great importance on it,’ she said.”

—Alex Hern, “The fashion line designed to trick surveillance cameras.” The Guardian. August 14, 2019.

Against Hustle

“Even Thomas Merton — a mid-twentieth century activist monk who lived at the Abbey of Gethsemani in Kentucky, and whom Odell holds up as a model of informed, participatory refusal — might have difficulty following the same path today. My mom lives a couple hours away from the Abbey of Gethsemani and when we went there a few months ago there were barely any monks left. Reading a flyer I realized that not only do you have to renounce all worldly belongings to join the Abbey, you can’t bring any debt with you. Charities can alleviate some of the burden, but even becoming a monk won’t necessarily help you escape student loans.”

—Rebecca McCarthy, “Against Hustle: Jenny Odell Is Taking Her Time at the End of the World.” Longreads.com. April 2019.

Walgreens Tests New Smart Coolers

“Walgreens is piloting a new line of “smart coolers”—fridges equipped with cameras that scan shoppers’ faces and make inferences on their age and gender. On January 14, the company announced its first trial at a store in Chicago in January, and plans to equip stores in New York and San Francisco with the tech.

Demographic information is key to retail shopping. Retailers want to know what people are buying, segmenting shoppers by gender, age, and income (to name a few characteristics) and then targeting them precisely.”

—Sidney Fussel, “Now Your Groceries See You, Too.” The Atlantic. January 25, 2019.

Another technology that sounds creepy, but will be everywhere in 10 years and no one will think twice about. It reminds me of the good old days when movie theaters started on time and didn’t show 20 minutes of ads first.

Google’s Sidewalk Labs Plans to Package and Sell Location Data on Millions of Cellphones

“We audit their practices to ensure they are complying with industry codes of conduct,” said Bowden. “No Google data is used. This extensive audit process includes regular reporting, interviews, and evaluation to ensure vendors meet specified requirements around consent, opt-out, and privacy protections.”

—Ava Kofman, “Google’s Sidewalk Labs Plans to Package and Sell Location Data on Millions of Cellphones.” The Intercept. January 28, 2019.

As these ideas go, this is a good use of the kind of data phones are collecting. For urban planning, it’s great to be able to look at real time road, sidewalk, public transit, building, park and other infrastructure usage.

But, it always starts with good ideas and then, the incentives encourage implementations and extensions that are a net negative, such as using real time location data and artificial intelligence to look for anomalous movement patterns for policing. That’s only the tip of the iceberg of ways this information, packaged in aggregate, could go horribly wrong.

Also, no Google data is being used? Even if true, the key word missing is “…yet.” They are seeing how it is received first, putting it on telephone service providers, before they add in Google data. A Google service of this type will eventually use Google data.

Protecting Your Online Privacy is Tough—But Here’s a Start

“Algorithms make decisions based on statistical correlations. If you happen to not be a typical individual, showing unusual characteristics, there is a chance that an algorithm will misinterpret your behavior. It may make a mistake regarding your employment, your loan, or your right to cross the border. As long as those statistical correlations remain true, nobody will care to revise this particular judgement. You’re an anomaly.”

—Katarzyna Szymielewicz, “Protecting your online privacy is tough—but here’s a start.” Quartz. January 25, 2019.

So, algorithms are just like people then?

There is a need to regulate data aggregators. But, you have more technical options to avoid surveillance than this article suggests. Here’s a good start.

For one, instead of trying to control the information you put into social media and limiting your “likes”, opt-out of social media entirely. Data aggregators don’t need many real data points to make an accurate profile, and these will be correlated against real records, such as credit card purchases to complete the picture.

You can also not log into a Google account on your Android device. It still may phone home your location data, but at least it isn’t associated with your account. You also can control whether to share location data with social media apps, which again makes the job harder.

Also, using a VPN or Tor Browser can create some distance between your digital and real identity. It certainly makes the job of creating marketing profiles of individuals harder.

Like security, privacy is a process. The more layers you put between you and the surveillance apparatus, the more difficult you make it to profile and surveil you.

The Pendulum Will Swing

“If you believe in something, you have to be willing to stand for something or you don’t really believe in it at all. There’s always going to be consequences for opposing people in power and there’s no doubt that I have faced retaliation, as has every public interest whistleblower coming out of the intelligence community in the last several decades, going back to Daniel Ellsberg. But that doesn’t mean it’s not worth doing. These are risks worth taking…So many people look at the world today, they look at how broken and ruined things are, and they are just disempowered and lost. But what I want people to focus on is the fact that things changed, right. And if they can change for the worse, they can change for the better. And the only reason the world is changing for the worse is because bad people are working to make it happen that way. And if more good people are organizing, if we’re talking about this stuff, if we’re willing to draw lines that we will not allow people to cross without moving us out of the way, the pendulum will swing, and I’ll be home sooner than you think.”

—Edward Snowden. “Edward Snowden on Privacy in the Age of Facebook and Trump.” The Intercept. May 25, 2018.

Tracking Firm LocationSmart Leaked Location Data for Customers of All Major U.S. Mobile Carriers in Real Time Via Its Web Site

“A third-party firm leaking customer location information data [from all U.S. mobile telephone service providers in real-time] poses serious privacy and security risks for virtually all U.S. mobile customers (and perhaps beyond, although all my willing subjects were inside the United States).”

Brian Krebs, “Tracking Firm LocationSmart Leaked Location Data for Customers of All Major U.S. Mobile Carriers in Real Time Via Its Web Site.” KrebsonSecurity.com. May 17, 2017.

What could possibly go wrong?

Pre-Crime

“When whole communities like East L.A. are algorithmically scraped for pre-crime suspects, data is destiny, says Saba. ‘These are systemic processes. When people are constantly harassed in a gang context, it pushes them to join.’…

…’These cases are perfect examples of how databases filled with unverified information that is often false can destroy people’s lives,’ says his attorney, Vanessa del Valle of Northwestern University’s MacArthur Justice Center.”

—Peter Waldman, Lizette Chapman, and Jordan Robertson. “Palantir Knows Everything About You.” Bloomberg. April 19, 2018.

Nothing says small government conservatism like a surveillance state powered by dirty unverified data, black box algorithms, and an artificial intelligence that works for someone that believes death is optional and who takes ten years to destroy an organization for reporting on his sexual orientation. Hard to articulate how safe this makes me feel knowing this is going on in the background with next to no transparency or oversight.

Find that depressing? Try a Pre-Crime Comic.

Facebook’s Surveillance Machine

“Should we all just leave Facebook? That may sound attractive but it is not a viable solution. In many countries, Facebook and its products simply are the internet. Some employers and landlords demand to see Facebook profiles, and there are increasingly vast swaths of public and civic life — from volunteer groups to political campaigns to marches and protests — that are accessible or organized only via Facebook.”

—Zeynep Tufekci, “Facebook’s Surveillance Machine.” The New York Times. March 19, 2018.

It’s a Catch-22. You have to be willing to tell Facebook, as well as the employers and landlords that demand access to your social media accounts should you choose to have them, to fuck off in order to get “vast swaths of public and civic life” off of the Facebook platform. Regulation isn’t going to solve the problem of Facebook and the feudal Internet. Thinking that regulation can solve every problem is one of the central contradictions of U.S. liberal political thought. But then, U.S. conservatives have similar notions of deregulation. You can’t have small government and a global war on Communism, terrorism and drugs.

Sometimes there is no reform that will square the circle, and you have to make a choice. It’s perfectly reasonable to choose not to use Facebook. It takes two to four weeks to shake off the desire to check it, and then, most likely, you’ll spend more time with those closest to you rather than cultivating all the weak ties out beyond your Dunbar number of acquaintances that Facebook facilitates. Not everyone can do it, but many people could (and should).