“Without communication, connection, and empathy, it becomes easy for actors to take on the “gardener’s vision”: to treat those they are acting upon as less human or not human at all and to see the process of interacting with them as one of grooming, of control, of organization. This organization, far from being a laudable form of efficiency, is inseparable from dehumanization.”
—Os Keyes, “The Gardener’s Vision of Data.” Real Life. May 6, 2019.
“Computational thinking assumes that perfect information about the past can and should be collected and synthesized to inform decisions about the future.”
—John Thomason, “Is It Easier to Imagine the End of the World Than the End of the Internet?” The Intercept, November 24, 2018.
A review of the book, New Dark Age: Technology and the End of the Future by James Bridle. It is interesting throughout.
Bridle’s central point is about our mental models and that technology is not value neutral. John Thomason points out that technology isn’t just ideas but tangible capital from which the people investing in it are expecting a return.
Think about artificial intelligence. Once you introduce a technology that will fundamentally change the landscape, e.g., introducing autonomous vehicles on the roads, then the model that the autonomous vehicles use to make decisions will also have to change as they change the environment.
Easily said, but clearly some changes will happen that might be unknown factors influencing the model, not accounted for in its decision making, and so forth. One current example is how human biases get baked into training data and influences the decisions of the model. The problem can be very subtle and there may be no obvious solution to it, assuming people are aware of the problem at all and that it can be fixed.