The Understructure of Thought

Language imposes limitations. When we reason, we use language, whether symbolic or natural. But, our understanding, or, perhaps it is better to talk about it as an intuition, runs deeper than our reason.

A common example can be found in a terms like “creepy”, “janky”, etc. We use these terms when there is uncertainty, when something is unreliable or unpredictable. The “creepy” guy on the bus is one that could possibly do something unexpected and unwanted. The “janky” piece of equipment will fail when it is needed. But, if we were certain, if we were able to reason that this person or piece of equipment were bad in some way, we would move toward judgment. This person is a bad person and must be avoided. This equipment is faulty; it must be replaced. The creepy and janky imply that we aren’t certain, but we know more than our reason can tell.

Of course, some of what makes up our intuition is a worldview, which is faulty. For example, people will look for information that confirms their bias, such as using the “precautionary principle” with respect to vaccines due to some rationale, such as an untested vaccine platform or antibody enhanced infection. However, the precautionary principle has a bias, against the new.

There are other principles. You could also use a decision-making model that looks at a decision in terms of risk/benefit. But, this also has a bias. Being able to assess risk and benefits means you have relevant experience that allows for making a risk/benefit assessment. But, it is useless where we have no experience.

Another would be focusing on signal-noise ratio for processing information. High signal means you have a lot of precision in what you hear, but it also implies that you may be missing signal. When you’ve attenuated what you are listening to down to a level that screens out most noise, you are also likely screening out signal. Perhaps that lost signal makes a difference in judgment? High signal implies a value judgment based on prior experience. It implies a level on confirmation bias.

You could probably think of many different ways of thinking about information and making decisions, and most of them would favor the status quo. So, perhaps, one way to break the tendency is to look for ways of making decisions that favor options with more unknowns, where it is difficult to make an assessment based on our prior experience. Experience forms the understructure of our thought. Broadening our experience helps us change our thinking from the ground up. More experience inables more variability in our intuitions, which in turn change our more formal, “rational” thoughts.

Standardized Thought

“From this [advertising] expert he learned that the key tool of the ad trade was to “standard[ize] thought by supplying the spectator with a ready-made visual image before he has time to conjure up an interpretation of his own.3 In that instant before the process of making sense was completed, a presupplied image and, subsequently, a thought (not quite your own) could take hold. Thought was being standardized.”

—Rebecca Lemov, “Into the Whirlpool.” The Hedgehog Review. Summer 2020.

A discussion of legibility and mass manipulation from print media through YouTube and Facebook algorithms. Nothing new here for people familiar with James C. Scott’s Seeing Like a State or Edward S. Herman’s Manufacturing Consent. However, I did like this idea of standardizing thought, which is clearly what the 24 hour news networks, YouTube, Twitter, etc. are doing.

My Affair With the Intellectual Dark Web – Great Escape – Medium

“If the idea is that I piss people off by being disloyal to my likely tribes, well, I don’t think that makes me unusual. I think it just makes me a good intellectual.”

—Alice Dreger quoted in Meghan Daum. “My Affair With the Intellectual Dark Web.” Medium. August 24, 2018.

Easy test to see if you (or others) are thinking for yourself is whether your ideas easily conform to a political orthodoxy.

The Daum article is interesting throughout.