The Understructure of Thought

Language imposes limitations. When we reason, we use language, whether symbolic or natural. But, our understanding, or, perhaps it is better to talk about it as an intuition, runs deeper than our reason.

A common example can be found in a terms like “creepy”, “janky”, etc. We use these terms when there is uncertainty, when something is unreliable or unpredictable. The “creepy” guy on the bus is one that could possibly do something unexpected and unwanted. The “janky” piece of equipment will fail when it is needed. But, if we were certain, if we were able to reason that this person or piece of equipment were bad in some way, we would move toward judgment. This person is a bad person and must be avoided. This equipment is faulty; it must be replaced. The creepy and janky imply that we aren’t certain, but we know more than our reason can tell.

Of course, some of what makes up our intuition is a worldview, which is faulty. For example, people will look for information that confirms their bias, such as using the “precautionary principle” with respect to vaccines due to some rationale, such as an untested vaccine platform or antibody enhanced infection. However, the precautionary principle has a bias, against the new.

There are other principles. You could also use a decision-making model that looks at a decision in terms of risk/benefit. But, this also has a bias. Being able to assess risk and benefits means you have relevant experience that allows for making a risk/benefit assessment. But, it is useless where we have no experience.

Another would be focusing on signal-noise ratio for processing information. High signal means you have a lot of precision in what you hear, but it also implies that you may be missing signal. When you’ve attenuated what you are listening to down to a level that screens out most noise, you are also likely screening out signal. Perhaps that lost signal makes a difference in judgment? High signal implies a value judgment based on prior experience. It implies a level on confirmation bias.

You could probably think of many different ways of thinking about information and making decisions, and most of them would favor the status quo. So, perhaps, one way to break the tendency is to look for ways of making decisions that favor options with more unknowns, where it is difficult to make an assessment based on our prior experience. Experience forms the understructure of our thought. Broadening our experience helps us change our thinking from the ground up. More experience inables more variability in our intuitions, which in turn change our more formal, “rational” thoughts.

Meditation Without Meditating

“Over the past several decades, studies examining the potential for meditation to curb mental anguish and increase wellbeing have yielded promising, if complicated, results. For patients, complications can arise when meditation is marketed as a ‘happy pill, with no side effects’. This commodification and oversimplification is at the root of a conundrum for Jay Sanguinetti and Shinzen Young, the co-directors of SEMA Lab (Sonication Enhanced Mindful Awareness) at the University of Arizona. In the early stages of developing a technology that they believe could lead to meditative states without the need to meditate – a Silicon Valley-ready concept if there ever was one – the duo now must navigate the intricate ethics of introducing such a powerful product to the world. This short film from The Guardian follows Sanguinetti and Shinzen in their quest to ‘democratise enlightenment’ via ultrasound technology, while also attempting to ensure that, when the time comes, it will be properly implemented as a therapeutic tool.”

Lina Lyte Plioplyte, “‘Meditation without meditating’ might be possible. Can it also be made ethical?Aeon.com via TheGuardian.com. August 16, 2021.

Preferring Pain to High Cognitive Effort

“Cognitive effort is described as aversive, and people will generally avoid it when possible. This aversion to effort is believed to arise from a cost–benefit analysis of the actions available. The comparison of cognitive effort against other primary aversive experiences, however, remains relatively unexplored. Here, we offered participants choices between performing a cognitively demanding task or experiencing thermal pain. We found that cognitive effort can be traded off for physical pain and that people generally avoid exerting high levels of cognitive effort. We also used computational modelling to examine the aversive subjective value of effort and its effects on response behaviours. Applying this model to decision times revealed asymmetric effects of effort and pain, suggesting that cognitive effort may not share the same basic influences on avoidance behaviour as more primary aversive stimuli such as physical pain.”

Todd A Vogel, et al. “Forced choices reveal a trade-off between cognitive effort and physical pain.” eLife: Neurosciences. November 17, 2020. doi: 10.7554/eLife.59410

Of course, it’s a little more complicated than outlined in this abstract.

The Dunning-Kruger Effect Is Probably Not Real

“For an effect of human psychology to be real, it cannot be rigorously replicated using random noise. If the human brain was predisposed to choose heads when a coin is flipped, you could compare this to random predictions (heads or tails) made by a computer and see the bias. A human would call more heads than the computer would because the computer is making random bets whereas the human is biased toward heads. With the Dunning-Kruger effect, this is not the case. Random data actually mimics the effect really well…

…Measuring someone’s perception of anything, including their own skills, is fraught with difficulties. How well I think I did on my test today could change if the whole thing was done tomorrow, when my mood might differ and my self-confidence may waver. This measurement of self-assessment is thus, to a degree, unreliable. This unreliability–sometimes massive, sometimes not–means that any true psychological effect that does exist will be measured as smaller in the context of an experiment. This is called attenuation due to unreliability. ‘Scores of books, articles, and chapters highlight the problem with measurement error and attenuated effects,’ Patrick McKnight wrote to me. In his simulation with random measurements, the so-called Dunning-Kruger effect actually becomes more visible as the measurement error increases. ‘We have no instance in the history of scientific discovery,’ he continued, ‘where a finding improves by increasing measurement error. None.'”

—Jonathan Jarry, “The Dunning-Kruger Effect Is Probably Not Real.” McGill: Office for Science and Society. December 17, 2020.

[Question] Is Stupidity Expanding? Some Hypotheses.

“To be explained: It feels to me that in recent years, people have gotten stupider, or that stupid has gotten bigger, or that the parts of people that were always stupid have gotten louder, or something like that.

I’ve come up with a suite of hypotheses to explain this (with a little help from my friends). I thought I’d throw them out here to see which ones the wise crowd here think are most likely. Bonus points if you come up with some new ones. Gold stars if you can rule some out based on existing data or can propose tests by which they might be rendered more or less plausible.

-David Gross, “[Question] Is Stupidity Expanding? Some Hypotheses.” greaterwrong.com. October 15, 2020.

George Carlin kind of nails it for me: stupid, full of shit and fuckin’ nuts. While the Venn diagram has overlap, you really cannot think about this issue without the other two.

Prima facie evidence? See hypotheses in Section A, Hypothesis 11:

“There is no truth, only power. What I’ve been interpreting as truth and rationality has been my own attempt to align my thinking with the political clique that was in power when I was being educated. What I’m interpreting as rising stupidity has been the collapse in power and status of that clique and the political obsolescence of the variety of ‘truth’ and ‘rationality’ I internalized as a child. Those pomo philosophers were right all along.”

Or Section B, Hypothesis 10:

“Stupid choices used to reliably have undesirable results; now there is more of a disconnect where people are shielded from the results of their stupid choices, or even rewarded for them (man lights himself on fire in an easily-forseeable misadventure, becomes YouTube legend). So people may be appearing stupid not as a result of being stupid but as the result of a perverse cost-benefit analysis. People are no dumber than they used to be, but for [reasons] it has become advantageous to display stupidity and so smart people sometimes mimic idiocy so as to reap such advantages. The smarter they are, the quicker they caught on to this and the better mimics they are, so this makes it look as though the smart people are being replaced by morons, when really it’s more a matter of camouflage.”

Both are clearly in the full of shit category. Much of crazy is indistinguishable from stupid. Section B, Hypothesis 8, for instance:

“Back in the day, when a person had a stupid idea, they would be reluctant to put it forward as their own. Rather, they would wait to see if someone else would voice the idea so they could just agree with it. This used to be relatively rare, but now you just have to google “[my stupid idea]” to find that someone or other has said it first, and then you’re off to the races.

Replace stupid with crazy in that sentence, and it is every bit as valid.

Cults: Dissociation, Group Psychology, and Cognitive Dissonance

“”How does cult psychology work? How is it possible to persuade human adults to enter a weird cognitive landscapewith no basis in reality? To enter a fantasy realm so profound that they’ll willingly die for whomever has been selected as the local Messiah?”

–Matthew J Sharps Ph.D, “Cults and Cognition: Programming the True Believer.” Psychology Today. October 2, 2020.

Partial answer: Through dissociation group psychology, and cognitive dissonance.

“…cognitive dissonance (e.g ., Festinger et al. 1956), which manifests itself in the tendency to overvalue anything in which we’ve invested too much—money, time, emotional energy, whatever. Cognitive dissonance essentially means that the more you’ve paid, the better you like. Whether it makes any sense or not.

-ibid.

Dunning-Kruger Effect = Satisfaction

“…people with the biggest gap between their abilities and their view of [themselves] say they have the highest levels of satisfaction with their life, career and relationships. “People who report being more adjusted are those who have a combination of relatively lower true abilities and actual higher views of themselves,” says Stéphane Côté, a social psychologistat the Rotman School of Management at the University of Toronto and an author of the paper.”

—Lydia Denworth, “New Insights into Self-Insight: More Might Not Be Better.” Scientific American. August 27, 2019.

Simpson’s Paradox

“Simpson’s paradox (or Simpson’s reversal, Yule–Simpson effect, amalgamation paradox, or reversal paradox), is a phenomenon in probability and statistics, in which a trend appears in several different groups of data but disappears or reverses when these groups are combined.

s.v. Simpson’s Paradox, Wikipedia.

An example using arithmetic from the Stanford Encyclopedia of Philosophy:

1/5 < 2/8

6/8 < 4/5

7/13 > 6/13