Why Fish Don’t Know They’re Wet

You know that David Foster Wallace speech about fish? Two young fish swimming along, older fish passes and says “Morning boys, how’s the water?” The young fish swim on, then one turns to the other: “What the hell is water?”

That’s the point. We don’t notice what we’re swimming in.

The Furniture We Sit In

Think about chairs. If you grew up sitting in chairs, you probably can’t comfortably squat all the way down with your feet flat on the ground. Try it right now. Most Americans can’t do it—our hips and ankles don’t have that range anymore.

But people in many Asian countries can squat like that easily. They didn’t sit in chairs as much growing up, so their bodies kept that mobility.

The chair didn’t reveal “the natural way to sit.” It created a way to sit, and then our bodies adapted to it. We lost other ways of sitting without noticing.

Stories and language work the same way. They’re like furniture for our minds.

Mental Furniture

The stories you grow up hearing shape what thoughts seem natural and what thoughts seem strange or even impossible.

If you grow up hearing stories where the hero goes on a journey, faces challenges, and comes back changed—you’ll expect your own life to work that way. When something bad happens, you might think “this is my challenge, I’ll grow from this.” That’s not wrong, but it’s not the only way to think.

Other cultures tell different stories:

  • Some stories teach “be clever and survive” instead of “face your fears and grow”
  • Some teach “keep the group happy” instead of “discover who you really are”
  • Some teach “things go in cycles” instead of “you’re on a journey forward”

None of these is more true than the others. They’re just different furniture. They each let you sit in some positions comfortably while making other positions hard or impossible.

Reality Tunnels

Writer Robert Anton Wilson called this your “reality tunnel”—the lens made of your beliefs, language, and experiences that shapes what you can see. He was right that we’re all looking through tunnels, not at raw reality.

Wilson believed you could learn to switch between different reality tunnels—adopt a completely different way of seeing for a while, then switch to another one. Try thinking like a conspiracy theorist for a week, then like a scientist, then like a mystic.

He wasn’t completely wrong. But switching tunnels isn’t as easy as Wilson sometimes made it sound. It’s more like switching languages—you need immersion, practice, and maintenance, or you just end up back in your native tunnel when things get difficult.

Why This Matters

When you only have one kind of mental furniture, you think that’s just how thinking works. Like those fish who don’t know they’re in water.

But when you realize stories and language are furniture—not reality—you get some important abilities:

First: You notice when your furniture isn’t working. Sometimes you face a problem where thinking “I need to grow from this challenge” actually makes things worse. Maybe you just need to be clever and get through it. Or maybe you need to stop focusing on yourself and think about the group. Your usual way of thinking might be the wrong tool for this specific situation.

Second: You can learn to use different tools. Not perfectly—that takes years of practice, like learning a new language. But you can borrow techniques.

Want to think more tactically? Read trickster stories—the wise fool who outsmarts powerful people through wit rather than strength.

Want to notice how groups work? Pay attention to stories that focus on harmony and relationships instead of individual heroes.

Want to see patterns instead of progress? Look at stories where things cycle and repeat instead of moving forward to an ending.

Third: No framework gets to be the boss. This is where it gets interesting. Once you see that all frameworks are furniture, none of them can claim to be “reality itself.” They’re all tools.

Think about how cleanliness norms work in Japan. There’s no cleanliness police enforcing the rules. People maintain incredibly high standards because they value the outcome. The structure is real and binding, but not coercive.

Your mental frameworks can work the same way. You choose which ones to use based on what you value and what works, not because any of them is “the truth.” That’s a kind of mental anarchism—no imposed authority telling you how you must think, but still having structure because you value what it enables.

The Hard Part

Here’s what most people don’t want to hear: different frameworks sometimes genuinely conflict. There’s no way to make them all fit together nicely.

An anthropologist once read Shakespeare’s Hamlet to a tribe. The tribesmen thought Hamlet’s uncle marrying his mother was perfectly reasonable, and Hamlet’s reaction seemed childish. They weren’t offering “an alternative interpretation.” From their framework, the Western reading was simply wrong.

This creates real tension. You can’t be “in” two incompatible frameworks at once. You have to actually pick, at least for that moment. And when you’re stressed or in crisis, you’ll probably default back to your native framework—the one you grew up with.

The question is whether you can recover perspective afterward: “That framework felt like reality in the moment, but it doesn’t own reality.”

The Practical Part

You probably can’t completely change your mental furniture. That would be like growing up again in a different culture. It would take years of immersion in situations where a different framework actually matters—where there are real consequences for not using it.

But you can do three things:

Stay aware that you’re sitting in furniture, not on the ground. Notice when your usual way of thinking is just one option, not the truth.

Borrow strategically from other frameworks for specific situations. Use a different mental model, tell yourself a different kind of story about what’s happening, ask different questions. Not because the new furniture is better, but because sometimes it gives you a view you couldn’t see from your regular chair.

Accept the tension when frameworks conflict. Don’t try to force them into a neat synthesis. Real anarchism isn’t chaos—it’s having structure without letting any structure claim ultimate authority. You maintain your primary way of thinking because you value what it enables, not because it’s “true.” And you accept that other frameworks might be genuinely incompatible with yours, with no neutral way to resolve it.

The Bottom Line

We all swim in water—language, stories, ways of thinking that feel natural but are actually learned. The point isn’t to get out of the water. You can’t.

The point is to notice it’s there. To see that your framework is a way, not the way. To choose which furniture to sit in based on what you value and what the situation demands, not because someone told you that’s reality.

That’s harder than it sounds. When things get tough, your native framework will reassert itself and feel like the only truth. But if you can recover perspective afterward—if you can remember that you were sitting in furniture, not touching the ground—you’ve gained something real.

It’s a kind of freedom. Not the easy freedom of “believe whatever you want.” The harder freedom of “no framework owns you, but you still need frameworks to function.”

That’s not much. But it’s something. And it beats being the fish who never even knew there was water.

Finding Your Best Starting Point: A Simple Guide to Personal Growth

The Big Idea

Instead of asking “What’s wrong with me?” ask “Where should I start today?”

This guide helps you pick the best place to focus your energy so you can grow and feel better.

Step 1: Look at Four Areas of Your Life

Think about these four parts of yourself:

Your Body

  • How tense or relaxed do you feel?
  • How is your breathing?
  • Do you have energy or feel tired?
  • How does your body want to move?

Your Emotions

  • Can you feel your emotions clearly?
  • Are your feelings strong or weak right now?
  • Can you share your feelings with others?

Your Mind

  • Are your thoughts clear or foggy?
  • Can you focus on what matters?
  • What patterns do you notice in your thinking?

Your World Around You

  • How are your relationships with family and friends?
  • How do you feel about work or school?
  • Does your home feel good to you?

Step 2: Pick Your Approach

Choose one of these three ways to decide where to start:

The Smart Move
Ask: “What one change could help everything else get better?”
This is about being efficient and making progress.

The Fun Way
Ask: “What sounds interesting or exciting to explore right now?”
This is about enjoying the process and discovering new things.

The Hard Thing
Ask: “What do I keep avoiding that keeps asking for my attention?”
This is about facing the stuff you don’t want to deal with.

Note: Sometimes the thing you’re avoiding IS the smart move to make.

Step 3: Try Something Small

  1. Pick one area (body, emotions, mind, or world around you)
  2. Choose something small to try – don’t go big right away
  3. Think of it as a test, not something that has to fix everything
  4. After you try it, ask yourself:
  • Did this help me feel more open?
  • Did anything shift in how I feel?
  • What should I try next?

Examples to Get You Started

If you picked your body and want to try the smart move:
Take 5 deep breaths when you feel stressed

If you picked emotions and want to try the fun way:
Write down 3 things that made you smile today

If you picked your mind and want to try the hard thing:
Spend 5 minutes thinking about something you’ve been avoiding

If you picked your world and want to try the smart move:
Send one text to someone you care about

Remember This

  • Small changes can lead to big results
  • You can’t always guess what will help the most
  • Sometimes the fun, easy thing works better than the serious, hard thing
  • You’re not broken – you’re just choosing where to put your attention
  • If something doesn’t work, try a different area or approach

The goal isn’t to fix yourself. The goal is to find the best place to put your energy so good things can happen naturally.

Seven Varieties of Stupidity

“1. Pure Stupidity…

2. Ignorant stupidity…

3. Fish-out-of-water stupidity…

4. Rule-based stupidity…

5. Overthinking stupidity…

6. Emergent stupidity…

7. Ego-driven stupidity…

-Ian Leslie, “Seven Varieties of Stupidity.” ianleslie.substack.com. May 21, 2022

It’s a fun classification exercise. I’d say that 3 is a subset of 2, being in an unfamiliar environment is a variety of ignorance.

However, if you think about the kinds of stupidity we are most likely in contemporary times, it’s rule-based stupidity. Everyone is being turned into algorithms. They have a set of rules they are given, checklists, and they go through the checklist, whether it makes sense or not.

For example, if you take out a home equity loan of $20,000 on a home worth $200,000. Does the bank really need your credit report and income? But, by God, they’ll get through their checklist, before they’ll lend anyone money.

It’s also interesting to think about the connections. Rule-based stupidity is a variety of emergent stupidity, or the kind of stupidity you get when people get together and are afraid of conflict and sharing their ideas. Rule-based stupidity is trying to stripe initiative away from people because you are afraid of the first three forms of stupidity.

Ego driven stupidity reminds me of barkers in the Church of Interruption piece. People that are too busy thinking they are the only ones with anything interesting to say, aren’t learning anything. There’s only so much we can learn from our own experience. To be smart in any meaningful sense, we have to learn from the experiences of others. If we stop doing that, we slowly become more stupid.

The Big Here Quiz

30 questions to elevate your awareness (and literacy) of the greater place in which you live:

1) Point north.
2) What time is sunset today?
3) Trace the water you drink from rainfall to your tap.
4) When you flush, where do the solids go? What happens to the waste water?
5) How many feet (meters) above sea level are you?

30) How many days till the moon is full?

-Kevin Kelly, “The Big Here Quiz.” The Technium. February 15, 2022

A concrete reminder of how divorced many of us are from nature. In cognitive evaluation, you sometimes hear people talk about being situated in time and space, i.e., knowing where you are and what time it is. But, this quiz asks the question in a more fundamental way, a way in which the vast majority of us wouldn’t pass.

The Understructure of Thought

Language imposes limitations. When we reason, we use language, whether symbolic or natural. But, our understanding, or, perhaps it is better to talk about it as an intuition, runs deeper than our reason.

A common example can be found in a terms like “creepy”, “janky”, etc. We use these terms when there is uncertainty, when something is unreliable or unpredictable. The “creepy” guy on the bus is one that could possibly do something unexpected and unwanted. The “janky” piece of equipment will fail when it is needed. But, if we were certain, if we were able to reason that this person or piece of equipment were bad in some way, we would move toward judgment. This person is a bad person and must be avoided. This equipment is faulty; it must be replaced. The creepy and janky imply that we aren’t certain, but we know more than our reason can tell.

Of course, some of what makes up our intuition is a worldview, which is faulty. For example, people will look for information that confirms their bias, such as using the “precautionary principle” with respect to vaccines due to some rationale, such as an untested vaccine platform or antibody enhanced infection. However, the precautionary principle has a bias, against the new.

There are other principles. You could also use a decision-making model that looks at a decision in terms of risk/benefit. But, this also has a bias. Being able to assess risk and benefits means you have relevant experience that allows for making a risk/benefit assessment. But, it is useless where we have no experience.

Another would be focusing on signal-noise ratio for processing information. High signal means you have a lot of precision in what you hear, but it also implies that you may be missing signal. When you’ve attenuated what you are listening to down to a level that screens out most noise, you are also likely screening out signal. Perhaps that lost signal makes a difference in judgment? High signal implies a value judgment based on prior experience. It implies a level on confirmation bias.

You could probably think of many different ways of thinking about information and making decisions, and most of them would favor the status quo. So, perhaps, one way to break the tendency is to look for ways of making decisions that favor options with more unknowns, where it is difficult to make an assessment based on our prior experience. Experience forms the understructure of our thought. Broadening our experience helps us change our thinking from the ground up. More experience inables more variability in our intuitions, which in turn change our more formal, “rational” thoughts.

Meditation Without Meditating

“Over the past several decades, studies examining the potential for meditation to curb mental anguish and increase wellbeing have yielded promising, if complicated, results. For patients, complications can arise when meditation is marketed as a ‘happy pill, with no side effects’. This commodification and oversimplification is at the root of a conundrum for Jay Sanguinetti and Shinzen Young, the co-directors of SEMA Lab (Sonication Enhanced Mindful Awareness) at the University of Arizona. In the early stages of developing a technology that they believe could lead to meditative states without the need to meditate – a Silicon Valley-ready concept if there ever was one – the duo now must navigate the intricate ethics of introducing such a powerful product to the world. This short film from The Guardian follows Sanguinetti and Shinzen in their quest to ‘democratise enlightenment’ via ultrasound technology, while also attempting to ensure that, when the time comes, it will be properly implemented as a therapeutic tool.”

Lina Lyte Plioplyte, “‘Meditation without meditating’ might be possible. Can it also be made ethical?Aeon.com via TheGuardian.com. August 16, 2021.

Preferring Pain to High Cognitive Effort

“Cognitive effort is described as aversive, and people will generally avoid it when possible. This aversion to effort is believed to arise from a cost–benefit analysis of the actions available. The comparison of cognitive effort against other primary aversive experiences, however, remains relatively unexplored. Here, we offered participants choices between performing a cognitively demanding task or experiencing thermal pain. We found that cognitive effort can be traded off for physical pain and that people generally avoid exerting high levels of cognitive effort. We also used computational modelling to examine the aversive subjective value of effort and its effects on response behaviours. Applying this model to decision times revealed asymmetric effects of effort and pain, suggesting that cognitive effort may not share the same basic influences on avoidance behaviour as more primary aversive stimuli such as physical pain.”

Todd A Vogel, et al. “Forced choices reveal a trade-off between cognitive effort and physical pain.” eLife: Neurosciences. November 17, 2020. doi: 10.7554/eLife.59410

Of course, it’s a little more complicated than outlined in this abstract.

The Dunning-Kruger Effect Is Probably Not Real

“For an effect of human psychology to be real, it cannot be rigorously replicated using random noise. If the human brain was predisposed to choose heads when a coin is flipped, you could compare this to random predictions (heads or tails) made by a computer and see the bias. A human would call more heads than the computer would because the computer is making random bets whereas the human is biased toward heads. With the Dunning-Kruger effect, this is not the case. Random data actually mimics the effect really well…

…Measuring someone’s perception of anything, including their own skills, is fraught with difficulties. How well I think I did on my test today could change if the whole thing was done tomorrow, when my mood might differ and my self-confidence may waver. This measurement of self-assessment is thus, to a degree, unreliable. This unreliability–sometimes massive, sometimes not–means that any true psychological effect that does exist will be measured as smaller in the context of an experiment. This is called attenuation due to unreliability. ‘Scores of books, articles, and chapters highlight the problem with measurement error and attenuated effects,’ Patrick McKnight wrote to me. In his simulation with random measurements, the so-called Dunning-Kruger effect actually becomes more visible as the measurement error increases. ‘We have no instance in the history of scientific discovery,’ he continued, ‘where a finding improves by increasing measurement error. None.'”

—Jonathan Jarry, “The Dunning-Kruger Effect Is Probably Not Real.” McGill: Office for Science and Society. December 17, 2020.

[Question] Is Stupidity Expanding? Some Hypotheses.

“To be explained: It feels to me that in recent years, people have gotten stupider, or that stupid has gotten bigger, or that the parts of people that were always stupid have gotten louder, or something like that.

I’ve come up with a suite of hypotheses to explain this (with a little help from my friends). I thought I’d throw them out here to see which ones the wise crowd here think are most likely. Bonus points if you come up with some new ones. Gold stars if you can rule some out based on existing data or can propose tests by which they might be rendered more or less plausible.

-David Gross, “[Question] Is Stupidity Expanding? Some Hypotheses.” greaterwrong.com. October 15, 2020.

George Carlin kind of nails it for me: stupid, full of shit and fuckin’ nuts. While the Venn diagram has overlap, you really cannot think about this issue without the other two.

Prima facie evidence? See hypotheses in Section A, Hypothesis 11:

“There is no truth, only power. What I’ve been interpreting as truth and rationality has been my own attempt to align my thinking with the political clique that was in power when I was being educated. What I’m interpreting as rising stupidity has been the collapse in power and status of that clique and the political obsolescence of the variety of ‘truth’ and ‘rationality’ I internalized as a child. Those pomo philosophers were right all along.”

Or Section B, Hypothesis 10:

“Stupid choices used to reliably have undesirable results; now there is more of a disconnect where people are shielded from the results of their stupid choices, or even rewarded for them (man lights himself on fire in an easily-forseeable misadventure, becomes YouTube legend). So people may be appearing stupid not as a result of being stupid but as the result of a perverse cost-benefit analysis. People are no dumber than they used to be, but for [reasons] it has become advantageous to display stupidity and so smart people sometimes mimic idiocy so as to reap such advantages. The smarter they are, the quicker they caught on to this and the better mimics they are, so this makes it look as though the smart people are being replaced by morons, when really it’s more a matter of camouflage.”

Both are clearly in the full of shit category. Much of crazy is indistinguishable from stupid. Section B, Hypothesis 8, for instance:

“Back in the day, when a person had a stupid idea, they would be reluctant to put it forward as their own. Rather, they would wait to see if someone else would voice the idea so they could just agree with it. This used to be relatively rare, but now you just have to google “[my stupid idea]” to find that someone or other has said it first, and then you’re off to the races.

Replace stupid with crazy in that sentence, and it is every bit as valid.

Cults: Dissociation, Group Psychology, and Cognitive Dissonance

“”How does cult psychology work? How is it possible to persuade human adults to enter a weird cognitive landscapewith no basis in reality? To enter a fantasy realm so profound that they’ll willingly die for whomever has been selected as the local Messiah?”

–Matthew J Sharps Ph.D, “Cults and Cognition: Programming the True Believer.” Psychology Today. October 2, 2020.

Partial answer: Through dissociation group psychology, and cognitive dissonance.

“…cognitive dissonance (e.g ., Festinger et al. 1956), which manifests itself in the tendency to overvalue anything in which we’ve invested too much—money, time, emotional energy, whatever. Cognitive dissonance essentially means that the more you’ve paid, the better you like. Whether it makes any sense or not.

-ibid.