top of page

Turtles all the way down!


Cognition (https://en.wikipedia.org/wiki/Cognition) refers to “the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses”. Fair enough, but when you consider the actual actions or processes, it’s referred to as metacognition (https://en.wikipedia.org/wiki/Metacognition), or “thinking about thinking”. And when I thought about writing this post about metacognition, I guess that becomes meta-metacognition...


#TIL that Wikipedia actually has an article dedicated to the phrase “turtles all the way down” (https://en.wikipedia.org/wiki/Turtles_all_the_way_down), which makes me happy. It discusses the phrase and how it is used to illustrate the problem of “infinite regress”, and refers to various creation myths in which the world rests on the back of a “World Turtle” – what, then, does the turtle rest on? Another turtle, of course!


As I have previously noted (https://www.til-technology.com/post/the-answer), critical thinking is an essential tool for navigating the complex world we live in. But how do we think critically? It’s one thing to learn about plausibility and assessing the credibility of sources, but that’s just the beginning. Learning about logic and logical fallacies is another aspect of critical thinking, and provides tools by which we can judge plausibility and credibility, but how do we know that we’re not just fooling ourselves into thinking we have a better approach?


Metacognition comes in when we start considering why and how we believe, and when we develop objective processes for understanding and assessing the beliefs of others and of ourselves.


When we start to learn about almost any field of study related to human thought, we quickly realize that humans are different from other animals not because we are rational and other animals are not, but rather that humans (and a few other animals) have the ABILITY to think rationally, when we take the time and effort to do so. (Practice is another essential part – like anything else, it’s hard when we begin, and improves over time, if we are disciplined about it)


Many humans think of themselves as rational actors, but much of what we do is based on our psychological makeup and on heuristics (https://en.wikipedia.org/wiki/Heuristic), which are essentially mental short-cuts that allow for faster decision-making.


As with most things, these can be good or bad, depending on the context. It’s important to note that these features appear to be at least partly “hard-wired” into our brains, which raises a lot of interesting questions...

A simple example of a heuristic might be to assume that a rustling in the bushes at night is a predator. If you follow that assumption and there’s no predator, no problem. If you don’t follow that assumption and there is a predator, you are dead. But can’t we just use our reason? Can’t we just rationally decide how to determine whether there’s a predator?


In theory, yes, but that takes time. In practice, if there is a predator there and you try to figure it out rationally, it will take too long, and you will end up dead. There is a definite survival value in the ability to make quick decisions under certain circumstances, even if they are sometimes wrong. In general, heuristics are present where a quick, survival-based response is needed.


Now, since evolution moves very slowly, and is usually measured in centuries and millennia rather than years, there is no reasonable expectation that our current hard-wiring would be consistent with the human existence of the current day. Instead, it seems fair to assume that human responses are optimized for very small groups, with very limited technology, living in a world where life-threatening situations are a daily occurrence – ie, the way things were in our distant past.

This is where rational thought comes in. As humans developed, our ability to think rationally became important to our survival – when it was used effectively. Rational thought included things like developing cooperative plans for dealing with threats, the development of technology... and our ability to indulge in metacognition.

Over time, the threats we faced from our environment became more and more manageable, and our need to face threats to our survival on a day-to-day basis was reduced, leaving more time for reflection, specialization, and eventually our modern society. As we progress, we think about what we learn, how we learn, and why we learn, and start developing theories and systems for better understanding of all of those things.


However... (There’s always a “however”, isn’t there?)

That “hard-wiring” we’ve been discussing has a downside. It’s not rational, so there are limits to our ability to control it, but it’s optimized for our survival, so...

Oh. Right.

... optimized for our survival, in very small groups, with very limited technology, with near-constant threats to our individual survival.

How could that possibly go wrong in a densely-populated, multi-cultural world where physical threats are rare, but where our beliefs are challenged every day?

I think I’ve previously mentioned the podcast You Are Not So Smart, where David McRaney interviews a variety of researchers into all aspects of human psychology. Great stuff, that I highly recommend it! In episodes such as https://youarenotsosmart.com/2017/01/13/yanss-093-the-neuroscience-of-changing-your-mind/ and https://youarenotsosmart.com/2018/02/26/yanss-122-how-our-unchecked-tribal-psychology-pollutes-politics-science-and-just-about-everything-else/, David McRaney discusses many of these points, including the fact that people have been shown to respond to perceived challenges to their world-view in exactly the same way they respond to physical threats.


This, I think, explains a lot. We have a lot of hard-wired responses that most people don’t think about, and rational minds that can be used to come up with explanations to support just about any position we decide we want to argue.

Not necessarily valid or correct ones, but that’s part of the point, and exactly the reason why metacognition is so important. Learning about cognitive biases, logical fallacies, and critical thinking is part of it, but the greatest value is when we start thinking about how and why people think, and what we can do to help ourselves (and others) think better.


Many of the heuristics which were so valuable in our distant past are arguably counter-survival in our current environment, so it seems inevitable that our hard-wired traits will eventually change to ones better suited to our current environment, though at our current rate of societal change, it seems hard to imagine that they will ever really catch up... (Maybe a point for future consideration...)


But if we understand how we think, and why we think in certain ways, we have at least the potential for rational consideration prior to responding. I often think of the book The World of Null-A (https://en.wikipedia.org/wiki/The_World_of_Null-A), by A. E. Van Vogt, in which there appears the concept of a “cortico-thalamic pause”.


The idea is essentially that we pause for rational thought before responding to a situation, but of course a term like “cortico-thalamic” sounds far more impressive. As a side-note, the Null-A books were influenced by the ideas of General semantics (https://en.wikipedia.org/wiki/General_semantics), which included some interesting ideas, but sadly also included a fair bit of pseudo-science...


In the end, I think we simply need to recognize that we can be rational, but it takes time and practice in order to do so consistently. The more we think about how and why people believe, the more willing we will be to change our beliefs, and the more empathetic and charitable we can be toward others. Our hard-wiring will shift over time, but will take centuries to do so - for our society to survive and thrive, we need to learn ways to manage our non-rational responses, and metacognition is the place to start.


Cheers!

12 views

コメント


bottom of page