Cognitive Abstention

“All the talk in the world doesn’t get it unless you listen to the talk in a new way.”

— Alan Watts

 

Conversation is a fast-paced endeavor. By adulthood we learn to listen, process, and respond fluidly, and to decide in real-time if what was said is true, or false, helpful, unhelpful, serious, or facetious. We render judgement quickly, or become lost in the sea of chatter.

But there is perfidy here, in our evaluations. By adulthood we have also lived enough to have constructed a filter for new information, built out of our experiences, knowledge, and interpretations. The flaw in this mental structure is of course also its benefit — the familiar, almost invisible mental toggle for accepting or rejecting new input. A statement that does not correspond to our understanding of reality is rejected as misleading — too easily. In this process we get nowhere. From one statement to the next, one encounter to the next, we apply the same filter. We become immutable through an action that seems eminently responsible. This subtle treachery undermines successful, productive living.

To evaluate what we see and hear, we must have a filter. This appears to be common sense. But it also appears that a filter limits our growth, and all our flaws, foibles, setbacks, and errors will persist, most likely all life long, causing predictable downfall after predictable downfall, since any failing of ours that is clueless of a stumbling block will stumble every time.

Some mistakes are inevitable, especially from shortcomings which often sit at the center of our persona, the result of childhood experiences, trauma, maladjustment, among other causes. If we are fortunate, these are limited in scope. More drastically we see even friends or family who represent this in an extreme, who never change an iota, never learn from a mistake, unable or unwilling to reconcile new themes. It is immediately and pitifully evident how tragic this can be.

The alternative seems worse: accept new thoughts as they are offered; do not evaluate. That can’t be very good. Compared to this, remaining permanently unchanged seems less dangerous. The third option that offers itself up naturally is to strike a balance — accept some ideas as inherently valid without scrutiny, and evaluate others. This seems no better. By choosing to allow some things through the filter, and reject others, we merely create a new problem of deciding which things to choose. How do we categorize, or rank, input to determine its validity? This is just one step removed from the same problem as before: the choice of what to accept can either be arbitrary, or it can be based on what we already know — which again means this filter will be effectively unchanged.

These are our entrenched positions, unhealthy and unwise, damaging to mind, body, soul.

Out of options then. Either we reject all new modes of thought, leading usually to a terrible end, or in some form or another accept whatever new information comes our way, and end up a jumbled mix of other people’s notions (if this latter option seems unrealistic, consider someone you know who takes news broadcasts at face value).

Obviously there is a flaw in this logic; it doesn’t represent a common case. Many (most?) reasonably well-adjusted adults do not accept new information naively, nor do they go through their lives completely rigid, unwilling to learn new things (even if most of us do that a bit more than we like to admit).

The argument I am making is that even in very well-adjusted individuals, willingness to accept new things is harshly constrained by the need to stay within the “safe” parameters of the existing viewpoint. The best we often can do is to edge bit by bit toward a direction that seems more accurate, more truthful. Then much of the time we change direction again. This amounts to practically no progress at all, and it is why when an epiphany does hit it tends to feel so dramatic and why we tend to fight it (often with success, unfortunately). It seems likely that a point of view so distant from “truth” simply can’t be right, until and unless it hits us in the face so hard and dramatically that we cannot fight it any longer. All too often this reversal of opinion results from some desperate event which might have been avoided if we’d come to the realization sooner.

Of course not all such problems can be avoided or foreseen. But the slow crawl towards truth that seems the fastest pace is no help. And the dramatic upheaval of our point of view brings new problems of its own, not the least of which is the so-called “rubber-band effect” which takes us from one incorrect notion straight into the worst extreme of the another — no intermediate, and no thoughtful consideration. This is of course the result of the dramatic events that tend to precede and cause the re-examination of a closely-held idea.

This is all a bit dramatic for the day-to-day evaluations we tend to make in most facets of life — evaluations that include what job is beneficial, what foods are healthy, the best way to manage a budget, the optimal choice for a vote, and so on, including more minor but still relevant questions like what kind of entertainment I will enjoy today, what author I want to try next, and what color shirt actually looks good on me instead of what my spouse/parent/friend tells me looks good.

It’s easy for minor decisions to go wrong, too, and of course a bad night’s sleep or a poor choice of breakfast food can lead to a bad day, which might include another error brought about by tiredness, stress, or discomfort, and which impacts your career, relationship, and so on. To cut to the chase of it, the closer we can come to usefully accurate information, the better off we are in life.

This argument has become a bit cyclical at this point. New information is a challenge to rationalize appropriately. Old information may be out of date or inaccurate. Poor decisions made based on a mishmash of these often lead to unwanted outcomes. Where we get information and what we decide about its accuracy present a deceptively difficult challenge.

 

*  *  *

 

Either a thing is true or it isn’t…and if you can’t find out whether it’s true or whether it isn’t, you should suspend judgement.

— Bertrand Russel (emphasis added)

In my own life this relatively simple advice has proven among the most useful and practical applications of philosophy. It is, I firmly believe, one of the most immense powers of the human mind: the ability to hold an idea in abeyance, to consider it while abstaining from any conclusion, so that you neither believe it nor disbelieve it, neither find yourself convinced nor skeptical. The idea simply is, and while you are in the process, however short or long, of determining its value or correctness or usefulness, it holds no sway. You are simply aware of it.

From this detached awareness you can build fact on fact, connect idea to idea, and at some point decide if this nascent structure you hold in your head is correct. Or you can determine, all your life, that you have not gotten enough information, and never close the matter off with a conclusive judgement at all.

The retort to this mechanism is that it seems to restrict the user from effective action. If you won’t commit to judgement of ideas, how can you respond with a practical decision? This retort relies on what I believe is an erroneous assumption. Not only are we fully capable of making bad decisions even when we are sure of ourselves, I claim that we make poor decisions with poor outcomes more often when we are sure of ourselves and the correctness of our beliefs than when we are not. I will give examples for this claim shortly.

It would not I think generally be contested that the impetus of action is not dependent on the certainty of a statement or point of view. Action arises from a need for a practical outcome. Given a need to reach, or avoid, a certain outcome, we assess the collection of potential outcomes for the situation. This assessment is not based on one idea, on one body of knowledge, and not on two, or a hundred, but on everything we know and that is in our experience, which is sufficiently relevant to be used to shape decisions. It is typical that we make decisions with only partial information in this way, and act in doubt, sometimes with concern for the outcome — but we act anyway, because failing to act is likely to incur a worse outcome. We may prove incorrect about our estimate of the situation, which isn’t great, but we must try again next time anyway. In other words, when a problem arises, we take our best shot — this is no great revelation. I mention it to re-establish the independence of action from certainty.

One of the most expressive examples of this action-in-doubt manifests in religion. Many believe zealously in the existence of an all-knowing deity, and many disbelieve equally vociferously. Others fall in between. Many practicing believers choose to exercise their religion even with some doubt, in absence of certainty. The reasoning is to act in accordance with a set of religious principles even without feeling certain that they accurately reflect truth, on the notion that they may seem likely to be correct, or that they contain inherent morality or utility regardless of religious accuracy, or simply to be in accordance with some form or derivation of Pascal’s Wager.

Others disregard religious tenets even while maintaining a consideration that something in the religion possesses a degree of legitimacy. Perhaps the total body of tenets remains unconvincing, or external ways of conduct seem equally moral.

An individual in one of these categories often has not yet decided on the total truth of the matter — there is no inherent disingenuity in their actions; quite the opposite, in striving to reconcile their actions with what they suspect they know as of now.

Still others, comfortable in not knowing, take the agnostic approach and may remain open to a more general sense of spirituality without committing to any specific acts or body of belief; effectively, recusing from the quest.

The point of all this categorization is to establish that it is entirely common and effective to choose directed action in absence of belief, across the full spectrum of a very divisive topic.

When we do possess a strong belief, of any kind, it’s evident that action in any other direction would be erroneous. This easy and obvious conclusion is one of the worst we make. How much damage has been done by zealots religious and political? Of course the great challenge in this evaluation is that damage is often a subjective claim, with one generation reviling the actions of its prior, and one culture condemning the most sacred acts of its opposite. Here also is the trouble of certainty. We can’t all be right in holding ourselves sacred and each other villains. I in no way intend to allege that there is no objective truth, only that this truth is very difficult to find, even harder to be sure of, and that in regard to words like good and moral the span across the realm of ideas might be shorter than we’d like to think — that is, ideas that fall under objective good or bad compose a narrow scope, with much of the rest falling only to history to decide. Ideas which are not good or bad, but simply are, may prove even more difficult to certify. This, again, is the difficulty of being certain. Accuracy is the work of millennia.

 

I have the advantage of having found out how hard it is to get to really know something.

— Richard Feynman

Though speaking of the social sciences in particular here, Feynman held a general concern that assumptions should not ever be made, nor any conclusion be reached without a full and convincing investigation, if the topic provides for such, and if not, that the conclusion would not be certain. This approach pervades Feynman’s research and lectures. It is scattered throughout his recordings and his writings.

The quote from above continues:

I know what it means to know something. And therefore…I see how [the social sciences] get their information, and I can’t believe that they know it; they haven’t done the work necessary, haven’t done the checks necessary, haven’t done the care necessary.

As it relates to a topic like religion, this approach nearly requires a lack of conclusive decision. The cautionary religion of a believer who allows himself to doubt, and the cautious doubt of someone who allows himself to believe, enable a flexibility to not only consider other possibilities, but to adapt to them as needed, to new facts, and new understanding.

It’s worth noting that Feynman was mostly atheistic; his view saw no support for organized religion particularly. But equally poignant is that Feynman’s view, like all of his views, he considered not final, not closed, not certain. This is not to say that given time or exposure to an expert pulpit he would have “seen the light” but rather that he remained aware of his, and everyone’s, inability to fully encompass the totality of understanding.

In fact, while I won’t go through the exercise here, perusing the chronology of Feynman’s expressions of his views on God, religion, and epistemology show a flexible, evolving viewpoint (I will list a few quotes and links at the bottom as a starting reference, but there’s more).

 

*  *  *

 

Religion is complicated. Here’s a more down-to-earth example: Political elections often create divisions not only between individuals but between the lines forming our own sense of correctness. One candidate professes a number of policies we agree with, but also one or two we vehemently disagree with. Which outcome is best? We don’t want to yield on our critical issues, but we also don’t want to throw out the baby with the bathwater.

At the risk of sounding trite, this uncertainty is an opportunity. In most cases like this we can, and should, take action. The uncertainty is not a call to inaction but rather identification of a lack of knowledge. In this case the lack regards which outcome would be preferable. This feels like prognostication; impossible to verify — but most variables that affect this case are at our disposal: how the candidates speak about their positions — slight changes of wording can imply different leanings and eventual courses of action; the situation on the ground, which might impact how the policy is implemented; the history of similar policies employed in other places and times which might guide a sense of whether the policy will succeed in the here and now (by our estimation of success). Present in these factors and countless others sit the intricacies of patterns waiting to be recognized and leveraged. A growing awareness of interrelationships, with thought and reasoning applied, provides a good chance of moving our estimation closer to an objective truth (which, in this case, can happily be born out and verified via predictions and outcomes). Lingering uncertainty simply legitimizes the complexity of the situation — the ability to decide hasn’t been compromised, we’ve simply become somewhat more correct. And now it becomes dismaying to contemplate making a choice with an inferior amount of knowledge (but we’ll do it sometime anyway).

In the process of learning, we also discover something of ourselves, through our reactions to the new information, which formerly we possibly have not considered.

There’s one guarantee in all this, and it seems to me that every great thinker reiterates the lesson:

Ignorance more frequently begets confidence than does knowledge.

— Charles Darwin

 

The Dunning-Kruger effect is now relatively internet-famous (or infamous). The principle, and its corollaries, note a tendency for a lack of skill to lead to an overestimation of skill. It is a fairly niche observation easy to overgeneralize, but it identifies a certain way of thinking, which is also born out in other domains. Other research has indicated that at least in the U.S, the higher educated population tends to greater political partisanship (see here for some further discussion on the reasoning). The reasonably straightforward expectation would suppose greater education lead to greater openness. Exposure to new and variant knowledge, a growing understanding of how much there is to learn — everything that an awareness of ourselves, our humanity, and our limits tell us — all ought to point to growing humility, and therefore a growing moderacy.

As the saying goes, “a little knowledge is a dangerous thing.” Knowledge, especially in the form of accredited education, offers an easy route to believing we know something more than we do. A preponderance of awareness or capacity in area A seems to imply we know, or ought to inherently know, something about area B. Often there is no reason but ego. If we are intelligent, we’ve been proven right about enough things in life that it becomes easier and easier to excuse or dismiss the times we’re wrong.

Political affiliation skews our minds too. Interest in an issue can lead to a universal moral alignment we never intended. A general concern with the reasons a position exists leads us to adopt the issue, which takes root as cause of its own, in a way that inhibits thought. We now align with the issue, not with the reason, and if the position shifts, often we shift too, becoming more and more entrenched. Opponents on the other side of an issue seem to malign our core beliefs; they are the enemy, and lead us to even firmer certainty that our initial belief was correct (after all, what kind of monster would oppose position X?). We become more and more extreme, yet view ourselves as moderate, and the others as extreme, where years or months before we ourselves were ambivalent and hold a position which if held by another we would consider ignorant, untenable, or even cruel.

And yet this process is a result of learning. Without very careful scrutiny and the application of doubt, self-education becomes narrow and selective. This returns to the start of this argument, which hopefully some of the points have helped to bolster: Everything that is not in accordance with what is already known and accepted is ignored — everything but what we wanted to see. The process is fast and silent. The intellect does most of the lifting, until we’re willing to stick a skeptical wrench in the works. You know this process, both sides of it, unless you’ve never changed your mind on a single thing in your life. You know the feeling of being right, and you know it doesn’t guarantee you are right, and that knowledge is also what makes you aware and conscious and capable of growth.

In order to keep our psyches safe and our lives manageable, mental change is slow, often begrudging, arises from exposure to novelty, and may be initially repulsive because a new input is so dichotomous with what we already believe. But if we can be wrong about one thing, we can be wrong about another. Of course we can be right about many things too, and we probably are. Just because an idea is foreign doesn’t mean it’s wrong; but it also doesn’t mean it’s right.

Some matters are dependent on judgement and perspective; that is to say they are subjective to the context, the culture, and the needs of the situation. Healthy food for a toddler is not generally equally healthy for an aged diabetic. A mannerism that wins friends in Canada loses them in Mexico. A stellar candidate for town alderman in Santa Fe, New Mexico might poorly fit Boise, Idaho. A novel that one friend loves, another friend may hate. These are poor examples — you can think of better ones, more specific to your experience, but you know the point well. Accuracy of understanding is not only difficult, it’s also commonly contextual, and sometimes the idea of accuracy itself is nearly irrelevant (this is especially difficult for methodical thinkers, but equally valuable, I believe). All the more reason to remain skeptical of our own points of view, especially for any that we have not absolutely ridden into the ground through investigation.

 

*  *  *

 

If this is all sounding political, egotistical, and more than a little judgmental, it is not my intent. I am trying to make the case as strongly as I can that certainty seems guaranteed a worse enemy than even a total lack of confidence. Lack of confidence can paralyze, prevent important decisions at key moments. But certainty almost always guarantees the wrong choice, which is very often far worse than no choice. If you have not found convincing my attempted arguments so far, it is easy to think of your own examples (of course if you already have these examples, you might already agree with me at least in part, and if you don’t, then I’m not likely to convince, so why do I go on? Only on the hope that my writing should be complete and coherent to the extent I can manage).

The study of memory is broad and ongoing. I believe it is fairly well established first that memories alter with each recollection, but also that we are able to hold stable long-term memories all our lives (the latter of these is more experiential, but both have research behind them). In one practical application, the assessment of memory in relation to crime has produced conflicting results, with some field research indicating that certainty of eyewitnesses in what they say correlates to accuracy, while other research points to the liability of such memories to alteration. Certainly subtlety has impact here. Regardless, it seems that the reliability of eyewitness accounts and testimony — really any matter of memory as critical as determining innocence or guilt — cannot be considered faultless. It is far too easy to be incorrect or simply misunderstood — see here and here for starting points, but there is more research available. I am not familiar with the subject matter in any depth so I do not claim the veracity of a given study or journal; the studies are offered here as insights for consideration, as a starting point for further thought. However non-authoritative a given study, it seems likely that memory, despite being the core underpinning to sustain a coherently progressive reality, cannot be assured as entirely reliable. This is intuitive at the personal level, of course, as when we misplace our sunglasses or forget a birthday, but we often ignore this limitation when considering our own certainties.

Memory is powerful, too, and we should not baselessly doubt our abilities. But certainty should never be total or final, and memory (which includes memory of our own interpretations of facts and events) ought to always be supplemented by additional cognition, applying a regular verification process, to the best of our abilities at a given time. This is a kind of manual error correction on top of the error correction performed automatically by the brain. apply a questionable technical analogy, it is an integration test after all unit tests have passed.

Unfortunately instituting a verification process sounds like a call to question every action, which would be a poor choice, I think. This is again the dichotomy between thinking and acting. It is optimal in most cases to act based on all known information at the given point when a decision is called for. The metacognition applied manually then provides a means to assess whether or not to act and how to act. Once a determination is made, it should be employed wholeheartedly (less vigorous acts mostly fail or produce negative repercussions as artifacts in addition to the primary outcome). Having acted, there is now new information in the form of an outcome, and we study this, too, of course. We do that intuitively, but it is useful to do so manually, by intellect, as well.

This process, and the growing and adapting body of knowledge resultant, provides a means to lean closer toward defining positive actions which have a higher likelihood of achieving a subjectively good outcome (i.e. you get what you want). Perhaps equally practically, this tends to minimize regret.

We all know someone who regularly makes poor decisions and still lives without “ragret.” An inherent pride sustained in ignorance is very tempting! To maintain constant certainty without reflection, without doubt, to remain uniformly sure of yourself, and your rightness, and your own abilities — it feels good! We know that — we all were teenagers once. But this archetype has historically initiated pointless wars, committed genocides, destroyed, ruined, hurt, suppressed, and obliterated, due to this inherent and unquestioning certainty of what the truth is and what the correct moral or practical path ought to be (or is). I make this claim with only history itself as evidence, so I suppose you might interpret it differently.

Not everyone of this character will make such drastic actions or terrible choices of course. But the majority of these plaguing outcomes originate from a lack of doubt. The kind of confidence which rejects reconsideration of its formations tends to become our undoing — individually, and societally.

 

*  *  *

 

Enduring time and effort are required to master any skill (and trying to skip the queue can lead to some pretty rough results). Proof of legitimacy is often required as well, in order to be considered proficient by the field at large.

This is what you’d want. Hurried mastery is no mastery at all, and generally bears itself out in practice. Many acknowledged experts, long in their fields, embarrass themselves in errors, with outcomes dramatically proportional to the status of the expert. Do we assume our evaluation of the expert was incorrect? We could track proportion of correctness for an expert over time, but experts in the public eye are often positioned to allow circumstances to act a shield. The simple busy pace of society also distracts. So experts may remain experts despite judging or acting wrongly more often than they are right. There’s always room for human error anyway, including in our evaluation of the failures of others.

We are the most crucial judges of our own actions. Not solely — how would we know if your own opinions are right if we don’t consider the opinions of others? But the bottom line for our considerations and actions devolves onto ourselves as final evaluators. Confidence comes from knowledge — knowledge not necessarily about what specific facts are true or untrue, or good or bad, but a cohesive sense of the approximation of correctness, a scrutinizing answer to the question “how sure am I,” a gestalt built out of every observation, every fact, every bit of understanding we obtain to construct an integrated picture of the world as we best understand it.

That is power. The act of considering and the awareness of the unknown create the power of potential, cocking the hammer. It is the power to wonder, the power to think (and to want to think), the power not to conclude, and to know when you shouldn’t. It is a power commensurate to the observed difference between the known and the expected unknown, laying the track to tread, paver by paver. Consider your footsteps or they will lead astray.

We spend our lives in insufficiency. Circumstances overwhelm us, information buries us, and time eludes us. With care and thought and luck we can hope to overcome these to a degree, and in practice that often must suffice. Ars longae, vita brevis — our lives are short, but the practice is long*. You study all your life and know so little of all there is to know, and in the end only history can vindicate or vilify you, if it even recalls you. Therefore we act for ourselves, for our families, for our communities, for our country, for our world, by our own reckoning. Only we can decide whether what happens is good or bad, and our best best possible shot at the outcomes we want requires thinking, and waiting, and reflecting.

The power of abstention avoids the blindness of certainty, and the confusion of uncertainty. It is control over thinking, over knowing, over effective use of knowledge and intellect. It betters the society we operate within and ideally try to build, and sets our working minds outside of our established framework of thought entirely, creating a lifelong opportunity to change ourselves.

 


 

References:

  1. From Out of Your Mind series of collected recordings of Alan Watts (alternate link).
  2. Fragment from Interview with Bertrand Russel (1959)
  3. I’m feeling lazy.

 

Feynman quotes on religion / God / belief:

  1. I agree that science cannot disprove the existence of God. I absolutely agree. I also agree that a belief in science and religion is consistent. I know many scientists who believe in God. It is not my purpose to disprove anything. There are very many scientists who do believe in God, in a conventional way too, perhaps, I do not know exactly how they believe in God. But their belief in God and their action in science is thoroughly consistent. It is consistent, but it is difficult. (The Meaning of Everything, pp 36 – 37)
  2. If they are consistent with their science, I think that they say something like this to themselves: “I am almost certain there is a God. The doubt is very small.” That is quite different from saying, “I know that there is a God.” I do not believe that a scientist can ever obtain that view – that really religious understanding, that real knowledge that there is a God – that absolute certainty which religious people have. (The Relation of Science and Religion)
  3. God was always invented to explain mystery. God is always invented to explain those things that you do not understand. Now when you finally discover how something works, you get some laws which you’re taking away from God; you don’t need him anymore. But you need him for the other mysteries. (Superstrings: a Theory of Everything? pp. 208 – 209)
  4. Once you start doubting — which for me is a very fundamental part of my “soul”, to doubt, to ask — when you doubt and ask, it gets a little harder to believe. … I have approximate answers, and possible beliefs, and different degrees of certainty about different things, but I’m not absolutely sure of anything, and there are many things I don’t know anything about, such as whether it means anything to ask why we’re here.  … I don’t feel frightened by not knowing things, by being lost in a mysterious universe without having any purpose — which is the way it really is, as far as I can tell, possibly. (Extract from interview).

 


 

* “Ars longa, vita brevis” is to my limited understanding of Latin better translated as “art/skill is long, life is short,” I considered “practice” to get the specific point across. As to “longae” but “brevis,” we each have our life to live but the skill is not ours alone. I hope I do that at least partial justice.

To go further and manifest understanding and skill that encompass the true state of the world-that-is — this is the mind of the hero, and those who approach these heights are generally either lionized or villified, depending on their actions, and what we know of them. A single step can prove transformational.

To the extent any human can, but here I am in danger of tangenting onto epistemology, cosmology, theology, and the notion of “perfection,” which ___.

 


 

N.B: I wrote this piece, and it took a long time, but I don’t like it. I agree with what I’ve written, so far, but it came out much more lecturing in tone than I have any right to be for so speculative a series of thought. That being said I’ve found it a useful perspective to apply and if you are reading this then I stand by that at least in the here and now (and if you are me later on, same goes).

If you want to contact me about this article for any reason, please email me at
martin at mberlove dot com
.
This entry was posted in metacognition. Bookmark the permalink.