Updated: Sep 17
Not making a decision is a decision.
“I suppose it is human nature to get anchored in our beliefs. The key is knowing when to drop anchor and sail away.”
I’m really curious about how we change our minds, especially as it impacts our personal and professional growth. This article describes approaches and considerations for changing our minds. It provides context from science and business. It provides real-life examples of changing our minds and reaching our goals.
This article is presented with the following sections:
Conclusion (sort of)
Background and Building An Understanding
I change my mind over time. It is usually an evolving process, not a thunderbolt. I tend to be slow to build an understanding and equally slow to change my mind. Changing my mind generally results from a weight of evidence effect.
The following describes my approach to changing my mind. Imagine all the evidence you know about a belief could fill a bucket. The individual evidence would have different sizes and colors.
The color would represent whether the evidence supports your belief or not (to build our mental picture -> pro is green, con is red), and
the size would represent the weight (or importance) of the evidence (big or small), like different size rocks.
The portion of the bucket representing the pro evidence acts like a belief probability.
So, if 70% of the evidence in the bucket supports my belief, then my belief may be sound, though there is some contrary evidence (30%).
So, what if new evidence is added to the bucket? If the new information is net pro (supporting) evidence, then my belief probability increases. Alternatively, if the new information is net con (contrary) evidence, then my belief probability decreases. If the supporting belief probability approaches the 50% threshold, I seriously consider changing my mind. This approach is meant to provide some structure and flexibility to update as I learn. At a minimum, I subjectively rank order and compare the existing and new weighted evidence. Sometimes, when a decision is more complex, I will more formally model it. See my notes for a narrative about modeling the college decision. (1)
A higher quantity of existing evidence (a bigger bucket) may create a kind of “belief inertia.” This occurs because a bigger evidence bucket requires more new evidence to move the needle on my perspective. Importantly, “new evidence” may include error correcting the existing weight of evidence.
Also, our own brains tend to work against us when it comes to changing our minds. This may be manifest as the false dichotomy logical fallacy or familiarity bias. (2) For example, I do my best not to fall into the false dichotomy trap. (I.e., the black or white beliefs our brain favors for convenience). So, if the existing belief probability is closer to 50%, I tend to be skeptical and seek new evidence to challenge a held belief. I am less skeptical of beliefs that are more probabilistically certain (either closer to 0% or 100%). But I will say, I do like to challenge my substantive beliefs, no matter the probability. I mostly appreciate new and curated information (or “believable” information). Even if it doesn’t change my mind, I will certainly add it to the bucket, evaluate it as a pro or con, and reformulate the overall belief probability. Actually, my favorite outcome is when new information does change my mind. I find it quite liberating!
Another way to look at this approach, I am regularly on the lookout for either a) errors of commission (my misunderstanding) or b) errors of omission (something I’m missing). As a card-carrying introvert, this sort of introversion gives me personal energy! (3) How quickly I challenge a belief is usually more of a time prioritization issue. That is, if I’m pressed for time on another topic, I may not be as available when new information presents itself. In this case, I endeavor to “tuck it away” for future evaluation. (4)
I formerly worked for KPMG and I worked there for 10 years, from 2009-2019. KPMG is a “Big 4” Audit, Tax, and Advisory firm. I was a Managing Director. My decision to leave KPMG went something like this:
My overall mission (or “why” statement) for being at KPMG was twofold.
To help financial services clients, and,
to help junior team members develop and progress in their career.
We were incredibly successful. I am proud of my teams for 1) helping many banks and millions of their loan customers through the financial crisis, 2) standing up an industry Artificial Intelligence enabled technology and operating platform to improve loan quality and operational productivity, and 3) growing a strong university recruiting and development linkage.
I would periodically compare my ability to execute against these mission statements. When it came time to leave, the decision was based on my analysis that KPMG and I were prospectively no longer aligned enough with MY ability to execute against the mission. (think probabilistically - forward alignment < 50%) Keep in mind, KPMG is an excellent firm. But ultimately, I had evolved to need a different environment to meet mission goals.
At this point in the article, it would seem all my decisions are made from the evidence that has known information value or “language”….this is certainly NOT true!
I also tend to listen to my gut, which has no language and manifests itself as a feeling or intuition. That is, I’m not exactly sure why my gut believes something….It just does and I just do. Per our brain model, I’m referring to the difference between the high emotion tag / low language processing path, as opposed to the low emotion tag / high language processing path. To be clear, these paths are highly dynamic and interactive. People process along both these mental pathways all the time. I’m presenting these pathway segments as a convenient way to frame our brain’s processing approach as related to changing our minds.
First, you may already anticipate the problem with this approach. If some of the evidence is held in our low language processing centers, how do we evaluate it to build the probability? Good question...As it is tough to evaluate something if it provides little language! This is where listening to your gut and seeking to build language comes into play (I call it “living in it”). (5)
It may mean listening to believable (6) others that may have already successfully “languagized” a perspective. Or,
sometimes, it may mean just going with it and doing your best.
The next section discusses integrating our gut into the decision-making process and changing our minds.
Active Listening and Living In It
Related to changing our minds, economist Steve Levitt interviewed scientist Moncef Slaoui. Dr. Slaoui runs the U.S. Government’s Operation Warp Speed, started in 2020, to develop COVID-19 vaccines. I thought the following interview exchange was particularly relevant.
“I'm really curious if you have any advice as someone who obviously thinks logically and scientifically about how to change the minds of people who don't think logically and scientifically. When you run into people who are ideological or anti science or conspiracy believers, do you have strategies for being persuasive?”
“That's a very good question. I would say my first advice, which is slightly tangential to what you said, I always say I'm not going to spend a lot of my time describing the problem. I quite quickly, once I have somewhat described the problem, I need to start thinking about solutions because that's how you move forward. Otherwise, if you continue describing the problem, you stay still. I use somewhat of a similar strategy of building up an argument and engagement where people understand that I'm genuine and authentic and I'm not at any cost trying to change their mind.
I'm actually mostly trying to understand how they think and start from there, because in order to really convince somebody of something, you need to truly exchange views, which means active listening and understanding why they say something. Some people it's impossible, but otherwise that would be my starting point. And my advice is:
A - active listening, and
B - once you have some grasp of the problem, think of solutions that also create energy and momentum to move forward.”
Specific to the weight of evidence model, Dr. Slaoui's comments suggest our perspective is a combination of right and left hemisphere based pathways as mentioned in the last section. That is, some weight of evidence has language and some doesn’t, some of it applies a high emotion limbic tag (more emotion) and some of it applies a low emotion limbic tag (little emotion). Active listening is an approach to help the conversee to process and reveal the basis for their thinking. It will also help the conversant to understand what drives the other’s view and the degree to which they may change their mind in the short term. My hypothesis, those with a high proportion of high emotion tag / low language evidence in their evidence bucket will take longer to change their mind. The hypothesis occurs because they may lack the internal understanding to evaluate the “how they feel'' evidence with language and critical thought. Said another way, people with beliefs generated more from emotion (and less generated from objective information) are likely to be slower to change their minds. (7)
To some degree, those with higher education as it relates to the topic will be more immediately open to adapt to a quickly changing environment. In the interview, Dr. Slaoui called this “educated intuition.” They have the developed language to understand the problem. Those with lower education on a topic may be slower to adapt as a higher proportion of understanding is found in lower language brain centers.
To be clear, “education” as I use it here, is in the broad sense. That is, education may arrive from formal education, experience, practice, autodidaction, or some combination thereof. It is not necessarily related to formal or credentialed education. Regardless of the source of education, building language by accessing the left hemisphere is helpful when building your total evidence bucket.
Also, living in it or building solutions is a great way to clarify decisions and add to your total evidence bucket. That is, do not let yourself get bogged down by not acting because you want more information. After gathering initial evidence, often “doing” is the best way to collect more information and then error-correcting along the way.
In 2019, I took a flyer by agreeing to be the COO of an investment management firm. They had a very interesting blockchain technology I thought would be a good fit for a number of banking contexts. I certainly researched the industry and the firm’s principles, but ultimately I decided the only way to really learn was by doing. I worked hard to build firm capability, taught myself a different business, worked client opportunities, networked, attended conferences, etc. Ultimately, the owner was unable to make payroll. I immediately left the firm.
Was I disappointed? A little. But mostly, other than my time, I took a little risk and learned a ton about something new and interesting. I’m grateful for the experience and also grateful the experience only cost a little time.
As an aside, to some degree, our laws and religions codify evidence to provide certainty. Also, many have certain personal standards they adhere.
For example, most religions support the golden rule - “Treat others as you want to be treated.” Laws generally make it illegal to assault or kill someone.
The point being, when it comes to changing your mind, there are certain understandings bounded by cultural or personal weight of evidence certainty. This generally makes decision-making easier as it reduces the need to weigh certain evidence. (8) For personal standards, I do try to be thoughtful and flexible, recognizing that increasing entropy is the only real certainty in life. Certainly, some have very structured personal standards.
Pulling the trigger
We have discussed some finer points for building and maintaining our belief bucket, including how to use objective information and your gut. So, let’s say the scale tips, your supporting belief probability drops below the 50% threshold, and you are faced with a change opportunity, then what? It’s time to change your mind and take action! The reality is, most people hold on longer than they should. Changing your mind includes messy emotions and habits that intrude in the decision-making process but do not always add value to the decision. In my KPMG example, my decision to leave was tough. We had been very successful, I had built many great working relationships, I had comfort in knowing how to work the organizational system, I had some fear of going from the known to the unknown. Basically, I had many excuses not to make a change. I do not regret it for a moment. My only advice, consider these sorts of decisions like an economist. Consider marginal value (that is, the "bucket" value created incrementally and likely in the future) and do not consider sunk costs or sunk value. For example, in my KPMG example, my three "pride points" are an example of a past sunk value.
What if you just can’t decide? The weighted evidence probability is about 50% and the decision could fall either way? My advice, when in doubt, make the HARDER decision. For example, if you are deciding between staying and leaving a job, and the weighted evidence is about a tie, I would leave the job. Why? While the evidence may seem at 50%, complete objectivity is almost impossible. That is, your natural familiarity bias is an invisible weight inappropriately biasing toward the easier decision. Steve Levitt, in a related coin toss experiment, came to a similar conclusion. (9)
Please remember, not making a decision is a decision! This is certainly easier said than done, especially since our own brain is sometimes working against us. However, I did find this advice helpful when pulling the trigger and after the bucket analysis showed it made sense.
At this point, I am not ready to conclude this article. I’d like to live in it more, check other believable sources, and allow my introversion processes to turn. I am very interested in what others think.
(1) A close cousin to “change my mind” is “make up my mind.” Here is an example of how I used objective decision-making to create a belief probability assignment program. This was to help my kids make up their minds about their college choice. When my kids were in the process of making their college decision, I created a decision program that ranks and weights key decision categories and colleges. The outcome was a single weighted score to help them compare colleges. It includes both objective and emotion-based criteria. This ultimately informed their college decision. (Please see our article The College Decision for the decision model.) From a parenting standpoint, this was a great way to teach a little decision science and to help build my kids' decision-making confidence. By the way, 3 of my kids went to James Madison University, one went to Christopher Newport University. Both are excellent state schools and our kids have thrived. For more information on the college decision, please see our articles, The College Decision - Framework and tools for investing in your future, The Stoic’s Arbitrage, and How to make money in Student Lending
(2) In addition to the false dichotomy logical fallacy, familiarity bias is also likely in play. If you have a choice between two options in your life or work — a safe one and a risky one — which one will you take? In a series of experiments, psychologists Chip Heath and Amos Tversky showed that when people are faced with a choice between two gambles, they will pick the one that is more familiar to them even when the odds of winning are lower. Our brains are designed to be wary of the unfamiliar. Familiarity bias is our tendency to overvalue things we already know. When making choices, we often revert to previous behaviors, knowledge, or mindsets. By the way, the literature for cognitive bias has grown substantially in recent decades. One of my favorite sources is Daniel Kahneman’s Behavioral Economics classic, Thinking, Fast and Slow.
(3) See our article for more information on introversion and extroversion thinking processes.
(4) "Tuck it away" runs some risk. What if you do not remember to write it down? What if you think, "That is so far away from my understanding, I'm not going to bother with it...I'm too busy!" The risk here is known as "Confirmation Bias." Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. It is tough to be disciplined, especially when busy. I manage this by keeping ongoing notes and periodically referring to them. I use an informal but relatively structured method that is described in our article Curiosity Exploration - An evolutionary approach to lifelong learning.
(5) My mentor and CEO of Rockland Trust Company, Chris Oddleifson, was famous for saying “Live in it.” He would say it in the context of getting a working team unstuck and on to the business of creating solutions.
(6) I first heard “believability” used in this context by Ray Dalio in his book, Principles.
(7) This relates to the Dunning–Kruger effect. People may overestimate their ability until they increase their bucket to a certain size AND with a higher proportion of language-informed evidence. This also reminds me of Bertrand Russell’s famous quote: “The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts.”
(8) NN Taleb, in his book Skin In The Game, considers long-term religious principles as part of tail risk management. They exist because they work and have stood the test of time. Like the Golden Rule. “....religion exists to enforce tail risk management across generations, as its binary and unconditional rules are easy to teach and enforce. We have survived in spite of tail risks; our survival cannot be that random.”
(9) Heads or Tails: The Impact of a Coin Toss on Major Life Decisions and Subsequent Happiness. Oxford University Press said “The two-month survey found that participants favoured the status quo, making a change less frequently than they predicted they would before the coin toss. At the six-month survey, this bias toward the status quo was gone. Furthermore, those who were instructed by the coin toss to switch their current position were more likely to actually make the change, reported that they were substantially happier, and said that they were more likely to make the same decision if they were to choose again.”
Hofstadter, in Goedel, Escher, Bach, describes a similar mental process for how beliefs are evaluated. He calls the RH interaction a “Brain state” that impacts which LH rendered beliefs and symbols (a symbol is a building block for a belief). His use of a “probabilistic catalogue” nicely aligns with the “weight of evidence” presented here.
“In summary, then, a chunked description of a brain state will consist of a probabilistic catalogue, in which are listed those beliefs which are most likely to be induced (and those symbols which are most likely to be activated) by various sets of "reasonably likely" circumstances, themselves described chunked level.”
I just heard Steve Levitt interview Daniel Kahneman to discuss his new book Noise. It was just released and I look forward to reading it. I like the premise of separating Noise from Bias. Though, to some degree, Noise and Bias are related. System noise does tend to randomly occur as a variance from the mean, whereas bias is associated with unfairness from human intervention. Either way, focusing on the decision-making process to remove noise or bias is critical. Especially if the decision-making process involves uncertainty. In my article Credit Analytics - When a decade of data suddenly loses predictability! I wrote of utilizing a multiple, independent teams approach to create new pandemic-effected loss estimates. The idea is to use a panel of independent expert views and a structured multi-input Monte Carlo simulation process to develop the new estimates. 8/8/21
The book Subtract, The Untapped Science of Less by Leidy Klotz is well aligned with many of this article’s concepts. Above, I discussed that I’m on the lookout for errors of commission, which is based on incorrect evidence I used to support some belief. In effect, I will edit or subtract the evidence interpretation and update my belief. Klotz considers this editing or subtraction to be a critical and often overlooked part of the creative process.
“The pinnacle of mental subtraction is when we remove ideas that are no longer correct, or that never were in the first place.”
I’m guessing Dr. Klotz would encourage me to use subtraction for errors of commission and other evidence issues that may lack relevance. In other words, don’t just adjust the wrong evidence, weed out the evidence that lacks relevance. As a slight difference to Dr. Klotz’s pinnacle recommendation, I do not necessarily look to remove beliefs that are incorrect, it is that I change my mind as to the belief. In my article The case for removing Civil War Statues, I use an analogy to demonstrate how I changed my mind on civil war statues.
Your Personal Finance Journey Guide:
Making the money!
7. Career success - Success Pillars - Maximizing luck with an adaptable mindset to reach your goals!
8. Career choices - Do I need to be a Data Scientist in an AI-enabled world?
9. Career choices - Diamonds In The Rough - A perspective on making high impact college hires
Spending the money!
10. Budgeting - Budgeting like a stoic
12. College choice - The College Decision - Framework and tools for investing in your future
13. College choice - College Success!
14. College choice - How to make money in Student Lending
15. Event spending - Wedding and event planning guiding principle
Investing the money!
Pulling it together!