Changing Our Mind

Updated: 2 hours ago

Not making a decision is a decision.


“I suppose it is human nature to get anchored in our beliefs. The hard part is knowing when to drop anchor and sail away.”

Jeff Hulett is the Executive Vice President of the Definitive Companies, a decision sciences firm. Jeff's background includes data science and decision science. Jeff's experience includes leadership roles at consulting firms, including KPMG and IBM. Jeff also held leadership roles at financial services organizations such as Wells Fargo and Citibank. Jeff is a board member and rotating chair at James Madison University.

 

Have you ever noticed how difficult it is for people to change their mind? It seems even more challenging to convince someone else to change their mind. This article explores how we change our minds, especially as it impacts our personal and professional growth. We use a multi-disciplinary approach, considering research from Decision Science, Economics, Psychology, Neuroscience, and Political Science. This article provides practical business and science contexts. It provides real-life examples of changing our mind and reaching our goals.


This article is presented in the following sections:

  1. Background and Building An Understanding

  2. Changing our mind - approach

  3. Changing our mind - information curation

  4. Changing our mind - listening to our gut

  5. Active Listening and Living In It

  6. Pulling the Trigger

  7. Conclusion (sort of)

  8. Notes and Musings


1. Background and Building An Understanding


I change my mind over time. It is usually an evolving process, not a thunderbolt. I tend to deliberately build an understanding and actively consider changing my mind. A decision to change my mind may result after effectively weighing the evidence about a held belief. Generally, beliefs are formed from our experiences, whether from childhood, family, education, professional, or others. In this article, we accept our initial belief set as a given. However, please see our notes for resources to further explore our initial belief formation. [i] This article provides methods and context for how to update your beliefs and take action based on those updates. Eliezer Yudkowsky, a Decision Scientist and Artificial Intelligence theorist [ii], suggests:

"Every question of belief should flow from a question of anticipation."

To this end, we will explore changing our minds by first identifying our beliefs, then evaluating anticipatory evidence supporting, or not supporting, a belief. Finally, we will provide suggestions for acting upon our updated beliefs.

1a. Changing our mind - approach

The following is an approach to changing our mind. Imagine all the evidence you know about a belief could fill a bucket. The individual evidence would have different sizes and colors.

  • The color would represent whether the evidence supports your belief or not (to build our mental picture: pro is green, con is red), and

  • the size would represent the weight (or importance) of the evidence (big or small), like different size rocks.

The portion of the bucket filled with pro evidence acts like a belief probability.

So, if 70% of the evidence in the bucket supports our belief, then the belief may be sound, though there is some contrary evidence (30%).

“The best choices are the ones that have more pros than cons, not those that don’t have any cons at all”

-Ray Dalio - Chairman, Bridgewater Associates


So, what if new evidence is added to the bucket? If the new information is net pro (supporting) evidence, then the belief probability increases. Alternatively, if the new information is net con (contrary) evidence, then the belief probability decreases. If the supporting belief probability approaches the 50% threshold, we should seriously consider changing our mind. This approach is meant to provide some structure and flexibility to update as we learn. At a minimum, we should subjectively rank order and compare the existing and newly weighted evidence. Please see our notes for a definition and examples of "evidence." We also discuss how evidence relates to our rock size and color. [iii]

 

Side Note: As decision complexity increases

In the main, this article is based on simpler “changing our mind” decisions found in our day-to-day life. These are when we are considering a single belief. In the decision science world - a “belief” is more generally known as a decision outcome - like “I should change my current job.” The different colors and sizes of rocks relate to decision criteria. The decision criteria is evaluated by assessing the related evidence. In a simpler decision, the outcome has just two decision alternatives, to either change or not change our mind.

Sometimes, decisions are more complex. These are decisions with multiple criteria-based evidence and multiple decision alternatives. When a decision is more complex, it is helpful to create a decision model using readily available decision science-enabled tools. Please see the notes section for a decision process narrative example and related decision tools to make the best college decision. [iv]

 

We now return to our simpler decisions. A higher quantity of existing evidence (a bigger bucket) may create a kind of “belief inertia.” This occurs because a bigger evidence bucket requires more new evidence to move the needle on the existing belief perspective. Belief inertia may be a sign you have “more to lose.” Importantly, “new evidence” may include error-correcting the existing weight of evidence. This is updating evidence you may have previously accepted. Rationally, having more to lose should not change our ability to change our mind. If anything, a larger belief bucket should increase the evidence available for updating our belief bucket. This occurs because the higher volume of evidence may need error-correction updating in a larger belief bucket as compared to a smaller belief bucket. However, there is significant research showing that people's belief inertia may distort or inappropriately impact how we change our mind. In effect, certain evidence tends to “harden” as a result of belief inertia.

Hardened evidence makes it more challenging to appropriately evaluate evidence. (I.e., update the rock color) For example:

  • Belief inertia and fear: In my life, wealth relates to overcoming belief inertia. As I have aged and increased my wealth base, it makes it easier to make change decisions. I feel less anxious knowing I have a financial buffer in the event things do not work out as planned. I have found it easier to overcome belief inertia with a buffer. My wife and I started with very little financial wealth. I was the primary breadwinner and we had four young children. The belief inertia resulting from fear of change was more powerful. If there is anything I learned, it was to let go of the fear because things usually did work out. Not always as planned, but things do tend to work out. To dig deeper into how willingness to change jobs impacts personal finance, please see our article They kept asking about what I wanted to do with my life, but what if I don't know?

  • Belief inertia and overconfidence: Julia Shvets is an economist at Christ's College in Cambridge, England. In a study, Shvets and colleagues found that only about 35% of managers were accurate in self-assessing their future performance as compared to their actual current work performance. Even though these managers had recently been provided evidence (i.e., a performance review) about their most recent performance. Also, these managers were biased toward being overconfident about their future performance. [v]

  • Belief inertia and expertise: Francis Fukuyama is an author and political scientist at Stanford University. In a 2022 interview, Fukuyama recalled a situation when he was an official in the U.S. State Department. This is an example of when our own expertise, or belief inertia, leads people to not change their minds when they should have:

"In May of 1989, after there had been this turmoil in Hungary and Poland, I drafted a memo to my boss, Dennis Ross, who was the director of the office that sent it on to Jim Baker, who was the Secretary of State, saying we ought to start thinking about German unification, because it didn’t make sense to me that you could have all this turmoil right around East Germany and East Germany not being affected. The German experts in the State Department went ballistic at this. You know, they said, “This is never going to happen.” And this was said at the end of October. The Berlin wall fell on November 11th. And so I think that the people that were the closest to this situation — you know, I was not a German expert at all, but it just seemed to me logical. But I think it’s true that if you are an expert, you really do have a big investment in seeing the world in a certain way, whereas if you’re an amateur like me you can say whatever you think." [vi]

When it comes to handling belief inertia, the great British mathematician and philosopher Bertrand Russell’s timeless aphorism is on point. Russell provides an appropriate reminder that good decision-making is not necessarily easy:

“The whole problem with the world is that fools and fanatics are always so certain of themselves, and the wiser people so full of doubts.”

- Bertrand Russell, 1872 -1970

Russell is suggesting wise people appreciate the dynamic nuances of the size and color of our “rock” evidence. He suggests it is only fools and fanatics that simplify decisions as “all green!” or “all red!”

Our own brains tend to work against us when it comes to changing our minds. This may be expressed as the false dichotomy logical fallacy, familiarity bias, [vii] or confirmation bias. For example, I do my best not to fall into the false dichotomy trap. (I.e., the black or white “thinking fast” beliefs our brain favors for processing convenience.) So, if the existing belief probability is closer to 50%, I tend to be skeptical and seek new or updated evidence to challenge a held belief. I am less skeptical of beliefs that are more probabilistically certain (either closer to 0% or 100%). But I will say, I do like to challenge my substantive beliefs, no matter the existing probability. I mostly appreciate new and curated information (or “believable” information). Even if it doesn’t change my mind, I will certainly add it to the bucket, evaluate it as a pro or con, and reformulate the overall belief probability. My favorite outcome is when new information does change my mind. My favorite outcome feeling may start as apprehension, but I have learned to appreciate it as liberation!

Another way to look at this approach is... we should regularly be on the lookout for 2 types of errors:

  • errors of commission -> misunderstanding of evaluated evidence (false positive)

- or -

  • errors of omission -> evidence overlooked and not evaluated. (false negative)

This evaluation approach is very helpful for challenging, updating, or confirming held beliefs. How quickly beliefs are challenged is usually more of a time prioritization issue. [viii] That is, if we are time-pressed, we may not be as available when new information presents itself. In this case, we should endeavor to “tuck it away” for future evaluation. There is a body of research suggesting that time-pressed people, especially chronically time-pressed people, struggle with decision-making. This is beyond the scope of this article, but please see the notes for an excellent resource on scarcity, including time scarcity. [ix] The important takeaway is to beware of time scarcity and its impact on your ability to both collect and evaluate evidence for your decision bucket.


1b. Changing our mind - information curation

In recent decades, changing sources of curated information have negatively impacted decision-making. There was a time when we received balanced information via broadcast media networks. In the current age, social media-based narrowcasting has segmented our media information to deliver potentially biased and incorrect data. Changing our mind is particularly challenging if we lose confidence in our data sources or only consume narrowly presented viewpoints. Our evidence bucket, unattended, may be incomplete, biased, or wrong. In the words of sociologist Zeynep Tufecki:

"The most effective forms of censorship today involve meddling with trust and attention, not muzzling speech itself."

The good news is that we can all be good curators of information to fill our evidence bucket. [x]


Think of your data sources as being part of your very fertile information garden. It is a garden holding your unbiased, high-quality information sources but also may harbor unwanted data weeds or invasive data plants. Your job as a chief gardener is to constantly weed your information garden and introduce new, high-quality information plants. Also, every good garden has variety. Being open to many curated information species is the key to a healthy garden and appropriately updating your evidence bucket. Keeping in mind, that unbiased, high-quality information that causes you to change your mind may initially feel weedy.

Please see our article Information curation in a world drowning in data noise for an effective approach to successfully curating your information. Matthew Jackson, an economist at Stanford University provides an aligned perspective. In a recent interview [xi], he said:

“People tend to associate with other people who are very similar to themselves. So we end up talking to people most of the time who have very similar past experiences and similar views of the world, and we tend to underestimate that. People don’t realize how isolated their world is.”
 

A consulting firm example

I formerly worked for KPMG and I worked there for 10 years, from 2009 to 2019. KPMG is a “Big 4” Audit, Tax, and Advisory firm. I was a Managing Director. My decision to leave KPMG went something like this:

First, to understand why I left, we must start with my reason for being there in the first place. My overall mission (or “why” statement) for being at KPMG was twofold.

  1. To help financial services clients, and,

  2. to help junior team members develop and progress in their careers.

We were incredibly successful. I am proud of my teams for:

  1. Helping many banks and millions of their loan customers through the financial crisis,

  2. Standing up an industry Artificial Intelligence enabled technology and operating platform to improve loan quality and operational productivity, and

  3. Growing a strong university recruiting and development linkage.

I would periodically compare my ability to execute against these mission statements. When it came time to leave, the decision was based on my analysis that KPMG and I were prospectively no longer aligned enough with MY ability to execute against the mission. (think probabilistically - anticipated forward alignment < 50%) Keep in mind that KPMG is an excellent firm. But ultimately, I had evolved and anticipated needing a different environment to meet mission goals.

 

1c. Changing our mind - listening to our gut


At this point in the article, it would seem typical decisions are made from the evidence that has known information value or “language”….this is certainly NOT true!


We also tend to listen to our gut. Our gut has no left hemisphere-generated language and manifests itself as right hemisphere-based feeling or intuition. That is, we not exactly sure why our gut believes something….It just does and we just do. Per our brain model, we are referring to the difference between the high emotion & low language pathway, as opposed to the low emotion & high language pathway. As the model suggests, it is our left hemisphere that houses our language. As Daniel Kahneman discusses in Thinking, Fast and Slow, our “fast,” emotion-based thinking will bypass our left hemisphere. The left hemisphere is on the low emotion & high language pathway. Without left hemisphere processing, it is difficult to add language to evidence. To be clear, these mental pathways are highly dynamic and interactive. People process along both these mental pathways all the time. These pathways are presented as convenient brain signaling segments to frame our brain’s processing approach as related to changing our minds.


First, you may already anticipate the problem with this gut-based approach. If some of the evidence is held in our low language processing centers, how do we evaluate it to build our probability understanding? Good question!...As it is tough to evaluate a problem if your experience and learning provide little language! This is where listening to your gut and seeking to build language comes into play. I call it “living in it”. [xii]

  1. It may mean listening to and learning from believable [xiii] others that have already successfully “languagized” a perspective. Or,

  2. sometimes, it may mean just going with it, doing your best, and adapting as you learn.

The next section discusses integrating our gut into the decision-making process and changing our minds.


2. Active Listening and Living In It


Related to changing our minds, University of Chicago economist Steve Levitt interviewed scientist Moncef Slaoui. Dr. Slaoui ran the U.S. Government’s Operation Warp Speed, which started in 2020, to develop COVID-19 vaccines. Extraordinary speed was required to build the new vaccine. The result was incredible. Operation Warp Speed was a tremendous success. Future historians will likely consider it as one of mankind’s outstanding achievements. This environment was a great example of quickly building a knowledge base with gut, curated information, and objectively informed judgment. From our brain model framework, the COVID-19 vaccine developers had to bring and build evidence from both their high emotion & low language pathways and the low emotion & high language pathways. Dr. Slaoui discusses Operation Warp Speed’s environment as necessitating a combination of pathways. He calls this “educated intuition.” The following interview exchange is particularly relevant:

 

A COVID-19 vaccine development example

Levitt:

I'm really curious if you have any advice as someone who obviously thinks logically and scientifically about how to change the minds of people who don't think logically and scientifically. When you run into people who are ideological or anti-science or conspiracy believers, do you have strategies for being persuasive?”


Slaoui:

That's a very good question. I would say my first advice, which is slightly tangential to what you said, I always say I'm not going to spend a lot of my time describing the problem. I quite quickly, once I have somewhat described the problem, I need to start thinking about solutions because that's how you move forward. Otherwise, if you continue describing the problem, you stay still. I use somewhat of a similar strategy of building up an argument and engagement where people understand that I'm genuine and authentic and I'm not at any cost trying to change their mind.


I'm actually mostly trying to understand how they think and start from there, because in order to really convince somebody of something, you need to truly exchange views, which means active listening and understanding why they say something. Some people it's impossible, but otherwise that would be my starting point. And my advice is:

  • Active listening, and

  • Once you have some grasp of the problem, think of solutions that also create energy and momentum to move forward.”

 

Specific to the weight of evidence model, Dr. Slaoui's comments suggest our perspective is a combination of right and left hemisphere based pathways as mentioned in the last section. That is, some weight of evidence stimulates the high emotion & low language pathway and some stimulates the low emotion & high language pathway. Active listening is an approach to help the conversee process and reveal the basis for their thinking. It will also help the conversant to understand what drives the other’s view and the degree to which they may change their mind in the short term. Please see our notes for a deeper dive into career decisions, including our brain's interactions when learning to like like skills, tasks, and missions. [xiv]

Similarly, Steven Sloman, a Brown University psychology professor, suggests having someone OBJECTIVELY explain how something works, instead of providing potentially FEELING-based reasoning for why they should or should not do that something is an effective “changing our mind” strategy. That is, Sloman’s approach is a good way to help people both a) reveal their true (often lack of) understanding of a problem and b) anchor them in a productive mindset to be open to changing their minds. Also, Sloman suggests that we are part of a community-based hive mind. That is, we do not think alone. [xv] As such, understanding how others impact our thinking is critical to our own thinking understanding. Back to our Brain Model, it is the Right Hemisphere that actively seeks to connect with the outside world. We are genetically wired to be part of a wider network of thinking.


The hypothesis is that those with a high proportion of high emotion & low language evidence in their evidence bucket will take longer to change their mind. This hypothesis occurs because they may lack the internal understanding to evaluate the “how they feel'' evidence with language and critical thought. Said another way, people with beliefs generated more from emotion (and less generated from objective information) are likely to be slower to change their minds. It generally requires language for change-enabling reflection and evaluation. [xvi]


To some degree, those with higher education as it relates to the topic will be more immediately open to adapting to a quickly changing environment. Even if the topical education is related but not always specific, they are likely able to frame and structure the problem, ask good questions, and fill holes in understanding. This enables moving forward and creating solution momentum. Per the vaccine development example, Dr. Slaoui called this “educated intuition.” For Operation Warp Speed to be successful, he needed teammates possessing the relevant language to quickly understand the problem and move to a solution. In other words, Dr. Slaoui needed senior teammates that could quickly contextualize the problem and differentiate between “what they know” and “what they don’t know.” Clearly, there was much they did not know. Notably, the COVID-19 virus was new and the required speed-to-market was new. Those with lower contextual-based education may be slower to adapt when a higher proportion of understanding is found in lower language brain centers.


To be clear, “education” as used here, is in the broad sense. That is, education may arrive from formal education, experience, practice, autodidaction, or some combination thereof. It is not necessarily related to formal or credentialed education. Regardless of the source of education, building language by accessing the left hemisphere is helpful when building your total evidence bucket.


Also, living in it or building solutions is a great way to clarify decisions and add to your total evidence bucket. That is, do not let yourself get bogged down by not acting because you want more information. After gathering initial evidence, often “doing” is the best way to collect more information and then error-correcting along the way. When possible, attempt to road test your initial evidence-based decision. Please see the article Curiosity Exploration - An evolutionary approach to lifelong learning for the natural interaction between our induction-based reasoning (trying new stuff) and deduction-based reasoning (learning about new stuff.) Next is an example of a low-risk road test that provided high-value information.

 

An investment firm example

In 2019, I took a flyer by agreeing to be the Chief Operating Officer of an investment management firm. They had a very interesting blockchain technology I thought would be a good fit for a number of banking contexts. I certainly researched the industry and the firm’s leadership team, but ultimately I decided the only way to really learn was by doing. I persistently worked to build firm capability, learned a different business, developed client opportunities, networked, attended conferences, etc. Ultimately, the owner was unable to make payroll. I immediately left the firm.


Was I disappointed? A little. But mostly, other than my time, I took little risk and learned a ton about something new and interesting. I’m grateful for the experience and also grateful the experience only cost a little time.

“I know every gain must have a loss. So pray that our loss is nothing but time.”

- The Mills Brothers, the song "Till Then"

 

Our laws and religions may codify evidence to provide certainty. Also, many have certain personal standards they adhere to.


For example, most religions support the golden rule - “Treat others as you want to be treated.” Laws generally make it illegal to assault or kill someone.


The point is, when it comes to changing your mind, there are certain understandings bounded by cultural or personal weight of evidence certainty. This generally makes decision-making easier as it reduces the need to weigh certain evidence. [xvii] For personal standards, it is helpful to be thoughtful and reflective, recognizing that increasing entropy is the only physics-based certainty in life. [xviii] Certainly, some have very structured personal standards. As such, understanding your and others' “non-negotiables” is important to building and weighing your evidence bucket. It is also enlightening to periodically test your non-negotiables. Some may evolve to become part of your probabilistic-based beliefs. It is interesting that some people, particularly younger folks, sometimes respond to evidence-related questions with an enthusiastic "Definitely!" or "100%!" As this article suggests, wisdom comes from realizing a proper response may take more evaluation and will likely lead to a more nuanced, evidence-weighing response.


3. Pulling the trigger


We have discussed some finer points for building and evaluating our belief bucket, including the importance of curating information and how to use objective information and your gut. So, let’s say the scale tips, your supporting belief probability drops below the 50% threshold, and you are faced with a change opportunity, then what? It’s time to change your mind and take action! The reality is that most people hold on longer than they should. Changing your mind includes messy emotions and habits that intrude upon the decision-making process but do not always add value to the decision.


In the consulting firm example, the decision to leave was challenging. There had been much success. Many great working relationships had been developed. There was comfort in knowing how to work the organizational system. Existing sales pipelines and business relationships had been built. There was some fear of going from the known to the unknown. In summary, there were many excuses not to make a change. I do not regret it for a moment.


As advice, please consider these sorts of decisions like an economist. Consider anticipatory marginal value - that is, the "bucket" value created incrementally and likely in the future. In the case of a job change, the value is based on your or your organization’s comparative advantages. The following questions help evaluate comparative advantage:

  • How do your skills or capabilities compare to others?

  • What is your accomplished combination of skills and capabilities compared to others in your market?

  • Then, how do you anticipate applying those capabilities in the future? [xix]

In the consulting firm example, I realized my ability to build and scale technology-enabled solutions would be more challenging in an accounting firm than in a technology company. As such, I align my range of skills-based comparative advantages by leaving a firm with technology-based comparative disadvantages. [xx]

Finally, do not consider sunk costs or sunk value. Certainly, there may be some credentialing benefits to past accomplishments. However, when it comes to evaluating your bucket’s pros and cons, only consider the value or costs anticipated in the future. For example, via the consulting firm example, the three "pride points" are an example of past sunk value. As the KPMG change was considered, those points should be sunk and not considered as part of the belief bucket’s value comparison. While sunk in a decision-making context, I will always be incredibly proud of my teammates for the service we provided our clients and their customers. I call it “sunk with pride!”


What if you just can’t decide? The weighted evidence probability is about 50% and the decision could fall either way? The advice, when in doubt, is to make the HARDER decision. For example, if you are deciding between staying and leaving a job, and the weighted evidence is about a tie, I would leave the job. Why? While the evidence may seem at 50%, complete objectivity is almost impossible. That is, your natural familiarity bias and confirmation bias is an invisible “hand on the scale” weight inappropriately biasing toward the emotionally easier decision. Steve Levitt, in a related coin toss experiment, came to a similar conclusion. Also, the neurological mechanisms of confirmation bias will naturally seem to decrease the weight of evidence contrary to past beliefs. [xxi]


Please remember, not making a decision is a decision! Belief inertia is a powerful force. This is certainly easier said than done, especially since our own brain sometimes works against us. However, I did find this advice helpful when pulling the trigger and after the anticipatory bucket analysis showed it made sense.


4. Conclusion


At this point, I am not ready to conclude this article. I’d like to live in it more, check other believable sources, and allow my left hemisphere-based language and right hemisphere-based gut to process. I am very interested to learn about others’ “changing our mind” lived experiences.

 

5. Notes and Musings


[i] Individual belief sources are out of the scope of this article. We accept the existence of individual beliefs as a given. The following are resources providing relevant context and additional research on how belief sources may be formed or impacted:

  • The first, by Gneezy and List, provides the cultural impact of matriarchal v patriarchal society. Probably not surprisingly, cultural upbringing and cultural standards make a big difference in individual beliefs.

  • The second provides a risk management perspective and experience from Saujani. This provides reasoning for why appropriate play-based risk-taking as a child may improve life outcomes.

  • The third provides an explanation of how unseen neurodiversity, defined as the diversity of thinking types across all people, may impact our work environment as we emerge from the pandemic. Sources include Pew Research, Cain, and Kahneman. "How we think" is certainly a basis for our beliefs.

Hulett, How braver people, especially our daughters, achieve more career success, The Curiosity Vine, 2022

Hulett, An investor’s view: Teaching our daughters to be brave, The Curiosity Vine, 2022

Hulett, Sustainable Diversity in the post-pandemic world, The Curiosity Vine, 2021


[ii] Yudkowsky, Map and Territory, 2018: See chapter 11.


[iii] Simply defined, "evidence" is the link between cause and effect. Or, in more decision science-like language, evidence is the link between criteria and alternatives. Next are a couple of examples. The form of the evidence could be as simple as your judgmental answer to the cause and effect-linking evidentiary question. When possible, it is best practice to obtain objective support for the evidentiary question. For example: When considering the salary criterion for a good job, determine the market salary for your role and industry from an independent source.

Beyond collecting the evidence, our evaluation includes weighing the criteria. That is, we need to assess the question: "How important is the causal linked evidence to determine the effect outcome?" This is akin to the size and color of the rock in our evidence bucket. For example, your location may be the most important criterion. So it is a big rock. If your current job wants to transfer you to Alaska, and you would prefer to live in New York, then the location's rock color would be red.


We also include a “bankable loan” example. Banks generally have credit policies that define the criteria and evidentiary requirements to make loans. While this example is basic, it does provide insight into how banks make loan decisions. The loan decision is usually the responsibility of a loan underwriter. Much of the assessment has become automated via the use of loan assessment algorithms like FICO score.

Yudkowsky, Map and Territory, 2018: See Chapter 20.


[iv] A close cousin to the simpler “change my mind” decision is the more complex “make up my mind” decision. Here is an example of how I used objective and judgmental decision-making information to create a belief probability assignment program. This was to help my kids make up their minds about their college choice. The college decision is an incredibly complex decision. One may have 5-10 criteria and 5-10 college alternatives. This could create almost 100,000 decision combinations! Our brains are simply not wired to effectively handle such a complex decision without support.

When my kids were in the process of making their college decision, I created a decision program that ranks and weights key decision criteria and college alternatives. The outcome was a single weighted score to help them compare colleges. It includes both objective and emotion-based criteria. It includes costs. This ultimately informed their college decision. It provided confidence they were making the best college decision.


(Please see our article The College Decision for the decision model. Since this article was originally published, I initiated a non-profit organization with a mission to “Help people make a better life with better decisions.” Our non-profit recently stood up a college decision app.)


From a parenting standpoint, this was a great way to teach a little decision science and to help build my kids' decision-making confidence. By the way, 3 of my kids went to James Madison University, one went to Christopher Newport University. Both are excellent state schools and our kids have thrived. For more information on the college decision, please see our articles, The College Decision - Framework and tools for investing in your future, The Stoic’s Arbitrage, and How to make money in Student Lending


[v] Huffman, Raymond, Shvets, Persistent Overconfidence and Biased Memory: Evidence from Managers, American Economic Review (conditionally accepted), 2021


[vi] An interview, Freakonomics Radio, Stephen Dubner with Francis Fukuyama, "How to Change Your Mind", episode 379, 2022


[vii] In addition to the false dichotomy logical fallacy, familiarity bias is also likely in play. If you have a choice between two options in your life or work — a safe one and a risky one — which one will you take? In a series of experiments, psychologists Chip Heath and Amos Tversky showed that when people are faced with a choice between two gambles, they will pick the one that is more familiar to them even when the odds of winning are lower. Our brains are designed to be wary of the unfamiliar. Familiarity bias is our tendency to overvalue things we already know. When making choices, we often revert to previous behaviors, knowledge, or mindsets. By the way, the literature for cognitive bias has grown substantially in recent decades. One of my favorite sources is Daniel Kahneman’s Behavioral Economics classic, Thinking, Fast and Slow.


[viii] Understanding how our personality interacts with our thinking process is generally helpful when addressing held beliefs. Time prioritization and personality preferences may interact. That is, we are generally more likely to perform certain thinking tasks when those tasks are a net energy provider v. a net energy drag. See the following article for more information on introversion and extroversion thinking processes.

Hulett, Creativity - For both introverts and extroverts, The Curiosity Vine, 2022


[ix] "Tuck it away" runs some risk. What if you do not remember to write it down? What if you think, "That is so far away from my understanding, I'm not going to bother with it...I'm too busy!" The risk here is known as "Confirmation Bias." Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. It is challenging to be disciplined, especially when busy. I manage this by keeping ongoing notes and periodically referring to them. I use an informal but relatively structured method that is described in our article Curiosity Exploration - An evolutionary approach to lifelong learning.

For more information on the scarcity-based impacts of being time-pressed, please see:


Mullainathan, Shafir, Scarcity: Why Having Too Little Means So Much, 2013


[x] We discuss the social challenges related to data curation and good decision-making in the following article. We recognize that inherent social challenges may make good decision-making and related outcomes more challenging for individuals in some social groups. We also provide tool suggestions to help counteract social challenges.


Hulett, The Great Social Equalizers: Data and Decision-Making, The Curiosity Vine, 2021


[xi] An interview, Freakonomics Radio, Stephen Dubner with Matthew Jackson, "How to Change Your Mind", episode 379, 2022


[xii] My mentor and CEO of Rockland Trust Company, Chris Oddleifson, was famous for saying “Live in it.” He would say it in the context of getting a working team unstuck and on to the business of creating solutions.


[xiii] The term “believability” is used in this context by Ray Dalio.

Dalio, Principles, 2017


[xiv] In the following skills, tasks, and missions reference, please see the first section of our article called "Background - our attitudes, behaviors, and career segments."


Hulett, They kept asking about what I wanted to do with my life, but what if I don't know?, The Curiosity Vine, 2020


[xv] Sloman, Fernbach, The Knowledge Illusion, Why We Never Think Alone, 2017


[xvi] This relates to the Dunning–Kruger effect. People may overestimate their ability until they increase their bucket to a certain size AND with a higher proportion of language-informed evidence. This also reminds me of Bertrand Russell’s quote mentioned earlier.


[xvii] NN Taleb, as described in his book, considers long-term religious principles as part of tail risk management. They exist because they work and have stood the test of time.…. like the Golden Rule.

“....religion exists to enforce tail risk management across generations, as its binary and unconditional rules are easy to teach and enforce. We have survived in spite of tail risks; our survival cannot be that random.”

NN Taleb is an amazing thinker. However, I still endeavor to challenge my beliefs and the supporting evidence. Even if that evidence may include religious doctrine. While I may not ultimately change a religion-grounded belief, it is a best practice to challenge.


Taleb, Skin In The Game, 2018

[xviii] In the following article, we present the physics-based argument for why the goal of life is to "Fight Entropy." The article also connects the dots between the perception of time and entropy. The article presents research from Neuroscientists and Physicists such as Genova, Taylor, Price, Buonomano, and Dyson.


Hulett, Fight Entropy: The practical physics of time, The Curiosity Vine, 2021


[xix] We discuss the power of developing and leveraging a combination or range of unique skills and capabilities. See the following article:


Hulett, The case for range: Why “polymathic” people are so valuable, The Curiosity Vine, 2021


[xx] John List identifies several elements to successfully scaling an idea, including comparative advantages such as alignment of incentives and managing margins.


List, The Voltage Effect: How to Make Good Ideas Great and Great Ideas Scale, 2022


[xxi] Levitt, Heads or Tails: The Impact of a Coin Toss on Major Life Decisions and Subsequent Happiness. National Bureau of Economic Research, WORKING PAPER 22487,

DOI 10.3386/w22487, 2016


Oxford University Press said about Levitt's conclusion:

The two-month survey found that participants favoured the status quo, making a change less frequently than they predicted they would before the coin toss. At the six-month survey, this bias toward the status quo was gone. Furthermore, those who were instructed by the coin toss to switch their current position were more likely to actually make the change, reported that they were substantially happier, and said that they were more likely to make the same decision if they were to choose again.”

Kappes, Harvey, et al, Confirmation bias in the utilization of others’ opinion strength, Nature Neuroscience, 2019

“Our results demonstrate that existing judgments alter the neural representation of information strength, leaving the individual less likely to alter opinions in the face of disagreement.”

Additional musings:

4/29/21

Hofstadter, in Goedel, Escher, Bach, describes a similar mental process for how beliefs are evaluated. He calls the RH interaction a “Brain state” that impacts which LH rendered beliefs and symbols (a symbol is a building block for a belief). His use of a “probabilistic catalogue” nicely aligns with the “weight of evidence” presented here.

“In summary, then, a chunked description of a brain state will consist of a probabilistic catalogue, in which are listed those beliefs which are most likely to be induced (and those symbols which are most likely to be activated) by various sets of "reasonably likely" circumstances, themselves described chunked level.”

5/17/21

I just heard Steve Levitt interview Daniel Kahneman to discuss his new book Noise. It was just released and I look forward to reading it. I like the premise of separating Noise from Bias. Though, to some degree, Noise and Bias are related. System noise does tend to randomly occur as a variance from the mean, whereas bias is associated with unfairness from human intervention. Either way, focusing on the decision-making process to remove noise or bias is critical. Especially if the decision-making process involves uncertainty. In my article Credit Analytics - When a decade of data suddenly loses predictability! I wrote of utilizing a multiple, independent teams approach to create new pandemic-effected loss estimates. The idea is to use a panel of independent expert views and a structured multi-input Monte Carlo simulation process to develop the new estimates. 8/8/21

The book Subtract, The Untapped Science of Less by Leidy Klotz is well aligned with many of this article’s concepts. Above, I discussed that I’m on the lookout for errors of commission, which is based on incorrect evidence I used to support some belief. In effect, I will edit or subtract the evidence interpretation and update my belief. Klotz considers this editing or subtraction to be a critical and often overlooked part of the creative process.

“The pinnacle of mental subtraction is when we remove ideas that are no longer correct, or that never were in the first place.”

I’m guessing Dr. Klotz would encourage me to use subtraction for errors of commission and other evidence issues that may lack relevance. In other words, don’t just adjust the wrong evidence, weed out the evidence that lacks relevance. As a slight difference to Dr. Klotz’s pinnacle recommendation, I do not necessarily look to remove evidence that is incorrect, it is that I change my mind as to the belief. In my article "The case for removing Civil War Statues," (https://www.thecuriosityvine.com/post/the-case-for-removing-civil-war-statues-a-wonderful-community-and-an-analogy-close-to-home) I use an analogy to demonstrate how I changed my mind on the civil war statues. In this case, my original "white history-centric" cultural evidence about civil war statues was augmented by evidence related to a more fulsome "inclusive race history." The weight of evidence ultimately changed my belief about civil war statues. Truly a very liberating experience.


Further Reading:



Foundation

1. Our Brain Model

2. Changing Our Mind

3. The Education of a Sport’s Parent - Top Ten suggestions for developing the athlete’s mindset


High School Students

4. The benefits and risks of college – An employer's and risk manager's perspective

5. The College Decision: Framework and tools for investing in your future

6. The College Stoic: The Stoic's Arbitrage and making a great college decision

7. Be like Rudy: Community College as a smart, lower-cost path for Higher Ed


College Students

8. Shattering the Teacup Label: Honing Adulting Competencies and Building Resilience

9. How to make money in Student Lending

10. College Success!

11. Diamonds In The Rough – A perspective on making high impact college hires

12. Higher Education Reimagined


Career and Beyond

13. Do I need to be a data scientist in an AI-enabled world?

14. Success Pillars - a life journey foundation

15. They kept asking about what I wanted to do with my life, but what if I don't know? - Part 1

16. They kept asking about what I wanted to do with my life, but what if I don't know? - Part 2


 

The Stoic's Arbitrage: Your Personal Finance Journey Guide


Core Concepts

1. Our Brain Model

2. Curiosity Exploration - An evolutionary approach to lifelong learning

3. Changing Our Mind

4. Information curation in a world drowning in data noise


Making the money!

5. Career choices - They kept asking about what I wanted to do with my life, but what if I don't know? - Part 1

6. Career choices - They kept asking about what I wanted to do with my life, but what if I don't know? - Part 2

7. Career success - Success Pillars - A Life Journey Foundation

8. Career choices - Do I need to be a Data Scientist in an AI-enabled world?

9. Career choices - Diamonds In The Rough - A perspective on making high impact college hires


Spending the money!

10. Budgeting - Budgeting like a stoic

11. Home Buying - Homeownership is an important wealth-building platform

12. Car Buying - Cutting through complexity: A car buying approach

13. College choice - The College Decision - Framework and tools for investing in your future

14. College choice - College Success!

15. College choice - How to make money in Student Lending

16. Event spending - Wedding and event planning guiding principle


Investing the money!

17. Investment thoughts for my children

18. Our Investment Barbell Strategy

19. Using the Stoic's Arbitrage to choose a great investment advisor

10. Anatomy of a "pump and dump" scheme

21. The Time Value of Money Benefits the Young

22. How Would You Short The Internet?


Pulling it together!

23. Capstone - The Stoic’s Arbitrage: A survival guide for modern consumer finance products

387 views0 comments