When Maps Melt: The Limits of Knowledge in Decision-Making
- Jeff Hulett
- Oct 1
- 9 min read
Updated: Nov 4

At Personal Finance Reimagined (PFR) we often remind students and entrepreneurs of an essential point: Good decisions depend on clear thinking, yet one recurring confusion is between probability and frequency. The error arises because historical frequencies—what has happened—are often incorrectly treated as certain probabilities for the future. As the physicist E.T. Jaynes suggested, frequency is merely an observation of the past terrain, while probability is the map we use to navigate the future. Our past knowledge of the territory is always incomplete, meaning the future-focused map we rely on is inherently flawed. By understanding this crucial distinction, we can refine our decision-making process and ensure our map is effective for guiding long-term success.
About the author: Jeff Hulett leads Personal Finance Reimagined, a decision-making and financial education platform. He teaches personal finance at James Madison University and provides personal finance seminars. Check out his book -- Making Choices, Making Money: Your Guide to Making Confident Financial Decisions.
Jeff is a career banker, data scientist, behavioral economist, and choice architect. Jeff has held banking and consulting leadership roles at Wells Fargo, Citibank, KPMG, and IBM.
What It Is
Facing the future Probability is the logic of uncertainty — a forward-looking framework helping us reason about what might happen next. It reflects how plausible an outcome is given what we know today. Probability is not a physical property of the world but a tool for making informed decisions about the future. Errors are indicated by a lack of accuracy.
Facing the past Frequency, by contrast, is a subset of data helping us interpret the past. It is backward-looking — an empirical record of how often something has happened in repeated trials or observations. Frequency is an understanding of the past. Errors are indicated by a lack of precision.
Frequency informs us about what has been, while probability guides us toward what could be. The difference between the two lies in the direction of our gaze: frequency looks behind us to measure what was; probability looks ahead to navigate what will be.
Statisticians—and I am one!—are famous for using p-values or other measures of precision to validate their findings, often saying things like, "The p-value < .05 is the evidence against the assumption nothing is happening." This is the confusing language of rejecting the null hypothesis to back into the thing we are trying to understand with a reasonable amount of precision. Once we get over the backward-sounding language, the challenge is that this confuses our precise past view of the terrain for an accurate map of the future. In other words, statisticians can help us be incredibly precise about not getting to where we want to go!
Think about the weather. When a meteorologist says there is a 40% chance of rain, they are suggesting a forward probability—a forecast grounded in knowledge and models. This forecast is constantly updated because new and different information is always available. That is why the best forecast of all—the weather in the next ten seconds—is achieved simply by looking outside. If we check all the days where that initial 40% forecast was made and find it rained on 38 out of 100, we are observing a historical frequency. One describes belief, the other records experience. One predicts the variable future, the other records the fixed past.
Why It Matters: The Three Nevers and The Limits to Knowledge in Human Affairs
This distinction is not just philosophical — it shapes how we interpret evidence and make choices. The Three Nevers define the core informational challenges in decision-making and prediction within complex, real-world systems. They explain why simply extrapolating from historical frequency or current data is inherently flawed, necessitating a framework built on humility, adaptability, and continuous inquiry. As we will see, 1) bias impacting decision-making often results from the 3 Nevers and 2) bias is less the intended result of some bad actors, but more the emergent result of how all actors respond to the environment's incentives and constraints.
1. Knowledge is Never Complete
Rumsfeld's Never: This principle asserts we always operate with partial information, meaning there are inherent and often unknowable blind spots in our understanding of any given situation. A dataset, a historical record, or a current observation can never capture all the relevant variables, underlying mechanisms, hidden intentions, or rare, unpredictable events capable of fundamentally influencing an outcome. There are always unobserved factors, measurement limitations, and future contingencies outside the scope of our current knowledge. This reality is famously encapsulated by former U.S. Defense Secretary Donald Rumsfeld's observation: "There are known knowns... but there are also unknown unknowns—the ones we don't know we don't know." It is these crucial unknown unknowns that render current knowledge inherently incomplete.
Example: Imagine a financial analyst trying to predict the future price of TechCorp stock. The analyst has access to all public data: earnings reports, historical price movements, analyst ratings, and macro-economic trends. This is the complete knowledge available to the public. However, the true outcome will be shaped by information the analyst doesn't know: the CEO is secretly negotiating a major, market-moving acquisition; a key competitor is about to announce a revolutionary new product; or a confidential regulatory investigation is about to go public. The stock's future is determined by these crucial, unobserved factors, rendering the current, supposedly complete dataset inherently incomplete and the prediction flawed.
2. Knowledge is Never Static
Goodhart's Never: This principle highlights: 1) The world is constantly changing, and critically, 2) People are not independent agents like simple statistical data points. Unlike physical processes, human agents observe, interpret, and react to changes in their environment, policies, or measurements. This makes knowledge ephemeral and often self-defeating because the very act of measuring or setting a goal for a system changes behavior within it. Humans exhibit unknowable, non-linear responses to new information or incentives. This dynamism is precisely where Goodhart's Law manifests: "When a measure becomes a target, it ceases to be a good measure." Goodhart's Law has a powerful analogue to physics and Heisenberg's Uncertainty principle. In the same way, the act of observing a particle obscures the particle's momentum. Both concepts define inherent limits to knowledge: the act of observation fundamentally changes the system.
Example: Consider a major financial institution trying to boost performance by setting a strict Key Performance Indicator (KPI): "Every customer must have eight banking products." (The infamous Wells Fargo scandal offers a sharp historical record of this approach.) The executive committee assumes the historical frequency of product adoption reflects stable customer demand. The reactive human dynamism kicks in: bank employees are not independent agents; they react to the new, aggressive target. To meet the quota, employees strategically open unauthorized accounts, sometimes transferring funds without customer consent, and prioritizing quantity over ethical sales. The initial "knowledge" indicating a high product-per-customer frequency equals sales efficiency becomes obsolete and misleading once it is targeted. The underlying goal (true customer service and profitable growth) is subverted by the strategic, interdependent behavior of the people attempting to satisfy the metric.
3. Knowledge is Never Centralized
Hayek's Never: This principle establishes that no single person or institution holds all the relevant information necessary to truly understand a complex system, whether it's a global market or a local supply chain. The crucial knowledge is distributed across countless individuals, organizations, sensors, and locations.
This concept is central to the work of Nobel laureate F.A. Hayek, who showed how the localized, specific knowledge held by millions of individuals (e.g., a foreman knowing the current state of a machine, or a shopper knowing their personal preference) cannot be centrally gathered or processed. This decentralized information is aggregated most effectively through price signals—an action often described by Adam Smith's "Invisible Hand." Smith's metaphor demonstrates how individuals, pursuing their own interests, unintentionally drive resource allocation based on supply and demand, creating outcomes that no single planner could have foreseen or directed. The resulting market equilibrium is "invisible" in the sense that while we observe the price and quantity, we do not—and cannot—know the exact source of the information aggregated from every individual decision maker. True insight requires understanding this powerful, distributed reality.
Example: An executive committee at a retail chain decides to aggressively expand its inventory of a new high-tech gadget, "The Zapper." The decision is based on centralized knowledge: favorable reports from the marketing department and high-level sales projections. However, the decentralized knowledge tells a different story: a store manager in a key region knows a local competitor just launched a superior, cheaper alternative; a junior warehouse clerk knows the current racking system can't safely handle the increased volume of The Zapper's large boxes; and the delivery truck driver knows the main shipping route is about to close for two months of roadwork. Because this vital, distributed knowledge isn't efficiently gathered and weighted, the company orders millions of units, only to face immediate, costly problems with collapsing inventory, low sales, and shipping bottlenecks.

When the Nevers converge: The Broken Window
The systemic challenge of the Three Nevers is robustly captured by Frédéric Bastiat's famous "Broken Window Fallacy." Bastiat, a 19th-century French economist and legislator, used this parable to critique superficial thinking in policy by forcing the reader to account for unseen consequences.
In the parable, a vandal breaks a baker's window. The immediate observers conclude that this destruction is a net gain because the baker must pay the glazier, stimulating the economy. This simple observation fails due to the Three Nevers:
Never Complete: The observers only see the immediate transaction (the seen). They fail to see the opportunity cost (the unseen)—what the baker would have purchased instead (e.g., shoes from the cobbler). The frequency of economic activity is rendered incomplete because it ignores this crucial negative counterfactual.
Never Static: The destruction of the window is a permanent loss of capital that corrupts the market's stability. Because the baker must divert capital to repair the window, their costs increase. To maintain the equilibrium, the baker must either raise prices or reduce supply, creating a net economic negative for the broader market. This reactive, non-linear behavior demonstrates that the system has shifted, and the total wealth has been reduced.
Never Centralized: The flawed conclusion arises because the glazier's gain is visible and centralized in the transaction, while the cobbler's loss is distributed and invisible across the broader market. No single observer holds all the local knowledge required to calculate the true aggregate economic impact.
This scenario demonstrates how a frequency analysis based only on what is seen fails because the data is simultaneously incomplete, dynamically unstable, and decentralized.
Map or Territory?
These realities mean frequency is inherently limited. It looks backward, describing what has happened, but cannot fully anticipate how knowledge and behavior evolve. To make sound decisions, we must use frequency as a starting point but go further — applying probabilistic reasoning to think forward and adapt to change.
Healthcare: A doctor’s estimate you have a 10% chance of a condition, which is not the same as saying 10% of patients have it. The first depends on your symptoms, history, and test results; the second is population data.
Markets: Investors often mistake past market returns (frequencies) for future probabilities, ignoring shifting incentives and hidden risks.
Policy: Treating probability as nothing more than frequency overlooks F.A. Hayek’s key insight — knowledge is dispersed. Decision-makers must reason with incomplete information, not just tally historical counts.
In each case, confusing the map with the territory — frequency with probability — leads to costly mistakes. Recognizing the 3 Nevers keeps us honest about what data can tell us and disciplined about using probability to bridge the gap between what was and what will be.
How to Best Think About It
At PFR, we teach a decision-first process:
Frame with probability: Use probability as your reasoning map. Ask, given what I know (past), how plausible is this outcome (future)?
Inform with frequency: Let real-world data refine your map. Observed outcomes provide feedback, not the whole story.
Respect their difference: Probability is epistemic — about what we know and do not know. Frequency is empirical — about what we can observe, but subject to the 3 nevers.
Use a consistent, repeatable decision process: We arm our students and clients with Definitive Choice or similar choice architecture tools. This helps build their decision system.
Maintain a belief updating mindset: Because of the 3 Nevers, at any point in time, our knowledge is incomplete. Treat beliefs the way cognitive researcher and "superforecaster" Philip Tetlock does: "...beliefs are hypotheses to be tested, not treasures to be guarded."
As with a map and terrain, both are indispensable. The map without the territory is speculation; the territory without the map is noise. The challenge today is that advanced observation tools like AI make the past-focused terrain incredibly vivid and detailed, tempting us to let this data overshadow our judgment. The availability of highly detailed, past-focused data makes us more likely to confuse confidence with hubris. This precision about what was ultimately blinds us to the great incompleteness of our knowledge for predicting what will be, a failure explicitly demonstrated by the Three Nevers.
Bayesian Inference is the best approach to managing past frequencies in a way to make future probabilities. For a deeper dive, please check out:
Takeaway: Probability guides future expectations. Frequency records past outcomes. The "3 Nevers" suggest they are never the same. Confusing them blinds us. Keeping them distinct helps us make decisions using past records, adapting to uncertainty, honoring dispersed knowledge, and, in the spirit of Hayek, letting reality guide our choices without pretending we can centrally control it.
Resources for the Curious
Bastiat, Frédéric. That Which Is Seen, and That Which Is Not Seen. Originally published 1850.
Goodhart, Charles A.E. "Problems of Monetary Management: The UK Experience." In Papers in Monetary Economics, Volume I. Reserve Bank of Australia, 1975.
Hayek, F. A. “The Pretence of Knowledge.” Nobel Prize Lecture, 1974.
Heisenberg, Werner. Physics and Philosophy: The Revolution in Modern Science. Harper & Brothers, 1958.
Hulett, Jeff. Making Choices, Making Money. Personal Finance Reimagined, 2022.
Jaynes, E. T. Probability Theory: The Logic of Science. Cambridge University Press, 2003.
Kahneman, Daniel. Thinking, Fast and Slow. Farrar, Straus and Giroux, 2011.
Rumsfeld, Donald. Department of Defense News Briefing. February 12, 2002.
Smith, Adam. An Inquiry into the Nature and Causes of the Wealth of Nations. Originally published 1776.
Tetlock, Philip E. and Dan Gardner. Superforecasting: The Art and Science of Prediction. Crown, 2015.

