Everyone has heard of Occam’s razor (sometimes written as Ockham because it is attributed to the monk William of Ockham circa 1285-1347). This razor is also known as the principle of parsimony and suggests that “all other things being equal, the simplest explanation is usually the most probable” to which is added that “one should not postulate the existence of entities unnecessary to the explanation”. In other words, that one should not complicate things unnecessarily, if you have two competing ideas to explain the same phenomenon, you should prefer the simpler one. For example, if you call someone and he doesn’t answer the phone, the most logical thing to do is to think that he is busy and not that he has been kidnapped by a criminal gang. It is another way to say “Keep it simple, stupid“.
In philosophy, a rule, law, effect or principle that helps to simplify the analysis of a phenomenon, reducing the number of hypotheses by eliminating improbable explanations, is called a razor. But apart from this Occam’s razor we have seen, there are many others that can help us to understand how a situation has been arrived at, why someone reasons the way he reasons, or what to expect in the face of certain events.
Let’s look at some of them:
1 Sunk Cost Fallacy
It occurs when we keep working or doing something for the time (or money) that has already been invested in it.
We can think of a company that decides to develop a new product and for this purpose invests a large amount of money in research and development, as well as in hiring specialized personnel. After several months of work, the company’s team concludes that the product is not in sufficient demand in the market and that it will probably not be profitable. Despite this evidence, the company’s management decides to continue with the project because a lot of money has already been invested and they do not want to “lose” that investment. In this case, the company is falling into the sunk cost fallacy by making a decision based on resources already spent in the past, rather than objectively evaluating the future costs and benefits of the project.

2 Fixed Cost Fallacy
Unlike sunk costs, fixed costs are expenses that remain constant regardless of the level of production or activity. Examples are rent or wages. Although fixed costs do not change in the short run, they can change in the long run. Therefore, the fixed cost fallacy would consist of assuming that these costs are immovable and cannot be reduced or eliminated in the future.
In this case we could think of a company that has a production plant that requires a high level of investment in machinery and equipment. When the demand for its products decreases due to an economic recession, the company’s management decides not to reduce production capacity or close some lines, arguing that machinery costs are fixed and cannot be modified in the short term. In this case, the company is falling into the fixed cost fallacy by assuming that machinery costs are completely immovable, when in fact there are options such as selling or leasing idle equipment, renegotiating maintenance contracts, or even considering the temporary closure of some production lines.
3 Hanlon’s Principle
This principle can help us avoid confrontations and be more understanding with our colleagues. The principle goes like this: “Never attribute to malice that which can be adequately explained by stupidity.” Although this principle is described in a rather comical way, we have to think about the fact that it can not only be due to stupidity, it can be due to clumsiness, ignorance, misinterpretation, etc.
Let’s imagine that we receive an email from a co-worker in which he seems to be aggressively criticizing or questioning our work. Our first reaction might be to think that this colleague has a grudge against us and has done it on purpose to annoy us. However, if we apply Hanlon’s principle, we should consider that it is more likely to be a misunderstanding or miswording of the message, rather than attributing bad intentions to the coworker. Perhaps he simply did not express himself in the best way, or did not realize the tone the mail took on.
4 Hume’s Law
This law is also known as the problem of being and ought to be. It addresses the possibility of deducing normative sentences (about what ought to be) from descriptive sentences (about what is). In other words, it refers to the gap between what is (facts) and what ought to be (values).
If in an organization we see that all project managers are doing micromanagement, we may end up stating in the job description that project managers have to do micromanagement. We are translating what is, to what should be.
5 Sayre’s Law
This law is an observation about the intensity of conversations and discussions. According to this law, in any dispute, the intensity of feelings is inversely proportional to the value of the issues at stake. In other words, the less important the issue, the more heated the discussion may become.
For example, we may have a meeting where a process to be implemented in the organization is being discussed and the discussion drifts into the typeface of the presentation, which makes certain words not read well.
6 Streisand Effect
This effect occurs when you try to make something go unnoticed, and what you get is more attention.
Let’s imagine we are going to present a new idea at a meeting but we don’t invite a person who we know is critical of that idea. When people start asking why that person has not been invited, that person will gain prominence and so will his ideas, and the message against the idea to be presented will be amplified.
7 Hitchens’ Razor
This razor establishes that the burden of proof in a debate falls on the person making the assertion. In other words, what is asserted without evidence can be rejected without evidence, which is a translation of a Latin proverb “Quod gratis asseritur, gratis negatur”.
How many times have we not attended meetings where someone says that something is better than something else without providing a single piece of evidence, only their perception. These occasions always remind me of a phrase attributed to Edwards Deming “In God we trust; all others must bring data”.
8 Sagan’s Razor
This is an extension of the previous razor which states that “an extraordinary statement requires extraordinary evidence”. Again, the person making the claim has to provide the evidence.
It is not the same to say “this coffee is bad, we must change the supplier” as it is to say “the products of this company’s factory are not of quality, we are going to close the factory”. Clearly, for the second statement we will need much stronger evidence than for the first.

9 The Chesterton Fence
Chesterton’s fence is a simple rule that invites us to think twice before altering, destroying or modifying a tradition, rule or structure. According to this premise, we should never change something without understanding its original purpose. Before removing a fence that was put up by previous generations, we should make sure we understand why it was erected in the first place. Don’t change something until you know why it’s there.
I remember once working for a bank that there was on one of the servers a file that was called “run.txt” but it was empty. The file was very old, but since it didn’t bother, it was there. One day someone decided to delete it because it was empty. It turned out that there were some scheduled batch processes that only ran if that file was there (they checked its existence). The result was two days with several interrupted processes (one of them order processing) and a lot of people having to work at night to find out why nothing worked.
10 Gall’s Law
This law dictates that “complex systems that work, evolve from simple systems that work”. Another way of saying that we can’t run if we don’t know how to walk.
An example might be complex digital solutions. They start with small functions that are tested and aggregated to get this functional complex system.
11 Alder’s Razor
This razor allows you to end discussions that might otherwise drag on forever. “If a discussion cannot be resolved by experimentation, it does not deserve to be debated.” By this it means to tell us that if we cannot demonstrate whether one option is better than another, we cannot say with certainty the best of the options.
Suppose that in a construction project one team maintains that material A is better than material B to make one part of the construction, while another team maintains that B is better than A. If there is no objective data through experimentation (which may be previous experiments), we will not have enough information to decide in favor of A or B. A simple test will remove doubts and save time and money.
12 Popper’s Falsifiability Principle.
Enunciated by Karl Popper, this principle helps to discern what is science from what is not. Popper argues that a theory should be formulated in such a way that the conditions under which it would be false can be specified. This implies that scientific theories must make predictions that can be tested and disproved.
For example, we can present a new method of production will reduce costs and increase the quality of products. According to this principle, we must make specific predictions that can be tested. In this case, the company predicts that, by implementing the new method, production costs will decrease by 20% and quality will improve by 15%. Now it has to be disproved. To do this, the company implements the new method for a trial period. At the end of the period, it collects data on costs and quality. If the data show that costs did not decrease by 20% or that quality did not improve by 15%, the theory is considered falsified. This means that the company must revise its approach or discard it.
13 Pareto Principle
Well known and applied in organizations, it dictates that 80% of the results come from 20% of the actions.
In sales, the Pareto principle suggests that 80% of profits come from 20% of customers. The company can focus on serving those key customers to maximize profits.
14 Parkinson’s Law
Work stretches until the time allotted for its accomplishment is completed.
If you give a team a task and tell them it has to be done in three weeks, even if it can be done in two weeks, it will usually take 3 weeks (or more if they have not calculated well as we will see later). The tighter the times, the more efficiently the tasks are performed (the difference between carbon and diamond lies in the pressure).
15 Hofstadters Law
Enunciated in the book Gödel, Escher, Bach: An Eternal Golden Braid (highly recommended), this law extends Parkinson’s law by saying that “things take longer to get done than expected, even if you take Hofstadters law into account”.
We can see this when estimates are given in projects, how even if we are loose in these estimates, sometimes they fall short. Remembering Pareto, in a company I worked with they had a tool for project planning, which automatically added 20% time and cost to all estimates.

16 Duck test
The duck test is a humorous expression that refers to a form of inductive reasoning. The best known expression is:
If it looks like a duck, swims like a duck, and quacks like a duck, then it is probably a duck.
This is a way of saying that a person can identify an unfamiliar object by paying attention to its characteristics.
For example, the company notes that the new technology uses machine learning algorithms to analyze user behavior and deliver personalized recommendations “looks” similar to other solutions that have improved user experience on websites, such as Netflix or Amazon’s recommendation engines. If it “looks” and “swims” like an improvement in personalization, then it probably is a valuable investment for the company in that area.
17 Dunning-Kruger Principle
The Dunning-Kruger Principle is a psychological phenomenon that describes how people with low ability in a specific task tend to overestimate their capabilities in that same area. In turn, people with high skill level in a task often underestimate their abilities.
Let’s assume a person who is the manager of a call center department in a company. Despite having limited knowledge in data analytics and digital data strategies, because of relevance or opportunity, she comes forward to lead data projects offering shorter timeframes than someone who does know and taking complexity out of the subject. This manager overestimates her competence. She believes she can lead complex projects without additional training. As a result, she makes poor decisions and does not seek expert advice.
18 Anchoring Bias
This principle dictates that the first thing you read, see or learn, results in anchors for the following answers.
If someone asks a group if something is pretty and the first one says yes it is pretty, everyone who is undecided will tend to say it is also pretty following the anchor created by the first answer. This bias can even be given by external factors. In Will Smith’s movie Focus we have a good example of Anchor Bias.
19 Halo Effect
The Halo Effect is a cognitive bias that affects our perception of people or situations. It occurs when a positive characteristic of someone or something influences how we evaluate other related qualities, even if there is no data to support it.
If an employee who excels in a single area (e.g., presentation skills), might receive more positive overall evaluations, even if her performance in other tasks is mediocre. The Halo effect also manifests itself in negative perceptions.
20 Godwin’s Law
Mike Godwin back in the 1990s stated that as an online discussion gets longer, the likelihood that a comparison mentioning Hitler or Nazis will appear tends to one. This law is used to end discussions in some online groups.
Hitler or Nazis may not be directly discussed in business settings, but depending on the company, there are some recurring themes (perhaps a past project gone wrong or a bad hire) that take discussions completely off focus.
21 Hawthorne Effect
The Hawthorne Effect is a psychological phenomenon that refers to how observation of workers affects their job performance. People who know they are being watched and work better.
For example, in an office, managers may decide to conduct more frequent and visible performance assessments. Employees, knowing they are being watched, may feel more motivated and engaged in their work. This could lead to an increase in productivity and the quality of their work.

22 The Russell Teapot
Russell’s Teapot is an analogy made by Bertrand Russell about the existence of God that says that there is a porcelain teapot that is orbiting around the Sun, between Mars and the Earth. This teapot serves to refute the idea that it is also up to the atheist to justify his position or to discredit the unprovable claims of those who do believe in God.
Russell’s intention is not to place the burden of proof on the unbeliever, but to show that a claim without proof is not tenable, but, if that claim without proof is supported by “holy books,” “prophets,” and cultural traditions, the unbeliever who dares to doubt runs the risk of ending up cast aside (or burned in other times).
Let’s imagine that someone is reading a renowned book with some proven claims and propositions. This person can take part of this book and indicate that a certain idea is best for the company, without providing any more data than simply because a well-known book says so. Although the burden of proof should always be on the person making the assertion, sometimes this responsibility shifts to the doubter if the assertion is supported by tradition, books or cultural beliefs.
23 The “not invented here” syndrome
This syndrome occurs when, despite being a great idea or a great product, some group or person does not want it because they have not thought of it or created it themselves. Although they do not say so explicitly and look for strange reasons to justify their opinion, they really do not want it because others have thought of it or made it.
This can happen when, for example, one department A uses a document template to perform a task. Another department B has to start doing that same task and it would be logical to leverage the knowledge of department A that already operates in one way. Instead, we may find that department B decides to make its own template, because the one offered by the other department “wasn’t their idea.”
24 “Father of the child” syndrome
This other syndrome is caused by the attachment we have to the things we have created ourselves and it is the opposite effect to the previous one. It happens when someone sees exceptional qualities in something, just because that person has created it, thought about it or worked on it.
An example can be when applications or systems are decommissioned in companies, that someone may be reluctant to replace that system because he has been working a lot on it or even because he was the one who devised it. If someone sees virtues in a thing that others doN not, it is usually because they have an emotional attachment to that thing. No one likes to be told that their child is ugly or that it no longer works.
25 Murphy’s Law
Murphy’s Law is a well-known empirical principle that states that “if something can go wrong, it will go wrong”. This law was formulated by aerospace engineer Edward A. Murphy in the 1940s, after a series of failed experiments in which he observed that problems and mishaps are inevitable in any process or situation. These types of unforeseen situations are precisely what Murphy’s Law warns against. Despite efforts to plan and mitigate risks, there is always the possibility that something will go wrong, even in the most carefully designed processes.
Suppose a company launches a new product on the market. During the development process, the team has worked hard to ensure that everything goes smoothly. However, on the day of the launch, something unexpected happens: there has been a water leak in the street where the product warehouse is located and the entire street is cut off so that the first shipments cannot be sent out on time.

These are some of the many principles and laws that can help us to analyze situations more easily, to be prepared for certain events and to simplify decision making.
1 thought on “Simplifying reasoning”