Insight for Business Leaders Building Great Values-Driven Companies

Good Decisions Require Good Information

Written by Eli Makus | August 23, 2023

Making good decisions is a business leader’s most important job duty. But good decisions require good information, which can be elusive. As Robert K. Greenleaf noted, 

“On an important decision, one rarely has 100% of the information needed for a good decision no matter how much one spends or how long one waits. And, if one waits too long, [they have] a different problem and [have] to start all over. This is the terrible dilemma of the hesitant decision maker.” Robert K. Greenleaf, The Servant as Leader

Actually, the dilemma for business leaders is even more dire. Not only do leaders rarely have all the information but they cannot be sure the information is accurate or reliable. How can decision makers act quickly and decisively, to avoid Mr. Greenleaf’s warning, with incomplete and unreliable information? 
Employing specific critical thinking skills and analytical tools allows leaders to make the best of limited and uncertain information. 

Our firm specializes in making factual findings amid contested employment disputes. Employers hire our lawyers to conduct impartial investigations in response to employee complaints, usually involving harassment, discrimination, or retaliation. Our investigators are trained to use critical thinking skills and analytical tools to evaluate information gathered and make findings. We find many of these same critical thinking skills and analytical tools also help us evaluate information in our business, leading to better, more informed, and thoughtful decisions.

We all know “facts” are elusive, especially when relying on people and their memories to report them.

Further, the more we critically analyze a fact, the more elusive and uncertain it becomes. Why? The daily information business leaders receive is inevitably subject to varying perspectives, unreliable memories, and incomplete data. Opinion too often masquerades as fact. Imagine a fact is a prism, and what that fact reveals depends entirely on which side of the prism the person describing it is looking at. Everyone will see the prism, and the fact, a little differently, given their perspective, interests, and experience. Further, social psychologists tell us that good stories can increase the persuasiveness of weak facts but decrease the persuasiveness of strong facts.¹


If we cannot have all the facts and guarantee their accuracy, we must focus on the reliability of the information we gather. Leaders cannot make good decisions without reliable information.


Some facts are more reliable than others. Hopefully, your financial statements are accurate (although even they can be subject to the prism effect based on how information is categorized). Facts become harder to assess when relying on someone’s description and assessment. 


Let’s say you have to evaluate the success of a recent social media marketing campaign. The Marketing Director, Maria, claims the campaign was successful, but would have been more impactful if Sales had done more to support it. The Controller, Carl, notes the campaign exceeded the budget and he never believed it would be successful. The Sales Manager, Suri, believes social media is important, but claims the campaign was never going to directly impact sales this quarter and did little more than briefly elevate the brand in a crowded marketplace. Suri insists only direct contacts by the sales team can truly move sales.

How do you evaluate whether Maria’s campaign was a success or not, and who is responsible for its lackluster performance? The following framework emphasizes the importance of employing critical thinking skills and cognitive tools to analyze incomplete and conflicting information from multiple sources. 

 

Step One: Prepare Yourself

Prepare yourself by recognizing the moment calls for a thoughtful evaluation of information and reflect on how you will approach that analysis. Consider three key critical thinking skills:

  1. Commit to curiosity. The ambiguity and elusiveness of facts can lead to impatience and frustration, especially when time is limited. But curiosity drives more analysis and information gathering, which leads to better understanding. Asking questions from a place of curiosity instead of judgment will elicit more accurate information. When faced with reactions that seem disproportionate, out of touch, or uninformed, lead with curiosity and try to understand why.
  2. Be open-minded. Being open to multiple outcomes and solutions will lead to better decisions. Committing to this approach can limit the impact of confirmation bias. This is an unhelpful cognitive bias that leads one to only seek information consistent with a desired outcome. For example, you may bring your own biases to the state of the marketing campaign – you may have fully endorsed it or questioned it from the beginning. Being open to a variety of explanations may avoid only absorbing information that supports your original bias.
  3. Evaluate your own thinking. Metacognition is, put simply, the concept of thinking about your own thinking. The ability to self-reflect on your own analysis and decision-making will help you correct when biases and other threats to clear thinking interfere.

 

Step Two: Use Cognitive Tools to Evaluate the Reliability of Information

If we cannot achieve certainty, we must evaluate reliability. The more reliable the information, the more we can confidently rely on it in making decisions. These tools can quickly illuminate reliable – and unreliable – information. 

  1. Is there corroboration, or lack thereof? Nothing is stronger to establish the reliability of a fact than multiple sources confirming the same information. But always ask how they know. What if five people corroborate that something happened, but four of them heard it from the first and have no independent knowledge? In our example, Carl seems to corroborate Suri’s account because he never believed it would work. But is Carl a reliable source on the success of a marketing campaign? Suri claims the campaign was never going to increase sales but acknowledges the campaign “elevated the brand.” Maybe Suri’s account corroborates Maria’s view that the campaign was initially successful, but to truly drive sales, it needed more support from Sales. More information may be needed to evaluate how Sales could have impacted the success of the campaign.
  2. Did the source of this information have an opportunity to observe what they are sharing? When presented with compelling information, ask how the source knows that information. This applies to people telling you information, documents, spreadsheets, and data. There is a source for all information. But can you trust that source? Carl may be reliable on the budget, but what information does he have to support his view the campaign would never be successful? Maria, on the other hand, can point to specific analytics showing more brand exposure through social media platforms. Suri can point to specific sales data for a relevant period of time.
  3. Evaluate the specificity of the information. Vague or ambiguous information suggests two outcomes. One, the source is well-intentioned, but does not really know if the information is reliable. Two, the source is not being forthcoming, and they are manipulating the information. Ask enough questions to know whether the source has specific examples, or just generalities. 
  4. Are biases distorting your analysis or theirs? We all have biases. We have a bias in favor of easier, faster outcomes. We have biases we cannot see but are the product of our lived experiences. Biases are natural, but can lead to unreliable, uninformed, and misleading conclusions.² When someone shares information with you, always consider how their biases may impact the reliability of what they tell you. Maria, for example, has a bias in favor of demonstrating the campaign’s success. She may be right but consider her statement with this view of the prism in mind.
  5. Look for consistent, or inconsistent statements. Past statements that are consistent with current statements strengthen the reliability of the information. The opposite is also true. If the story has changed, seek to understand why. What if you learned that Suri originally supported the social media campaign, but Sales team members did not. Now, Suri’s changed opinion may reflect an effort to protect the sales team, rather than a truly objective assessment.
  6. What motives are influencing how information is presented to you? Like bias, we all have motives that influence how we characterize information. However, evaluate motives with caution. Just because someone has a motive to exaggerate or fabricate information does not mean they are doing so.
  7. Avoid relying on your assessment of demeanor. Do you believe you can look someone in the eye and tell if they are lying? Scientists say your likelihood of success is less than 50%. We are not reliable evaluators of others’ demeanor. Look for objective, logical ways to analyze the reliability of what people tell you.


After evaluating the reliability of the facts and information received, it is time to turn to decision-making. In Maria’s case, you may still need more information. But by using these tools, you made headway in evaluating the reliability of Maria’s, Carl’s, and Suri’s assessment of the campaign. While Carl’s account is least reliable on the campaign itself, you now know you need more information to evaluate whether Sales could have or should have done more to support the campaign.


Ultimately, this framework advocates two concepts that sit uneasily together. On one hand, be methodical, objective, analytical, and unbiased in how you evaluate information. This practice will clear the way to better decisions. On the other hand, recognize that the facts upon which you must rely are inherently incomplete, elusive, and imperfect. And remember Mr. Greenleaf’s warning – hesitant decision makers risk having to start over if they wait for all the facts, made certain. 

 

¹ https://www.sciencedaily.com/releases/2019/08/190819082446.htm
² To learn more about how biases and heuristics impact human thinking and decision making, see Thinking, Fast and Slow, by Daniel Kahneman.