Access to better data has not made organisations better at making decisions. Most leadership teams now have more information than at any previous point, processed faster, often with AI doing much of the analytical work. Yet the research is consistent: decision quality in most organisations has not improved correspondingly.
The reason is structural. Poor decisions are almost never caused by lack of information. They are caused by the conditions under which leadership teams operate and the habitual patterns those conditions reinforce.
McKinsey's 2026 research on team decision-making identifies six specific traps that consistently undermine decision quality across high-performing organisations. Recognising them is the first step to building the processes that counteract them.
The Six Traps
1. Skipping the Problem Definition
The most common trap is also the most consequential. Teams rush into debate and solution-generation without establishing shared clarity on the problem they are actually solving.
When a leadership team has not rigorously defined the decision, individuals are frequently debating different problems without realising it. One person is solving for short-term revenue. Another is solving for operational risk. A third is solving for team morale. The conversation feels like disagreement when the underlying cause is a failure of shared framing.
The structural fix is simple but requires consistent discipline: before any discussion of options, the group must align explicitly on the problem statement, the constraints, the criteria for success, and who owns the final call. This scoping conversation often surfaces the actual disagreement, which is frequently about values or priorities rather than facts.
2. One-Size-Fits-All Process
Not all decisions are alike. Choosing a strategic market to enter, approving a budget reallocation, and setting the date of a quarterly offsite require fundamentally different levels of deliberation, information, and governance.
Organisations that apply identical processes to every decision consistently make two errors: they over-process low-stakes decisions, creating delay and frustration; and they under-process high-stakes ones, moving too quickly through choices that warrant more rigour.
A useful approach distinguishes four categories: decisions that are high-stakes and hard to reverse, which warrant maximum deliberation; decisions that are high-stakes and reversible, which warrant careful analysis but faster action; decisions that are low-stakes and routine, which should be delegated without escalation; and decisions that are genuinely exploratory, which should be treated as experiments with defined review points rather than commitments.
3. Anchoring to the First Viable Option
Research on anchoring bias is extensive and consistent. The initial option presented to a group exerts disproportionate influence over the final decision. Once a concrete proposal is on the table, discussion evaluates everything against it rather than generating genuinely independent alternatives.
Leadership teams that want better decisions need structural mechanisms to generate multiple options before any are formally evaluated. This means separating the option-generation phase from the evaluation phase, and enforcing that separation even when it creates friction. The option chosen from three carefully developed alternatives is almost always better than the one chosen from one.
4. Mistaking Agreement for Quality
High-performing teams often have strong social cohesion. That cohesion is an asset in many contexts but a liability in decision-making when it produces premature consensus. The absence of visible disagreement is routinely mistaken for thorough analysis.
The most effective structural counter is to assign a dissenting role before discussion begins, with explicit responsibility for surfacing what the group might be missing or what the weaknesses in the emerging consensus are. This works better than inviting disagreement after a consensus has already formed, because by that point the social pressure to align is strong and the cost of dissenting feels personal rather than professional.
5. Treating Data as a Substitute for Judgment
The availability of sophisticated AI analysis can create an impression that a decision has been made by the data. Data describes what has happened. It does not determine what should happen next. This distinction becomes more consequential as AI systems generate increasingly authoritative-looking outputs.
Data is a critical input to good judgment. Leadership teams that delegate decision authority to models, or that accept AI-generated recommendations without applying contextual knowledge, are offloading precisely what they bring most value through: the capacity to understand what the data cannot capture.
Effective decision-making in the AI era requires being explicit about where data ends and judgment begins, and documenting that boundary so it can be examined and, where appropriate, challenged.
6. Sunk Cost Anchoring
Investments already made exert powerful psychological pull on subsequent decisions. Organisations continue with failing initiatives not because the forward-looking case is strong, but because stopping means acknowledging that earlier decisions were wrong.
The discipline required is to evaluate every continuing investment as though it were a new decision: given where we are now, with the information we have now, is this the best allocation of attention and resource going forward? The answer to that question should not change based on how much has already been committed.
The Common Thread
Each of these six traps shares an underlying mechanism: a social or cognitive pressure that drives the group to converge on a decision before thoroughly examining whether it is the right one.
The most effective response is not to remove human judgment from the process. It is to build conditions in which human judgment can operate at its best: clear problem framing, structured deliberation, explicit dissent, and a cultural norm in which revisiting a decision is treated as rigour rather than weakness.
The organisations making better decisions are not those with better data. They are those with better conditions for thinking together.
References
McKinsey & Company (2026) The State of Organizations 2026. New York: McKinsey Global Institute.
Kahneman, D. (2011) Thinking, Fast and Slow. London: Allen Lane.
Lovallo, D. and Sibony, O. (2010) 'The case for behavioral strategy', McKinsey Quarterly, March 2010.
Edmondson, A.C. (2018) The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth. Hoboken, NJ: Wiley.