Reason: Ubiquitous Cherry-Picking, The Essence of Thought
Cherry picking is not just a slur to hurl at those you disagree with; it is the very essence of cognition
Author’s Preface
Cherry-picking lines of evidence is the most ubiquitous and default mode of human reasoning. It takes a huge effort not to cherry-pick, and few succeed.
So, one could argue that it's a feature, not a bug.
So, any evaluation of other people, of societies, of artworks, of music—all of that involves cherry-picking. And it becomes a rationalization of our biases, just a way of rationalizing “boo” and “hooray.”
It is very funny that very few people realize this, nor work to counteract their tendencies, nor realize the number of ways in which cherry-picking is hidden. It doesn't occur to them that that's what they're doing. They think they're being logical.
I haven't encountered any self-reflective discussions on the issue of cherry-picking and how we reach conclusions, even though it's endemic. It's the default mode, as I said earlier—the default mode for reaching conclusions.
Psychologically, I don't think we're capable of reasoning without bias. That's just a feature of cognition. I don't even think it makes sense to talk about reasoning without bias. We can only reason within the constraints of our current worldview, and some of us will realize that our worldviews will change over time. But nevertheless, they both enable and constrain how we reason. And it could not be otherwise.
So, we'll never have all the evidence. It's not even clear what evidence means, but just assume we know pragmatically what evidence is—information, concepts that have some bearing on our conclusions. So, let's pretend that the omniscient one gives us all the evidence at a uniform level of conceptual granularity, as in Michael Polanyi's work on The Tacit Dimension—a uniform level of description.
The cognitive challenges would be way beyond human capacity if we had an even relatively complex area of discourse,: we couldn't make sense of at all.
So, we'll look for and choose our evidence based upon our confirmation-disconfirmation bias and interpret it through the lens of a large number of other biases.
I don't see how it could be otherwise, given the nature of the world and the nature of our minds. We're going to have to filter the evidence, and that's going to be done according to some unconscious processes. Even if we're trying to be systematic and use lines of evidence and cumulative arguments, we're still going to be subject to cherry-picking in terms of the lines of evidence we use—unless, in simple cases, we're very, very disciplined and give equal weight to each and every counterargument that we can think of—for the line of evidence we find confirming and disconfirming variants. Otherwise, I don't think we would handle the complexity of the world if we didn't filter all the possible evidence. You can call it heuristics, but that's just an admission of ignorance. We don't know how it's done. It's a mystery.
Introduction
Cherry-picking is often used as an accusation, a rhetorical club to dismiss opposing viewpoints. It is considered a failure of reasoning—a dishonest attempt to manipulate facts while ignoring contradictory evidence. But this critique misunderstands the fundamental nature of human cognition. Cherry-picking is not just something people do when they are being deceptive or careless. It is not a behavior that can be trained out of people through better education or stricter logic. Rather, cherry-picking is the unavoidable essence of thought itself.
Every act of reasoning, every conclusion we reach, every evaluation we make—all of it depends on selection, filtering, and interpretation. The human mind is incapable of processing all available evidence, nor would doing so be practical or meaningful. Instead, we extract, highlight, and organize information according to biases, heuristics, and unconscious mental structures that shape what we notice, what we remember, and what we consider relevant.
This essay argues that cherry-picking is not a bug but a feature, an inescapable cognitive function that enables thought while simultaneously constraining it. Even in the most disciplined and systematic reasoning, we are still engaged in selection—choosing which lines of evidence to pursue, what counterarguments to consider, and how to weigh the significance of competing claims. There is no alternative.
The question, then, is not whether we cherry-pick—because we always do—but rather how we navigate this necessary selectivity to form the best possible understanding of the world.
Discussion: The Inescapability of Cherry-Picking
1. Cherry-Picking as the Default Mode of Human Reasoning
Cherry-picking is not a rare failure of reasoning; it is the default mode by which humans make sense of the world. The mind must filter, simplify, and categorize in order to function. If we attempted to evaluate every piece of evidence with equal weight, we would be unable to draw conclusions at all. Instead, we gravitate toward confirmatory evidence, discard conflicting data, and rely on pre-existing frameworks to interpret information.
This applies to all areas of human judgment:
Personal Evaluations – We emphasize aspects of a person that support our impression of them while ignoring contrary evidence.
Art and Music Criticism – We judge works based on cultural norms, personal taste, and selective exposure.
Historical and Political Narratives – Societies construct self-serving stories that highlight certain facts while omitting inconvenient truths.
Scientific Inquiry – Even in structured research, scientists select which hypotheses to test, which variables to measure, and which interpretations seem most reasonable.
The illusion of objectivity comes from the false belief that our selections are neutral or purely evidence-driven.
2. The Myth of Bias-Free Reasoning
Psychologically, we cannot reason without bias. Bias is not an accident of cognition—it is cognition. We do not reason in a vacuum; we reason within a worldview, a structured set of beliefs, experiences, and assumptions that define how we see the world.
Even those who are aware of their biases cannot escape them completely. Some may acknowledge that their worldview will change over time, but this does not remove the fact that all reasoning is shaped by what we currently believe. Our frameworks both enable and limit how we think.
3. The Limits of Evidence: Even If We Had It All
Even if we had access to all the evidence—an omniscient view of reality—it would not eliminate the problem. Why?
Cognitive Overload – The mind cannot process unlimited information. Even perfect data must be filtered to be useful.
Conceptual Frameworks Remain – We would still interpret evidence through mental structures shaped by experience and prior knowledge.
Selection Is Inevitable – Even with full access to all facts, we would still need to choose what seems relevant, what to compare, and what to weigh more heavily.
Tacit Knowledge Is Not Explicit Knowledge – As Polanyi noted, much of what we “know” is implicit, intuitive, and embedded in practice. Even if we had an explicit description of everything, it would not guarantee understanding.
Thus, more evidence does not necessarily mean better reasoning. It often just creates more complexity to filter through, reinforcing the necessity of selective cognition.
4. The Hidden Nature of Cherry-Picking: Why People Don’t Notice It
Most people are unaware that they are constantly engaged in cherry-picking because:
They mistake their reasoning for objectivity. The selection process happens unconsciously, so it feels like a neutral evaluation rather than a biased one.
They assume bias is something that affects others, not themselves. People accuse opponents of cherry-picking while failing to see their own selective reasoning.
They equate feeling logical with being logical. If an argument “makes sense” within their framework, they believe it is objective.
They don’t see what they don’t see. Because filtering happens at a pre-conscious level, they never even consider the evidence they are ignoring.
Even in debates, people do not actually engage with the full spectrum of available information. Instead, they selectively construct counterarguments that reinforce their existing position, a process often mistaken for rationality.
5. Heuristics: A Fancy Word for "We Don’t Know How Thinking Works"
The fact that we cherry-pick evidence is sometimes justified by saying we use heuristics—mental shortcuts that help us make decisions quickly. But this is just an admission of ignorance. We don’t actually understand how we filter information, why certain things seem relevant, or what drives the unconscious weighting of evidence.
We can describe biases, but we cannot fully explain:
Why some arguments feel stronger than others, even if they are logically equivalent.
Why some pieces of evidence seem more relevant, even when they aren’t.
How our minds unconsciously decide what to focus on before we even engage in conscious reasoning.
In the end, thought itself remains a mystery—a process we engage in constantly but do not fully understand.
Summary: Thought as Selective Construction
Cherry-picking is not an anomaly in human reasoning—it is human reasoning. The mind does not operate as an impartial processor of truth but as a highly selective filter, navigating a world of overwhelming complexity by constructing meaning from selected evidence.
Even in the most structured forms of reasoning—science, philosophy, historical analysis—cherry-picking is not eliminated, only formalized and constrained within explicit methodologies. But it remains present.
The challenge, then, is not how to eliminate cherry-picking, because that is impossible. The real question is: How do we reason well, despite the inescapable selectivity of thought?
The best we can do is:
1. Acknowledge that selection is inevitable.
2. Be aware that we are always filtering and interpreting.
3. Recognize that accusations of cherry-picking apply to ourselves as much as others.
4. Refine our methods of selection—not to eliminate bias, but to make it more systematic, structured, and self-aware.
We do not reason in spite of cherry-picking. We reason because of it. Thought is not the neutral evaluation of truth but the selective construction of meaning.
Bibliography
Fiedler, K., & Unkelbach, C. (2011). Reinforcement effects in the hidden profile paradigm: The impact of confirmation bias on group decision-making. Psychology of Learning and Motivation, 54, 45-80. https://doi.org/10.1016/B978-0-12-385527-5.00002-5
Friedrich, J. (1993). Primary error detection and minimization (PEDMIN) strategies in social cognition: A reinterpretation of confirmation bias phenomena. Psychological Review, 100(2), 298-319. https://doi.org/10.1037/0033-295X.100.2.298
Hart, W., Albarracín, D., Eagly, A. H., Brechan, I., Lindberg, M. J., & Merrill, L. (2009). Feeling validated versus being correct: A meta-analysis of selective exposure to information. Psychological Bulletin, 135(4), 555-588. https://doi.org/10.1037/a0015701
Klayman, J., & Ha, Y. W. (1987). Confirmation, disconfirmation, and information in hypothesis testing. Psychological Review, 94(2), 211-228. https://doi.org/10.1037/0033-295X.94.2.211
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175-220. https://doi.org/10.1037/1089-2680.2.2.175
Oswald, M. E., & Grosjean, S. (2004). Confirmation bias. In R. F. Pohl (Ed.), Cognitive illusions: A handbook on fallacies and biases in thinking, judgement and memory (pp. 79-96). Psychology Press.
Plous, S. (1993). The psychology of judgment and decision making. McGraw-Hill.
Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12(3), 129-140. https://doi.org/10.1080/17470216008416717
Wason, P. C., & Johnson-Laird, P. N. (1972). Psychology of reasoning: Structure and content. Harvard University Press.
Wason, P. C., & Shapiro, D. (1971). Natural and contrived experience in a reasoning problem. Quarterly Journal of Experimental Psychology, 23(1), 63-71. https://doi.org/10.1080/00335557143000068
These works provide a comprehensive foundation for understanding how cognitive biases influence selective attention to evidence.
Appendix A - psychology of Bias
These observations align with extensive research in cognitive psychology. Cherry picking is not just a slur to hurl at those you disagree with; it is the very essence of cognition neuroscience, which has delved into how biases influence our selective attention to evidence. This body of work examines how our perceptions, interpretations, and recall of information are systematically filtered through various cognitive biases.
Confirmation Bias and Selective Attention
Confirmation bias is a well-documented phenomenon where individuals favor information that confirms their pre-existing beliefs while disregarding contradictory evidence. This bias affects how we gather, interpret, and remember information. Studies have shown that people tend to recall evidence that supports their expectations more readily than evidence that challenges them. This selective recall reinforces existing beliefs and contributes to phenomena such as attitude polarization and belief perseverance.
Attentional Bias
Attentional bias refers to the tendency of our perception to be influenced by recurring thoughts and emotions, leading us to focus on certain stimuli while ignoring others. For instance, individuals with specific concerns or emotional states may disproportionately focus on related information, affecting their judgments and decisions. This bias demonstrates how our internal states can direct our attention selectively, shaping the evidence we consider in reasoning processes.
Selective Perception
Selective perception is the process by which individuals perceive what they expect in observations, leading to an unconscious filtering of information. This bias causes people to overlook or misinterpret evidence that contradicts their expectations, further entrenching existing beliefs. Such perceptual filtering underscores the challenges in achieving objective analysis, as our expectations can subtly shape the information we acknowledge.
Forensic Confirmation Bias
In applied settings, such as forensic science, confirmation bias can have profound implications. Research has demonstrated that experts' judgments can be swayed by contextual information, leading them to interpret evidence in a manner consistent with initial beliefs or external suggestions. For example, knowledge of a suspect's confession can influence fingerprint analysts' interpretations, highlighting how selective attention to evidence can be affected by external cues and internalized expectations.
Conclusion
The interplay between cognitive biases and selective attention to evidence is a critical area of study, revealing that our reasoning processes are inherently influenced by unconscious filtering mechanisms. Recognizing these biases is essential for developing strategies to mitigate their impact, striving for more balanced and objective decision-making.

