Understanding the World: Fact-Checking. It Seemed Like a Good Idea at the Time
The Fact-Checking Ship Runs Aground on the Rocky Epistemological Shores
Note: This essay was prepared with the research assistance and ghostwriting of ChatGPT 4.0.
1. Introduction
Fact-checking has long been heralded as a vital tool in the modern age of information, offering a seemingly objective process for sorting truth from falsehood. It promises clarity in a world overwhelmed by conflicting claims, misinformation, and the ever-present forces of propaganda. At its inception, fact-checking seemed like an idea with noble intent—striving to uncover and uphold the truth. However, as with many good ideas, its practice has become far more complex and fraught with issues. Today, fact-checking is less a neutral pursuit of truth and more a battleground where bias, corporate interests, and political pressures collide.
2. The History of Fact-Checking
Fact-checking initially emerged as a tool for journalism, ensuring that reporters and publications maintained accuracy in their work. Traditionally, fact-checkers were responsible for verifying the details of an article before it went to print—cross-referencing sources, confirming dates, and clarifying quotes. The goal was clear: ensure that the information being presented was accurate, objective, and unbiased. This practice helped establish credibility, particularly in investigative journalism, where factual accuracy was crucial.
As media evolved, so too did the role of fact-checkers. In the internet age, where misinformation spreads rapidly, fact-checking expanded from internal editorial practice to public-facing organizations. Platforms like Snopes and FactCheck.org aimed to combat false claims circulating online. Initially, this move was seen as a necessary safeguard against misinformation, but over time, fact-checking became subject to the same pressures as the media outlets they were intended to police—corporate funding, political leanings, and tribal loyalties have all contributed to the erosion of fact-checking’s original intent.
3. The Epistemology of Fact-Checking
One of the foundational issues with fact-checking lies in the nature of facts themselves. Friedrich Nietzsche famously claimed that “there are no facts, only interpretations” (Nietzsche, 1887). While this assertion may seem extreme, it highlights an important philosophical question: can we ever truly access objective truth, or are all facts filtered through human interpretation?
In the realm of fact-checking, this becomes particularly relevant. Some facts—such as the speed of sound or the date of an event—are verifiable and measurable, but much of what fact-checkers assess exists in the grey area between fact and interpretation. This epistemological problem complicates the fact-checking process. What one fact-checker may assert as a fact could be, in reality, a subjective interpretation shaped by their own biases and the context in which the information is presented.
Nietzsche’s view, though limited, reminds us of the inherent difficulty in establishing universal truths. Fact-checking, while necessary, must account for the biases and interpretive lenses through which “facts” are often seen.
4. Psychology of Understanding and Bias
Understanding is not simply a cognitive process; it is deeply intertwined with our beliefs and emotional investments. Bias, while often used as a shorthand for misunderstanding, is distinct yet related. Bias emerges from emotional attachments to beliefs, creating blind spots or distortions in our perception of reality.
In the context of fact-checking, biases can significantly impact the conclusions reached by fact-checkers. Confirmation bias, for example, is the tendency to favor information that supports pre-existing beliefs, while ignoring evidence to the contrary (Nickerson, 1998). Similarly, political and ideological biases often influence how facts are selected, interpreted, and presented.
Fact-checkers, despite their role as arbiters of truth, are not immune to these biases. Their emotional investment in certain beliefs, whether consciously or unconsciously, can lead them to assert interpretations as facts. Understanding, then, must be recognized as an emotional and cognitive process shaped by our worldview, and fact-checkers are subject to the same cognitive distortions that affect everyone else.
5. The Role of Objective Reality
While Nietzsche’s assertion challenges the idea of objective facts, there remain aspects of reality that we can measure, record, and verify. These objective facts—such as scientific constants or documented historical events—offer a degree of certainty that transcends personal bias or interpretation. For example, the speed of sound at sea level is approximately 343 meters per second, and this can be measured reliably in various conditions (Rossing, Moore, & Wheeler, 2002).
Fact-checking, at its core, is meant to deal with these kinds of verifiable facts. However, the problem arises when fact-checkers are tasked with addressing more speculative or interpretive claims. In these cases, fact-checkers often find themselves not verifying facts, but rather making subjective judgments based on their understanding, biases, and the pressures they face. The distinction between verifiable facts and interpretive statements is crucial, but it is often blurred in practice, leading to errors and overreach in fact-checking efforts.
6. The Problem of Malinformation
While misinformation and disinformation have distinct roles in the world of falsehoods, malinformation is an entirely different beast. Misinformation refers to false information spread without intent to deceive, while disinformation refers to deliberately false information spread with the goal of misleading others (Wardle & Derakhshan, 2017). Malinformation, however, is not about falsity at all; rather, it concerns true information that is inconvenient or undesirable for those in power.
Malinformation is not an epistemological issue but a political one. It is information that, while true, is deemed harmful or inappropriate to share—often because it challenges the narratives upheld by governments or corporate interests. In practice, the term is a tool used to suppress truths that powerful entities do not want to become widely known. This makes malinformation a fundamentally cynical concept, as it suggests that certain truths are dangerous simply because they undermine existing power structures.
The inclusion of malinformation in the fact-checking framework is problematic because it blurs the lines between truth and falsehood. Fact-checking should be about verifying the accuracy of information, not suppressing inconvenient truths.
7. Misinformation, Disinformation, and Propaganda
To navigate the complexities of fact-checking, it is essential to clearly distinguish between information, misinformation, disinformation, and propaganda. Information refers to content that is true, though, as discussed earlier, determining truth is not always straightforward. Misinformation, by contrast, is false information that is shared without intent to deceive—essentially errors or mistakes in communication.
Disinformation, however, is a deliberate act of deception. It is the intentional spreading of false information with the goal of misleading people, whether for political gain, financial profit, or ideological control (Wardle & Derakhshan, 2017). This distinction between misinformation and disinformation is critical, as the latter is a tool of manipulation.
Propaganda, on the other hand, operates at a larger scale. It is not simply about spreading false information but about shaping narratives and controlling public perception. Propaganda is typically employed by groups—governments, corporations, political movements—and it is motivated by ideology or money. Unlike disinformation, which can be spread by individuals, propaganda is a coordinated effort, backed by significant resources, to influence entire populations.
In today’s world, propaganda has become highly sophisticated. It is often subtle, wrapped in the language of objectivity, but motivated by the interests of powerful groups. Whether through advertising, corporate influence, or political messaging, propaganda works to shape public understanding in ways that benefit those in power.
8. Corporate and Political Pressures on Fact-Checking
One of the most concerning aspects of modern fact-checking is the influence of corporate and political funding on the process. Fact-checking organizations, like many media outlets, rely on financial support to operate, and the sources of this funding are often far from neutral. As the saying goes, “follow the money,” and when we do, we frequently find that those who fund fact-checking organizations have a vested interest in shaping public perception.
Corporate sponsors, political actors, and large-scale donors all contribute to fact-checking organizations, and their financial support inevitably influences the scope and nature of the fact-checking process. Fact-checkers may shy away from scrutinizing claims that could harm the interests of their backers or may frame their findings in ways that align with the political leanings of those who fund them. This creates a troubling dynamic where the fact-checking process is not about truth but about reinforcing the narratives that serve those in power.
The integrity of many fact-checkers is, therefore, in question. While we must avoid directly accusing individuals of dishonesty without proof, the financial entanglements and incentives at play cast doubt on the neutrality of their conclusions. Moreover, we know that certain organizations and individuals have actively sought to shape public discourse through more covert means, infiltrating media outlets and fact-checking bodies.
9. Cass Sunstein, Cognitive Infiltration, and Propaganda
Cass Sunstein's views on conspiracy theories and his promotion of "cognitive infiltration" deserve critical attention. Sunstein’s work on the subject has been horrendous in its bias toward dismissing all but the most mainstream explanations. His views have allowed legitimate concerns about government and corporate collusion to be branded as "conspiracy theories," a term weaponized to stifle dissent. There are well-documented conspiracies—historical facts that show government and corporate entities working together in covert and unethical ways. To dismiss these out of hand by calling them conspiracy theories is not fact-checking; it is propaganda.
Sunstein’s theory of cognitive infiltration—the idea that governments should subtly insert themselves into groups and public discourse to discredit conspiracy theories—has played a role in biasing fact-checking organizations and shaping the broader propaganda landscape (Sunstein & Vermeule, 2009). The result is a situation where fact-checkers and media outlets alike no longer function as neutral entities seeking objective truth but as tools of power, reinforcing dominant narratives while silencing alternative viewpoints.
Sunstein’s approach is deeply flawed because it dismisses valid skepticism and critique. Not all conspiracy theories are baseless, and to categorically lump them together as untrustworthy without investigation is intellectually dishonest. Some conspiracy theories have been proven true over time, but the fact-checking apparatus, influenced by cognitive infiltration, tends to dismiss them outright if they challenge the dominant power structure.
By labeling valid inquiries as conspiracy theories, Sunstein's framework perpetuates a system where dissent is marginalized, and the fact-checking process becomes complicit in the spread of propaganda. In many cases, this has led to the suppression of legitimate concerns, questions, and evidence that do not align with the prevailing political or corporate agenda.
10. Examples of Fact-Checking Gone Astray
A prominent example of fact-checking failure occurred during the coverage of former President Donald Trump’s remarks about bleach and COVID-19. Many fact-checkers, relying on out-of-context clips, reported that Trump had suggested injecting bleach as a treatment. However, a careful review of the full transcript and video reveals that his statement was far more ambiguous. Trump’s remarks were speculative and unclear, but he did not explicitly recommend injecting bleach. This distortion became a widely accepted "fact" due to biased interpretations, not factual accuracy.
This case highlights how fact-checkers can allow their biases and political leanings to cloud their judgment. Instead of providing a nuanced analysis, fact-checkers reinforced a narrative that aligned with their own political views, distorting the facts in the process. There are numerous similar examples where fact-checkers, under pressure from corporate or political forces, have ventured into interpretive territory, presenting opinion as fact.
11. Making Fact-Checking Better
To restore trust in fact-checking, we must first acknowledge its limitations. Fact-checkers need to be transparent about the difference between objective facts and interpretive views. When a claim can be objectively verified—such as a scientific measurement or a historical date—it should be presented as fact. However, when fact-checkers delve into speculative territory, they must make clear that they are offering an interpretation, not an indisputable truth.
One way to improve fact-checking is by creating clearer guidelines for distinguishing fact from interpretation. Fact-checkers must also recognize their own biases and work to minimize their impact on the conclusions they reach. This can be done by adopting a more rigorous methodology, using multiple independent sources, and avoiding reliance on corporate or political funding.
The public, too, must be educated about the limitations of fact-checking. People should be encouraged to think critically about the information they consume and to recognize when fact-checkers are veering into the realm of opinion.
12. Conclusion
Fact-checking, once a tool for promoting truth and accountability, has become deeply compromised by corporate and political pressures. The distinction between objective facts and interpretive claims is often blurred, and many fact-checkers, influenced by financial backers and subject to cognitive infiltration, are no longer neutral arbiters of truth. The role of individuals like Cass Sunstein and the increasing sophistication of propaganda have further eroded the integrity of the fact-checking process.
In a world where financial interests and political agendas drive much of the media and fact-checking landscape, we must remain skeptical. It is essential to recognize the limitations of fact-checking and to demand transparency and independence from those who claim to be the arbiters of truth. Only by understanding these dynamics can we hope to restore trust in the pursuit of objective facts and challenge the propaganda that increasingly defines our information landscape.
References
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220. https://doi.org/10.1037/1089-2680.2.2.175
Commentary: Nickerson’s article provides a thorough exploration of confirmation bias, detailing how it affects both individual thinking and larger decision-making processes. It is foundational in understanding how fact-checkers may unconsciously align facts with their pre-existing beliefs, making it highly relevant to discussions of bias in fact-checking.
Nietzsche, F. (1887). On the Genealogy of Morals (W. Kaufmann & R. J. Hollingdale, Trans.). New York: Random House. https://philosophy.ucsc.edu/news-events/colloquia-conferences/GeneologyofMorals.pdf
Commentary: Nietzsche’s work, particularly his assertion that “there are no facts, only interpretations,” raises important epistemological questions that are central to the challenges fact-checking faces today. This philosophical perspective underscores the tension between objective reality and human interpretation, which is a recurring theme in critiques of modern fact-checking.
Rossing, T. D., Moore, R. F., & Wheeler, P. A. (2002). The Science of Sound (3rd ed.). San Francisco: Addison Wesley. https://www.amazon.ca/Science-Sound-3rd-Thomas-Rossing/dp/0805385657
Commentary: This textbook is a useful example of how certain facts, such as physical measurements like the speed of sound, can be objectively verified. Its inclusion highlights the distinction between measurable scientific constants and more subjective claims that fact-checkers often navigate.
Sunstein, C. R., & Vermeule, A. (2009). Conspiracy Theories: Causes and Cures. Journal of Political Philosophy, 17(2), 202-227. https://doi.org/10.1111/j.1467-9760.2008.00325.x https://philpapers.org/rec/SUNCTC
Commentary: Sunstein and Vermeule’s paper on conspiracy theories is deeply controversial, particularly because of Sunstein’s promotion of “cognitive infiltration.” This paper is often cited to justify governmental interference in public discourse under the guise of discrediting conspiracies, but it is also criticized for suppressing legitimate dissent and undermining trust in fact-checking processes.
A piece of pseudo-scholarly trash.
Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe report. https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html
Commentary: This report provides a comprehensive framework for understanding misinformation, disinformation, and malinformation. It offers valuable insights into how these phenomena interact with fact-checking, although its treatment of malinformation as a distinct category has been critiqued as politically motivated.
Another piece of pseudo-scholarly trash.
Addional References
These references provide students and journalists with structured approaches for integrating thorough and effective fact-checking into their reporting processes. I cannot vouch for the qulaity of these references.
KSJ Handbook for Science Journalism. (n.d.). Fact-checking in science journalism.
Knight Science Journalism at MIT.
Retrieved from https://ksjhandbook.org/fact-checking
Commentary: This handbook is designed for science journalists and provides specific strategies for fact-checking complex and technical subjects. It advises cross-checking information from multiple authoritative sources and emphasizes the role of expert consultation in verifying technical claims.Yellowbrick Fact-Checking Guide. (n.d.). Practical steps for fact-checking in journalism.
Yellowbrick.
Retrieved from https://www.yellowbrick.co/blog/journalism/mastering-fact-checking-strategies-in-journalism
Commentary: This guide outlines the essential steps for accurate journalism, including verifying quotes, statistics, and consulting subject-matter experts. It underscores the importance of transparency and credibility in news reporting, making it an invaluable resource for journalism students.
Appendix A: Verifiable Facts vs. Speculative Assertions
Verifiable Facts (Objective and Easily Verified)
The speed of light in a vacuum is approximately 299,792,458 meters per second.
Water boils at 100°C (212°F) at sea level.
The Earth revolves around the Sun once every 365.25 days.
The human body contains 206 bones.
The freezing point of water is 0°C (32°F) under standard atmospheric conditions.
World War II ended in 1945 with the surrender of Japan.
The chemical formula for water is H₂O.
The atomic number of hydrogen is 1.
Mount Everest is the tallest mountain on Earth, standing at 8,848 meters (29,029 feet).
The human body temperature averages around 37°C (98.6°F).
The circumference of the Earth at the equator is about 40,075 kilometers (24,901 miles).
Pi (π) is approximately 3.14159.
Blood types include A, B, AB, and O.
Neil Armstrong was the first human to walk on the Moon.
Oxygen makes up about 21% of the Earth's atmosphere.
Photosynthesis is the process by which plants convert sunlight into energy.
The Eiffel Tower is located in Paris, France.
Venus is the second planet from the Sun.
Speculative or Interpretive Assertions (Often Claimed as Facts)
Political policy outcomes, such as "Tax cuts for the wealthy increase economic growth."
Climate change projections, such as "The world will warm by 3°C by 2100."
The interpretation of historical events, like "The sole cause of World War I was the assassination of Archduke Franz Ferdinand."
Health claims, such as "Consuming red meat increases cancer risk."
Economic predictions, like "Bitcoin will replace all forms of currency by 2050."
Political motivations, such as "The government’s new law was passed solely to suppress dissent."
Psychological theories, like "Childhood trauma inevitably leads to depression in adulthood."
Social interpretations, such as "Social media addiction is the primary cause of mental health issues in teenagers."
International relations assertions, like "Country X’s military buildup is a direct threat to global stability."
Cultural trends, such as "The decline of family values is the leading cause of crime in urban areas."
Economic cause-effect relationships, like "Raising the minimum wage always leads to job loss."
Medical projections, such as "By 2030, most cancers will be curable."
Political narratives, like "Candidate Y's victory in the election proves the country is more progressive."
Claims about the media, such as "All mainstream media outlets are controlled by corporate interests."
Global population projections, like "The world population will exceed 10 billion by 2050."
Technological predictions, such as "Artificial intelligence will surpass human intelligence by 2040."
Dietary claims, like "A vegetarian diet is always healthier than a meat-based diet."
Educational assertions, such as "Remote learning is more effective than in-person learning."
Gender and identity statements, like "Gender is purely a social construct with no biological basis."
Climate policy interpretations, such as "Switching to renewable energy will end fossil fuel reliance within the next decade."
Appendix B: The Murray-Gell-Mann Effect and the Limits of Fact-Checking
The Murray-Gell-Mann effect, named after the Nobel Prize-winning physicist Murray Gell-Mann, refers to a cognitive phenomenon where individuals notice factual errors in articles on topics they are well-versed in but continue to trust the accuracy of articles on topics outside their expertise. This effect highlights a significant flaw in how we perceive and evaluate information.
The key implication of the Murray-Gell-Mann effect is that if we can spot factual inaccuracies in areas where we are knowledgeable, it stands to reason that similar errors exist in articles covering unfamiliar topics. Yet, we often grant these articles the benefit of the doubt, assuming their accuracy despite the likelihood of similar mistakes.
Fact-checking, intended to serve as a safeguard against misinformation, has never fully addressed this issue. While fact-checkers may focus on correcting blatant or well-documented errors, the subtle nuances of specialized knowledge often go unchecked. This leads to a breakdown in trust, particularly when readers with domain-specific knowledge consistently encounter factual misrepresentations.
For instance, a professional in the field of medicine may read an article about healthcare and immediately notice glaring errors or misinterpretations of medical studies. However, the same individual might read an article about economics and, being less familiar with the subject, assume the facts are correct. This misplaced trust perpetuates the problem that the Murray-Gell-Mann effect warns us about: fact-checking can only go so far, and in areas of specialized knowledge, the chances for error are magnified.
This raises a crucial question about the reliability of fact-checking as a whole. If fact-checkers cannot reliably identify errors in topics they are less familiar with, how can they be trusted to vet the accuracy of information in other domains? The Murray-Gell-Mann effect demonstrates that fact-checking has inherent limitations, and it reminds us to approach all information with a healthy degree of skepticism—especially in areas outside our expertise.