A Blindspot in Cognitive Immunology Research?
Recently, I was considering purchasing the book "Mental Immunity: Infectious Ideas, Mind Parasites, and the Search for a Better Way to Think", when I came across a comment in the book review, that I found to be extremely indicative of the type of reasoning plaguing the world. The short answer to the title of this post is "no", but I raise it because it's not evidently clear, given the methods presented by this research, on how to handle this type of backlash. Being interested in critical thinking, naturally I was intrigued by the idea of Cognitive Immunology. The idea is that there is a system in the mind, analogous to the bodies immune system, that can be trained, strengthened, and fortified to help the individual avoid believing parasitic beliefs. The terms refer to the mind’s ability to resist harmful influences, including misinformation, biases, and maladaptive thinking patterns. According to this research, critical thinking by itself is insufficient when it comes to preventing us from adopting problematic beliefs. I don't want to spend time explaining what I find wrong with this research program, so I'll be brief. Generally, I am on board with this program, but have some reservations with the analogy. I don't think the community has established there to be an analogous structure in the mind. Also, if the term is used metaphorically, it could be misappropriated. I also don't really see a distinction between "learning" and "strengthening your cognitive immunity"; it seems that a comprehensive critical thinking course would cover the majority of content recommended by this community, so I am not sure entirely where the value-add would be except for rebranding "critical thinking" in scientific terminology. Nevertheless, there is sufficient overlap with my goals and objectives when it comes to disseminating critical thinking skills that enable people to avoid being exploited. I'll first describe some of the basic recommendations from this research before I analyze the comment.
Lets start by having a look at their 12 step recommendation for achieving mental immunity:
I am entirely in favor of every recommendation with the exception of Step 8. This has been in no way established philosophically. Nevertheless, I understand the sentiment. They are trying to avoid someone becoming some sort of moral relativist who is passive about discussions of morality. I wholeheartedly agree with this level of description and concern. We need to engage in moral analysis and reasoning to gain a broader understanding of the moral landscape. However, insistence on a moral objectivity can obfuscate the perspectivalism embedded within this complex discussion. I think many people come to some moral conclusion and insist it's objective at the expense of furthering the dialogue. Perhaps this can be avoided by adhering to the other steps.
They also have a "New Socratic Method" resource that I find useful; although I don't entirely see the difference from the original method seen in the Socratic Dialogues:
I've got no problem with this outline. I'm constantly trying to figure out ways to improve dialogue so this to me seems like a useful resource. I am not sure how someone could disagree with this unless they severely lacked intellectual virtue. CI researchers also have resources distinguishing rational justification from dogmatism:
One trap novice critical thinkers fall into is some sort of radical skepticism when they realize many of their beliefs are held with dogmatic certainty. It's important to note that you can have firm beliefs provided they are arrived at through rational mechanisms. You just have to be willing to revise your beliefs if they cannot withstand scrutiny. CIRCE (Cognitive Immunology Research Collaborative) suggests several methods to strengthen cognitive immunity by enhancing critical thinking and reducing susceptibility to misinformation and cognitive biases. Here are some key strategies:
- Mental Inoculation: Just as vaccines expose the body to a weakened form of a virus to build resistance, mental inoculation involves exposing people to small doses of misinformation along with explanations of why it's false. For example, if you know that deepfakes are becoming common, learning how they are made and identifying their flaws helps build resistance to their deceptive effects.
- Cognitive Reflection: Engaging in slow, reflective thinking instead of relying on gut reactions can help counter cognitive biases. Ask yourself, “Is this claim supported by evidence?” or “Am I being influenced by emotions or cognitive shortcuts?”
- Ideological Flexibility: Being aware of ideological immune systems (resistance to changing beliefs) helps individuals become more open to revising their views when faced with strong evidence.
- Epistemic Vigilance: being critical of information sources and assessing their credibility. Cross-check facts from multiple reputable sources before forming conclusions.
- Debiasing and Cognitive Hygiene: Identify and correct cognitive biases like the confirmation bias (favoring information that supports existing beliefs) or availability heuristic (basing judgments on easily recalled examples).
- Socratic Questioning: Use systematic questioning to analyze claims, such as: What do you mean by that? (clarification), What evidence supports this? (evidence, more on this later), Could there be another explanation? (alternative), and what if this belief were wrong, what are the implications? (consequences)
We generally ought to strive for an ideological flexibility that can strengthen cognitive immunity. The more rigid an ideology, the more susceptible it is to pathogens. Beliefs should not be treated as sacred or immune to questioning. This requires epistemic humility, acknowledging that you might be mistaken, while being open to belief revision based on new evidence. The evidence must be critically assessed; is the idea based on reliable evidence? Does it come from a trustworthy source? Are there fallacies in the sources argument? In this process, we should ask whether the proposed belief encourages dogmatic thinking or open inquiry. Dogmatic thinking lacks epistemic virtue. Well functioning minds thrive on intellectual diversity and open dialogue; so you ought to avoid echo chambers but in a healthy skeptical way. Curiosity is generally a powerful antidote to dogmatism. Therefore, we ought to cultivate a spirit of inquiry, or a questioning mindset, where you constantly ask "what else might be true?" and "how do we know this?". This involves separating truth-seeking from group loyalty, committing to reason and evidence over identity and tribalism. Another important habit of mind, is intellectual charity. Instead of dismissing opposing views, try to understand them in their strongest form. One of the most important features is to practice actively open-minded thinking; engage in intellectual exploration rather than ideological entrenchment and be curious rather than defensive when encountering new information. Lastly, and probably most importantly, practice metacognition. Become aware of how you form beliefs and the potential flaws in your reasoning. Regularly reflect on why you believe what you believe. Think about how you are thinking; interrogate the methods by which you came to that conclusion.
So far, everything mentioned are methods and dispositions that are to be performed on a more localized scale. But what about immunity to systematic, coordinated, institutional campaigns, designed to obfuscate truth? Cognitive immunology provides methods to address these scenarios as well. Some of the methods include:
- strategies to recognize how institutions can create the illusion of scientific debate when there is actually broad consensus.
- strategies to enhance greater epistemic vigilance and media literacy needed to detect conflicts of interest.
- strategies to break ideological bubbles by exposing people to diverse, reliable perspectives, key to counteracting group-based misinformation and echo-chambers.
- strategies to avoid the "marketplace of ideas" fallacy; cognitive immunology critiques the idea that all perspectives deserve equal weight. Instead, credibility must be determined by epistemic rigor.
- Cognitive immunologists advocate for algorithmic interventions (e.g., limiting misinformation virality) and teaching digital literacy. Bad ideas spread like viruses and are frequently constructed using memetic engineering.
- strategies to enhance cognitive inoculation—exposing people to authoritarian misinformation tactics beforehand to build resistance to extremist groups that use identity-based persuasion to cast doubt about objective truth.
- strategies of debiasing techniques—training individuals to recognize financial and psychological manipulation in narratives. This keeps us aware of grifters and misinformation peddlers (e.g., influencers, politicians, media figures) exploit cognitive biases for financial or ideological gain.
- Cognitive Hygiene: "Clean" vs. "Contaminated" Information: Treat bad ideas like mental pathogens before accepting an idea. Has this claim been vetted by independent sources? Is this media outlet known for accuracy, or does it have a history of spreading falsehoods? Am I being emotionally manipulated? (Fear, outrage, tribalism)
- Source Reliability Check ("Trust, but Verify"): Don’t just look at the headline or source name—dig deeper: Who owns or funds the media outlet? (e.g., corporate, government, activist) Is it transparent about its sources? What’s the outlet’s track record for factual accuracy? (Use factchecking sites)
- Cross Checking with Independent Sources: If a claim is important, verify it across multiple reputable sources. Example: If one news site reports a major scandal, check whether other reliable sources confirm it. Be wary of single source stories or “exclusive” reports that lack outside verification.
- Recognizing Manipulative Framing and Bias: Media can distort facts through framing, selective omission, and emotional language. Is this article aiming to inform or to persuade? Does it use emotionally charged words to trigger outrage? What alternative viewpoints or missing facts might change the story?
- The "Illusion of Depth" Test: Misinformation often feigns intellectual depth but lacks substance. Test if an article or speaker makes complex claims by asking: Could I explain this claim in detail to someone else? What specific evidence supports this claim? Does this argument rely on vague buzzwords or conspiratorial thinking?
- Epistemic Humility: Questioning Your Own Biases: Cognitive immunity weakens when we accept information that aligns with our preconceptions too easily. Before accepting a story ask " Would I believe this if it contradicted my political or ideological views?" Check if you are falling into confirmation bias.
- Checking for "False Balance" or Manufactured Controversy: The "both sides" fallacy can make fringe views seem more legitimate. For example, giving equal weight to climate scientists and climate deniers creates the illusion of an ongoing debate. Rely on expert consensus, not just debate presence.
- The "Misinformation Hallmarks" Test: Watch for classic misinformation patterns like conspiratorial thinking (Suggesting a secret plot without clear evidence). Avoid Appeals to Emotion over Facts. Avoid cherry picking evidence; Using selective data while ignoring the bigger picture. Avoid moving the goalposts; Demanding more and more "proof" even when evidence is strong.
- Algorithm Awareness: The Role of Social Media- Platforms amplify outrage and misinformation because it drives engagement. Actively follow diverse sources to counteract filter bubbles. Don’t rely on headlines or viral posts—read full articles. Use factchecking tools for viral claims (e.g., Snopes, Media Bias/Fact Check).
- Applying the "Socratic Mindset": Instead of passively consuming media, interrogate claims by asking: What’s the strongest counterargument to this claim? If this were false, how would I know? What assumptions does this media source take for granted?
From the examples given throughout this book, Andy Norman appears to be a left-leaning philosophical naturalist and gives the impression (at times) that he believes that conservatives and religious folk don’t have a shred of evidence on their side, but instead, blindly believe what they know ain’t so and stubbornly refuse to bow before the facts. They also are morally culpable for this as the ship owner in Clifford’s story, such warrantless beliefs are not only deadly to the individual but to society as a whole. His hope is people could adopt a method that can lead to mental immunity, so as to be protected from delusional conservative and religious beliefs.I take issue with his concept of what belief is. Beliefs happen to us. I believe what I think is true and what I think is true I believe; the assumption of truth is intricately connected with belief. Though I have never been to Egypt, I believe it exists and I could not by the act of the will, decide to believe that Egypt does not exist. My belief in the existence of Egypt is based on trust in a variety of authorities. This belief is grounded in good evidence. There are lots of things I couldn’t believe, for example, the Hindu religion is so far outside of my plausibility structure, as well as philosophical naturalism seems utterly irrational and absurd. Sure, in a sense I could “will to believe” but this wouldn’t result in belief. If I decided I wanted Hinduism or atheism to seem more plausible to me and wanted to belong; I could read their books, surround myself with them, immerse myself in their worlds, and maybe someday, I would wake up, finding “belief” happen to me, meaning materialistic or Hindu assumptions would no longer seem false, but rather true.Andy is simply mistaken thinking faith is believing what we know is not true, for if we know it is not true, we don’t believe it. Those things which I believe, I rightly or wrongly think are aligned with reality, and unless I have a good reason, it wouldn’t be right to change my mind when I am giving what seems to be weak or fallacious counterarguments.
So now, let's pick up one of Andy’s favorite examples of irrationality, that is the “climate deniers”, those people who know global warming is wholly caused by humanity and will result in human extinction, and yet decide to foolishly suppress the obvious evidence and willfully choose to believe what they know is not true. Andy seems to think that if only they would see they are morally obligated to affirm what is true and bow before the evidence, they would repent of their evil denial and agree with the global warming alarmist, ridding their mind of a parasite.But this is the problem, we who are labeled as “climate deniers” (an attempt to associate us with Holocaust deniers) actually believe that the “science is not settled” and there is good reason to question whether humanity’s Co2 production is the primary driver of climate change. My belief on the matter is grounded in what I (rightly or wrongly) consider good evidence. Both me and Andy, are in the position of basing our conclusions on faith in authorities and research, for neither of us is capable of personally proving and verifying claims these scientific claims. Of course, both of us are also biased by our political tribe, personalities, and experiences, which will influence who seems trustworthy and what evidence seems weak or weighty.
- Relevance: Is the evidence actually linked to the hypothesis? Evidence is only meaningful when it has an inferential connection to the claim. In the climate change example, the fabrication of data by one scientist does not logically disprove the existence of climate change. The falsified data only affects claims that specifically relied on that data. Schum would highlight that just because a piece of evidence relates to a topic, that does not mean it has strong inferential force.
- Inferential Strength: How much does the evidence actually support or contradict the hypothesis? Even if evidence is relevant, it might not be strong enough to outweigh a body of other evidence. The case of the falsified climate data is an example of weakly relevant evidence—it may call into question one particular study but does not refute the overwhelming body of independent evidence supporting climate change. Schum would say that in evidential reasoning, we must look at the net effect of all evidence rather than overemphasizing isolated pieces.
- Competing Evidence: How does this evidence compare to the total body of evidence? One of Schum’s central ideas is that evidence exists within webs or chains, not in isolation. If the evidence supporting a claim is extensive and comes from multiple independent sources, a single counterexample (like one case of fraud) does little to weaken the overall hypothesis. He would argue that strong counter-evidence would need to attack multiple independent sources of confirmation, not just a single flawed study. Climate change research is also interdisciplinary; someone would have to refute these independent lines of inquiry.
- Dependence and Overweighting: Are people overvaluing certain evidence due to cognitive biases? Schum warns against confirmation bias and availability bias, where people selectively focus on evidence that supports their beliefs while ignoring contrary evidence. In the climate change case, someone rejecting the consensus based on a single case of fraud is likely committing an evidential fallacy, overweighting one piece of counter-evidence while disregarding the broader pattern. This seems to be standard practice by climate change denialists.
- Causal vs. Associational Evidence: Does the evidence address causality or just association? Some evidence may be indirectly related but not actually bear on the causal question at hand. A scientist faking data does not cause climate change to be false; it only affects confidence in that one study. Schum would say that evidential reasoning must distinguish between undermining a source (which weakens an argument) and undermining an entire hypothesis (which requires much stronger evidence).
- Relevance: Yes, the evidence is somewhat relevant, but only in relation to trust in that particular study, not to the broader hypothesis.
- Inferential Strength: Very weak—it does not actually challenge the mechanisms or data underlying climate change theory.
- Competing Evidence: The overwhelming scientific consensus, independent lines of data (temperature records, ice core samples, atmospheric CO₂ levels, etc.), and multiple replications outweigh the effect of one fraudulent study.
- Cognitive Biases: The person making the argument might be overweighting one counterexample and ignoring the larger evidential picture.
- Some people focus on individual pieces rather than net evidential weight.
- Some may misinterpret relevance, treating tangentially related evidence as if it were directly refutational.
- Some may rely on evidential shortcuts, like assuming that a flaw in one study discredits an entire field.
- Others may disregard dependence structures (e.g., multiple studies drawing from different data sources are stronger than those all relying on the same flawed dataset).
When I spent some time trying to find the best evidence that global warming is caused by human activity, I was at first humbled, for I learned climate is a remarkably complex system and I was also convinced that human activity could play a role. I understand Mill’s method of similarity and difference that lead to the conclusion that human activity seems to stick out as the primary difference. However, this is induction, and it is also clear that the precise relationships and factors involved in climate are not entirely understood. As the complexities of the human body, so it is with climate.Anyhow, in my research I learned that the greenhouse gases humanity produces are tiny compared to what nature produces; if nature’s contribution is like a tidal wave, then our contribution is like a teaspoon in comparison. According to the National Center for Policy Analysis, 98% of our atmosphere is filled with Nitrogen, Oxygen, Argon, and other gases, about 1 to 2% are Greenhouse gases. Of this Greenhouse gas, 95% is from Water Vapor, 3.62% is Co2 and 1.38% is other things like Methane. Of the Co2 96.6% is natural and 3.4% is from human activity. Now, compared to water vapor, the human contribution is 0.28%, and for all the alarmism; I thought 0.28% seems rather a small amount. Now, when forced to, global warming alarmists acknowledge these facts and of course, they have a quick response, and that is that human contribution is causing a positive water vapor feedback loop.But it was when I went to investigate whether humans are causing a positive water vapor feedback loop, that I learned that little is determined or known; that there is yet solid evidence that human activity is causing this, and it isn’t even certain that there is a runaway feedback loop. The whole bloody thing seems to rest upon a weak premise that hardly anyone ever even addresses.
1. The Heartland Institute (USA): One of the most prominent climate denial think tanks. It has received funding from fossil fuel companies and politically conservative donors. Organizes annual conferences promoting climate skepticism. It published a “Climate Change Reconsidered” series, opposing IPCC reports.
2. Competitive Enterprise Institute (CEI) (USA): Opposes regulations on carbon emissions. It has previously received funding from ExxonMobil. It has promoted lawsuits against climate policies.
3. Cato Institute (USA): Libertarian think tank co-founded by Charles Koch. Previously housed the "Cato Center for the Study of Science," which downplayed climate risks. It has argued against government action on climate change.
4. The Heritage Foundation (USA): Conservative think tank with a history of challenging climate science. Supports fossil fuel expansion and deregulation. Opposes policies like the Green New Deal.
5. The George C. Marshall Institute (USA) (Now rebranded as CO2 Coalition): Previously promoted doubt about climate change before rebranding. CO2 Coalition argues that carbon dioxide is beneficial rather than harmful. Has received funding from fossil fuel interests.
6. The Manhattan Institute (USA): A conservative think tank that downplays the urgency of climate action. Publishes reports minimizing the economic impact of climate change.
7. The American Enterprise Institute (AEI) (USA): Free-market think tank that has funded work criticizing climate science. Once offered scientists money to critique IPCC reports.
8. The Global Warming Policy Foundation (GWPF) (UK): UK-based climate skeptic group founded by Nigel Lawson. Opposes climate policies and promotes fossil fuel interests. Criticized for lack of transparency regarding funding sources.
9. Institute of Public Affairs (IPA) (Australia): Australian free-market think tank opposing climate action. Ties to coal and mining industries. Has promoted misleading climate information in media.
10. Fraser Institute (Canada): Canadian think tank with ties to the fossil fuel industry. Has downplayed the risks of climate change and criticized emissions regulations.
11. The Center for the Study of Carbon Dioxide and Global Change (USA): Promotes the idea that higher CO2 levels are beneficial for plant growth. Downplays climate risks.
12. Science and Public Policy Institute (SPPI) (USA): Run by Christopher Monckton, a known climate change denier. Distributes reports that contradict mainstream climate science.
13. The Cornwall Alliance (USA): Religious group that frames climate action as contrary to Christian teachings. Argues that environmental regulations harm economic freedom.
14. EIKE (European Institute for Climate and Energy) (Germany): A German think tank that spreads climate misinformation. Tied to far-right political groups.
- Intergovernmental Panel on Climate Change (IPCC) – The leading international body on climate science, producing comprehensive assessment reports confirming human-induced climate change.
- United Nations Framework Convention on Climate Change (UNFCCC) – Oversees international climate agreements like the Paris Agreement, based on scientific consensus.
- World Meteorological Organization (WMO) – Tracks global climate trends and supports the IPCC's findings.
- United Nations Environment Programme (UNEP) – Provides global research on environmental changes, including climate change.
- International Energy Agency (IEA) – Analyzes the impact of fossil fuel consumption on climate change.
- World Health Organization (WHO) – Studies the health impacts of climate change and confirms human influence.
- National Aeronautics and Space Administration (NASA) – Conducts satellite monitoring of climate change and confirms human-caused warming.
- National Oceanic and Atmospheric Administration (NOAA) – Tracks temperature, sea level rise, and extreme weather linked to human activity.
- United States Geological Survey (USGS) – Studies climate-related changes in Earth's systems.
- American Association for the Advancement of Science (AAAS) – One of the largest scientific societies affirming anthropogenic climate change.
- American Meteorological Society (AMS) – Concludes that "climate change is real" and human activity is the dominant cause.
- National Academy of Sciences (NAS) – The leading U.S. scientific advisory body that strongly supports the climate consensus.
- American Geophysical Union (AGU) – Studies Earth's climate systems and supports the conclusion that humans are driving global warming.
- American Chemical Society (ACS) – Affirms the role of human-produced greenhouse gases in climate change.
- American Physical Society (APS) – Recognizes human-caused climate change as a significant scientific issue.
- Royal Society – The UK's leading scientific academy, strongly affirming human-induced climate change.
- Met Office Hadley Centre for Climate Science and Services – A major climate research institution.
- European Space Agency (ESA) – Monitors climate data from satellites, confirming human-driven trends.
- European Academies' Science Advisory Council (EASAC) – Represents EU scientific organizations and supports climate action.
- European Geosciences Union (EGU) – Confirms the scientific basis of anthropogenic climate change.
- Environment and Climate Change Canada (ECCC) – Tracks climate trends and attributes warming to human activities.
- Canadian Meteorological and Oceanographic Society (CMOS) – Recognizes human-induced climate change.
- Commonwealth Scientific and Industrial Research Organisation (CSIRO) – Australia’s premier research organization confirming human-driven climate change.
- Australian Academy of Science – Affirms the scientific consensus on global warming.
- New Zealand Climate Change Research Institute (NZCCRI) – Researches human-caused climate change impacts in the region.
- Chinese Academy of Sciences (CAS) – Supports the global scientific consensus on climate change.
- Japan Meteorological Agency (JMA) – Studies climate change and attributes warming to human activities.
- Indian Institute of Tropical Meteorology (IITM) – Conducts research on monsoon patterns and global warming.
- South African Weather Service (SAWS) – Studies regional climate impacts and human influences.
- Brazilian National Institute for Space Research (INPE) – Monitors Amazon deforestation and climate change impacts.
- The Potsdam Institute for Climate Impact Research (PIK) (Germany) – A leading climate research center.
- The Grantham Research Institute on Climate Change (UK) – Researches policy responses to climate change.
- The Scripps Institution of Oceanography (USA) – A major contributor to climate science.
- The Tyndall Centre for Climate Change Research (UK) – Studies mitigation strategies for climate change.
- The Climate Research Unit (CRU) at the University of East Anglia (UK) – One of the most influential climate data centers.
- S's belief that P is explained by X
- X is an off track process
- S's belief that P is unjustified
- Premise 1 (Genealogical Premise - Causal History): Belief B was formed due to causal process C (e.g., evolutionary adaptation, social conditioning, political influence, or cognitive bias).
- Premise 2 (Explanatory Independence): The truth or falsity of B plays no role in C—that is, C would have produced belief B regardless of whether B is true.
- Premise 3 (Epistemic Deficiency - No Truth-Tracking): If a belief is formed due to C and C is independent of the truth of B, then B lacks justification.
- Conclusion (Debunking): Therefore, belief B is unjustified, and we should either reject it or suspend judgment.
- The Genetic Fallacy Test: Does the fact that a belief has a certain causal history automatically make it unjustified? Are we committing the genetic fallacy (wrongly assuming that the origin of a belief determines its truth or falsity)?
- The Self-Defeat Problem: If all beliefs formed through Causal Process C are unjustified, does that include the debunking argument itself? For example, If rational thinking evolved for survival, not truth, then does that undermine all reasoning, including the debunking argument? If all beliefs shaped by social, political, and economic factors are unjustified, then skepticism about climate change itself is also socially constructed. Climate skepticism is often linked to fossil fuel industry lobbying, libertarian ideology, and anti-regulation sentiments—should we also debunk these beliefs?
- The Third-Factor Strategy: Is there a third factor that connects the belief to truth, even if its origins were non-truth-tracking? For example, Evolutionary pressures may have led to true beliefs because true beliefs tend to be useful for survival (e.g., believing in predators keeps us alive). Can the belief still be rationally reconstructed, even if its origins were non-truth-tracking? Even if social forces influence scientific research, this does not mean that climate science is false. The reliability of climate science is supported by independent evidence, predictive success, and cross-disciplinary verification (e.g., physics, chemistry, oceanography). Unlike political or ideological beliefs, scientific theories make testable predictions—and climate models have successfully predicted global temperature trends, Arctic ice melt, and extreme weather patterns.
- The Selective Application Problem: If this argument undermines belief B, does it also undermine other beliefs formed in the same way? For example, If moral beliefs are debunked by evolution, are logical and mathematical beliefs also debunked (since evolution shaped human cognition)?
- The Empirical Constraint Test: Does the debunking argument apply to empirical claims where independent verification exists? For example, Climate science relies on observable data—does a debunking argument really challenge its justification, or just its social history?
- The Regress Problem: If we must question the causal origins of all beliefs, does that lead to an infinite regress where no belief can ever be justified?
- The False Equivalence Problem: While all scientific fields face some institutional biases, the degree of scrutiny, peer review, and empirical testing in climate science is high. Comparing climate science to past failed scientific theories (like phrenology or eugenics) ignores the fundamental methodological differences.
- Etiological Premise (Origins of Climate Science Beliefs): The acceptance of climate change is largely influenced by political, economic, and social factors, rather than objective, neutral scientific inquiry. Scientists are influenced by government funding, institutional pressures, and ideological commitments. The media and public discourse are shaped by political agendas and corporate interests.
- Explanatory Premise (No Truth-Tracking Connection): If belief in climate change is shaped primarily by external social, political, or economic pressures, rather than by truth-tracking scientific methods, then climate science is unreliable. If scientists receive funding to support climate change research, their conclusions may be driven by incentives rather than truth. If mainstream media promotes climate change alarmism, public belief might be a social construct rather than a rational conclusion.
- Generalization Premise (All or Most Scientific Research is Affected): Many scientific fields (not just climate science) are subject to institutional biases, funding incentives, and social pressures. The replication crisis in psychology and medicine suggests that even peer-reviewed research can be unreliable. Historically, "scientific consensus" has been wrong before (e.g., phrenology, early nutrition science, eugenics). Since scientific institutions operate under similar social and economic constraints, we have no reason to trust climate science more than any other potentially flawed discipline.
- Skeptical Conclusion (Climate Science is Unjustified): If climate science is shaped by political, economic, and social pressures rather than truth-tracking mechanisms, then we have no justification for believing in climate change. Therefore, we should reject, doubt, or remain agnostic about climate change claims.
- The Funding Bias Argument: Skeptics claim that climate scientists have financial incentives to exaggerate the effects of climate change. "Follow the money" reasoning suggests that grants, funding, and government policies drive research outcomes.
- The Political Ideology Argument: Climate science is often linked to left-wing politics, environmentalism, and global governance agendas. Some argue that belief in climate change is socially constructed to justify government regulations and green policies.
- The Historical Precedent Argument: Pointing to past scientific mistakes and paradigm shifts, skeptics claim that climate science could be another example of a flawed consensus.
- The Media Influence Argument: The argument claims that mainstream media selectively reports climate data to create alarmism. Skeptics argue that if reporting were truly balanced, public perception of climate change would be different.
- The Self-Defeat Problem: Debunking climate science due to institutional bias also undermines climate skepticism. If climate science is unreliable because scientists receive funding and operate within a political and social system, then the same reasoning applies to climate skeptics who receive support from fossil fuel industries, libertarian think tanks, and corporate lobbying groups. If skepticism about climate change is influenced by ideological commitments (e.g., anti-regulation policies, corporate interests), then why should we trust climate denialism any more than climate science? If social and financial influences alone disqualify beliefs, then skepticism itself must be discarded as socially constructed rather than rational. A climate skeptic claims that "scientists say what their funders want them to say", yet fails to acknowledge that many climate-denying organizations are funded by fossil fuel industries. If funding invalidates beliefs, then skepticism is equally invalid.
- The Third-Factor Strategy: Empirical Constraints on Science. Science is not purely socially constructed—it is constrained by reality. Even if social, political, or financial pressures exist, climate science is not based on subjective opinion but on empirical data, predictive models, and independent verification. Unlike moral beliefs (which are shaped by evolutionary adaptation and cultural learning), climate science produces testable predictions that have been confirmed by reality (e.g., temperature trends, glacier melt, sea level rise). If climate science were just political propaganda, its models would be inaccurate—but instead, climate predictions have consistently tracked real-world changes.
- The False Equivalence Problem: Comparing climate science to past scientific failures is misleading. Climate skeptics sometimes say, "Scientific consensus has been wrong before—why trust it now?" However, this ignores key differences between past failed theories (e.g., phrenology, geocentrism) and climate science. Empirical rigor: Climate science relies on multiple independent lines of evidence, including satellite data, ice core samples, and oceanic measurements. Predictive success: Climate models have made accurate forecasts about warming trends, ice sheet collapse, and extreme weather patterns. Cross-disciplinary validation: Physics, chemistry, and earth sciences all support climate change—this isn’t just one narrow field making unverified claims.
As I have not found convincing evidence, I would have to accept what the global warming alarmist claim based on blind faith. I guess if I really wanted to belong and be accepted by progressive democrats, I could believe what I don’t think is true, and eventually, one day I might find I actually believe it. But in this case, it does not seem responsible or wise to do so. Because1. we have decades of failed predictions by global warming alarmist1, they have a track record just as good as Christians attempting to predict the end of the world.2. Climate is a complex system, and yet scientists can only receive funding to look for “evidence” that supports that humanity is the primary cause, it is agenda-driven.3. There is evidence that scientists who are not in lockstep with the orthodoxy can have their careers destroyed. There is remarkable pressure not to go where the evidence leads, but instead to fit in so as not to be spotted by the modern inquisition.4. There is hard evidence of falsification of data, propaganda, and crooked research procedures. Considering this is a “global problem”, in need of global “solutions” and politicians know that the only way to “save the world” would be to obtain near absolute and global power, then it is of little surprise then why politicians have entangled itself themselves with the entire thing, which also makes me skeptical.5. The whole thing has all the trappings of fundamentalist religion and has ironically become anti-science, with the claims that the “debate is over” and the “science is settled,” hell, talking about having no concept of Karl Popper’s falsification principle! Then the loose way that everything, I mean literally everything bad, no matter how contradictorily the claims, is attributed to global warming, makes me rightfully suspicious; when I know causation is remarkably hard to determine.If I can be convinced that humanity's microscopic contribution to the greenhouse effect is the primary driver of global warming, I will change my mind. I can honestly say outside of mere assertions (a hell of a lot of them); I’ve yet to be given any convincing evidence. I just don’t appreciate the smug assertion throughout this book, that anyone who doesn’t agree with Andy on this matter, is a “climate denier”; is knowingly and immorally believing what they know is not true, and have absolutely no evidence or reasons at all for their position.
Additional Resources:
- Mental Immunity: Infectious Ideas, Mind-Parasites, and a Better Way to Think With Andy Norman
- #474 Andrew Norman - Mental Immunity; Reason, Critical Thinking, Beliefs, and Morality
- Nature & Nurture #61: Dr. Andy Norman - Mental Immunity & Infectious Ideas
- Pirate Television: Naomi Oreskes - Merchants of Doubt
- Merchants of Doubt: Naomi Oreskes on Climate Skepticism - LASCO-ELI conference (Part 1)
- Global Debunking Arguments
Comments
Post a Comment