Response to CosmicSkeptic & Jordan Peterson Discussion

I originally intended on writing a brief comment on the video, but within the first twenty minutes or so I realized it simply wouldn’t be possible given the amount of sheer bullshit I was hearing. Within a matter of 1 hour or so, we have what seems to be a cycle between pseudo-profound BS and presuppositionalist apologetics. When asked for clarification, Peterson is extremely reluctant to give clear answers. When challenged on certain topics, he takes the sort of “you can’t even think without a notion of the divine” when towards the end, he prompts his interlocutor with the endless “why” questioning in an attempt to lead them to some bedrock assumption he slaps the label “God” on. There is so much content to respond to, I’m only writing about some of the misunderstandings I find salient. Here is the link to the video: Navigating Belief, Skepticism, and the Afterlife | Alex O'Connor @CosmicSkeptic | EP 451.

At around 44:00…

It’s quite disingenuous and ignorant to claim that the act of asking for biblical historicity presumes an “atheistic materialistic worldview” derived from enlightenment presumptions, claiming that it’s not a “Christian metaphysics”. Enlightenment thinkers were not unilaterally atheist, many were Christian. Is he trying to say that George Berkeley, Leibniz, Pascal, Jonathon Edwards, William Paley, Samuel Clarke, Joseph Butler, Reverend Bayes, etc were all lacking the true “Christian metaphysic”? This seems equivalent to saying “no true Christian would ask that question”. Furthermore, atheism is not unique to the enlightenment, we can see it as far back as the ancient Indian agnostics. It’s also quite ignorant to assume atheism entails materialism, when there are plenty of practicing philosophical idealists who maintain a strong atheism. You can also be an atheist and subscribe to the platonic theory of forms, while rejecting a deity. You can also be an atheist and study the intimate relation between part-wholes in fields such as complexity science, without subscribing to crude reductionism, or the assumptions of theism, while maintaining a rigorous empiricism and naturalistic outlook. Enlightenment philosophy is much more nuanced than he’s making it seem.

To imply that prior to the 17th century, no one was concerned with the historicity of biblical claims, preferring a metaphorical interpretation, also seems a bit shallow. Historicity did matter, it was just never explicitly questioned, and not understood in the modern sense of the practice. Were people asking for historical evidence in the form of someone like Herodotus? Probably not. Did they think that the gospels were actual attestations and testimonials of events that actually took place? Probably so. This actually matters to most Christians I know; catholic, Protestant, or orthodox. Factuality matters; something perceived as factual impacts the interpretation of the story under discussion. I’ve recently come across Jonathan Bourgel. He has an interesting paper re-examining the relationship between ancient samaritans and Israelites, which calls into question canonical interpretations of the parable of the Good Samaritan. There are plenty of examples similar to this.

Also To imply that the historical question is a non-starter is simply incredibly ignorant of textual criticism, comparative religion, anthropology of religion, ancient near east studies, and study of cultural exchange in ancient Mesopotamia through the Hellenistic period. Peterson tends to think that all perception takes place within a socio-cultural context, why not acknowledge the embedded nature of biblical narratives in the broader context of cultural processes and religious syncretism from which it emerged? It sure seems interesting to exclude the Homeric epics, platonic dialogues, and other material from the Hellenic near east, in his analysis of the significance of the biblical stories. He seems to imply “who knows, no one can know”, taking a globally skeptical stance towards history, while selectively isolating the instances where history corroborates a biblical narrative. Referring back to the “Christian metaphysics”, we can see interactions between early Christian thinkers and Neoplatonists. It’s quite astonishing to observe the historical timeline and find the fusion between a pagan philosophy and early Christian theology. There’s that quote attributed to Alfred north Whitehead “All of Western philosophy is but a footnote to Plato”, seems a bit interesting to exclude this from his analysis of “persistent themes across western culture “. Never mind all of the instances where concepts like these emerged in Asian cultures.

Later in the talk, he conflates “fiction” with abstract objects, claiming both are “more real” than concrete objects or entities. The problem is, “fiction” is a categorically different concept, making the analogy absurd. When mathematicians speak of numbers “existing”, they mean ontological existence. Fictions can exist only in a conceptual or pragmatic sense. This makes fiction a social construction, in the same sense that money is socially constructed. Fictions can be inter-subjectively real, and the content within a fiction can be Phenomenologically relatable between people in a community, but it’s absurd to claim they exist in an ontological sense. Furthermore, within some versions philosophical nominalism, or the idea that only concrete objects exist, which is not to be brushed off dogmatically, it’s reasonable to suspect that none of these abstracta exist in the ontological sense. Embedded cognition seems to corroborate this philosophical view; sets and set relations (or any mathematical primitive) could be thought of as abstractions our brains impose on persistent observations, emerging through dynamic feedback mechanisms (brain, body, environment interactions).

I think he is quite confused about the concepts “existence” and “reality” and is butchering the meaning of “real”. To be “more real” makes just about as much sense as saying something “exists more”. These are not gradable concepts, they are not grade terms or rank terms; they are binary based on the ontological sense of the term. If we mean “real in the inter-subjective sense”, then we can clearly see that “more real” just means “more important” or “more pertinent” with respect to some socio-linguistic community. A key feature of a fiction is that readers must suspend disbelief, allowing a story to resonate or make an impact, culminating in a theme or purpose that may or may not be relevant. This is quite distinct from mathematical abstracta. I think this analogy is incredibly weak, trivializing everything he’s been arguing. This can also be extended to his usage of the word “actual”. In one breath he wants to claim that something being “actual” misses the point or that it’s irrelevant, but then employs the word somewhat arbitrarily to refer to something when he wants to establish his credentials, or as something properly grounded. 

His position does not seem to be critically distinct from what I’ve read in Lyotard or Baudrillard. He laments the enlightenment ideal of observer-neutrality and emphasizes the importance of context dependency. He claims that “the real and fiction are inextricably linked”. Well, this is equivalent to saying, “fiction and non-fiction are not critically distinct”, which in so many words, is the position postmodernists take with respect to grand narratives. I will grant that there isn’t a distinct boundary demarcating the two in all circumstances and especially when it comes to religious fiction where there are incredible motivation to obfuscate the two, but Peterson seems to acknowledge the human role in making that distinction, and that it can be a function of human goals which are ultimately pragmatic in his view. This seems to be identical to the Nietzschean views in “the uses and abuses of history for life”. History is a useful tool that can be constructed to serve various psychological requirements. Postmodernists emphasize power relations because certain narratives aren’t necessarily beneficial to everyone. When Peterson rejects fundamental binaries like “belief or disbelief”, he is literally engaging in Deconstruction. When Peterson acknowledges the role of the early church in the canonization of the gospels, he is recognizing what Baudrillard recognized, that signs and symbols can be manipulated. When Peterson talks about the obscure relationship between the real and the fiction, he is referring to the concepts Baudrillard uses to describe the collapse between reality and representation. When Peterson talks about how the woke have hijacked institutions, he implicitly recognizes the role of ideological state apparatuses discussed by Althusser. It seems the reason Peterson does not go all the way, is to appease the Christians who pay his bills, exemplified by the anti-abortion advertisement midway through the video.

This brings us to the topic of computational epistemology. Quite a bold claim to assert that generative models are “interpreting” or that compression is similar to interpretation. LLM’s do universal function approximation based on statistical regularities in a corpus for token prediction. When humans “interpret”, we are not doing statistical inference. The two concepts are distinct. A hermeneutic approach might utilize both inference and interpretation, but the two are distinct cognitive processes. Interpretation focuses on understanding and explaining the meaning or significance of something, incorporating subjective insights or contextual understanding. Language models simply don’t do this, and to claim they do would be an implicit argument for strong AI. The problem recognized by AI alignment researchers isn’t whether the LLM is “giving us the correct interpretation”, because the model isn’t performing that cognitive function. The alignment problem is focused on “is this generative model generating false or nonsensical information”. It’s a question of factuality, not interpretation. But since Peterson is lacking a distinction between fact and fiction, combined with his misunderstanding of language models, he concludes that “woke” are imposing an interpretive framework on the LLM. No evidence needed by the way for this assertion. Sure, language models can be fine tuned for specific purposes that will orient responses towards a subset of possible sequences. But the concern is one of correspondence to reality measured by consistency with facts. You can train your model on a subset of information that will yield “woke” responses or you can train it on a subset containing religious paraphernalia to give you “Judeo Christian” responses. You can construct any rationalization you want about why your dataset is better, but claiming the former is ideologically driven while the latter is not ideologically driven, is bullshit. Knowingly filtering out information that runs contrary to what you believe is precisely what it means to have an ideological filter. Pointing out this fact shouldn’t imply the pejorative label “postmodernist”, that’s simply an ad hominem. Plus, there is little recognition that language models are trained on corpuses that already contain the classic texts. He somewhat recognizes this by saying they’re biased towards the present. Well, can’t I simply argue that his proposal simply biases the model towards the past? It would require rational deliberation, not labeling your adversaries as “ideological”, to determine the optimal weighting. All he is saying is “my ideology is correct because it’s cosmically grounded” , which is another way of saying “I’m right are your wrong, end of story”.

Comments

Popular posts from this blog

The Nature of Agnosticism Part 1

The Nature of Agnosticism Part 2

Basic Considerations for Argument and Evidence Evaluation