TUESDAY, 24 AUGUST 2021For many, humans are defined by our intelligence and capacity for abstract thought. Aristotle explained humans as similar to animals in that we have life and perception, but distinct in that we also have reason. Linnaeus boasted of our wisdom as a species when he named us Homo sapiens. However, as this article explores, we are not as wise as we would like to think.
At the most basic level, human observations are limited by the shortcomings of our minds, known as cognitive biases. Consider this story: a man new to poker has great confidence in his own abilities. His friend, a veteran player, is less confident. After winning several hands, the first man fancies his chances, betting more and winning big. He then uses his winnings to buy himself a new car, only to notice that this model is everywhere. In order, this exhibits: the Dunning-Kruger effect (an overconfidence in ability demonstrated by the unqualified and underestimate of abilities by those more qualified), the hot-hand fallacy (previous successes making subsequent success seem more likely) and the Baader-Meinhof phenomenon (noticing something once makes you more likely to notice it again).
This story illustrates just a few ways in which our observations misalign in basic ways with reality. This is especially true in our modern, information-saturated age. Take, for example, confirmation bias, the tendency to pick out and retain information that conforms to existing beliefs; not a very useful trait when it comes to changing one’s mind. Or, similarly, our inability to comprehend probability; a 1996 study by Robert Hamm at the University of Oklahoma found doctors susceptible to base rate neglect, whereby the disregard of background information in favour of individuating information leads to overdiagnosis of rare diseases.
Conceptual Frameworks, or, the Cognitive Constraints of Cultural Context
The philosopher Alfred North Whitehead coined the phrase ‘the fallacy of misplaced concreteness’ to refer to how we can sometimes mistake our interpretations of the world for reality itself. The separation of the various sciences represents an artificial vivisection of the natural; admittedly, this compartmentalisation is useful in breaking down the infinitely complex natural world into manageable, researchable chunks. However, such subdivisions can prove problematic. As naturalist John Muir once said, ‘When we try to pick out anything by itself, we find it hitched to everything else in the Universe’. The intellectual root of this idea can be traced back to the revolutionary 19th century scientist Alexander von Humboldt. Humboldt is most famous for his Naturgemälde, a diagram indicating how temperature, humidity, atmospheric pressure, flora and fauna varied with altitude in the Andes. He was the first to define global climate and vegetation zones, and to describe and identify the cause of anthropogenic climate change. While in the 21st century these observations are discussed by separate communities at separate conferences, Humboldt’s success as a scientist was partly due to his ability to see nature as one great tapestry. This gives one pause to wonder: what interactions and connections are we missing in a world where children are being taught to separate the natural world into the physical, chemical and biological, with no attempt to stitch the tapestry back together?
The fallacy of misplaced concreteness is rooted in culture. Western culture has, for the past two centuries, been heavily influenced by capitalistic ideals such as competition and individualism. In his book Entangled Lives, mycologist Merlin Sheldrake uses lichen to demonstrate how the lens of competition can inhibit our understanding of the natural world. Lichen are now understood to comprise multiple symbiotic partners, but, as Sheldrake notes, when botanist Simon Schwedener first proposed in 1867 that lichen comprised fungi and either algae or cyanobacteria he was ‘laughed out of the house’. However, even the iconoclastic Schwedener was subject to cultural influence; his Dual Hypothesis described a ‘master’ and a ‘slave’ species operating together. Mutual beneficence was clearly not an option in 19th century biology.
Sheldrake then demonstrates the limitations of conceptualising nature in strictly individualistic terms with the example of fungi. Fungi cannot move to seek nutrition. Rather, they grow through their environment in order to reach food sources which they can then digest. They do this by extending their mycelium, a network of fibres, the growing ends of which are known as hyphal tips. This prompts the question: at what level do we apply the idea of the individual — to the mycelium, or to the hyphal tips? The mycelium as a whole seems like the obvious candidate, but there is a flaw in this thinking. When a hyphal tip reaches a source of nutrition, the rest of the network recedes, and the fungus’s energies are focused on exploiting this food source. The issue is that, in Sheldrake’s words, fungi have ‘no operational centres... no seats of government’; coordination happens ‘everywhere at once and nowhere in particular’.
Our notions of individuality suggest a central locus of control, yet this is not always the case. Since there is a whole kingdom of life subverting the idea of the individual it is worth wondering where else we could gain greater understanding of nature by disabusing ourselves of our preconceptions.
Even understanding our own biology is limited by the lens of the individual. Recent research suggests that human microbiomes aid digestion, influence the fine-tuning of our immune systems and impact our cognitive function and behaviour. At what point do we draw the line between us and our microbiomes? Do our gut bacteria affect our cognition, or are they simply part of our cognition, and thus extensions of ourselves? And what does this imply for the bacteria themselves? Perhaps these questions would be illuminated by a Humboldtian view of nature — that is, one that recognises the individual as ‘more of an assumption than a fact’, as Sheldrake puts it.
Europeans, and the Western cultures that succeeded them as a result of colonialism, have a long history of seeing the natural world as something to dominate. In this world view, humans are alienated from nature, rather than just one (albeit powerful) piece in a huge and complex puzzle. The natural world is a commodity — it is there to be exploited and used as humans see fit. Conservation of endangered species is often framed in terms of the species’ use to humanity, rather than having inherent value of their own. This mindset can also lead to feelings of disconnection, and an urge to return to nature. Yet this is not the default human view; rather, this is just a view. In an interview with Dissent magazine, Nick Estes, academic, activist, and member of the Lower Brule Sioux Tribe of South Dakota, describes his culture’s relationship with the natural world through the example of buffalo; while acknowledging that the relationship was ‘very material’, he also argues that it ‘wasn’t just one-way’, and how ‘reverence’ for the buffalo as part of cultural traditions ensured over-exploitation and loss of balance in that relationship was avoided. In another example, anthropologist Juan Barletti’s 2013 TedTalk explains how the Ashaninka tribe of the Peruvian Amazon see Earth ‘not as a commodity’ but as a ‘social agent’, to be interacted with in a way that our Western worldview does not allow space for.
In his speech This Is Water, David Foster Wallace warns against adopting the ‘default setting’ — a worldview shaped by biological and cultural hard-wiring. This default setting operates at many levels; this article has explored just a few of the cultural and cognitive defaults which affect our understanding. It is the duty of scientists, as those who seek to expand human knowledge, to know our minds and most importantly to appreciate their limitations, so that we may inch closer to the Truth.
Tom Wilkins is a fourth year medical student at Christ’s College. Artwork by Rosanna Rann.