The mind is the brain at work and culture is the creation of manifold individual minds composing a civilization where the legacy is handed on from one generation to the next.
Charles Gillespie 1998
A quick review: in Post I and II I present a new look at implicit Bias. I argue for a link between bias and biology and I suggest that bias is a biological mechanism much as vision and upright walking. We looked at how memory develops and how memory is tied to biology and survival. Among other things, the mechanism of bias houses fear and joy and I argue my favorite topic–musicality.
In Post III, I turn to more obvious cultural issues in today’s climate and I look at why we are the way we are. How could we not fear others—those different ones—if our survival instincts are primed from centuries of training? Why would we not cling to those ‘like us’ or those who agree with us?
First, we look at some of our quirky traits, then our touch of hubris and finally how we should pause as we think about complex topics and carefully about change—if change in ourselves is possible.
And now, the imperfections
Moving to a somewhat humorous note, I now look at human quirks, idiosyncrasies or foibles that have fallen like “rocks in the road,” circumventing clarity and change. These are the imperfections that influence our biological mechanism and in turn have led to a singular and negative reputation of bias.
Shakespeare saw our foibles as comedic and tragic—our idiosyncrasies as forgivable but he used them as vehicles on the path to change. We can learn (knowledge); we can use logic (reason); but we must face our quirks (denial and distortion). We must look at our past and work, though reluctantly, through the wisdom of change. For this reason, I recommend several books as a next step: Why we are Polarized (Ezra Klein 2019), How to be an Antiracist (Ibram X. Kendi 2019), White Fragility (Robin DiAngelo (2018) and Blindspot: Hidden Biases of Good People (Banaji and Greenwald 2016). Each opens a door to a new world view and will help you to see how you contribute.
Biology, Culture and choice
There is an intersection where biology, culture and choice meet and navigate safely. Or they collide. Cultural and social advances of the 21st century, enabled by many technological advances and research are where we can disentangle the knot that prevents change. But choice – human choice – may be the vehicle causing a crash. Due to motives outside our awareness, such as reasoning or denial, we may choose not to change. Knowledge, our capacity to reason and our willingness to change have come under recent scrutiny by cognitive scientists Steven Sloman and Philip Fernbach, along with the work of French scientists Hugo Mercier and Dan Sperber.
In their book, The Knowledge Illusion: Why We Never Think Alone, Sloman and Fernbach write that we are social creatures; knowledge is gained from one another and from other groups and our survival is dependent upon those affiliations. This human trait, hypersociability, is both a design and a devil. We must cling to and validate group belief to survive. Recall the bias mechanism is built upon memories of fear or desire – events that encode memory networks and learning. Confirmation bias is a direct and unambiguous adaptation to hypersociability memories – again, fear and desire. Elizabeth Kolbert writes, “Habits of mind that seem weird or goofy or just plain dumb from an ‘intellectualist’ point of view prove shrewd when seen from a social ‘interactionist’ perspective” (2017).
Though we learn through sociability and interaction with others, another human idiosyncrasy enters to trip us – we are inclined to believe we know it all. We rank or rate ourselves by what we assume we know about a topic and not by our actual comprehension of the topic.
A study at Yale University (Rozenblit and Keil 2012) proved our failure to comprehend complex issues yet to perceive (and rate) ourselves as knowledgeable. Graduate students were asked to rate their knowledge about the working of everyday devices: zippers, cylinder locks and toilets among others. Their personal assessment of their knowledge was high. On a second pass of the study, students were asked to give a detailed explanation of how each device worked and to rate their understanding again. Self-assessments dropped significantly – simple devices are not quite so simple, thus proving that we often think we know more than we do – even about the physics of toilets.
“We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history” (Sloman and Fernbach, 2017). So while we think we are knowledgeable, once we must explain in more detail, most of us – but not all, according to the Yale study – grasp that we know less about even simple things than we thought.
Foible # 1: We know less than we think we do.
Delusion and deception
The sociability or collaborative factor highlighted by Mercier and Sperber along with Sloman and Fernbach works well until we get to the political or personal – the subjective and emotional domain. The examples in the Yale study – explaining the mechanism of a toilet – depended upon topics that are somewhat unlikely to increase emotions, unless we are plumbers. Once emotion enters the social interactions, others have their own specific wellbeing at heart, just as I do.
A personal example is the joy I experience from performing or listening to classical music. A few years ago after the sudden death of their mother, I was responsible for my three granddaughters for several months. The two teenagers were heavily into “rap” music so trips to and from school and sports activities involved loud and raucous sounds blaring from the car radio. During those months, their wellbeing was far more important than my feelings – I learned to listen. But did those months change my emotions or my bias? Rap is not music to a classically trained ear.
Consequently, our need for sociability is part cooperation, part competition and often manipulation. But once a group develops strong feelings about an issue – the element of the “right way” – confirmation bias inserts itself. Differing opinions are held at bay – our ability to reason rendered inert.
Reason is an evolved trait not developed to solve abstractions or difficult problems. Rather, Mercier and Sperber see reason as genetically linked to survival much like three-color vision or walking upright on two feet. They write in The Enigma of Reason (2017) that other opinions, including unreliable ones, reinforce the group-think phenomenon. They maintain that this is related to hypersociability – our dependency upon each other. Simplistically, reason’s goal is a social activity, to be informed by others who teach us how to navigate our way. We become emotionally vested in an opinion or a privilege because it is in our best interest, survival. Consciously or unconsciously among our ancestors, reason was the vehicle to winning, to be on top in social standing or as Elizabeth Kolbert (2017) writes, “to make sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave.” Logic or reasoning gained little advantage; belonging to a clan or group meant surviving. Abstract thinking was not the primary consideration of our ancestors.
Foible # 2: We experience a surge of pleasure (dopamine) when someone in our clan confirms our opinion.
The hard nut to crack or why we rarely overcome bias
By the age of four or five, children learn they can mislead others while also realizing others can mislead them. Sperber calls this “background awareness,” where a child learns that parents or teachers or other family members may not be fully truthful. “Vigilance is not the same thing as being distrustful, it’s just not automatically taking everything at face value” (2013).
In an odd way this vigilance and how we interpret it does suggest self-deception. We learn in our formative years to watch adults carefully while learning simultaneously to go along with what they tell us. Belief in Santa Claus is a relevant instance. By age five, I knew Santa was my mom, but I kept my knowledge of Santa’s identity a secret until I was nine. Chance and Norton (2015) see this as self-deception, as adaptive behavior. They identify three discrete rewards: to deceive others, to gain social rewards and to reap psychological benefits. I kept the big secret in order to keep my parents happy – I didn’t really mean to deceive but I knew they would be angry with me if I didn’t. I sensed I would reap benefits of lovely presents if I kept the secret intact. Some would call this collusion. Was it? I joined in the masquerade when my younger sister and brother were four or five because I instinctively knew I should.
Self-deception, according to evolutionary psychologists evolved to assist in “other-deception” – where confidence at self-deception reaps social benefits (Chance and Norton 2015). Or as Mercier and Sperber observed, self-deception allows us to hold onto preferred beliefs – the “myside” manifestation – regardless of the truth (2017).
Foible # 3: We want to win; when we don’t we experience heightened fear brought on by a surge of norepinephrine.
I now turn to the work of Dr. Sara Gorman health specialist and her father psychiatrist, Dr. Jack Gorman (2014). The Gormans write, “How do we deal with highly complex science that most people are unprepared to tackle?”
When our personal preference is one of confirmation or staying with our point of view, how do we motivate the brain to accept change? When information is complex and difficult to comprehend, such as universal health care, climate change or racism how can we expect any but dedicated scientists to listen and learn? A safe and simple tactic for most is to rely upon availability heuristics—facts available in our memory.
Facts we can easily recall are the mental shortcuts to best defend our position. This is particularly true when we evaluate complex topics. Often when we oversimplify complex issues, we feel a powerful surge of emotion—emotion which strongly reinforces “myside” bias. Voices grow louder; statements more emphatic. Facts readily available build a convincing standoff (Mercier and Sperber 2017); ‘myside’ bias is authoritative.
Another example the Gormans highlight is the power of “baseless opinion.” Example: I may form an opinion about healthcare based upon unsubstantiated or even incorrect information. In this day of “alternative facts,” such scenarios we have witnessed repeatedly on television. To my friend, I now repeat my opinion based upon facts that I did not understand. She is persuaded and agrees because I am her friend. She passes on my opinion with some enhancements of her own to several of her friends and to her husband who is influenced by her and who is not reluctant to argue with others should they disagree with him. Now, we have a community of like-minded believers – some forcibly vocal and compounded by social media—all convinced of their solution to a complex health plan fed from my opinion of baseless information. “This is how a community of knowledge can become dangerous,” write Sloman and Fernbach (2014). The power of the group cements, and as we know about our sociability needs and our inability to reason, the possibilities for learning new facts or changing our opinions look dim.
Foible # 4: Often we don’t know what we are talking about.
Is there a light at the end of the tunnel?
Before we move forward, recall our foibles: 1. We know less than we think we do. 2. We experience pleasure when someone agrees with our opinion and 3. Often we don’t know what we are talking about and finally 4. We want to win.
When asked to explain our knowledge in detail, such as the mechanisms of zippers or toilets, we discover our knowledge is quite limited – perhaps even baseless. Fernbach and Rogers (2013) observed that almost fifty percent of participants who are given factual, accurate information in a second trial of the study will then ratchet down the intensity of their objections to a complex topic such as tax reform or police reform. Participants who absorb the facts and the error of their knowledge will moderate their views. Yet, confidence in our knowledge is a two-sided coin. In an unusual examination of the role of knowledge when reasoning about false beliefs, Birch and Bloom (University of British Columbia, Yale University 2007) concluded: “Our findings demonstrate that an adult’s own knowledge can compromise his or her ability to reason about other people’s false beliefs and to make predictions about their actions.” This is a rock in the road we may not see when the light is dim.
Gene, Culture, Coevolution
But there is hope, or as Sloman and Fernbach describe, “. . . a little candle for a dark world.” To look for solutions, I return to Wilson’s position that culture and biology meet and move forward as gene-culture coevolution. “The mind grows by absorbing parts of the culture already in existence.” In our university laboratories, difficulty arises when researchers attempt to mirror scientific methodology within the study of culture. Here across all disciplines, “hard” and “soft” scientific analyses conflicts. Researchers in social sciences do not have basic analytical principles to apply like those principles which underlie chemistry, biology and the cognitive sciences. Culture has no “basic atomic units equivalent to genes, cells and organisms that can form” an easily analyzed base as does science (Wilson 1996).
But cognitive and neuroscience research is expanding in a still young 21st century. New vocabularies and new scientific methodologies will emerge from new technology beyond CNiFERS. We will see how the brain functions and how the mind absorbs our cultural lives and how we over a relatively short period of 500 years we incorporate the cultural attitudes are embedded in our minds and passed on to future generations.
Recall: “The mind is the brain at work . . . .” New methodologies and measurements will emerge between the sciences. While disadvantaged by our own shortcomings, our idiosyncrasies and blindspots, we can begin with a new view of bias. If bias is a biological mechanism found in each individual at birth, we could begin by discontinuing blame and not assigning stigma, while being ever vigilant – a cultural bias is easily formed and almost effortlessly shielded from attempts to unravel it. As Wilson points out, once a genetic predisposition is in the culture, the mind unconsciously absorbs and engages with it (1996). The bias mechanism stores memories filed away as fear or pleasure and the mind links those memories to future experiences. (Fig. 1)
Moving forward: we should educate our children and our teens to look at bias as a mechanism – instinctively we all have the capacity “to bias.” The mechanism is linked to survival and memory, yet we have competency if we so desire to overcome its power, its seductiveness and its negativity.
We have the capacity to learn but as we know from the studies discussed earlier, only fifty percent of us will choose to examine and move away from harmful cultural attitudes and primarily unconscious harmful and undesirable biases. The other fifty percent will hold tightly with either confirmation bias or false beliefs. Scientists and educators should bring the language and bias awareness into public discourse. While “political correctness” – actually the basis of civility among us – has recently become a public and political target of disdain, it is also political to manipulate, exploit and harm one another through the most rudimentary of communication forms – language and body language. Political correctness, often uncomfortable to many, is the most basic way to change language. It then changes film and art, which then changes media, the press and ultimately our laws. But without awareness and acceptance, personal resistance merely goes underground (Banaji and Greenwald 2016) until it explodes on the streets as with Black Lives Matter movement in 2020. (and the pushback of violence by the radical right).
So, while the light to change may be dim, additional research on factors affecting brain, mind and culture, along with studies on adaptive psychology, should be funded by government and private foundations.
And realistically, as we are a society driven by monetary gains, economic pressures upon various industries and continued societal demands among various groups will be the facilitators to cultural change.
Encumbered by our culture, we will continue to carry forth negatively biased beliefs, behaviors and language – often unconsciously and without rationale – because we have a biological mechanism, a biological predisposition I call bias, in which to transport them forward into the future. The mind is the brain at work, and it carries the enigma and the mystery of unconscious bias.