A few years back, I was running errands when I ran into a friend. At some point the conversation turned to religion. This went about as well as you might imagine. The tone – initially chummy – shifted to debate mode. Each of us staked out a claim, stated our positions, then stood our ground.
After fifteen minutes of repeating our positions, I realized that the resolution of our debated hinged on the definition of one word. If the word meant what I thought it meant, then my understanding of the passage in question was correct. If it meant what he thought, then his understanding was correct.
Fortuitously, we were standing outside of a library. I suggested we go in, look at a dictionary (smartphones hadn’t been invented yet, and even then, I just upgraded last year), and find out who was right. He wouldn’t go in.
I tried to persuade him. Cajole him. Induce. Coax. Urge. Challenge him. Bring him around to the idea. Sell him on the idea that he could easily win the debate. Prevail upon his thirst for knowledge.
Thirty minutes later, we were still standing outside the library.
We agreed to disagree, parting ways to finish whatever errands we had been running when we ran into each other.
What happened there, aside from the fact that I shouldn’t have brought up the subject in the first place? How did I end up spending 30 minutes trying to convince my friend to look in a dictionary rather than five minutes actually looking in a dictionary?
My first mistake was assuming my friend and I shared the same goals during our debate. My primary reason for suggesting we check the dictionary is that I had a strong accuracy motivation. (Admittedly, I was also being competitive.) When we operate under accuracy motivation, we are looking to form beliefs that reflect the true state of the world.1 To do this, we search for and evaluate information in an even handed manner. I wanted to know if I was right. Had I learned I was wrong, I would have updated my belief and gone about my business.
I don’t know what my friend’s motivations were. It is possible that he was concerned about losing the argument. It is also possible that he had a strong directional motivation and didn’t want to face information that would challenge his personal religious beliefs. Additionally, there are a range of reasons that he could have had but I never considered.
My larger mistake is what I call a hardware problem. Hardware problems are issues that are better analyzed by looking at our brains than the filters we view the world through. Here, I mistakenly based my ideas about my friend’s motivations on my own. In the larger world, similar acts cause us not to see the forest for the trees – that is, to be so caught up in one or more small details that we can’t see the larger picture.
This type of incongruity between what we think and reality is common. After all, our brains work differently than we imagine. Among a variety of other findings, people who study how our brains work have learned that we do much of our thinking automatically, without conscious input2; our memories can be influenced by cues from other people in the room3; children as young as five can pick which candidate is more likely to win an election; and we can be tricked into seeing motion where none exists. In my example, a few additional questions could have helped me to discover my friend’s motivation. This additional information might have given me more useful options, including not trying to “win.”
Hardware problems aren’t a new idea. Indeed, the term is just my way of contrasting fields such as psychology and neuroscience, which study how we really act with my business and legal training, both of which start with the idea that all of us are rational actors. Time and again, I have applied logic to a situation only to bump into another person’s psychology, which, as shown above, didn’t work too well. My usual reaction was, “if only they could see what I’m trying to show them….” Today, I try to keep in mind how people are likely to respond to what I’m saying, rather than spending an hour trying to convince someone to check out the other side of one tree in a forest.
Digging this deeply isn’t always worthwhile. Sure, our brains make mistakes and can be fooled, but few situations require perfection. However, spending more time understanding the hardware can help us improve the software that runs our interpersonal relationships.
Did you enjoy this post? Click here to subscribe to my mailing list for more, or share with a friend!
1. Flynn, D. J., Nyhan, B., & Reifler, J. (2016). The nature and origins of misperceptions: Understanding false and unsupported beliefs about politics↩
2. Khaneman, D. (2011). Thinking Fast and Slow↩
3. Bradfield, A. L., Wells, G. L., & Olson, E. A. (2002). The Damaging Effect of Confirming Feedback on the Relation Between Eyewitness Certainty and Identification Accuracy↩