Evolving Mindset Series: When Both Sides Are Wrong

When Both Sides Are Wrong.jpg

“…a scientist who relies wholly upon scientific observations and tests; while, indeed, empirical observations and records furnish the raw or crude material of scientific knowledge, yet the empirical method affords no way of discerning between right and wrong conclusions. Hence it is responsible for a multitude of false beliefs. The technical designation for one of the commonest fallacies is post hoc, ergo proper hoc; the belief that because one thing comes after another, it comes because of the other.”

— John Dewey, How We Think – Chapter 11, 1910

Sheila Grant Article.png

In 2005, less than a year after I started as Chief of General Surgery at The University of Missouri, I was scanning the contents of that quarter’s Mizzou Magazine. An article (above) caught my eye because it highlighted the work of a biologic engineer named Sheila Grant. The focus of the article was her expertise in researching materials designed for implantable medical devices. Because we use synthetic materials for most adult hernia repairs, I thought she might want to work with our hernia team.

After a brief introduction in our first meeting, Sheila asked me what material we use for hernia mesh. When I told her that the most common synthetic polymer used in hernia mesh was polypropylene, her jaw dropped, and she exclaimed, "You can't put that in human beings; it's going to degrade in the body." I replied, "But we put it in about a million people a year just in the United States alone."

I was shocked to hear her response, and my initial thought was that she had to be "wrong," and I was obviously "right" because I’m a hernia expert. But we had just met, and she was nice. I didn't want to start an argument, so I kept that opinion to myself. We had a nice dialogue, which led to a collaboration and a friendship that continues to this day.

The interaction between hernia mesh and the human body was the first complex problem I investigated after learning the need to use systems and data science principles to understand the real world's uncontrollable biologic variability. Sheila and her husband Dave, a mechanical engineer, decided to work with our hernia team on this complex issue.

We started a materials characterization lab at the University of Missouri - Columbia and began to analyze mesh after removing it from patients (usually for symptoms of chronic pain, infection, and/or a recurrent hernia). The material removed from the body often looked different, and was sometimes harder and more brittle, compared with the soft and flexible material that comes out of the package during a patient’s surgery. But here's the interesting thing: this didn't occur in all mesh explants, and it occurred to different degrees in different patients – it even could be different in different parts of the same mesh.

For a while, this was difficult for me to understand. I was taught that mesh was inert (meaning it would not change after being implanted into the body). In my lectures, I said that the most important part of doing a hernia repair with mesh was the technique. Placing the mesh with good technique would lead to perfect results.

But in our analysis of explanted mesh, Sheila and I learned that we were both wrong, at least for some patients. For most patients who have a hernia repaired with mesh, the mesh performs fine. But in some patients, the mesh may have been a contributing factor that leads to an unintentional complication. When I first learned this, it made my brain hurt. It was hard for me to understand that the same mesh, placed the same way, in two different patients could undergo vastly different changes and result in different outcomes.

Before this experience, I believed that the best, most powerful way to learn in healthcare was to test a hypothesis using a prospective, randomized, controlled trial (PRCT). The PRCT is the reductionist science gold standard for the scientific method. The goal is to prove or disprove a hypothesis. But when we use a reductionist tool like this, we create a false dichotomy of only two options: right or wrong. Our lower brain craves this façade of certainty, but it has a harmful effect – the divisiveness it creates. Our lower brain falsely entices us to pick a side (for or against mesh, for example).

One science historian, Henry Cowles, has recently published a book and an article that explains why the reductionist scientific method is ill-equipped to deal with a complex disease such as COVID-19. In the article, he notes that the concept of a static, linear scientific method (the PRCT) was taken from only one paragraph in the book, How We Think, by John Dewey, published in 1910. The "method" was simplified to be put into textbooks to teach children how to do science. This was never the intent of Dewey, who described many other ways of learning and discovery. Scientists know that this representation of the scientific method is not how science works in the real-world.

But in healthcare, we still attempt to use an invalid scientific method to deal with complex disease. It doesn't work, and it leads to the divisiveness I described above. We feel we have to choose between two options: antibiotics or surgery to treat appendicitis; “I'm pro-mesh or anti-mesh for hernia repair”; “I'm pro-mask or anti-mask in public places during a pandemic.” This kind of false dichotomy in the face of a complex problem is tragic because it creates an "us vs. them" healthcare system and society. It leads to a situation where both sides are wrong because the situation is complex, and there is no one right answer, but both sides believe they are right.

We need to overcome our lower-brain thinking and evolve to a higher-brain mindset where we are comfortable with uncertainty. We can then work together, using systems and data science principles, to deal with all of our complex problems, including this pandemic. A recent article in MedPage Today by Vinay Prasad does a wonderful job demonstrating higher-brain thinking applied to the COVID-19 pandemic. He notes two realities about any complex problem – no one person has all of the answers and there are a range of reasonable viewpoints, not one right answer.

If our reductionist scientific method and lower-brain thinking leads to divisiveness, can we create environments that foster higher-brain thinking to unify us with common shared values and goals? The Robbers Cave experiment, conducted over three weeks in the summer of 1954, can help us understand the potential.

A group of camp counselors (actually, the experimenters) divided 22 boys, aged 11 and 12, into two groups. Each group was given a name, and the two groups were initially isolated from one another. During this period, the counselors allowed group bonding, while each group didn’t know that the other group existed. Then, they brought the two groups together for a variety of competitive activities. What a mess! There was screaming and yelling of derogatory comments, and many fights broke out. Clearly, it was not difficult to divide a group of boys and create perceived hatred for each other.

Then the real experiment began. The counselors created a series of obstacles and challenges that required the two groups to work together to achieve common goals, such as fixing the broken-down bus so they could go to town to watch a movie. As the groups worked together, the hatred and fear melted away, replaced by new connections and friendships. By the end of the experiment, the two groups had become one, unified by the achievement of solving problems together. This "realistic conflict theory" has been demonstrated repeatedly, and the solution of having small teams work collaboratively with a common goal should have no better application than in health care.

Although we can create conditions and environments that lead to bad behaviors, we can also create conditions and environments that foster cooperation, love, empathy, and engagement with a passion for what we are privileged to do in healthcare – care for another person.

Lower-brain, reductionist principles lead to fragmentation and divisiveness, causing us to think we must prove we're right. We have the potential to learn higher-brain thinking and apply systems science principles that will align our goals and values by working in small, diverse teams, in decentralized learning health systems, and collaborating in the ensemble model of learning that is the application of science for our real, complex world.

Previous
Previous

Evolving Mindset Series: The Complexity of Conspiracy Theories

Next
Next

Evolving Mindset Series: A Surgeon’s Mindset Transformation