Operate with Zen

16. Mindful Decision Making - (Part 3) Avoiding Poor Decisions

October 31, 2021 Phil Pierorazio Season 2 Episode 4
Operate with Zen
16. Mindful Decision Making - (Part 3) Avoiding Poor Decisions
Show Notes Transcript

Medicine and surgery require complex problem solving and tough decisions are made every single day.   In Part 3 of this three-part, solo, mini-series, the frameworks of heuristics, biases , emotion, ethics, and other challenges to effective decision making are reviewed in hopes of avoiding poor decisions. (Music Credit: Sunshine, Simon Jomphe Lepine.)

Phillip Pierorazio:

My name is Phil Pierorazio and I'm a urologic oncologist, a surgeon. Like many of you, I absolutely love what I do, and I would not choose another profession. But I've struggled professional identity practice efficiency and wellness over the years. operate with Zen is a podcast designed to explore a mindful approach to surgery and to being a surgeon. By discussing these struggles and mindful solutions, I hope together we can create a community of strong and healthy surgeons enjoy. Welcome to operate with Zen. in this season, we've been focu ed on effective decision making in our solo episodes. And in t is last solo episode, we're go ng to talk about concepts t at could lead to poor decis on making. And the hope is by understanding these concepts nd theories, we can be bet er decision makers, we can empo er our patients to be bet er decision makers. And so we re gonna start with a little bit of philosophy and there is in classic yogic philosophy or y ga teaching. There's a concept of mental obstacles or clic es otherwise known as torments in these are mind bloc s. Specifically, they re attachment aversion, an urge ot to die mistaking self for an inner voice or ignorance of t ue self. But they really are he roots of desire, gre d, delusion, anger, pride and en y, which we are more accustomed to discussing in contempor ry Western philosop These afflictions can determine fate, especially in the cycle of karma. And as you fall victim to these mind blocks, you can move backwards in that karmic cycle or as you move forward and alleviate your suffering. or eliminating these cliches you can move forward in your cycle of karma. And therefore, in the meditative practices of yoga, self discipline, study and devotion, you can train the brain for new neural pathways to contemplate the opposite, to get rid of negative of negative thoughts, or to explore middle grounds and can help you cultivate friendship towards happiness, compassionate towards the suffering, Joy towards the verse, virtuous and indifference towards the non virtuous, which is not only yogic teaching, but Buddhist teaching. And so through concentration, meditation and absorption, the Sanskrit terms are dharana, dhyana, and Samadhi. You can overcome mind blocks and achieve enlightenment. And so if you're not into yogic philosophy, that's okay. There are some Western philosophies which translate very well into this. And in concepts of behavioral economics, which we've talked a lot about in the last couple of episodes, there are concepts of heuristics and bias. These were put forward by Amos Tversky, Daniel Kahneman, you can read about them in nudge, you can read about them in Thinking Fast and Slow. But we'll give you just kind of basics to think about three of the simplest heuristics and biases The first is anchoring. That means our judgments are based on an established reference point. So what is the normal heart rate for a infant? While you may not remember your pediatric rotation, so you may use adult heart rate as your anchor for which to make your next judgment. The second is availability. And this says that judgments are based on held readily an example comes to mind in medicine, this is our N equals one phenomenon. Not only that one patient, we remember but most recent patient that we saw that experience is going to affect our next judgment. And the last is representativeness. This is also known as a similarity heuristic where we use prior experiences or stereotypes of data to make to make decisions. This is very common in cancer clusters. When we think about exposures and environmentally related cancers, for instance, is this because truly because there was an environmental exposure that led to a surge in cancers or in this area, or is it just a fact that in relatively rare diseases, in increasingly small populations, when we do these kinds of studies, that there is statistically a chance that you might find a really high proportion of cancers, just as likely that you find a really low proportion of cancers in some other geographic population? Now there are a number of other biases I'm not going to spend a lot of time on. I think those are three really good examples of what we see in medicine. But some of them that we talked about in other episodes, we've talked about unrealistic optimism and over estimation, right, we overestimate our time by about 50%. On average gains and losses we're gonna spend some time on later in this episode again, status quo bias is really common, that's when you stick with the norm. Rather than thinking about something new or thinking about something rationally, how many of us when we go to a medical conference, like our grand rounds, sit in the same seat or on the same side of the room? Right, we all joke that you have assigned seats in some of these conferences. That's a status quo bias you're doing with the norm, rather than thinking about where you want to go. Now that's kind of a benign example. But where it can be problematic is in the operating room, we see this specifically in older surgeons, but you see it in young surgeons to where you do something because that's the way you always did it, or that's the way you were taught to do it. Rather than thinking, being mindful, thoughtful or present about what actually the correct approaches. In the last episode, we also talked about framing. Our decisions can be influenced by how we report statistics, for instance, as either positive or negative. And there are huge influences based on social structures, and peer pressure. And a great illustration of this was a study by Solomon Asch in 1995, what he called a conformity of errors. And if you do a very simple test, where you give people multiple choice question, but you tell them what their peers answered, even though that is the wrong answer. No matter what country what group of people you ask 20 to 40% of people will provide the wrong answer that their peers endorsed, rather than thinking about the correct answer and providing that solution, some other social phenomenon, something called pluralistic ignorance, which is when you don't know what your colleagues are saying, you may guess and try and conform your answer to what you think they want you to say, rather than actually thinking about the correct answer. And we can use the social determinants to our own influence, recognizing that people will follow the crowd. So when we want to help, for instance, either colleagues or patients make a decision informing them that compliance is higher than the that other people tend to choose this option, we can promote compliance with a recommendation that we're offering. On the flip side, when we're struggling with a decision or a choice, we should recognize that we're going to feel pressured to think, what would my colleagues do? Or what do I think they would want me to do. And while they may help guide your choice, you need to be cognizant that these influences exist, and they may not be leading us to the correct answer. And when we think about these heuristics and biases, they're incredibly common in medicine. And not only does a diagnosis when we when we give a patient or give a disease a diagnosis, not only does that hold all of the biological information about that disease. But with that comes a whole bunch of biases and heuristics that we need to understand. Commonly as a cancer surgeon, cancer carries huge biases and huge heuristics. And trying to understand those as you're explaining to a patient, or trying to make decisions for a patient or with a patient can really help you get to the right answer for that patient in that family. And so we need to be aware of that we'll call it baggage quote unquote, baggage. What are those biases? What are those heuristics? And how do we best treat patients knowing that they exist? It's really important to understand that these heuristics and biases exist to help us answer questions that we may not know the answer to. Most patients have not also gone to medical school, they don't have the understanding of the of the physiology and pathophysiology of what's affecting them. So how do they try and get to an answer? Well, one of the simple things they do is they can substitute easier questions for harder ones. By substituting easier questions, you're making it easier on your brain, you're engaging that system, one, which makes sense of quick, easy thoughts rather than having to try and break things down. Or patients can also insert emotional responses rather than an OT and an actual answer. So how does how do your choices make you feel? Rather than how do you think about those choices? And this doesn't just happen to patients, we can think back on our own training our medical school residency, before we are expert in what we do. We fell victim to the same phenomenon as our patients do. where we may substitute easier questions, or an answer that feels better before we have the knowledge and understanding of what actually is the correct choice. And often the confidence that a physician or a patient projects actually reflects the coherence between system one and system two, does your quick thinking line up with your longer, more rational thinking, and not necessarily the factual information you're presenting. And interestingly, classic paper, crockery and grabber in the American Journal of Medicine 2008. clinicians who are certain of a diagnosis are actually wrong 40% of the time. And they do a beautiful job in this study in in this paper of breaking down system one and system two, and how you can go wrong in overconfidence. I'm not going to go into too many details about that. But it's a beautiful paper if you really want to look at it. And one of the most influential factors in decision making is the concept of losses versus gains. And we talked about this briefly in some earlier episodes this season, but we're gonna spend a little bit more time focusing on losses and gains here, because humans give priority to bad news. There's probably some evolutionary advantage to having this negative dominance. And it helps us unconsciously detect predators or bad things and keeps us alive. Traditionally, when these concepts of negativity dominance are discussed or put forward in academic circles, they're in financial conversations, and it's thought of in losses and gains for wealth. But I would easily make the argument that we could substitute wealth for health when we think about these things. And so losses and gains in health can certainly influence our decision making. For example, certain people may view cancer as a short loss no matter how its explained to them. And they may be adverse to a more conservative management strategy willing to take on unreasonable risks of a surgery or an aggressive approach, because they don't want to take the risk of any loss with cancer. Along those lines, not achieving a goal is considered a loss where exceeding that goal is a gain. There's a great paper from Baumeister in 2001 that studied successful marriages, and concluded that avoiding the negative was much more than seeking the positive for a long term successful marriage, which kind of flips a lot of things on its head. But if you look at the data, avoiding the negative can be much more important than seeking the positive may be true in medicine as well. Really trying to avoid adverse outcomes may help propel a surgical career may help grow up practice. And I think it's a really important notion to be aware of. One of the best studies that elucidates this human behavior of loss aversion is a study by Pope and Schweitzer, who analyzed 2.5 million putts and the PGA Tour. And when they looked at these golfers, no matter how easy or hard, the distance professional golfers were much more successful when putting for par than when they were putting for birdie. Birdie in this circumstance would be a gain. par would, or being beneath par would have been a loss. And so you can see the motivation and the strength of this analysis in this very simple yet elegant, clinical example. So how does this apply to medicine? One of the clinical implications may be that we can change behaviors by making a desired outcome viewed as a loss. For instance, we commonly do this right complications, length of stay, blood loss in surgery, are all seen with a negative connotation. And so we want to not achieve those losses. But when we're trying to achieve other outcomes, we may need to flip the conceptualization. And I know I typically talk about kidney cancer, but it's it's easy for me. But I think one of the ways that we're starting to do this in the kidney cancer world is make surgery for benign tumors now a very significant loss, it is no longer going to be acceptable to do a nephrectomy for a benign tumor for the vast majority of patients. And by changing that framework, changing that thought process, we may be able to get over some of this loss aversion that exists in medicine and exists outside of medicine. There's also a concept in the behavioral economics literature called the fourfold pattern. And this is where you look at probabilities versus gains and losses. It's basically a two by two table high probability and low probability mapped out against gains and losses. Once again, you could substitute a number of medical diagnoses or choices into this financially created system. And when there's a high probability of making things worse, but a small hope of avoiding a large loss, people will often reject a favorable outcome and seek risk seek the more risky option for that small hope of avoiding a large loss. And, unfortunately, for surgeons and patients alike, this kind of risk taking can turn manageable failures into complete disasters. And sometimes this means not getting the maximal outcome, but achieving a good outcome and and how many of us have heard the enemy of good is perfect. Sometimes we have to take that good outcome. And we have to tell our patients that that outcome is okay. When seeking the exceptional outcome could lead to a much more detrimental state of affairs. In earlier episodes, we talked about how the emotional state can influence decision making. We're really not going to talk about that here. But it's important to recognize that one of the emotions not just happiness and sadness that affects decision making is fear. And when patients refuse logical recommendations really think about their state of mind, and are they fearful and investigate with it, that with them. Likewise, if you see a colleague, that is having a tough time, reaching a logical decision for patient management, or surgical management, whatever that may be, ask them about what they're afraid of. And sometimes that can offer a whole new insight into their decision making and help them get to the right decision. One of the other strategies to be aware of that can help you make decisions or help your patients is a concept of broad framing. This happens all the time in medicine. And this is where a single comprehensive decision has to be made. But it incorporates multiple choices or multiple options throughout it. In a single framing construct, you make one decision at a time, A to B, B to C, C to D, but in a broad framed concept, you make all of those choices, at the same time optimizing the outcome for all of them. To use the same analogy, you make the decision of how to get from A to D, considering all of the small choices along the way. This is really tough for patients. It's really tough for physicians to but we get better at it because we're used to looking at kind of complex situations in its entirety. And so one of the recommendations here to reduce the pain of the occasional loss. Look at the big picture, look at larger numbers. This is why we do large studies, lots of patients lots of outcomes. Rather than allowing ourselves to be skewed by small numbers, retrospective samples that could be biased, we have to look at the big picture. And sometimes that can be calming, not only to ourselves, but to our patients and colleagues. Specifically, I'm going to spend a couple minutes on how we can help patients make choices. System one is always going to dominate system two when people are struggling because it's easier on their brain and it's easier on them. Recognize when the decision is tough that you're going to have to bring people out of system one and help them think about and talk about and find the answer that makes the most sense. We can definitely make data easier to interpret. And there's lots of studies and lots of projects, and lots of interfaces out there to make the patient interpretation of data easier. use pie charts, use big colors, YouTube videos, all of these things can help patients understand. We can structure our choices to facilitate learning. So for instance, in prostate cancer treatments, we often talk about compensatory or comparative outcomes. So if you have surgery, you may have a slightly higher risk of incontinence or erectile dysfunction. If you undergo radiation, you may have certain risks of secondary cancers, or gross hematuria. And you help people understand that there's no ideal treatment, but there's also no terrible treatment, they have to understand the trade offs between the two. Patrick Walsh, who was one of our gracious guests this season, when he talks to patients about prostate cancer management will occasionally say to them instead of thinking about what the optimal outcome is with each treatment, think about what the worst case scenario would be for you. Is it the worst case scenario of surgery or the worst? Worst case scenario of radiation or the worst case scenario of surveillance that scares you the most. And sometimes changing that perspective can help patients particularly make a better choice for themselves. We can help patients by eliminating choices, right, so let's rank options 12345. And let's cross off the bottom to right away, and then we can focus on the top three that are more pleasing. We can also use a concept called collaborative filtering. And patients actually ask us for this all the time, if I was your family member, what would you tell me? The other way to put that is people like you will often Choose X or choose Y. And this is why, remember, we have the same biases going the other way we can stereotype we can lump people together. And that may not be appropriate in many circumstances. But sometimes when somebody is struggling with a decision, it really can help. Some of the other points recognize that the market is not pure. There are incentives at all levels, there are incentives for a patient to undergo one treatment versus another. There are incentives for you as the physician or surgeon to choose one treatment or another. The hospital system, the payers are all incentivized in different ways. We would hope in the ideal world that those incentives don't matter or don't influence the decisions, but they do. And sometimes calling them out helping people understand why incentives exist, and how that could influence their decision making will help them get to the choice that is best for them. And lastly, we should reassure our patients anticipated regret is terrible. And it is often worse than actual regret. People are resilient. Humans are great compensators to use the terminology and Daniel Kahneman his book, but people can be incredibly resilient if they understand and have the expectations of what could happen to them. So those are some practical tips to structure choices and the questions that may be asked. But there are also ways there are also other ways to influence our behavior or to be aware of how our behaviors are influenced. So Freud said the super ego is happy when we comply with societal ethics. And since Freud, there's been a number of neuro psychological studies show that compliance with ethical thoughts or ethical behaviors actually stimulates our reward centers. The nucleus accumbens lights up the caudate nucleus, and we are happy when we comply. So when we're having when we're struggling with a choice, when our colleagues or patients who are struggling with a choice, we can sometimes frame it in a moral framework or an ethical framework. And it is very clear that people will often choose an outcome which enhances moral behavior. So when struggling with the decision, think about ethical code. And in medicine, this is easy. This is the Hippocratic Oath do no harm, right? This is help people care for others. This defines who we are. And sometimes the simplest thing to do when you're struggling, is break it down to an ethical outcome. I still remember, one of the best chief residents I had when I was an intern said to me, medicine is simple, you always do the right thing. And if you're not sure what the right thing is, call me and I'll help you figure it out. I think this is a really great example of how moral or ethical principles can help guide us to the correct decisions. On the flip side, it's really important to remember that we make poor decisions when we are aroused. In high stress or high arousal situations, we may we may not behave as we would normally do, we may lose sight of our generally well accessed moral or ethical code. Because of all of the other influences of those emotions. So calm down, step away, try and remove the emotion from that decision. How many of you have had a heated exchange by email, and you'll learn pretty quickly that next time you have a heated exchange back away from it for 24 hours before you hit reply, and send a heated exchange back you just often escalate things. Sometimes it's best to just let time settle, let the emotion pass to help you make a more sound decision. And it leads to the last point we're going to talk about today, which is experience versus memory. And this is something called the peak end rule, which is also described in great detail in Daniel Kahneman Thinking Fast and Slow. And there's a number of medical studies that investigate this. But the take home is basically the reported pain reflects the average of the worst moment that a patient experienced and the last pain they experienced or the end pain. And a lot of these studies were actually done in colonoscopies, but there's a number of corollaries in other fields and other and other procedures. And there's something called duration neglect. Where it's not, there's no effect of time on the overall experience. So even if a patient was in pain for a longer period of time, if they had a really severe pain over a short period of time, and that was the last thing they experienced before their procedure ended, that's what they're gonna remember. So what does this tell us not only about painful procedures or experiences we may have with our patients, but duration doesn't matter. It's that last experience, and it's the average of the worst pain they felt. So we should always be seeking to lower the peak intensity of any painful experience, whether it's physical pain, emotional or psychological pain. And always try and finish with our patients in a comfortable place. Much better to take some more time, get to a good and, and more pleasing experience. At the end, we can always make it a happy experience, but a more pleasing experience, than to finish abruptly on a bad note. That's what they will remember. And that will certainly affect their decision making and their experience. So in summary, understanding mind blocks, heuristics or biases, whatever you want to turn them can help you avoid pitfalls and making good decisions for you and your patients. We can structure our choices our patients choices to help them with decision making. Often remember to keep it simple. Our brains favor system one, because it's easier. Sometimes we have to engage system two and facilitate engagement of system two when we have a tough time. Remember that losses are favored over gains. We often need to look at the big picture when making complex decisions, and that actual regret may not be as bad as perceived regret. Minimize emotions when we're making important decisions. Promote ethics when you're struggling to remember the last experience will often define perception. I hope you found this series on effective decision making useful. Thank you for listening and look forward to talking to you

Unknown:

again