Anshu Bahanda: Welcome to another episode of Wellness Curated. As you know, the aim of this podcast is to help you lead a healthier, happier, more hopeful life. And we do so by bringing you ideas, trends, techniques, and tools from all over the world. This season, we’re focusing on social well being. And our topic today is ‘Beyond Perception: Sharpen your Intuition.’ And it’s a subject that can really transform your life. And today we’re thrilled to have with us research psychologist and pioneer of the naturalistic decision making methodology, Gary A. Klein. With his extensive expertise in decision making, Gary has developed a number of models that can enhance our decision making abilities. And his work has really influenced thought leaders and professionals across various fields. He’s also come up with an exciting new book that we’re going to talk about later in our chat with him. Gary, thank you for being here with us today.
Gary Klein: Thank you very much for inviting me. I’m looking forward to this conversation. Thank you.
AB: So I want to ask you, let’s start by discussing the difference between instinct and intuition. How do you define the two? And what has motivated you to focus on research, psychology and explore the role of intuition in decision making?
GK: Okay, so these two terms can be defined in different ways. There’s no official definition and so there’s an opportunity for confusion. Here’s the way I would distinguish them. Instinct is wired into us, and so we see it in humans and in animals. These are response tendencies that, through evolution, allow species to thrive. And so this is something that we’re born with; these kinds of instincts. Intuition is not something we’re born with. Intuition is something we develop through experience and it drives the way we build our expertise. And it has a major role in the way we make decisions.
[Getting to] What got me interested in intuition and why did I decide to study it? I didn’t. I never decided to study intuition. I was just interested in decision making. And I wanted to know, how can people make rapid decisions like pilots or physicians, nurses, firefighters? And according to the literature at the time, and this would have been in the 1980s, the only way to make a good decision was to identify all of the plausible options that you need to consider and then evaluate each option on a common set of criteria. It does this, it does that, and then see which one comes out ahead. But the researchers knew that it would take at least a half hour to set up that matrix and carry it out. Firefighters don’t have a half hour, pilots don’t have a half hour to set up a matrix like that. So what can they do? And I thought that maybe they probably didn’t look at all the plausible options. Maybe they just looked at two. I had some ideas about that. So I studied firefighters and I picked firefighters because they do have to make life and death decisions under extreme time pressure. And they were a very cooperative community. It was wonderful working with them. And we found that they never did any comparison of options. They just responded with what they thought was the appropriate course of action. I remember the first interview I did with a firefighter. I said, I’m here to study how you make decisions. And he assumed that decisions meant comparing different courses of action. He said, I’ve been a firefighter for 16 years, I’ve been a captain for 12, and in all that time, I’ve never made a single decision. And we had just gotten funded from the army to study his decision making. And he’s telling me he’s never made decisions. I said, how do you know what to do? And he said, ‘It’s just procedures. You just follow the procedures.’ I figured our project was about to fail. [I said], ‘Can I see the procedure?’ And he said, ‘Oh, it’s not written down, you just know.’ And that was my introduction into intuition because that’s what he was referring to.
AB: So that’s very interesting, what you’ve just told me. Tell me, what about relying on intuition as opposed to analytical thinking? How do you strike a balance between the two so that the outcome is the best possible? And also, do you think like the example you gave about the firefighter, do you think sometimes analytical thinking gets so developed because of practice that it almost becomes intuition?
GK: I think some people believe that analysis becomes so second nature that it becomes like intuition. But I don’t think that’s correct. I think intuition is really the collection of patterns and experiences that we’ve developed and learning from feedback and just learning what’s worked and what hasn’t worked. Some people ask me, how can you trust your intuition? The answer is you: can never trust your intuition. It can mislead you. But you can’t trust analysis either because analysis can mislead you. So in a research project that I did with my friend Danny Kahneman, who worked with Amos Tversky to develop the heuristics and biases approach, we drew on the literature on two systems: System One and System Two. And System One is sort of your intuition. It’s the pattern matching. It’s your rapid reactions that you have. And System Two is the analytical part. And we said you need both of these parts. You need both aspects in making decisions. If you’re under extreme time pressure, you may not have time to do the analysis, but otherwise you want to be in touch with what your intuition is telling you to do. And then you want to use your analytical ability to see if it will make sense in the situation that you find yourself in.
AB: So also in your new book, Snapshots of the Mind, you examine topics like confirmation bias, anomaly detection, and perspective taking. Now, how can we ensure that we’re not acting on deep set biases, which are going to affect our decisions? And also talk to us a little bit about what your book explores. The new book.
GK: Okay, so let’s start with the second question. What does the new book explore? The new book explores what I like to call the cognitive dimension, which is going beyond the simple procedures that people tell us to follow. In many situations, you can’t routinely follow the procedures because life is complicated. So you need to decide, does this procedure work in this situation? And how do I have to modify it and how do I know that it’s being effective and when should I abandon the procedure? So you have all of these cognitive issues based on your experience in order to handle situations. The cognitive dimension is the way we use our intuition, the way we use System One. It’s the way we are able to make sense of situations. It’s the way we are able to judge the plausibility of situations and build stories to account for things. So there’s just a marvellous capability that I’m calling the cognitive dimension, and many people seem unaware of it. They think that everything can be mastered by just following procedures and getting more procedures that they’ve learned that they can apply. And that’s not enough.
Now, your first question, how can you be sure that you’re not being biassed? And the answer is, you can never be sure that you’re not being biassed. I don’t like the idea of biases because that suggests that there’s something wrong with our thinking. And our thinking is wonderful. I mean, our capabilities are wonderful. Look what we have created in the last several millennia. Look what we’ve created in the last five years, or even the last year in terms of different technologies that have been developed. We wouldn’t have developed those things if we were riddled with biases and hopelessly irrational. What I don’t like about biases is it’s all about what can go wrong and how tendencies can possibly mislead you. And the way Kahneman and Tversky originally studied biases, they were really studying heuristics, which are shortcuts, which are not perfect, but they’re shortcuts for getting things done. But I think their work has been tremendously misinterpreted by people who said, look, heuristics are biassed, and we have to stand by it. And what people aren’t looking at is how powerful these heuristics are and how we would be lost without them.
AB: So would you say that life experiences, which can cause cellular trauma in people, obviously affect their decisions; but would you say that that’s fine, that’s them?
GK: We are the sum of our experiences, yes. And that’s what makes each of us unique. Not just the experiences we’ve had, but the way we learn from those experiences, what we take away from the experiences. When I first started studying firefighters, we went to one department in a large city, and I said, I want to talk to your best firefighters, the ones who are the most skillful, the best decision makers. And the captain said, do you want to talk to the ones who have the most experience or the ones who are the most competent? Because we have some people who have ten years of experience, but it’s really just one year repeated ten times. They haven’t learned as much as they can. And others who are just with us for a year or two, who have really picked it up, who have really learned from what happened to them, from what happened to others, they talked to the others and learned from them. And I said, ‘If it was your house on fire, which firefighter would you want to be in charge of putting it out?’ Those are the ones we want to talk to.
AB: And who did they choose?
GK: They sometimes chose the younger ones, because they knew, who had developed good intuition, who had developed what we call mental models, rich mental, how things work.
AB: And Gary, in your book, you also talk about this concept of naturalistic decision making. Can you explain that to us in detail?
GK: Right. This is extremely important. For many decades, most decision research has been conducted in laboratories under controlled conditions with people who are very inexperienced. They give them tasks they’ve never seen before. And the reason for that is you want to be able to conduct a study that gives you results that will be statistically significant. And I won’t go into the details of that, except that it’s easier to get statistically significant results if there’s not a lot of variability between subjects. So if they gave people tasks they had seen before, some of them would have more experience and some would have less, and so the data would be all over the place. The way to cure that problem is to give people, everybody, a task that’s totally unfamiliar. So everybody is starting from ground zero. But that doesn’t describe the world of firefighters or pilots or nurses who have to rely on their experience. And I remember talking to one group, and I discussed my work with firefighters, and they had an average of about 20 years of experience. And one of the researchers said, ‘I give my subjects and my experiment lots of practice.’ And I said, how much practice? And he said, ‘I give them 10 hours of practice.’ And I thought, 10 hours versus 20 years? You can’t even make that comparison. So naturalistic decision making studies people in their home environments and studies people who have built that experience over 10-15 and 20 years. You can’t run a study in a laboratory and wait 20 years for people to build experience. You have to get out of the laboratory and work in messy situations where there’s lots of uncertainty. People perform tasks. There may not be a right answer. All of those things are messy conditions, and laboratory researchers recoil them because that’s not a good way to get statistical results. But for us, it’s a perfect way to find out how people actually make decisions. And I will just say that one of the confirmations for the approach of naturalistic decision making is that we have identified how people actually make decisions, how they make sense of events, how they engage in detecting problems. We’ve identified descriptions for a number of these phenomenon that the laboratory researchers had not achieved. There is now an association, the Naturalistic Decision Making Association, that has several hundred members around the world, and it’s a very exciting development.
AB: So what factors would you say influence the accuracy and reliability of our intuitive judgments and potentially what clouds our judgments based on this study?
GK: All right, so what influences it? There’s an article that I wrote with Daniel Kahneman about the conditions for intuitive expertise. And what you want is to work in an area that’s not chaotic, because if everything is going to be totally random, there’s no chance for building expertise or learning how to be more effective. So there’s got to be some stability in the environment in which you’re working. You want to be able to get feedback, usually rapid feedback and reasonably accurate feedback. And so if you have that opportunity to get that kind of feedback and to use that to reflect back on what you did and what you could have done better, those are the conditions that allow you to build skills at making decisions. What can get in the way? One of the things that can get in the way is fixation; where your intuition is telling you, here’s what’s going on. Here’s what I need to do. But it might be wrong. And people don’t reexamine their impulses. And if you get it wrong, there’s going to be signals. If you’ve misunderstood a situation and you think you’re handling it appropriately, you’re going to be encountering anomalies. ‘Hmm, that shouldn’t have happened. That shouldn’t have happened.’ And as you continue, you’ll encounter more of those anomalies. Where people go wrong is they ignore those anomalies. You can’t respond to every anomaly, but we see people fixating, holding on to their original idea, even as the evidence mounts that their original idea is wrong. That’s something that clouds people’s judgement, and I think they can work on that.
I had one physician tell me he had a two strikes rule if he encountered an anomaly that he wasn’t expecting, that’s why it was an anomaly. Okay, sometimes things happen. But the second time there was an anomaly as he was working over the patient, that was his wake up call. Let me go back to the beginning and rethink it. Maybe I got it wrong, right?
AB: Also, you know you come across people who feel like that. They always make bad decisions, so they feel they cannot trust their intuitive insights. What advice can you give people like that, and how can we help?
GK: All right, so if we’re working in a group, we want people like that in our group. who will keep us from going off after following the impulsive members who think they’re always right. So you want a balance of different types of people. But what can we tell the individual who is always thinking: I’m always getting it wrong. I suspect they don’t always get it wrong. I think what they’re responding to is just a few episodes where they got it wrong, and those have caused them pain and those have made them reluctant to act and feeling that they’ve got to maybe rely on analysis, even where analysis doesn’t apply. So what I would suggest for those people is a cognitive review of the decisions that they made, that they got wrong to try to unpack. Why did you get it wrong? Maybe you were just unlucky. Maybe you made a good decision and there were conditions you didn’t understand or nuances you didn’t understand that worked against you. So try to unpack what was behind the problem you encountered. Maybe you were fixated, you held onto a belief too long and you ignored signals that were telling you, ‘This isn’t working the way you expected.’ So I would ask them to just be more realistic about what went wrong, but why it went wrong and what mistakes they were making when they were sizing up the situation. And why did they make those mistakes? I would be asking them to diagnose why that happened. And I think they’ll find that a lot of the reason why they got something wrong was just beyond their control or bad luck. It wasn’t them.
AB: So, Gary, you’re saying that cognitive review and diagnosis should help. So will this help them develop a stronger sense of self trust and confidence in their intuitive abilities? And also, can you give us some sort of an exercise that people can do which will help them?
GK: I don’t want them to develop a strong trust because our intuition can sometimes mislead us. And we’ve seen people who are very impulsive, people who just assume that they’re always right and they go barging in one direction or another. We don’t want to turn people into that kind of impulsive mode. We want them to be reasonable about their chances for success. So I would say I want them to develop appropriate trust. I don’t want them to think they’re totally useless, but I don’t want them to think either. What exercises can people engage in? You can engage in this kind of cognitive reflection, and this has been known for a long time: the importance of reflection. You can talk to other people who have been in the situation and get their responses. Often people are surprised about how others were sizing up the situation. They hadn’t realised there were different ways of making sense of it. You can take advantage when you’ve seen or heard about somebody making a difficult decision. Ask them about it, because people love to go back and reflect on it, and then you can ask them some of the tough questions they might not have thought about themselves. So you said you saw this, what was it that you saw? Tell me, what were you noticing that made you uneasy here? And would you have noticed it five years ago? So you can work with people who have more expertise to learn from them.
AB: Okay, thank you for that. So in my head, I’m just thinking, like, as an example, something that I hear a lot about people honing their instincts is from… You know you hear a lot of people saying, ‘Oh, I always choose the wrong kind of partner in life,’ in their personal lives. So you’re saying that cognitive reflection, diagnosis, and getting other people’s responses prevents you from making the same mistakes again?
GK: I wouldn’t go that far with the issue you just described. If there’s somebody who’s continually picking the wrong life partner, that suggests that there are deeper issues, and that’s where you want to consult with an expert, with a therapist who can help you review your decision making in selecting the partners that you did. And often, I think you’ll find that there were warning signs and people were aware of those warning signs. They chose to ignore those warning signs. And their friends might have been also expressing concerns, but they were gripped by an infatuation and by magical thinking that somehow all these problems are going to disappear because we have such a strong relationship. And in fact, there isn’t a strong relationship. There’s just an infatuation. And once the period of infatuation is over, then the grim reality sets in. Now, the fact is, if you’re looking for a life partner, you’re not going to find anybody who is perfect in every dimension. So you have to accept that there are going to be flaws and are they flaws that you can live with. And you have to accept that you have flaws, and the person is going to have to tolerate your flaws or find ways of not being sidetracked or antagonised by it. So that part of the excitement of a relationship, is that type of negotiation rather than the delusional state that we get into when we’re in an infatuation. We think that this other person has no flaws at all and is just a perfect individual.
AB: Also, on your website, Gary, you’ve listed several models, that in the context of honing one’s instinct. Now, I know this is a short conversation, and we can’t go into every model, but could you explain a little bit about that in terms of honing one’s instinct?
GK: So, because we are naturalistic decision makers, we work in these messy environments where people perform tasks, and nobody knows if they handle the tasks perfectly. The goals may be vague and open for discussion. There may be other people that have to be brought into the conversation. Pressure may be very high and usually we don’t have all the experience we would like, but that’s the hand we’re dealt. We have to make a decision based on who we are and how we’re thinking about it. So we study these different domains and our first breakthrough was how do people actually make decisions? Then we wanted to know how people make sense of situations? And a lot of that is based on story building. So now we’re looking at how people build stories, how individuals build stories, how organisations can use the stories. Organisations have this wealth of stories that they can harvest and they ignore those kinds of stories because they think everything can be boiled down to a procedure. But it can’t. We investigated insight. That was one of my favourite projects and it’s one of my other books: Seeing What Others Don’t, to try to understand where insights come from because insights are new ideas. And I studied 120 examples of insights to see what was going on in all of them. And I found that there wasn’t just one path to an insight, but I put them in different categories and I found three different paths. Some of our insights come because we put the pieces together and now we’ve created a new notion. Some of the insights come because the pieces don’t fit together and that tells us that there’s something wrong with our beliefs or our beliefs are going to have to change. And sometimes we use anomalies to identify flawed beliefs that we have to give up so that we can escape from that fixation.
I’ll give you a story. We heard about this in our work with police officers. So you’ve got these police officers. These two guys are driving in a police car and they stop at a traffic light and there’s a car ahead of them. And then the older police officer, the more experienced one who told me this story, said, “So I was just sitting there in the car in front of me. There was a car in front of that. We’re waiting for the light to turn. But my buddy, who was just a year out of the academy, said, ‘What just happened,’ because he noticed that the car in front of them was a brand new BMW. And the driver took a drag on a cigarette and then flicked the ashes.” Who flicks the ashes in a brand new BMW? That didn’t make any sense. And he said to his partner who was driving, ‘light him up and let’s pull him over. There’s something wrong going on.’ So they pulled them over and sure enough it was a stolen car.
AB: Whoa. Wow.
GK: And that little thing made him pick it up. Amazing, right? So there’s no procedure for telling people what to spot. It’s just using your experience to spot an anomaly. And to take it seriously and to act on it. And in this case, it paid off. And that was an insight. And so we have a triple path model of insights. The three paths that I just described.
AB: So you said when pieces fit together… The three paths are: when pieces fit together, when they don’t fit together, and anomalies— those three?
AB: Wonderful. And also in your book The Power of Intuition, you talk about the premortem technique. Would you explain that to us?
GK: We never developed this as a technique, and it’s become amazingly popular around the world, and all sorts of people are using it, and I’m continually surprised by people telling me how helpful it’s been. So here’s what happened with the premortem, my original research company that I founded in 1977, sometime about ten or so years later, I noticed that most of our projects were very successful, but not all of them. And when the project wasn’t successful, we would do an after-action review. You could call it like a post-mortem of what went wrong, what failed here. And I wondered, why can’t we do that post-mortem at the beginning to make ourselves smarter as we’re getting started on this project, because we have a kickoff meeting when we start a project, let’s move the post-mortem up front. And so we called it a ‘premortem.’ Now, a post-mortem mostly is a medical term that you perform on a patient, after a patient dies. You want to find out, the physicians want to find out why the patient died, so they do the post-mortem. And then it helps the physician because the physician discovers the true cause of death. It can help the family because now the family knows why their loved one has died. And if it’s important and it has consequences, they might publish an article. So it helps the community. So post-mortem can help everybody except the patient because the patient is dead. So we said, let’s move the post-mortem up front before we fail and see what happens. And the way a ‘premortem’ works is we have the team, we’ve done the kickoff meeting until we only have about 20 or 30 minutes left in the kickoff meeting, and everybody knows what they’re supposed to do, what the plan is. It’s all been worked out, and then it’s time for the premortem. So I tell everybody around the table, ‘All right, now we’re going to do a premortem. I want you to just relax, relax in your chair, and I want you to imagine that we have done this project, and it’s six months from now, maybe a year, and it’s failed. It’s been a disaster. I’m looking in a crystal ball, and the crystal ball never lies. And we know this project has failed. Now, you’re all smart people. Take two minutes and write down all the reasons why it failed, starting now.’ And then everybody starts writing down all the reasons they can think of why the project fails. And then two minutes are up and then I go around the room to see what people have written down. I put it on a whiteboard. So we collect it all. And it’s a marvellous technique for getting people to describe possible problems, things that never got brought up in the kickoff meeting, either because people didn’t want to say anything that would discourage others or because they weren’t thinking about it. But this crystal ball exercise liberates them and now allows them to think, well, if it failed, here are the reasons. Here’s what could have gone wrong. And we have made major discoveries and other people have made important discoveries as they have used the premortem. Plus, the premortem sets the culture for the team, because this is a team that is not afraid to voice criticisms or to voice concerns, rather than trying to pretend everything is going perfectly.
AB: So you’re talking about projects here and corporates. Can you also apply this, say, to individuals? So, like, we were talking about relationships earlier. So you come back to a relationship here and say, can you work backwards? Would it still work?
GK: Why would it potentially fail? I do know a friend of mine who’s a therapist who has used this with patients. I don’t see any reason why you couldn’t. I think this would take courage because now you’re talking about somebody who is gripped by infatuation. But if the person can really do it, can really try, I think it’ll be a wonderful exercise. So now you’re about to get married in about three or four months, and I’m looking in a crystal ball, and I’m looking it’s now four years from now, and I see you in a lawyer’s office and you’re drawing up the divorce papers. And this is very sad and very tragic for you. So you have two minutes. What went wrong? What happened that got you into that lawyer’s office? I would run it that way, make it as concrete as possible and say the crystal ball never lies. So you know that this has failed. Write down the problems that you can identify.
AB: And Gary, you wrote this wonderful book called Sources of Power, and I think Malcolm Gladwell mentioned it in his book— Blink. In there you talk about pattern recognition skills. You talk about the recognition primed decision model. Will you explain that to us?
GK: The way to do research is you have a theory and you do the research to test the theory. We didn’t have any idea what we were going to come up with. This was a total surprise to us. We were studying firefighters and how they could make rapid decisions under time pressure and lots of uncertainty. And we thought they couldn’t look at lots of options. Maybe they only looked at two options to compare them to each other. That was the story I told you about before, where the firefighter said he’s never made a decision because he never had to compare different options, but he actually was making a decision because there were many decision points. He just didn’t treat them as decision points because he could use his experience, his intuition, to know what to do. So what we learned was, over the course of five years, 10 years, 20 years, we built up a repertoire of patterns. And that is a basis for our experience, for our expertise, for our intuition. And when we encounter a situation, we can draw on these various patterns and they combine together in various ways to say, here’s what I think is going on. So that’s the first part of the recognition primed decision model; that recognition, that identification, that I’ve seen this sort of thing before and I know what to do. That’s part of the model. However, remember I told you intuition can sometimes be wrong? So the firefighters said that they relied on this kind of intuition process. But how do you evaluate a single course of action? Because we thought the only way you can evaluate a course of action is to compare it to others to see which is better. But if you’re only thinking of one course of action, you have nothing to compare it to. So we looked at our transcripts from our interviews, and we found that what they were doing was they were imagining it. They were saying, if I carry out this course of action in this situation, what can I expect? And they were playing it over in their mind to see how it might work, and they were noticing what might go wrong. So if they didn’t see any major problems, then they were ready to carry it out. If they saw some problems, then they would modify the course of action. So it became stronger. And if they couldn’t find a way to make the course of action work, they said, what else is my intuition telling me that I can do? And they kept looking through their experience base till they came up with an idea that would be workable. It wasn’t necessarily the best because in their situations, it’s not clear what the best is. But they were looking for the first workable solution.
So this is a combination of System One, which is the pattern matching intuitive part, and System Two is this what we call mental simulation, this imagining part, which is the analytical part. It’s a blend of the two. So we’re not saying analysis doesn’t matter because this mental simulation matters a great deal, but so does the pattern matching.
AB: Gary, can you also talk us through your concept of thin slicing and its role in sharpening intuition? Malcolm Gladwell used thin slicing in his book ‘Blink’, because he was interested in how people made rapid judgements.
GK: So thin slicing refers to the research that’s been done with people, just seeing short snippets of situations, like maybe a 10 second snippet of a video of a university teacher. And people immediately form judgments about whether they would want to take that class or whether they find it interesting or things like that. And so these are all that. And the recognition primed decision model are examples of how we can make rapid judgments, because we have to make judgments all the time.
AB: So are you saying that anyone can develop and enhance their intuitive capabilities? And how can we help our listeners to learn to make better decisions?
GK: I believe that anyone can learn to become better. I believe that it is within all of our capability, if we’re serious about it. We’re not all capable of becoming experts. It can take a fair amount of time and a lot of effort to build the experience base that will turn us into experts. But we can become better than we are. There are several ways of doing it. One way is to just actively reflect after a decision, what happened in hindsight, what could I have done? What could I have done better? Because if we can’t be smart, in hindsight, we can never be smart. So to just go over: what could I have done? But more importantly, what should I have been noticing that I wasn’t noticing? What should I have been picking up? What connections could I have been making that I didn’t think about? So you can spend your time doing that kind of reflection about your own thinking, not just what you did, but how you thought about it. So that’s one thing that you can do. Another resource for the viewers is to work with other people. We have friends, we have colleagues; they’re in the same situations [as] we are. Find out how they’re viewing those situations and what their hunches are and what their concerns are, especially if they have more experience than we do and try to work with them that way. Another thing that we have developed is a ShadowBox method for building…
AB: Yes, I’d love you to talk more about that, actually.
GK: All right. So this is something that we’re very excited about: a ShadowBox method. And it’s a way of seeing the world through the eyes of experts without the experts being there, because we don’t always have experts around, and they’re expensive. So what we do with ShadowBox is we compile challenging situations, and usually it’s by text, but it could also be by video. And we have people go through the situation. Then we stop the action and we say, at this point: you have four courses of action, rank order them. Which are you the most likely to do? 2nd, 3rd, and 4th, and write down your reasons. But it’s not just about courses of action. Then we’ll continue and we’ll say: stop. At this point, you have three goals that you can be pursuing, rank order them and write down your reasons. And then we’ll continue, then we’ll stop. At this point, there’s five pieces of information you can be pursuing rank order, which is the most important, 2nd, 3rd, 4th, 5th… And write down why. And so it goes like that.
We’ve also had a small group of experts, say just two or three, rarely more than five. They’ve gone through the same scenario that you have. They’ve done their ranking, they’ve written down their reason. So when you do your ranking, we give you feedback and we say, here’s what the experts would have done, here’s what goals the experts would have prioritised, here’s what information the experts would have gone after. And people want to match the experts. That’s the game part of it. But the important part is the reasons the experts put down. Here’s why this is the most important goal, because these are the things I’m concerned about. And people [say], ‘Wow, it never occurred to me to be concerned about that.’ So this is how you are seeing the world, or at least this situation, through the eyes of an expert. And the expert is not there. You could just be doing this on your own. And this is a technique for training decision making skills.
AB: And this is both for organisations as well as for individuals?
GK: We haven’t developed it for individuals, but we can. And listening to you, I’m thinking that maybe we should develop ShadowBox exercises for individuals, for life decisions about picking careers, picking life partners, child rearing; since we’ve gotten working with Child Protective Services about how to handle some of the really gut wrenching decisions they have to make. So we don’t have materials like this for individuals, but we certainly could develop that. That’s something I’ll take away from this interview. I’ll take it back to my team and we’ll talk about that.
AB: Wonderful! And have the corporates found it useful, the organisations you’ve taken it to?
GK: We have a raft of data showing that in just half a day of training, people match the experts 25-30% more than when they started. And we’ve seen that kind of an effect.
AB: Okay. And what advice would you give to individuals who are seeking to cultivate and sharpen their intuition for personal growth, for professional success?
GK: The advice is what I’ve covered before, the kind of cognitive reflection about decisions and what you could have done better. How did you size it up? And in hindsight, what should you have been realising what connections you should have been making? What contradictions could you have been noticing just to give yourself a habit of being reflective. Too many people are like the firefighter who has one year of experience repeated ten times. So you’d say, wow, this person’s been a firefighter for ten years. But it’s not somebody you’d want putting out the fire. If it was your home that was burning down, you don’t want to spend all your time in reflection, but you don’t want to be mindless and say, ‘Well, what’s past is past.’ I can’t fix it. You can’t fix it, but you can fix yourself, you can learn from it.
AB: Very interesting. That’s a really interesting way of looking at it. So, Gary, at the end of every session, we do something called a rapid fire round to summarise the session. So very quickly, one sign that you’re being led by emotion rather than intuition.
GK: One sign is that you’re reluctant to do a premortem because you don’t want to imagine that something might be wrong with your choice. And a related sign is just a discomfort with other options. So that’s a sign that you’re being led by emotional intuition. You know that it’s fallible and you wouldn’t have trouble imagining alternatives.
AB: Some important barriers that commonly impact people’s ability to make good decisions.
GK: An important barrier is fixation, where you lock into your initial assessment and your initial course of action and you ignore contrary evidence or explain it away. That’s very dangerous to do, and it happens too often.
AB: Can anyone learn to sharpen their intuition and make better decisions?
GK: I believe so. But it takes time and it takes commitment. Yes, I believe so.
AB: Thank you so much, Gary. That was such a useful session. And thank you for sharing your powerful insights on intuition, decision making, and lots of [other] things. Thank you.
GK: Thank you for having me on the show. I appreciate it.
AB: You’re welcome. To my listeners, I hope you learned something new and I hope we brought you a little closer to leading a happier, healthier, more hopeful life. If you enjoyed it, please press like and please invite your friends and family to subscribe to our channel. Thank you for being here today and see you next week.