In this episode, I unpack the lessons from “Upstream” by Dan Heath.
You can watch the episode here:
Full Transcript:
There’s one reason why we tend to favor reaction, because it’s more tangible. Downstream work is easier to see, easier to measure. There’s a maddening ambiguity about upstream efforts. One day there’s a family that does not get into a car accident because a police officer’s presence made them incrementally more cautious. That family has no idea what didn’t happen and neither does the officer. How do you prove what did not happen? Your only hope as a police chief is to keep such good evidence of crashes that you can detect success when the numbers start falling.
But even if you feel confident in your efforts to accomplish something, you’ll still never know who you helped. I’m Mickey Mellen. This is Stacking Knowledge. And that was a bit from Dan Heath’s excellent book Upstream. In it, he unpacks why we tend to favor reacting to problems instead of preventing them and what we can do about it. So let’s dig in. He starts by talking about moving upstream. There’s a number of good quotes here. The main one, this one kind of is like that initial quote, sort of explains the book. It’s a shorter one. says, quote,
When you spend years responding to problems, you can sometimes overlook the fact that you could be preventing them. And we get really good at fixing problems when maybe we should be looking at how to stop them. So he gives an example from Expedia back in the day, the Microsoft research tool. He said, the effort to reduce call volume at Expedia was a successful upstream intervention. Downstream actions to react to problems once they’ve occurred. Upstream efforts aimed to prevent those problems from happening. And he talks sort of what happened and why it took so long to prevent, like to reduce call volumes because
Reducing call volume wasn’t the problem. They were trying to get better at answering and having shorter calls and quicker answers and being great at handling calls versus trying to fix the product so there were fewer calls. So as he explains it, he says, quote, no, notice what was missing. It was no group’s job to ensure that customers didn’t need to call for support. In fact, no team really stood to gain if customers stopped calling. It wasn’t what they were measured on. So that reminds me kind of of good heart’s law. Good heart’s law state simply says when a measure becomes a target, it ceases to be a good measure.
The first time I heard that had to read it like 10 times to understand what it meant. But yeah, when a measure becomes a target, it ceases to be a good measure. So if you’re saying, hey, this output you do, I’m going to measure you strictly on that, people will change their behavior. Like for us, for example, we told our designer, she does a great job of balancing speed and quality, putting out great work as quick as she’s able. If we say no, speed is what matters. Turn it out. And so she says, OK, cool. Her work quality suffers, but she cranks out a bunch of designs. Well, she hit her target. She’s going to get a raise and get promoted.
But even though her work is garbage, that’s not really what happens. But when a measure becomes a target, it ceases to be a good measure. And so we’ll see a lot of that throughout this book where if you measure the wrong thing, it’s problematic. So as we get into, he says, in this book, I’m defining upstream efforts as those intended to prevent problems before they happen or alternatively to systematically reduce the harm caused by those problems. So he gets into the three barriers to upstream thinking. Like, why don’t we think more upstream?
The three problems just to lay them out are problem blindness, a lack of ownership, and tunneling. So problem blindness, he says, to succeed upstream, leaders must detect problems early, target leverage points in complex systems, find reliable ways to measure success, pioneer new ways of working together, and embed their successes into systems to give them permanence. And so there’s a lot too, just to see that there’s problems and know that problems are solvable. We had this at Green Mellen once where a problem we had is that our team was
was bugging Ali, my business partner, quite a lot. And I say bugging, they’d come to her with problems throughout the day when we were all in the office. And we just figured that’s just what it is. So we worked how to make that problem easier to deal with, not realizing we could avoid that problem by, in our case, we said, hey, you can meet with her, we gave them each a time. Ashley, you can meet with her on Tuesdays at one o’clock, and Brooke, can meet with her on Wednesdays at three, and set up different times to solve the problem. But we didn’t realize it was a problem. We were blind to the fact that it was a problem. Like we knew it was a problem, we just assumed there was no solution, so we didn’t try to solve it.
until we had a business coach that said, yes, here’s how you can solve it. And was eye-opening. So being blind to problems can be problematic. Second one is a lack of ownership. I think that’s clear kind of from the Expedia example. Everyone was measured on different things, and none were measured on reducing calls. So if no one’s responsible for resolving the problem, they’re judged on how they handle the problem. Well, they’re going to do the best they can to handle the problem. So no ownership is a problem. And then tunneling is another problem. This is one we’ve all kind of hit for a while. So he talks about a book called Scarcity.
where they call it tunneling. And in that book they say, quote, when people are juggling a lot of problems, they give up trying to solve them all. They adopt tunnel vision. There’s no long-term planning. There’s no strategic prioritization of issues. And that’s why tunneling is the third barrier to upstream thinking, because it confines us to short-term reactive thinking. In the tunnel, there’s only forward. And with that, he gives an example of a nursing ward at a hospital where towels were a problem. They had to keep borrowing towels from other units, running and grabbing them. So he says,
And by the way, how her colleagues going to feel about someone who’s always yammering out about fixing processes rather than simply grabbing more towels from another unit. It’s so much easier and more natural to stay in the tunnel and keep digging ahead. And so yeah, the problem is if she’s gonna say, I’m not gonna just grab another towel and solve this. I’m gonna try to fix the problem. They’ll say, no, we have things to do. Just grab the towel and keep going. And so they just keep grabbing towels for years when they could have saved hours and hours and hours of work by spending some time just to fix the process, but it’s tough to do. I think similar when you’re trying to delegate to other people. You say,
I need to do this thing. takes me 10 minutes, but to train her to do it’s gonna take me a half hour to train her, so I’m just gonna do it myself. It’s just easier to do it myself. Even though if you could spend that half hour to train her, you’ll save yourself the 10 minutes every time. 10 minutes plus 10 minutes plus 10 minutes plus 10 minutes over the weeks and years. You save hours and hours, but it’s tough because it’s the first time you gotta invest more. So same with the towels in the hospital. It’s easier just to keep tunneling in rather than try to fix that upstream problem of whatever was happening. Section two of the book, he talks about seven questions for upstream leaders.
I’ll mention the questions. I won’t necessarily dig into all of them, but first one is how will you unite the right people, you know, getting the people together? Second one is how will you change the system? I like this one. He had a great example here. He said on sharp curves where accidents tend to happen, transportation departments have begun to install high friction surface treatments, overlays of ultra rough materials, super glued to existing roads in Kentucky where the treatments have been used widely. Crashes have been reduced by almost 80%.
None of those drivers who avoided crashes they would have suffered in in an alternative world will ever know that they may owe their lives to some construction workers who installed a super gritty road. And so yeah, they’ve reduced the crashes by 80%. They can see the numbers. They’ve saved all these people from accidents, but no one that’s driven through thinks they were the one that got saved. It’s, I’m glad they helped other people that I wouldn’t have gotten an accident. I’m a safe driver. And I see this for things like the COVID vaccine for self-driving cars and for nonprofits. Like we work with a nonprofit called Kid to Leaders.
that their goal is to help children stay out of prison when they get older. There’s children of incarcerated parents. so again, they’ll some of them still end up going to prison. Most don’t, but of the ones that don’t, which ones did they save? Like not all of them, probably some might not have. And so you don’t know for sure who it was. I’ll look at COVID vaccines, for example, you know, the problem with COVID vaccines and self-driving cars as those become more proliferated, proliferated, proliferated over the years is that you won’t know who they say. So like
and you will know who they killed. So like with COVID vaccines, there’s undeniable evidence that they’ve saved millions and millions of lives. But there’s also undeniable evidence that they’ve killed some people. You can debate how many, but at least like with the Johnson and Johnson vaccine, we know that I think there were eight women that died from that. You can tell me the names of the eight women. You know who was killed. You don’t know the millions that it saved. I mean, I had the COVID vaccine. It very well may have saved my life. I got COVID once and I clearly didn’t die. Like, did it save my life? I mean, probably not, but maybe. But who are the millions that saved?
No one knows. We can say the names of those that it didn’t. The same will hit with self-driving cars, I think, too. Right now here in the US, we have, I think, something like 35,000 automobile crash or automobile fatalities every year. So say self-driving cars are remarkable and they get it down to 10. Well, in the next year when those 10 people die, you’re going to know exactly who they are. It’s to be front-line on the paper that John died because his Tesla lost control and did something stupid. And if it wasn’t for that self-driving car, he would have lived.
But we don’t know who the 34,990 other people are that didn’t die, but we know John died because of that self-driving car. And it’s going to be interesting to deal with where knowing it saved a lot of lives, but not knowing who they were, just like the vaccines is super tricky. next we’ll look at where, where can you find a point of leverage? That is the next one. says upstream leaders should be wary of common sense, which can be a poor substitute for evidence. So you can say, course it’s obvious we should do it this way. We should just grab more towels, but use evidence to try to find things a way to find that. says in the book,
The postmortem for a problem can be the preamble to a solution. The postmortem for a problem can be the preamble to a solution. So if you have a problem, spending time to investigate the problem is kind of the first step toward actually solving it. But again, recognizing there’s a problem and then taking the time to look into the problem is something people skip quite often. The next one he asks here in the seven questions for upstream leaders, he says, how will you get early warning of the problem? How will you know the problem’s happening?
It gives an example of the Safe to Say system. a youth violence prevention program in Pennsylvania where you can call in reports and stuff and try to help solve things before they happen. It’s a good upstream effort. They said, will no doubt be prone to overreactions and even cruel pranks. It will almost certainly surface that many false positives for every genuine threat avoided. To make matters worse, it’s the curse of preventing rare problems that we never may really know when we’ve succeeded.
They say, how could you possibly conclude that the kid in Hazelton would have perpetuated a massacre? Like, yeah, you stopped the kid, but would he really have done it? Like, did you solve the problem? Like, he didn’t commit a massacre, which is great, but was he going to anyhow? And again, there’s probably others that will still slip through. And so you can look at the numbers and see it was reduced, but who did it help and who did it stop? And you’ll never know for sure. Another question they ask is, how will you know you’re succeeding? He says, there’s also a third kind of ghost victory that’s essentially a special case of the second.
It occurs when measures become the mission. This is the most destructive form of ghost victory, but because it’s impossible to ace your measures, well, because it is possible to ace your measures while undermining your mission. We talked about that earlier with good art’s law. Like you can have the measures and nail them perfectly. And if you’re so focused on that, you may actually destroy the mission. Another way he puts it, says, when people’s wellbeing depends on hitting certain numbers, they get very interested in tilting the odds in their favor. So again, people, if they have certain things they need to do, they’re going to slack in.
you make other things fall apart. For example, with Expedia, if they said, hey, your goal is to have your calls be as quick as possible, take more calls per day and help more people, well, they’re just gonna get off the phones quicker. They’re not gonna help people as well. But man, Steve did a great job because instead of everyone else averaged 14 calls, he did 22 calls. Steve is the best, but those 22 people may have gotten way worse service, but that’s not what he’s measured on. He was measured on getting calls done. And so again, when people’s wellbeing depends on hitting certain numbers, they get very interested in tilting the odds in their favor. So just be careful of.
of what you’re measuring because it may have adverse effects. Next he says, chapter 10 is called, how will you avoid doing harm? And he gets into some good examples of how you can avoid doing harm and making sure you do things the right way. Just one funny line I pulled out of this that I was great. He talked about how cars these days, for the most part, are all very well made. mean, compared to any time in the past, you go buy any new car from any manufacturer. It’s a pretty darn good car. It’s going to be pretty reliable and stuff. So he says, from the book, says, quote,
It’s generally difficult to buy a poorly made car these days, especially now that the Pontiac Aztec is gone. So I thought that was kind of funny that that was notorious for being a very bad car. If you’re not familiar with the Aztec, was a minivan ish thing. was the one Walter white drove around and breaking bad, just that ugly, ugly minivan thing. So yeah, it’s genuinely difficult to buy a poorly made car these days, especially now that the Pontiac Aztec is gone. thought that was great. the biggest problem or one of the biggest problems I see with this, he says, who will pay for what does not happen? So
Again, if you’re paying to fix a problem, that’s easy to justify. But if you’re paying to prevent a problem, that’s much more problematic. You know, we’ve seen that, like with homelessness. There’s a lot of studies that preventing homelessness is less expensive than dealing with it. But again, you’re gonna be fixing something and not seeing results from it, and it’s harder to justify that, which is very unfortunate. But they give an example here. He says, a few years ago, my wife and I flipped to an upstream pest control. We’d had a problem with spiders, so we…
called Exterminator. When he visited, he offered us a subscription service. The idea was that they’d visit on regular basis, not requiring an appointment, just spraying outside our home periodically, using the best strategies they’d learned to keep bugs at bay. At first, we were skeptical. Are we getting ripped off here? But ultimately, what won us over was the beautiful vision of removing bugs from our life concerns. So we did it. We removed one small source of drama from our lives. So in his case, they were able to pay for an upstream effort. They’re paying for this exterminator to keep bugs out, and they have no bugs. so what’s he fixing? We have no bugs, but…
They know that it’s because of that, because they had bugs for years and now they’ve just removed one small source of drama from their lives. And then section three talks about going far upstream. talks about the Chicken Little problem, which I love this. I I hate it, but it’s very important. He says, quote, there’s a concept called the prophet’s dilemma, a prediction that prevents what it predicts from happening, a self-defeating prediction. What if Chicken Little’s warnings actually stop the sky from falling? He talks in the book about like breast exams and stuff. You do more mammograms.
you’re preventing people from getting cancer in the first place because you’re fixing it and so rates drop. You’re like, well, why are we doing this? Rates are dropping like you’re.
An example in the book is like cancer rates in women, like breast cancer rates in women. There’s studies that say it’s going up and up and up. And so if you do more mammograms, you’re going to prevent it from going up. Like you do the work to stop it from happening. Say, yeah, we prevented that prediction from actually happening by doing it. He says in here, the Y2K bug was an example of the prophet’s dilemma. The warnings that the sky would fall triggered the very actions that kept the sky from falling. And so for those old enough to remember, when Y2K hit, when the year 2000 rolled over, it was going to mess up a lot of computers.
They only stored the years at two-digit numbers. So it was going from 98 to 99. It was going to go to 0, 0. And a lot of computers would think it was now 100 years ago and would mess up and cause all kinds of drama. And when the morning of January 1, 2000 hit, not really much happened. There were a few minor issues. And people said, I guess it didn’t happen. These people were wrong. Their predictions were horrible. But it was actually because of all the work they did. I was actually in IT at the time. And those three or four years ahead of time, we upgraded thousands and thousands of systems and machines and servers and prevented
lots of things and so it a self-defeating prediction by saying all this bad stuff is going to happen by knowing we said that we’re able to fix it and then the bad thing didn’t happen and so yeah it becomes that chicken little problem the profits dilemma is what they call that so a challenge you need to be aware of just that may happen if you’re able to predict a problem and then solve it people say ha you were wrong it didn’t happen but you fixed it so congratulations on that and just making sure people understand what happened there and Y2K is still when there’s arguments about today you’ll hear people say me what a hoax that was nothing bad happened people are all worried about it
People just fell over themselves. They were chicken little saying the sky was falling. again, because they said it was falling, we fixed it and yeah, went from there. I’ll leave you on this last little bit here talking about you being upstream. I’m encouraging you to find problems that you could solve to make your life easier. So he says, which problems have you come to accept as inevitable that are in fact nothing of the kind? Maybe it’s something small. Say the irritation of finding a place to park in a crowded parking lot. I met a woman who told me about an epiphany. She said quote,
I literally have a step counter on my wrist, yet I was driving myself crazy trying to find a closed space. It was madness. So now I always park in the most remote spot in the lot. I think of it as a VIP spot away from the other cars. I get some extra steps and don’t stress about finding the spot. It’s such a wonderful sense of relief. Like I purged that concern forever from my life. So I love that’s kind of a great way to end on is just, yeah, thinking things a little bit differently. Like you’re trying to get more steps in the day, but you’re also stressing out trying to park close to the store. Like, no, just undo it. Just park further out.
You don’t have to worry about finding a spot, you’re not near any cars, you get a few extra steps. Also it’s gonna add like 15 seconds to your day to walk a few extra spaces, it’s not a big deal. So yeah, it’s a great way to look at things. Upstream is a fantastic book, I encourage you to check it out. It’s a good read, it’s a fairly concise read as some of these books kind of go. So Dan Heath, Upstream, give it a read.
Leave a Reply