System of Profound Knowledge with John Willis
Adam and John Willis discuss the aptly named "System of Profound Knowledge" introduced in Deming's book "The New Economics".
Hello and welcome. I'm your host, Adam Hawkins. In each episode I present a small batch, with theory and practices behind building a high velocity software organization. Topics include DevOps, lean, software architecture, continuous delivery and conversations with industry leaders. Now let's begin today's episode.
Hello, everybody. Welcome back to Small Batches. This is the second part of my conversation with John Willis on Deming. If you don't know John, he is the co-author of the DevOps handbook and the host of the profound podcast. And this conversation or this part of the conversation, we talk about Dr. Deming's final book, the new economics, which introduces the system of profound knowledge, with that, let's get into the conversation.
So John, welcome back to Small Batches. We're here the next episode of the Deming series and where are going to be talking about the system of profound knowledge. So want to get into it?
Yeah, yeah. You know, the, we talked about sort of new or out of a crisis being his first book and very humid and mastic and you know, all those things.
The thing about really, to me, the new economics. is... Like I said, I think when he wrote that first book, he was just fed up. And then what happened was, like I said, it, it became this sort of Deming mania, or you became incredibly busy and he winds up writing this the second book, which is okay. Do you really want to know what I know.
So he puts this and he kind of flies this idea that he calls the system of profound laws and I've had people come up to me like: "John isn't that pre like egotistical to call his own system?". I'm like, let me explain to you. And then you then decide whether it should be called profound. I have no problem calling my podcast profound.
Right. So here's the thing. So he had all this work, right. Let's start with the, sort of the, what is what's interesting work, but it's sort of simple, which is your theory of variation and variation, right. And that's the statistical process control, you know, and then really sort of really understanding this body of work that Walter Shewhart had created at bell, basically Hawthorne labs and, and how you use, statistics. To take the human element out of this thing, right? Like it's, it's just a fascinating way to just use a very simplistic, standard deviation, basic stuff to sort of look at data and really understand, quality, again, we just spent a whole lot of time on there, but, so that's basically understanding variation, understanding types of variation, understanding how to react to variation and not to have knee jerk reactions to the wrong type.
A sort of anomalous thing, like sort of black Swan don't fire people. Right. And then, so that's the sort of the first lens. So Deming would say there were four lenses to understand complexity of complex systems. Right. And the first one is the theory of variation, right? The second is basically the theory of systems and system thinking.
Right. And that really sort of draws from us or a body of work of what he learned about complexity. What sort of goal rat talks about, although there's not real evidence that they collaborated a lot, but there's always a shoulder giant thing going on. I'm sure he was very involved in some, in fact, I know for a fact that he's corresponded with some of the original chaos theories and those things.
He was in his heart of hearts. Anybody sort of pays attention, knows that you have to look at things as a system view, right. System thinking. Right. So that's a whole body of work. And in fact, what's interesting is in Peter Senge, a fifth discipline, Deming wrote a whole letter to him when he got an early copy of it and Senge responded.
So there was like even sort of this like mixed between the sort of the fifth discipline work and Deming thoughts right. So those are two. And now you're getting into sort of two interesting ones. The third one is a theory of knowledge. You could simply say that's PDCA or Deming called the PDSA, but, plan do check act, which is scientific method.
So if we talked about my, what Mike Rother wrote brilliantly in Toyota kata, we talked about improvement kata, because basically it is, is it's not just doing plan, do plan, do plan dudes. Oh, is treating, in fact one of my favorite sort of quotes on that subject is from, Dr. Spears, decoding the third production DNA in that HBR review, he said Toyota was a community of scientists continually experimenting. Right? That's like, imagine that's the way like you work as an organization.
Interestingly enough, a lot of that comes is based pissed homology actually philosophy. Some of the stuff I'm writing about is how Deming ensured we're big fans of CI Lewis, which was one in a month early pragmatists. You know, in fact Deming said that, she would told me I had to read this book called mind in the new world order. And Deming said he had to read it six times. It was so hard to comprehend. Right. So all right, so that is the first three of a four lens.
Now you could say that those three, or I would say in those three are pretty awesome, right? Like I look at complexity and with all this stuff. Right. And like in like, you know, so let's look at the sort of statistical analysis of what you're doing. Let's make sure we understand that we're doing so the scientific thinking and method in our process and let's clearly understand that we're always global would go out, we call global Optima, not local Optima. Well, I think he becomes profound is where he pulls in... And I, I had this, I posed this Doris Quinn in my podcast, she's a woman who traveled with him his last two years of life and went to Ford. And I said that my theory is that like, guess Deming taught a whole lot to the Japan culture. But I think he pulled in a lot from them. I think intrinsic motivation, you know, he was clear humanist, right? You right from the get go. The way he thought about humans is quotes about humans, how he wanted the worker to be treated and how he felt that most sins were because the leadership.
Right. And, but I think that culture really worked, gave him a lot of value and he loved that culture. They loved him and he loved the food. There's a great book that his secretary wrote (...) she documents everything. And like every other sort of paragraph is about, we went to this restaurant, we had this amazing, you know, and, but it's this theory psychology, right? And were basically, he applies this fourth lens that we, I think even in DevOps, we try to say, you know, it's about culture, it's about behavior. But imagine putting all four components in one sort of structured understanding of varsity where you actually, you know, like understand biases and understand this sorta, you know, like how these things play. And then you can look at that body of work is do sorta kind of hand-to-hand and Chris Argus, right. All that stuff, again, I think if not directly influenced by Deming, definitely indirect I mean about, you know, there's sort of a behavior science and how we make choices based on sort of decision-making in our bias.
And he, so he knew that he said that like basically ordered the things that sort of codified yet. I would still be able to, like I told you earlier, I don't know that reading new economics is the best way to understand. So someone found out,. The way I learned it was actually a bunch of the healthcare industry is probably taking Deming's work farther than any other industry, you know?
And so I watched a bunch of videos, but one was just a simple, like, like you, you just getting people in a hospital to sort of wash your hands, you can tell them all they want to know about system thinking and how it affects the global optor, how it does variation. And you can do ,statistics and you can do all that stuff. But if somebody firmly believes that it's a waste of time to wash your hands, you know, it was kind of stupid during the pandemic, which you said, but they like you've gonna, you're not to be able to attack it, like, you know, with stacks and like, you're going to have to figure out how to get over in around that bias.
And I thought it was just brilliant that he included that as an equal lens. Among the other four. And so, so that, that's what I think that like, as you know, you asked me earlier, like why in the early session, why do I, why did I sort of get so involved in Deming? Because the thing that you understood early in the DevOps room, or just being in large infrastructure to try to figure out the things that work for us, the things that don't work.
There's always sort of a human element to it. And you always know that there, there are things like in the earliest DevOps sessions, we talk about, you know, the peer ops, right? Like it's sort of a joke, right. Everybody went out to why, because you had like, you couldn't just say, well, we got, you know, we got this monitoring tool.
We got Chef, we got CFEngine. We got all these tools. Right. There was a little bit other thing you had to deal with, and that was sort of human psychology of how you get people to adopt and change and you saw that clearly, I would say probably throughout his whole career, he didn't, he didn't just wake up, you know, at 90 years old and say, oh, I'm going to put the stereo psychology.
I mean, he was, he was gathering knowledge of that, like coming out of college, you know, understanding, how pragmatism was changed was the first American philosophy in how understood how that was changing the way people thought, what he learned from the Japan, Japanese culture about, so the intrinsic nature of work.
Yeah. So for the listener, I want to give a little bit background on some of the content in the book, some thoughts, and then we can move on to discussing the red beads.
So one thing that I really took away from a new economics was like the first thing: a globally optimized system has unoptimized components. Like if you're trying to create an optimized system, you can not over-optimize the components, because then you're creating negative outcomes in the system. And it gives the example of a company who tries to save money on a flight to send an employee somewhere to just so they can arrive jet lag. We'll sure, the company saved money, but then the person's not able to get the job when they get there. Just an example of systems thinking.
Then the other one is about, this was really the first time I've ever had been introduced to statistical process control and separating two different causes, common cause and special cause. This is like, to me, one of the biggest takeaways from the book was that I had no knowledge of this.
And I had been going about a lot of the work that I had been doing wrong because I had been making the two mistakes when he talks about, which is: attributing special causes to common causes and the reverse attributing common causes the special closets or whatever that one was. Which then leads into I think, which was actually my favorite chapter of the book, which is the discussion about the red beads, because it covers variants and covers inter like systems thinking of course the human psychology.
So, John, could you talk about and introduce the listener to the red beads?
Yeah. So I did want to sort of backtrack on a couple of things, one of my favorite all time quotes and I don't have it in front of me, but it's an unknown author. It's something like, misunderstanding variation is the root of all evil misappropriations knee jerk reactions, to your point, like understanding what the difference between common cause and special cause correlation is and how to sort of understand it. Knowing what you don't want to do is when it's special cause react like it's common and when it's common, don't react like, like there's beauty in it. The other thing that was, that you mentioned like another great example is, you know, sort of cotton valley sort of stuff, bigger companies will just pay for lunch.
And so a classic sort of bank or insurance company, that's a terrible ROI. You know, like you you're paying, like, what you're doing is you're getting people to sort of work through lunches, collaborating. They're not like going out of the house. Like, so that whole thing of like you know, understanding (...) some people don't hear this guy much often, but in the early cloud days, he was very famous at Randy bias, what he called, you know, the difference between bottom line ROI and top line ROI. Right. So the red bean is a great example, I think of just showing how that you can fall into these traps. So Deming hated MBS.
I am pretty certain like everybody grab your seat, you're going to get upset when you hear this. I think he would hated OKRs. I'm just reading, working backwards. Right. And, and, and Jeff Bezos has ideas of how they build stuff in Amazon. Right. And that they weren't, they didn't look any like OKRs.
And by MBO, you meant management by objective...
Measure by objectives. I suspect you would have hated KPIs. I think, I would argue that he probably would hate, you know, the way most people sorta, implement OKRs today,
Coming from the S3 side. I'm not sure it's sort of it's like... One of my thoughts too, was I wonder what this guy would have to say about SLOs.
Oh, I think he'd be all in on that, right? I think so he'd like flow another hour on this, but I think he'd be all in on it because the thing about the SLO and SLIs are it's a social tactical contract over delivering the service. So it's totally sysmistaking.
Yeah, it's sort of, that's kind of like what I thought it was, or you have to sort of differentiate between like, say, let's just say for the sake of discussion here, you know, let's just assume that Deming would like something like SLOs because it's sort of an holistic measurement of the system and he's all about systems thinking and, you know, outputs...
And it's calibrating and feedback loops. Right. In other words, you know, I mean the whole idea of doing SLOs specifically SLIs, right? It's a collaboration, right? You sort of bring in the different parties and say, you know, hey, let's experiment and it's a Kanban too, but you don't build your first kanban. It's like, this is the way it's always going to be. It emerges. And so I think about SLI, SLO and SLIs as these original conversations about how do we manage the social sort of social type of contract between people who deliver software and people who might at scale manage it. I think he'd be all over that now.
I think OKRs if you ran them the way, I guess they were originally defined, but like when you start doing sort of personal OKRs, and sorta quarterly OKRs, you're just not, here's the thing about Deming, I think he understood that there are complex, these are all complex systems and that, that's why he was so into, in statistics.
Yeah. Right. In fact, there's a great, somebody asked him once he said, how does the mathematical fists become a statistician? And his answer was, I've always been interested, in least in theory of errors. Well that's probability, right? That's understanding that it's non-determinism and it's beautiful core. And, and he said, I also have been trained in leastsquares by the best. So, he understood that the way you solve problems is you have to approximate and you have to calibrate. And, and I think anyway, and that's why I think sort of deterministic structures of like, trying to figure out what you're going to look like in a month from now, or, you know, even when, even Rother's in is, is a permanent color in the gray zone.
All right. So back to the red bean, which is a perfect example of when you build these arbitrary construct of, Joe, I expect, you know, this many red beans, this quarter. And if these are things that you're sort of measuring that are out of the control of people, then it's sort of folly, right? So the whole exercise is somebody Joe will scoop up beans and is there's a disproportionate number of red beans versus white beans. And they're all mixed up. And Joe scoops it up and like good job, Joe, you've got tons of red beans, you're awesome. You know, and then Mary comes up and scoops up, you know, they mix it up and she gets, you know, low level red beans, like, and it's like, you know, Mary, you gotta really got to work on your, your set of red bean and a lot of ways, when we look at how we structure our organizations and the way we, we don't really give people the ability to control their environments and we don't give them the freedom, you know, you're like, we'll take this right to software if you're sorta massive waterfall and you're a developer that has to sort of contribute as part is monster monolith that basically gets delivered every six months. And you might've written like the most amazing, like, payments is whatever, right. Is the code might've (...) greatest stuff ever, but the whole system, every six months goes down for like a month and a half before they go ahead and get it rolling.
Like, as opposed to another team that gave somebody, you know, they had their own atomic service. Right. But it's, you know, poor Joe or Sally is, you know, is getting dinged on her bonus. And then the structure of...
I got to tell one of my favorite all-time stories. So when I was at Chef and this is public now. So we almost everybody at Chef in the early days was ex Amazon.CAnd so one of the guys, Jesse Robbins was the seat, our CEO at Chef, and he was he self-titled at Amazon, the master of disaster. And he was there from the early days. He ran infrastructure. You couldn't get anything on, this is pre Amazon cloud. Right. You couldn't get anything on his system without his approval.
So there's this great story about, and this is misaligned sentence, right? Which is, the Kindle is coming out. And I don't have the exact, I'm going by sort of memory what Jesse would tell the story periodically. And he's told it publicly a couple times now, so the Kindle's coming out and they had all the press release that the stock price was all, everything was business-wide in motion for this to hit. Like, let's just say on a Monday, I don't know if (...) . So they went ahead and they went, you know, suggested to get approval, to put it in production. And he's like, no, that's not been tested. It's probably going to bring down the system and it was like a battle Royale, right? Like, and the battle supposedly went to baseless from third, maybe fourth party hearing.
But I did hear from Jesse and I suppose you guys, he was in that meeting and the argument was stock price goes down or a system goes down and they both levied, sort of the product owner and Jesse leveling the argument and Jesse lost and, and the product went in glorious price was cheap, was great. It was early days of Amazon, e-com, the site went down, big time down Amazon. They have an after party for the glorious success of the Kindle software and product hitting the market on time wallstreet, choppy. And they invite Jesse's team to the party.
They don't come.
They can't go.
Thanks exactly. They're busy.
Yeah. I mean like that's, that's it right? That's the red bean.
Yeah. So for the listener, just to recap this experiment here, the idea here is that there's a ton or some (...) makes jar of red and white beads, beans, whatever, the matter, just mix things you're given this paddle and this paddle can pick up 50 beats at a time, and you're supposed to get a certain percentage of red beads.
And your goal is to remove all of the red beads from this jar, right? Picking them up in this with this scoop and, you know, he gives out the requirements, which is like each willing worker is supposed to scoop like this many times per minute and pick up 50 beats at a time and, you know, do it in this, I mean I'm out of time and I'm reading this and just thinking like, man, there's no way this could ever work.
I mean, I know how this is sort of going to play out because the worker has no input into the process. Like what they're picking up each time is totally random, but yet each worker, in this case, each person who's scooping out of this jar is being compared to the other person. Even though there's no logical way you compare them.
And then it's so simple in that when you see it, like you see it in that version. Like, well, that's stupid, but then you, if you step back and say, wait a minute, how do we do (...) ? There's oh, have you ever heard that there's somebody created a game called CATIA in the classroom. Have you ever heard about this?
No, I have not.
So it's brilliant. Oh, it's brilliant. They based it on (...) . So you get groups of five that have to build, a puzzle three-year-old puzzles, like 15 pieces. And, and in, in these teams of like five, if you're lucky you get like four or five different people doing the same time. And so what they do first is, you know, so would you sort of say is the goal or the true north, right is to build the puzzle in 15 seconds. And then you go through these cottages, like, you know, you baseline. And normally everybody has like 40, maybe 60, 90 seconds first time. Right. And then they sorta go through like five cottages you'll plan, do check, act, figure out their strategies.
And typically, I don't think anybody I've never seen it. I asked my nobody's ever gotten to 15 seconds. I've seen teams break 20, but it's, it's the puzzle itself that they have to be this specific puzzle says for three-year-olds or above. So I've actually done this with a leadership group. So I did it with a company. I had a whole bunch of people and they said, oh, you know, John, you want to, you want to get us to believe that you can change this place, get our C level team to play this game for 90 minutes. And I did. And it was just amazing. You know, you sit there and, and you, you, you just watch the, like, these are people that are running a billion dollar multi-billion dollar corporation in the struggling to get reasonable efficiencies, five people, like, I mean, like there's probably a million dollars salary per group. And they're struggling to get basically what is a three-year-old plan and you learn as you go, it's a group, like you have to start, gets face on flow and you learn a lot about sorta, how do you solve problems in a group, but, but it's the same thing, right? It's just you, if I told you, Hey, you know, like your company was like, we'll go out of business. If you can't, you know, the five executives can't create a puzzle when you put them in a room and like five for Cod is later, they're scratching their head. Why they can't break 30 seconds, you know?
Yeah. Okay, so you earlier, you mentioned Deming's bright writing style and I tend to agree with you that he has a kind of a weird style is kind of a little bit hard to read, but I like how sometimes he just sort of, he's just writing and then all of a sudden he'll just drop like one sentence. That's just like a hammer. Like he knows that he dropping an audio and that's the end of the red beads chapter. When he's like, you may observe this in your own work and just ends the chapter. It's like, yeah, I read it. I just thought like, okay. Yeah, you can feel this sort of there's no way this could work yet we're all still participating in it. And he tells the story of the woman who participated in the event who approached him and said, look, Dr. Deming, I, I still feel obligated to participate in, do all these things because you know, I'm a member of the system and she, you know, she talks about how she was like emotionally invested in it and she was psychologically invested even though her brain is telling her that like, look, there's no way you can do this, but there's still this human, emotional, like connection to like, I want to do what I need to do, or I want to do it well, whatever that is but even when you know, it's few tiles still connected to it and like, as thinking in systems and thinking about people, like how can you construct systems and like ways of working that don't put people in that state, you know, like it's so important to think about the human involved also it's really like that aside.
We do it every day, right? Like, I mean, you've seen like people who work in companies, I do a lot of assessments are going to accompany, you know, interview by these people and you'll find people that no it's broken, especially when you get into sort of the stuff like we're moving to cloud, but you can't do this, this and this, but you can't have that because, you know, and so I asked him like, well, like what do you do? Like, you know, like, you're, you're told you have to move this out because (...) , but then there's another group tells you, you can't do this, you can't do that. You can't do this, came this. And they're like, well, I basically create crappy software. You know what I mean? Like, like I'm going to do my job. Like I get paid, I got to create it. So you put it, I mean, people either figure out like work arounds that still go right thing. Right? Like a lot of people. And that's, you know, in the end, that's where you get this terrible tech, you know, another form of debt where when you actually try to get things aligned, you have to go back and start finding all the, how people did the work around on the workaround, because you have people just like, you know, I'm going to do my job and I'm going to do it great. And so, even though they tell me that I had the mic minute, there's a great story in Devon sandbox, but hasn't made me in at target, right. She decided that she thought that the first year that Target was going to go all in, on sort of e-commerce and web based commerce that she needed a Kafka and Cassandra.
So she went to this board, the LARPer (...) and they like, absolutely not, no way you could never do that in a retail. She did it anyway. Basically, it was sort of, it was, you know, like everybody realized they could have never survived that first year. She was in charge of like the whole commerce API. So, and then like, they actually got rid of the (...) , like this like architecture review, but I felt what the (...) like, and she had a plaque that just said, you know, and congratulations for getting rid of the (...) .
Right. But so there's those types of people then there's the people that sort of become sort of apathetic or somewhat cynical in that, you know, like, all right. You know, they say they want X. I told them that they shouldn't do X, but they told me, you know, shut up and do X. So I'll do X in a crappy way.
And then there's the people just sort of flounder, right? Like they're just sort of in constant chaos with themselves. Like they're probably like that woman, you know, like who like just, you know, so, you know, it's tough. Right. And you know, you have to. I was just interviewing Courtney, Courtney Kissler. And so I think she's one of the best enterprise managers and leaders I know, you know, and she says that she make sure that her people, I guess don't like certain things that she thought about Deming. And she said, I want to make sure my people want to Deming quote about Udacity, joy. No worries. And you know, she's, that's one thing I, you know, there's other things I look for, but I do look for that constantly, that I want to make sure that the people that work for me, just sort of like, and if they're not, then we gotta figure out what in the system is not right. Then it may be them. But, you know, she didn't say that, but it may be the person, but, you know,
Well, I think that's a good segue into probably the last conversation that we have. So let's pause here and we'll pick it up again in the next episode.
You've just finished another episode of Small Batches, podcast on building a high-performance software delivery organization for more information, and to subscribe to this podcast, float to small batches.fm
I hope to have you back again for the next episode. So until then, happy shipping,
Like the sound of small batches. This episode was produced by Pods Worth Media. That's podsworth.com.
Subscribe to Small Batches
Show notes and other presents sent to your inbox