Nov. 2, 2023

Unpacking Amazon’s unique ways of working | Bill Carr (author of Working Backwards)

The player is loading ...
Lenny's Podcast

Bill Carr is the co-author of Working Backwards: Insights, Stories, and Secrets from Inside Amazon. With a background at Amazon of over 15 years, Bill played a pivotal role in shaping the company’s global digital music and video ventures, including Amazon Music, Prime Video, and Amazon Studios. After Amazon, Bill was an Executive in Residence with Maveron, an early-stage, consumer-only venture capital firm. He later served as the chief operating officer of OfferUp, the largest mobile marketplace for local buyers and sellers in the U.S. Today he’s the co-founder of Working Backwards LLC, where he helps companies implement Amazon’s time-tested management strategies. In this episode, we discuss:

• What exactly “working backwards” is, and how you do it

• Why having “single-threaded leaders” is so effective

• Inside Amazon’s intense product review process

• How to actually follow the “disagree and commit” principle

• The thinking behind the principle “Leaders are right, a lot”

• Input vs. output metrics

• Fostering a culture of risk-taking and innovation

• The role and responsibilities of a “bar raiser” in your hiring, and how it significantly improves the success rate of new hires

Brought to you by AssemblyAI—Production-ready AI models to transcribe and understand speech | Coda—Meet the evolution of docs | Wix Studio—The web creation platform built for agencies

Where to find Bill Carr:

• X: https://twitter.com/BillCarr89

• LinkedIn: https://www.linkedin.com/in/bill-carr/

• Website: https://www.workingbackwards.com/

Where to find Lenny:

• Newsletter: https://www.lennysnewsletter.com

• X: https://twitter.com/lennysan

• LinkedIn: https://www.linkedin.com/in/lennyrachitsky/

In this episode, we cover:

(00:00) Bill’s background

(04:26) Amazon’s workplace evolution

(09:54) Amazon’s “fitness function”

(11:44) Single-threaded leadership

(18:07) Implementing a program orientation with single-threaded leadership

(20:16) The GM model vs. single-threaded leadership

(21:31) Functional countermeasures needed for single-threaded leadership

(25:22) Embracing the “disagree and commit” principle

(30:22) Understanding disagreements

(32:41) Deciphering Amazon’s “Leaders are right, a lot” principle

(35:25) An explanation of the working backwards framework

(41:16) PR FAQ process: Amazon’s innovation engine

(44:47) Deconstructing the PR FAQ structure

(43:49) The concentric circle model for sharing PR FAQs

(44:55) The customer problem-solution statement

(47:52) Create a product funnel, not a product tunnel

(51:19) How Amazon promotes action vs. talk

(54:35) Amazon’s flywheel and input metrics

(1:00:51) Signs you’ve got a good input metric

(1:04:23) How mistakes can still be made with working backwards

(1:06:54) Why disagreements aren’t necessarily signs products will fail

(1:08:02) Examples of failed Amazon projects

(1:09:55) Cultivating risk-taking and accepting failure

(1:13:57) Amazon’s “bar-raiser” practice for hiring

(1:18:21) Selecting Amazon’s bar raisers

(1:20:41) Advice on implementing practices from Working Backwards

(1:23:10) Bill’s work as an advisor

(1:26:05) Lightning round

Referenced:

Working Backwards: Insights, Stories, and Secrets from Inside Amazon: https://www.amazon.com/Working-Backwards-Insights-Stories-Secrets/dp/1250267595

• Jeff Bezos on X: https://twitter.com/jeffbezos

• D.E. Shaw: https://www.deshaw.com/

• Eric Ries’s website: https://theleanstartup.com/

• GM business model: https://fourweekmba.com/general-motors-business-model/

• Rick Dalzell on LinkedIn: https://www.linkedin.com/in/richarddalzell/

• The Effective Decision by Peter F. Drucker: https://hbr.org/1967/01/the-effective-decision

• Template: Working Backwards PR FAQ: https://www.workingbackwards.com/resources/working-backwards-pr-faq

Good to Great: Why Some Companies Make the Leap and Others Don’t: https://www.amazon.com/Good-Great-Some-Companies-Others/dp/0066620996

• The Amazon flywheel: https://feedvisor.com/resources/amazon-trends/amazon-flywheel-explained/

• Sixsigma: https://www.6sigma.us/

Loonshots: How to Nurture the Crazy Ideas That Win Wars, Cure Diseases, and Transform Industries: https://www.amazon.com/Loonshots-Nurture-Diseases-Transform-Industries/dp/1250185963

• Andy Jassy on LinkedIn: https://www.linkedin.com/in/andy-jassy-8b1615/

• Implementing Amazon’s Bar Raiser Process in Hiring: A Quick Guide: https://www.barraiser.com/blogs/implementing-amazons-bar-raiser-process-in-hiring

• Microspeak: The As-Appropriate (AA) interviewer: https://devblogs.microsoft.com/oldnewthing/20231017-00/?p=108897

The Practice of Managementhttps://www.amazon.com/Practice-Management-Peter-F-Drucker/dp/0060878975

The Effective Executive: The Definitive Guide to Getting the Right Things Done: https://www.amazon.com/Effective-Executive-Definitive-Harperbusiness-Essentials/dp/0060833459

Steve Jobs: https://www.amazon.com/Steve-Jobs-Walter-Isaacson/dp/1451648537

Seveneves: https://www.amazon.com/Seveneves-Neal-Stephenson/dp/0062334514

A Gentleman in Moscow: https://www.amazon.com/A-Gentleman-in-Moscow/dp/0143110438

Dune on Prime Video: https://www.amazon.com/Dune-Timoth%C3%A9e-Chalamet/dp/B09LJXY4PH

A Spy Among Friends: https://www.imdb.com/title/tt15565872/

• Zipp 303 Firecrest tubeless disc brake: https://www.sram.com/en/zipp/models/wh-303-ftld-a1

The Fifth Discipline: The Art & Practice of the Learning Organization: https://www.amazon.com/Fifth-Discipline-Practice-Learning-Organization/dp/0385517254

Production and marketing by https://penname.co/. For inquiries about sponsoring the podcast, email podcast@lennyrachitsky.com.

Lenny may be an investor in the companies discussed.



Get full access to Lenny's Newsletter at www.lennysnewsletter.com/subscribe

Transcript

Bill Carr (00:00:00):
... Jeff would say, we took it as an article of faith. If we served customers well, if we prioritized customers and delivered for them, things like sales, things like revenue and active customers and things like the share price and free cash flow would follow. So therefore, when we're making a decision thinking about a problem, we're going to start with what's best for the customer and then come backward from there. That informs what's the work you have to do to then create this new solution for customers.

Lenny (00:00:33):
Today my guest is Bill Carr. Bill is the co-author of the book Working Backwards, which is a synthesis of the biggest lessons that Bill and his co-author learned from their many years at Amazon. Bill joined Amazon just five years after it was founded, stayed there for 15 years where he worked on the books business, and then as VP of Digital Media, launched and managed the company's global digital music and video businesses, including Amazon Music, Prime Video, and Amazon Studios. After Amazon, Bill was an executive in residence at Maveron, an early stage VC firm, then chief operating officer at OfferUp. And these days, Bill runs a consulting firm called Working Backwards, LLC, where he and his co-authored, Colin Breyer, help growth stage and public companies implement the many practices developed at Amazon.

(00:01:20):
In our conversation, we go many levels deep on how to actually implement a number of the practices and ways of working that helped Amazon become the success that it is today, including the process of how to actually work backwards, how to organize your team with a single-threaded leader, how to divide up your metrics into input and output metrics, how to practice disagreeing and committing, how to implement the Bar Raiser program in your hiring process and so much more.

(00:01:47):
Huge thank you to Ethan Evans for making this episode possible and introducing me to Bill. With that, I bring you Bill Carr, after a short word from our sponsors.

(00:01:58):
Today's episode is brought to you by Assembly AI. If you're looking to build AI powered features in your audio and video products, then you need to know about Assembly AI, which makes it easy to transcribe and understand speech at scale. What I love about assembly AI is you can use their simple API to access the latest AI breakthroughs from top tier research labs, product team to startups and enterprises are using Assembly AI to automatically transcribe and summarize phone calls and virtual meetings, detect topics in podcasts, pinpoint when sensitive content spoken and lots more. All of Assembly AI's models which are accessed through their API are production ready. So many PMs I know are considering or already building with AI, and Assembly AI is the fastest way to build with AI for audio use cases.

(00:02:44):
Now's the time to check out Assembly AI, which makes it easy to bring the highest accuracy transcription plus valuable insights to your customers, just like Spotify, CallRail and writer do for theirs. Visit assemblyai.com/lenny to try their API for free and start testing their models with their no-code playground. That's assemblyai.com/lenny.

(00:03:07):
This episode is brought to you by Coda. You've heard me talk about how Coda is the doc that brings it altogether and how it can help your team run smoother and be more efficient. I know this firsthand because Coda does that for me. I use Coda every day to wrangle my newsletter content calendar, my interview notes for podcasts, and to coordinate my sponsors. More recently, I actually wrote a whole post on how Coda's product team operates, and within that post, they shared a dozen templates that they use internally to run their product team, including managing the roadmap, their OKR process, getting internal feedback, and essentially their whole product development process is done within Coda.

(00:03:45):
If your team's work is spread out across different documents and spreadsheets and a stack of workflow tools, that's why you need Coda. Coda puts data in one centralized location regardless of format, eliminating roadblocks that can slow your team down. Coda allows your team to operate on the same information and collaborate in one place. Take advantage of this special limited time offer just for startups. Sign up today at coda.io/lenny and get a thousand dollars starter credit on your first statement. That's C-O-D-A.io/lenny to sign up, and get a startup credit of $1,000, coda.io/lenny.

(00:04:26):
Bill, thank you so much for being here and welcome to the podcast.

Bill Carr (00:04:30):
Thanks, Lenny. Thanks so much for having me. Pleasure to be here.

Lenny (00:04:32):
It's my pleasure. So, I was reading your book, and something that I recognized as I was going through this is just how many new ways of working Amazon contributed to the way tech and business runs. And I made this little list, and I'm curious if there's anything I'm forgetting that's obvious. So, obviously the idea of working backwards, the idea of one way and two way door decisions, the concept of disagreeing and committing input and output metrics using memos versus decks, just the idea of two pizza teams, and then I know that evolved into single-threaded leaders. Is there anything else that's just like an obvious core thing that's maybe almost too obvious that I don't even think about that Amazon contributed?

Bill Carr (00:05:10):
The one that's non-obvious and is really the way in which Amazon created a set of leadership principles that were very real, and the way in which Amazon created a set of processes to reinforce them. I think I certainly haven't encountered anything quite like that. It was very intentional. So, that is also a distinctive element of that we try to point out in our book.

Lenny (00:05:42):
Awesome. Okay. So, maybe we will come back to that, because that is also really powerful mechanism. So, the question I wanted to ask about this is there are companies that are bigger than Amazon, that are more successful than Amazon, that have been around longer than Amazon, but I don't think any other company has contributed so many unique, new ways of working and also been able to coin them into such shareable ways. What would you say it is about Amazon that enables this sort of way of working and also just making things so just proliferate through the culture?

Bill Carr (00:06:13):
That's actually one of the reasons why Colin and I set out to write our book because everyone knows about Amazon as a innovative product company, at least certainly during the time I was there, which was from 1999 through the end of 2014. The company rolled out all kinds of innovative products. The Kindle, AWS, Alexa, Echo, the Prime subscription itself is innovative and...

Lenny (00:06:41):
And it's all those things, by the way.

Bill Carr (00:06:43):
Yes, a lot of people around the world use all those things. And obviously, Jeff was a huge driver of those things. But what people don't realize is that Amazon was actually, to some degree, equally focused on process innovation. In many cases, by the way, we stood on other people's shoulders, we cannot take credit for having... For most of these, there were other inspirations or we built on work that others had done, which by the way, was what I think all great companies should do. And again, that's also why we wrote the book was because we would like to allow people to stand on Amazon's shoulders to learn what we learned, and then take all or part of these things and build from there.

(00:07:28):
But to more directly answer your question, how or why did this happen. So, this period of both product and process innovation actually occurred in this one narrow window of 2003 to 2007. During that window of time, all of the products I just mentioned and all of the processes except for one were all developed in this one four year period.

Lenny (00:07:53):
Wow.

Bill Carr (00:07:54):
And this is the period actually where we were going from hypergrowth stage, zero to one company, to what I would call one to whatever, a thousand, infinity. That next step that companies have to make where what happens is things become very complex. We're no longer just a bookstore, we sell a lot of things. We actually branched out beyond just a retail business, we had a third party marketplace business. We were experimenting in those days with providing running websites for third party retailers in those days too. We were developing new things. We were in many countries around the world. So, we'd become very complex. And what happens to that point is that then you reach this point where the CEO can no longer be in every important meeting, can no longer be involved with hiring every person. And you need a system, a method to run the company effectively. And Jeff Bezos is fundamentally, he's a very scientific and analytical thinker.

(00:09:07):
His undergraduate degree was in computer science, I'm pretty sure. Although I think he actually started off wanting to get a physics degree, he ended up moving over to computer science. He spent his early days at DE Shaw as a quant on Wall Street. Very quantitative mind. So, he applied this... When he thought about this problem, he said, "Well, I need to be scientific about this. There needs to be some system or some approach, some mechanism for me to be able to manage such a company. So, I'm going to experiment, like a scientist would, with different ideas, different hypotheses, implement them and see what works, and iteratively improve." So that was the mindset which we took... Which by the way, we applied both to process innovation, but also product innovation.

Lenny (00:09:55):
Awesome. I had Eric Reson, and he also happened... I thought about this at the same time, he contributed a lot of core concepts to the way tech worked, and he actually brought up a couple concepts that were on the cutting room floor, basically things that he thought would be things people adopt everywhere. And I'm curious, is there an example of that at Amazon where you built a process and had this clever term for it and just never spread or never actually worked at Amazon? Anything come to mind?

Bill Carr (00:10:20):
The dev team, the design team, the product team, they're all in one group, and they'll go operate autonomously, but not completely autonomously because we, the senior leadership team, Jeff and the S-Team want to know that they're on the right track. So, we're going to create something called a fitness function, which was let's figure out what are the four or five or six metrics that matter most for your particular area. Let's give a weighting to all of them and then let's create an index for those, and we'll measure that index up and down. And that's the fitness function.

Lenny (00:10:52):
That is a very nerdy way of organizing teams. I love it.

Bill Carr (00:10:55):
Yeah, super nerdy. But we realized after, I don't know how long, several months or a year of doing this, so the fitness function was not a good idea. This is what I would describe as a compound metric where you try to take several important metrics and munge them into one. The problem is it's actually becomes totally meaningless. When you're measuring things, you're trying to understand what actions or reactions are creating the good outputs that you want, revenue, customer growth. But by putting them all together, you basically obfuscate that. And what really we realized is we need to just break each one of these out individually and manage them each in its own way. So today, I discouraged teams and companies from creating any sort of compound metrics.

Lenny (00:11:44):
I've done that once, and it was a terrible idea as well, where we had six different metrics and every quarter, we were going to move a different metric that contributed to a higher metric. And what we realized is we just never learn how to get good at one thing, and then it turns out there's always one thing that actually impacts the bigger goal most. See, you just end up working on that thing anyway.

(00:12:03):
Let's actually go deeper into the single threaded leader piece since you mentioned it. It's actually come up a lot on this podcast of people working this way where they have a single threaded leader. And so clearly, it's worked. And I guess we'll just help people understand what does a single threaded leader actually mean, and then why is it such an effective way of working.

Bill Carr (00:12:23):
So, the concept of single-threaded leadership was first... When I was born from this time of complexity at Amazon, and where again, large... Once you get to a certain scale, you get to a point of where there are competing departments, competing interests, and they're competing for some centralized pool of resources. For all of you who are working for a tech company, this is this pool of engineering resources, or today, data science and AI resources. There may be other constrained resources often designed as a constrained resource. But the point is now all these teams want that pool of resources to go build stuff for them, but they're in competition with each other. So, most companies solve this by having an intense, centralized, highly collaborative process. We decided to go in the other direction for the reasons I mentioned, which were that we're just fine, that we're spending all our time in these meetings, planning, and a lot of the work we were doing, the artifacts we create, the documents, the projections, we're actually not very useful either, we're bureaucratic time wasters largely because a lot of the assumptions built into them were deeply flawed.

(00:13:35):
So, you're debating numbers in these documents that are based on flawed assumptions, which is a waste of time. So, what we realized instead was how do we get... The three things we really wanted were ownership, speed and agility. And so, we experimented with that and said, "Let's create teams that can stand alone, where there's a single leader and the cross- functional resources that they need are all either directly report to them or are dedicated to them." So they don't necessarily have to be a straight line direct report. In Amazon's case, for the most part it was. There were some dotted line, but it could be all straight line, it could be all dotted line, it could be a mix of the two. But fundamentally, we've moved from what we called a project orientation to a program orientation.

(00:14:24):
So, a project orientation means, oh, we are going to do this project to change our search result page and algorithm, and the project is defined in this way and it's going to take six months. The resources will come and swarm on that, and then they'll move off to some other thing in some other part of the company. The program based orientation says, let's stick with the search example. There's a team that works on search, and they always work on search. And instead of thinking about things on a project by project basis, they think holistically about what they need to do to improve search. They have a set of metrics by which they're looking to drive those metrics largely. Ones that they can control. Things like what percent of the time is a customer clicking on one of the top three results in my search page, or how many milliseconds does it take for the page load time in this browser type, on this device type, et cetera, et cetera.

(00:15:21):
And they then are running their own roadmap. They're deciding what are the most important things for us to go work on, and having a prioritized list of those things and be able to start at the top of the list and work their way down with the pool of resources that they have. Sometimes, and most times, they may want more resources to be able to tackle more, but they spend less time in resource contention, resource fighting, and instead, focus on building what they can build with the resources that they've got.

(00:15:54):
And so, the benefit of this is if there are success or failures, they're really dependent on themselves now. The only thing they could maybe argue about how they could do better is if they had more resources, which they can petition management for. But this way, it also solves a big management problem, which is instead of management, senior management refereeing every item on a roadmap, they're refereeing which teams have how many resources, which is more of like a once or twice or three times a year decision versus refereeing everything on the product roadmap. And then all the resource contention issues, that's a daily issue. And so, it frees teams up then to actually go and sprint ahead. There's a lot of work you have to do to get ready for this. For example, in a software environment, when we first started and we had a monolithic code base that was not pretty, we weren't ready to do this because you have all those interdependencies.

(00:16:56):
Once we moved to a service-based architecture, and then teams could own their code with defined endpoints, APIs that other teams could understand that are well-documented, then we could move in that direction. And the other thing is we had to create, what I would call, countermeasures because there's no free lunch in org structures, any org structure, you're trading off one thing for another thing. In this case, you're trading off potentially functional excellence. So, in other words, if you no longer have every single engineer or every single marketing person or every product person or every biz dev person reporting into a C-level leader of that particular function, and instead, they're spread out in small teams across the company reporting into some generalist who is probably not going to have functional expertise in several of the functions that they're leading, you risk the problem of then the people in those teams not gaining functional competency. That's the downside. And we can talk more about this, but we created a lot of countermeasures to still enable us to have functional excellence while creating these single-threaded teams.

Lenny (00:18:07):
To drill into this a little bit further, is the origin of this, this recognition at Amazon that the best stuff comes from one person's vision and just one person driving and one person's ask being on the line versus the often, the decision by committee approach?

Bill Carr (00:18:24):
It is less about that. I do want to be clear, it's one leader and their team who are accountable and responsible. So, with respect to what are we going to go build, how are we going to go measure success, all those things, this team and that leader are responsible for documenting that, writing their plan. Now, they don't just get to go off and do that. There was an intense review process at Amazon where either at some level, whether it be the vice president, senior vice president, or all the way up to the Jeff level and his direct reports called the S-Team, this plan would be reviewed and scrutinized deeply as well, and there'd be a discussion, an interchange, and basically getting alignment between the senior leadership team and each one of these single-threaded teams on that plan before the team could go off and run.

(00:19:17):
The beauty of that though is that once we'd had those discussions, those interchanges, then the teams were free to sprint hard after their plan. They didn't have to worry about whether was, "Am I aligned with my CEO? Am I aligned with my senior vice president?" They could know that they were. But yes, this creates then clear... If they're going to deliver it or not. It's up to that owner and that team. Whereas when you have this highly cross-functional approach and there's not one clear person who's responsible for this one project that's on this roadmap, I've seen many CEO pull their hair out saying, "I have no ownership and accountability here. How do I have that?" They're pushing on a string. Because they can't because their different people and leaders are part owning, half owning a long list of things instead of fully owning a short list of things.

Lenny (00:20:15):
I like that. I like metaphor of pushing on a string. Is this approach similar to just the GM model, or is there a big difference when someone's thinking about going GM model versus the single-threaded leader approach?

Bill Carr (00:20:27):
Yeah. Obviously, there are probably different definitions of what people consider the GM model, but I would consider that being this person is a P&L owner, and you can, of course, create mini P&Ls within a P&L. Like for example, in the book business, we could, and I don't know most of the time we didn't do this, but we could have created a P&L owner just for fiction books, or just for professional and technical books, which is a very large category with big differences between the others. And then you say, "Great." Then that team, they have their own dedicated team. They're fully responsible for the revenue numbers and other numbers. But you have to be thoughtful about how you do this because one of the three questions you have to ask when you establish one of these teams is, does the team have the resources within their control to effectively manage this part of this department, this product, this P&L?

(00:21:23):
And sometimes then if you narrow things down too much in some cases, then the answer is no. In other cases, the answer can be yes very easily. A great example, this was in Prime Video, one of the businesses that I managed, we could create a single-threaded team who just was working on applications for TV sets, like Samsung, Sony. We could create another team that's working on game consoles, and another team that's working on mobile phones and tablets. And then within each one of those, we could further break it down. We could have one team working on Xbox and another one on PlayStation, another one just on iOS. In those cases, then it's very clear how you can break the teams down and they can have very clear ownership.

Lenny (00:22:06):
Awesome. Let's go back to the countermeasures topic, and then even just a little more broadly. You talked about one thing that was important to put in place before you moved to the single threaded leader model, which is creating APIs, and basically breaking apart this monolith. What are some other things that you think you need to put in place to be successful in trying to shift to this model?

Bill Carr (00:22:26):
The other thing was these functional countermeasures. So, let's stick with the engineering, for an example. So, in 2004, 2005-ish, I started managing a single-threaded team. Actually managed two different ones, one for music and one for video, which are now Amazon Music and Prime Video. They weren't called that in those days. But I started managing a small team of software engineers at that point. Well, I have never... Well, I have written lines of code, but that would be back in high school, and we're talking about Basic and Pascal. I have a master's in business, a background in marketing. I'm a generalist, okay? So, I'm not equipped to coach. I couldn't possibly conduct a code review. I couldn't possibly conduct an architectural review. I couldn't possibly coach or mentor an engineer on how to improve their craft. But I was one of many of these examples.

(00:23:28):
And there could be reverse examples where instead of me being a business leader, I was purely an engineer, and now I'm managing a team that does marketing and business development. I wouldn't know anything about those things if that had been my background. So, what we did, and I'll stick with the engineering examples, we came up with various countermeasures. One example was that we still had a C-level leader of engineering in Rick Dalzell, and most of the core infrastructure and core services still reported into Rick. So-

Bill Carr (00:24:01):
Core services still reported in to Rick. So it was things like payments or infrastructure search, and Rick still could be a technical leader for the whole company and he and his team could create things like what are the standard ways that we're going to do code reviews? What are the standard ways across the company that we will interview and screen engineers? What does the promotion process look like? What are the defined steps getting from an SD1 to an SD2, SD3? How do we document and describe what are the requirements? There are many things like this. 

(00:24:41):
Effectively, what it also meant is that anyone who is an engineering vice president, or in many cases a director, they would often have something else beyond their day job of some sort of subject matter expertise area where they would also contribute to the company. A good example of this would be that they might sit on a panel for promotion from a certain level to another level in the engineering world, or they might be available to do code review outside of their organization for another organization. So people had other jobs in addition to their day job to build and maintain functional excellence. There are a lot of examples like this across the company.

Lenny (00:25:23):
Let's go in a different direction and talk about one of my favorite principles of Amazon, which is disagree and commit. I think in the way I even describe it I know is wrong. I think people hear this term and they often use this principle incorrectly. For example, it actually starts with have backbone and then disagree and commit. So I'd love to just hear how you've seen this actually implemented well and what people should do and think about when they're trying to implement something like this at their company.

Bill Carr (00:25:49):
So when I was at Amazon, there were 10 leadership principles and they've since expanded them. But of those 10, this was always the least well understood when I was at Amazon too, and partly because it is actually the most nuanced and difficult to actually use. So here's what it means. What it means is that have backbone and disagree, meaning when we are making any kind of a decision, important decision, if you are part of that team, part of that unit, it is your obligation to voice your point of view if you disagree with your approach that's been taken. The point of that disagreement, by the way, is to provide usually additional information or a new point of view that people have not considered.

(00:26:40):
So I like to geek out a bit on the process of decision-making and have read more and more about this. I think that Peter Drucker probably has the best writing on this topic. But as he would describe it, good decisions are made by first understanding all the different points of view and pros and cons to the potential issue at hand or the potential direction, and that great leaders, what they do is they solicit these different points of views. They have a team that they work with to debate and discuss things. So another way to think about this, a king and their court. In an ideal world, if you assume that there's no political motivations, the court is there to advise the king and help them think through different problems and provide different and opposing points of view to allow the king to arrive at the right decision.

(00:27:41):
This is sort of no different than that which is the disagree part is about bringing forth new information, new data, new point of view that would be contrary to the current direction. So that's the disagree part and you're obligated to do it as we would describe sort of all the way up the chain if necessary, if it's an important issue and people are not hearing or understanding your point of view. Now the important point is first of all about hearing and understanding your point of view.

(00:28:11):
What would often happen, I can tell you if someone in a leadership role, someone come to me with a disagreement and many times I'd appreciate it, by the, way because they'd bring some point of view that was useful, but sometimes they bring the disagreement and cite the reasoning behind it and I already knew that reasoning. We'd already thought of that reasoning, we already thought of that, in which case I would say, "I hear your disagreement. We have already considered that factor. But even though that factor is there, here are these other factors that outweigh that."

(00:28:42):
Now that is the point at which as long as the disagreer is hearing back from the leader that they understand their point of view, understand why they are pushing back and seem to fully understand it, and they've taken that into consideration, that is the point for them to commit. Because the point is you provided your information, they've processed that information and they've decided to go this way with the knowledge of that. Where people get confused about is they don't maybe understand when they're supposed to stop disagreeing is one thing, and so hopefully that explanation made people clear this is when you're supposed to stop, and then the commit part done well means that it's not just like I'm going to commit, I don't really agree with what we're going to do, but I'm going to get behind this.

(00:29:39):
Ideally it's, oh, now I've heard the argument, I've actually now thought about the argument and hopefully that person has now understood why we're taking that direction. So their commitment is based on that understanding because then they can reflect that understanding back to their organization too. Because the worst thing to do is to say, "Yeah, we're committed to this. I don't really agree and I still think it's wrong, but I'm committed to it." That's not actually commitment. This is really about decision-making and understanding the facts and information that people are going to use to make a decision and then be able reflect that back.

Lenny (00:30:22):
I imagine there are many times I've gone through this where I still don't agree. What's your advice to a manager or to a report of just like, okay, when you actually still don't agree, how do you behave? Do you just behave like, yes, I agree with this and don't really voice your concerns or something else?

Bill Carr (00:30:37):
I work with Jeff on all kinds of different new ideas. Jeff doesn't think a normal person. His level of sort of creativity and the way he thinks, the timescale of which he thinks, there's many ways about the way he thinks that there was no one else in Amazon that thought that way. So there'd be times when even after we've had that discussion, I would maybe still disagree, but then what I would do is I'd focus on, okay, well what is the kernel or the core of why Jeff thinks that we should do this and I would focus on that kernel. I got great advice actually from one of my managers at one point, Steve Kessel who said, "You have to look for what that is, and then your job is to then take that kernel and try to run with it and expand it and try to see how I can take that idea, that concept, and then make it into something viable."

(00:31:41):
It doesn't always work, but it's about then having that understanding of what it is, not just sort of going through the motion of stomp, stomp, stomp through it. That's not going to work. Also, I've seen people who try that and their career doesn't go very far. You have to have some degree of faith that there's something there and I'm going to try to do the best I can to make that part. How would I productize that idea? How would I make that viable from a business point of view or whatever the different constraints are.

Lenny (00:32:18):
Awesome. So the advice there is focus on the parts you agree with and think about how you can find out if it's actually right or not.

Bill Carr (00:32:25):
Agree with, or even just you may not even agree, but what is the core of what that person is thinking is the big benefit or good guy or thinking vector that they're on that's causing them to want to go in this direction.

Lenny (00:32:42):
Thinking vector, love that term. Along the same lines, another principle that I love is leaders are right a lot. I feel like this is a term that it almost goes unsaid. You almost can say this in a lot of companies. I'm curious just the origin of why that became an important principle and then how it's implemented at Amazon.

Bill Carr (00:33:03):
Yeah. So going back to this last discussion, so one fallacy we should all acknowledge is that when you're making these decisions, and you're trying to use data to make decisions, you can make the data kind of look however you want it to look to sort of try to meet your decision. If I'm looking at some issue and I've got some big dataset, I can come up with ways of looking at a dataset to support this idea and ways of looking at that dataset to not support it. So the data rarely makes the decision for you. What is happening is then a lot of judgment and interpretation of the data, weighing that, weighing various factors to then come to a decision. 

(00:33:49):
That is sort of the right a lot part. The right a lot part comes from having what we call sort of sound judgment, which generally come... Some people maybe are born with this, not a lot of them, mostly they get it through experience. A lot of experiences actually about being wrong, by the way, about making mistakes and by having looked at a lot of problems, made decisions or observed others making decisions, being a student of that, and then using that to understand then how to weight different information when making a decision.

(00:34:27):
So right a lot is that you're good at that and that then it proves, and that generally speaking, people want to follow someone who ends up by and large going in the right direction, right? You're the leader of a team. The team is petitioning you on multiple sides. If you keep kind of going off in some direction where most of the team is scratching their head saying, "I don't think that that was the right decision," they're not going to want to follow you very far and you're probably not going to go very far. So this is something that you develop through experience and I'd say from having the opportunities to observe and work for others that are good at this.

Lenny (00:35:12):
I love that it's a lot. I like that it's not just leaders are right. It's right a lot.

Bill Carr (00:35:18):
Yeah, yeah. No one is right every time. That is totally unrealistic. Yeah.

Lenny (00:35:26):
Let's talk about the titular concept of your book, and that's a word I've never used, but I think it's appropriate, which is working backwards. First of all, just what does it actually mean to work backwards versus working forwards?

Bill Carr (00:35:37):
The title of the book comes from two things. One is one of the leadership principles, which is that customer obsession, and the principle states something along the lines of that great leaders start with the customer's needs and work backwards from there to sort of meet those needs or solve them. Then also because we created a process in this window I was talking about earlier, the 2004 to 2007 window, we created this process for new product innovation called the Working Backwards PR/FAQ process. They both refer to the same idea, which is that as your guiding star or the point from which you're going to start is what are the customer's problems or what are the customer's needs, and then figure out, okay, well what would be the solution to that, what are potential solutions to that? 

(00:36:40):
To do those things, starting with without the constraints of my financial constraints, my resource constraints, my legal constraints, my engineering constraints, whatever all those constraints may be, because the problem is what most of us do is we start with those constraints and work forward from there, or we start with things like I got to increase revenue. How do I increase revenue? I need to increase active customers. How do I increase active customers? For customer oriented behavior, we tend to start with those things which may often lead you in the wrong direction.

(00:37:22):
Whereas we had, as Jeff would say, we took it as an article of faith. If we served customers well, if we prioritized customers and delivered for them, we took it as an article of faith that then things like sales, things like revenue and active customers and things like the share price and free cash flow would follow. So this is important because I still can't give you objective proof that that is true, I don't know who could, and so it was saying this is an article of faith that if we do that we think those other things will work out. 

(00:37:57):
So therefore, when we're making a decision thinking about a problem, we're going to start with what's best for the customer and then come backward from there. Then in that coming backward process, we're going to have to figure out, well, to do that, gee, I'm going to have to solve this engineering problem, or I'm going to have to figure out how to make this thing cost less or make this thing faster or solve one or more problems. That's the backwards, that informs what's the work you have to do to then create this new solution for customers.

Lenny (00:38:36):
Awesome. So just to summarize, you start with what are the customer's needs and problems, and I think a big part of Amazon's approach is what are the lasting problems they'll always have, which is I think it's lower prices, faster shipping and all those things, and then think with no constraints. When you work with companies to implement this idea of working backwards, is it always what is the customer problem and need versus revenue or growth or something like that? Or is there other examples of where you work backwards from at different sorts of companies?

Bill Carr (00:39:07):
Well, the working backwards part is strictly about the customer's needs. Yeah, we don't want to work backwards from revenue. I guess we didn't really use this term for sort of other things like cost structure. Cost structure was actually a part of working backwards from the customer that if we had a low cost structure, we could afford to give customers lower prices, therefore let's figure out how to have a low cost structure. Because in itself, driving down costs, doing things more efficiently doesn't inherently benefit customers because you could just choose to take more profit. It only does if you decide that in doing so I'm going to lower my prices to customers or provide some other benefit. So no, we used it in this method of I'm starting from the customer, and then very specifically, we used it in this method of new products and features that I'm going to go build on behalf of customers. 

Lenny (00:40:10):
Awesome. This episode is brought to you by Wix Studio. Your agency has just landed a dream client and you already have big ideas for the website, but do you have the tools to bring your ambitious vision to life? Let me tell you about Wix Studio, the new platform that lets agencies deliver exceptional client sites with maximum efficiency. How? First, let's talk about advanced design capabilities. With Wix Studio, you can build unique layouts with a revolutionary grid experience and watch as elements scale proportionally by default. No-code animations add sparks of delight while adding custom CSS gives total design control.

(00:40:46):
Bring ambitious client projects to life with any industry with a fully integrated suite of business solutions from eCommerce to events bookings and more, and extend the capabilities even further with hundreds of APIs and integrations. You know what else? The workflows just make sense. There's the built-in AI tools, the on canvass collaborating, a centralized workspace, the reuse of assets across sites, the seamless client handover, and that's not all. Find out more at wix.com/studio. Okay. So then when you go work with a company to implement this idea of working backwards, what are the very tactical things that you do to help them here? I know PR/FAQ is a part of that, so let's chat about how to actually implement that. What are the steps to shift to working backwards?

Bill Carr (00:41:31):
Yeah. So the first shift is to take this, so that's just a concept, right? Working backwards. Well, how do I turn that concept into a scalable, repeatable process? That's exactly where Jeff's mind went. Eventually, without getting into the origin story, we came up with this process called the PR/FAQ process. So what it means is that whenever we're devising a new product or feature, we're going to start by writing a press release describing the feature and describing it in a way that speaks to the customer and to some degree the external press and world where the idea is, in my description of this, it better jump off the page of something like, wow, as a customer I will really need this.

(00:42:17):
So what I work first is to say, okay, for your product development process, let's start by using this method as the method to decide what am I going to go build? And oh, by the way, to use it as a method to sort between a lot of different choices of what you might build. In summary, the way that process works is that PR, you're going to describe very carefully and clearly who's the customer, what's their problem, and what's the solution that you're planning to build. That sounds really simple and easy, but it's actually very hard to do that well. to crisply and clearly define those. The first two things are the things that are hardest to define, like who's the customer? Like anyone says, "All restaurants are my customer."

(00:43:08):
Okay, well, that's a mistake. Now, I mean, which kinds of restaurants are your customers? In what kinds of cities? In what kinds of formats, et cetera, et cetera? Then what is the specific problem you were solving? Ideally, you would some way have quantify that problem or there's some data or customer insights that have led you to understand that problem, to know that it is a meaningful and big problem. Ideally a problem that people would pay money if you could solve that problem for, because you can just look at the economics of that problem, and if instead they use your solution, this would be beneficial to them.

(00:43:50):
So I work to have them first implement this PR/FAQ process is the first step. Then the next step really is to go from there to say, "Okay, writing PR/FAQs is one thing. Well, how do I actually use them? How do we actually develop them?" Because there's this iterative nature to writing PR/FAQs where it's sort of a concentric circle review. You start off small with one author and with low fidelity writing these things, and then you start to share them with a small group and get feedback and improve it, a wider group, get feedback and improve it, and onward and onward until, depending on the size and scale of your company, you get up to the CEO as a way to strengthen, improve and really codify this idea and determine whether it's a great idea or not. So I help them understand how does that work? How do you do this iterative process? Then once you've done that, then what do I do with these PR/FAQs once I've got them? How do I then think about that with respect to my roadmap?

Lenny (00:44:55):
Awesome. Okay. That was an awesome overview. I'm going to fire off a couple of questions around the first part. Do you still suggest people do it as a press release? It feels like press releases aren't a thing anymore. Do you ever suggest people do it as a tweet or as TikTok video or a blog post?

Bill Carr (00:45:09):
Good question. So the first thing is it's not a real press release, okay? We could change the nature of it, and if instead we wanted to call it the customer problem solution statement, right? We could just change it to that because there really are three money paragraphs in this. First of all. Yeah, it's not meant to be a real press release, so don't use the language you would use if you were sending an actual press release. This is like an internal document. Okay. So that's the first thing.

(00:45:42):
The second thing is the heart of it really is that first paragraph, it's a short description, that second paragraph, that's the problem statement, and that third paragraph, that's solution statement. If you wanted to ditch the rest of it and the artifacts of the press release, you could. I think there are other benefits to it, like the headline, is this headline long and drawn out and I can't even tell what the heck this thing is from reading this headline? If you used a tweet that wouldn't work very well. 

(00:46:11):
The date is also a meaningful thing when you write the press release. The date is meant to be a hypothetical timing on which you're envisioning launching this thing which tells the reader something. Are you thinking that this is something that's so simple and easy, we're going to launch it next month or so complex that we're going to launch it in a year from now. So there are some other directional cues within it. Like I said, with everything, these are tools that people can use and I'm sure that companies will find other ways to improve upon these tools, but if you don't use those parts of them correctly, you're kind of missing out on what's the main benefit that your getting out of this.

Lenny (00:46:50):
Do you try to write it in a way that would be announced, like a press release feel? Or is it mostly just who is the customer? Do you try to pitch it as a part of this experience?

Bill Carr (00:47:01):
So you try to write it in that way, but the one thing is you don't want to use hyperbole. It would be very factual with numbers, data rich document too. So again, not like a real press release. A lot of internal confidential data would be in this press release.

Lenny (00:47:25):
Got it.

Bill Carr (00:47:25):
So it's a tool that has a very specific use to it.

Lenny (00:47:34):
Is there a template that we can point people to in the show notes to help them craft this? I think there's a version in your book maybe, but is there some online that we could point people to?

Bill Carr (00:47:43):
Yeah, so we have a website related to the book, which is www.workingbackwards.com, and there's a resources section within there and you'll find a template.

Lenny (00:47:53):
Amazing. Okay. Then the concentric circle piece. So the idea there is basically get feedback from an increasingly larger swath of the company and it sounds like a big part of that is also get buy-in as you go- 

Lenny (00:48:00):
... swath of the company. And it sounds like a big part of that is also get buy-in as you go along the way.

Bill Carr (00:48:05):
Yes and no. So first of all, there are some things where you may write it and you, the author, if we were in the old world, would take the piece of paper, crumple up and throw in the trash can, which is, in your own, you've realized, "Now that I put this down on paper and read it, this is not actually that good of an idea. I'm going to try something else." By the same token, you may then have written one you think is a pretty good idea, and you show up here or your manager and they give you feedback that makes you want to then ball it up and throw it into the trash can.

(00:48:38):
So part of this concentric circle thing is not just that everyone you write lives on and gets all the way to the CEO. There are no stats in this, but let's just say in some imaginary world where, yes, all these things... You're a product manager and you've got a director of product management you report to who reports to some senior vice president of division who reports to a CEO. Well, if you truly run this out and you write 100 PRFAQs in a year, maybe 20 of those make it their way to the CEO. The point is not every single one of them is destined to go that far. The numbers get narrower. And this leads me down to the concept of what you're really trying to create is a product funnel, not a product tunnel. And with a funnel, meaning lots of things at the top, fewer things at the bottom. The tunnel means that everything that comes in is also going to come out the other side.

(00:49:40):
And the problem with that method is that it means you're not actually having a method of consideration and comparing it against other things that you might build or how you deploy what are, frankly, most companies, your most precious resources, which is your engineering team, you should be looking at various choices. You should think of yourself honestly as a venture capitalist. They don't fund every company that they meet with. They actually fund a very, very low percentage of them. And at Amazon, we had lots and lots of PRFAQs that were a great idea, but we didn't ship them because we had other ones that were just a better idea, which had a bigger potential impact. So you want that. You want to create this corpus of ideas that are well-thought-out and select the best ones.

Lenny (00:50:27):
It feels like a lot of these processes are basically just ways to stop stupid shit from happening. I think the narrative is a good example where you have to expose your thinking deeply. This is a great example of that.

Bill Carr (00:50:39):
Yeah. And it's also, I would say, an example of where this is a process to prevent the other process, which is the product development process, from becoming the thing where you just get locked in on, "What are we doing in this sprint, what are we trying to get done," and focused on shipping stuff. What I recommend is you try to break that into two different processes. One is the process of deciding what you should go build, and that's what the PRFAQ is designed for. And then once you've decided that, then, yes, by all means, use all that good thinking, freight, "Now how can I ship it efficiently and effectively with few to no bugs?"

Lenny (00:51:19):
I was just reading this Harvard Business Review article, I think that's called the thinking to doing gap, where a lot of companies just spend a lot of time talking about ideas and solutions and not actually doing anything. And so I'm curious how you try to avoid that at Amazon considering there's this period of just like, "Let's explore, explore, explore, and we're fine."

Bill Carr (00:51:39):
There's a couple ways, and of course I'm somewhat having to imagine what are the problems in such companies where that's going on. So one such version of this problem is what I'd call the-big-idea-that's-not-fleshed-out problem. So I'm sure that every single person listening to this podcast has either themselves done this or have witnessed others in their company who come up with a concept of like, "Oh, I think if we built this, boy, that would really solve things or that would really work well or that would really grow things." And it may sound good to everyone, it may sound good to you, to everyone, and then maybe you start then working on building it. But the reality is that actually once you've spent some time looking at that idea more deeply, you then start to identify several roadblocks or maybe a fatal flaw with this idea. And in fact, no, you shouldn't waste any of your time going into building that thing because it has a fatal flaw.

(00:52:46):
So one problem is that companies get stuck, I think, where they never actually go do that documentation. And so it's a debate and discussion about concepts that aren't really well fleshed out. And so people's ability to actually evaluate them in any realistic way is they don't have a good way. And so in those situations, what gets done is probably more of a function of politics or will or a culture of completely top-down. I think the other way is where they're debating and discussing things that they just don't have good methods where then they can take things, and then go build them, meaning they probably don't have the right org structure or processes in place to then go take the good idea, assign it to someone who will own it, go look at it. And after they have owned it and gone and look at it, if it works, then they and their team can go actually build it.

(00:53:50):
What I always found as I became more senior in the company and my role became bigger and bigger is that when something came up, some idea that didn't neatly fit within my org structure, I couldn't necessarily delegate it to someone that this... There were only two things I could possibly do, which is just set it aside altogether because otherwise it'd just be a real distraction to people or I had to decide this was a compelling enough idea that we were going to take a resource, could be one person, could be a whole team, depending on the idea, and I'm going to have to assign that resource to actually go look at this and work at this. Otherwise, it will never happen.

Lenny (00:54:31):
I've been through those many times. Okay. So there's two more concepts I want to try to touch on before we wrap up. The next one is the idea of input and output metrics. This is something that at Airbnb, we super implemented, it became a very defective way of thinking. And actually there's a lot of Amazonians that ended up at Airbnb, a lot of leadership. So there's a lot of this stuff that we ended up doing like the memos. And so on the input and output metrics, could you just describe what that is and why that's so important, why people think about metrics in the wrong way often?

Bill Carr (00:55:03):
Yeah, so the origin of this one really was, again, in our early years at Amazon, '99, 2000, 2001, we were a public company then, we were growing. But then growth started to... It wasn't just all up into the right and like, "Woo-hoo." Every company's going to hit a wall eventually, and it's not going to be... If you're so lucky to even been at a company where it's just going up into the right with no gravity, good for you, because million people never experienced that.

(00:55:36):
What most people experience is the reality is that there's a lot of gravity pulling against your revenue numbers and you've put a plan out there and you wanted to grow 15% or 20% or 75% or whatever it was, and now you look like you're not going to hit that number this quarter. And so what ensues then is, "We're not going to hit our number. What should we do about that to hit our number?" And this often happens with, well, there's a month or a month and a half left in the quarter, and then we would run around like chickens with our heads cut off and come up with a bunch of ideas that tended to be promotional in nature and tended to be price reduction in nature, or we'll send this extra email or extra ad or whatever it might be-

Lenny (00:56:19):
Another Prime Day.

Bill Carr (00:56:19):
Right. And the reality is we did that, we went through that enough times, several quarters, and we started to realize, "Huh, these fire drills don't really work." We didn't really get meaningful progress against the number with these last-minute things we decided to go do. And oh, by the way, they were a big distraction. If they did work at all, they pulled revenue that might've just gotten in the next month or next quarter into this one. So it wasn't really a zero-sum game there. And we realized we're not really actually working on things that matter to customers that are going to move the needle over the long term.

(00:56:58):
And this is about the same time when Jeff and the S-Team were reading the book, Good to Great. And you have to ask Jeff what it is, but if you ask me, I think that this was the single most influential and effective management book for our company because what it caused Jeff to do, and I won't describe what... Most of you probably know what it is, if you don't know what it is, go read Good to Great. It is, in my opinion, the best, most important management book you'll ever read. Because what it did is to help us codify our growth flywheel, meaning what are the inputs that if we improve these things, which in our case, was how do we have broad selection? How do we have a great customer experience or great customers experiences in retail? Things like how easy was it to find what you wanted to buy, how easy was it to buy it, and how fast did it get to you. Were the prices low? Do we have lots of merchants on our platform? And by the way, could we drive out costs?

(00:58:06):
So we identified these things on our flywheel. And this identification of these things was such a critical moment for the company because then it realized, "Okay. Well, what we need to do is spend our time focusing on how do I measure each one of those things, and then how do I improve each one of those things?" So it shifted our focus away from this short-term thinking of pushing the revenue number up to this longer-term thinking that if we just improve these things, whether it's... There's no day that people will wake up 10 years, 20 years, 30 years from now and say, "All else equal, I'd rather shop at a store with fewer items than more items or a store with higher prices than low prices or a store where things get to me more slowly versus more quickly." So if we can just improve these things, this is our path to winning. So those were all inputs to the customer experience. And so we then figured out ways to measure them creating a set of input metrics.

(00:59:03):
And so then when we would develop our operating plans and review our business each week and set our goals, we were hyperfocused on those inputs and the input metrics. As a simple example, there was one tool that Jeff and the leadership team, the S-Team, used called S-Team goals, which are effectively a list of what they would harvest would be like, "Here are the most important goals for the company that I've harvested from all of our operating plans." And I can't remember exactly what year, something around 2007, 2008, they looked at that list, which is about 500 items long by the way, and they counted it up. And of that list, only 10 of them actually had a financial metric in it, like revenue or free cash flow or gross profit. These other things we're generally speaking, all... One of those inputs, like I mentioned to you about low prices, and selection, and speed of the customer experience.

(01:00:03):
So, yes, the point was, again, it's this other article... So we took it as an article of faith that if we can just improve these inputs, the outputs will take care of themselves. The inputs are the things that drive the outputs, which are revenue, customer activity, free cashflow. And so one of Amazon's... It's not really a secret, but one of Amazon's great strengths is [inaudible 01:00:28] focus on those things and make just continuous process, continuous improvement on each one of them and measure them rigorously.

Lenny (01:00:37):
The flywheel, you reminded me. It feels like that's another concept Amazon proliferated through all of companies is everyone's trying to create their own little flywheel, and I imagine everyone has that image of the Amazon flywheel in their head with a little orange circle in the center and the black arrows. On the topic of input metrics, just briefly, what is an example of a good input metric? Because I imagine people that are listening are like, "Oh, shit. I got to think about my metrics as input and output now." What's a sign that's a good input metric?

Bill Carr (01:01:02):
A sign that's a good input metric is, first of all, map your end-to-end customer experience. I never worked at Airbnb, but, okay, step one is that they clicked on some ad somewhere and showed up in the website or the app. Now you're in the app. Now you're looking at this first screen. Well, the first thing, what they're doing is they're browsing and/or they're searching. Okay. How are we measuring the speed, quality, and ease of that browsing and searching? Now they've got onto a detail page for an individual property. How are we measuring the speed, ease, and quality of the different actions they may take like reserve... Forgive me if I get any of my terminology wrong. I'm not an Airbnb-

Lenny (01:01:47):
You are, but it doesn't matter. It's close enough.

Bill Carr (01:01:51):
So then you've reserved. Now you have interactions with a property owner. How do I measure the quality of those? How many messages go back and forth? Is a lot of messages a good thing? Is that a bad thing? At first, you may not know the answer to that question. Same thing every step of the way. Then there's the actual rental experience. How do I instrument and measure every part of the customer experience? So you know it's an input metric if it is measuring something with respect to the customer experience. Which ones are the right metrics, which ones are the most causal to the outputs, I couldn't begin to tell you this is actually what you're getting paid for. You work at Airbnb to figure that out. And basically through an iterative process of measuring, observing, improving, and looking at what the effect is on your outputs.

(01:02:49):
So, again, we didn't really create this concept. This is a concept from Six Sigma, which is using DMAIC, which is I have a process, there's an output of this process, but the inputs are a black box to me. So how do I understand those inputs? Well, DMAIC stands for define... Oh, boy. Define, measure... The A is going to come back to me in a minute. Improve and control. And I'm going to have to... Oh, gosh. The A is lost. I've lost it for a second here. But-

Lenny (01:03:26):
Oh, here it is. I'm looking at... Define, measure, analyze, improve-

Bill Carr (01:03:30):
And analyze. Thank you. Yeah, duh, analyze. So we just use that process, which was... And by the way, the way we think about it first is like, "Well, you need to throw a lot of things at the wall. You don't really know which of these things are going to be the most causal." So you know you're doing input metrics. If it is, do you control it? Meaning can you apply resources to make this thing better or worse? Does it touch customers? It doesn't always have to touch customers, but if it is affecting the customer experience, it's almost certainly is an input. And then which ways you're going to measure that input? You need to try more than one way, because again, we tell a story in the book about one of our most important input metrics, which was how much selection do we have, and we were actually not measuring that right for several years. We had to refine that measurement.

Lenny (01:04:23):
So I don't know if you saw this, but I asked on Twitter what questions I should ask you and tell people you were coming on. And something that came up a bunch is with working backwards, obviously some products Amazon has launched have not worked out. Say the Fire Phone is a classic example. What have you learned from that process of just like, "Okay. [inaudible 01:04:42] won't work out"? Also knowing many things are not going to work out, there's no way to really [inaudible 01:04:46].

Bill Carr (01:04:47):
Yes. So the one important thing to share is that all these tools that are described in this book that Amazon is using, whether it's using documents and meetings or the PRFAQ process or input metrics, is that none of these things give you the answer. They are tools to help you make decisions. So sometimes you're going to make the wrong decision. Fire Phone is a great example that comes up often, people ask, "Well, if you've got this great PRFAQ process, how did you get Fire Phone?" So I was tangential to the Fire Phone team and I worked on it closely and different people have different opinions, so I'll just share my opinion, which is that if you think about, again, how does the PRFAQ process work? Well, there's a customer problem.

(01:05:36):
Well, what was the problem that the Fire Phone was seeking to solve for customers? I would argue this is a case where we made the mistake of what we had a technology solution in mind, which was 3D effects. And then we took that solution and we're then in search of a problem. I don't think it solved any meaningful problems for customers. And candidly, we had to build a version with the music application and the Prime Video application for this phone. And I couldn't figure out how this 3D part would make it better for the customers to discover, watch, or playback any of these media. Maybe there were games that could have been a great solution, I don't know. But I think the simplest place to go when you see a failed product is to ask yourself, what problem did you solve? And I could get into all kinds of other examples outside of Amazon too, but 9 times out of 10, I think that's where... If it wasn't poor execution, if the product was executed correctly, what was wrong with the concept of the product?

Lenny (01:06:48):
I imagine there was a lot of disagreeing and committing on that concentric circle process. Is there anything that you've found of just the number of disagreement and commits in this process of PRFAQ filtering out, I don't know, that tells you maybe this is not a good idea?

Bill Carr (01:07:03):
Not necessarily. So I'll tell you partly also why the Fire Phone happened was, from my point of view, I think that we had had a number of successful products where, in some cases, there were a lot of people who doubted whether it would work. A lot of people inside Amazon doubted that the Kindle was going to be a good idea. I remember contentious board meetings on this topic. So even within a company that was considered innovative, you would have a lot of people that would doubt things. I can tell you that for years is working on Prime Video, I would tell people about what our envision was of you watching on your TV set and we're going to have our own motion studio. We'll make our own movies and TV shows. And they would laugh at me. They thought that was crazy. So that's not necessarily the sign of whether the product is right or wrong. And so that's a problem actually, that makes it harder to know.

Lenny (01:08:02):
Yeah. And I think something Amazon's incredibly good at is being okay with a lot of failures, and I think that's part of the reason there's been so much innovation. Is that true?

Bill Carr (01:08:11):
I'd say it's partially true. I mean, again, it's hard for me to do a compare and contrast with other companies. But I can tell you did we have a lot of things that we launched that failed? Yes. Some of them are very public and obvious. I'll give you one that people don't really realize. It's something called... We had a feature in the early 2000s called Slots. And what it was was it was basically third parties could bid on different search terms and put a little ad in there.

Lenny (01:08:11):
Sounds familiar.

Bill Carr (01:08:42):
Well, obviously, that works now on Amazon, but it didn't work then because we simply didn't have the scale that Amazon has today. So a lot of times a product idea, a perfectly good idea, you just have the wrong time or the technology isn't there. I mean, Jeff wrote about a product that was a puck that sat in your kitchen that you would talk to and ask it for things and could shop from it. He wrote about that in 2004. Well, the technology wasn't there to be able to create that little puck, which one day would become Echo. It was a decade away. But we had a lot of things we launched that failed. We were not afraid to take what we considered a well-calculated risk. I think many, many companies are less willing to do so, less committed to product innovation, and really do not want that fear of... They do fear failure, and they're really focused on their near-term financial goals. It's not their fault. It's the way a public company and Wall Street interact with each other creates this dynamic.

Lenny (01:09:55):
Just to pull on that thread a little bit more. It feels like a lot of companies talk about, "We're okay failing. We're okay launching things that don't work," but then in practice, their performance review is impacted. Teams get shut down, budgets get pulled. Is there something that you recommend to companies that want to actually improve in this? What could they actually change and actually do this well?

Bill Carr (01:10:16):
Yeah, I just spoke with actually a senior executive at a well-known Silicon Valley company about this topic the other day and said, "Well, what is it we had structurally at Amazon, especially from a people point of view, that would enable or encourage people to take these risks?" Because, yes, in a lot of companies, if you go work on the project that fails, then your career is in the garbage can and/or your compensation system, you're going to lose out on that bonus. So there were two things. One was our compensation system. So there were no performance bonuses. So if I was running the book business and I had a killer year from a financial point of view, there was no extra kicker for me. And if I ran the book business and it had a bad year, there was no financial penalty for me either because our compensation was based on the stock price.

(01:11:12):
So we all had an incentive to do what was right for the company, frankly, over a long-term because trying to win off of short-term fluctuations off Wall Street is a losing proposition, which meant that therefore, if I am... Because I had that situation, I moved off of working on our largest P&L, and then the book business and music and video business, now I'm going to go work on digital media. There is zero business there. This might not work. Well, my compensation didn't change as a result of that. It didn't change one way or the other. We tended to also have a performance management system that then would change compensation based on evaluating what did you actually deliver more in an input method. We cared about the outputs too, but just there are plenty of people that could be-

Bill Carr (01:12:00):
Just, there are plenty of people that could be in a business that's up and to the right but has nothing to do with them. And so we tried to focus more on, well, what did you actually build and contribute, ways you improved selection or lowered prices, or whatever that might be. So those two things about the compensation mattered a lot. And then the second thing was having a CEO who was really committed to it and it wasn't something that they delegated to someone else.

(01:12:31):
So Safi Bahcall, I think, writes about this in his book Loonshots, where part of the conditions that are necessary for innovation to occur are that you actually create different structures of decision-making, of approvals, of all kinds of things, if you create some team that's going to go build something new and innovative. Because most of the structures inside a big company are designed to crush and impede a small innovative team that's trying to go build something new. They need speed, but approval here, approval there, it's going to get in their way.

(01:13:10):
We solved that two ways, one was when we went to go build digital media and AWS, we put two of our smartest leaders in the company on those things, Steve Kessel and Andy Jassy. And number two, they were meeting with Jeff regularly. Jeff was deeply engaged with them, reviewing what are we going to go build? Part of the decision to decide where we're going to go build. And so he could then also, between their seniority and of course him being the CEO, they could run interference on these sorts of things too. So even if you want to have innovation, even if you really do crave it, you're willing to take the risk, if you don't set up the organization in the right way, you're just not going to get it.

Lenny (01:13:52):
Amazing. I'm glad we got into that, I wasn't planning to talk about that and I'm glad we did. Final topic, this concept of Bar Raisers, it feels like it's been such a core way of allowing Amazon to scale successfully, and I think that's something a lot of people can implement, it's a very one-off thing you could just implement at your company. Can you just talk about what this idea of a Bar Raiser is in the hiring process and then what people can do if they wanted to add this to their hiring process?

Bill Carr (01:14:19):
So the Bar Raiser hiring process is a process, it was actually one of the first ones that was established and published, pretty early in the company's history back in 1999. And we created it for a simple reason, to quote one senior leader at Amazon, "We had new people hiring new people hiring new people." We were in our hyper-growth phase, okay? The company was only, what, three, four years old, and we were growing like a weed at that point.

(01:14:47):
So this started off actually in our tech org, and what our senior leaders in tech realized is, my gosh, we hire some new engineering leader, and then the next thing is that their job is to go hire the senior managers, and they'll go hire managers. And all these people have been here for a week, so they don't really even know our company yet, they don't know our culture yet, they don't know our standards yet. So what information are they using to make these hires, and what information they were using is obviously they were just using their own personal judgment, and their personal judgment combined with whatever criteria they used at prior companies that they worked for. So let's say if they came over from Microsoft, if Microsoft had some methodology or criteria, they probably would just apply that.

(01:15:38):
Well, is that methodology or criteria relevant to our company? Because every company has a different culture, and I'm here to tell you that if someone's been a super successful vice president at Microsoft, does not mean they could be as super successful at Amazon or at Google or Facebook. Sometimes they can, but these companies are very different, they all do work very differently. The way leadership happens and decisions are made are very different. So how do we fix this problem other than letting it run rampant and basically hire a bunch of people who are, we don't know if they fit our culture and we don't know if they fit our high standards we have for what we expect of engineering leaders or engineers?

(01:16:21):
So they created this Bar Raiser process, which by the way, they borrowed from Microsoft, which had a process called As Appropriate. And the concept was that on every interview loop there's one person, who is not the hiring manager, who doesn't report to the hiring manager, who's not the recruiting manager, they're in the business, they're a software development manager, or they're a marketing manager, and they are on the interview loop and they're a Bar Raiser, which means when we get to the debrief meeting, they will run that meeting, not the hiring manager, not the recruiter, they will run the meeting. And it also means that they technically have veto power over the hiring manager, which, by the way, a good Bar Raiser never uses, or I never saw a Bar Raiser use. I was a Bar Raiser, and in my 15 years at Amazon I never used it, never saw it used.

(01:17:14):
And then finally, which actually was not true in 1999 but later became true, was once we established our leadership principles, we created a set of objective criteria that would be used and an interview methodology that would be used in every interview, which was the objective criteria would be our leadership principles, and the methodology would be behavioral based interviewing.

(01:17:33):
So this Bar Raiser basically would be a subject matter expert on how this process worked, they'd conduct the debrief to make sure that we were actually adhering to the process, that people were sticking to the objective criteria rather than saying, "I don't think we should hire this person because, I don't know, they don't seem to want to work here enough." Maybe that's a valid reason, but it's actually not part of our objective criteria. And so the Bar Raiser was there to act as a balance also on the urgency bias that every hiring manager has, which is like, I got to fill these roles, but rather than filling them with the next warm body they find, make sure they fill them with people who actually meet our standards, fit our culture and meet our standards for functional excellence too.

Lenny (01:18:21):
Such a cool process. Two questions along these lines, one is who has the final decision in hiring, is it the hiring manager?

Bill Carr (01:18:27):
Yes.

Lenny (01:18:28):
And this is just off advice from the Bar Raiser?

Bill Carr (01:18:31):
Yeah, so this often gets confused. The decision maker is the hiring manager, the whole interview loop and the Bar Raiser are actually just there to help the hiring manager make the right decision. Now oftentimes the hiring manager could feel like this is actually a bureaucratic process and a group of people that I have to sell and they're just in my way between me and hiring this person, which is kind of a natural feeling to have.

(01:18:54):
But one of the feedback I would always give managers who are new to this is like, no, no, no, that's not the way to think about it, think about these people are helping you, because the amount of time you're going to put into the hiring process may seem like a lot, but if you hire the wrong person, boy, that amount of time you're going to have to deal with managing that person, that's going to be a lot more, the impact on the team, impact on you. So making a great decision here is important, they're here to help you.

(01:19:20):
So yes, the final decision is with the hiring manager, technically speaking the Bar Raiser could block them from a decision to hire someone, but they would, well done they would help the hiring manager see the reasons not to hire the person through a Socratic method and how they would guide the discussion.

Lenny (01:19:40):
And then when you're choosing a Bar Raiser, is there any suggestions you have of who to choose and how often you pull them into these things? Because it could also be a huge time suck.

Bill Carr (01:19:49):
It is a huge time suck, and it sometimes could be up to 10 hours of my week spent actually as a Bar Raiser. The selection process is you start with, as a company, I would recommend if you wanted to do this, you'd pick a department to pilot it with. Pick people who are A, care a lot about your hiring process, B, appear to be good interviewers, and C, seem to have high standards. It's also a great role for people who are earlier in their career by giving them this additional leadership opportunity. It's a great way to grow and develop leaders, by the way, because this added responsibility is a great way for them to start testing out leadership. And you have to train them properly and you have to have dedication to the process, but I generally would try to pilot it within one group at first.

Lenny (01:20:41):
One last question before we get to our very exciting lightning round. Many people are listening to this, they're considering implementing some of these things, trying to figure out how to actually make these real. If someone were trying to move along the path of becoming more Amazonian, which of these elements and processes do you think often has the most impact? And/or is there something fundamental that needs to change to allow for some change like this to happen at a company, in your experience?

Bill Carr (01:21:10):
Yeah, good question. And the first thing I'd say is one thing to be careful of is a lot of times when I'm talking to a company about these processes they say, well, does this mean we need to turn into Amazon? And first thing I tell them is, well, first of all, I couldn't turn you into Amazon if I wanted to, because you have your own culture. And secondly, no, that's not the idea is for you to try to become Amazon, the purpose is to sort of look at these processes and best practices they have and consider adopting parts or all of them into your organization to improve these, every company of a certain scale has these same processes, so this is just a different way to do them. So you should have scalable, repeatable processes for each one of these, pick one, here's one choice of ways to do these things.

(01:21:58):
The other piece of advice I give is that a lot of these changes are relatively profound, they really require buy-in all the way up to the CEO, if you're really going to change the way you do product development, or if you're really going to change the way you do hiring, that probably requires buy-in of the CEO, and so I would seek to get that probably before I would move too fast. Some of these things, though, can be piloted in your own little group, like your one little product development group. You want to decide you want to start writing PR FAQs, you probably can decide to do that. But again, try to check with your leadership.

(01:22:32):
The other thing I would just tell you is that for any of these processes, these in our book, or any book, implementing a new process is not easy. And if you go into it lightly and dip your toes into it and try it out, it's probably not going to work for you, because it'll be hard at first, and it requires some level of commitment to actually work through that hard part and say, I'm really committed to doing this, and it will take a few months for you to get good at it. So you have to have commitment and discipline to get through it. Anyone can really do these things, it just requires commitment and discipline.

Lenny (01:23:10):
And in our chat we've basically just scratched the surface of a lot of these things, if people want to dig deeper there's obviously your book Working Backwards, which we'll link to in the show notes. I know you also work with companies to implement a lot of these practices. Could you just talk about what it is you can help folks with and then how to potentially engage if they're interested?

Bill Carr (01:23:27):
Sure, great. Yeah, Colin, and I, one of the reasons we wrote this book was to pass on what we learned to the next generation of business leaders at scale with a book, but also because we had a passion to work with companies directly one-to-one. And so we are advisors, consultants, call it what you will, but non-traditional, we don't have a team of people working for us. Each of us just work directly with the companies who engage us.

(01:23:55):
And generally speaking what we do is the right kind of company for us to work with, first of all, has to achieve a certain scale. Companies that are in the product market fit phase, they need to focus on getting product market fit, they probably don't really need to focus much on how they put in scalable, durable processes. Like sure, some of these could definitely be helpful to you even if you're in that phase, but really these are designed for, my company's become complex now, I've got multiple product lines, it's well over 100 million in annual run rate, growing fast, complex. So most of our clients are either large, well past series C private companies, or they're public companies. And in most cases, a C-level leader, or the CEO themselves, has read our book and recognizes that they have a lot of the same problems that we had at Amazon, and looks at these as useful solutions and wants us to help them implement them.

(01:24:56):
So we tend to usually first actually go in and do an assessment of how they do things today, because to help people move from one place to another we have to understand where they are, and then we come up with a prioritized list along with that, the CEO and C-level leaders, of what are the things that would be most useful, what are the symptoms and problems you're having and what are the root cause solutions that could be found in these processes? And then we sort of prioritize those and come up with a plan to work within the organization to help them implement those. And what's also different is that we're very hands- on working at all levels of the company, and as we do it we will be there in the meetings with the teams to help coach them and teach them along so that we make sure that it actually gets implemented properly and to spec, and they get to the outcome they want.

Lenny (01:25:45):
Sounds amazing. How would people engage with you if they wanted to explore this?

Bill Carr (01:25:51):
Simple way is you can just send an email, I'm bill@workingbackwards.com, and Colin is colin@workingbackwards.com. You can also just check out our website, www. workingbackwards.com. We have some information there, we have a contact us form, those would be the best ways.

Lenny (01:26:05):
Okay. Well with that we've reached our very exciting lightning round. I've got six questions for you, are you ready?

Bill Carr (01:26:11):
I'll try.

Lenny (01:26:12):
Interestingly, as I look through the list, many of them relate to using Amazon, which is pretty funny. The first is, what are two or three books that you've recommended most to other people?

Bill Carr (01:26:23):
So I'd say in the management world, not surprisingly, Good to Great. I'd say Drucker on Management, or Drucker, The Effective Executive. And then the other one I'd say that's a little bit different is I'd recommend the Steve Jobs biography. I never worked at Apple, but looking at that arc, a lot of the way those things worked was not that different from what I experienced at Amazon, so it's a good window into what it's like to be inside some company, tech company, that goes through product innovation and big growth. On a personal basis, recent books would be Seveneves by Neal Stephenson is a favorite, and A Gentleman in Moscow.

Lenny (01:27:05):
Amazing. Can you get them all on Amazon?

Bill Carr (01:27:07):
Yes.

Lenny (01:27:07):
Another Amazon related question potentially is do you have a favorite recent movie or TV show? Might be on Prime, might not be.

Bill Carr (01:27:15):
Yeah, my favorite recent movie is the latest Dune movie, and I can't wait for the new one to come out.

Lenny (01:27:20):
When is that coming out? It seems like I've been waiting a long time.

Bill Carr (01:27:22):
I think it's supposed to come out next month. I used to know this, I used to have to know the answer to this question, but I don't anymore. But I anxiously await the next one, I thought that last one was awesome. I even liked the original Dune movie, so I'm probably unusual that way. And I just watched, along with my wife, we just enjoyed watching the TV series A Spy Among Friends, which was on MGM+.

Lenny (01:27:50):
MGM+, I have not even heard of that.

Bill Carr (01:27:52):
I had not actually heard of it either.

Lenny (01:27:54):
Another one to subscribe to.

Bill Carr (01:27:55):
But you can basically go onto Prime Video and you can find this show and you can subscribe to MGM+ through that.

Lenny (01:28:01):
Thank you Prime. What is a favorite product you've recently discovered that you really like, maybe that you bought on Amazon, maybe not.

Bill Carr (01:28:08):
This one I did not buy on Amazon, and this one is, most of you may not understand this one, but I'm an avid cyclist and I got myself a new set of wheels for my road bike this year, actually my road bike and my gravel bike. It's the Zipp 303 Firecrest, the latest model, and boy, these are just fantastic wheels. They're light, they're sturdy, they absorb all the bumps well, I can use them on a road bike, I can use them on a gravel bike, awesome wheels.

Lenny (01:28:35):
Wow, that might be the most obscure random product that we've had yet. Recently we had a humidifier, so I like this collection of products we're building here. Create a wishlist on Amazon maybe.

Bill Carr (01:28:47):
Nice.

Lenny (01:28:47):
Do you have a favorite interview question that you really like to ask?

Bill Carr (01:28:52):
Yeah, it's actually quite basic, it's tell me about your most significant professional accomplishment. And I have to always clarify this, by this I mean not some award you won or some promotion you got, I mean something you built, or some product, some process, some organization you built, something like that. And I could basically then, once they get into that example, ask a lot of probing and follow-up questions and I could fill an entire hour interview just sticking with this one example to really understand how they... Using the STAR method, which is to try to understand everything from the situation to the result, and everything they did in between, who they influenced, how they'd influenced, what decisions they had to make, what roadblocks they encountered. If I just pull on that one string I can learn a whole lot about a candidate.

Lenny (01:29:46):
Next question, what's a favorite life motto that you often find yourself coming back to, sharing with friends, that you find useful?

Bill Carr (01:29:55):
Well, one that I end up coming to a lot professionally, and somewhat personally, is this one called slow is smooth and smooth is fast. This is a, I believe the origins of this one are actually from the Marine Corps for the Scout Snipers, so not trying to promote that particular craft, but the point of it is that actually, and we did this a lot at Amazon, it really oftentimes, to really to go fast, you actually need to go slow first and to be very clear on what you're doing and where you want to go. Most people confuse speed with velocity, and the difference between the two is that velocity has both speed and a vector to it, meaning there's some specific destination. And so I see a lot of people who are going very, very fast, but the destination isn't very clear, they haven't really thought that out well. So slow is smooth, smooth is fast.

Lenny (01:30:57):
There's a similar quote that I've always thought of, of you've got to go slow to go fast. And I always thought it was was Stephen Covey, but I just Googled as you were chatting and it's someone named Peter Senge in the book Fifth Discipline.

Bill Carr (01:31:11):
It took me a while, I always wanted to go fast first and not go slow first, so I'd say a lot of my personal development and growth, it's a big thing that I learned from Jeff at Amazon.

Lenny (01:31:25):
Final question, I don't know if you'll have an answer to this, but is there a pro-tip that you could suggest for using Amazon? Something that people may not know about how to get the most out of using amazon.com?

Bill Carr (01:31:36):
Sorry, I have no secret insights. There's not something like if you go on Monday mornings at this time the prices are lower, or something, or in stock is better. No, I know of no such thing.

Lenny (01:31:49):
Which I think is great, because it's built exactly as it should be for customers.

Bill Carr (01:31:54):
I guess so. But it used to be, maybe in the days of the slower internet, that I would tell you to go at non-peak hours, but this isn't really an issue anymore.

Lenny (01:32:06):
That'd be wild if that was still an issue. Bill, this was everything I hoped it would be, thank you so much for being here. You already shared where folks can find you online, so I'll skip that question. So final question is, how can listeners be useful to you?

Bill Carr (01:32:19):
We're always looking for feedback. You can post a review on Amazon for our book, that's probably the best way. You can fill out the contact us form or send us an email telling us what you found most useful in the book, or what you actually found is missing, what would you like to learn more about? If we were to write another book, or write more, what would you like us to tell you about?

Lenny (01:32:46):
Amazing, and that's bill@workingbackwards.com if they have that feedback. Go buy the book Working Backwards on Amazon and other places, and workingbackwards.com to learn more. Bill, thank you again so much for being here.

Bill Carr (01:32:59):
Thanks so much, Lenny, really enjoyed it.

Lenny (01:33:00):
Me too, bye everyone.

(01:33:04):
Thank you so much for listening. If you found this valuable, you can subscribe to the show on Apple Podcasts, Spotify, or your favorite podcast app. Also, please consider giving us a rating or leaving a review as that really helps other listeners find the podcast. You can find all past episodes or learn more about the show at Lennyspodcast.com. See you in the next episode.