Here's Your Handbook to Growth Engineering (w/ Ram Navan, Airtable)
Matt Bilotti: Hello, and welcome to another episode of the Growth podcast. I am your host, Matt Bilotti, and I am super excited today to dig into a topic we haven't touched in a while, which is all around growth engineering. And we have Ram Navan, who was previously head of growth engineering at Opendoor, and is currently leading the pricing and monetization growth teams at Airtable. Ram, thank you so much for joining.
Ram Navan: Well, thanks for having me, Matt. Excited to be here, and looking forward to talk about growth eng.
Matt Bilotti: Absolutely. So as I mentioned, we haven't covered growth engineering on the podcast for a while. We are going to dig into all the things that you need to know about building a growth engineering team, working with growth engineers, and everything related. Metrics, tooling, all that fun stuff. So Ram, why don't you give a quick background on yourself, and then we'll go ahead and dive into the topic.
Ram Navan: Yeah, absolutely. As Matt said, my name is Ram. I currently work at Airtable, which is a productivity software that enables people to build their own software and workflows for any of their use cases, without having to code or purchase another tool and software. And I recently joined Airtable, actually. I'm in my week six at Airtable, and I support the pricing and monetization engineering team at Airtable. Before this current gig, I was leading growth engineering org at Opendoor, which is a real estate tech company that enables people to sell and buy houses with a click of a button. I helped restart the growth engineering org there, and grew the org from a few engineers to three teams. And the growth org, during that time, managed to crush the growth targets set by the business, though there were tailwinds that helped with that. The team did a fantastic effort, as well, and that's one of the highlights of my career. And, before Opendoor, I worked at various engineering leadership roles, at various property tech companies, and before that, IC roles in various startups. That's a bit about myself.
Matt Bilotti: Love it. So you covered a whole board here, from IC to leader to classic engineer, to growth engineer. How about we start with some context setting, from your perspective, around the difference between classic, regular, quote, unquote," regular engineering," and growth engineering? What are some of the key differences?
Ram Navan: Yeah. I've done both, and I spent good amount of time doing both. When I first started on my growth role my initial learning, and very quickly I learned, that there is a fundamental distinction between how you think about traditional product engineering teams, and growth engineering teams. And that distinction comes from the fact on, what the product teams and the growth teams tend to do. And our product teams are responsible for creating value to customers. They build new features to meet a customer need, use cases, to bring feature parity with the competition, or to bring completely new capabilities for customers. That's essentially what product teams do is, creating value to customers. Growth teams, on the other hand, their primary responsibility is connecting the customer to value, and not necessarily creating net new value, but how do we efficiently connect that value to the customers, is what growth teams fundamentally do. And I like this framing from Andrew Chan, who's the author of Cold Start Problem. And before, head of growth at Uber, and now I think, a16z partner and whatnot. So his framing is that growth teams are responsible for getting the largest percentage of the target audience, experience core product value, as quick as possible. And I think that's a very elegant framing. So, these distinction exist. And these distinctions between product team and growth team exists, it fundamentally calls for a different way of thinking about the skillset needed for the roles. Who to hire for, and what technical and non- technical skillsets to look for. And what is the incentive structure, for the team? What is the process variations, for the team? And it calls for carefully thinking through the distinctions in those areas as well, but fundamentally, that's how I see these two teams, and the nature of those two teams.
Matt Bilotti: Awesome. So you mentioned a couple of things there towards the end, that I would love to dig in on. One is around who to hire, what to look for, when you're thinking about what makes a growth engineer a good fit for growth engineering versus product engineering. What is the line there, skillset?
Ram Navan: Yeah. I think generally, the skillsets are broadly applicable to all the engineering function, but what to really index on, and optimize on, is what's actually a little bit different. For example, like starting with an engineer's motivation. An engineer can be motivated by a lot of different factors. They could be motivated by the type of the work they do, with the complexity of the work they do, with the technology choices involved in the work that they do. They could be motivated by variety of factors. The one distinction when it comes to growth engineering, is that, what is ideal? And this is... I know I'm generalizing to a large extent, there is obviously going to be edge cases and exceptions, to the norm. But in general, ideally you would want engineers to be motivated by, primarily motivated by, the business impact. That is what is most beneficial for growth. And the reason is that there are the varieties of things you could do, that makes business impact, but not all of them potentially involves the technology that you want to work in. They may not involve, or may not have the complexity, that you want to work in. Sometimes it is few lines of code, or tweaking email platform integration, and having an idea and executing an idea in two weeks that brings big business impact. So a good growth engineer's primary motivation is impact, beyond everything else. So that is something, certainly, I would encourage people to index on. Then secondly, having full stack capability is really beneficial for a growth engineer, to be able to drive impact. The reason is that it really expands the scope of influence for the engineers, as opposed to ability to only work in front end or backend. Ability to span across front end middleware, backend, the ability to be able to be proficient in extracting data, analyzing data, and essentially operating across the layers. It really increases the scope of impact, because generally, you won't know which lever you want to pull, to be able to make a business impact. So having all that options available, as an engineer, is great. And maybe the third thing, in terms of general things to index on is, their ability to take initiatives. Are they motivation, to take initiatives? It is commonly applicable to all engineering roles but given, within growth engineering, there are so many different things you can try. We really cannot afford to wait for instructions to come in, or direction to come in. Many times, you would want that leadership presence to be with every engineer involved in the team, where they help set the direction and priorities for the team. So that taking initiatives, and be able to quickly adapt to situations, is important. And then finally, if someone wants to really, in terms of hiring, if you really want to bring in growth expertise, depending on the business, on their existing skill set, you ought to look for," Okay, what other growth- specific skills," that you'll be looking for? Things like, what is their knowledge of experimentation, and experimentation platforms? What is their knowledge of growth models? What is their knowledge of tools that you use for performance marketing, integrating with performance marketing, integrating with third party providers? Like emails, and customer support tools, and stuff like that. All of that plays a part, but it is case- by- case basis, depending on what skills you already have, and what you need and what stage your team is in.
Matt Bilotti: Yeah. So much good stuff in there. Like, the skillset should be broader, because you could wind up touching any parts of the code. The motivation piece, I think, is so critical. Because there's some engineers that are just driven by writing the most code, or getting the thing done.
Ram Navan: Right.
Matt Bilotti: Or removing lines of code, or working on specific scaling problems. It's very, very much not that, in a lot of the cases, in the world of growth. So then how... You touched on this a little bit. Outside of the individuals themselves, from your perspective leading teams, you've led a lot of product teams in the past, and now you're leading more growth engineering teams. Are there explicit differences that you have learned, that you take? Your philosophy is a bit changed in how you run the teams, or how you goal them, or anything of the sort?
Ram Navan: Yeah. In the area of, how do you run the teams, what processes that you set for the teams? There are some distinctions, there. Again, it is not so much that it is a radically new thing that you would introduce, it is basically what processes you put in place that makes the team successful, that enables the team to make the impact, or maximize the impact. One clear distinction, or the distinction in terms of the emphasis, is the ideation process in itself. So growth teams is all about, again, this is a generalized statement. There are obviously exceptions, and edge cases. Generally, it is about testing various ideas and hypotheses to find which ones has the desired impact. And when you find them, you iterate further, so that the velocity of this testing and iteration becomes very, very important. And that's where the diversity of these ideas and hypothesis become very crucial. We really cannot scale, if we just rely on a product manager to be the source of the ideas. You want that source of ideas to be from everyone on the team. And this is not even in, just engineers. It is engineers who is part of the team composition, like design data. They all should feel like they're responsible for the ideation process. And that doesn't happen by accident, so you have to facilitate that. You have to facilitate the aspect of," Okay, how do I maximize generation of high quality ideas within a team?" And how you facilitate it is, first, the democratization of the information is crucial. We make sure that the team has a fundamental and common understanding of the metrics they're responsible for, prior user research studies that has been done, prior data science experiments that has been done, the results of the experiments. They should be aware of, and have a baseline on, what are the opportunities we have? And what are the levers that we have, that we can pull to be able to make metrics move? All that knowledge should be fundamentally available to everyone, so you create process that facilitates that. That generally increases, over time, the quality and diversity of ideas that gets generated. Along the same lines, prioritization. There are so many different ideas, how do you decide which one to start first, or prioritize first? There are so many frameworks available that are broadly applicable. The one that I have found to be very effective is what I call ICE prioritization. It stands for Impact, Confidence, Ease. Again, pretty popular framework. You assign a score for each of these categories, and then you calculate the score for an idea. But an important thing to notice is that, there is no perfect process for prioritization. All of these tools exist to help with mostly relative prioritization, not in our total absolute prioritization, that would be resource intensive exercise to go through, to get the prioritization perfect. So I think we should be happy with good enough prioritization, and then move on. Even within ICE prioritization, I like this variation from this product manager called Itamar Gilad, where he has this thing called Confidence Meter, and it basically enables you to give a confidence score based on the quality of the evidence that you have, that supports this idea. You can say," I want to improve the conversion on a particular step from X to Y," and the confidence you put in place for that is, essentially," Okay, what quality of evidence do you have?" Is it to say, intuition? In which case I think your confidence should be very low, or the confidence score should be very low. Or if it's data, there is data supporting your hypothesis, then the confidence is high. So that variation I particularly like, because it removes more subjectivity out of the prioritization, and includes more objectivity. Other things for growth's later processes, like in execution autonomy, enabling teams to be able to make decisions faster. Sometimes, when you go through a design iteration or ex- migration, there are so many people with opinions and ideas, with the best intention of improving it. But sometimes when you have a lot of people involved, it just prolongs the decision making process. So to be able to have a process and expectations where the decision making is very clear, on who is the decision maker and how to keep that group lean to be able to iterate faster, is another process- related thing that you would index a little bit more heavily on the growth side of the staff.
Matt Bilotti: Covering a lot of ground here. This is such a good episode, because I think it's like the handbook for growth engineering stuff. This is great. So you touched on it a little bit earlier, when we were talking about the motivations of engineers, that make people a good growth engineer. Let's talk a little bit about the measurement of the teams. Like, a product engineering group is going to be measured a little bit differently, than a growth engineering group.
Ram Navan: Yeah. So I think with measurement, given the growth engineering team... We talked about the motivation, right? It needs to be impact driven, how the engineers need to be impact driven. So that translates a bit here on how you measure, so you have to incentivize impact. And that should be the first aspect, in terms of how you measure the performance, is what is the impact that a particular team is bringing on to the table? Given the mandate is growth teams must prioritize impact over everything else, it makes sense to incentivize impact. And one of the fundamental responsibilities for an engineering leader within growth team is, how do you streamline the path from effort to impact? Like, an engineer has a fantastic idea, they have great execution skills. How do you streamline the path, where they can just take that and make impact, is one of the responsibility. So, yeah. Impact is the name of the game. And in addition to that crosstalk...
Matt Bilotti: I'm just going to jump into here real quick. And when you say impact, are you talking about," I want to see these engineers move these numbers, with the work that they're doing?" Is it like a," Let's look at the metrics," type of impact?
Ram Navan: Yeah. So it depends on the team's goal, and the individual's goal as well. Usually, typically, and this is a good topic because we could talk about, what does a team level goal should ideally look like? So I like a portfolio of work that a team can achieve in, let's say an arbitrary quarter timeframe. And the portfolio involves three components. One is, metrics optimization. Let's say there's a team called acquisition. What is the metric this team is responsible for? What metrics, that we prioritize for this quarter, and what is the optimization we expect in that metric? That forms one part of the goal. Along with it, we also think about customer experience. Sometimes some of the customer experience related goals are not directly correlated with metrics, but indirectly correlated with metrics. Think about things like, what is the billing experience the customer is having? What is the invoice experience? Are we losing trust, there? Can we improve clarity, there? Those things, it's not very easy to directly connect that to a metric, but also important for the long term. And then, third category I would put in the portfolio is big bets. Like high risk, high reward type of initiatives. So any given quarter, we would like it to be a portfolio mix of these three, metrics optimization, customer experience, and big bets. The percentage of allocation varies team to team, depending on the current business climate where the team is at, and whatnot. So in terms of measuring the impact, you correlate that with," Okay, does the individual have the autonomy to be able to impact this particular metrics optimization goal that we have?" Kind of. It will be a team effort also. It's not always the case, this one engineer moves the metric. But at the team level, we want to grade the team based on what type of impact they made, because they moved these metrics. So, setting goals for the team is also very critical, because that's the mechanism through which you align the business needs and business goal, to team level goals. So, impact, in many cases translates to," What is the business impact that we can deliver?" Sometimes it is not, all the effort is not realized in the same quarter, but you can realize it in the next quarter or next quarter. But the eye on impact is crucial, because that guides the prioritization, as well.
Matt Bilotti: Yeah. Makes a ton of sense. Any other notes on measurement, before we change topic, here?
Ram Navan: Oh, me, the one thing I would also include is, we talked about impact. But the other thing is, learnings. The truth of the matter is that, we try many things to optimize the metric, but many times, many of those efforts actually fail. There is, out of 10, one or two, three would work. And then you iterate and double down on it, but there are so many other things, fail. What is important there is, failure is okay. That is expected. It is built, by design. It is not a bug. But what is important is that, are we learning from those failures? So one thing we also want to measure is, learnings. In a given quarter, what do we know more about our customer behavior? So when you propose a project, one goal is about improving a metric, but there is always some goals about what are we going to learn more, by doing this project? So incentivizing learnings is another place, another area, where we want to... What you want to use, to measure the team's performance, as well.
Matt Bilotti: Cool. So I want to jump into something that I feel like doesn't get covered, I've never covered in this podcast. I haven't seen a lot of content around it. I think it's because the volume of people in this world is very low, and haven't necessarily put anything out there. Let's talk about managing growth engineers' careers. What should people look out for? I think there's probably a mix of people listening here, like the current IC that maybe wants to think about how they can grow their own career, or thinking about what should they be thinking about in order to get to management level? And then there's the other growth engineering leaders listening to this saying," What can I take away to help manage my team?" Let's talk through some of that.
Ram Navan: Yeah. A common pattern I have noticed, with respect to managing a growth engineer's career, is a common theme of like," Hey, there is smallish, repetitive work that doesn't have good scope for me to continuously challenge myself, push my boundaries," and whatnot. This is actually very true and very real feedback that leaders on the team need to really index on, and make corrective actions. And if you think about it, a lot of growth engineering is about... Not a lot of, part of growth engineering involves minor design revisions, copy changes, changing onboarding flows. And these could be repetitive for engineers, and may not present learning opportunity, every time they do the work. So it gets, to be frank, sometimes it gets boring too. Because, let's say, a lot of the experiments also fail, so it affects the motivation aspect, as well. So this is a real problem, for many growth teams. And the fundamental way to solve this problem is to give them, first, give them space and make it as part of... You know the work portfolio we talked about? Make it part of the portfolio, that there is a space where people can actually build systems to solve this problem. It's essentially, like in systems thinking." What systems can we build, so all these smallish, repetitive tasks are handled by the system, as opposed to one of the engineers doing it all the time?" An example of that is copy changes, hero changes, doing AB testing on copy changes. What is the overhead involved in setting up experimentation? The overhead involved in changing questions and onboarding flow, or changing assets in onboarding flow?" Can we build systems, or buy systems, that can solve these problems?" For example there are headless CMRs you could integrate to your product, that enables you, that self- serves ability to change copy, and even do AB testing, based on... It offloads all that to a product manager or a marketer, as opposed to an engineer. So that actual effort of finding those systems, integrating into it, or making some infrastructure changes to make experimentation simpler. That is what systems thinking is. And that's usually not a boring thing, not a repetitive thing. It is actually, it really motivates people to dig into that. So creating space for that, and encouraging people to think in systems, is really crucial to kind of tackle that common challenge. And another aspect is also to have the right balance and mix of engineers, things that are repetitive and smallish for someone, may be a good learning opportunity for someone else. So to find that right allocation of tasks so that you keep it interesting for people. And also, there's a learning opportunity for everyone involved, is also crucial. An example is that, let's say a backend engineer who actually is in the transition of becoming a full stack engineer, and want to do a lot of front end. For them work that is like changing onboarding questions, and stuff like that, may be interesting because it allows them to get started on their front end skill sets. So yeah, that is another lever that we can use, to mitigate this particular challenge.
Matt Bilotti: Awesome. Love it. There's a whole bunch more there, I'm sure.
Ram Navan: Oh...
Matt Bilotti: But we got to... Yeah?
Ram Navan: crosstalk Now I think about it, that is another, like for the engineering leaders, they have another responsibility, here. Because most companies, they have a career ladder, and they'll have one engineering career ladder. And typically, the career ladder calls out in my last three, four companies, that I see a pattern. It calls out impact, and it calls out the craft aspect, the complexity of the work and how large the project is. It calls for collaboration. So it touches on those areas, and set expectations on what we expect from a junior engineer, a senior engineer, a staff engineer, and whatnot. I see that many of these career ladders have a bias towards the complexity of the work, and how large the project is. Though this may be largely applicable for many engineers, but is important for a growth engineering leader to emphasis that, we had to prioritize impact over everything else. For some of the reasons we talked about earlier in the podcast, and that is not a default knowledge across the company. So it is the responsibility of the growth leader to be able to educate the rest of engineering on this distinction, and place a higher weight- age to impact, as opposed to, let's say complexity of the project. Because we are saying the primary responsibility of a growth engineer is to make impact. And it may not always involve complexity, or a huge size project.
Matt Bilotti: Yeah. Such a good point, and it makes me think about velocity as a main thing, as well. Right? Maybe it's not necessarily complexity of the growth things that they're doing, but it's the velocity at which they can test and learn. Right? And generate learnings for the team.
Ram Navan: It's a great point. Yeah. Yep. Exactly. Yeah.
Matt Bilotti: Cool. All right. Let's hop to some team structure, and strategy. So we talked about this a little bit, would love to just dig in there a little bit more, if you have other thoughts around how you approach building your teams, and setting them up for success.
Ram Navan: Yeah. Structuring a growth org, there is two popular models, actually. There are companies that have successfully adopted both the models, where they're called independent growth model, and then a functional growth model. Independent is essentially, you have the entire growth org outside of product and engineering org. So you have a head of growth, VP of growth, and then they put CEO and then you have engineers, product managers, designers, everyone as part of that particular org, that is completely outside of product engineering. So there would be a product org, engineering org. And then there would be a growth org, that also has engineers and product managers. And in the functional model, growth is one of the functions within product and engineering. So there will be product org, and then there will be head of growth product, and then there'll be engineering, there'll be head of growth engineering. And then you have engineers and products that as part of the product vertical, and engineering vertical. I think there are companies that have been successful in both. But I have developed a point of view here, where I feel... I firmly believe in functional model. And there are a couple of reasons for that. I think that the biggest reason is that, being in a functional model, and have the engineers and products sit alongside the product and engineering org, is that it makes a lot of crosscutting concerns. It streamlines a lot of crosscutting concerns like, how do you hire? How do you calibrate? How do you do career planning? How do you do performance reviews? Everything falls under the umbrella of one org, and is commonly applicable to everyone. When you have it completely separate, you can facilitate that to happen, but it is not very natural. And I think for me, that distinction in itself, is enough to bias towards a functional model. And within this model, depending on the business, depending on the customer acquisition strategy, you cannot structure different growth teams in a different way. Commonly, people structure it either through using metrics, like an acquisition team or activation team or revenue team, retention team, or they structure it based on some product areas. Like," Okay, this team takes care of all signups. This team takes care of all onboarding," and whatnot. I have a bias here also, towards structuring towards metrics. But the reason is, when you structure in our team based on metrics, it just allows the team a lot more surface area. Because their goal is to move the metrics responsible, for the team. What tactics they want to use, which area they want to operate in? They have more freedom, as opposed to just getting stuck on one product area. And I bias towards, always, having more sophist area to operate in, to be able to move the needle on something. So that's kind of my bias, there. So besides the metrics organization, or the product surface area, there is also this platform aspect to growth, as the team evolves. And there's also, primarily driven by the type of business, type of customers, and the acquisition strategy, there is a platform at aspect to it. For example, let's say that the business relies on performance marketing for acquiring 90% of their customers. You want to make sure that you have a really good customer targeting mechanism, on the tools and technologies for that. So you want to have dedicated focus, because a lot of your spend is going toward performance marketing, so you make sure that your tools and processes are efficient and you have dedicated engineers and product managers thinking about it, on how to improve. So that would inform a little bit, on how you structure the team, as well.
Matt Bilotti: Yeah. Again, a whole lot there. I have seen both the independent model and the functional model at Drift, and I would say the functional model, where the growth team is in the product org, generally worked a bit better. And having clear alignment, like," This team does acquisition. This team does activation." For sure.
Ram Navan: Right.
Matt Bilotti: Okay. So you just talked, zoomed out, about teams. Let's zoom in on a team. What is the structure of the teams, themselves, that you see work well in certain scenarios, versus other scenarios? And the makeup, and all that.
Ram Navan: Yeah. I think, by and large, most growth teams, it is a mix of engineers and a product manager, design, data, user researcher, and a marketing partner. Either a PMM, or a marketing representative. And in some cases, if applicable, QA resources as well. These are all the different functions that makes up a growth team. Not all growth teams have all these type of resources, this is ideal, to have this mix. But depending on the stage the company is in, their evolution in their growth engineering. Not all teams may have all these representations from different functions. Besides engineering and product, the representation for the other functions. They primarily fall into two models, whether it is embedded model, where everyone is dedicated and part of this team, or an agency model, where there's a centralized design or data team, whenever there is a need. There should always be need for growth teams, ideally, but it's basically like, you've given a request, and they take in the request and execute it. I have a bias here too, because I think, my fundamental principle is that I think all these functions should act as a leader, in the team. Like, a design partner in the team should be one of the leaders of that growth team. It is not just always either a product manager, or an engineering manager, but data scientists, part of the team, should act as one of the leaders in the team. I'm saying should, because it's not by default, in many cases. So you have to set this culture, explicitly. And so for all these functions to act as leaders, what this enables is that, rather than them defaulting into the mode of," Okay, what work needs to be done?" Like,"The product manager gives me the work and I do the work." And shift it to," Hey, we as a team, this is our goal. What we should be doing." So that mindset shift is very, very important. And I believe, the agency model doesn't facilitate that mindset, because they have competing priorities. They have, different tasks comes in, and they start to index on," Okay, let me accomplish this task," as opposed to thinking like an owner. So that's essentially the makeup of growth teams. Design, data, user research, a marketing representative. Again there are going to be edge cases, but generally, many growth teams has this mix of functions.
Matt Bilotti: Amazing. Again, I feel like there's so much to like dive into on all these, but we only have so much time here.
Ram Navan: Okay.
Matt Bilotti: One thing that we were talking about before we hopped on the recording here, that I wanted to give you a chance to just touch on, even if we don't go too deep. Is, around tooling. What growth engineers, what types of tools they could expect to use. You were talking a little bit to me about building versus buying a tool or a platform to assist the teams. Can you touch on that, for a little bit?
Ram Navan: Yeah. So we usually talk about full stack, but generally a theme around ability to grow, go across multiple surface areas. So most of the companies that typically have, there is a front end stack, there is a backend stack, that is pretty standardized in many companies now. So knowledge about that is crucial. But in addition to that, I think what really makes our growth engineer more impactful is, understanding of all the peripheral tools involved in growth engineering. Events tracking. How do we track events? Where do events end up? How do we effectively self- serve, myself, on understanding customer events? How do we do email? What are the email workflows, and campaign? What tools do we use? What is our customer data platform? How do we segregate audience? How do we target, on those audience? Things like experimentation platform, feature flagging systems, because almost, a lot of things you will roll out would be experimentation nature. So understanding of all this. Not all product teams always have to have knowledge on all these things, but for a growth engineer to be really effective, having an understanding of all this really makes huge impact. And you asked about like build versus buy. That is such an interesting a topic to... But I won't go too deep, into that. So I think all companies, all teams, they face this straight off." Okay. We need this capability. Do we build or buy?" And that is the sudden pride aspect involved in building. And I have fallen into this trap many, many times, and I have learned a lot from it. To an extent, and this is going to be a gross generalization. For most companies, I think building anything that is outside of their core competency, or their core business model, is always a bad idea. Let's say there's a company involved in moving money for transactions, X to Y. Is moving money, your core competency, or not? It's likely, not. So you should really not invest resources in building a money movement system. Same thing with events tracking. Like," Yeah, we want to track user events, and we want to have a system." But is that our core competency? If the answer is no, then I think in most cases, buy and get license is the right answer. Because what we don't understand is that, building is just one time fixed cost, but the actual, real cost is actually maintaining that system and upgrading that system to regulation changes, improving security, improving performance, and fixing bugs over time. That is where the real cost is, and in many cases, for most companies, that is not the right cost to pay. There is the exceptions, here. Of course, if you're an Amazon, and if you want a talent management system, you would probably build it because you are hiring hundreds and thousands of people every year, and usage based pricing, paying for someone is actually more, than building and maintaining such software. So there are exceptions there but for most companies, I believe if it is not your core competency, buy is always the right answer.
Matt Bilotti: Yeah. I've been in that trap, as well. It's tricky. All right. As we come towards the close here, the one more big topic I want to dig into, which is around prioritization. So we loosely talked about prioritization, a little bit. How do you think about, let's say there are some people listening here that probably aren't necessarily on the product team or engineers, maybe they're folks on marketing team, or there might be designers, there might be PMs. How do you think about the growth team, and the growth engineers, prioritizing asks from different stakeholders and working with those different stakeholders?
Ram Navan: Yeah. The fact of the matter is that, there is always more things to do than available resources. But I think one of the common pitfalls is not be like rootless in prioritization, but try to peanut butter spread on trying to satisfy everyone, and make them happy. That is a common pitfall. So, fruitless prioritization is very, very crucial. And that means, saying no to a lot of things. And it is okay. But at the same time, you want to make sure that people are felt heard. And you need to carve a path for them, like for people who are saying no to, you need to carve a path for them, or you need to set some timelines and guidelines and what conditions need to be present, to be able to prioritize those asks. So a few things you could do, one is, be very transparent. You should, a team, a product line should clearly call for what is the focus area and what not, is the focus area. And if we have more resources, what would we additionally focus on? Clearly call that out, make sure to write it down and circulate, not just within your team, but across all your stakeholders. And involve all the stakeholders, even though, you know that you cannot prioritize everything, involve them in the planning process. Have them see what you see, in terms of what is important, why you are doing or prioritizing certain things versus not. So that transparency and inclusive brainstorming and planning processes is very, very important. And the third thing is, sometimes it is also about enabling people to self- serve themselves. Let's say you are one of the product team, not the growth team, but you want to change onboarding flow. And you're taking dependency on all the growth teams to go and change onboarding flow. Can you enable others to be self- serve? Can you build the system in a way, can you have documentation in a way, so others can self- serve, themselves? And we touched upon other things. A marketer wants to change images. Why should they rely on an engineer, to do that? Can you build a system, self- serve, themselves? But that also solves that tension, over time, obviously. But I think the transparency and inclusiveness in planning and brainstorming, this is very key.
Matt Bilotti: Yeah. I think all of those are super, super important points. I love the core philosophy, that over time, you want to make sure you empower the other teams to have access to stuff. You don't become a bottleneck. Other teams can take some of their ideas and put them into reality, without having to necessarily go through you. Cool. Ram, it feels like every bucket of topics that we talked through, could have been up podcast on its own. Is there anything else that is burning on your mind, that you want to touch on or close with, before we go ahead and wrap here?
Ram Navan: Maybe one thing. Given, I think a lot of your audience would be outside of engineering, that would need engineering help to be able to be more efficient in their thing, or to be able to meet their goals and targets. I think the one thing that I find it helpful is... This is also not a quick win, but over time. Rather than going through engineering or product and asking for," Hey, these are the four things that needs to get done," is to form a story and theme, and a long term charter about," This is our vision, and this is how technology can help solve that problem, over time." So taking that long term view, it really helps engineering in planning for," Okay, maybe we need to spin off a new initiative, because there is value in investing in those initiatives. There is good ROI. It seems like we should invest over a long time." So it helps them in planning and resource allocation over time. When there is one- off asks like," This is the P0 thing, can you do this, one and two?" They'll just optimize for local optimization. And it's also hard for engineering teams because they want a story behind it, a theme behind it, for long term prioritization and planning. So I think that is one thing that will be helpful in working with engineering is, collaborate with them to form a long term view and a long term plan, the vision and theme around it, as opposed to one- off asks.
Matt Bilotti: Awesome. Ram, thank you again so much, for joining here on the podcast.
Ram Navan: Oh, my pleasure. Thank you so much. Appreciate the time.
Matt Bilotti: Absolutely. Yeah. Learned a ton. We covered so much. I think this is one of these like pillar episodes in the podcast, where someone is like," What should I know about this topic?" Growth engineering? This is the one. So thank you again, really, really appreciate it. If you're listening and you like this episode, there are almost 90 other episodes with amazing guests and experts on all sorts of topics around growth. So check those out, hit the subscribe button to catch all the next episodes. If you're spending your time listening here, I super appreciate it. I know there are so many things you can work on, listen to, watch, do, whatever it is. And you're spending it listening to this, and I am appreciative. If you're a fan, a five star review and a written review goes a long way on the podcast apps. And otherwise, I think we'll call that a wrap, and we'll catch you on the next episode. Thanks.
What is "growth" engineering, and how does it different from "classic" engineering?
If you ask Ram Navan, who leads the pricing and monetization growth teams at Airtable, "classic" engineers create value for their customers, while "growth" engineers connect customers to the value of their product.
Whew, that's a lot.
Which is why, in this episode of Growth, Ram explains to Matt all the details that make up a successful growth engineering team -- from finding the team's purpose, to hiring for growth engineers, to measurement and prioritization, and more.
Think of it as your handbook to growth engineering.
- (0:56) Who is Ram Navan?
- (2:19) The key differences between “classic engineering” and growth engineering
- (4:30) What characteristics make up a good growth engineer?
- (8:55) How Ram thinks about a growth engineering team’s philosophy and goals
- (13:52) How Ram measures his team
- (18:37) What a growth engineer career path looks like
- (24:14) How to build and set up a growth engineering team for success
- (28:13) How to set up growth engineering teams to conquer different scenarios
- (34:42) How a growth engineering team should prioritize asks from different stakeholders
- (37:52) Why engineering teams should take a long-term view when planning
Like this episode? Leave a review!