Airbnb runs about 700 experiments in a given week. Here, I chat with data scientist Linsday Pettingill on the culture and principles that allow their team to move so quickly and efficiently, and how others people can transpose these ideas to their own work.
United States


00:00:00one great thing about Air B&B is it is it sort of disaggregates the the volume of travel traffic right so traditionally you go to New York and where do you stay in Manhattan cuz that's where the hotels are right but one of the things that Airbnb in Naples and I think this is a great thing about the product is it it it it breaks out that sort of Monopoly of certain locations are districts on on travel and lodging and it brings economic effects you know outside of that core and that really drew me to the company and then of course I read about their data team and yeah was lucky enough to find a place on it
00:00:47growth hacking it's one of those terms of their inspiring or incredibly maddening for other marketing professionals across functional teams building testing and integrating or acquisition initiatives at a rapid Tempo was made famous by The Usual Suspects Facebook Dropbox Pinterest Airbnb and now many others one of the earliest and most talked about growth hacks 2010 for an Airbnb users could Syndicate their listing to Craigslist which would then piggyback off of the web sites massive user-based even though the time there was no sanction away from Craigslist to do so this is huge for Airbnb a nap still vying for sustainable use a grout that time but make no mistake doesn't stain off the back of one initiative like that
00:01:46product but also from its recurrent approach to experimentation at data scientist at Airbnb to learn more about this culture of experimentation at Airbnb and other people can transpose these ideas at their own companies this is louder than words enjoy
00:02:09is a red letter day for me because last night I booked my first ever vacation so they're going to be so awesome and great experience I wonder if I went through any of the 700 experiment that you guys are hypotheses but but yeah I think I think the first thing I wanted to ask you to because I just from around how did you find your way I guess to take the Grotti industry you know sort of like everybody is like one of the terms of like growth functions like a baby is one of the people like it's talked about so much so one of the organization's really dating back to what I like that
00:03:09Airbnb in Lake craigslist integration of one of the early growth hacks but yeah how did you find your way I guess into the Grove team at Airbnb wild ride and I'm very circuitous right so my background I went to build a liberal arts school in the Northeast and unipotent I really feel like I found my home you know I didn't identify as a social scientist then but it was very clear that I was intrigued by the way that social scientist understood the world and put that on hold for a little bit after voting went to Germany for for a bit on a full breaded and ended up at Harvard and was working with a cognitive psychologist on some of the early research on to social media on how social media shapes kids identities and and senses of selves and was really intrigued
00:04:09as a possibility involved in in in a more social research it wasn't quite into Data at that point went to Georgetown got my Ph.D over many too many years and I wish I got really into data when I was in grad school in fact I would have to say I was just distracted by day. Right I just sensibly be working on a survey project but I could really into the Nuance of it and actually the technical aspects of data right like how I'm doing this thing but how can I do it better or how can I do it how can I take a spin on the data and and and look at it a different way and you know it some point I just realized like wait a minute if I'm being distracted by data and not actually finishing my dissertation in a reasonable amount of time like maybe you should take that as a date and you know life happens life is just wild and that it's at least for me it's been really unpredictable and
00:05:09I wanted to do a startup and I was I had heard the name of some fellowship that could help me get funding and so I looked for funding and ended up at this page that was for program called Insight data science and it was one of those moments where I was like someone's playing with me here like this can't be for real because what inside data science it is is it's a program to bridge the gap between academic and Industry and in particular it takes to be a PhD and 7-week fellowship where you you build a product you delete data products in the new demo it at companies in the valley so I applied once and I didn't get in and then I have all the sudden my life changed and I was you know writing a python which I hadn't done before us building a web app and you know interviewing at all these companies and Airbnb was one of them and it was one that was particularly appealing
00:06:09its Mission at that point you know I had to use the product and I loved I seen some of their research at the time on how one great thing about Air B&B is it is it sort of disaggregates the the volume of travel traffic right so traditionally you go to New York and where do you stay stay in Manhattan cuz that's where the hotels are right but one of the things that Airbnb in Naples and I think this is a great thing about the product is it it it it breaks out that sort of Monopoly of certain locations are districts on on travel and lodging and it brings economic effects you know outside of that core and not really contribute to the company and then of course I read about their team and yeah was lucky enough to find a place on it so that's that's the I guess longer short story of how I ended up at Airbnb in the girls team was just out of the teams that I chose at the time I mean Airbnb is still
00:07:09growing and growing still pretty expeditiously and I'm pretty pretty rapidly and so it was really appealing to me to have a seat at the table and really understand that gross and more so influenced it experimentation count yeah just search that on medium for all the listeners and it's it's part of the airbnb's publication on medium it's an outstanding article especially if you're if you're trying to establish sort of like a culture of experimentation at your company this is like this is this is a really great framework obviously from an organization that that that seems to have it down pretty well so and you mentioned in the article that when you had first got there they were running like something like a hundred experiments a weaker and and now 2 years later you're up around seven hundred a week which which is just hard for for for me to wrap my head around so it's like I know how hard it is to
00:08:09create a culture throughout the organization around experimentation right because you know building things on the engineering side that might only work for 2 weeks and then they are never use again if that's a hard sell right to to to some Engineers or if your Finance team or support team trying to keep track of all these different experiences that are being executed on and what uses are supposed to itch and like that could be a headache so it's it's it's really hard to cultivate that like culture of experimentation especially at that level so you talk about this in the article like a shaping a robust culture around experimentation across all functions so that's really hard work how did like from what you've seen your time there how did your team go about doing that
00:08:56yeah this is a good question I appreciate that you picked up on it I mean it's it's always a work in progress right so when I started over two years ago we had really just scaled out our experimentation framework right which allows experimentation to be run at scale of course they were there was an adoption curve like hey we have this great tool use it it was like we're building this tool trust us you know as a framework for your experiments and there were definitely some Road you know hurdles that we can front it in terms of you know things don't always work you uncover bugs just like any product right and experimentation framework is an internal product that there were bugs in the product and people would say like you know what you're telling me to do these things because it's coming out the other end is crap right and so a lot of the early work was on okay how did we
00:09:52how do I get people to to use this how do we get people to trust it just like any other product right and so we paid a lot of attention to the pain points that people had was it in setting up an experiment was it in Philippine metrics I wish they could measure their experiment you know we have a very devoted small team that that builds this framework and they did a great job at really listening to our internal users to get a sense of you know why they're using the product for the power users why they're not using it but you know what Edition to that I think one of the biggest is smartest things that we did that we have going for us is that across the company we really try to create an environment that is collaborative that's open that's really conducive to questions because you can't unleash something as powerful as and experimentation framework which you know the engineering behind it is quite complicated but even the
00:10:51understanding of what happens within the framework and is produced to you in terms of like a you eyes of his who have to see you I wear your experiment are presented in terms of metrics and things like you can't hang out over two people at expectable just know what to do with it and that's something that we realize very early on you know we hire very smart people but you know taking an advanced course in statistics it's not a part of our interview process right before for business partners for pn's and so we really have tried to double down on creating the right environment so that you don't people if I don't know how to use the tool they know how to ask how to use it you know there's been a lot of effort put into documentation around it right here is a tool here's how you use it right I'm just for engineers but data scientist who are supporting experiments or product managers right so they know how to support their Engineers when they're setting up experiments you know we have done a lot of things like
00:11:51more recently restarted basically rotation system whereby you have data scientist partnering with Engineers for a week on a slack Channel and they're just there to answer questions and questions could be I want to run an experiment and I'm not quite sure how can you walk me through it too I set up this experiment and it looks really wacky what are the things like that could go wrong and you know that takes a lot of effort to be servicing users in that way it's like CX4 experimentation right or customer customer support for experimentation but it's really really helped us in both ramping up people on the tools to do to run experiments but then also really supporting as you know as they use those tools so we got into a place now where is really exciting in that it's it's basically like a radical to not want to watch something without testing
00:12:51which is amazing we didn't imagine that years ago and also you know it's really hard in this environment to get away with with BS and that's basically due to the way that we we educate around the tool you know you can't just say hey run this experiment and I moved this metric like you're going to get really hard questions here about whether or not you really you know you might have moved to metric but is there any real impact and I think all of that again comes back to you us really creating an environment where we're
00:13:23providing people with lots of education and support around experimentation we're not just saying do this but we're saying you know we want to do this and we're going to help you to do this and hopefully learn a lot in the process that like it sounds like especially with the experiment reporting framework what does internal tool that you guys have enabled anybody really to run an experiment how does that how does that work like who owns I guess the experimentation process is that your team is like actually the tool that was owned by you know the engineers who are supporting it but you know each of us any data scientist any engineer who wants to run an experiment any product manager has full access to the tool and you know all of his capabilities and they have access to change that so if you're running an experiment and you need a more functionality depending on the complexity of it
00:14:23so for instance a common thing is my team will run an experiment I work on internationalization you know who run an experiment and Midway through I'll realize there could be something fascinating going on here for you no host who have multiple listings like maybe what this feature where this feature is really working is on that segment or. Dimension of people and it omit experiment I can then create that Dimension RR pipeline is very conducive you know Siri open its Democratic anyone can use it and anyone can break it and it happens all the time but you know that's okay because that's how you learn and so yeah I think I giving anyone who wants full access to it again we're enabling it's used so it's pretty cool I also have as it comes a level of discipline right to be able to know that what's it worth testing it again this is this is a big part of your two books you would like being hypothesis-driven rather than future driven just it's it's really easy
00:15:23the right to say later I think this feature would would really increase engagement in this area or would increase adoption or whatever it is so obviously I'm at an organization like Airbnb you have guys have the engineering resources to to realize just about any idea somebody has so how but just because there is an idea or there isn't a feature it doesn't mean it should be built right so like how how do you guys navigate that yeah it's definitely a hard one so as a social scientist and in the social sciences at least experimentation is really recent it was not until me the past 10 years that you got social scientist who were who are starting to run experiments in a lot of this of course is due to the non-trivial issue of into the ethics around experimentation in in in a lot of the work that social scientist do but you know so it's really welcome but there it's
00:16:22it's been a long time coming and before that so just took to get to to give a little bit of context in a social scientist like running quasi experiments at best but mostly using observational data and I think what that means is that there's a lot of focus in the social sciences on definitions on concept on hypotheses right those are those are really sent her a lot of Education around becoming a social scientist is this really isn't that so even though you know that's what I call myself professionally I'm really like a social scientist at heart and my that that's what if statement or belief on my part about hypothesis really comes from that background and I you know I want to flush it out a little bit for product people in that you know as as you mentioned like those of us in Silicon Valley are familiar with Silicon Valley in particular in particular like engineer's are really valued out here right around and you know
00:17:22mostly do the work that they want because we we really value makers but I think part of this is part of what you touch down like we can get distracted by by what people can make as opposed to what should be built and you know it's a little cynical but like being successful out here can sometimes be more about luck than anything else you know what I mean I'm here wondering cuz I don't want to work somewhere that builds for lock right you need to build on a very solid foundation and I think part of that success
00:17:59metric or whatever needs to be like is this impactful is this feature going to impact something that something can change right depending on the team you're on it could change even though you may be on the same general team but the area Focus varies but I think that hypotheses can really be helpful there because they they force you to ask like is like axes related to why and by changing acts you know we can decrease why so so yeah I think that mindset it's hard to get into that mindset but by just creating an environment where people are asking questions rather than immediately going to the answer because I'm anyways like a feature a product is an answer but the challenge is are we asking the right question about that answer and
00:18:53you know for me it's very sad when you have the environment where people are making things and then
00:18:59you you can't actually ever measure them you know then they feel like they wasted their time and I never want to be in that environment when people feel like they wasted their time and so process can be bad but we've tried to create a lot of process around making that easier making it so that we're all in this together right like we want to build a good product we want to build informed product and 5 by providing a little bit of structure around building a product with big part of which is hypothesis-driven development should help their story about the new translation service that your team you know kind of hypothesized on random experiment on the the web and Native apps for Airbnb and how that worked out yeah this is part of you know as you mentioned it's your work at a big company and a fast moving company and like you said people can build things very quickly it also means
00:19:59people can build things that you don't know about and or or you just overlooked because you're so familiar that you don't notice the things that you should be paying attention to and so one thing that we did was we were basically changing the backend service that we were using to Power Hour translation's and true running this test on both web and Native listings platforms at her most companies in most products and and this really interesting conversion and we were really surprised by it and again I keep harping on this but like I work Airbnb is a place where when people are surprised by things that we talked about them right it's not like I don't know what's going on let's just ignore that instead it's like hey did you guys notice this effect that we saw leg is this a real because it looks really good
00:20:59and we were like oh yeah minute like this looks really good because this product change that some other team made made it look really good right on on the native apps there's more structured data that's used and as a result of that you can translate structured data pretty easily and effectively and so you know we have been translating structured data which meant that more people were using a certain feature and it was just a really interesting learning process that would not have been possible had we just said like oh we're moving this metric and it looks great and said it was like we're moving this metric should we be concerned about anything right and I don't know that's always fun to rain on each other's parade Cynthia like there's always another question to be asked
00:21:59Clifton and things like that which which kind of takes me into your next point in in the article which was basically enabling the team that has the ability to detect a lift or decrease if there is any talk about different ways and then suggested that you guys go about that so I mean it it also sounds obvious to write that sounds odd but there's a multitude of factors I would imagine too especially for an organization like Airbnb with so many different screen so many different data points is a multitude of factors that can probably muddy the learning so how does how does your team approach level setting like spearmint so you know what the hell happened
00:22:51yeah again just disagree questions I'm glad for your audience that they they get the insurance yeah level sitting is hard as mentioned a couple times process right like
00:23:10you never want a particular organization grows at least for my understanding you never want to too much process but there needs to be a certain amount of process
00:23:21not that just helps structure things so I hope that I'm answering this correctly but being a part of the process we have around this is like if we're going to build something and launch it there's a couple of things that we always share with each other and we share with other teams and you know we we do we literally just have this document where it's like a couple of bullet points right it's like describe to me the future declare your hypothesis right it's what platform are you running this on just because there are lots of gotchas depending on the platform Define your key metrics and your sanity you know are those that keep us sane think about the estimated number of users that you need to make a decision you know how much time do you need to achieve the power those are like the things we should we try to force people to go through but those last the ladder to inches
00:24:21what we what you were bringing out the in terms of detecting in effect and I think that like make her driven culture in a future driven culture with a lot of Silicon Valley companies fall into a lot of people ignore those those last two points like how many I need you need to be able to tell me how many people are going to see this feature and as a result of that like what what potential effect I can see but I really think it's it's like a good counter to again the bells and whistles like when you have really great Engineers it's really easy to get distracted by some idea without it being grounded necessarily being grounded in you know researcher hypothesis and so so yeah this point I think are super important in Port Barre for pn's to hold their Engineers to because again everyone loses if you've invested you know weeks or months
00:25:21of time into building something that either isn't going to give you enough Lyft or you know not enough people are going to see I mean it's it's it's simple math around it is pretty simple but it's a question I think we ate we often Overlook like two two are parallel so are you guys are way more than one experiment or idea in a certain area that's looking to influence one specific metric so like what kind of like prioritization scale or or or basically process do you guys run through to say like this is the one that we think could have the biggest impact
00:26:02yeah I mean that that's not always I think the process around that isn't always so consistent really varies I mean at the most simple it's just like a to their two axes effort and impact and you know thinking deeply about where the competing ideas Fall on that scale you know I mean as as big as we are we're still resource kind of like right like his always in everyone's best interest to say that they're resource-constrained to get more resources and you know even though we're big company were certainly we encounter that and so you know it sometimes it's it's as simple as like you know these two ideas are really compelling and you know somebody's taking a vacation so we're going to go with that is true
00:26:54and of course they're always were assuming hear that the ideas are equally you know they have as much Merit and that they would take the same amount of time you can take to feature ideas one's going to take twice the amount of time then you better convince me that we're going to get twice as much out of it right and I think really these questions are are hard right like no one wants to be told I could this is a great idea but it's not it's not that impact full but again that's not a failure that's like that's a data point the next time you think about a future you eat take that into consideration right and hopefully it goes into your to your process so the data about the sized a potential Solutions and probably for 5 10 of them depending how big your team is at some point it starts to creep into subjective territory right
00:27:54all of these are are smart you know hypothesis they're all based in in some sort of reality and then it's just a matter of who thinks the effort is is huge right like yours is going to take 3 weeks this one's going to take 5 hours or more or less so it's I think you know just being able ensure that level of a velocity is probably like a great way to to to prioritize right cuz then you kind of like minimize the subject the subjectivity of it I want to I want to jump in because I think something also that's that's like creeping up a little bit and it's certainly something that we confront is this notion that like oh my God maybe we chose wrong we made the incorrect choice and I should have chosen that thing that took three times as long and and I always have to remind myself and my teammates like these are
00:28:54disturbance like they're hypothesis-driven but it's a guess right like we have a hypothesis and it's an informed hypothesis but like this might not work out and that's okay and maybe we made the wrong choice and you're always I I try to bring this up in my my blog post you know like you're always going to fail if if failure means something can work out the way that you intended that we also shouldn't get into a position where we're not doing things because we're afraid we're going to fail right like
00:29:25you're always going to fail so the question is what exactly exactly like what do you do with that you know sometimes it's because your hypothesis is wrong and that's a huge learning right like if we walk out of an experiment we stay like we made this assumption about users that we weren't even declarative about before but now we are and and now it's on the table right like we're going to put it up there and now we own it and yours and good data scientist and you know you can build an experimentation framework and you can get it right on fine it'll be great but like was much harder is
00:30:09this attitude or this culture where it's like we're just testing things right like and and sometimes you're going to win and that's great but don't count on it and that's totally okay
00:30:23yeah and I think and packaging those failures and you talk about this in your article I think it's super important because especially for organizations that I may be trying to Foster this culture of experimentation it helps lend credibility to what you're doing nobody especially at executive level are leadership level wants to hear that everything's going great like experiments and in your only reporting on the good ones or you're only running ones that you're going to see really minimal lips in packaging those failures and the subsequent learnings lend credibility to the entire function and I think actually increased by in across your position because you're learning things then there are success is so how do you guys package those analyses or two or do you sew in a wider audience internally can can learn from them how you guys approach that
00:31:21yeah well. Great question first a lot of this at least a Airbnb from my perspective comes from the sort of cultures that we create right and something that I know you're really passionate about really a fierce advocate of is like don't be embarrassed like again I talked about process earlier they process too much process can be bad but I do think that you need to create process around humility like I think unfortunately we live in a world where humility isn't always fostered yes I'll just leave it at that but you know is it is really important to me to work on teams and create teams where you know your humble about your successes and you're very reflective about your non successes or or failures and here you're not embarrassed by them you create you turn them into learning
00:32:21if you're not doing that and I think this applies to life in general and so what does that mean in terms of like actual tactical things so on the growth team that I'm on and we have a couple of things we have experiment review and this is a biweekly thing where we write you know all of our product teams and we say you know experiment review is coming out who has experiments they want to share and we've been doing this for almost a full time that I've been here at the beginning of those were all about wins it was like hey my team did this and you know look at this really good and it's great you know you certainly want to know the things that people did and that work so that you can know maybe move them to your or apply them to your product areas but about a year or so and we were like wait a minute this is great but are we are we learning anything from this are we learning anything by creating an environment where people are encouraged to share
00:33:21just their successes and we did a revamp of it and I'm really a rebranding of it and we said you know we're going to bring to keep doing this for me to change it up a little bit what we're particularly looking for is lessons like tell us what you did wrong because you know how do you become a great engineer you scale your work or you know you make you make it so that other Engineers are doing or working more efficient something with knowledge right like I want to work at a place where knowledge is scaled and Sheridan snots I load and so you know these experiment reviews became a place and with that said it's something that you still have to actively Foster like if you just say come to experiment review and tell us a story everyone will tell you their successes and so we really had to actively say like we need someone to talk about a failure here and it's amazing because you know when people talk about the others that everyone's taking notes right because they're like oh well I'm glad it happened to you and
00:34:21because you know I can learn from your mistakes so experiment review has been really good another side of that is post-mortems which you know I come from Academia we don't talk about their failures at all really struck by you come to an engineering culture and when something goes wrong at airbnb's it's very public it's it's public in that again you take the opportunity to share with other people what happened how it happened how you can avoid it in the future other people can avoid it so those things have been very powerful within the data science team in particular we started this thing called the failure Awards where every quarter or so we give Awards to people who have messed up and you know it usually becomes something that it's not like you laugh about it right sometimes you do things and they're more to fly a little distance from it
00:35:21can you share it with other people and again what they're really listening to is like why don't you do right there listening to it so they can't do it and you know of course we have to avoid like the perverse incentives right like I don't want someone to do something really bad just so they can get an award but this is one way that we create that culture is just by reminding people that hey this is funny and we're going to we're going to laugh and we're also going to praise this person for being brave enough to you know teach us all and I think that's
00:36:01that's really important to always going back to you know experiment that I can uncover in in the process yeah you know whether their feces are or whoever the you know just figure heads to stay like we don't celebrate failure but when you don't do that what happens if people hide them and then nobody and I think I think it is seems like especially in Tech isn't as you said an engineering culture there's much more openness about like I want you guys to all understand why which of the which I think is great yes it was so thanks for sharing these and and you keep saying you up because it's all based on liked your article so sorry
00:37:01like great article this is just really easy to take kind of put this in question form in your comments from the article that I kind of want to end with because I think it's it's it's a common one and I think it's very active 144 listeners to take with him his is very basic like coming up with experiments this particular reader said can you expand on how you come up with your experiments in the role that user research place so you have to answer that question specifically but for any listener now that that that either has done a few whether they're just a TB test are they want to get in tomorrow experimentation and kind of start on creating a culture because that's a continuous process but you know what are some things people can do to start coming up with ideas like what where do they look like what where can I find that inspiration to make smart decisions on things to test
00:38:01yeah I mean I think it's really hard right like coming from a data scientist obviously I feel like the first thing maybe this is predictable but the first thing that comes to mind is just look at your data right like if you're running experiments without data that like makes me a little uncomfortable because it needs to be as a data scientist I'm looking into the data and I'm always approaching data skeptic Lee and this is something I have colleagues who are great and and they're very skeptical and that means that whenever I come to this is check this out this is what's happening they'll be like Twist on to an assumption that I had and so think starting with data is really important but also be in conversation with an about your data because
00:39:00again you were saying in Silicon Valley we talked about things right maybe we talk in an engineering cultures you talk about failure so that we can scale are learnings you know we should we should also be evangelizing our thoughts and the degree to which you do this varies but you know I think if anything I came from Macadamia where we don't do enough of that right like part of the way you succeed through or are you advance an academic career is
00:39:35sitting alone and thinking and if that works for some people I wasn't that great at it and I think one of the the things that I really tell you here is that I get to have really great conversations and so many of the conversations I have about our product about our data are hypotheses generating so so yeah I'd say that's super important understand your data or try to understand your data talk to other people about your data and again like data can take different forms of alluded to the I-10 to think about your user Behavior but depending on the product that you're working on depending on the stage of your gross depending on the company that you work at you know user research is an is an amazing way to derive hypotheses and
00:40:31and I just a great research researcher on our team whom I work really closely with and I wouldn't say either one of us has a monopoly on hypotheses you know we're always bouncing ideas off of each other and they're given equal weight so I don't know it seems simple but like talk about your product like think about your product talk about it
00:41:01I think the sort of creative juices like start to flow

Transcribed by algorithms. Report Errata
Disclaimer: The podcast and artwork embedded on this page are from John Bonini, which is the property of its owner and not affiliated with or endorsed by Listen Notes, Inc.


Thank you for helping to keep the podcast database up to date.