How Atlassian mastered their land and expand strategy with experimentation.

Patrick Thompson


Co-founder, and CEO


Patrick Thompson
Patrick Thompson

Episode Summary

Today on the show we have Patrick Thompson, co-founder, and CEO of Iteratively.

In this episode, we talked about why Patrick made the leap to start his own business, how 200 interviews over 6 months of customer discovery helped them find the biggest data-related pain point software teams have, and how Iteratively helps solve it. 

We also discussed what it was like to work in the growth team at Atlassian in the early days, the difference Patrick sees in picking and prioritising quick wins vs long-term bets to work on at a later stage company vs an early-stage startup, and how partnerships and tool integrations can help increase long-term retention.

Mentioned Resources



Why Patrick made the leap to start his own business 00:01:60
People don’t trust data: Six months of customer discovery, over 200 interviews to find the biggest data-related pain point software teams have 00:03:14
How Iteratively solves this problem. 00:07:58
What was it like to work in the growth team of Atlassian in the early days. 00:11:34
Picking and prioritising quick wins vs long-term bets to work on at a later stage company vs an early-stage startup. 00:14:06
How partnerships and tool integrations can help increase long-term retention. 00:18:58
How Atlassian tripled the invitees creating an account via experimentation. 00:24:19
What is one thing that he thought he learned at Atlassian that has proven to not be true, starting his own business? 00:33:46
How would Patrick try and turn things around when it comes to churn and retention for the first 90 days at a new company 00:34:55
What's one thing that, Patrick knows today that he wished he knew when he got started tackling churn and retention. 00:37:55


Andrew Michael: [00:00:00] hey, Patrick. Welcome to the show.

Patrick Thompson: [00:00:03] Hey, Andrew. Thanks for having me

Andrew Michael: [00:00:05] for the listeners. Patrick is the CEO and cofounder of iteratively, a smart tracking plan that allows you to capture customer data you trust prior to founding intuitively. Patrick was a design manager last year, where he led a team of designers and the relaunch of JIRA software, and also worked from the growth team focused on improving acquisition, retention, and vitality.

So my first question for you, Patrick is. What made you take the leap to start your own business? And why did you pick the problem you've chosen to solve?

Patrick Thompson: [00:00:33] Yeah, Andrew, great question. So to build a background, I've always wanted to go work on my own startup and had the opportunity to go work with my really good friend who I worked with previously, actually his company, the end of 2018 being 2019.

So it was just a good time for me. Yeah. I was living in Sydney and decided to pull the trigger and moved back to Seattle, give you a bit of background, actually, how we started it. We didn't know what problem we wanted [00:01:00] to solve off the bat. We knew who we wanted to solve problems for, which was other software teams.

So Andrea and I decided to spend, as long as it took to understand the pain points and problems that these teams face, we spend about six months doing customer discovery and the number one problem through over 200 interviews, over a hundred companies at the time. It was companies not trusting the data that they're captured related to customer data.

So that's what we set out to go solve.

Andrew Michael: [00:01:27] Very cool. And so 200 interviews, it's quite intense. what did that process look like? like to get to the point where you found the problem you're working on and did you have any sort of core thesis behind what you were looking for and try to hone in on that with any specific personas you were speaking to?

what is the methodology that you, got you to this point?

Patrick Thompson: [00:01:46] Yeah, definitely. So we definitely followed a couple different frameworks. One of them is the focus framework by Justin Wilcox, who is one of our advisors furtively and the other sort of like the Steve Glenn typical lean customer development methodology.

We decided we wanted to [00:02:00] interview traditionally product managers, engineers, data analysts, growth, people to understand the pain points that they had that really kept them from achieving success. So each interview had a template specifically trying to understand. What successful applied for the person within their company, what was keeping them from achieving that success?

And, if they were looking for existing solutions in the market or what, duct tape that they had in place to really help them succeed, and overall, we ended up seeing a few themes emerged from this, relatively quickly. Throughout this entire process, we were a hundred percent dedicated to interviews and synthesis of both primary, and secondary research.

So we didn't code. We didn't do any design work. our full time job was just talking to people which generally speaking was extremely enjoyable and that's pretty much crossed out. Most of my time is as well as this process. Isn't over for us. We're continually during customer development and research and, the core to how we build product that are like.

Andrew Michael: [00:02:55] That's awesome. I think that's actually what I'm doing right now with here is doing a bit of [00:03:00] customer research and one of the purposes of the podcast. but interesting as well. So like you landed on a problem and then people don't trust the data. That's one of the biggest pain point in psych. What is, what, what would you say in your experience as been some of the biggest issues you've uncovered?

And obviously I think one of them would be your product that you're building, but what are some of the common themes you noticed in like the lack of trust in data?

Patrick Thompson: [00:03:23] Yeah. Generally when trust came up as a concern, it came up relatively broadly in context to different types of data. we had a lot of folks talking about trust in their Salesforce data or trust with in the data inside their data warehouse or, things breaking or not being able to access the data or privacy concerns with the data that they had or who had access to this data.

being able to obviously. Yeah, data silos is a big issue. And we learned a lot about sort of data observability and data cataloging and how a lot of sort of tools or teams are trying. To apply good engineering practices, [00:04:00] SDLC practices to data today, tools like DBT and snowplow analytics, emerging helping spearhead a lot of us, and tools like it early as well.

And what we really wanted to. Do when we, we looked at the broad market on really what was keeping folks out that night and thought, generally speaking, that the problems were really broad. And for us as a building a company, you want to make sure that you have a really narrow beach head that you can actually go effectively address.

So our first primary focus was on what was the type of data that we actually wanted to. Help teams, manage, help them wrangle and take some sort of control over. And for us that was really related to click stream data. So user telemetry. and the reason that we decided to do this as we saw a lot of companies using tools like segment or mParticle, or, heaps or mixed panels or the amplitudes of the world, and regardless of what tool that they ended up using all these teams really [00:05:00] struggled with defining instrumenting and actually verifying the data that they had.

And, because this was such a broad prob one with the interviews that we did and the level of pain was so deep, There wasn't really anybody solving this out there today, a lot of companies were building home. Ground solutions were using spreadsheets or confluence pages that we thought of this as relatively Greenfield space that we can go after.

That would give us a foothold into the broader analytics market.

Andrew Michael: [00:05:28] Yeah, absolutely. I think I can attest that's as well as challenge similarly at Hotjar where I work now. and, one of the things we actually did the bots 18 months now ago was just overhaul our whole analytic stack and stock fresh and start with a good tracking plan in place.

just because it got to the point where Nobody had trust in the data. we were adding events left and right. No like practices put in place and data governance set up from the beginning. So you do tend to get that big mess within, at the end of the day, everybody's seeing all [00:06:00] these different event names and some events being duplicated, but just being spelled incorrectly or, at some point it's it just became a big mess and I can definitely see the pain that you're trying to solve here.

And, We ultimately actually we went with segments. cause obviously we're going through this and segment themselves, I think has protocols. which if I understand, and your product correctly is part of what you offer, but maybe you want to let the audience know a little bit more about what it is it really is.

And, how you help customers then get trust out of their data.

Patrick Thompson: [00:06:30] Yeah, definitely. So to just touch back on, I think Brian Balfour says it best when he calls it the debt. Yeah. The data death spiral, where, once you have one paper cut around trust within your organization, these things compound and eventually no one ends up using or looking at the data and all decisions are, folks are falling back onto their gut, which as we know, Is not a great thing for an organization.

Yeah. to talk a little bit more how the solution, how it, or delay as a product works, we actually integrate with tools. So we're a schema registry. So [00:07:00] think about it early as get hub for your analytics. You can go into our tool instead of using a spreadsheet. To define all of your analytics.

You use iteratively to find sort of the metrics and events and properties that you care about as a business. And then we have a developer tool kit that your engineers on the team use that really helps do two things. One, it helps ensure that the data that you're capturing is correct. So all of the schemas that are defined inside of our tool become strongly typed.

events on an SDK that we coach and for your development team, lastly, we help make analytics testable. So we integrate into your existing unit tests and 10 tasks to really help, provide what was traditionally untested code for teams to make it tested. And we integrate into CICB.

it really helps avoid a lot of the problems with it, worry factor, some code and accidentally drop it event or things change over time. There's an entire workflow behind, PM's and analysts and data scientists defining the events that the business cares about to the engineers, instrumenting them.

And lastly, we really closed the loop and then report back in. All of [00:08:00] the inconsistencies and errors detect to really help give you some sort of health report on your overall tracking. We integrate with tools like Mixpanel, amplitude, and segment to really seek the schema is back and forth. So you need to see this game modes that are defined inside of iteratively get uploaded to tools like segment and Mixpanel and amplitude, so that all that context is shared across regions.

In the context of segment protocols, they're solving a very similar problem. Two iteratively. And you have, some of the things with amplitude taxonomy and Mixpanel lexicon and an mParticle data master, this is a big problem that all organizations are facing. from our perspective, we're very much an agnostic tool.

We're very much focused on the collaboration and the workflow and the versioning behind schema is where most of these tools, it's very much sort of an add on, Yeah. for us, the companies that we're trying to help are very much, focusing on data quality as a thing outside of one particular tool, but agnostic of the data that they're sending to their data [00:09:00] warehouse or to segment or to Mixpanel directly.

Andrew Michael: [00:09:04] Yep. It makes a lot of sense as well. And you can definitely see that it's not an add on product. I think just going through the sites and looking at what you've been building, I can see, it's quite a lot more progressed turn in some of the other solutions I've seen on the market, specifically solving this problem.

Which is a big pain points and getting a dry and scaling and also not being a blocker. I think for engineering teams, like the worst thing you want to be doing is having people wait to get tracking approved so they can actually run experiments so they can build new products. very interesting.

And then let's talk a little bit about this. So you, and correct me if I'm wrong, but you were at Atlassian and then you decided to live not sure what you were going to work on yet, but just really dug into the customer interviews. And you figured it out now with it's. This is it's let's rewind a little bit and, Going back to Atlassian days.

And I think we were chatting just before the show. You also working with the previous guest, Shawn Klaus at the time, [00:10:00] at Atlassian working in the growth team, what was that like in the early days working in the growth team and at that time, what was your role

Patrick Thompson: [00:10:06]

yeah, definitely. Yes, I joined it last year as a designer. Shortly thereafter ended up getting promoted to design manager on growth. I think the best way to recap my experience at Atlassian and was probably the fastest, and most impactful learning opportunity that I've had outside of being a cofactor.

It was probably the best company I've worked for. And I learned a ton from working with Sean and generally the broader, the product growth teams there. And there's some, I get folks there as well, who I highly respect. and Atlassian is a great company that has done some amazing things.

And I'm glad to had a small part along that product journey. my time on growth was it was great. Like I am very analytical focused, and I'm very happy to be working on it in the analytics startup at this point. So you're

Andrew Michael: [00:10:51] starting an analytic

Patrick Thompson: [00:10:52] startup. Yeah, it's a good fit, good founder fit.

And. when working on growth, I think it was a very much aligned to my [00:11:00] skills as well. So I really was focusing on what is the customer value that we're delivering and how are we not just optimizing the core experience specifically related to improving like virality and retention and reducing churn, but like, how are, what are some of the big rocks strategically that we want to be working?

On that are gonna, have longterm impact that you couldn't, you can't necessarily measure in the form of a simple AP experiment. I really liked the process behind working on growth at Atlassiann. And then some of the big bets that we took specifically are now starting to have a compounding impact for the business.

Andrew Michael: [00:11:39] Yeah, this is a very interesting, topic. We discussed it as well recently with Angela Steger. who's the director of design and Facebook currently and worked previously at Pinterest and Dropbox. and, one of the things that came up was this concept of balancing between quick wins and longterm bets and what's [00:12:00] measurable and what's not.

And also making sure you're always protecting the end user experience, as a whole, they're not optimizing for short term gains. I think this typically obviously comes when you get to the scale and size of these types of companies and that last year, but. Have you ever took us through a little bit about this in the context of a growth team?

because I think maybe what the initiatives that you work on at a stage should that lesson is first, second, early stage series a startup in your experiments and growth programs are going to look slightly different if not completely different. yeah. What are some of the top of work you're talking about now in terms of these bigger tidbits and then what would be some of the shorter term wins that you would be focusing on?

Patrick Thompson: [00:12:38] Yeah, good example. I can get some context also out of last and specifically how we thought about it from those, the small term bats and large term rocks. When we think about, generally Jira as a product, we spend a lot of time, early days in growth, trying to optimize the experience for new users to get them not just activated, but retained on the product.

And over [00:13:00] time, obviously you get some sort of diminishing returns to where, So it comes down to local Maxima. and so we decided that, in order for us to have sort of an oversized impact to go read double down on improving the overall product experience and not just focusing on optimizations, but rethinking giraffe from the ground up, which is what led us to go spend years and years to actually invest in.

Were you doing Jura software, which is now known as next gen JIRA. and it's a complete step change in the overall experience, which allowed us to build a new foundation at which we can then go start to optimize the core experience. And there's other initiatives that we had at Atlassian and specifically around virality and making the product more easily shareable and trying to figure out what are the entities that can be shared.

And. how do you get folks easily onboarded into the organization instead of having everything be an admin permission, changing it down to where you actually pushing power down into the folks who actually need that control, which is typically the team members within [00:14:00] a project or within an organization.

So like all these initiatives really ended up layering on top of each other to do really getting an outsized impact on the overall growth of the business. And when you think about. How that comparison is, at a very late stage post IPO company to series a it's very similar, but yeah, different in the context of, even admittedly we're making a lot of small butts to improve the overall experience or we're listening to customer feedback, we're running formalized hypothesis and validation.

and we're making a lot of big, best longterm that are hopefully, gonna make it. So it's easier for us to go from C stage company to series a and series B. and traditionally, when I talk to other founders at our stage or, one or two stages ahead of us, the hardest problem for them is really true.

And to understand the growth model and figure it out, all the inputs and outputs that are going to drive longterm growth. you're very resource constrained, at an early stage startup versus at a later stage company. For sure. it's easier to get access to resources depending on. [00:15:00] The company.

Yeah. I'll ask the end use variable culture of being able to pitch initiatives and get funding out to start up. You're constantly pitching initiatives and trying to get funding.

Andrew Michael: [00:15:09] yeah.

Patrick Thompson: [00:15:09] the risk reward scenario is much different and then the appetite for failure as much different as well.

So it's one of the things where my recommendation to folks who. to move fast too. Like the opportunity for growth is go work at that seed stage series, a series B company. Your opportunity for learning is much more, more aligned with the longterm interests of the company. And then, when you think about companies like Atlassian and that was my MBA.

So to speak, that was where I went to fine tune my skills to where I can actually then go prepare to do a company coming out of that. And, I've been very happy with the overall, lessons learned that I had coming out of that organization.

Andrew Michael: [00:15:48] And cool. You touched on one thing and I definitely want us to go a little bit deeper on this point, because for me, like lessen is one of those companies that.

Has one of the most [00:16:00] amazing land and expand and strategies where they're able to start off with a small individual within a team, in a company, and then expand that account to get the whole entire organization using it at some point.  how much of a focus was this around your work and what is some things you potentially learnt along the way on this journey, was any sort of key insights that you learned along the way while experimenting your way to try and increase adoption across an organization?

Patrick Thompson: [00:16:27] Yeah, there's definitely a lot of insights and lessons learned.

So just to give you some context, I spent about six months of my time at Atlassian and specifically focused on virality and that was a relatively big bet within the organizations. I think when I was, there was probably

Andrew Michael: [00:16:41] big time.

Patrick Thompson: [00:16:42] Yeah, it was when I was there. I think it was a team of six or seven and now they have a full time, I think, multiple squads focusing on virality.

And, obviously the last thing is the. The typical example of using when companies use product led growth and talk about the flywheel and how do you get it spinning and how do you start with one and move to [00:17:00] 10 and a hundred and then a thousand and generally speaking, there's a lot folks you've talked about, the flywheel.

So I wanna spend a lot of time. I'm a time on that. But when you think about virality as an initiative within organizations, you see a lot of companies, the notion there are tables of today. Bake that in from the start, it's really easy to invite your team to, start with one person and share link, get them involved as early on as possible.

We ran a lot of experiments focusing on virality as part of the onboarding flow. and I think we optimized it relatively. Relatively well, and what we saw is that, generally speaking, starting as a team was much stronger correlation with retention than anything else. When you start as an individual, even though it's really.

You don't buy it, but when you start as a team, you've already made the decision I'm like, okay, great. this is the tool that we have. And you're actually putting yourself out there from a social expanding social capital. It's actually get folks on board. when I look at the work that we [00:18:00] did, I don't know the last thing around being able to easily, invite external folks, not just folks within your own domain and be able to share links to the Gera, be able to share links to confluence all these things.

Generally helped, not just increase reality for one product, but also get you exposed to other products within the last name portfolio. I think that's one of the biggest differentiators across most teams when they think about virality is they're really focusing on their core product, but when you have a product like.

let's just assume Gera hair in context, European integrated with a tool, get hub, which is even outside of the Atlassiann ecosystem and then integrated within a tool like Slack. you're really just building and compounding the overall, value that the tool is providing, but you're also increasing.

so you're increasing value, but you're also increasing overall exposure to other folks who might not be exposed. To the work that the team is doing, and then they'll get brought into the organization, and all of these things really do help increase [00:19:00] longterm retention for your product. So I think, generally speaking to recap it, I think you think about partnerships and potential integrations.

I think that has a very much a one plus one equals three situation for organizations. I don't think they spend enough time really trying to develop. Good strong integrations with, so there are tools, from the get go.

Andrew Michael: [00:19:23] Interesting. and so then on this six months, a yourself, when it came to a sort of the virality component, you mentioned sort of integrations being a way to get in front of more people and get more people invited, but was anything specifically like when it came to maybe perhaps use the personas that you're working on, any sort of research.

That's you try to look into, to figure out, okay. what is, as a first time experience look like? what could we do to encourage news to be invited was any specific time in that a user's lifecycle that made sense to get people to [00:20:00] invite to the accounts, like maybe talk us a little bit through some of the ideas that you were throwing around to increase, has been sent and.

Patrick Thompson: [00:20:08] Definitely. So I think one of the earlier experiments that we did was, optimizing the invite to email. So specifically the email that you would receive when you were invited into an organization that could be from somebody, assigning an issue to you and you were falling outside of the organization, and that could be from somebody inviting you into a project.

originally the email template was very generic. It had no notion of personalization about who was the person inviting you. what was the context? There's no ability for them to even include a message of why they were inviting you to go through the project. converted, but didn't convert well enough.

And over time we ended up adding things like the invitees, name and then adding the invitees avatar, and then allowing them to personalize or add a message to the invited email and all of these things, over time compounded into, Yeah. I think if I remember a doubling or tripling the recipient rate of actually getting them [00:21:00] back into creating an account.

So those are all easy ways and easy wins. The other thing that we did was a DRS domain, restricted signup was another huge one for us. So being able to allow you as an admin, configure your organization to allow new users from within certain emails, that had a huge impact. And that's a default, that's basically a, an on by default now for most organizations.

yeah. Other things basically related to virality that worked really well for us, as we did spend a lot of time doing research and actually journey mapping the entire experience from end to end. So we had, massive 30 meter, long, wide boards that had every single screen and touch point for a customer from literally day zero from when they land on last to.

day seven, day 14. So when they're there, they're hopefully activated into the product. And so our goal was to go through each one specific scenarios, an outline, when was the most ideal time for them to potentially invite a user. And what we saw is, [00:22:00] we. Added it into the onboarding flow.

So I think you would create your organization. You would then immediately get prompted to invite three users. This is very similar, onboarding sequence that Slack and other companies have at this point as well. And that increased, that increased invites as well. I don't think it had much of an impact on seven day retention for us.

but it did increase the number of invites sent. and subsequently when you started creating issues, our goal is to get you to actually assign that work to your teammate. And that actually increased, the activation. So our goal, obviously, as a growth team was we looked at a couple of different metrics and we typically look at sweet symmetrics, not particular golden.

metrics so to speak, the things that we are typically interested in, or a longterm retention and then activation metrics, and we had different types of activation metrics and obviously the number of invites being sound as well. a number of issues created as a proxy for value in the organization.

and so for virality, our goal is when companies are, when folks took actions, that would likely lead them to wanting to invite their team. [00:23:00] Obviously, a project has been created, Yeah, issues had been created. work has been transitioned specifically. You can think of a developer transitioning work from a, to do column to a QA column.

Hey, should you invite the QA team? Things like that. These are all sort of the scenarios that, which we tried to analyze when was the right time, when it's a specific right time to actually prompt the user to invite their teammate. And so when we think about, generally speaking, a user onboarding or.

A feature onboarding. What was the right time? What was the right message? What was the specific trigger to actually prompt them to take an action? Because all of these actions over time, all of these messages over time, they're always competing with each other. So we wanted to make sure that we experimented, not with them, just one-on-one, but, as part of a holistic suite of messages that we're sending the user, cause we wanted it to be respectful to the user's attention and time.

I'll pause there.

Andrew Michael: [00:23:54] Yeah. And obviously keep going, because it's all very interesting what you're saying. let's quickly ask you a question on that [00:24:00] last point though. So then, you're running experiments, trying to with like email notifications, trying to improve, like activation metrics were invites.

but you mentioned like suites of emails, so you weren't really testing an email in isolation. You would test, I can email sequence against another email sequence. Is that correct?

Patrick Thompson: [00:24:17] Actually, I, we actually had a, we had. I don't know how much Sean touched on this in your last interview, but we actually had a few different teams that I'll ask me.

One of them was the engagement platform team, specifically focusing on building out capabilities within their organization to actually. Test, not just individual messages, but sequences and messages to specific segments of users. So the goal, obviously, which is, you don't want to speak with a microphone, but you want to speak with a whisper.

You want to make sure that your message is so personal. That like you can almost guarantee that, a person who receives that message is going to take the action that, which you're trying to get them to convert to. So our goal is to be able to send these like rightly timed messages that were so personal and to the [00:25:00] user that felt.

Very natural versus, typically when you're using and what we see a lot of teams using today is, they speak very loud either within product or within email. Oh, that wasn't what we wanted to have is some of the design principles that we wanted to have is just be a lot more friendly and respectful because there's time and attention and make sure that.

No, when we actually sent them a message, that it was really helpful for them not just for us. so we built an entire system. We explored things like multi-armed bandits and really trying to figure out, Hey, when we have a catalog of all these messages that a potential user could see what is the right message for them to see, and it could be on the context, it could be it a user and email, or it could be teaching them about a new feature.

That is correlated with retention based off, two or three features I think already been engaged with, or it could be that they ended up looking at aligning page on agile software development. So maybe we should teach them about things like scrum or Kanban inside. The product. And so all these [00:26:00] inputs would basically fit into a system.

And then based off past, look alike, audience targeting would then allow us to determine what message that user should see in real time. And I think that was super important for us. And one of the things that, Yeah, we talk or, folks talk about quite a lot, which is, even having those capabilities in house, it's really hard to find an off the shelf solution that can do that.

So we had to spend years building out, not just, this platform, but things like, AB testing software analytics software, we had to spend a lot of time, really just building A tool chest that we had as a growth team to actually go, be able to move the levers that we wanted to move.

And, it's one of the things, when you think about companies like Atlassian and their needs are so unique, a lot of the times, because they're 12 plus products that they're, that they have. And a lot of the work that we wanted to do around cross flow, there's not many off the shelf solutions that can do that.

And so when we thought [00:27:00] about. Specifically virality, it wasn't just reality from one, within one existing product, it was reality, our product suites and discovering cross sell. And how did, how do you discover new products that would fit within your organization? All these things. Really compound once you actually have the systems in place.

So when we have every mass message is like a long lived message. It was something that, typically when you work with marketing teams, they send a message out and it's very, one, one time all of our messages were thought of as, Hey, we don't want to send the traditional drip campaigns.

We don't want to send the traditional one off. change boarding, messages to customers. We want to make sure that all of this stuff is long lived and will have compounding value over time.

Andrew Michael: [00:27:43] I love the analogy as well of the whisper, like not chatting, just whispering and hitting the right message the right time.

I think it's also an interesting point that you raise as well. and I think this is potentially like reason the color tools. And I imagine so like Optimizely and any AB testing service out there, they're probably [00:28:00] sitting on a double edge sword in the sense that. That, they see churn on one end from a lot of earlier stage companies like, tipping off.

And then also likewise on later stage companies who've maybe used their service in the early days. as the team has grown in sophistication, as the needs have grown as the, the product has become more complicated, over time. And they'll start moving away from them and building their own in house solutions to serve their needs.

And this is definitely a. Like a consistent theme. I've been speaking to more and more companies lately around this topic. And it seems that there's this progression where you start out, you find a ready, made solution, to get you up and running. And then as your team gets up and running and into a good rhythm, usefully start realizing the shortcomings of the systems.

And then slowly, like you say, you started building that chest of tools that you're going to use to grow your business, the way you done.

Patrick Thompson: [00:28:53] Yeah. you see a lot of companies spun out from bigger organizations. Like you have a lot of tooling that's coming out of companies like [00:29:00] Uber and Airbnb and LinkedIn and the Netflix of the world specifically recently.

because a lot of these companies do have to solve very interesting challenges themselves because there aren't tooling available for them. And we see very similar things. When we talk to organizations specifically within the data space about, either building their own. Their own in house solutions or trying to find something off the shelf and does it work correctly for them?

And oftentimes, our typical response to them is use what is available to you, but think about the longterm cost of ownership. And so for any company looking at adopting toys. Yeah. That's one of the things that I always recommend that they think of as do some sort of modeling and forecasting, not just on, through growth of their business, but, How this tool will evolve with them.

I think that's one of the things that's typically companies don't spend enough time thinking about for. obviously we're building a company for other software teams. We tried to make it as easy [00:30:00] as possible to navigate that purchasing decision. And when you think about other companies like that, Optimizly and segment and Mixpanel and amplitude and split, there's a lot of companies out there today who are focusing on.

On solving enterprise features or enterprise problems. There's a lot of companies who are solving, focusing on startup problems. There's this? No, man's dead mind within the mid market where nobody really wants to play in, but that's really where, a lot of the companies are building up their own homegrown solutions.

And, generally speaking, what I find very interesting is the companies who are building out their home grown solutions. What are the lessons learned? That you can get from them because specifically a lot of companies have unique challenges. And I find that the most fascinating thing when talking to a 500 person company like, Hey, what are the challenges that you have within growth?

So what are the challenges that you have within data discovery within your organization? because you'll learn a ton.

Andrew Michael: [00:30:58] Absolutely. I think [00:31:00] for me personally, as well, I'm always have a notion of 5s booth and focus on your core competency. So when you're looking for solutions out in the market, ultimately if you can find something that meets 90%, 80% of your needs, it's always going to be better off because at least your engineers and your team focused on what matters and the product that you built for your customers.

But I think. When you get to the scale and size of, like lessons or the booking dot coms or, Uber's and the Airbnb's I think that's when, The opposite conversation maybe started to happen. And, the ROI is going to be a lot bigger for you building your own custom in house solutions.

Patrick Thompson: [00:31:36] Yeah. And I, a hundred percent agree with that sentiment.

I think the one thing that is universal is everybody who tries to estimate the cost of building their own solution is off by a factor of 10. Generally speaking, it's always much harder. speaking from experience, obviously building tooling, I'm being on teams that built to lending at a loss and.

It's always, you're always off by, many months, if not years on the total cost of actually [00:32:00] not just building the tooling, but then maintaining it in house.

Andrew Michael: [00:32:04] Absolutely. Cool. I have three more questions cause I see we running up on time. three sounds like a lot, but we can get through them pretty quick.

I'm sure. One question is, what is one thing that you thought you learned at Atlassian? It has not proven to be not true, starting your own business.

Patrick Thompson: [00:32:24] I think the one thing that I learned, generally we talk a lot about product like growth in context of last year and not needing a sales team.

There will always be times where you need to get on a call and help educate. A customer help be able to Kate the value that your solution provides and there's ways to do that, obviously with a sales team. But I think there's a lot of businesses out there today who are looking to adopt the product like growth flywheel.

And my summation at this point is that there are certain businesses where that doesn't align. And so make sure that you're choosing. [00:33:00] Your go to market and sales model that aligns best with not just your buyer, but your economic buyer and your end user as well. And

Andrew Michael: [00:33:09] your channel for sure. cool. next question.

Let's imagine a hypothetical scenario. You are joining a new company. General attention is not doing great at this company. And the CEO comes to and says, Hey, Patrick, we need to turn things around. We need to do it fast. We have 90 days. please, can you do something? What would you want to be doing in those first 90 days to try and turn things around for the company?

Patrick Thompson: [00:33:33] Yeah, good question. I'd probably spend the first 30 to 60 days just talking to every single customer. I can, both the customers who are getting value from their products and, those who have recently turned and just trying to understand the lay of the land. The other thing I'd be doing is probably just trying to go.

Build and establish relationships with my counterparts within the organization to understand, Hey, what do you think is working well? What's not working well. Why do you think we're having a problem with churn? and then I would tactically [00:34:00] go try to make sure that we have in the last, 60 days of that time, I would tactically go and make sure that we have enough resources in place that we can then go build a roadmap effectively to go run sort of high velocity experiments to actually get some meaningful results.

I don't think this is one of the things where typically teams think that they can have a quick one. I think growth and retention is one of those areas where you're not going to see progress overnight. Like anything else it's going to take dedicated time and resources. So trying to say, Hey, if a CEO to me and say, Hey, we want to lower our tryna retention by a factor of two in 90 days, I would not take the job.

I would say, no, it's going to take me 30 to 60 days just to get better lay of the land it's going to take diagnosed. Diagnose the issue is it's going to take me another, 30 to 60 90 days is to build a team around this and it's gonna take me another 60 days to get, experiments that we can then analyze the results Bactrim.

And it's probably gonna be, six to nine [00:35:00] months before we start seeing any sort of traction on this area. Yeah. I'm very much more of the, building longterm sustainable. Teams dedicated to solving these problems. I think that's the only way within an organization to have the level of accountability.

and enough folks really trying to dig in on what's exactly happening within the business. I'm always the one where I always push for more faster. and with less resources and a lot of the times, you'll shoot for the moon and land in the stars. But a lot of times there are things where, you can't take off, you can't make impact within 30 to 60, 90 days generally speaking.

So I'd say be realistic with what your expectations are. I

Andrew Michael: [00:35:48] think that is an extent. Excellent. A message as well. It's definitely something, it was up a bit from different guests, but like often I think definitely the number one thing everybody talks about is talk to customers, understand diagnose the [00:36:00] problem.

and then some of the, I would say the guests that are focused and really spent a lot of their time thinking about the challenge of general attention, they're always coming back with the same messages that it's not something that you're gonna impact in 90 days. Yes. Sure. There are some quick wins and.

you could look to dining is probably like the first sort of very common place where people look and you might get like a small percentage move, on the metric. But ultimately if you want to make a meaningful change general retention, it takes time. It takes consistent effort and iteration to get to a point where you actually start to see results.


Patrick Thompson: [00:36:31] 100%

Andrew Michael: [00:36:33] Great message. And, then the last question I had for you today is what's one thing that, today that you wish you knew when you got started tackling churn and retention.

Patrick Thompson: [00:36:47] I think, my one thing that I wish I knew is just go work with people that are better than me. I think that was a.

I had the good fortune of working with a really talented [00:37:00] PM at Atlassian and directly one on one. I learned a ton from them, the good fortune of working with Sean close. I have the good fortune of working with some amazing people and that really helped accelerate my own learning velocity. and generally speaking.

I feel like what I wish I had done earlier is actually just cohort four for folks that I consider mentors. Yeah. I think that would be the advice I'd give anybody specifically interested in tough and churn and retention is go find and identify the people are really good at this and just go work for them and just, hell or high water.

Just go figure out a way to. Learn as much as you can. A great example. the program that I took, I think you took, Andrea, as well as the report tracker that was, a very much a compounded learning that, I am very grateful for the opportunity that I was able to participate in that program.

And I learned a [00:38:00] ton. Yeah. And be able to surround yourself with folks who are experts at what they do, that will rub off. You'll be able to take some of that away with you. I think that's the one lesson I wish I could make be for myself

earlier. Absolutely the learning and, that program you're speaking about specifically as well.

I think reforged, they have a number of different programs in one of them is retention engagement as well. and the experience is relaxant because like you get to week, Hey, we can week out, spend time with some really. top professionals, like talking about the challenge, talking about things, and also like it's for me personally, I developed some good relationships afterwards from those calls that have really led to some really good and powerful learning psycho at the time.

So highly recommended as well. Anybody looking for extra training on the topic, definitely check them out. Brian Bell for Shawn classes, one of the, and Sean Case, Yeah is the three people behind the program. So definitely check that out. Okay.

Andrew, China as well.

Andrew Michael: [00:38:59] And Andrew Chen. [00:39:00] Yes. So like some of the best names in growth, responsible for building this program from companies like Uber, Pinterest, Atlassian, and so forth.

Cool. So how we've we're up on time now, Patrick, it's been a pleasure chatting today. A really interesting conversation. Is there any sort of final thoughts you want to leave with the audience, with anything they should be aware of from your side? Anything to keep an eye on iteratively?

Patrick Thompson: [00:39:24] No definitely. we are hiring.

so if anybody out there is interested in joining a new high growth stage startup, we're very much rapidly growing and looking for really talented folks. So feel free to get in touch with me at  or find us at

Andrew Michael: [00:39:43] Cool. thanks so much for joining the show and I'm looking forward to see what's next now and the journey and wish you best of luck.

Patrick Thompson: [00:39:50] Thanks Andrew. Cheers.


Patrick Thompson
Patrick Thompson

The show

My name is Andrew Michael and I started CHURN.FM, as I was tired of hearing stories about some magical silver bullet that solved churn for company X.

In this podcast, you will hear from founders and subscription economy pros working in product, marketing, customer success, support, and operations roles across different stages of company growth, who are taking a systematic approach to increase retention and engagement within their organizations.


Listen To Next