Navigating the Challenges of Product Management in Horizontal Products

Austin Yang

|

Lead Product Manager

of

Softr
EP
242
Austin Yang
Austin Yang

Episode Summary

Today on the show we have Austin Yang, the Lead Product Manager of Softr.

In this episode, Austin shares his experiences in the no-code space, particularly the unique challenges and opportunities it presents for product management.

We then discussed the impact of horizontal products that serve multiple industries, and we wrapped up by discussing how Softr is navigating product development and user needs in this diverse landscape.

As usual, I'm excited to hear what you think of this episode, and if you have any feedback, I would love to hear from you. You can email me directly at Andrew@churn.fm. Don't forget to follow us on Twitter.

Mentioned Resources

Highlights Time
Introduction to Austin Yang and No-Code Platforms00:01:12
Innovative Uses of No-Code Tools at Landbot00:02:20
Day-to-Day Roles and Challenges at Softr00:03:15
Prioritizing Features Across Diverse Industries00:05:00
Feature Prioritization and User Feedback00:08:04
Onboarding and User Activation Strategies00:11:16
The Concept of 'Productive Friction' in User Onboarding00:17:11
Retention Insights and Final Thoughts00:30:09

Transcription

[00:00:00] Austin Yang: A lot of companies, the way they work on retention is, like by looking at at-risk customers or looking at, oh, someone requests to cancel. You try to talk to them, try to talk them out of it, but that's just not how it works. They already made up their mind. They probably reached that point because they don't see value in your product. Or a lot of times, they might not, like, they're just not the right customer to begin with. So you almost always have to start at the very beginning.

[00:00:35] Andrew Michael: This is Churn.FM, the podcast for subscription economy pros. Each week we hear how the world's fastest growing companies are tackling churn and using retention to fuel their growth.

[00:00:48] VO: How do you build a habit-forming product? We crossed over that magic threshold to negative churn. You need to invest in customer success. It always comes down to retention and engagement. Completely bootstrapped, profitable and growing.

[00:01:01] Andrew Michael: Strategies, tactics, and ideas brought together to help your business thrive in the subscription economy. I'm your host, Andrew Michael, and here's today's episode.

[00:01:12] Andrew Michael: Hey, Austin, welcome to the show.

[00:01:15] Austin Yang: Hey, Andrew.

[00:01:16] Andrew Michael: It's great to have you. For the listeners, Austin is the lead product manager at Softr, the easiest way to build customer portals and internal tools powered by your data. Prior to Softr, Austin was the lead product manager at Landbot. And our first question for you today, Austin, is what is it about the no-code space that has attracted you?

[00:01:36] Austin Yang: I think it's just the ability to see what our users build. Because I think traditionally, we build when you work on a SaaS product, right? Let's say a project management tool. You know what your user is going to use it for to manage projects. But for no-code specifically, like, people are really building things that I can, like never imagine. Like people using a software to build some AI product or things for very niche industries like never heard of. So I think that's what attracts me. Like it's not just a horizontal platform. It's horizontal, but just like possibilities.

[00:02:12] Andrew Michael: That's very interesting. What's probably one of the most interesting things you've seen that you've come across and like, Oh, wow, that's cool. Or that's weird. Or.

[00:02:20] Austin Yang: Oh, I think I've definitely seen a lot, but maybe one example from my Landbot days, I think one of the customer used our WhatsApp like solution to build a WhatsApp chatbot. And then it took picture of, like, certain plants. So the app is for farmers and they immediately… like, tells the farmer, right. What plant is this? Like I thought that was, like super cool. Yeah.

[00:02:47] Andrew Michael: Super cool. Yeah. I've seen apps like that as well now, but yeah, taking that into the chat format is also interesting. I've actually been meaning to get it because my plants at the house don't look so healthy at the moment. It's like, I have this one in my office and I'm lucky it's actually grown insanely big now, but everything else in the house, I think I feel sorry for them. So I need to get my green fingers out and start gardening again.

[00:03:10] Austin Yang: All the plants in my house died. So yeah, we have no plants in our house.

[00:03:15] Andrew Michael: Yeah. And I actually like, when I had this plant and I was talking to the next colleague and she was like, I literally Googled like what plants can't you kill? And this one came up and I think that's why this one's been doing so well. So I don't know what the name is if anyone's listening, because that's definitely worth checking out. Nice, man. So tell us a little bit about, like what you're doing now at Softr, lead product manager, like what is, like the day to day look like for you? And then maybe what's keeping you up at night.

[00:03:40] Austin Yang: Yeah. So every day is like, very different. So I think at Softr, we move really fast, like we really prioritize user understanding. We use data of course, but I think at the core, we really try to understand, like who we are serving and what the pain points are. So we ship really fast. We're not trying to validate every single decisions, but we are also no-code users ourselves. We use our product to build a lot of our own internal operations. For example, we have the feature request form, like a feedback ripple all built with software internally.

[00:04:15] Austin Yang: So yeah. So that's day to day, really trying to scope out and design features that our users want, but also for use cases that we know we can be very good at, such as internal tools, but we're trying to understand like for different industries, like what are the features that they need? And we try to generalize them in a way that, okay, so we're not just launching a feature for, say, real estate or a launching feature for anyone who might need a user management system, who might need to list out a certain type of items they have in their airtable base, regardless whether it's a list of people, a list of like houses, a list of items.

[00:05:00] Austin Yang: So that's what I'm trying to do. So at the same time, the challenge is that we're serving in so many industries, so many use cases, it's sometimes hard to know who to prioritize. And so it's hard to know whether, like our focus is too narrow sometimes. Oh, should we go a little broader? Should we go targeting all, the tech company or a certain type of tech company, certain department or certain vertical? Yeah, I think that's always been the challenge and then we're still trying to figure it out a lot of times.

[00:05:32] Andrew Michael: Yeah, it's interesting because I think this is like one of those challenges where maybe there's an exception. I think like most of the time, the common advice for startups is like, niche down, find your target audience, try to be as specific and, solving their pain point as possible. But from your perspective, like building this no code platform that enables people to do whatever they want with it is sort of like one of the main value propositions, I guess, at the end of the day. So it's like, very difficult, I would say, to decide. And maybe talk us a little bit through that in that case. Like you have a lot of different companies, different industries. Like what does, like use case prioritization look like? How are you understanding what use cases to be, prioritizing this case?

[00:06:11] Austin Yang: Yeah. So for sure. I think one thing that we've done really well, I didn't find early on is knowing that because the nature of the product, especially like two, three years ago, when the no-code movement was starting, a lot of people were like, we're building side projects or like, just like new startup on no code, but we knew that some of those we call like passion project can really, like sustaining the growth of our company, but we still want to enable it because they are still good allies of our platform.

[00:06:42] Austin Yang: So we knew very early on that they are not going to be the, let's say, like, you know, ICP in a traditional sense. So we tried to prioritize business use cases that people using software for like portals, like maybe, they are a professional service agencies that there's a need for their clients to access data, but in a very restrictive way. So we allow them to do that.

[00:07:07] Austin Yang: And another good use case that we've identified that is kind of always on and always going to be there is internal tool use cases that companies maybe, they don't have the budget to buy an off-the-shelf SaaS tool. Or maybe those who just don't do what they want to do. So they use software to build their own version of it. So these are the two key group of use cases that we've identified. But at the same time, there are always other people building websites, people building a site project, some sort of new SaaS.

[00:07:45] Austin Yang: We want to make sure we don't intentionally alienate those people because we think software is about empowering people, and also they bring a lot of good word of mouth. But at the same time, we know we're not going to prioritize them for most of the product decisions. Yeah.

[00:08:04] Andrew Michael: And even in that, it's still pretty broad. So there's a lot of, different like, even within the client portal side or within a community or within internal tools, like there's many different types of internal tools that you can go down in many different use cases and requirements. And I can imagine it's just almost endless. Like what does the feature prioritization look like though, at the end of the day? Like how are you actually really deciding or testing or validating what needs to be built next?

[00:08:30] Austin Yang: Yeah, that's a good question. I can't say we have, like a good answer on that either. But so far, our approach has been kind of brought to start with. We know internally, we kind of categorize the use case, let's say for, internal tool into, say displaying data, which is generally some sort of list or table. And we also allow user to edit or interact with data through some sort of form or action buttons. And then we also have dashboard use cases that people want to display their data as different charts.

[00:09:09] Austin Yang: And yeah, so these are the three big groups we have. So far, our main focus is on the... We know we do very well on the very first part, but we are also trying to increase our capabilities on the latter two groups. And we'll see which one really we should invest further. I think there's an unlimited path that we can go and we just have to try and see.

[00:09:33] Andrew Michael: Yeah. See from there. So with all this then obviously like there's many different groups that you're serving, there's many different use cases. I'm keen to talk a little bit about onboarding and what that looks like in this case, like what are you prioritizing for and how are you activating users effectively in this environment where there's just like a plethora of different ways they can be using the product.

[00:09:56] Austin Yang: Yeah, for sure. So I think they, one of the biggest things is that, at the beginning of that onboarding survey, right? I think we do ask questions to really first understand, all right, what are you using software for? And the very first question, we have a few options, like are you using it for the company you work for? Are you using it just for fun? Or building some personal projects? Are you using it for a nonprofit? Because we do think there's some key difference between, let's say, a non-profit and for-profit company. So we provide the additional options.

[00:10:32] Austin Yang: And then throughout the survey, we also ask questions like, how familiar are you with, like no code development to kind of get a sense of their skill level? And also data source, what, do you already have data that you want to build a front end for? So all those information kind of help us to personalize the experience a little bit once they get into the product. Right now, we don't have that advanced personalization yet. So to begin, we'll do with, the first thing is template recommendations, right? We know what use case they select for.  So we display some templates that's relevant to that specific use case. This is what we have so far.

[00:11:16] Austin Yang:  I think a lot of this data really helps us to understand, all right, who are we really attracting? And then when we do any, let's say, A/B test or activation experiment, help us identify, all right, is this actually a win or not a win? Because if we just look at everybody, then it's very easy to get misguided. Maybe an experiment we did worked for people who really want to build a dynamic web app, which is really our core value prop. But maybe because we have more people who just want to build a website, that experiment didn't win. So we didn't want to really get misled by the summary data. Yeah.

[00:11:58] Andrew Michael: Yeah. So in this case, then you've really started now. You've collected some good information about the user, what the use cases are, where they're getting started. And then you're customizing that onboarding experience based on these questions. Like, I think a lot of times people will say like, why are you asking so much from the user without giving back? But I think in this case, you really are giving back, if I'm understanding correctly, and you're tailoring that experience to that. Is that the case?

[00:12:25] Austin Yang: Yeah, so far it's not super advanced. It's simply just recommending different templates, but in the future we might do more. Also, it's not just within the in-product experience, but also the lifecycle email that we might send some more targeted content for a specific use case, like client portal versus if you're building a data dashboard. We are planning on providing different information for each use case.

[00:12:53] Andrew Michael: Yeah. And has it always been this way, or is this something that you've recently added to the product?

[00:12:58] Austin Yang: So the onboarding survey has always been there, since the day I joined. We definitely spend some time trying to optimize it, just so we get more relevant information that really helps us to act on. A lot of times, you ask questions like, Okay, what is your industry? There's something we might still have that we have to kind of think a little bit about, all right, is this really relevant for us? Like, if we know this information, how can we actually do differently? If the answer is no, then maybe you should not include it during your onboarding survey. And that's kind of how we look at it.

[00:13:35] Andrew Michael: Yeah, I like, like, I can't remember where I saw this while I started as well, like implementing in the previous product I had, but anytime you're asking for piece of information, like stating to the end user, how you're going to use it and how it's going to benefit them really forces and challenges you to think about, like, okay, do we actually really need this information? Is it adding value to our users or is it just sort of one of those selfish vanity metrics that we're collecting for ourselves, but we don't really do anything with?

[00:14:00] Austin Yang: Yeah. But generally speaking, I would say… I think a lot of people, when they, like when they start optimizing for onboarding, the first thing they say is, oh, let's remove all these questions. So people, you know, we provide value first. That's kind of, at the high level, it sounds good, but I think almost always, like every product you see still, have that question because like understanding, you know, who your users are, what they're trying to do is still, like essential at the end of the day. Yeah.

[00:14:30] Andrew Michael: Yeah. Now, definitely that removing friction argument, I think has been played out over and over and over again. And I think there's definitely areas and products and use cases where, like positive friction is required. And I'm assuming it's probably in this case as well for you getting started, it is, even though it's a no code tool, it's not probably the most straightforward tool to get started with as well. So how are you using friction in your onboarding process to activate users?

[00:14:58] Austin Yang: Yeah, so the way I look at it is, when we talk about good friction, I will also divide them into two categories, like the onboarding question we were talking about, I consider them necessary friction, like you kind of need to get those information, but, Oh, there's also another group, what I call, I guess, a more productive friction. Maybe some user will drop off or find it annoying. But for those who actually go through it, they're more likely to convert down the funnel, whether that's activating or upgrading or whatever.

[00:15:36] Austin Yang: The way I look at it is kind of like going to school, like the easiest way to get an A is if you're a teacher, you just make the test super easy, then everyone will pass. But that doesn't necessarily help to get into a better university or getting a good job. So sometimes you have to challenge them a little bit. And so one of the things, a software we did, maybe a year ago when we were trying to optimize the activation onboarding journey, is by requiring users to connect to their data source, when they create their very first app.

[00:16:14] Austin Yang: Before that, we let them do it naturally. Once they open the application, they can connect it whenever they really want to. But we made the change to require it at the start of app creation. Because basically everything you do when you're building an app requires you to have the data source connected. And we thought it's very important that you set it up first. And indeed, we see some drop off because people, maybe they don't really have, data source, they don't have the data yet, or maybe they find it a bit annoying.

[00:16:50] Austin Yang: But for those who did, because we kind of surfaced that action way earlier, they were able to do everything else within the application so much easier, like defining who the users are, defining the access level, defining what data to display. So I think that's a good example of how I call it more productive friction.

[00:17:11] Andrew Michael: Productive friction. And in this case, obviously prior, you allowed them just to get on with things. Then you started saying, okay, actually you need to connect the data source to do anything. What metrics were you looking at to understand, okay, was this a successful move for us or not? How are you tracking that?

[00:17:29] Austin Yang: Yeah. So we kind of defined the activation metrics using the, kind of the reforge method, right? You have the setup moment, you have the aha moment, and then you have the habit moment. So in our case, because it was still quite early on, we didn't really have enough data to really correlate, all right, which of the, what actions correlate with long-term retention. So we did find something a little more leading. So for us, setup is pretty straightforward. You connect the data source because that's like the absolute minimum that you will see value if you want to build a web application.

[00:18:06] Austin Yang: But by doing so, we kind of pretty intentionally said that, right, we don't care too much about people building websites. Like we don't want to include them in our activation metric. If people want to come and do that, that's perfectly fine. But we don't really judge whether, you know, an activation or onboarding is performing well based on that at all. It's like we only serve people who want to build a web app.

[00:18:33] Austin Yang: So that's the first part. And for the aha moment, we define a few actions within the application. For example, adding a few blocks, previewing and publishing the app within, I think the first three days, to mean that they, okay, at least, very first time they see the app live, so they experienced the value. And I think to keep things simple, those were the metrics that we were tracking at the time.

[00:18:57] Andrew Michael: At the time. And yeah, as you mentioned, sort of like the people that aren't ready to have data get started on really the customers that you want to that point in time. And I think this is probably one of the biggest things I think where people think about, oh, like if we add friction, we're going to lose customers. If we add an extra field, like there's going to be drop off. And I think the other side of it is like, if people aren't really motivated enough to fill in one extra field on your app, like how motivated do you think they are going to be to use the actual thing itself? So in some ways you sort of just like cleaning out the weeds as well. And only making sure that you're serving potential customers.

[00:19:35] Austin Yang: Yeah, that's true. I think that's something you almost need a little bit of experience to really learn. It's hard to really prove right away to a lot of people. I think they just started working on growth and product. The obvious thing is, oh, you reduce one step. So obviously your conversion to the next step will increase, like a little bit. But those people who would have, like those who would have dropped off if you add, extra step, they probably, you're just delaying that. Probably, they will still drop off in the next step. Yeah.

[00:20:09] Andrew Michael: Yeah. And then, so you made this change now where you sort of like, force them to give their data you've used, as you mentioned, sort of the reforge metrics set up, aha, habit forming motions. What are some of the next steps that you send your use on after this data integration, like where do they go from here?

[00:20:31] Austin Yang: Yeah, so we also added a pretty typical checklist to kind of guide them through the application building. I think the challenge in our case, and also in a lot of product, is that there's no one linear order of how you should perform things. So that's why we provide the checklist. There is still some dependency between steps that we would force users to, let's say, if you want to define user groups for your application, you first have to connect your user table to your data source.

[00:21:05] Austin Yang: So we have those rules defined, but generally, we try to keep it a quite open-ended approach for our user to explore the application. We haven't really invested too much time optimizing this, because I don't think it's going to be the most high leverage work at this point. So we haven't really seen, Okay, maybe should, we actually force users to perform a thing in a very strict order and to see if that works better. Yeah, that's something we haven't tried yet. But also I would say a lot of our [effort on] activation is not so much on adding all these additional product tours or a little, sometimes a little gimmicky stuff on top of the product.

[00:21:46] Austin Yang: All we really, trying to do is to allow the product itself, like when you're setting up a specific, using a specific feature that it provides enough guidance for you to know, all right, why you should use this, what is the next step, and how does that connect to the overall goal that you're trying to achieve, like the app you're trying to build. And I think that's, I think with some early stage startup, they can lose sight of that. They think, all right, I just got to keep adding this product or all these hacks, so my activation will improve.

[00:22:19] Austin Yang: But I think what they should really do most of the time is to improve the core product. ‘Cause a lot of times, user don't convert, not necessarily because they don't understand, they just don't see that you have a particular feature that they need and that's not easy to really capture from data like a, like your product analytics alone. Yeah.

[00:22:41] Andrew Michael: Yeah, for sure. I think the concept of onboarding as well is often treated as like this moment in time where people, like, fill in the signup form and then they go through an onboarding experience. But I think as you start to evolve and understand as well, it's a continuous process of onboarding and you're really always onboarding your users, whether it's to new feature releases or it's to new use cases or better and new ways to use the product. And as you say, I think, thinking through those interactions, sometimes we end up building a thousand features before we think about how, do we increase activation for what we've already built and a lot of the time we could save a lot of pain and effort by just focusing on what you already have and just giving that a little bit of love and attention to the parts of the product to increase engagement.

[00:23:27] Austin Yang: Yeah. I think one interesting thing that we're currently working on, I think I can share a little bit is we call this concept like a smart default and a smart constraint. So essentially it's not really a part of onboarding everyone sees just part of how the product works. We notice one big problem is that even though our product is pretty user friendly, but, to build an application, you still need to have some basic understanding of how software development works. You need to know, all right, if you want your user to be able to reset passwords through some link, you first need to have a reset password page. But to a lot of people who never work on software, that has never crossed their mind.

[00:24:15] Austin Yang: So what we're currently doing is trying to define some more specific constraint that, Okay, if you want your user to sign in with password, then we specify, Okay, you need to have this reset and forget password pages. Basically, I consider this a form of what I call or what people call invisible design, that we know what they're trying to do, and then we kind of just predict that and put those constraints in. So they will always remember, Okay, I need to set this up so my application will function properly. Or sometimes we can read the data source, right? So we know what the data looks like, how each table relate to each other.

[00:25:01] Austin Yang: So we make some predictions. So maybe you want your user to be able to control tasks that they can work on. Then we make that suggestion within the product. I think that's something we will keep adding to, kind of our activation process. Yeah. Not necessarily on boarding, but just like, you know, we display them when it's necessary.

[00:25:23] Andrew Michael: Yeah. And it sounds very interesting as well. This, like you mentioned, is hidden design in the sense of like, how can you predict what users are going to need to do and then guide the product in that direction? The thing I'm also thinking is like going back to the beginning around prioritization is how do you then sort of figure out what to prioritize? Because now you have two really meaningful channels of work, one being in activation and onboarding, and then the other being on actually just delivering new features to users. So I'm keen, what the prioritization looks like on that front. How much are you prioritizing activation onboarding over building new features for users?

[00:26:00] Austin Yang: Yeah, I would say it's almost always case by case, I would say. I think very intentionally, we do set up some kind of percentage up front. Like, all right, say 30% of the time are going to be for improving the building experience, which we consider a part of activation. And then, say, 70% of the time, we are going to kind of execute what we call the big bets. So they could be things like connecting new data sources or revamping our blocks. I think that's how we do it. We're not trying to use eyes or anything, like I think sometimes they're too complex and also not complex enough at the same time, that there are so many factors that you almost have to, like that's so unique to the situation at hand.

[00:26:52] Austin Yang: That is not simply just impact versus effort. Sometimes there's also dependency, who can work on what, and then in what sequence does it make sense to introduce the feature into, to make that value really, maximize the value for our users. So we don't really use any standard prioritization framework on the front, so almost always case by case.

[00:27:15] Andrew Michael: Case by case. And do you have an indication or an idea of what percentage of what your ship makes up, like activation onboarding versus new product features?

[00:27:25] Austin Yang: Oh, yes. So I would say, depending on the quarter, but at one point, I think it was around, maybe 50/50. I think right now it's probably around 70 new feature, 30% more about building experience. Yeah.

[00:27:40] Andrew Michael: Interesting. Yeah. It's good to hear that it's up to as much as 50/50, because I think like most of that the value comes in. I think this is something I've also learned in experimentation over the years. It's like when you come up with an original hypothesis and you launch an experiment and it fails, it doesn't necessarily mean that, it's like the hypothesis has failed. It just means that you need to iterate on the experiment. And like I've literally seen this firsthand where I've run like almost an identical experiment and the one was like a 25% conversion rate on a landing page and the other one returned to 7% and like, very like small degrees in differences, like in the overall experience.

[00:28:18] Andrew Michael: But just like that iteration and like going back at it a second and third time has like orders of magnitude in difference in terms of the results that are achieved. So the same thing I think implies in activation is like you've built a feature and then you're coming back to it and you're revisiting and you're improving it and you end up seeing like the results then. Is this something you've seen as well in your work?

[00:28:38] Austin Yang: Yeah. But I would say like in, for a lot of B2B product, even, like B2SMB and consumer product, sometimes it's really hard to kind of attribute exactly what change you did, like increase or decrease certain metrics. You almost always have to use, A/B test for that. But the issue is that for A/B testing, there might not be enough sample size. So you really have to prioritize the most impactful thing to test. And in our case, that's the same. Also, when I was at Landbot previously, even though we already have a lot more [sample size] compared to your enterprise product, but still, we still have to really prioritize things that we think. Like kind of the moonshot experiment to do, not just testing, all right, just change the color of a button, the placement of certain CTA. I think those are like, way too small and I think it's hard to see results.

[00:29:40] Andrew Michael: Yeah, no, absolutely. I mean, when we talk about experimentation, like, it's really about coming up with a hypothesis and the hypothesis is not like changing red to blue buttons. It's more about changing the positioning of the product or introducing a new step in the onboarding funnel or re-engineering the total onboarding funnel and testing that out against something new. So definitely need to be much more bolder when you don't have the luxury of data on your side to be able to effectively track and measure those things.

[00:30:08] Austin Yang: Yeah.

[00:30:09] Andrew Michael: I see we’re almost up on time now, today, Austin. So I’ll ask you a question I ask every guest, which is what's one thing that you know today about churn and retention that you wish you knew when you got started with your career?

[00:30:21] Austin Yang: I would say it's that retention, you can't work on retention backwards. A lot of companies, the way they work on retention is like by looking at at-risk customers or looking at, oh, someone requests to cancel. You try to talk to them, try to talk them out of it, but that's just not how it works. They already made up their mind. They probably reached that point because they don't see value in your product. Or a lot of times, they might not, like, they're just not the right customer to begin with. You almost always have to start at the very beginning, including acquiring the right customers.

[00:31:01] Austin Yang: If you don't acquire the right customers, a lot of times there's not much you can do, really. Everything you do might increase them staying on a little longer, but it's not going to be a significant change. So you have to require the right customer that will actually get value from your product and also activate them, in the very first day, first week, first month, because once they kind of put it off, then they're not really thinking about your product and they might just leave forever.

[00:31:34] Austin Yang: And then I think lastly, especially for the early stage startup, a lot of the long-term retention is how good your product is. So definitely don't overcorrect by investing in, kind of optimization because there's so much, only so much you can do. You can help your user see the value of your product a hundred percent. But if the value of, product is kind of limited, it's not improving. Then like they won't, they still won't retain a long term.

[00:32:09] Andrew Michael: Yeah, absolutely. I love that. Is there any final thoughts you want to leave the listeners with before we wrap up today, how can they be, keep up to speed with your work?

[00:32:18] Austin Yang: Well, yeah. So I sometimes write blog posts. My blog, called Product Levers. You can find it on austinyang.co. I will leave the link with you, Andrew.

[00:32:30] Andrew Michael: Cool. Very nice. So yeah, also for the listeners, absolutely everything we discussed today, we'll leave in the show notes as well. So including Austin's blog, if you want to check that out, you'll be able to find it there. But yeah, once again, I think Austin, thank you so much for joining today and wish you best of luck now going forward.

[00:32:45] Austin Yang: Yeah. Thank you for having me, Andrew.

[00:32:47] Andrew Michael: Cheers.

[00:32:49] Andrew Michael: And that's a wrap for the show today with me, Andrew Michael. I really hope you enjoyed it and you were able to pull out something valuable for your business. To keep up to date with Churn.FM and be notified about new episodes, blog posts and more, subscribe to our mailing list by visiting Churn.FM. Also don't forget to subscribe to our show on iTunes, Google Play or wherever you listen to your podcasts. If you have any feedback, good or bad, I would love to hear from you. And you can provide your blunt, direct feedback by sending it to Andrew@Churn.FM. Lastly, but most importantly, if you enjoyed this episode, please share it and leave a review as it really helps get the word out and grow the community. Thanks again for listening. See you again next week.

Comments

Austin Yang
Austin Yang
About

The show

My name is Andrew Michael and I started CHURN.FM, as I was tired of hearing stories about some magical silver bullet that solved churn for company X.

In this podcast, you will hear from founders and subscription economy pros working in product, marketing, customer success, support, and operations roles across different stages of company growth, who are taking a systematic approach to increase retention and engagement within their organizations.

Related

Listen To Next