Video: How Sprout Social Accelerates Innovation with Mixpanel: Closing the Gap Between AI & ROI | Duration: 3488s | Summary: How Sprout Social Accelerates Innovation with Mixpanel: Closing the Gap Between AI & ROI | Chapters: Welcome and Introductions (30.91s), Welcome and Introduction (115.56s), Introducing Mixpanel Insights (307.845s), Mixpanel's Customer Insights (433.62s), Sprout Social Overview (821.29s), Data Inconsistency Challenges (962.86s), Intentional Implementation Strategy (1108.925s), Operationalizing Mixpanel (1313.865s), AI Efficiency Insights (1447.245s), Demonstrating Mixpanel Features (1707.52s), Optimizing Posting Funnel (2008.625s), Flow Analysis Insights (2174.76s), Measuring AI Impact (2299.945s), Measuring AI Impact (2466.35s), Mixed Panel Board (2710.23s), Q&A Session (2818.11s), Playbook Essentials (3130.04s), Implementing Data Governance (3242.595s), Closing and Resources (3331.555s)
Transcript for "How Sprout Social Accelerates Innovation with Mixpanel: Closing the Gap Between AI & ROI":
Alright. In service of time and also now that we've broken the ice, welcome. Thank you all so much for being here. I know how hard it can be to carve out time, especially at this time of year, and we really appreciate you joining us. We're excited to spend this time together and have packed this session with as much value as possible. But before we dive in, let's start with a quick poll. This is gonna help me tailor the conversation to where you are today and make sure that the session is as relevant as possible. So to start, I wanna know what's the biggest barrier to using data more effectively at your organization. Is it hard to find or understand reports? Do you find it slow or incomplete, the instrumentation? Is there a lack of shared dashboards or team alignment? Are you experiencing difficulty identifying behavior patterns or not sure where to start? Let's give a few moments for folks to answer this poll so we get a sense for where everyone's at in the room. Alright. Let's take a look at the results. It looks like the majority of folks find it slow or incomplete instrumentation in the way that they're tracking things. I think you're gonna be pleasantly surprised with your ability to answer those questions through, the material that we're covering today. So here's what to expect over the next forty five to sixty minutes. We're gonna ground this discussion with an introduction to why this conversation matters right now as product teams move faster and AI features roll out rapidly. Then you'll hear from our featured guest on Sprout Social's analytics journey, where they started, what changed with Mixpanel, and the impact that they've seen. Next, we'll jump into live workflows to show how they identify friction, measure AI impact, and get questions, get from questions to insights in minutes. We'll wrap up with a live q and a and clear next steps so that you can leave with a practical playbook that you can use with your teams right away. We do have a few mixed panelers here joining us, so they'll do their best to help answers or sorry, to answer questions in the chat. But for the more technical or support oriented questions that we're not gonna be able to assist with today, we'll be passing those on to our support team. So, they will get answered within the coming week. It's the holidays, so give us a little bit of time. I do wanna note that we're gonna be sharing this recording, our slide deck, and resources after our event, so there's no need to take notes. Just stay present and stay engaged. My name is Michael Armstrong. I'm the customer engagement manager for Americas here at Mixpanel, and I'll be your host for today. We're excited to be joined by Blake Kurinsky, director of product management at Sprout Social, who'll share how his teams use Mixpanel to make faster, more confident product decisions. Then our customer success architect, Patrick Mackle, will take us through proven Mixpanel workflows showing how the impact you'll hear about from Sprout Social can translate into real practical wins for you and your team. One last poll to help us set a baseline for where we're all at. How confident oh, where we are. How how confident do you feel that your team can clearly demonstrate the impact of your product features today whether they're AI driven or not. I'm gonna open up this poll. There's no wrong answer. And while this session might be oriented towards measuring the impact of AI features. I think I wanna open this up for, getting a sense of whether or not people know how to measure the impact of, their product, whether it's AI or not. Alright. We've got lots of folks answering at this point. Let's see if we can share it. At this point, it looks like the majority of folks are somewhat confident, followed by we're not measuring, impact yet of AI or otherwise. Thanks for sharing. This is really helpful context, for us as we dive into the materials we've got today. So I wanna begin our time together by answering a core question around why we're here. Why do so many product and data teams still struggle to move quickly and confidently? And more importantly, how do teams like Sprout Social break out of that cycle and turn data into true competitive advantage? That's what today's session is all about. In today's world, every interaction matters. Every click, every swipe, every purchase, it's all being captured. We have more data than ever before, and yet at the same time, customer expectations have exploded. Hyper personalization isn't just a nice to have anymore. It's the baseline, and competition for attention is relentless. What we consistently hear from product and data teams is this. They wanna move faster. They wanna experiment more. They wanna be truly customer centric, but they're slowed down not because they lack the data, but because data is hard to access, data is hard to trust, or data just locked behind really complicated tools. Most teams already have what they need. The real challenge is turning that data into insight quickly enough to be able to act on it when it matters the most. And this is exactly the problem Mixpanel was built to solve. Mixpanel gives you a complete view of the customer journey from acquisition to activation to long term retention so that teams can see what's actually driving meaningful growth. Instead of just knowing how many users showed up, you can understand which channels bring in your most valuable users and what keeps them coming back. The goal isn't more dashboards. The goal is clarity so that teams can move with confidence. At its core, Mixpanel powers a continuous innovation loop, one that high performing product teams run on every day. It starts with observing real user behavior, analyzing that data, deciding what to do next, and then acting on insights with speed and confidence. This innovation loop is how teams are able to move from guessing to knowing. And what makes makes Panel powerful is that this isn't just a theory. It's built directly into the platform. So our full suite products support teams across its entire decision making cycle. So it all starts with observing customer behavior. We've got tools like session replay along with reports and heat maps that allow you to spot patterns at scale and then zoom all the way in going from big picture trends to reliving individual user experiences in seconds. This is how teams move beyond what is happening to truly understanding the why. From there, Mixpanel helps teams analyze what they're seeing using funnels, flows, and cohorts. These tools turn raw behavioral data into clear insights, showing you what's working, where users are getting stuck, and how different segments behave over time. Next is deciding what to do. With experiments and metric trees, teams can connect insights to action by identifying what's most worth testing and which levers are likely to move their North Star metric. This is what helps team prioritize with confidence and not guesswork. And finally, you can act. Using feature flags, you can validate your hypotheses and roll out changes with precision, controlling who sees which feature and when, all while minimizing risk. And so what makes this actually work comes down to our platform's ability to fuel and refine the cycle. First, with data governance and our integrations, we provide self serve access to both product and back end data at enterprise scale so that teams can make strategic decisions grounded in trusted, well governed data. Second is collaboration across teams. So through access controls, shared workspaces, standardized metrics, and dashboards, Teams can discover insights together and trust that everyone is actually speaking the same data language. And coming soon is AI driven automation. We're introducing capabilities in the near future that will accelerate onboarding support and data analysis. In fact, using our MCP server, teams will be able to analyze product and session replay data using conversational AI, making insights even more accessible. And because this workflow is repeatable, teams don't have to start from scratch every time. So whether you're a product manager, designer, engineer, or customer support, everyone can follow the same process and access the same data to arrive at insights they can trust, which brings us to why we're here today and why Sprout Social Story is really so powerful because they face the same challenges that many of you are dealing with right now, just massive amounts of data, the insights that took too long and required too many resources to uncover. Instead of adding more tools or relying on centralized analysts, they built repeatable Mixpanel workflows that anyone across their teams could use. And what's important here is that this isn't just a story about dashboards or reports. Those are important. But it's about a shift in the culture. It's about empowering every team member to ask better questions, test ideas faster, and validate their decisions without waiting in line for the data they need to do their jobs. That shift helped Sprout Social optimize user journeys, accelerate product development, measure the ROI of AI features, all while aligning product engineering design teams around shared metrics that they can trust. And today, we're gonna show you exactly how they made that shift and how you can replicate that impact with your teams today. So without further ado, it's my pleasure to welcome Blake Kurinsky. Why don't you take a moment to introduce yourself, tell us about your role and your path to Sprout Social. Perfect. Thanks, Michael, and hello, everyone. My name is Blake Kurinsky, and I'm the director of product management at Sprout Social. Prior to joining Sprout, I was a software engineer at Motorola and a few smaller digital digital agencies here in Chicago. I've been a product manager at Sprout now for eight years, and it's been really incredible to see our growth over time from a really small start up to a publicly traded company. I've led the implementation of Mixpanel within our product itself, and I'm excited to share how we use it to make faster product decisions. Perfect. So tell us, what did product analytics look like at Sprout Social when you first arrived? Sure. So for those unfamiliar with our products, we have a suite of SaaS products. Our namesake product, Sprout Social, is a social media management solution tailored primarily for large brands or organizations or really anyone that has to manage multiple social profiles across multiple social networks. Think 10 plus Facebook pages, five plus Instagram accounts, three x profiles, etcetera. Without oversimplifying too much, Sprout Social contains our smart inbox and other sections like reviews and cases that help our customers triage and reply to incoming social messages. Our publishing and compose section, they enable customers to plan social campaigns and schedule outbound posts. And then our reporting and listening solutions allow customers to understand how their posts are performing and how their brand is perceived across those social networks. So as you can see, Sprout Social is a pretty robust product with a lot of functionality. As a result, it's really important that we understand how the product is used so that we invest in the right areas. When I first started at Sprout, product analytics were captured and stored directly in a database. They were only surfaced in business intelligence tools with very limited access. So to understand product usage, you either needed to work directly with our data team or be one of the lucky few who understood enough SQL to do this yourself. It sounds painful. And, honestly, I think it's pretty relatable for a lot of folks here no matter where your company falls on the maturity spectrum. Before we dig into this more, I wanna check-in with everyone. If this feels familiar to you, drop an emoji in the chat that captures how this friction feels. While folks are sharing, let me just say that from my own personal experience working at an organization that was scaling really quickly in a hyper growth phase, this is a pain point in friction. That's something I've experienced myself. I see from some of the responses, in the chat that, we are not alone in feeling this pain. Blake, you've painted a really clear picture of what things look like and felt like overall. Were there any specific workflows or decisions that felt especially slow or bottlenecked? Yeah. I would say, performing cross product queries on really common behaviors in the application was always a challenge or a pain point for us. So here on the screen, you'll see an example of a social message. These are the building block of our product. They're found in every section of our application. You can also see here an example of an action that a user can take on that social message. This one highlighted is marking a message as complete in the Sprout Social product. Usually, this means there's no follow-up needed or no reply needed. The message is done. We see a lot of people use this to get to inbox zero or, you know, complete their their triage. As a product organization, we may ask the question, how many times do customers mark messages as complete? Well, even if that little complete action exists on every social message across the product, the way that it's tracked may change. For instance, the way that this specific action is tracked in the inbox may be different than it is in cases or reporting or listening. What that means is to answer that question, someone would need to know exactly how those events are structured and write a SQL statement that might look something like this, combining a bunch of different events in order to get a result. What was the moment where you knew that something had to change? Yeah. These challenges became most apparent for us during our product road mapping process. It's where we discussed what new features or enhancements we should build. Our data team would be swamped with requests for data and insights. But without the data, we were simply debating opinions. Mhmm. So what I'm trying to show here is when you have inconsistent data, it requires specialty knowledge to uncover, meaning that you greatly increase your time to insights. So it was at this point I knew we needed a better way to scale our data practices and provide a more robust self-service analysis tool for our product team. Yeah. That makes a lot of sense. Now let's dive into implementation. This is really where the rubber meets the road. How teams approach this process often determines whether they're gonna see real success or it just ends up becoming another tool in their tool stack. So why don't you talk us through your change management strategy? What did Sprout Social need to get right before rolling Mixpanel out to your teams? Yeah. I I know people are gonna grumble when I say this, but building a prioritized implementation plan was really the the time we kicked off the project, Sprout Social was already a highly mature software product. There were thousands of things we could track or migrate into Mixpanel. But instead of sending. every existing event in the Mixpanel, we purposely took an intentional approach. And what I mean when I say that is we looked at common actions across our product and asked ourselves what questions do we want to answer. We wrote those questions down and worked backwards to understand what data would be necessary to provide answers. So in this example, you see again that same social message and all of the actions a user can take on it. We previously talked about that complete action, but, really, all the actions here are similar and related. So this became very obvious that we would need to create a message action event or something similar. From there, we asked ourselves questions like, what's the name of the action? In which section does that action take place? And these questions provided us with hints about the data that we would need in the event itself. They guided us to the mixed panel properties that we would eventually outline, things like a message action name, a section name, a page name, and much more. And it was because of this very intentional approach that we can now analyze and break down common common events like these across our entire product. Beautiful. I love the intentionality around, building and driving that strategy. Now it's really tough to drive an org wide implementation strategy all at once. I wanna know how did you approach sequencing adoption of Mixpanel across teams and through different use cases? Yeah. It can be pretty tough. And I'll be honest, when we started, I was really worried about being able to make progress quickly. I'm sure all of us know firsthand the challenge of asking teams to fit work into their already established road map. Yeah. However, once we started, and once the first event was implemented, other teams quickly took notice. They also wanted to be able to query their product features in Mixpanel. It caught on much quicker than I anticipated, and teams began adding new events every week. In this way, I feel really lucky. Mixpanel almost advocated for itself. Once people saw it, everybody wanted to use it. I love that. Alright. So walk us through how you approached ensuring data trust and the consistency as more people self serve their data, for the various needs of their teams. Yeah. We quickly became strong advocates of the Mixpanel lexicon, which is the data dictionary features within Mixpanel. We populated these with event names, event descriptions, images, GIFs that described where the event was tracked. We used tags to mark the responsible teams and even outlined event owners who could help answer questions if there were any. Even more recently, Mixpanel released data governance features that helped keep the data dictionary up to date. For those that haven't yet, you can configure notifications for when new events and properties are added. This helped our teams quickly review and approve those changes, making sure that our events were always up and ready. Love it. This part is essential. Alright. Let's discuss how you operationalize Mixpanel. So not just what changed, but how did teams actually start using Mixpanel? Once we had a healthy event volume in place, we began to roll out Mixpanel to the wider product organization. We started with the product management team. These are the team members that are likely to perform that analysis, build reports, and create dashboards. Mhmm. Our overall goal was to provide wider access to product data. So the first step was training and enablement. I'm gonna applaud Mixpanel here because the resources that you all provided, the documentation, the Slack community, and the help center, did 90% of the work and will apply to any organization that wants to track data. But for the remaining 10%, we at Sprout Social used a combination of Slack, Confluence, and Loom. We created. Slack channels for people to explore the data and ask questions and get help. We built some onboarding resources in Confluence that people could use to quickly get up to speed, and we created instructional video recordings in Loom to help guide new team members on what they could do with the product itself. Beautiful. I think one of my favorite parts about Mixpanel is how the product features build on one another. Once you know how to select a metric, apply a filter, or do a breakdown, you can build all of the report types in the product. So what I did here on the slide is you can see three very different reporting types, and just how similar that user interface is for the users. It's this consistency across the entire product experience that helps reduce the barrier to entry for anyone first coming into the product. As a result of all this work, we now have 450 plus team members at Sprout Social in our product organization that are very comfortable using Mixpanel during our software development life cycle. That's really impressive. Alright. I'd love to now learn a little bit more about how you approached measuring the success of your AI features. I know these are recently launched, but how did you approach, the measuring of the impact there? Yeah. Like many, if not every single company today, Sprout Social has been adding a lot of AI related capabilities across our product suite. We all know that the big goal for these AI features is to create workflow efficiencies, pass along that value savings back to customers and users of the product. We should all be able to work better, faster, etcetera. But how do we prove these efficiencies? How do we know that the AI features we've launched achieve this goal? Yeah. And this is where we've really leveraged Mixpanel, and it's proved invaluable for us. Yeah. I can you share an example of an insight from Mixpanel that changed your AI road map or impacted your feature design? Yeah. I'm gonna come back to this social message concept and, walk us through a a common social workflow, which is responding to an incoming social message. Many of our customers receive over 500 of these social messages per day across all of the social channels they connect to our product. As you can imagine, the process of sorting through and replying to those messages can take a good amount of time. Mhmm. However, Sprout Social offers features like saved replies and AI assist for quickly selecting and editing social replies with a goal of reducing the time to response for our customers. So you can see that workflow here where a social or support agent is replying to Arletta Brown about her positive experience with Sprout Social Coffee, one of our favorite test groups here. You can see the agent working in Sprout Social, click the reply button. They leverage a saved reply to quickly get some content into that, that message box. Mhmm. They can use AI assist to make some slight adjustments to that message, maybe personalize it, make it more friendly, more professional, and then they can send and deliver that message to the social network. So using this workflow as an example, we can build two funnel reports in Mixpanel. One of those funnel reports will feature customers replying to a social message and leveraging AI related features. Another funnel report will be customers using the same workflow but without leveraging those features. And then with these two funnel behaviors, we can quickly compare the average time to response for both. And when we did this at Sprout Social, we saw that customers using our AI capabilities during the reply workflow have an average time to reply of one point one minutes compared to one point four minutes for those users who do not leverage our AI features. And now that eighteen seconds may seem small, but it means that our customers using AI capabilities in Sprout Social gained efficiencies of about 21% when responding to social messages. So to put that in perspective, for a team handling roughly a thousand social messages per month, this translates to five hours saved monthly or sixty hours saved annually. And I think those are pretty impactful findings to to reach in just a matter of minutes, and it was all made possible because of the implementation and intentional event structure we had with Mixpanel. So this is really exciting for me, and I'm excited to share more about this with you. And in the next section, Patrick will be showing how to build reports exactly like this to uncover insights like we did here. Amazing. I it's an achievement just to be able to track this. I think the whole industry here is pretty fresh. So, seeing how new this feature is and how long you've been tracking it, can you share a little bit about when this was launched just to give an idea around, the dataset that you've been working with to track this impact? Yeah. We've had this feature in beta for a little while now, but I believe we've launched and had this feature live for folks on a certain plan level for over six months. So as we see this get adopted more and more or as people become more interested in using this feature, we only expect these efficiencies to go up. Beautiful. Well, we can't wait to come back to you in a year to see how those numbers change. Blake, before we move on, I just want to thank you for sharing your story with us. I think a lot of folks here have been able to see themselves at some point of your journey, and like me, are inspired by what you and your team have been capable of. Alright. I'm gonna stop sharing my screen. It is a pleasure to welcome, Patrick Mackle, and also a member of our customer success team. Patrick, come on down to the stage. We're happy to have you. Thanks for joining us. Hi, everybody. Happy to be here. Go ahead and share my screen here really quick. You see it okay? Perfect. Great. So for this part of the webinar, I'm gonna walk through five bite sized demos that we have. In each one, I'll show you a specific way that Sprout Social uses Mixpanel. Right? What questions did they start with, how they explore the data, and how that insight translated into action. My goal for you all is not to just to see what they did, but how you could apply the same approach in your own work. So with that, we can go ahead and get started. Alright. Let's start from the beginning. A lot of teams still struggle to answer simple product questions quickly, because they're not sure which events to start with. Reporting feels time consuming, and monitoring signals isn't really automated. And I think this is really what slows down a lot of decision making all across the board. So what we wanna show today is how Mixpanel can take you from a question to an insight in minutes. No analysts needed. And to show how fast this can be, I'm gonna hand it over to Patrick to walk us through our first workflow. Mackle. So for this first quick demo, I'm going to show you how you can answer a really simple high value question in minutes. Right? Which areas of my products are driving the most message activity? So this here is an insights report, one of the most commonly used reports within Mixpanel. On the query builder on the right hand side here, I'm gonna look for our message action event. Oops. And I'm gonna count it by total events. In the context of our application, right, message action is the action a user takes when they're on a page. This can mean given a thumbs up, responding to it, copying the link to it. This instance, I'm just curious about the general message action. But maybe I wanna know where this is happening. So from here, I might break down by different attributes, something like section name, which are the big product areas in our application, and then I might also break down by page name for the more granular data on which specific page it is happening. So the value here is twofold. Right? Trends tell you when activity changed, and the breakdown tells you where it's actually coming from. So first, we're gonna use this line chart to spot signal versus noise. Right? Are volume stable? Are they growing? Are there any sudden spikes? Did we ship a feature, run a campaign, or change the UI? Or maybe a drop in a section is an early warning that needs investigation. But this isn't the only visual that's available to us. Because from here, I can also switch over to a bar chart for clarity. So the bar is gonna collapse the time dimension, gives us a more clear aggregate view. It tells you the small number of pages or a single section, right, that are driving the majority of traffic. So for PMs, this is gold. It gives you a prioritized list of where small product changes are gonna move the biggest needle. So here, there are three practical things to do from this next view. Right? You could prioritize a Mixpanel experiment against the top two to three pages. You know, small ones here are gonna yield large impact. You could drill into anomalies. If a page shows unexpectedly high volume, we could ask whether it's intended behavior or a UX issue prompting extra actions, or you could set alerts on the report. Alerts can help notify you of certain percentage changes so you're notified in real time. But the point is with a couple couple of quick visualizations, you can move from a question to a prioritized action in minutes. You shouldn't have to sift through complicated SQL queries just to get answers to what feels like a simple question. From here, I'm gonna hand it off back to Michael. Perfect. So so far, you've seen how to create your first insights report to centralize the data signals you wanna keep an eye on. The next step is building a clear line of sight into user journeys, especially as behavior becomes more complex. One of the biggest challenges teams face is understanding where users drop off in multistep workflows. Without clear visibility into friction points or how different segments move through an experience, optimizing these flows quickly turns into guesswork. So next, Patrick Mackle gonna show us how funnels make it easy to pinpoint drop offs, compare journeys over time, and surface the same insights that Sprout Social use to improve their most important user experiences. Great. So for this demo, I'm going to show a simple funnel that answers a critical product question. Where are users dropping off between composing a post, adding media, and then actually submitting it? And which steps should we optimize first? So when you're working with a funnels report, you likely already know the journey that a user is gonna take. You just wanna know where they're dropping off. So similar to the insights report, I'm gonna add in the steps for my funnel from the query builder. So I'm gonna start with compose open. Next step is gonna be media attached. And then the last step is gonna be post submitted. I'm gonna set the conversion window here to around thirty minutes because that's the reasonable length that it's gonna take for a user to typically do all of this. For this example, I'm also gonna hold the is existing post Boolean to constant. And this just makes sure we're consistent with the post that we're looking at. The last piece of this is I'm gonna break down by this is existing post property. Oops. Excuse me. See the difference between the posts that are new and then also the ones that are just after the fact. But here, we can kinda read the funnel from left to right as conversion. So for each step, I can look at the different conversion rate into the next step. So for the first conversion from compose open to media attached, tells you how many people who start composing actually attach media. That conversion is low. It usually means one of three things. Right? Users don't need media to finish the post. The media picker is hard to use, or maybe the media upload is too slow, or maybe it's failing. So that's where you would start troubleshooting. You know? The check upload errors, inspect attachment UX, validate that the property that records media type and size. So the second conversion from media attached to post submitted is the one that actually gates submission. So a big drop here usually means users attach media but then stop before hitting submit. So common causes could be validation errors, slow network during upload, confusion confirmation flows. Because we're looking at a thirty minute window, these drops are usually tied to immediate UX or performance issues that you can reproduce and test quickly. So one thing this funnel reveals is how do we prioritize which of these steps to work through? Here, right, I'm seeing the biggest drop between compose open to media attached. So for existing posts, I'd prioritize maybe simplifying the attachment flow or surfacing a faster default option like a drag and drop or a camera. If the biggest drop was instead from media attached to post submitted, I'd prioritize diagnosing upload failures, measuring time to upload so I can optimize that process. The takeaway here that in minutes, you can turn a vague people aren't posting problem into a prioritized list of testable fixes. So pick the top one or two funnel steps that combine big drops and big volume, investigate any errors that might be going on, deploy a fix. With that, we'll give it back to Michael. Perfect. Thanks, Patrick. Alright. So far, we've created our first insights report, and we built a funnel to better understand user behavior across key customer journeys. Next, we're gonna focus on flow analysis to see how users actually move through the product, including paths that might otherwise go unnoticed. Often, the most valuable product insights come from behavior that you didn't really anticipate. Traditional reporting usually assumes linear journeys, but real users rarely follow a straight line. So when this happens, hidden friction and unexpected paths can easily go undiscovered. So next, Patrick's gonna show us how flow analysis surfaces these patterns, helping teams identify and address friction before it turns into churn. Alright. So now I'm gonna open up a flows report here. So this flows analysis is gonna quickly reveal some top paths that, for me, could lead into different value moments. In our case here, we might look at something like the notification action event. I 'm gonna pull that up here, and I'm gonna filter for the notification action name is where the user navigates to specific content because maybe that's a event or a, action that a user's performed that I really care about. I'm gonna change the flow to instead of looking three steps after, I wanna look at the three steps before that event that users perform. And then the last piece that I'm gonna do here is I'm gonna expand by a couple of event properties to help give me more context. So I might look at notification drawer action and open it up by the name, and then I'm also gonna be looking at notification action and open it up by the action name. So while funnels tell you how many convert for a known user journey, flows tell you where they came from. By restricting to navigate to content, looking backwards, we can see the exact events that are actually feeding this value events, and the percent share tells us which sources to prioritize. Right? So we can read the flow from left to right toward the target node. And each oncoming node shows the percentage of users that took the specific path to content. So you can start by scanning for the top one or two incoming nodes. Those are your highest priority levers because they contribute the largest share of traffic. And if one event accounts for most of that inbound share, that's the single thing you can optimize to move the needle the fastest. Then this is also the place to look for fragmentation. Right? If the inbound share is spread across a lot of small sources, it means that there's no single dominant driver, and your strategy might wanna look different. If you found a user journey of interest and you quickly wanna dive into the nitty gritty of that funnel, you can actually switch from this user's flows to the top path that's here and go straight into a funnel. In the top right, that's how you change it, and you can go right back over here to the user's flows. So this flows report is basically a great way to explore the way users are navigating your application when you may not know the exact path that they take. It's only the actions that precede a value moment, moments after events of interest, or show you where people are getting blocked in a specific user journey. So use these paths to figure out which ones need your attention rather than making blind guesses at where users are actually coming from. Back to you, Michael. Perfect. Alright. In this next section, which is probably why many of you are here, so while we're focused on AI because it's the newest problem that teams are trying to solve, the same principles for measuring impact remain the same. So the challenge that we see again and again is that teams are struggling to understand the real impact of AI features, which makes it hard to justify investment or to prioritize what to build next. First, a lot of AI triggered actions simply aren't tracked. So when instrumentation is missing, AI usage becomes invisible. Teams can't clearly see who's using these features or how they're being used. Second, measuring efficiency gains is difficult. It's hard to tell whether AI is actually reducing time to complete a task when there's no baseline to compare that behavior before and after AI is introduced. And finally, AI ROI often remains ambiguous because adoption data isn't always connected to business outcomes, which leaves leadership without clear evidence of impact. So the solution here is really applying familiar proven approaches, instrumenting AI actions, analyzing adoption through cohorts, and then comparing time to action to measure efficiency. This is what allows teams to connect AI usage directly to measurable business impact. And so to bring this to life, Patrick's gonna show us how Sprout Social measures AI driven behavior, like AI assist and replies, and ties adoption to efficiency gains and real business outcomes. Awesome. So here is the exciting part that I know a lot of people are looking forward to. Right? In a world where everybody's excited to implement AI, right, it's even more important to make sure that what you've shipped is not just shiny and neat, but it's actually having a measurable impact on these users. So for this example example, we're gonna show how Sprout Social shift the new AI assisted inbox auction and was able to use Mixpanel to measure how it lowered the time it took to craft and submit a response. So I'm gonna take a bit of a shortcut here actually and use something that's unique to Mixpanel, which is this concept of a saved metric. So saved metrics allow you to find a specific behavior or flow to be able to reuse across your different reports. So in this instance, I've got two saved metrics, which are actually two saved funnels, and I'm gonna plot them in this single line chart in an insights report and look at the last three months. So if I look for the saved metric average time to conversion with AI and then the average time to conversion for a message reply without AI, I'm gonna pop them side by side here. So here, funnel a is gonna include the AI related inbox action step here, while funnel b is actually going to exclude that step, meaning everything else is the same. Both of these funnels use a thirty minute conversion window. We hold the message ID constant, meaning it's the same message thread. And the last piece to make sure we're comparing apples to apples, I'm gonna filter for customers whose current plan is a professional plan. This way, we're actually measuring users who have access to the same feature. But here, we can take a look at the two lines. Here, we see that the top line, the one that does not include AI, and the bottom line, which does include AI. So whenever the top line stays consistently over the bottom line, that means messages that route through an AI action convert faster on median than those that don't. Because we're using median time to convert, this is robust to outliers and tells you the typical user experience. And so here you can see that the AI response is generally below the non AI response, meaning it's faster. You can imagine the ideas that pop through your head as you look at this. Right? If the AI line is consistently lower, that's evidence of durable value. If the trend is changing, right, if the gap widens after an AIU exchange, that signal your change worked. If it narrows, you may have regressed. Then volatility, spikes can point to rollout timing, regional issues, or telemetry gaps. You'd investigate by drilling into the time frame. But we can make this even more clear and switch to a metric visual. And here, you can see the big difference that we're talking about earlier. Right? One point one minutes with AI versus one point four minutes without. And that gap right there is the concrete story that you can bring to stakeholders. It's not abstract. In Sprout Social case, this was a 21% improvement in median speed, which translates to meaningful time saved at scale. The takeaway here is that Mixpanel powers you to analyze quickly and make these decisions faster. Right? And these saved metrics enable the team to be on the same page about how they are measuring this conversion rate, make for an easy repeatable process to collaborate and make decisions on, like, these new AI features. I'll let Michael take it back from here. Perfect. Thank you, Patrick. Alright. We've seen how teams can measure impact and adoption at a granular level, but those insights only matter if they're shared. And even with strong data, teams often operate in silos. Too often, product engineering design teams operate with different dashboards and have different definitions of success. So alignment breaks down not because teams don't care, but because they lack a shared real time view of what's happening. So in our final demo, we're gonna see how teams are able to solve that. Patrick is gonna walk us through building a shared Mixpanel dashboard with funnels and flows and alerts so teams can stay aligned, able to spot issues earlier, and move faster together. Alright. I'm going to keep this one short and sweet, everybody, but here's the Mixpanel board of all of the reports that we just made. Right? This one board view, you can share with product engineering, design, CS. Literally, everybody sees the same priorities and can act fast. Point of the board is simple. Right? Bring your most important KPIs, funnels, flows, AI signals onto the same screen so stakeholders don't have to hunt for context. When teams open this specific board, they should immediately know where the product is healthy, where the biggest friction is, what actions will move the needle. We can organize this board however we want. Right? We can put reports right next to one another. We can add text here. We can also add media, photos, videos, to really flesh out this board. It's also easy to be able to filter and break down on this board itself. So rather than having to go into every single individual report, I can just go up here and be able to break down and filter by the specific properties that I care about most. So then I can, for example, filter for a specific country, look for where the country is United States. That helps me be able to target specific demographics. And so if anyone on the team wants recurring context, you can also subscribe to the board up there, and then you get updates in email or Slack whenever the board is refreshed. And that's how you keep stakeholders synchronized without extra meetings. But yep. So Mixpanel boards allow everybody to be on the same page on what is happening with your metrics. And that kinda wraps it up for the demos here. I'll go ahead and hand it back to Michael for some last q and a. Perfect. Thank you, Mackle. I love how you broke down each demo to mirror the exact same steps that made Sprout Social successful. And while you made it look easy, being able to follow your steps in each demo is gonna make the impact Sprout Social has seen accessible for everyone here. Next, we're gonna open things up for questions. I wanna be mindful of time. So based off of what folks shared when registering, let's welcome, Blake back to the stage. First question came from a good number of folks who sound like they're currently in the process of picking a product analytics tool. So, Blake, I'm gonna pose this question for you both as someone who isn't a member of our sales team, but, instead as someone who's been in this position before, what do you feel Mixpanel can do that other analytics tools simply can't? Yeah. It's a good question, and I'll do my best to remain unbiased, but no guarantees. I mean, there's a number of analytics tools on the market today, but I think where Mixpanel kinda drew me in was that it's truly a a product analytics solution, and it's really built to answer questions like we just walked through. Like, what features drive retention? What is the ROI of a feature we just launched? Answering key questions like this is harder to do in just standard web analytics tools that are designed primarily for marketing. And so I would ask yourself, you know, what am I what am I looking to answer, and which tool is gonna allow me to to do that, as quickly as possible? Couldn't decide it better myself. For folks who are in the process of procuring a product analytics tool, I encourage you to book a demo with a member of our sales team. And I mean this when I say it. They're here to help. So let them know the problem that you're trying to solve for, the functionalities you absolutely need to have access to, and the options you're considering, and they're gonna be more than happy to help you find a solution and a plan that speaks directly to your team's needs. Alright. Our second question is by Blake from the Inspire Leadership Network who wants to know what are the top metrics that are reported out to cross functional stakeholders from Mixpanel that you couldn't live without? Oh, this is another another really good one. I'll pick a few of my favorites and their use case here at Sprout. We have a fully functional, partnership team, you know, the the folks that talk with our social network partners like Meta, and and teams like that. And so that team, reviews a dashboard that shows social network usage and adoption across all sections of our product. They love that one and check that one often. Our growth team consistently reports against the trial sign up and conversion funnel. So dashboards like what Patrick, showed, they have notifications for steep increase trial conversion or trial sign up. And then our leadership team uses Mixpanel more for, like, at a glance. You you can imagine these kind of high level stakeholders looking at primarily monthly active use, daily active use, by unique customers or unique users. So very high level product health over time. I'd say those are my my three favorite examples. Perfect. Next question is by Salim from Sprout Social who wants to know how are you getting nonproduct folks like customer success to start using Mixpanel to get leading indicators on customer health? Patrick, customer success is your bread and butter, so I'm gonna pass this one over to you. Yes. As the nonproduct rep here, I will go ahead and take this. Yeah. We use Mixpanel all the time internally. It helps us see customer usage in the product, right, how frequently are people coming back. It's incredibly helpful for seeing, like, product adoption. Right? So what are our weekly active users? Right? What is the adoption looking like? It's also helpful for understanding their data health too. So how much are the events that they're bringing into the project? How much of them are actively being used? Are there any suggestions that we can make to help clean it up? And then also, lastly, it just helps to see if customers are performing the actions that we consider valuable moments. And if they're not, like, how can we get them there? Yeah. Great. Very straightforward. Thanks for that. Last but not least, I've got a perfect closing question for Blake. This one comes from Hannah at Sprout Social. So I wanna pass it to your your way, not just because she's familiar, but because I think her question really speaks to the foundation of everything we've been we've been discussing around, creating intentional events and data governance. So let us know. How do you think about maintaining data quality? Yeah. Thank you. Hi, Hannah. I'll see you back at work. I I strongly believe that data quality is linked to data maintenance. So what we try to do at Sprout is treat analytic events like any other software product that we build. I feel like organizations should make a practice of revisiting those events, making sure they still achieve the original goal, and making adjustments when they don't. You know, adding properties, removing properties, changing the structure, keeping events and descriptions up to date, you know, keeping a healthy maintenance on your your analytic events, in my opinion, is one of the the best ways to ensure, quality. Perfect. Alright. We're officially winding this down. So to close our time together, I wanna share some final recommendations and next steps for those who are interested, because our goal here is to have you lead the session with learnings you can, take to level up your Mixpanel experience. I wanna close things out with a special section called playbook essentials where we share five practices to accelerate impact. Blake and Patrick got together to compile this list, so I'm gonna pass it on to them. Yep. So the first and foremost practice, for teams just starting out, please build a tracking plan. I found that so many of my customer lives who create a tracking plan before any of the coding began makes their developers' lives so much easier. It makes your data significantly cleaner, and it really reduces the amount of time to insight. Yeah. The the planning spend here will pay off 10 x later on. So Mixpanel even provides a product, tracking template, that you can use that we did to follow right along. Yeah. So I think Michael actually is gonna drop a link to that tracker template in the chat. So you can click the link, bookmark it, make it your own. We'll also include this in our post event email just to make sure everybody has action access to it. But the the next step I would say is to prioritize tracking. Right? When you first get started, the idea of tracking everything is excited. This kinda connected to the last idea, but take it from me, it's best to be really intentional. Start with, like, your most important value moments in your funnels first, like, whether that's, like, three to five events or, like, seven events, and then you just keep iterating on that as you go along the way. Yeah. There's no need to implement everything all at once. Look across your product and try to figure out what gets you to 75, 80% coverage. Start there, and you can add more after. The third thing is to create, like, intentional events. Right? The most successful folks that I work with are the ones that can back up why they're tracking what they track. Right? Knowing a user subscription type tells me if they're a paying user. If they're a paying user, maybe that's a subset of users I want to increase the retention of, which maybe leads to an increase in revenue. So Mixpanel metric trees are one intentional way to think about this. Each events or property ladders up into a broader metric that impacts a North Star that we care about most. Yeah. And like we talked about earlier, think about the common actions across your product. Outline data properties that can be used to describe those, and then consider how you may wanna break that data down in the future. Is it by industry, subscription type, section, page? Those will all be inputs into your your events. Yep. And fourth, as, like, I kinda showed before too and showed the screenshot of, please, like, leverage the Lexicon. This is where it comes in really handy. The Lexicon is your one stop shop for all things data governance, whether it's adding descriptions to your events, images, assigning owners, verifying your events. There's lots of different options there that help you clean up your data. Yeah. Last but not least, is hands on keyboard. Blake, you wanna speak to this a little more? Yeah. And this is a term we use at Sprout Social, quite often, but Mixpanel has a a built in, incredible, easy to use product. So get people in it. That way, you don't have to advocate for it. The product does it itself. Love it. I'm I'm an experiential learner too, so I learn best by doing. I think it's really important to just get in there, test, build, experiment. That said, our customer education team has been very busy building out a range of resources I know you're gonna find helpful. Many of you have seen our documentation guide, but we're in the process of rolling out certification courses and educational modules, which are fresh off the press and available for everyone this month. So with that, it brings me to our enablement resource section. We're gonna be including links in our post event email, but I wanna make sure that everyone has access to the courses, learning paths, template that we referenced here today that we know are gonna help you be successful. The slide deck is gonna have these hyperlinks, to make it super easy to access, but I just wanna make sure I flag them here. Before I hand this off to Blake, to close this out, for those of you who are active Mixpanel users, we'd love it if you could take just five minutes to leave us a review on g two. Our team actively reviews your responses, which helps us evolve, improve, and grow as an organization. So take a moment to let us know what we're doing right so that we can double down and make Mixpanel everything you need it to be. Blake, is there anything you wanna promote that's fresh from, Sprout Social's product team you wanna share? Sure. Absolutely. There's two product releases that I feel really proud of that our team's been working on for a while. They're listed here. One is Guardian, which is an add on product for Sprout Social to help brands, manage their compliance and brand safety needs. And then we also just announced and launched Trellis, which is our, AI social listening agent, which helps customers understand how they're performing on social and what their, kind of brand performance is in the social sphere. If you already have Sprout Social, please reach out to your rep and ask about those two new offerings. If you're not a Sprout Social customer, we have a thirty day free trial. Not many of our competitors offer that, so, click the link here and, try us out. Thank you all. I love it. Alright. Thank you everyone for sharing your time with us. Thank you, Blake and Patrick, for sharing your experience in areas of genius. This was such a pleasure to share this time together. And for everyone else, we look forward to seeing you in future webinars. Coming up will be one dedicated to session replay, so stay tuned. Till then. Thanks, everyone. Thanks, all. No.