june-logo
June's logo
Customers
Pricing
Changelog
Crystal WidjajaEIR @ Reforge, ex Chief Of Staff @ Gojek

13 Feb 24

The Analytics mistakes startups are making

Everyone says you need to keep track of numbers in your new business like it's a big deal. But when you're super busy making your product and getting people to notice it, who has time for boring number stuff? 

I know, you think you've got everything under control. Your gut tells you you're doing the right thing because you care a lot and you're always thinking about how to make your product better.

But here's the thing: sometimes, you need to stop and look at the numbers. It's important, even if it doesn't seem fun.

I've seen businesses that don't pay enough attention to the numbers, and then they get confused about why things aren't working out. Some business owners don't think their product is ready to look at numbers, while others have so many numbers they don't know what to do with them.

Don't worry, though. I’m going to show you the 10 big mistakes new businesses make when it comes to numbers. I'll help you understand things like who really likes your product and how to make your product better without making things too complicated.

Ready to find out the 10 big number mistakes that can stop your business from doing really well? Hold on tight. This isn't just a list of what to do and what not to do. It's a straight-up, honest guide that might help save your business from some big troubles.

Pitfall 1: “Our product isn’t ready for analytics” 

The first problem startups make is not analyzing until they feel they’ve found product-market fit. I’ve worked with founders who say things like: “But there are a handful of users, that’s not going to be ‘significant,’” or, “What we need to do right now is just build the product.” The most common resistance to early-stage analytics is, however, “the product’s not ready yet.”

When it’s all hands on deck to generate revenue and build something the world loves, analytics can feel like something you won’t need until a later series or growth phase. However, the problem with this, according to Crystal, is how do you know you have product-market fit if you’re not doing analysis? And, I’d add, how will you properly find it?

While thought leaders can be hand-wavey about what product-market fit is calling it “a feeling” “when everything’s breaking,” or describing it as a moment “when there are more sales than you can handle,” there are very specific empirical signs. 

How do you know if you have product-market fit if you aren't doing analysis–if you aren't measuring cohort retention rates if you aren't finding out who is rotating? If you are only relying on customer calls or your own experience, there’s a hard bias. Your company has 50 people who happen to all want to join a startup–that’s a very particular type of person who does not represent the majority of the consumers.

A.I Bias -A Thought Experiment. When you do a quick google search of… | by  satyabrata pal | ML and Automation | Medium

That bias is huge. It has the potential to cause you to say: yes, I’ve found product-market fit, when instead, what you’ve found is a small audience of retained customers that are just like you. This can float you through series A, maybe even B, but won’t take you to the hyper-growth you’ll need to succeed. You have to check your own biases. 

Conversely, some founders assume there isn’t any usage and keep pivoting when there are actually sticky aspects of their product. In this case, you might be thinking your product isn’t ready yet, but for a very specific segment of users, it’s already quite valuable. 

I’d run into this before and had to point out a segment of users who were religiously using the product and loved it. You wouldn’t want to pivot away from early adoption and retention. These indicators are a great path to follow as you hunt for the proverbial product-market fit. 

In summary, the number one mistake startups make is not doing analysis at all. It’s cognitive bias that says: I know my product, my customers, my plan so well I don’t need to check with the data. Which, I’ll remind you, is a quantitative reflection of people. To us, that feels like running blind and ignoring the signals around you. This is why, of course, I’m sharing this piece on June’s blog - those folks built an analytics tool specifically for young startups who don’t have time for heavy tracking plans or money for data engineers.

Pitfall 2: Cohort analyses that includes internal usage 

The second most common mistake is a bit of a face-palm moment, and, luckily, an easy one to avoid. You need to filter out internal usage. 

If you are using your product, you’re likely creating new accounts and testing new user experience flows as they’re built. Let’s say you launch a new feature, and all 50 employees create an account or try it out. At the same time, you’re probably heavily monitoring new cohorts to see how the feature performs. Unless you filter out internal usage, you’re going to think retention is terrible because 50 people signed up and then dropped off on days two and three. 

Filters in june.so

Conflating this data means you have no idea what's going on in the product. Think you’d never do this? I’ve seen it enough to need to flag it. Check with your PMs and team to make sure your analysis excludes internal usage.

I’d also add that an analogous moment happens when teams forget to separate their “signup” button from their “login” button. In this case, all existing customers who have to go through the homepage are conflating what your new user signup flows look like. This can make it impossible for marketing to understand what’s happening on your websites and signup flow. I also see this consistently happening in Google Analytics when homepage traffic is impossible to separate from new user flows.

Pitfall 3: You track a lot of events but haven’t learned anything 

You’re a data-driven founder. You’re launching new features. You want to understand, empirically, what customers love about your product. So you track usage on every new feature, new upgrade, and every change. I love the intention but don’t forget to zoom out and think about your “job-to-be-done,” if you will, and test different slicing and dicing of your data. 

Here’s what I mean:

Say you’re a transportation company. There are feature bundles within “the ride” like the “motorcycle,” the “car,” the “luxury car,” and you just released “the extra large car.’” Maybe “the ride” shouldn’t include the “extra large car” because this is a different use case. The “motorcycle,” the “car,” and the “luxury car” are built for people riding from A to B, and the “extra large car” is built for people moving items. So, you wouldn’t want to bundle all four of these, more broadly, “transit” features together. 

If you looked at data individually as you launched new features, you’re going to see strange usage patterns that don’t add up; the “extra large car” might look like it’s not performing because its patterns vary wildly from the other transit options. Bucket all of these transit options together and you’ll miss the emerging use case for the extra large car and are wondering why retention dipped. You might think that this is an unwanted feature because they’re not being chosen, but in fact, you need a better matching tool to pair drivers with large vehicles to customers differently since they’re not constantly roaming like smaller vehicles. 

This is an easy mistake to make because, as a startup, you’re Frankenstein-ing a product together as you build new features and functionality. You’re doing anything you can to create multiple paths to that one experience. You’re trying to solve a problem in a few different ways until you find the market’s preferred way. 

I often find that you're doing analysis on an aggregate level, you're looking at retention or activation on an overall level versus segmenting it into different mental models of feature bundles, so to speak. 

Maybe a cohort performed twice better? Check who this is and what they did.

I’ve found it helpful to think through your “feature bundles” as quantitative reflections of your job-to-be-done. The job of “ride” vs “move” are very different. Consider interviewing customers and asking them how they’re using features to get tips on how to bundle your features if slicing and dicing aren’t fruitful. 

It does take some time and like clarity to think through the different approaches to solving a problem–how they are segmented and bucketed, and which features are complimentary versus cannibalizing.

Pitfall 4: You always take the average and forget the median 

Our fourth tip is quick and easy as long as you actually do it. Whenever you do an average, you must also take the median as well. 

Feel like 7th-grade math? It is, but many people forget to take both. 

If the difference between the two is insignificant, then you can trust your averages and maybe even a weighted average. But if there’s a large discrepancy, then you know something about the distribution of usage. If you’ve got strong outliners that are distorting your average usage, you’re going to want to find out why. 

Pitfall 5: You believe documentation will slow you down

So you’re taking note, you’re running analytics, you’re looking at medians, you’re grouping your product features by functionality groups (jobs-to-be-done); it’s time to make it easy for everyone on the team to contribute and access data through documentation. 

Ok, before you groan at the word “documentation” and picture a long Google Doc or Notion page, hear me out. 

All you need is a Google Sheet with a few tabs that work as a home base for your data frameworks and to build a practice of quickly dropping data conversations into this home base.

If someone says, ‘Oh yeah, that table and that column…it has values A and I, and A means active and I means inactive,’ write that down in the Google Sheet and just keep adding to that. This is literally the first thing I do on the first day of every startup that I've ever worked at. In 90% of the startups, they are still using that Google Sheet today because it becomes this wealth of information and knowledge that people cannot imagine. 

While that might sound disorganized, it’s easy peasy to Control+F and search for key terms, charts, and definitions as individuals on your team review and create new data. 

Want a bit of organization? Create just a few tabs in Google Sheets. Perhaps something like the tabs below: 

  • Event Tracking: List your event names, when they trigger, and what are the properties associated with each event. 
  • Databases: Perhaps you use BigQuery or MongoDB. Track table names, column names, and values–and then what those mean.
  • Important Dashboards: As your teams grow, so will your number of important dashboard. Make a home base for quick links to common analytics or dashboards.
  • Data Definitions: This is where you document the taxonomy of how you talk about user actions at a company. For example: what does “signup” mean, what is a “lead,” and how does this relate to something like “account created.” 

Every time I hear an acronym or a term that I don't know, I just write it down, and I list the definition in that same data dictionary document. I think it's part of the culture building.

To be honest, I actually encourage it. Some of the best startups I've seen were so passionate, felt so much like a team, and so united, but they do have these weird shared languages. They have services that are named after inside jokes…Like at Gojek, we had service we named Kenny, because it couldn't die but also because it kept dying…all of these like things, like kind of make everyone feel like they're in it together, it's going to happen because it's going to happen documented so that at least new people can come on board and feel like they are part of a team.

Pitfall 6: You’re running analyses that won’t change anything

If you’ve ever been on the end of a multiple-week-long experiment or analysis just to find out you don’t have the resources to implement the results of the analysis, you know just how frustrating it can be.

That’s why I’ll offer up a shortcut: only run an analysis if you know it’s going to change a decision you make or how your teams run the business. 

I'll often ask someone who is asking for some analysis or some chart: okay, let's say that the answer is a hundred percent. You are a hundred percent right. Are we going to do anything different? And that's not me saying no, right? That was me reprioritizing and trying to figure out like, is this an analysis that will actually translate to a change in our business or is it just curiosity and entertainment?

The latter, at the end of the day, is just busy work. If that sounds a bit harsh, consider why you’re running so many reports. Oftentimes, it's because teams are trying to report on their success to keep their jobs or lines of work open. That or it feels like big changes need data. Fight this impulse. You can move faster with more trust if you get clear on what you’re strategically doing as a team. 

Pitfall 7: Your data is so complex that no one bothers to use it

It can be tempting to create incredibly nuanced events for nuanced action in-app. The more you know the more you grow, right? Well, the problem becomes if your event tracking strategy is so complex no one interacts with it in the day-to-day. 

It’s unlikely that every engineer trying to ship quickly is going to name every column intuitively, name every value return, and label as active and inactive. In fact, it’s inefficient to do this. 

Start by reducing the complexity of documenting and sorting data. The Google Sheet I name above makes it easy to slap in some bare-bones documentation and Control+F to get what you need to do. This can work, especially for startups. And what it means is that the new kid on the team doesn’t have to resort to Slack and pestering more senior developers to get the job done. 

Creating and Using a Data Tracking Plan

And then, be smart about what you label as events vs properties. If you know your product is doing only one thing, and the event name can be super clear, go with a clear event name rather than a source. This allows you to collapse in-product actions into big-picture events to analyze the funnel. You can tie smaller in-app actions to properties. 

Before I took over the analytics tracking at Gojek they would do things like “select a ride,” “select a car,” “select food,” select “go.” The problem is that the premise of Gojek is that you can do many things in one app–so what are those many things? I ended up collapsing these because otherwise, you couldn't actually analyze the funnels in aggregate. Instead, I changed it to “select a service,” and under that was properties car, ride, transport, etc… whoever is designing the data taxonomy needs to be aligned with the strategy of the company. They need to understand what are the basic built like models, or like the fog of war in which we organize.

That said, many startups are iterating on their jobs to be done and can be still figuring it out. In that instance, I suggest using explicit event names because one might end up exploding. Make the analysis easy to differentiate one bet from another, but that also means you have to commit to killing event tracking later. 

Pitfall 8: You’ve never removed event tracking…like ever

The mistake most companies make is they never remove tracking. They let it go out of date and let historical tracking stay in some places and noth others. And if I’m being honest, I’m trying to think of the last time I heard a team prioritize removing events. 

Every time teams rewrite their instrumentation, they’d also boil their events down to 20 just twenty events while cleaning up and restructuring their data. Many people will say it’s impossible, or not needed. They’ll give examples of marketing teams needing all kinds of events on the website and products needing events on new actions. But at a layer of abstraction where there are just 20 events, you limit the organizational debt of upkeeping and analyzing mass amounts of data. It forces teams to really think about what they need to track, why, and what it would change. 

Pitfall 9: You have a data team, but they’re not looped into product strategy 

The wider problem within most teams is simply to have a strategy–like an actual company-wide strategy. 

If I boiled this down this might look like one day where everyone sits down and asks: what is our business? What are our product groups like? How do I think about our users and the way that we organize?

Once the product strategy is defined, you can easily point to the use cases in a product that need to be tracked, and then product can have the data team or even engineers come back with a proposal around how to do this effectively.  That looks like a team logically structuring different groups of user behaviors or features into organized use cases.

Analysts were distributed and reported up to the head of each business unit (source)

What I’ve seen, on the other hand, is a divergence between the business teams and the engineers. The business starts to dream of different product market fit tests, and the engineering team is just throwing things on top of one another because that's actually the only license that they're given. They’re told to just ship faster. At some point, there needs to be a convergence point where everyone comes together and looks at technical debt and business debt. That conversation needs to happen, and trickle into your data instrumentation.

“I actually find that when I look at a company's event tracking, it very much reflects how aligned they are with the executive teams, and the organizational strategy, like companies, are really tightly aligned. They actually have very clear data instrumentation. It's a really well structured, um, in terms of like the level of abstraction, but for companies where they're just like Frankenstein of a product, you end up with a Frankenstein of events.”

If you’ve ever heard the saying that software takes the shape of internal teams, then it’s no surprise when data and event tracking follow these same lines.  

Pitfall 10: You don’t have the resources for a robust data team

Now, I’ve talked quite a bit about the ways to avoid common analytic mistakes, but how do you do this with limited resources? One, use that Google Sheet for easy documentation. Two, with a smart and aligned company strategy, all engineers, marketers, and product people know that there are just a few “jobs” to test and only about 20 events that need to be tracked.

Conclusion

 Wrapping this up, handling numbers in my new business can seem tough, but it's super important. I’ve seen all sorts of mistakes, from not using numbers early enough to making things too complicated. The key for me is to learn from these mistakes, not be afraid of numbers, and use them to really understand my product and my customers.

Remember, my numbers tell the story of my business. They help me see what's working and what's not. And while it might seem like a big task, tools like June are here to make it easier, especially when I'm just starting out.

So, I'll keep it simple, stay curious, and let my numbers guide me. They’re more than just figures; they're the signposts on my journey to making my business awesome. I'll use them well and watch my business grow!

_________

Crystal Widjaja is a product and growth practitioner with experience leading some of the fastest growing companies in Asia. Crystal was SVP of Growth and Data at Gojek, a “super app” for multiple consumer service businesses. Crystal is an Angel investor through the Monk's Hill and Sequoia Scout programs, a Subject Matter Expert at Reforge, and a advisor to companies including AB-inBev, Maze, Carousell, CRED, and Eppo. She also cofounded Generation Girl, a non-profit focused on increasing STEM access for young women in Indonesia.


Continue reading

Or back to June.so
post image

Using product usage data to boost Sales in B2B Startups

25 Mar 24

post image

Bootstrapping is Dead. Use OPM.

17 Mar 24

Get on board in 30 mins

We're here to help with your journey to PMF. Set up June
in under 30 minutes and get our Pro plan free for a month.

Get started
Arrow-secondary
Barcode
WTF
Plane
PMF
FROMWhat the f***
TOProduct Market Fit
DEPARTURE