Developing an Effective Long-Term Data Strategy for Your Company
Analytics is considered to be critical at many companies, but few are able to get it right. Most companies quickly realize that getting a strategy in place can actually be quite hard. It is easy to drown in the hundreds of tools available, hype and other irrelevant things.
But, it’s possible to successfully to develop an effective short and long-term data strategy. These ideas come from my conversations with our clients and learning what are the most confusing aspects when it comes to data.
Start With a Good Analytics Foundation
Just like in houses, a good foundation can make a huge difference for your data. I have seen companies who are forced to spend more resources than they should have because their initial foundation wasn’t very good.
Before you start writing any code or setting up any tools, there are some challenges you should expect when it comes to analytics for your company:
- Your team will encounter hundreds of options when it comes to tools so you need a way to easily experiment with new tools.
- You will need to collect data from different sources e.g. websites, mobile apps, server/backend, other tools, etc.
- Data governance will be important from day one and will continue to increase in importance as you have more data.
- You will need to keep track of all the data that you collect to ensure compliance with regulations like GDPR and any future equivalents.
A good foundation needs to account for these things from day one. If you’re just starting out, an off the shelf solution will be good enough for most of these items.
Tools that will help you solve most of these challenges are typically called CDPs (Customer Data Platforms). They are a relatively new category but it is quickly growing with options. The most popular options are Segment.com, mParticle and Treasure Data.
All of these options are meant to help simplify your data collection efforts. Instead of implementing multiple tools (Google Analytics, Mixpanel, Intercom, etc.), you can send the data to a CDP and it will translate it for you to all of these tools. The CDP becomes the single place where all data ingestion happens before being distributed to different tools and destinations.
If you don’t want to use an off the shelf solution, you can also build your own which I have seen companies do (mostly larger companies). Either way, you want something that can solve the challenges I listed above.
Finally, make sure all of your data is stored in a data warehouse. You’ll have tons of options like Amazon Redshift, Google BigQuery and Snowflake, and most of them will likely be supported by the CDPs I listed above. This data can be analyzed using SQL and other tools, which means it might not that be that helpful in the short term (especially if you don’t have data analysts), but it will become relevant in the future.
Avoid the common analytics pitfalls
Even if you set up a good analytics foundation, you can still run into pitfalls. There are three common pitfalls you should be particularly aware of.
1. Vendor Lock-in
It doesn’t matter how excited you are about any new vendor and how much this vendor will “grow with you.” You should always be wary of any vendor lock-in, especially when it comes to data.
If you set up the right foundation, your data should be flowing pretty freely through all of your different tools and even being stored in a raw format in your data warehouse. You shouldn’t have any data that is “only” available through a specific tool.
In reality, you might not get this 100% of the time. Some data will be vendor specific (email campaign data, CRM data, etc.), but you should be mindful of this data and find ways to export it as soon as possible.
2. Overpaying for Tools
The second pitfall that I see is companies overpaying for what they actually need. This tends to be an issue of overestimating how much your company can actually do with data. I constantly see companies who buy enterprise analytics tools but who don’t have the resources to actually use them. These tools then go unused until it is time to renew those contracts.
If you’re choosing new tools, be conservative in how you will use them. Don’t assume that you will go from nothing to advanced overnight.
3. Too Many Tools
Our last pitfall is having too many tools. I’ll meet companies who have 10 different tools in their stack but aren’t able to use any of them with a high level of proficiency. This is one downside of a good analytics foundation. Your team will find it easy to set up new tools and they will be constantly adding new things to the website or product.
Experimenting with new tools is fine but you need to ensure that you’re able to get value out of your existing tools. You can do a lot with what seems like “poor” tools, as long as you learn how to use them.
How Your Analytics Strategy Changes Over Time
The foundation covered in the beginning is a great starting point, but it will change as your team and product changes.
Development Team Size
The first area is the size of your development team. Most of the technical details behind data collection will be managed by someone in your development team. This can be quite a burden for small teams who are already overcommitted in terms of work.
If your development team is less than 5 people, you should keep data implementation and maintenance to a minimum. This means using tools that auto-collect data for you such as Google Analytics, Heap Analytics or Hotjar.
If your development team is bigger than 5 people and you’re able to afford to dedicate 10-15 hours a month to analytics, then you can look at setting up more advanced tools like Mixpanel and Amplitude. These tools require an upfront setup and regular maintenance, which can make them quite expensive in terms of development time.
If you have a dedicated data engineer, then you can focus on setting up even more advanced and flexible systems like data warehouses and custom ETLs.
The second area that will affect your analytics needs is how many data analysts you have in-house. Data analysts will be able to analyze data using SQL and will be technical. This allows them to be flexible in how they work with data.
However, if you don’t have data analysts in-house, it likely means that your team isn’t very technical (on the business side) and you need tools that have more user-friendly interfaces to analyze the data. Keep this in mind as you choose the tools in your analytics stack.
The third and final area is around enterprise support. Once you have enough data volume, you will need enterprise-level support and packages. Apart from the changes in cost, you also need to be mindful of what kind of long-term commitments you’re making to specific tools or strategies. Signing multiple year contracts may turn out to be a mistake as your needs and capabilities change.
Data can be a competitive advantage, but you need to be smart in how you tackle it. Remember to start with a good foundation on which you can build upon. This will determine how easy or hard certain things will be in the future.
You should also be aware of common pitfalls (vendor lock, overpaying for tools and too many tools) while balancing your needs and how they are changing with your company and product. At the end of the day, the goal is to avoid over optimizing your analytics stack and instead focus on being “good enough” for your current company stage.
Our 2019 Expansion SaaS Benchmarks Data Explorer allows you to find your exact peer benchmarks around the metrics that matter most: YoY growth, gross margin, cash burn rate, CAC payback, net dollar retention and logo retention. Check it out!