– Thoughts on a business-first approach to digital analytics
For more than a decade, digital analytics has been portrayed as a discipline defined by toolsets like Google Analytics, Adobe Analytics, Mixpanel, Amplitude, Snowplow, and with them, and all the “new” players promising to “finally unify your data.”
But despite the growth in the number of platforms and exponential growth in data collection capabilities, businesses continue to struggle with one persistent fact: that no single digital analytics tool can no longer solve all your needs.
Hang in there for a second.
This is not a failure of the technology.
It is a reflection of the complexity of modern business itself, and, more crucially, a sign that organizations are finally approaching a business-first, not tool-first, approach to data gathering.
If we want to get return on our analytics, we need to focus on the return, not the tool first.
Fragmentation is not a bug, it’s a feature of maturity
Because the analytics landscape has fractured over the last decade. Slowly at first, but now faster and faster. Ten years ago, most digital teams could get by with one or two tools providing clickstream data and simple attribution models.
Today, businesses install stacks of tools from behavioral analytics, CDPs, data warehouses, experimentation platforms to journey orchestration tools and even more.
Many complain about this fragmentation, dreaming of the good old days with a “single source of truth” or “one tool to rule them all.”
But this is not a regression, it is an evolution. Fragmentation exists because no single vendor can simultaneously offer best-in-class capabilities across every analytical function that one could wish for.
From real-time behavioral tracking, advanced segmentation, predictive modeling, experimentation to activation, it’s practically impossible to lead while remaining fully compliant with an ever-shifting set of legal challenges.
The market has fragmented not due to problems with the individual tools, but because digital analytics has matured into a multidimensional discipline.
We should no longer ask, “Which tool should we use?” but rather, “What is the right combination of capabilities that fits our internal resources, business goals, data maturity and customer strategy?”
Different businesses, different stacks
One of the key reasons there is no universal analytics solution is that different types of businesses have inherently different data needs.
A stack that serves a high-volume international B2C ecommerce business perfectly will likely frustrate a low-volume, high-complexity B2B SaaS company with a long sales cycle, and vice versa.
And the reason is simple, if we look at the business.
Ecommerce businesses typically care deeply about funnel analytics, conversion rate optimization, real-time performance tracking, and multi-touch attribution.
They benefit from tools that offer event-level tracking, product affinity modeling, and integration with advertising platforms. A common stack might include Piwik Pro or GA4 for general reporting and light product analytics for behavioral insights, and a CDP like Piwiks or a composable one built in Google Cloud to unify profiles across channels.
Subscription businesses focus more on lifecycle metrics: retention, churn, customer lifetime value, feature adoption, and cohort analysis.
They need event-based tracking tied to individual user journeys over time. Amplitude is often a strong fit here, along with data warehouse-backed reporting in platforms like Looker or Tableau for financial and product metrics.
B2B companies introduce even more nuance. Sales cycles are longer, buying groups are more complex, and intent signals can be subtle.
Here, web analytics is only one part of the puzzle. Integrating CRM data, marketing automation, content engagement, and lead scoring often requires a strong data engineering layer. Reverse ETL tools, identity resolution, and offline attribution are more critical than pixel-perfect clickstream dashboards.
Marketplaces and platforms need dual-lens analytics: supply and demand.
They track both customer experience and vendor activity, needing granular behavioral data, reputation systems, and liquidity metrics. These models often outgrow off-the-shelf tools quickly and benefit from a robust, custom data warehouse model.
Each of these business types operates with different questions, success metrics, and operational rhythms. Trying to squash them into the same analytics tool is a recipe for compromise. The fragmentation of the tool landscape reflects this natural diversity.
Just imagine you sell shoes and what data you need.
If you are the factory or wholesaler your data needs are long term, for a shoe store different and even if you sell them online there is a huge difference in what data serves you business best depending on if you are a marketplace or an ecommerce store.
Having all the data will not add value.
There is no one-size.
That is why the smartest companies now are beginning to recognize that their data strategy must reflect their business model, not just some abstract notion of “best practice.”
Tool-centric thinking is obsolete
Many organizations still fall into the trap of selecting analytics tools based on market popularity, internal familiarity, or vendor hype, rather than starting with a rigorous articulation of their business questions.
The result?
Massive implementation projects that fail to deliver actionable insight, dashboards that no one uses, and siloed datasets that do more to confuse than clarify.
A tool-centric approach assumes the tool knows best.
But tools are built on opinions: about what to track, how to store events, what data is important, and how insights should be visualized. When you let a tool dictate your data model, you inherit its assumptions, whether or not they align with your business.
A business-first approach reverses this logic.
It starts with the questions that matter: What decisions are we trying to support? What customer behaviors do we need to understand? What KPIs truly reflect our value creation? Only then do we architect a data strategy, and select the tools that best enable that strategy.
The illusion of completeness
Many analytics platforms promise end-to-end solutions. But while these platforms are often strong in one area, say, web analytics or customer journey visualization, they tend to be weaker or more rigid in others.
Attempting to force every question through a single tool’s paradigm leads to a distorted view of the business.
For example, a marketing team might rely on a web analytics tool for campaign attribution, while a product team needs detailed feature engagement metrics that require event-based behavioral tracking. Forcing both teams to work within the same tool will result in compromise, tension, and, ultimately, suboptimal decision-making.
Moreover, the concept of a “single source of truth” is often misunderstood.
Truth in analytics is not about a monolithic dataset or one interface, it’s about consistency in definitions, traceability in data lineage, and clarity in business logic. These qualities are organizational capabilities, not tool features.
Starting at home
One of the core reasons businesses need multiple tools is that different teams have different analytical needs, levels of data literacy, and temporal horizons.
Executives need high-level KPIs and forecasts. Marketing teams want real-time campaign performance. Product managers seek granular behavioral patterns. Data science teams require raw, structured data for modeling.
Trying to serve all these needs through a single tool creates cognitive overload and usability friction. Worse, it discourages meaningful adoption and leads to shadow analytics, unofficial tools, spreadsheets, and unvalidated metrics proliferating outside central governance.
The solution is not centralization, but orchestration. Let each team use tools suited to their function, but ensure those tools are connected through shared data definitions, a common data infrastructure, and cross-functional alignment on business goals. This enables distributed autonomy without sacrificing integrity.
Complexity as a competitive advantage
Many organizations resist the idea of a fragmented analytics ecosystem because complexity feels inefficient. But in a rapidly changing business environment, simplicity is often an illusion.
The real goal is managed complexity, designing systems that are flexible enough to evolve, yet structured enough to govern.
This requires new capabilities with data product management, analytics enablement, and data quality assurance. It demands closer collaboration between business stakeholders, data engineers, and analytics professionals. But the return is enormous from faster insights, more relevant metrics, better alignment between strategy and execution to better decision making.
Businesses that embrace a business-first, modular approach to analytics don’t just get better reports. They build institutional resilience, the ability to sense, respond, and adapt to change using data as a strategic asset.
Strategy before stack
The fragmentation of the analytics market is not a sign of chaos, it’s a sign that the industry is finally maturing. Businesses are recognizing that tools are not the solution.
Strategy is, business is.
The most successful organizations in the data-driven era are not those who bet on the “best” analytics tool. They are the ones who start with clarity of purpose, build a data architecture that reflects their strategic priorities, and select tools that serve, not define, their questions.
There is no silver bullet.
There are only alignments between data and decision, between people and process, between insight and impact. That alignment does not come from choosing the right tool. It comes from asking the right questions, and building a system designed to answer them.