As the quantity of customer data has multiplied exponentially over the last decade, the promise of a unified customer database in the form of a customer data platform (CDP) has been a top priority for brands. However, not every business has the data expertise and resources to build their own CDP from the ground up or incorporate the essential capabilities for managing their data so they can translate it into actionable business intelligence.
In the early era of data storage, organizations stored data in third-party data warehouses, such as MySQL, which lacked separation between compute and storage layers. So when a business would upload any new data to this warehouse, they would need to shut down that engine and scale up to a larger data warehouse. In order to scale, all of the existing data needed to first be migrated to a different data storage warehouse. Depending on the amount of data a business needed to migrate, this could result in a lot of downtime before the data was able to be uploaded and processed.
The second evolution of data storage offered that much-needed separation with systems like Hadoop, which consists of a storage component — Hadoop Distributed File System (HDFS) — and a compute infrastructure — Yet Another Resource Negotiator (YARN). The main benefit of separating storage and compute in a data warehouse was the ability to accommodate data growth over time without having to switch to larger hardware. This made it easier and more affordable for businesses to scale up and scale down their analytics architecture.
However, while Hadoop offered the separate layers, the intricacies and expertise needed to configure all the settings in Hadoop to achieve optimal performance was a burden for IT teams. Setting up, configuring and managing Hadoop meant investing in hardware and on-premise servers. More so, managing updates and changes in Hadoop generally requires a high level of engineering expertise. HDFS is best used for sharing and processing large data sets rather than ad hoc queries. This is because Hadoop does not accommodate updates to data files without that file having to be completely rewritten. This is time-consuming and difficult to deploy.
Enabling Scalability and Ease of Use with Snowflake Data Warehouses
To achieve the best of both worlds in terms of scale and performance, organizations required a query engine that offered total separation between their compute and storage layers, while also providing the flexibility and ease of use of legacy data warehouses. This is particularly important for a CDP, which promises to give marketers direct access to data and intelligence, and democratize data across their organizations. This is why Acquia chose to leverage Snowflake, the industry-leading data warehouse in the cloud, to power Acquia CDP’s analytics applications.
Unlike other data warehouse platforms on the market, Snowflake is built with a multi-cluster shared data architecture (SD-MPP) with three distinct layers:
- Database Storage Layer — This is the storage layer where data is stored after it is uploaded through the Acquia CDP data pipeline.
- Query Processing Layer — This is the computing layer where uploaded data is processed for specific use cases.
- Cloud Services Layer — This layer deals with infrastructure management, access control and query optimization.
This multi-layered approach shares the data to virtual warehouses. Therefore, users will not incur additional data storage charges. The partnership between Snowflake and Acquia enables Acquia CDP users to deploy unlimited concurrent query loads with instant and infinite up and down scaling. This eliminates the need for marketers to need to think about scaling up or scaling down to prepare for fluctuations in volume (e.g.,to prepare for a 20x spike in volume for holiday traffic), and leaves the worry of data storage to the experts.
Another benefit of Acquia Analytics using Snowflake as their data warehouse is that BI teams can take advantage of Snowflake Data Sharing, which copies an Acquia CDP customer’s data over to their own instance of Snowflake. This gives other teams within their organization the ability to easily slice and dice the data, query CDP data with other data sets, and gain deeper insights into the business using their own BI tools.
Through Snowflake Data Sharing, a company’s BI teams can immediately access clean, unified, trusted CDP data without needing to log into a separate system or manage a custom extract, transport, load (ETL) process. This means users can execute queries faster and ensure that they can always pull data reports on reliable, up-to-date customer data — with no downtime as new data is uploaded and refreshed.
Integrating with Looker for Seamless Business Intelligence Reporting
Many CDPs fail to offer reporting or analytics, making it extremely challenging for marketers to understand and interpret the data they’re collecting and translate it into informed campaigns. For those CDPs that do offer some dashboards for reporting, those dashboards typically will only present data within a specific set of pre-built parameters and cannot be customized to reflect unique criteria or use cases.
However, the metrics capabilities in Acquia CDP are built with Looker, a leading business intelligence and data analytics tool that offers real-time dashboards for more in-depth analysis. The CDP’s open and flexible architecture combined with Looker’s visual data interface empowers marketers to build customizable dashboards and access customer insights instantly.
Acquia integrated with Looker to ensure the most seamless experience possible for the customer. Rather than hastily stitching together a bunch of one-off solutions that limit what data marketers can access, our product team evaluated which business intelligence tools provided robust API coverage. Looker offered 100% API coverage, which meant our engineers could embed all the necessary analytics and intelligence tools into the application without disrupting the user experience.
Built with Looker, the Metrics UI in Acquia CDP gives marketers the flexibility to configure their data into many views by filtering by factors like dates, integers and customer lookups. Additionally, the analytics functionality offers the ability to analyze high cardinality dimensions, such as product name and customer zip code. So, for instance, a marketing team would be able to easily look up how many customers live within 50 miles of a certain store location and use that report to better target their next direct mail campaign. Or, they can easily look up which customers deem that their favorite store. The analytics platform is infinitely flexible.
Having the right technology in place that can adapt and solve for unique business use cases is the true value in an enterprise platform like Acquia CDP. A decade ago, companies were interested in building all of their data tools in house because they were seeking the intellectual property (IP) of creating their own custom solution. However, now that the CDP space has evolved and matured, the value comes not from the software or the code you write, but how you are able to apply that technology to serve your customers and drive revenue for your business. When brands invest in a CDP that can harness the power of the best data and analytics tools currently available, they can spend less time focusing on the tool itself and spend more time on innovation and using that technology to solve new problems.
See more of how an enterprise CDP can serve your business needs in our e-book, A New Kind of Marketing Cloud: Where Data Takes Center Stage.