With numerous new technology options available for data analytics, more companies are looking at the need to upgrade outdated and restrictive platforms—both hardware and software. They see the opportunity to create state-of-the-art capabilities using increasingly viable options in cloud technology, big data analytics, data storage and reporting tools.
For example, a ZS client recently encountered a common problem: The firm’s existing data and analytics platform was hitting the upper limit of its capacity, limiting the firm’s agility and responsiveness, as well as its ability to serve current customers or to quickly and easily add new customers because of diminished performance. Even worse, the existing platform offered a limited range of capabilities. Built to accommodate only structured data, it gave the company no option to analyze unstructured data with its multitude of sources, from online clicks and social media to email and documents.
A Cloud-Based Analytics Platform
Moving the company’s data analytics capabilities from an on-premises platform to the cloud-based Amazon Web Services (AWS) platform opened up considerable capabilities. The company immediately was able to take advantage of a highly extensible infrastructure that leveraged cloud, open-source and big data technologies.
The company also was able to save money by using AWS on demand rather than paying to build and maintain an on-site system. And it could take advantage of Amazon Elastic MapReduce, a service that uses open-source Hadoop software to process data from both structured and unstructured sources, and Apache Hive, an open-source “data warehouse” that allows users to query and manage large data sets in distributed storage. That meant that the company could store and archive data on AWS S3 and move the data to HDFS file systems for active processing.
The new platform brought in more capabilities, first speeding up the company’s ability to ingest data and then improving the company’s predictive analytics. The first capability uses a technique known as metadata-driven transformation, which enables users to query data using characteristics gleaned from the data itself, such as type, source and other facets. This data-ingestion technique is easily configurable and, as a result, is much faster to set up and requires less manual processing.
The second capability aids predictive analytics. By combining big data with an enhanced recommendation algorithm, the company was able to use an expanded set of attributes—including demographics and historic engagement—to improve the accuracy of the results.
And to simplify querying, ZS integrated Tableau with Hive and an Impala data store. This helped to accelerate the creation of dashboards and reports and support ad-hoc analysis.
Preparing for the Future
With these new capabilities in place, the company was able to comfortably take on more customers, as well as to handle larger and more varied data sets. Because metadata-driven transformation allows quicker onboarding of new customers and new channels, with less manual intervention, the company was able to decrease overall processing time by 66% and reduce demand on its internal teams. Thanks to improved predictive analytics achieved through the recommendation algorithms, the company was able to increase prediction accuracy by anywhere from 15 to 40%.
Overall, the company was able to meld technological capabilities with business needs, creating a foundational data analytics system that allowed the company to better serve current customers, more easily accommodate new customers and grow its business.