When looking to expand your organisation’s analytics capabilities, the default decision around technology is often: “use more of the same.” However, organisations are finding that this doesn’t always work, especially when they pursue digital transformation strategies that entail new types and new sources of data. Enterprises are now going beyond the default decision to add data lakes to their analytics environments. This represents a great step forward in adopting modern analytics, but they end up capping the potential value of their data lake by continuing to use their existing, traditional BI tools. The rationale behind this configuration is actually valid—leverage existing investments as much as you can, and adopt new technologies where absolutely necessary. Enterprises see the need for a new data platform (i.e., the data lake), but not the need for a new BI technology. They try to use their existing BI tools to get insights from their data lake. And that is the problem in this situation, in not recognising that adopting a new BI platform specifically for your data lake is in fact necessary.
Once organisations realise that a new data platform requires a new BI solution, one more proof point is required—cost justification. Certainly the value a new BI solution creates is the most critical factor, but that’s a meaningful discussion only to the decision-makers in IT and in the line-of-business. Others teams, like the procurement department, don’t think in terms of features, capabilities, and processes, and instead think in terms of financial impact. They potentially see “just another BI platform,” which in their minds incurs unnecessary expenditure because they don’t understand the added advantage. It will seem counterintuitive to them that procuring a completely separate BI solution will result in much lower total cost of ownership (TCO). In contrast, they likely agree to data lake procurement because of the obvious up-front economic advantages with respect to software and hardware licencing, but proving the cost advantage of a new BI tool requires a little more effort. Fortunately, the cost analysis is straightforward when considering BI technologies that are native to data lakes, like Arcadia Data, since the associated workflow is significantly more efficient than the workflows required with traditional BI tools used on data lakes.
An example model for analysing the costs of a new BI solution designed for data lakes, versus a traditional BI solution designed for data warehouses, entails assessing the steps in the analytic lifecycle. Each step requires human resources, described as full-time equivalents (FTE). So with a BI solution native to data lakes, you can eliminate a few steps that are common in the data warehouse world, but unnecessary in a data lake world. One clear advantage with data lake BI solutions is the ability to analyse data lakes without any data movement. There’s no need to move data to a dedicated BI server, which is required by traditional BI tools to provide the performance needed by end users in a production environment. With no data movement, there’s no redundant administrative effort around security and data modeling, which frees up FTEs for other business-critical activities.
To share a point from my white paper, A Cost Analysis of Business Intelligence Solutions on Data Lakes, I included the diagram below that shows the shortened analytic lifecycle when you use native BI tools on data lakes. The lifecycle is shorter because of the quicker feedback loops that don’t require heavy IT intervention, typically for data modeling.
Contrast that process to the one shown below. If you’re a data warehouse veteran, you know that data modeling and remodeling tend to be very time-consuming. Perhaps you find the effort to be intellectually easy, but it still takes a lot of time, especially because there’s significant back-and-forth collaboration between business folks and IT.
The white paper describes how much more effort the traditional BI analytic process requires, which underscores the need for a new BI standard for your data lake. If you consider TCO alone, you get a 2.5x advantage with native BI. If you then factor in how quickly you can build BI artefacts, you call out greater productivity and efficiency and can get a 10x advantage per artefact with native BI.
Download the white paper if you want to see how much better you can do with your data lake. And if you want to get hands-on with the visualisations in Arcadia Data, download Arcadia Instant, a free, desktop BI tool that lets you get started right away. Contact us if you’d like to learn more.