Read time: 3 minutes

EXCLUSIVE: Data Vault 2.0 methodology key to unlocking big data's promise

By , Chief Business Environment Officer, at 4Sight Holdings Limited
Africa , 15 Feb 2022
Rudi Dreyer, Chief Business Environment Officer, at 4Sight Holdings Limited.
Rudi Dreyer, Chief Business Environment Officer, at 4Sight Holdings Limited.

Companies are experiencing an explosion in data in the wake of widespread digitalisation and cloud migration as organisations respond to powerful trends around remote work, mobility and business modernisation.

Business success in the modern digital economy increasingly hinges on a company's capacity to manage and analyse this data, and the ability to securely access mission-critical information from anywhere, at any time.

The powerful insights that big data can provide empowers business leaders to improve decision-making, identify market trends and opportunities, predict and respond to evolving consumer behaviours and preferences, and drive operational efficiencies and cost savings.

Ultimately, strategically leveraging big data will ensure that businesses remain relevant, viable and competitive. As such, data has become the driving force behind every successful modern enterprise, which makes it the most valuable asset in any business.

However, effectively managing and utilising data within the organisation transcends the technology stack. Beyond the database and data analytics tools, modern, data-driven businesses require a sustainable strategy that applies agile methodologies and robust data management architectures to realise the full potential of their data.

For instance, many enterprises rely on business intelligence reports from ERP systems to inform decision-making and drive the business forward. However, this reliance on the reporting function often ties companies to legacy ERP systems, which reduces their ability to adapt to change or compete with more agile and nimble competitors.

More importantly, the organisation becomes reliant on new system implementations to maintain business continuity. Any system failure or breakdown will cut the business off from a mission-critical data source system, which is the antithesis of a sustainable big data strategy.

In response, more businesses are designing, implementing, and managing data warehouse strategies in their operation using various methodologies in an effort to model information flows and maintain business continuity.

The Kimball methodology is an increasingly popular option among enterprises in South Africa. This data warehousing methodology applies a bottom-up approach to enable businesses to create pre-defined processes and implement practices that draw data straight from the source.

However, the Kimball approach has limitations. Specifically, companies must apply these processes and practices on a per-system basis. This methodology necessitates a complete rework when systems change, which is unsustainable given how dynamic and complex the modern enterprise IT environment has become.

In response to this challenge, Data Vault 2.0 has emerged as a far more relevant and effective methodology. It provides a refined set of rules, best practices, standards, and process designs that guide how businesses should engineer a data warehouse strategy and classify, catalogue and standardise data across the operation to create new ways of working.

This methodology combines agile delivery with operational optimisation across people, processes, systems, data, and technology to better handle unstructured data and integrate big data to create a massive enterprise data vault, regardless of how many systems are in place.

The resultant source system independency offers an agile, adaptable and sustainable data strategy that does not require an overhaul every time an operational or IT system changes. And any new system simply adds more data to the data vault's history.

Furthermore, consolidating data within a data vault architecture creates opportunities to implement data hygiene practices, as only source data from the system that offers the most accurate and highest quality data point is used when multiple instances exist.

This ultimately creates a single instance of the best quality data source for use within the enterprise, coupled with auditable linage that enables data warehousing teams to track changes and history.

Importantly, this 'sanitised', consolidated data provides the foundation on which to build powerful data visualisation and business intelligence capabilities.

The ability to instantly access relevant and accurate information in their preferred format empowers business leaders by transforming data's role in the enterprise from retrospective hindsight to offer deep, meaningful insights that can positively impact the operation in real-time.

Layering intelligence over high-quality real-time and historical data can unlock forecasting capabilities that create organisational foresight to inform future strategic decisions.

The Data Vault 2.0 methodology also supports scalability within the enterprise, with the ability to take a piecemeal approach to implementation to meet the unique needs of the enterprise. Rather than invest in an end-to-end data warehousing project, which can take up to three years to complete, businesses can implement the project in phases, focusing on high-priority, high-impact source systems first to immediately start realising value.

The company can then allocate the resultant return on investment to fund the next phase in its data warehousing strategy. Over time, the business will eventually achieve its end-to-end big data vision via a self-funded model, rather than the traditional upfront capital expenditure model.

And once implemented, the Data Vault 2.0 methodology will unlock additional opportunities to create operational efficiencies and drive cost savings within the enterprise.

For instance, businesses can leverage the methodology to automate the data extraction, capture and conversion functions to reduce the risk of human error.

It can also identify additional areas for potential cost savings. For example, the architecture may identify potential efficiencies from database consolidation, which can reduce software and support costs.

Businesses can also automate the data digitisation function to shift the onerous manual data capture process away from staff, which frees their capacity to focus on more value-adding tasks within the organisation.

Ultimately, the Data Vault 2.0 methodology is an extremely effective and efficient way to engineer data warehousing projects, the success of which has become a strategic imperative in an era characterised by rising data volumes across multiple heterogeneous source systems and end points.

Daily newsletter