Digitization has been a boon for financial services, bringing the industry into a new age. However, it has also brought massive and growing data volumes that can be either a goldmine or an indecipherable pool, depending on each organization’s ability to gain valuable insights from it to make data-driven decisions. There is a Darwinian aspect to this — in the highly competitive arena of financial services, only the agile and adaptable organizations will prevail; the survival of the fittest underlines the need for flexibility, versatility, and the rapid adaptation to new scenarios. Data mesh, an architectural design proposed by Zhamak Dehghani of ThoughtWorks, offers one solution to this challenge.
Weaving the Data Mesh
Data mesh is a concept designed to overcome the shortcomings of traditional, centralized architectures, which were built around a central repository such as a data warehouse or a data lake. Centralized architecture was found to encourage logjams, as no single repository was truly capable of storing all of the data in an enterprise, and business teams would forever be at the mercy of IT to furnish data requests. In contrast, data mesh is a distributed architecture with federated ownership of the data, in which those with greatest understanding of the data are also responsible for its management and consumption.
In a data mesh, different departments within an organization are responsible for managing their own “data domains,” and they package their data as “data products” for delivery throughout the company. Coupled to this concept there is the need for centralized oversight, to ensure interoperability and consistency across domains. This could cover areas such as standards, naming conventions, and a variety of different corporate governance features of. This enables companies to share data across the wider organization, when needed, for example, to combine domain data for assessing corporate risk or for taking cashflow/treasury positions across an entire business.
Data mesh has enormous potential for financial services organizations, whose teams could gain significant agility by managing their own data. However, data mesh alone will not be able to provide this agility without an overarching provisioning layer above the different data domains, to ensure their interoperability. For that capability, organizations need look no further than data virtualization.
The Data Virtualization Factor
Data virtualization is a modern data integration and data management technology that avoids expensive, slow “move-and-copy” architectures by establishing a logical data-access layer between siloed data sources and domain-specific data consumers. Unlike traditional extract, transform, and load (ETL) processes, which physically replicate data and deliver it via batch-oriented scripts (typically overnight), data virtualization enables real-time access to data across myriad different types of data sources, without having to physically move or copy any data. This brings enormous agility to not just the IT function but also the business, enabling stakeholders to gain real-time insights and more easily create new products and services ahead of the competition.
The data virtualization layer stores all of the metadata required to connect to the various sources but not the actual data itself, which remains at the original source location. Because the metadata is centrally stored, it also provides a straightforward architecture for implementing centralized data governance, compliance, and security across all of the data domains in an organization. Again, this brings simplicity and agility to the data management function.
Because data virtualization establishes a data-access layer above the data sources, it enables stakeholders to build any number of semantic layers within the data virtualization layer, perfectly suited to the establishment of individual data domains!
The Future of Financial Services
Data mesh, powered by a data virtualization strategy, is already playing a pivotal role in fostering untold agility in the financial services industry, enabling some of the world’s largest organizations to turn pools of accumulated data into goldmines of actionable intelligence. Every organization looking to thrive in today’s highly competitive climate should make it a priority to leverage this ground-breaking technology.