Just before lockdown I attended an excellent event in Holland where the presentations from Dutch speakers were translated and broadcast through headsets in real time (by some very adept linguists) into several other languages for the benefit of those, like me, who don’t speak Dutch. (This system is similar to the ones used by the United Nations.) I marvel at their talent; to be able to listen, translate, and speak another language all at the same time is quite a skill! Communication between different languages is perhaps one of mankind’s longest running challenges. If we go as far back as far as the Old Testament, the Book of Genesis references the Tower of Babel, which looks like an attempt to explain the multiplicity of world languages.
Languages also evolve. Today we regularly encounter new words or terms that can quickly become commonplace; certainly 12 months ago I wasn’t familiar with the terms “furlough,” “COVID-19,” or the “proroguing” of UK parliament, all of which have been widely used in the news since then, at least here in the UK.
In business, new terms and acronyms are introduced daily, and they may be particular to an industry, a business, or even a line of business in an individual company. These new terms result in problems similar to those at the Tower of Babel; in the modern world, it is not uncommon for four managers at a meeting to have four (or more) answers to the same business question. This can cause delays and poor decisions, and it would also result in conflicts at the workplace. The issue stems from data. Data comes from multiple sources, stored in multiple formats and protocols, and created at different places and times, and it has frequently undergone a variety of different transformations and curation processes before the results finally land on the meeting room table. Where is the proverbial Single View of the Truth? The ever-increasing velocity and volume of today’s data sets is not helping.
A Common Foundation
Organisations turn to data virtualisation to address this need. Rather than moving, translating, and copying the data multiple times around the business, data virtualisation enables a single consistent data model to be created and connected directly to the data sources. First, this means that organisations can avoid the cost, delays, and complexities of replication. It also means that the data is up–date and not dependent on a scheduled overnight batch process. Perhaps most importantly, it means that all data consumers operate from the same data model, regardless of their chosen technology for consumption.
Take for example an organisation that wants to launch a new omni-channel service. This organisation might have, say, a variety of data consumer types, perhaps a mobile app, a web site, a call center, a management report, and a real-time executive dashboard. Without data virtualisation, the developers would have to use different protocols, different formats, and perhaps even different languages and developer tools to orchestrate the data feeds going to each of these consuming technologies. This would result in slow development and deployment times as well as complexity, errors, and inconsistencies between each of the different data models. It would also make subsequent changes very challenging to implement. With data virtualisation, the different sources are all connected through a single virtual layer that contains a common data model that is exposed to all the data consuming technologies, providing them with the same results. By using data virtualisation, Denodo customers have reduced their development times and time-to-market by 65% to 95%. The work is done once, not four or five times, and the answers will be consistent, regardless of the consuming technology. The world’s largest semiconductor manufacturer uses Denodo data virtualisation in this way, saving 90% on development and time-to-market.
The Single View of the Truth is out there, waiting to be leveraged – Organisations just need to seize the opportunity. Or, as some might say, carpe diem!