View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Hardware
February 21, 2017

Why data virtualisation is the key to simple information management

So much data, so little time.

By Ellie Burns

In 2013, Norwegian research company SINTEF reported the startling fact that 90% of the world’s data had been created in the previous two years. This assertion was followed by IDC’s prediction that the digital universe would more than double every two years. Corporations and governments alike are dealing with amounts of information that were inconceivable thirty years ago.


More data means more challenges

Drawing meaning from this data, which is stored across a multitude of platforms and formats, from mainframes to mobile phones and all points in between, presents a huge challenge. Much of it is unstructured and difficult to consolidate into a consumable format. Not only are human time and attention limited, but the pace of business is accelerating and with it the need for immediate insight. Therefore, traditional methods used to manage data, such as extracting, transforming and loading data from various sources into a single database (known by the acronym ETL) have proved cumbersome when faced with today’s volumes. Business critical data needs to be analysed in real-time, not minutes or days, otherwise the information may no longer be valid.

Common integration approaches of recent years have relied on manual coded integration, data connectors, or ETL processes, all of which delay analytics and inhibit business insight. For applications requiring super-fast decision-making and number-crunching, moving data is not an option. Organisations must be able blend together all manner of data; relational and non-relational, mainframe and non-mainframe, and to retrieve this combined information for analytics via a simple, single query or request.


Go beyond ETL

Fortunately, better and better tools exist to help decision-makers ask meaningful questions of the information they possess. Data architectures are evolving to be more agile in response to the unique real-time needs of mobile, cloud, and self-service analytics. Data integration solutions can bring together the information that, combined with real-time analytics, produces instantaneous insight that can be shared across the enterprise. Users can have visibility of customer expectations, purchasing habits, competitive threats and emerging revenue opportunities.


Content from our partners
Green for go: Transforming trade in the UK
Manufacturers are switching to personalised customer experience amid fierce competition
How many ends in end-to-end service orchestration?

View data virtually

One of the most exciting developments in data integration technology of recent years is data virtualisation. As its name suggests, this solution integrates data “virtually”, eliminating the need to physically move data, along with all the effort (and risk) that entails. Data virtualisation provides a virtual means for bringing together heterogeneous data sources, regardless of data format or location. It delivers data integration without the complexity and cost associated with extracting, transforming, and loading the data.

data virtualisation

Recently, data virtualisation has been implemented on the mainframe. This has provided a strategic bridge between the mainframes that house so much data for large corporations, and the current generation of young developers whose expertise lies more in modern web and mobile application development.  Data virtualisation can provide familiar application programming interfaces, or APIs for developers, eliminating the need for mainframe experience and allowing them to access and use data more easily. For businesses that rely on mainframe systems, this can translate into greater developer productivity and the faster delivery of new business services.

Recognition has been a long-time coming for data virtualisation. As a technology, it has been something of a pre-adolescent, the child actor who couldn’t make the transition to adult roles. While revolutionary in its approach, early forms of data virtualisation were just not ready for prime time, suffering from performance and scalability issues. Now, however, we are seeing adoption on the increase. In its recent report, analyst firm Gartner highlights that “through 2020, 35% of enterprises will implement some form of data virtualisation as one enterprise production option for data integration.”


What does the future hold?

As data virtualisation comes of age, it is going a long way to solving the problem of today’s proliferation of data.  By providing organisations with the ability to combine data from a variety of disparate data sources into a common format, it not only addresses problems of data compatibility and volume but also eliminates issues relating to expertise in specific programming languages. Moreover, it achieves all of this in real-time. There is simply too much data today for old style data integration techniques to handle. With data virtualisation, business owners can be prepared to meet the fast-paced requirements of a digital age in which all businesses demand instantaneous answers.

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.