This blog is from the IT Connection series, posted on CurrentAnalysis.com. Current Analysis, a wholly owned subsidiary of Progressive Digital Media Plc, is a leading provider of timely, practical market intelligence and advice that helps global IT and telecom professionals compete, innovate and improve performance


There is a great book by Douglas Hofstadter titled Godel, Escher, Bach, in which the author describes a fictional machine capable of using Fermat’s Last Theorem and something called the "mathematics of acoustico-retrieval" to recreate an original musical performance by J.S. Bach as played hundreds of years earlier. Do the math right and you can literally hear Bach himself playing on his harpsichord! Obviously, this is complete fiction, but don’t we attempt the same feat when we use analytics tools to pluck business insights out of the boundless data that permeates and surrounds each enterprise?

With tools such as Tableau Desktop, Qlik Sense, or IBM Watson Analytics, we look to reproduce the "music of business" made sometime in the recent past. The key is to hear that music clearly, in high fidelity as it were. The reason Mr. Hofstadter’s thought experiment could never possibly work is the fact that over a few hundred years, the vibrations made by Bach’s harpsichord have traveled quite far and gotten quite lost in the background noise generated by Mozart, Lady Gaga and my Sony Walkman back in 1988.

The same thing happens with meaningful patterns found within business data. To hear the music of business clearly, we need to get as close to the data as possible in time. We certainly have to get rid of any noise in the data through various data preparation, consolidation and mixing practices, but that takes time. We need to lower the gap between the time the data is generated and when it is analyzed. Unfortunately, our ability to do that depends heavily upon how, where and when the music of an enterprise is playing.

Think for a moment about the amount of operational and transactional data generated by a thousand servers spread across two hundred retail outlets on a given day. We can certainly ensure that this data is clean and optimized for analysis, but how do we analyze that data in a timely manner? Even that is not easy, since the data from those 200 locations must typically first travel to a central repository before it can be wrestled into shape. The result can be a pretty hefty lag between a business-altering event and the realization of that event. Think about the Internet of Things (IoT) with mobile devices rather than servers; now multiply that number (200) by a factor of 10!

The solution? Bring data processing to the data itself to leverage the "gravity of data." This past week, I had the opportunity to spend a few minutes chatting with Qlik Technology’s Donald Farmer, who brought up this phrase in the context of how to determine the value of a data discovery and visualization solution. For Donald, going where the data is carries the greatest value. However, vendors such as Qlik can only do so much there. This creates a huge opportunity for infrastructure vendors such as HP and Cisco. For starters, these vendors can more efficiently marshal data by processing data at the source, on the server, within the switch closest to mobile devices, for example — anything to help enterprise users tune into the hum of their business with the most rapidity and highest fidelity.