Sometimes you just have to laugh. In January last year I wrote a news story in which Datanomic claimed it had an integrated suite of data quality tools, and one of its closest rivals, Trillium Software, quickly rejected the claim.

Then VP EMEA for Trillium Software, Tom Scampion, told me: "[Datanomic] do not have an enterprise-level offering. They only run on a PC, which does not mean it is not good, but it is suited to a departmental level and below, not the enterprise. There is an inverse relationship in this space between the amount of noise a vendor makes in the market and their proven ability."

Scampion said that for enterprise-level data quality tasks, software that runs only on a Windows platform is unlikely to be up to the job, whereas Trillium’s software runs also on mainframe and Unix platforms. "For data quality tasks like fraud detection and other heavy lifting, users are unlikely to go for software that runs only on a PC, it just won’t have the horsepower," he said.

Well, last week Scampion jumped ship and joined – you guessed it – Datanomic, as VP of sales. Commenting on the move he said, "Organisations are now investing in data quality software to gain a competitive edge and meet regulatory requirements. As such conventional name and address products no longer suffice. The current market demands solutions that go far beyond traditional data quality management, which is why Datanomic is pulling ahead with its powerful and unique approach.”

It is not surprising that competition is becoming increasingly aggressive in the space, as data quality is rising up the corporate agenda. Inaccurate and inconsistent data is known to lead to serious errors in software applications and data analysis, and the Data Warehousing Institute found in recent research that up to 75% of organizations have identified costs associated with so-called "dirty" data. I wonder what the cost of inaccurate or inconsistent marketing messages amounts to.