In what is now an annual upgrade event for its data quality suite, Trillium Software has launched version 13 of its Trillium Software System.
Speaking to CBR, the firm’s Ed Wrazen, VP product marketing, said that the biggest drivers for data quality technology like this today are regulatory compliance and the desire to get better value from customer data.
Key improvements to the technology include new role based navigation bars that help users of different levels of technical ability; reusable business rules that help to ensure that data cleansing options comply with existing business policies and the ability to quickly run user acceptance testing on any new data quality processes prior to deployment.
Wrazen also called out the suite’s new geocoding capabilities: "We can enrich data with latitude and longitude for 238 countries, helping users identify locations which is very valuable for risk management, for example identifying flood plains in insurance applications, and also for logistics and telco customers."
There has also been enhancements to the suite’s reporting and charting, Trillium – a Harte Hanks company – said. This is said to make it easier for business and data stewards to understand and quantify the impact of their data conditions for their business.
With regards to enterprises understanding the dangers of ‘dirty data’ and realising the benefits of a more strategic approach to data quality, Wrazen said: "There are pockets of acceptance and pockets of resistance. There are organisations treating their data as a much more strategic asset, especially in financial services, government and other public sector organisation where compliance with legislation and other regulation is high on the agenda."
"Data governance is taking more of a hold and we’re seeing more job titles like Group Data Officer, instead of this being handled by the CFO," Wrazen said. While he said that many companies still do data quality projects as one-off, or annual projects, there are clear benefits from integrating data quality techniques into operational systems and doing data quality in real-time, for example checking for quality errors or metadata inconsistencies when a customer is on-boarded.
Trillium competes in the data quality space with the likes of SAS company DataFlux, Datanomic and Informatica.