The value of data is more commonly realised by businesses, data is the new oil after all.
CBR highlights three of the best advanced analytics solutions in the market.
SAS is considered by the market leader in this area, it ranks highest in the Gartner magic quadrant, and has built a reputation over decades for delivering high quality analytics software.
Some of the products that it offers are data mining, statistical analysis, forecasting, text analytics, and optimisation and simulation.
In the data mining category the company offers solutions such as SAS Enterprise Miner, Factory Miner, Scoring Accelerator, and Visual Analytics, in addition to others.
Enterprise Miner offers features such as data preparation, summarisation and exploration, open source integration with R, automated scoring, and scalable processing.
In the statistical analytics field SAS offers Analytics Pro, ETS, In-Memory Statistics, Visual Data Discovery and more.
Analytics Pro is designed to allow companies to integrate it into virtually any computing environment so that users can unify computing efforts and get a single view of data, the company says.
It also offers a consolidated vendor portfolio that is designed to reduce the costs of licensing, maintenance, training and support.
Some of the key features are an intuitive 4GL, which is a fourth generation programming language that has support for SQL. A web-based development environment is also available alongside a prebuilt library of programs, geolocation analysis, cross-platform support, and advanced statistical analysis.
One of the benefits of SAS is that it offers advanced data analysis tools and techniques, however, it does also have a reputation for being a heavyweight to install, meaning that it might be better suited to enterprise workloads than for say a start-up.
However, it does offer tools for differing levels of analysts up to data scientists.
Just behind SAS in the analyst ranking, IBM has an extremely strong portfolio when it comes to analytics.
The company has been on a drive with tools like Watson, which is a smart data discovery service available on the cloud, it is a cognitive system that is capable of guiding data exploration, automating predictive analytics, and enabling dashboard and infographic creation.
One of the best things about Watson analytics is that users can get limited version of it for free. This will allow them to upload spreadsheets, get visualisations, discover insights and build dashboards, it also allows access to Twitter data.
The plus version gives a little bit more but is still restricted to one user and costs $30 per month per user. The Professional version comes in at $80 per month per user but is accessible to one or more users.
The professional version allows for access to relational databases, on premises and on cloud, access to 19 data connectors including IBM Cognos reports, and full access to IBM Analytics Exchange data and offerings.
Big Blue offers a lot more than just Watson, it also has solutions such as Stream computing, and Prescriptive analytics.
Stream computing is designed to allow organisations to process data streams which are always on. It works by continuously analysing data and connects to all data sources. The solution offers a development environment, runtime and analytics toolkits such as natural language processing, image/voice recognition, and spatial temporal analysis.
Prescriptive analytics is software that aims to help businesses make strategic decisions based not only on what has occurred or is likely to occur in the future, but through targeted recommendations based on why and things happens, at least that’s what the company says.
Basically it is meant to work by using predictive analytics and suggesting the optimal way to handle the future situation.
Prescriptive analytics allows for the automation of complex decisions and trade-offs to better manage limited resources, and the ability to proactively update recommendations based on changing events.
With the decision to align its analytics portfolio around Apache Spark, IBM has connected to an extremely popular technology that will surely boost its market standing among developers.
The company is close on the heels of SAS and perhaps offers a better portfolio when it comes to appealing to all levels of company.
Dell is currently in the process of buying EMC for $67bn, to help finance this it has decided to sell of some parts of its business, one of these parts is the software business that Statistica is a part of.
This means that Dell Statistica is likely to become just Statistica, but that isn’t confirmed yet.
What Dell will be losing with Statistica is one of the newcomers to the advanced analytics leadership ranks, but also a company with over 30 years of history.
Statistica may not be as well known a brand as SAS or IBM but it has certainly impressed analysts.
The latest release of Statistica, version 13.1, came out in April this year with a host of capabilities which the company said is designed to empower citizen data scientists, help organisations better address growing IoT analytics requirements, and better leverage heterogeneous data environments.
Features include edge scoring for IoT analytics, native distributed analytics architecture (NDAA), and collective intelligence.
On the big data analytics front a user can perform in-database analytics on Apache Hive (on Spark) and other big data platforms, while there are also templates for data mining, predictive analytics, machine learning, forecasting, and text mining on big data.
Users can also combine technologies such as Hadoop for scalability and performance, with Lucene/SOLR search, Mahout machine learning and advanced natural language processing.
The platform also offers network analytics so that users can visualise entity relationships and graphical association maps and combine predictive analytics with human expertise in order to better understand relationships within networks.
Statistica’s collective intelligence feature is designed to help businesses embrace the app marketplace for models, users can monetize models ad data or import models written by others.
This means that the platform has good connectivity to increase its capability based on what the user requires.
Models can be imported from Algorithmia, Azure ML, ExpertModels, and more.
The NDAA serves the purpose of trying to break down data silos and “take your math to the data wherever it lives,” the company says.
Users can create and score predictive models in-databases such as SQL Server, within Hadoop on Apache Hive (on Spark), MySQL, Oracle, and Teradata.
The company has offered these features for a while but it has made big strides with its visualisation software to improve the overall feel and experience for the user.
As with all technology choices it comes down to what best suits the needs of the business, this is something that needs to figured out before an assessment of these technologies takes place.
This article is from the CBROnline archive: some formatting and images may not be present.