Earlier this month, tech behemoth IBM announced its partnership with Palantir; a secretive big data supplier from Silicon Valley. The partnership will see the creation of ‘Palantir for IBM Cloud Pak for Data’, a new, user-friendly product that will enable enterprise clients to quickly build AI-based applications for a variety of use cases, the companies say. However, Palantir’s controversial past should raise alarm bells for the customers of new enterprise clients around how their data may be used going forward

Palantir has a reputation that could cause alarm for customers. (Photo by Ascannio/Shutterstock)

Palantir for IBM Cloud Pak for Data will combine the data processing power of Palantir’s Foundry software with the AI capabilities of IBM’s Watson and Red Hat’s OpenShift container platform to integrate and analyse vast amounts of varying data sets across hybrid clouds. The new product will be marketed to industries that manage large quantities of data such as retail and healthcare, as well as financial services and telecommunications, which tie in with IBM’s recent industry-specific cloud stack announcements.

While this new partnership and product may not raise any eyebrows in isolation, the real-world applications of Palantir’s traditional product suite certainly have. These include supporting the Los Angeles Police Department to identify and deter those deemed likely to commit crimes and enabling Immigration and Customs Enforcement to track the personal and criminal records of legal and illegal immigrants.

Despite claims that the latter case led to deportations, Palantir issued a statement in 2018 saying that its product had been used in support of a separate division tasked with investigating migrants. This controversy has failed to halt its momentum, however, just last year, it was awarded a contract to oversee the UK’s post-Brexit border and customs data.

Recently, the company has shifted its attention to supporting national responses to the global pandemic via lucrative contracts with the US Department of Health and Human Services and the NHS for capacity management purposes to deploy personnel, equipment and vaccines, based on levels of demand.

Details of its initial Covid-19 Data Store contract with the NHS (alongside Google, Amazon, Microsoft, and UK-based AI firm, Faculty) were only released under threat of legal action despite the supposed transparent nature of the UK public sector procurement system. What came to light was that the original terms granted intellectual property rights to NHS data, which prompted an amendment to the contract.

While decisions about how the technology is used lies with the client, Palantir may struggle to shake off its nefarious reputation, earned through episodes such as when documents, prepared by Palantir, on how to deal with the threat of WikiLeaks were leaked in 2011. Proposals included submitting fake documents and then launching an accompanying PR campaign to discredit the organisation, which led to a public apology from co-founder and CEO Dr Alex Karp.

These examples should serve as a warning for customers to hold organisations accountable. Despite the aforementioned UK border contract, it is hoped that mistakes of the past will not be repeated and instead help shape and improve the future. As individuals become more digitally savvy and engaged, we will likely see a greater demand for control and transparency over how their data is used.