1.5 million. That’s the number of news stories and press releases that reach Bloomberg every day – information that’s extracted and scrutinised by the firm’s small army of analysts before being offered to the software giant’s vast network of financial clients. But if the global market for financial market data is now worth $42bn, that informational avalanche means that actually getting from input to insight is far from straightforward.
That’s doubly true given the rising power of machine learning. LLMs are immensely useful for condensing information and answering questions. That’s doubtless reflected by Bloomberg’s own investments in generative AI for summarising earning calls, as well as research suggesting the technology could unlock $1tn in global banking revenue by 2030.
But as Anju Kambadur explains, designing an LLM is pointless if it can’t then be deployed efficiently. “Once you build these models,” says Kambadur, Bloomberg’s head of AI engineering, “you still have to convert into a scalable service.”
Unfortunately, LLM service deployment has traditionally been beset by challenges. Juggling a range of different ML frameworks and inference protocols, development has long been fragmented. That’s even as AI engineers were forced to become quasi-experts in everything from GPU scheduling to disaster recovery.
If that wasted time and money better spent on core competencies – like building new models – it also increased the risk that mistakes would happen and LLMs would produce faulty results. Especially in a sector like finance, one predicated on accuracy and speed, Kambadur stresses that even minor errors could be “catastrophic”.
Harnessing AI at Bloomberg
Enter KServe. Co-developed by Bloomberg – alongside Nvidia, IBM Cloud, AWS and other tech titans – this managed service platform simplifies and accelerates the deployment of ML models. Sitting between LLM inference on the one hand and outputs on the other, Peter Krensky of Gartner explains the technology helps “facilitate the interaction” between the two – unifying the model-deployment process in a single place while cutting the need for specialised skills along the way.
And if that means Bloomberg can get new AI products to market faster, that’s not KServe’s only strength. Built on the open-source platform Kubernetes, it doesn’t rely on physical servers, instead allowing distinct ML frameworks to sit together in the cloud.
Considering Kambadur alone works with over 350 engineers, that inevitably simplifies workflows, even as managers can more easily trace compute resources, vital for ensuring budgets stay in the black.
That focus on visibility is clear elsewhere too. Because it manages GPU, CPU and memory scheduling, for instance, KServe can prioritise between LLMs, useful for dealing with busy periods like earning seasons. At the same time, users can monitor a sample of outputs from LLMs in real-time. When a result is flagged as potentially problematic, it’s removed from client view, before a training and redeployment workflow sharpens it.
Kambadur especially highlights how helpful all this has been in bolstering Bloomberg’s transcript summarisation service – but it’s equally clear that KServe has improved the company’s AI modelling across literally thousands of pipelines. “Pretty much everything that runs on our data science platform uses KServe,” Kambadur says. “It’s pervasive that way.”
KServe testament to the growing popularity of Kubernetes
Bloomberg isn’t the only firm to benefit here. An open-source effort, IBM and Google are just two of the other companies leveraging KServe for their own projects. Kambadur, for his part, suggests that collaboration is important in principle, arguing that “we fundamentally have an obligation to contribute back” to the tech community.
More than that, though, the Bloomberg executive explains that schemes like KServe offer practical advantages too. As he points out, relying on open-source solutions means that when Kubernetes updates its software, Kambadur and his colleagues don’t need to painstakingly adapt to new standards themselves.
Krensky, for his part, is keen to highlight the remarkable ubiquity of KServe’s underlying technology. “Kubernetes,” he says, “is one of those incredible things, where it came along, and the entire technical community – academic, corporate, everything in between – agreed that it should be the standard for how we do cluster-based computing and containerisation.”
More to the point, Krensky sees KServe as “just one of the many” software variants to emerge from that same consensus. The evidence certainly suggests Kubernetes is transforming AI far beyond Bloomberg. In March, for example, Microsoft unveiled an add-on for its Azure Kubernetes Service, which automates LLM deployment based on available GPU and CPU resources. In April, Nvidia acquired Run:ai, an Israeli startup that built its own Kubernetes-based orchestration software.
No wonder Kambadur thinks that open-source solutions will “continue to be pervasive” across the AI space and beyond – even as he adds that firms must equally be sure to protect their own value propositions. Considering how much money’s to be made from those 1.5 million daily articles, that’s surely unsurprising.