In late November, three interlocking themes dominated a night of discussion among senior IT decision makers – artificial intelligence use cases, the challenges of deploying data, and the merits or otherwise of cloud computing. The Tech Monitor executive roundtable – convened in association with Lenovo’s advisory division, the Solutions and Services Group (SSG) – also covered topics as diverse as the case for green IT and how to overcome the current talent shortage.
Here are five takeaways from an evening of lively discussion and shared experiences.
1. Proliferation of AI use cases suggest growing maturity
Attendees offered a wide and extensive range of AI use cases that are currently in, or nearing, production. These included a risk model for reinsurers, tools for fuel optimisation and predictive maintenance, accelerated delivery of clinical trials for life sciences, data consolidation, and a market intelligence and price predicting tool for the energy sector. AI is even being applied to cricket where a bowling machine can be enhanced by artificial intelligence that assimilates a batsman’s strength and weaknesses in real time and adapts the balls it bowls accordingly.
What do all these use cases have in common?
All most likely fit into one of the four categories devised by one attendee in an effort to categorise her own companies AI efforts. The matrix, she explained, is made up of enhancement or replacement projects. These might be human-enhanced or human-replacement projects; or systems-enhancement or systems-replacement.
2. Cloud computing. Part 1. Workloads where?
During the evening a consensus of sorts appeared to emerge. Namely, that for any high performance computing (HPC) environment, on premise is always preferable to placing that workload in the cloud. The latter will almost always cost more than the former. By contrast, if you are hosting an application to run a monthly report, for example, where there is little or no activity for the remaining 29 or 30 days of the month, then the cloud makes absolute sense.
Many organisations are reconsidering their public cloud strategy in light of their ongoing experience. Cost is driving many to repatriate workloads. The need for latency, or to address data sovereignty and residency requirements are other factors pushing some workloads back into on premise, sometimes into a hybrid cloud environment. Hybrid cloud offers cloud-like experience and a pay-as-you-go model within privately-held infrastructure.
3. Cloud computing. Part 2. One public cloud provider or two?
“Never operate with just a single public cloud provider,” urged an attendee during the evening. Why use two or more providers? “It keeps them honest on pricing,” he offered. By a show of hands, around 80 percent of the companies represented around the table used two or more public cloud providers. Soon the Digital Operational Resilience Act (Dora) may enshrine cloud proliferation into law. For some organisations, at least.
In addition to the cost argument, a second provider is often deployed to provide resilience in the form of disaster recovery. Other reasons for multiple cloud providers include uncertainty around the availability of GPUs, the workhorse processors needed to power AI projects.
Asked to identify which hyperscalers they used, two thirds said Microsoft Azure, 58 percent AWS, and 17 percent Google Cloud Platform. Another 17 percent relied on IBM for public cloud.
4. Environmental sustainability struggles for oxygen
ESG maybe moving further down the corporate agenda if a straw poll of the night’s attendee organisations is any guide. Sustainability matters to individuals, a number of whom are lobbying to increase the influence of environmental criteria in their decision making and their request for proposals (RFPs).
However, most organisations are putting cost above all else. This might have always been the case but fewer companies are shy of saying so today compared to 18 months ago. One attendee put it brutally, arguing that if it was a choice between “oxygen for the business or reducing C02”, he’d chose business oxygen every time.
Others were more optimistic, suggesting that if organisations could better measure their own carbon footprint and those of their suppliers they could make more informed – and more responsible – decisions. Others suggested that as technology solutions become more efficient – water cooling of data centres rather than enormously expensive and emitting heating, ventilation and air conditioning systems, for example – so the emissions crisis would be tackled.
5. The skills crisis requires creative solutions
For most companies represented around the table, there is a clear deficit of skilled IT practitioners in the market. Demand is outstripping supply. This is particular prevalent among security professionals (think SecOps) and AI specialists. Some organisations are upskilling existing staff, topping off their existing specialism with a “layer” of intensive AI training. Others, who want to be leaders in artificial intelligence within the next two years know they probably need to hire “ready-made” AI expertise, not easy or cheap in this market.
There’s a challenge even before the recruitment process starts – and that’s defining the role you need. Given most of these AI roles are brand new, companies are required to create job specs from scratch.
One creative approach to the talent gap is to hire interns, graduates and apprentices, all of whom come with three advantages. First they are enthusiastic. Second they are quick learners. Third, they have yet to be “spoiled by the corporate IT mindset”.
‘Enterprise IT in the Age of AI’ – a Tech Monitor executive roundtable in association with Lenovo – took place at Searcy’s at The Gherkin, London on Thursday 28 November 2024.