In Gartner’s recent Hype Cycle of Emerging Technologies diagram, cloud computing was found to be at the very top of the so-called ‘Peak of Inflated Expectations’. That’s hardly surprising given the hype, but I’d argue that it won’t be long before it is plunging down into the ‘Trough of Disillusionment’.
There are still some grave problems with cloud computing today – a style of computing which treats IT as a service that can be delivered remotely, usually in a very virtualised form, with flexibility to scale up and down as the business requires it.
Cloud is said to offer efficiency and agility, apart from anything else by enabling the IT department to concentrate on more strategic projects instead of worrying about the infrastructure – the cloud provider looks after that for you. It’s argued that in some cases cloud could also save a company money, because the cloud provider is able to rely on economies of scale to bring the cost down for a number of customers.
So what are the drawbacks? William Fellows, an analyst at The451 wrote of the potential problems with cloud: "Security, red tape, fate sharing, legacy infrastructure, connectivity, offline access, performance, volume, SLA management, job security and territorialism, to name a few."
Meanwhile last year, Ovum’s Tim Jennings, wrote: "Service levels, security, and performance are all areas where customers will need reassurance and clear evidence of adequate resilience."
Speaking to CBR, Zohar Gilad, executive vice president, products, marketing, and channels at transaction performance management player Precise Software, noted: "All the old problems remain when you move to cloud computing, whether public clouds or private clouds. There’s the risk of a degradation of service as applications compete for resources – we call this inter-application resource contention."
Indeed Gilad believes that so-called private cloud computing – similar to the public cloud concept but using virtualisation, automation and services oriented architecture to help much of an internal IT function to start to look like a cloud provider to its own business – actually introduces a number of new challenges that are harder to overcome than some of the older problems when everything was physical and not virtual.
"These virtual machines can now be created, moved and deleted with the click of a mouse," Gilad said. "That gives IT a lot of flexibility but introduces new challenges as to how you monitor service levels, for instance. Also, if a virtual machine has been decommissioned, how do you know later when a problem with that service is reported, just what was wrong with it?"
Precise recently launched Precise for Cloud, said to enable companies to ensure quality of service for applications running on private cloud platforms. "It does three main things," Gilad told us. "We co-provision our agents with the application so as soon as a virtual machine is created, a Precise agent is automatically provisioned to monitor transaction performance. The same if a virtual machine (VM) is moved with VMware VMotion: our agent automatically follows that VM."
"But Precise for Cloud is also persistent, so even if a VM is no longer in service we can analyse what it was doing and diagnose any problems that might affect future VM configurations," Gilad said. "And the third thing is integration with VMware so we can fully connect virtual machine events with application performance."
Service levels under scrutiny
If further evidence were needed that management of service levels, latency and performance remain an issue in private or public clouds, witness CA Technologies’ move earlier this week to snap up cloud management firm Hyperformix.
CA, which already has a number of products for the management of virtual or cloud environments, said Hyperformix’s capacity management enables customers to discover how physical, virtual, hardware, software, storage, and network resources are being used, and to determine what resources will be needed in the future.
"Virtualisation capacity management is among customers’ most critical IT management needs," said Roger Pilc, general manager of CA Technologies’ virtualisation and automation customer solutions business. "An important aspect of our virtualisation and cloud strategies is to help customers overcome the ‘VM stall’ challenges they encounter as they roll out virtualisation and progress to a dynamic, cloud-based data centre architecture."
The firm said the acquisition will help companies avoid the pitfalls of virtualisation that include VM Sprawl (uncontrolled VM deployments) and VM Stall (the inability to move beyond initial virtualisation of 20-30 percent of servers).
The technology is also said to help companies ensure application performance and service level agreements, and also help achieve faster and more reliable data centre consolidations, virtualisation rollouts, platform refreshes and application migrations.
"Capacity management is an important discipline for customers attempting to expand their virtualisation and private cloud deployments," said Peter Klante, president and CEO, Hyperformix. "As a virtualisation rollout progresses through the enterprise, insight and comprehensive planning are required to realise its potential. CA Technologies and Hyperformix will deliver on that critical need."
In a related announcement earlier in the week, Gartner said it expects 25% of workloads to be virtualised by the end of the year. It said that as virtualisation matures, the next "big thing" will be automating the composition and management of virtualised resources.
Precise, too, believes it has found itself just in the right place portfolio-wise, at the right time (As well as performance management for cloud environments, it does application performance management and transaction performance management for a range of packaged applications, Java and .Net apps, databases and storage arrays). It’s been ramping up direct and channel sales around Precise for Cloud, opened new offices in France and Germany and moved its UK HQ from Basingstoke to the City of London.
In mid-September, meanwhile, it was Novell that launched its new Cloud Manager product, saying it enables customers to create and manage a private cloud computing environment as an extension of existing data centre resources. It played up the fact it supports all the leading hypervisors – today Precise Software only supports VMware though Gilad said the firm will be adding more.
Novell said in a recent survey of around 200 IT professionals at large enterprises conducted by Harris Interactive and sponsored by Novell, 89% said they see private clouds as the next logical stop for organisations already using virtualisation, and 93% feel private cloud platforms should offer a management framework that can span a heterogeneous infrastructure.
Yet one of the obvious problems is keeping track of all of these new virtual resources. "The phenomenal adoption of virtualisation technologies within the data centre has solved many IT problems, but has also made it easy for rapid proliferation of individual virtual machines that consume the finite available physical resources at a very rapid pace," said Fred Broussard, research director, PC, device and IT service management software at IDC. "What today’s changing IT environments demand is a practical solution that will help deliver on the promise of utility computing and attain the ROI benefits of virtualisation – products that create and manage cloud environments by provisioning whole IT services, comprised of one or more workloads, in a controlled, secure and compliant way."
Novell found at least one customer happy to endorse their new Cloud Manager approach: "Novell Cloud Manager will simplify our IT service provisioning by enabling us to treat our physical resources and virtual infrastructure as a seamless private cloud and gain a much clearer and more granular understanding of our IT use and costs," said Ryan Klose, chief information officer of Premium Wine Brands for Pernod Ricard.
Of course as well as monitoring and management of performance and service levels, another key concern about both public and private cloud computing is security. While many more servers are now virtual, they still need a range of protective measures and approaches if they are not to pose an even greater risk to the business than physical servers always have.
In the recent survey sponsored by Novell, 91% of respondents note concern about the inherent security risks public clouds present. But even with private clouds there are risks inherent with the virtualisation of resources.
As CBR has explained before, the ability to create new virtual servers in seconds means that companies need to watch not just for server sprawl but watch that those servers are configured according to company policy, have adequate security measures and are patched up to date. The ability to ‘roll back’ a virtual machine into a previous state means that it might inadvertently be rolled back to a state before a vital patch was added.
Ovum principle analyst Graham Titterington insists that the potential threat of attack from within or outside an organisation is very real. "The thing about virtual machines is that the data on them is nearly always in use, and data that is in use is never encrypted, because once you encrypt it it’s not much use to you," he said.
Meanwhile Neil MacDonald, VP and Gartner Fellow has said that, "In the rush to virtualise for cost savings, security and management issues are often afterthoughts, resulting in a reduction on overall security levels from physical environments. To avoid unexpected costs or increased and unexpected risks, engage proactively in a discussion of the security and management issues associated with a virtual environment before widespread virtualisation initiatives are undertaken."
Andrew Yeomans of the Jericho Forum – a group of security professionals who offer advice, guidance, best practice and the like – notes that many organisations forget about security best practices when it comes to virtualised environments. "People are doing things in virtualised environments that they would never have done in the physical world," he says.
Companies are still going to need the likes of antivirus and intrusion prevention/detection systems (IPS/IDS) in the brave new world of private clouds. Indeed last year, Sourcefire announced its first VMware-based virtual appliances to extend IPS protection to virtualised systems and remote office locations.
Its first VMware-based virtual appliances include the Sourcefire Virtual 3D Sensor and Sourcefire Virtual Defence Centre. Compatible with VMware’s ESX and ESXi platforms, the Virtual 3D Sensor offers IPS protection from 5 to 500Mbps and can be monitored and managed by physical or virtual Defence Centre management consoles. Then in March this year the company added support for the Xen hypervisor.
Sourcefire’s Dominic Storey told CBR at the launch that the virtual intrusion detection capability is "out of band", in other words parallel to any traffic between virtual machines rather than sitting in the data stream. So although the use of additional security technology like Sourcefire is always going to have some impact on the performance of underlying server hardware as it must use a few CPU cycles of its own – "That’s just maths," Storey notes – it shouldn’t add noticeable latency to the performance of the virtual machines or any traffic between them.
Which brings us back to one of Precise Software’s Precise for Cloud capabilities: to spot performance problems in communication between virtual machines, as well as between the VMs and applications, storage or indeed users.
But with all the additional complexity that private cloud computing is liable to usher in, will companies not be put off even attempting the switch, particularly for more mission critical applications? "From the customers we have talked to, the number moving mission critical applications into private clouds is in the low single digits today," says Precise’s Gilad. "But they know they will get there, the only question is when."
Time to do more?
Gartner is certainly urging companies to do more virtualisation. "Virtualisation now drives efficient IT from all angles, including data centre design, platform updates, and application and infrastructure modernisation, as well as traditional and new delivery models, such as infrastructure utility and cloud computing," research vice president Philip Dawson said.
But the move to virtualisation, and beyond that to private cloud computing, is not necessarily a simple one. "IT professionals are between a rock and a hard place," says Precise’s Gilad. "On the one hand private cloud has the potential for agility, flexibility, cost savings. But the question, especially when you start talking about mission critical applications, is how you manage the transition before, during and after migration."
While it seems likely that more and more companies will start to move towards private cloud infrastructures as the next logical step after virtualising resources, many argue it’s really just virtualisation, a bit of SOA and a bit more automation given a snazzy new name. Either way, companies would be wise to think about the performance, management and security implications of the rush to get on the cloud bandwagon.