View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Technology
June 30, 2014updated 22 Sep 2016 1:27pm

The latest cloud products and services

A roundup of the best cloud computing news this month.

By Duncan Macrae

British Airways to roll out in-flight internet within three years

British Airways is in advanced talks with satellite firm Inmarsat to provide super-fast and reliable internet access across the European airspace.

The airline said the service, which combines satellite and air-to-ground 4G connectivity, would initially be available on its UK domestic routes within three years.

London-based Inmarsat said it had teamed up with European peer Hellas-Sat in efforts to reduce initial programme costs.
Called Europasat, the $250m satellite will be constructed by French-Italian firm Thales Alenia Space and is expected to be delivered in 2016.

Inmarsat CEO Rupert Pearce said: "We believe that… with the support of EU telecoms regulators, Inmarsat can rapidly bring to market unique, high-speed aviation passenger connectivity services to meet this market demand on an EU-wide basis.

"A number of European airlines are aligned with this vision and we are absolutely delighted to announce advanced discussions with British Airways to be a launch customer on our new aviation network."

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape

Kate Thornton, head of product and service at BA, said the airline could soon be "leading Europe in a new era of broadband in the air".

"Starting with UK domestic routes, Inmarsat intends to deploy Europe’s first ground-based 4G broadband network, giving our customers the internet access they expect on the ground while in the air," she said.

 

Amazon Web Services awarded Bitcoin cloud computing patent

Amazon has been given granted a Bitcoin-related cloud computing patent that could soon allow the e-commerce giant to accept digital currencies as payment for its cloud computing services on Amazon Web Services (AWS).

The news counters recent claims by Amazon that it is not currently interested in accepting digital currencies, unlike e-commerce rivals such as eBay.

Amazon payments boss Tom Taylor said the online retailer had considered Bitcoin, but ultimately decided that there was too little interest in the virtual currency to benefit from accepting it.

Taylor told Re/Code: "Obviously it gets a lot of press and we have considered it, but we’re not hearing from customers that it’s right for them, and don’t have any plans within Amazon to engage Bitcoin."

The US Patent and Trademark Office granted Amazon the patent. It was filed on 29 March 2012.

An abstract from the patent read: "A resource can be allocated and available as long as payment has been provided. If a user wants the resource to be available for additional processing, for example, the user can submit another request with additional funding.

"The funding can come in the form of donations from any user, or in the form of investments where the investor expects some return on the investment in the form of revenue, visibility or other such compensation.

"One or more management components can track funding for various resources, can accept and select bids for period of sponsorship, and can manage various donation models."

The patent goes on to describe how different types of digital money can be accepted for cloud computing.

"Various types of digital cash, electronic money or crypto-currency can be used, such as bitcoins provided by the Bitcoin P2P currency system."

 

Acer announces further Bring Your Own Cloud plans

Acer has revealed a plan to push into cloud computing amidst figures that indicate a shrinking PC market.

The Taiwanese firm is currently the fourth-largest manufacturer of PCs in the world, and now wants to start developing software and providing online services under the name Build Your Own Cloud (BYOC), which it sees as the future of personal cloud computing.

The global PC market shrank 10% at the end of 2013, and it was then when Acer announced it BYOC plans, where users can store their data on the cloud and run applications purely online.

Stan Shih, Acer founder and chairman, said: "The computer is still our foundation, but BYOC is a new platform for integration, cross-compatibility and convenience."

Acer is entering a competitive market, however, with Google and Amazon both dropping their cloud computing prices in March. Cisco and HP also recently revealed $1bn cloud investments.

Data from research firm IDC showed that Acer’s PC shipments fell by 20.2% in the first quarter of 2014, alongside an overall market drop of 4.4%.

The company has partnered up with California-based software makers NTI to help push BYOC. Both NTI and Acer said it is in their best interests to explore a collaboration with the purpose of launching the BYOC service.

NTI president Bill Yao said: "NTI has been a software supplier to Acer since 1999. It’s both an honour and our responsibility to be a member of Acer’s BYOC alliance. We believe in Acer’s strong vision and we are very confident that our software expertise will help realise that vision."

Shih is to retire this June, but said he will aid Acer in finding BYOC collaborators past this date.

Microsoft set to preview predictive analytics in the cloud

Microsoft is set to bolster its cloud portfolio with a predictive analytics tool hosted on Azure, boasting data-driven algorithms used in Bing and Xbox Live.

Azure Machine Learning will be available for a public preview in July, the tech firm announced as it promised the tool would cut analytics app development from weeks and days to hours and minutes.

The tech giant’s Data Platform general manager, Eron Kelly, told CBR the service was created to solve a problem for companies wanting to make accurate data-based predictions but who lacked the data scientists to do it.

He said: "It allows you to shift from analysing the past to anticipating the future. What’s really cool about this service is I literally push a button and I publish the model as a service. It’s running in the cloud and you don’t have to worry about hardware, health monitoring, or failover – this is all done for you.

"Because of the simple graphical interface, there’s a broader group of people who can use this service."

One of those is the company’s own Microsoft Stores, which have been using Azure Machine Learning to predict and cut future credit card fraud, using the service to build a model that tags real-time transactions as fraudulent or not based on historical data.

Kelly said the stores have seen up to a 20% reduction in fraud thanks to the model.

The tool lets customers build their own algorithms, but also includes ‘out-of-the-box’ algorithms for Bing and Xbox Live, with Microsoft using the latter – that works out which games and films to promote on a gamer’s log-in screen based on their previous interests – to let retailers push products to customers when they visit their websites.

"We’ve been using machine learning within Microsoft for a long time and as we build these algorithms we’ve made them available out of the box. There’s a huge number of different models available," confirmed Kelly.

Azure Machine Learning plays into Microsoft’s mobile-first, cloud-first strategy by allowing companies to turn models into cloud-based apps that anyone can use, and you could even work on the model via your Surface tablet, revealed Kelly.

While he declined to share the number of "partners" Microsoft has been trialling the tool with for a year, he did say the firm is committed to "actively building a pretty robust partner network".

He named one as Carnegie Mellon University in Pennsylvania, US, which is using the tool to measure campus buildings’ energy outputs, and is expecting to save 30% on its bills just by adjusting the air-conditioning and heating based on the predictions.

The news comes after a string of big data-oriented announcements from Microsoft in April, including its "big data in a box" Analytics Platform Service.

 

Google plans global cloud-based Wi-Fi

Google is collaborating with wi-fi hardware firm Ruckus Wireless to develop a cloud-based Wi-Fi network.

The network would enable businesses to improve their Wi-Fi services by removing the physical Wi-Fi controller and placing it in the cloud, off of which they could hang thousands of individual access points anywhere in the world onto the same network.

The news was first reported by The Information, which cited an anonymous source familiar with the matter. Both Google and Ruckus Wireless have declined to issue an official statement.

The network would be subsidised by Google, and be part of a general push to get more people using Google’s Web services, the company’s primary revenue generator.

Facebook recently drew battle lines with Google over the mobile web, announcing ‘deep-link’ plans to capitalise on the rising tide of mobile web revenue as Google falls behind.

According to the news report, the project could be unveiled this summer, and is targeting businesses such as restaurants, offices, gyms, and perhaps even public services such as libraries.

The new system would use software-based wireless controllers to virtualise Wi-Fi management functions in the cloud, resulting in a global network that a business could join.

Once the service goes live, users will see the Wi-Fi network much like a home network, connecting and disconnecting when they enter and leave places that utilise the service. The plans could offer a significant income of web revenue that dodges the inclusion of the traditional mobile carrier industry.

Cisco System and Ericsson are both working on similar projects.

 

Dozens of cloud providers flock to achieve trust ‘seal of approval’

More than 50 cloud service providers have joined a cloud trust programme that gives them a seal of approval for meeting data protection and legal requirements.

The CloudTrust Program was launched in January 2014 by Skyhigh Networks, and evaluates cloud service providers on data control, service security, business practices and legal protection. Providers who pass the requirements are awarded with the Skyhigh Enterprise-Ready seal of approval.

Providers who have joined the programme include Accellion, Backupify, Egnyte, TargetX and WatchDox.

Brian Lillie, CIO of Equinix, an IBX data centre an colocation provider, claims he used the seal of approval to help select a service provider for his firm.

Lillie said: :We found that our employees were using multiple file-sharing services and that many of these services posed a risk to the organisation. We selected the most popular service rated Skyhigh Enterprise-Ready, which was the obvious choice from both a productivity and security standpoint. The CloudTrust rating reduced a process that takes months down to a few hours."

Skyhigh said the initiative is open to all cloud service providers including IaaS, PaaS, and SaaS providers, and certification is free.
There are more than 30 categories that are represented in the programme, covering a wide spectrum such as collaboration, customer relationship management, networking, ecommerce and cloud infrastructure.

Skyhigh Networks CEO Rajiv Gupta said that the use of cloud services is often not approved because there is a feeling that cloud service security is insufficient universally.

Gupta said: "While security is lacking for many cloud services, 7% of cloud service providers have invested time and resources in developing services that adhere to rigorous enterprise security requirements. We started the Skyhigh CloudTrust Program to highlight these services and ease the cloud adoption lifecycle for CIOs and IT departments."

VMware: public clouds are ‘modern silos’

Public cloud providers create "modern silos" and add complexity to businesses, VMware claimed as it showed off its hybrid cloud solution at its vForum event in London this month.

The virtualisation specialist said companies could move applications seamlessly between their on-premise infrastructure and hybrid cloud environment with its vCloud Hybrid Service.

The firm’s chief technology officer Joe Baguley told the event’s audience: "If you’re using the word migrating, you’re in the wrong mindset."

VMware’s hybrid cloud chief technologist Richard Munro told CBR the service means all the tools companies use on-premise to manage their applications still apply in the cloud – typically not the case with public cloud.

He said enterprises don’t migrate everything to one public provider because each service meets only some of their needs, leading to a multitude of providers.

"The way you operate each public cloud is completely different – the tools you have won’t work with them," he said. "Effectively, what you’re doing is creating modern silos like you had in the past. If you do that too many times you’re going to have too many silos and a very complex IT estate."

Munro added: "Our design decision has been very different from public cloud. Public cloud has built something and said ‘come and do it our way and we’ll try and reach in to what you have already [with plug-ins]’.

"We add this service as an extension of what you already use."

The company claims its vCloud Hybrid Service provides the flexibility of cloud with the security of on-premise, and could help cut the amount of time the business waits on IT to roll out new services.

"A five-month wait is not good when you’re dealing with a line-of-business request," said Munro. "Maybe they could go to a public cloud, but then they can’t bring that back into the organisation.

"With this, what you can do is decide to run this on-premise, but first provision this in VMware hybrid cloud on your build and manage it with your tools. Then when I’ve got the on-premise service stood up I’ll just copy it off the hybrid cloud component onto on-premise."

The comments come after Sage told CBR last month that companies are not interested in enterprise resource planning (ERP) cloud software because the cloud would render their different ERP systems identical.

Sage ERP CEO Christophe Letellier said: "Cloud by definition is standard. [We must make] sure our product is as configurable as possible. Through configuration you have the opportunity to adapt."

VMware’s service comes with a disaster Recovery as a Service option that Munro conceded doesn’t compete with the best enterprise tools on the market, but is aimed at those without existing disaster recovery.

Using it, you can right-click to replicate any virtual machine on your estate in the cloud, with recovery time ranging from 15 minutes to 24 hours.

 

IBM promises speed to cloud customers

IBM is working on integrating its cloud computing platform with the open source Docker community to enable companies to deploy applications faster and with flexibility.

The company has simultaneously launched the Direct Link service so that customers can connect directly to the SoftLayer cloud computing platform from their own dedicated IT network.

Integration with Docker will allow IBM to host Docker Hub on SoftLayer, provide customers access to all relevant content from the Docker data repository, and offer better performance and flexibility through a certified Docker image of the cloud-optimised WebSphere Liberty Profile Application Server.

Customers can expect faster start-up times and 50% reduction in memory usage with a unique combination of SoftLayer bare metal servers, IBM Java, Docker and the IBM WebSphere Liberty Profile.

IBM Open Technologies and Cloud Performance vice-president Angel Diaz said: "The IT industry continues to rally behind a cloud built on open and community-driven technologies. IBM continues to firmly believe that the best means to truly drive widespread adoption and innovation of cloud computing is to ensure open technologies are at its core."

Meanwhile, the Direct Link service enables customers to create hybrid-computing systems that merge private infrastructure with the SoftLayer platform. It provides them with complete control over infrastructure, connection speed and the route to connect to SoftLayer. It also brings in more security by eliminating the need to use public Internet.

SoftLayer CTO Matt Chilek said: "Direct Link helps them [the companies] optimise their workloads and get more value out of their data. They can move both to and from SoftLayer as easily as if our bare metal and virtual servers and storage were part of their local area network."

 

G-Cloud ‘lacks the transparency’ to succeed

Cloud computing authority Cloud Industry Forum says the UK Government’s G-Cloud initiative has failed due to a lack of transparency.

G-Cloud, launched in 2012, was supposed to ease procurement by public sector bodies in departments of the UK Government of commodity IT services that use cloud computing. Its aim was to cut £120m a year from the public sector IT bill by encouraging the public sector to purchase IT products and services through the Government’s CloudStore digital marketplace.

But results of a Freedom of Information (FoI) request by IT services company Bull Information Systems suggests that local councils have been ignoring G-Cloud, costing the taxpayers millions of pounds per year.

A total of 26 out of 27 UK county councils responded to the FoI, which showed that in the 2012-13 financial year the councils spent almost £440m on IT services (including staffing costs), but only £385,000 of that was spent through the Government’s G-Cloud framework.

The Government’s Major Projects Authority (MPA), which works to improve project performance for the taxpayer, has since given G-Cloud an ‘amber/red’ status due its flagging performance.

Now, the Cloud Industry Forum (CIF) has questioned exactly what G-Cloud has really achieved since its launch. According to Alex Hilton, CIF’s CEO, G-Cloud lacks the single ingredient that must underpin every procurement service: transparency.

He said: "Our latest research on the UK market indicates that cloud adoption rates in the public sector match those in the private sector, both standing at 69%, but this enthusiasm does not seem to have spread to local government, which simply hasn’t taken to G-Cloud as was predicted.

"From its inception, G-Cloud held a great deal of promise and we fundamentally support a consistent approach to cloud procurement by the Government. The Government’s stated aspiration is for 25% of central procurement to be through SMEs, but this does not seem to be following through to local authorities."

CIF’s Code of Practice is a certification model for cloud procurement services. The forum encourages G-Cloud providers to promote it to their local authority customers to further assure their cloud credentials.

Hilton added: "Whilst the European Commission is driving its Digital Agenda for Europe, the UK Government doesn’t itself subscribe today to any certification schemes. We believe it should be offering more assistance and guidance in the selection of suitable and trustworthy cloud providers."

CIF was established in 2009 to provide transparency through certification to a Code of Practice for credible online cloud service providers and to assist end users in determining core information necessary to enable them to adopt these services.

 

 

 

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU