Cloud Computing Price-Performance Could Vary By 1000%!

Strategies And Technologies for Cloud Computing Interoperability (SATCCI)

By G C Network | March 4, 2009

As I alluded to in an earlier post, a major cloud computing interoperability event will be held in conjunction with the Object Management Group (OMG) March Technical Meeting on March…

Government Cloud Computing E-zine Launched

By G C Network | March 3, 2009

Today marks the launch of a new electronic magazine dedicated to addressing cloud computing within the government space. Over the last year during my personal exploration of this marketspace, I’ve…

NCOIC Plenary: Cloud Computing Working Group

By G C Network | March 2, 2009

Last week, I had the pleasure of participating in the NCOIC Cloud Computing Working Group. Led by Cisco Systems Distinguished Engineer, Mr. Krishna Sankar of Cisco Systems, the meeting purpose…

2nd Government Cloud Computing Survey – A Sneak Peek

By G C Network | February 25, 2009

This month, we’re in the middle of collecting data for our 2nd Government Cloud Computing Survey. to peek your curiosity (an to entice your participation) here is a sneak peek…

Government could save billions with cloud computing

By G C Network | February 23, 2009

In a recent study, published by MeriTalk, Red Hat and DLT Solutions, the Federal government could save $6.6 billion by using cloud computing or software-as-a-service. “Looking at 30 federal agencies,…

Cloud Games at FOSE 2009

By G C Network | February 19, 2009

ONLINE REGISTRATION NOW AVAILABLE Booz Allen Hamilton is launching its Cloud Computing Wargame (CCW)T at FOSE March 10-12, 2009 in Washington, DC. The CCW is designed to simulate the major…

IBM and Amazon

By G C Network | February 16, 2009

According to the Amazon Web Services (AWS) site, you can now use DB2, Informix, WebSphere sMash, WebSphere Portal Server or Lotus Web Content Management on Amazon’s EC2 cloud. “This relationship…

A Berkeley View of Cloud Computing

By G C Network | February 13, 2009

Yesterday, Berkeley released their View of Cloud Computing with a view that cloud computing provides an elasticity of resources, without paying a premium for large scale, that is unprecedented in…

Cloud Economic Models

By G C Network | February 11, 2009

One of the most important drivers of cloud computing in the Federal space is its perceived “compelling” economic value. Some initial insight on the economic argument is now available on…

Cloud Computing In Government: From Google Apps To Nuclear Warfare

By G C Network | February 10, 2009

Today, I want to thank John Foley of InformationWeek for an enjoyable interview and his excellent post, Cloud Computing In Government: From Google Apps To Nuclear Warfare. Our discussion covered…

Yes, you read that right. The price/performance of your cloud computing infrastructure could vary as much as 1000 percent depending on time and location. High levels of variability have actually been seen within the same cloud service provider (CSP) processing the exact same job. This also means that the cost to you of processing the exact same job in the cloud could vary by this much as well.

This surprising result was discovered by a Rice University group, headed by Dr. T. S. Eugene Ng, that has been focusing on cloud computing. Recently they published their joint work with Purdue University: Application-Specific Configuration Selection in the Cloud: Impact of Provider Policy and Potential of Systematic Testing, in the IEEE INFOCOM 2015 Conference Proceedings. That paper took a first step towards understanding the impact of cloud service provider policy and tackling the complexity of selecting configurations that can best meet the price and performance requirements of applications. That work resulted in a collaboration between Rice University and Burstorm, a developer of computer aided design (CAD) software specifically built to support cloud computing architects.
The Burstorm platform contains a product catalog of over 36,000 products across 900 CSP product sets. Working with Dr. Ng’s group, the study looked at seven suppliers across three continents (Asia, North America and Europe) with a total of 266 computer products spread over three locations per vendor, where available. Raw data was collected every day, for 15 days. The results were then normalized to reflect a 720-hour, monthly pricing model. The final output were price-performance metrics graphs that were used to look at performance and price variance both between the CSPs and geographic regions.
Analysis of the final output showed a 622 percent variation of performance within a same instance type and a price/performance variance of 1000 percent. Performance of the exact same virtual machine instance can also vary by as much as 60 percent over time. The best performing instance also did not show the best price-performance. Availability and behavior of instances was also very dependent on location, even when the instance was provisioned by the same CSP. Dave Hansen, Vice President and General Manager of sales, marketing and services for Dell Software sums up the importance of these results saying:

Dave Hansen, VP and General Manager, Dell

“…[This] report is incredibly valuable. I’ve looked at this problem many times over the years and it is very difficult to make buying decisions on cloud services without this context.”

These results also show that today’s enterprise desperately needs to use active metering and monitoring when procuring cloud-based services. Changes in instance types, pricing, performance over time and availability of services by location highlights the inadequacy of traditional benchmarking philosophies and processes. Another hidden gem in this report is the use of “performance quota” by some service providers. When a customer meets this CSP management quota, the performance of the relevant instance will be reduced. In other words, exceeding this limit will drive up your usage bill. These findings also drive home the need for enterprises to ramp up their due diligence when selecting CSPs. They should
also investigate the use of third party brokers and automated solution design tools when developing their cloud migration strategy.
As the use of cloud computing advances, consumers must take active steps toward being more sophisticated, automated and dynamic in their use of cloud service providers. At a minimum, these steps should include:
  • The use of computer aid design tools when conducting due diligence on cloud service providers;
  • Use of organic or independent third parties to meter, monitor and report on the performance of cloud-based resources;
  • Clear understanding of the use and associated limits of “performance quotas”; and
  • The identification of one or more alternative sources for the provisioning of all cloud-based resources.
This post was written as part of the Dell Insight Partners program, which provides news and analysis about the evolving world of tech. For more on these topics, visit Dell’s thought leadership site Power More. Dell sponsored this article, but the opinions are my own and don’t necessarily represent Dell’s positions or strategies.

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS – © Copyright Kevin L. Jackson 2015)

Follow me at https://Twitter.com/Kevin_Jackson
Posted in

G C Network