Cloud Computing Price-Performance Could Vary By 1000%!

CloudCamp Federal @ FOSE

By G C Network | February 9, 2009

Sign up now CloudCamp Federal @ FOSE, March 10,2009, 3pm – 8:30pm at the Walter E. Washington Convention Center, 801 Mount Vernon Place NW , Washington, DC. As a follow-up…

Thank You NVTC “Cool Tech” and TechBISNOW !!

By G C Network | February 6, 2009

Thank you to Dede Haas, Chris D’Errico and the Northern Virginia Technology Council for the opportunity to speak at yesterday’s NVTC “Cool Tech” Committee meeting! The Agilex facilities were awesome…

A Significant Event in Cloud Interoperability

By G C Network | February 6, 2009

On Jan 20th, GoGrid released it’s API specification under a Creative Commons license. “The Creative Commons Attribution Share Alike 3.0 license, under which the GoGrid cloudcenter API now falls, allows…

Booz|Allen|Hamilton & Dataline Sponsor 2nd Government Cloud Computing Survey

By G C Network | February 4, 2009

Dataline, Booz|Allen|Hamilton and the Government Cloud Computing Community have teamed together to sponsor the 2nd Government Cloud Computing Survey. Cloud Computing has come a long way since the first survey six months…

Gartner Lays Out 7-year Plan for Cloud Computing

By G C Network | February 3, 2009

According to Gartner’s new report, cloud computing will go through three phases over seven years before it will mature as an industry; – Phase 1: 2007 to 2011 — Pioneers…

Cloud Interoperability Magazine Launches

By G C Network | February 3, 2009

My congratulations goes out today to Reuven Cohen on the launch of Cloud Interoperability Magazine. The site will focus on Cloud Computing, standardization efforts, emerging technologies, and infrastructure API’s. As the new…

Why Can’t We Eliminate the “Technology Refresh” RFP?

By G C Network | February 2, 2009

In order to maintain life cycle and technology, the Navy is upgrading server farms at fifteen (15) sites and any future sites throughout the Far East, Europe and Middle East…

Cloud & the Government Session at Cloud Computing Expo

By G C Network | January 29, 2009

Earlier this week I announced that I will be presenting at SYS-CON’s 2nd International Cloud Computing Conference & Expo in New York City this coming March 30-April 1, 2009. During…

CSC and Terremark target US Government with Cloud Computing

By G C Network | January 27, 2009

Today’s announcement by CSC reinforced the strong wave of cloud computing towards the Federal space. Ranked by Washington Technology Magazine as 9th largest (by contract dollar value) government contractor, this…

Should my agency consider using cloud computing?

By G C Network | January 26, 2009

This is clearly the question on the minds and lips of every government IT decsionmaker in town. Why should a government agency even consider cloud computing?  In reality, the decision…

Yes, you read that right. The price/performance of your cloud computing infrastructure could vary as much as 1000 percent depending on time and location. High levels of variability have actually been seen within the same cloud service provider (CSP) processing the exact same job. This also means that the cost to you of processing the exact same job in the cloud could vary by this much as well.

This surprising result was discovered by a Rice University group, headed by Dr. T. S. Eugene Ng, that has been focusing on cloud computing. Recently they published their joint work with Purdue University: Application-Specific Configuration Selection in the Cloud: Impact of Provider Policy and Potential of Systematic Testing, in the IEEE INFOCOM 2015 Conference Proceedings. That paper took a first step towards understanding the impact of cloud service provider policy and tackling the complexity of selecting configurations that can best meet the price and performance requirements of applications. That work resulted in a collaboration between Rice University and Burstorm, a developer of computer aided design (CAD) software specifically built to support cloud computing architects.
The Burstorm platform contains a product catalog of over 36,000 products across 900 CSP product sets. Working with Dr. Ng’s group, the study looked at seven suppliers across three continents (Asia, North America and Europe) with a total of 266 computer products spread over three locations per vendor, where available. Raw data was collected every day, for 15 days. The results were then normalized to reflect a 720-hour, monthly pricing model. The final output were price-performance metrics graphs that were used to look at performance and price variance both between the CSPs and geographic regions.
Analysis of the final output showed a 622 percent variation of performance within a same instance type and a price/performance variance of 1000 percent. Performance of the exact same virtual machine instance can also vary by as much as 60 percent over time. The best performing instance also did not show the best price-performance. Availability and behavior of instances was also very dependent on location, even when the instance was provisioned by the same CSP. Dave Hansen, Vice President and General Manager of sales, marketing and services for Dell Software sums up the importance of these results saying:

Dave Hansen, VP and General Manager, Dell

“…[This] report is incredibly valuable. I’ve looked at this problem many times over the years and it is very difficult to make buying decisions on cloud services without this context.”

These results also show that today’s enterprise desperately needs to use active metering and monitoring when procuring cloud-based services. Changes in instance types, pricing, performance over time and availability of services by location highlights the inadequacy of traditional benchmarking philosophies and processes. Another hidden gem in this report is the use of “performance quota” by some service providers. When a customer meets this CSP management quota, the performance of the relevant instance will be reduced. In other words, exceeding this limit will drive up your usage bill. These findings also drive home the need for enterprises to ramp up their due diligence when selecting CSPs. They should
also investigate the use of third party brokers and automated solution design tools when developing their cloud migration strategy.
As the use of cloud computing advances, consumers must take active steps toward being more sophisticated, automated and dynamic in their use of cloud service providers. At a minimum, these steps should include:
  • The use of computer aid design tools when conducting due diligence on cloud service providers;
  • Use of organic or independent third parties to meter, monitor and report on the performance of cloud-based resources;
  • Clear understanding of the use and associated limits of “performance quotas”; and
  • The identification of one or more alternative sources for the provisioning of all cloud-based resources.
This post was written as part of the Dell Insight Partners program, which provides news and analysis about the evolving world of tech. For more on these topics, visit Dell’s thought leadership site Power More. Dell sponsored this article, but the opinions are my own and don’t necessarily represent Dell’s positions or strategies.

Cloud Musings

( Thank you. If you enjoyed this article, get free updates by email or RSS – © Copyright Kevin L. Jackson 2015)

Follow me at https://Twitter.com/Kevin_Jackson
Posted in

G C Network