Wednesday, 13 July 2011 11:55

Swinburne models cloud computing costs


Researchers at Melbourne's Swinburne University of Technology have been examining the tradeoff between processing and storage costs in cloud environments.

One of the big attractions of cloud computing is that you only pay for what you use. The downside is that there's no upper limit, so it isn't difficult to end up with a much bigger bill than you expected.

Another feature of cloud computing is that providers typically charge separately for storage and processing. So researchers at Swinburne University of Technology have been exploring the management of raw and intermediate data.

Is it better to keep the raw data and recreate intermediate datasets as required, or should you keep both? "The trade-off is going to be between storage cost and computation cost," said John Grundy, who works in the University's Center for Computing and Engineering Software Systems (SUCCESS). "Finding this balance is complex, and there are currently no decision-making tools to advise on whether to store or delete intermediate datasets, and if to store, which ones."

Funded by the Australian Research Council, Prof Grundy, Yun Yang and Jinjun Chen (who is now with the University of Technology, Sydney) have developed a mathematical model that takes into account the size of the original dataset, the amount of intermediate data stored, and the rates charged by service providers.

What adds to the complexity is that intermediate datasets are not necessarily generated directly from the original data, but from intermediate results. So the team also developed an intermediate data-dependency Graph (IDG) to helps users decide whether they are better off spending money on storage or computation for intermediate datasets.


"IDG records how each intermediate dataset is generated from the one before it and shows the generation relationship between them. This means if a deleted intermediate dataset needs to be regenerated, the IDG could find the nearest predecessor of the dataset. This can save computation cost, time and electricity consumption," Prof Grundy said.

Prof Yang pointed out that data sets can be huge. Astronomers may log as much as 1GB per second. The researchers produced six intermediate datasets from a particular astronomical dataset, and determined the costs of regenerating or storing them based on Amazon's published prices.

The minimum cost for one hour of observation data from the telescope and storing intermediate data for 30 days was $200; for storing no data and regenerating when needed, $1000; and for storing all intermediate data, $390.

"We could delete the intermediate datasets that were large in size but with lower generation expenses, and save the ones that were costly to generate, even though small in size," Prof Yang said.

The researchers are woking on models that will allow these decisions to be made on the fly.

The research is not only applicable to public cloud services such as Amazon. Such models also could be employed by users of internal IT services that are charged on a utility basis.


Subscribe to ITWIRE UPDATE Newsletter here

Now’s the Time for 400G Migration

The optical fibre community is anxiously awaiting the benefits that 400G capacity per wavelength will bring to existing and future fibre optic networks.

Nearly every business wants to leverage the latest in digital offerings to remain competitive in their respective markets and to provide support for fast and ever-increasing demands for data capacity. 400G is the answer.

Initial challenges are associated with supporting such project and upgrades to fulfil the promise of higher-capacity transport.

The foundation of optical networking infrastructure includes coherent optical transceivers and digital signal processing (DSP), mux/demux, ROADM, and optical amplifiers, all of which must be able to support 400G capacity.

With today’s proprietary power-hungry and high cost transceivers and DSP, how is migration to 400G networks going to be a viable option?

PacketLight's next-generation standardised solutions may be the answer. Click below to read the full article.


WEBINAR PROMOTION ON ITWIRE: It's all about webinars

These days our customers Advertising & Marketing campaigns are mainly focussed on webinars.

If you wish to promote a Webinar we recommend at least a 2 week campaign prior to your event.

The iTWire campaign will include extensive adverts on our News Site and prominent Newsletter promotion and Promotional News & Editorial.

This coupled with the new capabilities 5G brings opens up huge opportunities for both network operators and enterprise organisations.

We have a Webinar Business Booster Pack and other supportive programs.

We look forward to discussing your campaign goals with you.


Stephen Withers

joomla visitors

Stephen Withers is one of Australia¹s most experienced IT journalists, having begun his career in the days of 8-bit 'microcomputers'. He covers the gamut from gadgets to enterprise systems. In previous lives he has been an academic, a systems programmer, an IT support manager, and an online services manager. Stephen holds an honours degree in Management Sciences and a PhD in Industrial and Business Studies.

Share News tips for the iTWire Journalists? Your tip will be anonymous




Guest Opinion

Guest Interviews

Guest Reviews

Guest Research

Guest Research & Case Studies

Channel News