IBM wasn't content with the traditional ribbon cutting, or even the button-pushing we've come to expect from hi-tech openings. Instead, each of the dignitaries present - including Victorian Premier Ted Baillieu, Federal Innovation Minister Senator Kim Carr, and Victorian Minister for Technology Gordon Rich-Phillips - turned a key that illuminated a segment of the company's logo.
"R&D plays a vital role in IBM's strategy for growth," said John Kelly III, IBM senior vice president and director of IBM Research. "The lab will work with Australia's top scientists and engineers from academia, government and industry to extend IBM's global R&D footprint and increase its impact on our clients and make the world work better. We look forward to working with the Australian technical and scientific community on some of the most pressing problems and greatest opportunities of our time."
Mr Baillieu said "Today's opening and this investment by IBM proves once again that Melbourne is a vital regional hub for global innovation and technology R&D," said Mr Baillieu. "This confirms our place in the world as a first-class destination for investment and revolutionary research and a great location for rewarding collaboration between Government, industry and the research community. We want Victoria to grow well and IBM's new R&D laboratory is part of a new technologically-driven age which will help deliver increased productivity, the key to sustainable ongoing economic prosperity."
Senator Carr noted that NBN rollout was one of the factors that attracted IBM to Australia, and said "The lab sets a new model for public-private collaborative research which is central to Australia's innovation agenda."
What areas of research will the focus of the new facility? See page 2.
The lab will focus on three key areas: natural resource management, natural disaster management, and healthcare and life sciences.
University of Melbourne vice-chancellor Professor Glyn Davis said "In recent years we have experienced a range of natural disasters, throughout the world, from floods and bushfires in our own backyard, to cyclones, tsunamis, severe storms, typhoons, earthquakes and landslides."
"No single organisation can tackle the impact these disasters have on the world," he added. "Instead we need to combine our technologies, our resources and capabilities and bring together leading minds from around the world to research solutions that reduce the risk and impact of these devastating events.
"Universities, industry, government and research institutes need to work together to find solutions to these complex global challenges."
iTWire spoke to Matthias Reumann, a IBM systems biology and computational medicine researcher with the Victorian Life Sciences Computational Initiative (VLSCI).
Dr Reumann's research might save your life, so please read on.
While research into heart disease has reduced deaths, the ageing population means it is remains the leading cause of death. Cardiac modelling allows the exploration of mechanisms that cannot be studied in vivo (in patients) or in vitro (in cells in a laboratory).
In 2008, a 768-processor model was estimated to require around two weeks of number-crunching to simulate a single heartbeat. That meant it would take around 12 weeks for a one-minute simulation.
But in Melbourne, Dr Reumann has access to an IBM Blue Gene supercomputer with 2048 four-core processors and 4GB of RAM per chip, which can simulate a heartbeat within minutes. This makes it feasible to simulate hours of heart action.
In collaboration with other researchers, he has submitted a scientific paper showing for the first time a correlation between in silico results (those generated by a computer simulation) and observations of a large group (over 300) patients.
He is discussing possible research in collaboration with cardiologists at the Royal Children's Hospital with the aim of improving the quality of life of children with abnormal heart anatomy. This would involve expanding current simulations of the electrical activity in the heart to its physical movement.
Dr Reumann is also engaged in another line of research that has the potential to make a major impact on our health - see page 4.
Genetic causes of cancer generally involve the interaction of two or more genes, and processing the large massive data sets is "computationally challenging," he said. Highly distributed computer systems such as Blue Gene are not widely used in this area of research, so he expects the laboratory will be able to make a contribution.
"The algorithms are very sophisticated," Dr Reumann said, but he noted that simpler algorithms are sometimes more parallelisable and therefore give faster results when a massively parallel supercomputer like Blue Gene is used.
Dr Reumann is already working with NICTA and the University of Melbourne in this area, but considerable effort is going into the validation of the results.