According to the company, Magnum IO delivers up to 20x faster data processing for multi-server, multi-GPU computing nodes when working with massive datasets to carry out complex financial analysis, climate modelling and other HPC workloads.
"Processing large amounts of collected or simulated data is at the heart of data-driven sciences like AI," said Nvidia founder and CEO Jensen Huang.
"As the scale and velocity of data grow exponentially, processing it has become one of data centres' great challenges and costs.
"Extreme compute needs extreme I/O. Magnum IO delivers this by bringing Nvidia GPU acceleration, which has revolutionised computing, to I/O and storage. Now, AI researchers and data scientists can stop waiting on data and focus on doing their life's work," he said.
The software was developed in collaboration with companies including DataDirect Networks, Excelero, IBM, Mellanox and WekaIO.
IBM vice president of storage offering management Sam Werner said "The amount of data that leading HPC and AI researchers now need to access continues to grow by leaps and bounds, making I/O a complex challenge for many to manage. IBM Spectrum Scale is designed to address the needs of any organisation looking to accelerate AI and run data-intensive workloads. The use of IBM Spectrum Scale and Nvidia GPU acceleration can help customers alleviate I/O bottlenecks and get the insights needed from their data faster."
A key part of Magnum IO is GPUDirect, which provides a path for data to bypass CPUs and travel between GPUs, storage and networking devices.
Its newest element is GPUDirect Storage, which enables researchers to bypass CPUs when accessing storage and quickly access data files for simulation, analysis or visualisation.
Nvidia Magnum IO software is available now, with the exception of GPUDirect Storage which allows GPUs to access storage devices without going through the CPU. GPUDirect Storage is currently available to select early-access customers, and will be more widely available in the first half of 2020.