DDN Works with NVIDIA to Ease Deployment and Set New Boundaries in Performance for Data-Intensive AI and HPC
Collaboration Makes Deploying AI and HPC Infrastructure Easier and Increases Speed of Data Science
SANTA CLARA, Calif., Nov. 19, 2019 – DDN®, the global leader in artificial intelligence (AI) and multi-cloud data management, today announced that it has worked with NVIDIA to combine the power of NVIDIA DGX SuperPOD systems with DDN’s A3I data management system so customers can deploy HPC infrastructure with minimal complexity and reduced timelines. Additionally, by leveraging the new NVIDIA Magnum IO software stack to optimize IO and DDN’s parallel file system, customers can speed up data science workflows by up to 20 times.
The NVIDIA Magnum IO software suite increases performance and reduces latency to move massive amounts of data from storage to GPUs in minutes, rather than hours. DDN testing has confirmed that, using the software’s NVIDIA GPUDirect Storage feature, the most intensive workflows will see significant improvements and benefit AI and HPC application output directly. DDN expects to support the full NVIDIA Magnum IO suite, including GPUDirect Storage, in an EXAScaler EXA5 release in the middle of 2020.
“We value the deep engineering interlock between NVIDIA and DDN because of the direct benefits to our mutual customers,” said Sven Oehme, chief research officer at DDN. “Our companies share the desire to push the boundaries of I/O performance while simultaneously making deployment of these very large systems much easier.”
During testing with DGX SuperPOD, which itself is designed to deploy supercomputing-level compute very quickly, DDN was able to demonstrate that its data management appliance, the DDN AI400, could be deployed within hours and a single appliance could support the data-hungry DGX SuperPOD by scaling as the number of GPUs scaled all the way to 80 nodes. Benchmarks over a variety of different deep learning models with different I/O requirements representative of deep learning workloads showed that the DDN system could keep DGXSuperPOD system fully saturated.
“DGX SuperPOD was built to deliver the world’s fastest performance on the most complex AI workloads,” said Charlie Boyle, vice president and general manager of DGX Systems at NVIDIA. “With DDN and NVIDIA, customers now have a systemized solution that any organization can deploy in weeks.”
While the testing described above with DGX SuperPOD was performed with DDN’s AI400, DDN has since announced the AI400X. The appliance has been updated to provide better IOPS and throughput and will ship with Mellanox HDR100 InfiniBand connections to support next-generation HDR fabrics. With these enhancements, the AI400X appliance could provide even better performance for AI and HPC applications.
DataDirect Networks (DDN) is the world’s leading big data storage supplier to data-intensive, global organizations. For more than 20 years, DDN has designed, developed, deployed and optimized systems, software and storage solutions that enable enterprises, service providers, universities and government agencies to generate more value and to accelerate time to insight from their data and information, on premise and in the cloud. Organizations leverage the power of DDN storage technology and the deep technical expertise of its team to capture, store, process, analyze, collaborate and distribute data, information and content at the largest scale in the most efficient, reliable and cost-effective manner. DDN customers include many of the world’s leading financial services firms and banks, healthcare and life science organizations, manufacturing and energy companies, government and research facilities, and web and cloud service providers.
©2019 All rights reserved. DDN, EXASCaler, GRIDScaler, SFA are trademarks owned by DataDirect Networks. All other trademarks are the property of their respective owners.