U of Miami Customer Banner
COMPUTATION CENTER
ACCELERATES TIME TO LINK
VIRUSES & GASTROINTESTINAL
CANCERS

UNIVERSITY OF MIAMI

“Where DDN really stood out is in the ability to adapt to whatever we would need. We have both IOPS-centric storage and the deep, slower I/O pool at full bandwidth. No one else could do that.”

Joel P. Zysman

Director of High Performance Computing

Center for Computational Science at the University of Miami

The University of Miami maintains one of the largest centralized, academic, cyber infrastructures in the US, which is integral to addressing and solving major scientific challenges. At its Center for Computational Science (CCS), more than 2,000 researchers, faculty, staff and students across multiple disciplines collaborate on diverse and interdisciplinary projects requiring HPC resources.

With 50% of the center’s users come from University of Miami’s Miller School of Medicine with ongoing projects at the Hussman Institute for Human Genomics, the explosion of next-generation sequencing has had a major impact on compute and storage demands. At CCS, the heavy I/O required to create four billion reads from one genome in a couple of days only intensifies when the data from the reads needs to be managed and analyzed

Aside from providing sufficient storage power to meet both high I/O and interactive processing demands, CCS needed a powerful file system that was flexible enough to handle very large parallel jobs as well as smaller, shorter serial jobs. CCS also needed to address as much as 10X spikes in storage, so it was critical to scale and support petabytes of machine-generated data without adding a layer of complexity or creating inefficiencies.

Read their success story to learn how high-performance DDN® Storage I/O has helped the University of Miami:

  • Establish links between certain viruses and gastrointestinal cancers discovered with computation that were not possible before
  • Reduce genomics compute and analysis time from 72 to 17 hours
CHALLENGES

  • Diverse, interdisciplinary research projects required massive compute and storage power as well as integrated data lifecycle movement and management
  • Highly demanding I/O and heavy interactivity requirements from next-gen sequencing intensified data generation, analysis and management
  • Handle large parallel jobs and smaller, shorter serial jobs
  • Data surges during analysis created “data-in-flight” challenges

SOLUTION

An end-to-end, high performance DDN GRIDScaler® solution featuring a GS12K™ scale-out appliance with an embedded IBM® GPFS™ parallel file system


BUSINESS BENEFITS

  • The ability to meet varied research workflow demands enables CCS to accelerate data analysis and speed scientific discoveries
  • Best-in-class performance for genomics assembly, alignment and mapping has proven invaluable in supporting major medical research into Alzheimer’s, Parkinson’s and gastrointestinal cancers
  • High-performance storage and transparent data movement lets CCS scale storage without adding complexityT

TECHNICAL BENEFITS

  • Centralized storage with an embedded file system makes it easy to add storage where needed—in the high-performance, high-transaction or slower storage pools—and then manage it all through a single pane of glass
  • DDN’s transparent data movement enables using one platform for data capture, download, analysis and retention
  • The ability to maintain an active archive of storage lets the center accommodate different types of analytics with varied I/O needs

Sensors, AI, and Storage: Accelerating Smart City Development

In this presentation from the DDN User Group at SC17, Joel Zysman from the University of Miami Center for Computational Science discusses smart city development.

RESOURCES


View All Resources

PRODUCTS & SOLUTIONS USED

INDUSTRY

UNIVERSITY OF MIAMI ONLINE

link image Online
link image Facebook
link image Twitter
link image YouTube