Webinar Replay: Hadoop for HPC

Webinar Replay:

Learn how to reduce Hadoop® storage integration cost and complexity

Hadoop is not an entirely plug-and-play ecosystem. It’s no secret. With data being wrangled at so many stages to gain critical insights, one of the main challenges is how to store the data while introducing an easy-to-use and cost-efficient storage layer that will result in successful Big Data strategy.

Join Abdulrahman Alkhamees, technical marketing manager, emerging markets, and Douglas O’Flaherty, IBM, two of the industry’s foremost Hadoop specialists, as they share the best practices on how you can successfully deploy an optimized and ultra-efficient Hadoop infrastructure. In this webinar, they discuss:

  • Why Hadoop for HPC?
  • How Spectrum Scale adds value to the Hadoop ecosystem
  • Spectrum Scale vs. other shared storage systems
  • Hadoop for HPC use-cases

Presenters

 

DDN Storage

Abdulrahman Alkhamees, DDN

 

IBM

Douglas O’Flaherty, IBM

Hosted By:

ddn-storage-logo-250
ibmLogoBlk130[1]