DDN BLOG

Hi, I’m Jean-Luc Chatelain, executive vice president of Strategy and Technology for DDN. My passion and life’s work is in creating new and better technologies for Big Data and analytics. I am launching this blog with a goal of sharing with you my thoughts and observations about accessing, processing, storing and distributing Big Data; you can also follow me via Twitter on @informationcto. I really hope to get a dialog going, as well as feedback from others who share my enthusiasm for this space.

I’d like to start things off with a mention of today’s news: DDN launched hScaler, the first enterprise-class Hadoop appliance, and I could wax poetic that it’s a huge leap of faith or that our incredible team developed the idea for it in a raucous, all-night brainstorming session. That story would be consistent with our history of industry-shaking innovation, and a fun one to tell but that would not be reality.

The truth is that hScaler is simply a case of listening to our customers needs, experiences and of the market being ready for what our company does best and has done since 1998: store, protect and process Big Data.

Hadoop is a data analytics architecture born out of the really, really Big Data problems of the likes of Google and Yahoo. It quickly drew the attention of enterprises everywhere for its enormous potential to provide answers more quickly and from more data than ever before. Even more important, Hadoop gives enterprises the freedom to choose and change the questions they want to ask of their data. As the amount of data collected by enterprises everywhere explodes, Hadoop offers the potential to provide incredible value extraction from that data — and to do so at the speed of business.

Let’s face it though, many enterprises have encountered significant issues in implementing Hadoop and most efforts to date, outside of some early adopters or corner cases, have been science projects. Rollout times have been much longer than expected (Cf. Wall Street Journal). Finding the enough staff resources to ensure the entire HW & SW infrastructure of Hadoop is running and stable has raised the total cost of ownership (TCO) to prohibitive levels and for some users lengthened the time to results to frustratingly high levels. Hadoop is a truly novel system for data analytics, which is both its blessing (potential for amazing results) and its curse (high TCO, hard to find the right experts).

Enter DDN! As I mentioned we’ve been in the Big Data business before the term “Big Data” was all the buzz (and hype), and our client list ranges from the majority of the world’s largest high performance computing (HPC) environments to the intelligence community to major web and cloud providers. Our DNA is one of performance, scalability, reliability, low I/O latency processing and data protection. As Big Data is being “democratized” and moving into the classic enterprise, more companies than ever before are wrestling with it, and we are growing by leaps and bounds.

If there’s anyone equipped to help enterprises take full advantage of Hadoop, it’s us. We took the best of Hadoop architecture and the best of our technology and designed something that an IT department can get up and running in 8 hours or less — with HPC-quality performance to boot. It’s all about minimizing the time to first actionable results.

We leveraged our existing, single-pane-of-glass Big Data management console to simplify management and monitoring of the Hadoop platform. We included our RAID 6 data protection capabilities, immediately reducing the HDFS footprint by about 60 percent. We added performance features such as RDMA I/O to the Map/Reduce infrastructure to accelerate the process of extracting value from the information. And finally, because we offer a very large capacity — multiple petabytes in two racks —customers can very efficiently scale with our solution and just as importantly, you can independently add storage or compute capabilities to hScaler without having to buy anything you don’t need.

In short, we’ve leveraged our core strengths to remove the obstacles to Hadoop nirvana: We made it easy to implement and maintain, cut down the TCO, while at the same time enabling number crunching at breakneck speeds.

So as much fun as it would be to tell a more colorful story about today’s announcement, the fact is that if you know DDN’s background and are aware of the complexities of Hadoop, it probably wasn’t hard to see hScaler coming.

Is your business considering Hadoop? What are you doing with the technology? Let us know what you think in the comments section below.

  • DDN Storage