There’s a lot of hype around autonomous vehicles these days. With artificial intelligence (AI) technology, autonomous vehicles can circulate unassisted in our cities however, these vehicles engage some of the toughest workloads in AI at unprecedented scale. They require the handling, ingest and delivery of a broad range of dataset types and sizes, generated from many different sources such as video cameras, lidar, radar and other sensors. Very large datasets captured over millions of miles undergo many cycles of processing, labeling, sub sampling and categorization and require immense I/O, data storage, and computational resources.
We are helping make AI-powered innovation easy, with faster performance, effortless scale, and simplified operations through our reliable data storage framework that can scale to Terabytes per second of throughput and hundreds of Petabytes of capacity. In fact, we worked with a world leading peer-to-peer ridesharing company to create, deploy and optimize a global AI-enabled IT Infrastructure for its autonomous vehicle program.
For this customer, a massive dataset for training neural networks was developed; data from experimental vehicles and ridesharing engagements was collected, and an extensive and complex DL framework was trained, tested, and refined for the autonomous driving capability. The resulting software was loaded onto experimental vehicles for evaluation in the field, and operational data from the ride fed back into the loop to further enhance the Deep Learning (DL) process.
The customer’s requirement called for the creation of a very large scale parallelized data storage system to feed an extremely large scale GPU-based computing platform. The storage solution had to ingest, keep and deliver massive amounts of data rapidly and reliably, scaling linearly to extreme levels in performance and capacity. With original increments set at nearly one hundred petabytes of capacity, the highest datacenter density and efficiency with low management and support overhead were additional must haves.
We designed an easy-to-scale pod-based architecture, with each pod providing an optimal amount of storage capacity and performance, perfectly matched to the customer’s ingest and training needs. These pods were then deployed the customer’s datacenters, and successfully scaled in capacity and performance as their needs expanded—all within a single namespace. We also provided extensive expert design engineering, deployment, optimization and support services. The DDN system has been in continuous operation with no unplanned downtime and no data loss since the initial installation.
Relying on DDN Storage, the customer program has been underway for several years. Today, fleets of cars operate autonomously in a number of cities, continuously gathering positional and contextual information from a wide set of sensors, feeding it back to the central storage system. The DDN storage platform effortlessly handles the concurrent ingest of these massive data streams, organizing and structuring the underlying datasets.
Millions of GPU cores continuously access the DDN storage system, executing extensive and complex training processes, continuously refining the self-driving capabilities of the fleet of vehicles. DDN storage has enabled this customer to harness data at immense scale, successfully and reliably building an advanced AI framework that is revolutionizing the transportation industry.
We are continuously engaged in developing technologies which enable breakthrough innovation and help our customers maximize the value of their data and easily and reliably accelerate time to insight. For more information on our solutions for AI and DL, visit our A³I solutions page.
Attending GTC19 in San Jose this week?
Stop by booth #1211, to learn more about the next generation of A³I reference architectures, which include NVIDIA’s DGX POD™, DGX-2™, and DDN’s AI400™ parallel storage appliance. We’ll also be demonstrating a live frictionless-retail experience in our booth with partner XXII, and have a lot of other exciting news to share with you. We hope to see you at the conference!