Benefits of GPUDirect Storage

Accelerating the Data Path to the GPU for AI and Beyond

Cover

As workflows shift away from the CPU in GPU-centric systems, the data path from storage to GPUs increasingly becomes the bottleneck.

NVIDIA and its partners are relieving that bottleneck with a new technology called GPUDirect Storage that includes a new set of interfaces. When partners are enabled with GPU Direct Storage, the Direct Memory Access engine in a NIC or local storage is able to move data directly to and from GPU memory, rather than going through a bounce buffer in the CPU.

This can improve bandwidth, reduce latency, reduce CPU-side memory management overheads, and reduce interference with CPU utilization.

In this presentation, discover the benefits of GPUDirect Storage with recent results from demos and proof points in AI, data analytics, and visualization. 

Vendor:
Flash Memory Summit
Posted:
Dec 17, 2020
Published:
Dec 17, 2020
Format:
PDF
Type:
Presentation
Already a Bitpipe member? Login here

Download this Presentation!