AI: Pushing Infra Boundaries - Memory is a Key Factor

webinar

Author(s)/Presenter(s):

Manoj Wadekar

Meta

Library Content Type

Presentation

Library Release Date

Focus Areas

Abstract

In recent years, hyperscale data centers have been optimized for scale-out stateless applications and zettabyte storage, with a focus on CPU-centric platforms. However, as the infrastructure shifts towards next-generation AI applications, the center of gravity is moving towards GPU/accelerators. This transition from "millions of small stateless applications" to "large AI applications running across clusters of GPUs" is pushing the limits of accelerators, network, memory, topologies, rack power, and other components. To keep up with this dramatic change, innovation is necessary to ensure that hyperscale data centers can continue to support the growing demands of AI applications. This keynote speech will explore the impact of this evolution on Memory use cases and highlight the key areas where innovation is needed to enable the future of hyperscale data centers.