This resource is no longer available

Cover Image

Large language models (LLM) are very large deep learning models that can achieve general-purpose language understanding and generation. But as useful as this may be, many organizations may hesitate to adopt, assuming that finding the right hardware to support LLMs is a daunting procedure.

Fortunately, AMD EPYC processors, designed and optimized for machine learning workloads, make this process a snap.

Discover in this short video how AMD EPYC 9654 delivers industry-leading AI inferencing performance with some of today’s top LLM families.

Vendor:
AMD
Premiered:
Mar 16, 2024

This resource is no longer available.