Log in
Enquire now
AMD Instinct MI300X

AMD Instinct MI300X

The AMD Instinct MI300X is an HPC GPU from AMD designed for the AI market.

OverviewStructured DataIssuesContributors

Contents

Is a
Product
Product

Product attributes

Industry
Graphics processing unit
Graphics processing unit
Generative AI
Generative AI
Artificial Intelligence (AI)
Artificial Intelligence (AI)
Launch Date
June 13, 2023
0
Product Parent Company
Advanced Micro Devices
Advanced Micro Devices
Competitors
‌
NVIDIA H100 Tensor Core GPU
NVIDIA H200 Tensor Core GPU
NVIDIA H200 Tensor Core GPU
Microsoft Azure Maia 100 AI Accelerator
Microsoft Azure Maia 100 AI Accelerator

The AMD Instinct MI300X is a high-performance computing (HPC) graphics processing unit (GPU) from AMD designed for the AI market. The MI300X is a pure GPU, using CDNA 3 GPU tiles that are paired with 192GB of HBM3 memory. The chip is targeting large language model (LLM) developers that require significant memory capacity to run cutting-edge models.

Part of the AMD Instinct family, the MI300X is a follow-on from the MI300A chip. Where the M300A combines three Zen4 CPU chiplets with multiple GPU chiplets, the MI300X replaces the CPUs with two additional CDNA 3 chiplets. This makes the MI300X design simpler than the MI300A with 12 chiplets in total—8 GPU and 4 IO memory chiplets. The MI300X increases the transistor count from 146 billion transistors to 153 billion, and the shared DRAM memory is boosted from 128GB in the MI300A to 192GB. The memory bandwidth has increased from 800 gigabytes per second to 5.2 terabytes per second.

AMD first announced its MI300 series of chips in June 2022, and further details were released at CES in January 2023. The MI300X chip was introduced by CEO Lisa Su on June 13, 2023. AMD stated it will be sampling the MI300X GPU to customers in Q3 of 2023. As part of the presentation, Su stated the MI300X is powerful enough to run the Falcon-40B LLM (one of the most popular LLMs at the time with 40 billion parameters) entirely in memory, rather than moving data back and forth to external memory. Su said the MI300X could run models up to approximately 80 billion parameters in memory, helping to reduce:

the number of GPUs you need, significantly speeding up the performance, especially for inference, as well as reducing the total cost of ownership.

On June 13, 2023, AMD also announced the AMD Infinity Architecture Platform with an 8-way interlinked MI300X design for larger workloads.

Timeline

No Timeline data yet.

Further Resources

Title
Author
Link
Type
Date

AMD Reveals MI300X AI Chip (Watch It Here)

https://www.youtube.com/watch?v=rYVPDQfRcL0

Web

June 13, 2023

References

Find more entities like AMD Instinct MI300X

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.