AMD has improved its Instinct accelerator

The Instinct MI325X would conquer with its memory, but the MI350 series is also being prepared.

At the end of last year, we wrote about the start of the AMD Instinct MI300 developments, and this series has just received a minor boost, as AMD has announced the Instinct MI325X accelerator. In terms of the basics, the design does not change, that is, the new solution uses the same development as the Instinct MI300X, the difference is the consumption frame and memory capacity. The former has increased from 750 watts to 1000 watts, while the current model uses 8Hi category HBM3 memories, while the new design is designed for 12Hi stacks and also supports the HBM3E standard.

Hirdetés

The exact specifications of the Instinct MI325X are detailed in the table below:

AMD Instinct MI300 series (with CDNA 3 architecture)
Type MI325X
Number of IO chips 4
Number of XCDs chiplets 8
Number of CCDs chiplets
XCD architecture CDNA 3
CCD architecture
Connection of XCDs and CCDs
Maximum XCD core clock 2100 MHz
Number of shader sub-elements 19 456
Number of processor cores
Int8 computing power 2600 TOPS
bfloat16 computing power 1300 TFLOPS
FP16 computing power 1300 TFLOPS
FP32 matrix computing power 163,4 TFLOPS
FP64 matrix computing power 163,4 TFLOPS
FP32 computing power 163,4 TFLOPS
FP64 computing power 81,7 TFLOPS
Effective memory clock 6000 MHz
Memory type HBM3E
Memory bus 8192 bit
VRAM capacity 256 GB
Memory bandwidth 6 TB/s
ECC support van
TDP/maximum consumption limit -/1000 watt
Format/Encapsulation OAM
PCI Express controller x16-os PCI Express 5.0
Number of Infinity Fabric links 8
RAS and Page Retirement van
Memory coherence on the host CPU side van

The Instinct MI325X comes in an OAM format and uses passive cooling, meaning that the specific server has to handle the airflow. The UBB platform is also made from the system, which uses eight Instinct MI325X OAM accelerators.

(+)

The primary advantage of the Instinct MI325X is the capacity of its on-board memory, which is large enough to handle the training of particularly large neural network models.


(+)

Meanwhile, AMD is also working on the CNDA 4 architecture, which will be the basis of the upcoming Instinct MI350 series. The company revealed that it will support new FP4 and FP6 data types, and the computing capacity for AI tasks will increase significantly. The main chiplet will be made on a 3 nm node, while the HBM3E standard memory will be able to offer a maximum capacity of 288 GB.

Source: prohardver.hu