Compare MI300X prices across 2 cloud providers. On-demand from $1.99/hr.
The AMD Instinct MI300X is the strongest alternative to NVIDIA's datacenter GPU dominance. Built on the CDNA 3 architecture with 192 GB of HBM3 memory, it offers 2.4x the memory capacity of the H100 at competitive performance levels. This makes it particularly attractive for large-model inference where the entire model needs to fit in GPU memory.
| Architecture | CDNA 3 (AMD) |
| GPU Memory | 192 GB HBM3 |
| Memory Bandwidth | 5.3 TB/s |
| FP16 Performance | 1,307 TFLOPS |
| TDP | 750W |
| Interconnect | Infinity Fabric (896 GB/s) |
| Release Year | 2024 |
MI300X pricing ranges from roughly $1.71/hr to $6/hr, with most neoclouds at $2-3/hr. It undercuts H100 pricing by 10-30% on a per-GPU-hour basis and offers substantially more memory per dollar. ROCm software maturity has improved significantly through 2025.
| Provider | Instance Type | GPUs | On-Demand | Spot |
|---|---|---|---|---|
| RunPod | 1x_MI300X_SECURE | 1 | $1.99 | $1.49 |
| Hot Aisle | vm-1x-mi300x | 1 | $1.99 | N/A |
| Hot Aisle | vm-2x-mi300x | 2 | $3.98 | N/A |
| RunPod | 2x_MI300X_SECURE | 2 | $3.98 | $2.98 |
| RunPod | 3x_MI300X_SECURE | 3 | $5.97 | $4.47 |
| RunPod | 4x_MI300X_SECURE | 4 | $7.96 | $5.96 |
| Hot Aisle | vm-4x-mi300x | 4 | $7.97 | N/A |
| RunPod | 5x_MI300X_SECURE | 5 | $9.95 | $7.45 |
| RunPod | 6x_MI300X_SECURE | 6 | $11.94 | $8.94 |
| RunPod | 7x_MI300X_SECURE | 7 | $13.93 | $10.43 |
| RunPod | 8x_MI300X_SECURE | 8 | $15.92 | $11.92 |
| Hot Aisle | bm-8x-mi300x | 8 | $15.94 | N/A |