| GPU ID | Display Name | Memory (GB) |
|---|---|---|
| AMD Instinct MI300X OAM coming soon | MI300X | 192 |
| NVIDIA A100 80GB PCIe | A100 PCIe | 80 |
| NVIDIA A100-SXM4-80GB | A100 SXM | 80 |
| NVIDIA B200 | B200 | 180 |
| NVIDIA H100 80GB HBM3 | H100 SXM | 80 |
| NVIDIA H100 PCIe | H100 PCIe | 80 |
| NVIDIA H200 | H200 SXM | 141 |
| NVIDIA L4 | L4 | 24 |
| NVIDIA RTX PRO 6000 Blackwell Server Edition | RTX PRO 6000 Server | 96 |
| NVIDIA RTX PRO 6000 Blackwell Workstation Edition | RTX PRO 6000 Workstation | 96 |
| NVIDIA Tesla T4 | T4 | 16 |
| NVIDIA A100 40GB | A100 40GB | 40 |
| NVIDIA Tesla P4 | P4 | 8 |

