pricegpu

NVIDIA H100 80GB SXM Cloud GPU Pricing

80GB VRAM · 1979 TFLOPS FP16 · Hopper · Released 2022

Live Pricing — 0 offers

No pricing data available yet. Check back after the next scrape.

Specifications

VRAM80 GB
Memory Bandwidth3350 GB/s
FP16 TFLOPS1979
FP8 TFLOPS3958
ArchitectureHopper
Released2022
TDP700W
MSRP$30,000
Good Forllm-training, llm-inference-large, fine-tuning, hpc

FAQ

What is the cheapest NVIDIA H100 80GB SXM cloud provider?
No pricing data is currently available for the NVIDIA H100 80GB SXM.
How much VRAM does the NVIDIA H100 80GB SXM have?
The NVIDIA H100 80GB SXM has 80GB of VRAM and 3350 GB/s memory bandwidth.
What workloads is the NVIDIA H100 80GB SXM good for?
The NVIDIA H100 80GB SXM is well-suited for: llm-training, llm-inference-large, fine-tuning, hpc.
What is the FP16 performance of the NVIDIA H100 80GB SXM?
The NVIDIA H100 80GB SXM delivers 1979 TFLOPS of FP16 performance, and 3958 TFLOPS FP8.

Related GPUs & Providers

NVIDIA H100 80GB PCIe
80GB VRAM · Hopper
NVIDIA H200 141GB SXM
141GB VRAM · Hopper
NVIDIA A100 80GB SXM
80GB VRAM · Ampere
NVIDIA A100 80GB PCIe
80GB VRAM · Ampere
NVIDIA B100 192GB HBM3e
192GB VRAM · Blackwell
Intel Gaudi 3
128GB VRAM · Gaudi 3