pricegpu

Cheapest NVIDIA H200 141GB SXM for LLM inference 70B

141GB VRAM · 1979 TFLOPS FP16 · 80GB min for LLM inference 70B

NVIDIA H200 141GB SXM Prices — 0 offers

No pricing data available yet. Check back after the next scrape.

Compatibility

GPU VRAM141 GB ✓
Minimum Required80 GB
Recommended160 GB (minimum only)
Typical Runtimetokens-per-second

FAQ

Is the NVIDIA H200 141GB SXM good for LLM inference 70B?
Yes. LLM inference 70B requires 80GB VRAM; the NVIDIA H200 141GB SXM has 141GB.
What is the cheapest NVIDIA H200 141GB SXM for LLM inference 70B?
Pricing data is being refreshed.

Related pages

NVIDIA H200 141GB SXM pricing
All providers
Best GPU for LLM inference 70B
NVIDIA H100 80GB SXM for LLM inference 70B
80GB VRAM
NVIDIA H100 80GB PCIe for LLM inference 70B
80GB VRAM
NVIDIA A100 80GB SXM for LLM inference 70B
80GB VRAM