Cheapest NVIDIA H200 141GB SXM for LLM inference 70B
141GB VRAM · 1979 TFLOPS FP16 · 80GB min for LLM inference 70B
NVIDIA H200 141GB SXM Prices — 0 offers
No pricing data available yet. Check back after the next scrape.
Compatibility
FAQ
Is the NVIDIA H200 141GB SXM good for LLM inference 70B?
Yes. LLM inference 70B requires 80GB VRAM; the NVIDIA H200 141GB SXM has 141GB.
What is the cheapest NVIDIA H200 141GB SXM for LLM inference 70B?
Pricing data is being refreshed.