pricegpu

NVIDIA B100 192GB HBM3e Cloud GPU Pricing

192GB VRAM · 3500 TFLOPS FP16 · Blackwell · Released 2025

Live Pricing — 0 offers

No pricing data available yet. Check back after the next scrape.

Specifications

VRAM192 GB
Memory Bandwidth8000 GB/s
FP16 TFLOPS3500
FP8 TFLOPS7000
ArchitectureBlackwell
Released2025
TDP700W
MSRP$35,000
Good Forllm-training, llm-inference-large, hpc, fine-tuning

FAQ

What is the cheapest NVIDIA B100 192GB HBM3e cloud provider?
No pricing data is currently available for the NVIDIA B100 192GB HBM3e.
How much VRAM does the NVIDIA B100 192GB HBM3e have?
The NVIDIA B100 192GB HBM3e has 192GB of VRAM and 8000 GB/s memory bandwidth.
What workloads is the NVIDIA B100 192GB HBM3e good for?
The NVIDIA B100 192GB HBM3e is well-suited for: llm-training, llm-inference-large, hpc, fine-tuning.
What is the FP16 performance of the NVIDIA B100 192GB HBM3e?
The NVIDIA B100 192GB HBM3e delivers 3500 TFLOPS of FP16 performance, and 7000 TFLOPS FP8.

Related GPUs & Providers

NVIDIA B200 192GB HBM3e
192GB VRAM · Blackwell
AMD Instinct MI300X 192GB
192GB VRAM · CDNA3
NVIDIA H100 80GB SXM
80GB VRAM · Hopper
NVIDIA A100 80GB SXM
80GB VRAM · Ampere
NVIDIA H100 80GB PCIe
80GB VRAM · Hopper
NVIDIA H200 141GB SXM
141GB VRAM · Hopper