pricegpu

NVIDIA A100 80GB SXM Cloud GPU Pricing

from $1.85/hr/hr

80GB VRAM · 312 TFLOPS FP16 · Ampere · Released 2020

Cheapest NVIDIA A100 80GB SXM: FluidStack at $1.85/hr/hr
on-demand · us-east · verify on provider site
Try FluidStack →

Live Pricing — 12 offers

12 offers found — from $1.85/hr as of Apr 29, 2026, verify on provider site
ProviderConfigurationRegionBillingAvailabilityPrice/hr
FluidStackcheapest1x A100 SXM 80GBus-eastper-minuteon-demand$1.85/hr Rent →
DataCrunch1x A100 SXM4 80GBeu-northper-minuteon-demand$1.89/hr Rent →
RunPod1x A100 SXMus-eastper-secondon-demand$1.89/hr Rent →
Hyperstack1x A100 SXM4 80GBuk-londonper-minuteon-demand$2.06/hr Rent →
CoreWeave1x A100 SXM4 80GBus-eastper-secondon-demand$2.21/hr Rent →
lambda1x A100 SXMus-west-2per-minuteon-demand$2.21/hr Rent →
Paperspace1x A100 SXMus-eastper-minuteon-demand$2.30/hr Rent →
together1x A100 SXM 80GBus-eastper-secondon-demand$2.49/hr Rent →
fal1x A100 SXM 80GBus-eastper-millisecondon-demand$2.99/hr Rent →
ReplicateNvidia A100 (80GB, SXM)us-eastper-secondon-demand$3.24/hr Rent →
Modal1x A100 SXMus-eastper-secondon-demand$3.72/hr Rent →
lambda8x A100 SXMus-west-2per-minuteon-demand$17.68/hr Rent →

Specifications

VRAM80 GB
Memory Bandwidth2000 GB/s
FP16 TFLOPS312
ArchitectureAmpere
Released2020
TDP400W
MSRP$15,000
Good Forllm-training, fine-tuning, llm-inference-large, hpc

FAQ

What is the cheapest NVIDIA A100 80GB SXM cloud provider?
As of the last data refresh, FluidStack offers the NVIDIA A100 80GB SXM at $1.85/hr (on-demand, us-east).
How much VRAM does the NVIDIA A100 80GB SXM have?
The NVIDIA A100 80GB SXM has 80GB of VRAM and 2000 GB/s memory bandwidth.
What workloads is the NVIDIA A100 80GB SXM good for?
The NVIDIA A100 80GB SXM is well-suited for: llm-training, fine-tuning, llm-inference-large, hpc.
What is the FP16 performance of the NVIDIA A100 80GB SXM?
The NVIDIA A100 80GB SXM delivers 312 TFLOPS of FP16 performance.

Last data refresh: April 29, 2026. Verify on provider site.

Related GPUs & Providers

NVIDIA A100 80GB PCIe
80GB VRAM · Ampere
NVIDIA A100 40GB SXM
40GB VRAM · Ampere
NVIDIA A100 40GB PCIe
40GB VRAM · Ampere
NVIDIA H100 80GB SXM
80GB VRAM · Hopper
NVIDIA RTX A6000 48GB
48GB VRAM · Ampere
NVIDIA H100 80GB PCIe
80GB VRAM · Hopper