pricegpu

Cheapest NVIDIA A100 80GB SXM for SDXL inference

80GB VRAM · 312 TFLOPS FP16 · 12GB min for SDXL inference

Best price: $1.85/hr/hr on FluidStack
on-demand · us-east · verify before purchasing
Try FluidStack →

NVIDIA A100 80GB SXM Prices — 12 offers

12 offers found — from $1.85/hr as of Apr 29, 2026, verify on provider site
ProviderConfigurationRegionBillingAvailabilityPrice/hr
FluidStackcheapest1x A100 SXM 80GBus-eastper-minuteon-demand$1.85/hr Rent →
DataCrunch1x A100 SXM4 80GBeu-northper-minuteon-demand$1.89/hr Rent →
RunPod1x A100 SXMus-eastper-secondon-demand$1.89/hr Rent →
Hyperstack1x A100 SXM4 80GBuk-londonper-minuteon-demand$2.06/hr Rent →
CoreWeave1x A100 SXM4 80GBus-eastper-secondon-demand$2.21/hr Rent →
lambda1x A100 SXMus-west-2per-minuteon-demand$2.21/hr Rent →
Paperspace1x A100 SXMus-eastper-minuteon-demand$2.30/hr Rent →
together1x A100 SXM 80GBus-eastper-secondon-demand$2.49/hr Rent →
fal1x A100 SXM 80GBus-eastper-millisecondon-demand$2.99/hr Rent →
ReplicateNvidia A100 (80GB, SXM)us-eastper-secondon-demand$3.24/hr Rent →
Modal1x A100 SXMus-eastper-secondon-demand$3.72/hr Rent →
lambda8x A100 SXMus-west-2per-minuteon-demand$17.68/hr Rent →

Compatibility

GPU VRAM80 GB ✓
Minimum Required12 GB
Recommended24 GB ✓
Typical Runtimeseconds-per-image

FAQ

Is the NVIDIA A100 80GB SXM good for SDXL inference?
Yes. SDXL inference requires 12GB VRAM; the NVIDIA A100 80GB SXM has 80GB.
What is the cheapest NVIDIA A100 80GB SXM for SDXL inference?
FluidStack at $1.85/hr/hr (on-demand, us-east).

Last refresh: April 29, 2026. Verify on provider site.

Related pages

NVIDIA A100 80GB SXM pricing
All providers
Best GPU for SDXL inference
NVIDIA H100 80GB SXM for SDXL inference
80GB VRAM
NVIDIA H100 80GB PCIe for SDXL inference
80GB VRAM
NVIDIA H200 141GB SXM for SDXL inference
141GB VRAM