pricegpu

Cheapest NVIDIA GeForce RTX 4090 24GB for SDXL inference

24GB VRAM · 330 TFLOPS FP16 · 12GB min for SDXL inference

Best price: $0.240/hr/hr on Salad
on-demand · distributed · verify before purchasing
Try Salad →

NVIDIA GeForce RTX 4090 24GB Prices — 13 offers

13 offers found — from $0.240/hr as of Apr 29, 2026, verify on provider site
ProviderConfigurationRegionBillingAvailabilityPrice/hr
Saladcheapest1x RTX 4090distributedper-minuteon-demand$0.240/hr Rent →
vast1x RTX 4090eu-westper-secondon-demand$0.350/hr Rent →
FluidStack1x RTX 4090eu-centralper-minuteon-demand$0.440/hr Rent →
RunPod1x RTX 4090us-eastper-secondon-demand$0.440/hr Rent →
DataCrunch1x RTX 4090eu-northper-minuteon-demand$0.490/hr Rent →
TensorDock1x RTX 4090us-eastper-minuteon-demand$0.490/hr Rent →
Hyperstack1x RTX 4090eu-centralper-minuteon-demand$0.550/hr Rent →
genesis1x RTX 4090eu-centralper-minuteon-demand$0.590/hr Rent →
lambda1x RTX 4090us-west-1per-minuteon-demand$0.790/hr Rent →
together1x RTX 4090us-eastper-secondon-demand$0.790/hr Rent →
fal1x RTX 4090us-eastper-millisecondon-demand$0.890/hr Rent →
ReplicateNvidia RTX 4090 (24GB)us-eastper-secondon-demand$1.00/hr Rent →
Modal1x RTX 4090us-eastper-secondon-demand$1.10/hr Rent →

Compatibility

GPU VRAM24 GB ✓
Minimum Required12 GB
Recommended24 GB ✓
Typical Runtimeseconds-per-image

FAQ

Is the NVIDIA GeForce RTX 4090 24GB good for SDXL inference?
Yes. SDXL inference requires 12GB VRAM; the NVIDIA GeForce RTX 4090 24GB has 24GB.
What is the cheapest NVIDIA GeForce RTX 4090 24GB for SDXL inference?
Salad at $0.240/hr/hr (on-demand, distributed).

Last refresh: April 29, 2026. Verify on provider site.

Related pages

NVIDIA GeForce RTX 4090 24GB pricing
All providers
Best GPU for SDXL inference
NVIDIA H100 80GB SXM for SDXL inference
80GB VRAM
NVIDIA H100 80GB PCIe for SDXL inference
80GB VRAM
NVIDIA H200 141GB SXM for SDXL inference
141GB VRAM