Best Cloud GPU for SDXL inference
Minimum 12GB VRAM · Recommended 24GB+ · Runtime: seconds-per-image
Cheapest for SDXL inference: NVIDIA GeForce RTX 3090 24GB on Salad
$0.130/hr/hr · verify on provider site
Cheapest GPU Options — 25 eligible GPUs
GPU Requirements
FAQ
What GPU do I need for SDXL inference?
Requires at least 12GB VRAM. Recommended: 24GB+. Ideal: NVIDIA GeForce RTX 4090 24GB, NVIDIA A10 24GB.
What is the cheapest GPU for SDXL inference?
NVIDIA GeForce RTX 3090 24GB at $0.130/hr/hr on Salad.
How much does SDXL inference cost per hour?
From $0.130/hr/hr. Runtime: seconds-per-image.