GPUs for Running Stable Diffusion
·
Note that this list is aimed at cloud GPUs where more expensive GPUs are comparatively cheap vs buying the whole GPU outright.
You can run stable diffusion on smaller/cheaper GPUs!
GPU | VRAM (GB) | Speed relative to H100 for SD | Speed / $ | Lowest cost per hour | Cost at Runpod | Cost at FluidStack | Cost at Lambda Labs |
---|---|---|---|---|---|---|---|
RTX 4090 | 24 | 50% | 👌 0.72 | $0.69 | ✅ $0.69 | None | None |
H100 PCIe | 80 | 🏆 100% | 0.50 | $1.99 | None | ✅ $1.99 | ✅ $1.99 |
RTX 3090 | 24 | 21% | 0.49 | 🪙 $0.44 | ✅ $0.44 | $0.59 | None |
RTX 3080 | 10 | 21% | 0.43 | $0.50 | None | $0.50 | None |
6000 Ada | 48 | 48% | 0.40 | $1.19 | $1.19 | None | None |
A100 | 40 | 43% | 0.39 | $1.10 | None | $1.20 | $1.10 |
L40 | 48 | 43% | 0.36 | $1.19 | $1.19 | None | None |
V100 | 16 | 24% | 0.27 | $0.87 | None | $0.87 | None |
A6000 | 48 | 19% | 0.24 | $0.79 | $0.79 | $0.80 | $0.80 |
A40 | 48 | 19% | 0.24 | $0.79 | $0.79 | $1.57 | None |
A100 | 80 | 43% | 0.24 | $1.79 | $1.79 | $2.91 | None |