Skip to main content

FluidStack vs Lambda Labs vs Runpod vs Tensordock

Which GPU cloud should you use? #

  • H100s and A100 large quantities
    • Talk to Oracle, FluidStack, Lambda Labs. Maybe talk to CoreWeave, Crusoe, Runpod, AWS, Azure, GCP. Capacity is low.
  • 1x H100
    • FluidStack or Lambda Labs
  • A few A100s
    • FluidStack or Runpod
  • Cheap 3090s, 4090s, or A6000s
    • Tensordock
  • Stable Diffusion inference only
    • Salad.com
  • For accessing a wide variety of GPUs
    • Runpod or FluidStack
  • If you’re a hobbyist and want an easy start
    • Runpod
  • If you’re tied to an existing large cloud
    • Stick with them, I suppose!

Runpod vs Lambda Labs vs FluidStack vs Tensordock #

Runpod is kind of a jack of all trades. Lots of GPU types. Solid pricing for most. Easy deployment templates for beginners.

Tensordock is best if you need 3090s, 4090s, or A6000s - their prices are the best.

Lambda Labs and FluidStack are kinda similar, similar pricing, Lambda has a simpler interface but you’ll get used to FluidStack’s, FluidStack often has better availability.

Runpod #

  • Pros:
    • Lots of GPU types
    • Good pricing
    • Cool templates
  • Cons:
    • What runpod provides is a docker container on a host machine. Not a VM with your required OS installed. Even with this it’s still a docker container with ssh with scp access enabled.
    • Tensordock pricing is better for 3090s, 4090s, A6000s
    • FluidStack and Lambda have better pricing on H100s

Best for: Beginners, A100s.

FluidStack #

  • Pros:
    • Good pricing on H100s
    • Generally the best option for A100 availability, along with Runpod, and good pricing
    • Good option for large quantities of H100s
  • Cons:
    • Interface can be confusing at first
    • Prices on ‘preconfigured machines’ are good, but non-preconfigured machines are expensive

Best for: A100s, H100s.

Lambda Labs #

  • Pros:
    • Nice interface
    • Good pricing on H100s
    • Good option for large quantities of H100s
  • Cons:
    • Poor availability
    • Had driver issues with their H100 instances

Best for: H100s.

Tensordock #

  • Pros:
    • Marketplace pricing is great
    • Cheapest options for 3090s, 4090s, A6000s
  • Cons:
    • Non-marketplace pricing isn’t great
    • Minimal availability on A100s and no H100s

Best for: 3090s, 4090s, A6000s.

Even more GPU cloud options #

In general, the ones above will be best for most people. But for more info: see here, here, here, here and here.