We have decided to change the AI service we use to run the Stable Diffusion webui. Starting out, we chose the AWS based Stable Diffusion – Create Stunning Images on Your Cloud GPU Server. We have since switched to Runpod.io. Now we use their RTX A4000 instances for running Stable Diffusion webui, which they have out of the box with an instance template.
Runpod gives more flexibility
One of the better parts of Runpod is that it is much easier to spin up a more powerful graphics card. This is only a few clicks and no worries about keys, permission groups, or ports. This has been great for some of the higher requirement tests I’ve been running on AI services.
I’ve been playing around with training and textual inversion on their RTX A5000 instances. My first attempt at running textual inversion was with Dreambooth, but it kept failing after a few hundred generations. The second attempt was with the Stable Diffusion webui, but that didn’t seem to want to run on Runpod, even when jumping up to the RTX A5000. We will try it again soon, as these repositories are being updated every day. Despite it not working the setup made it really fast to get set up and test.
Using these textual inversion services will be key when trying to get consistent characters, and is one of the main reasons we’ve avoided having a main character across the stories so far. It is one of the things holding us back from why we think making these stories makes sense.
Breakdown between the two AI Services
The breakdown between the services is as follows:
AWS
- Costs 78c / hour
- 71c / hour for AWS g4dn.xlarge
- 7c / hour for the software package
- Downtime doesn’t incur a cost
- Speed
- We got about 6s / image
- Instance Reserving
- You can turn it on and off at any time, no restrictions
- Interface
- The full desktop gave you all the control and opening browsers for uploading to places directly from the machine was convenient
- Managing the instance was harder because of the more complete nature of the PC running on the instance
Runpod.io
- Costs 32c / hour
- Downtime costs $4/month if you want to keep your data
- Can be less if you don’t mind be interrupted, which is fine for just running images on the AI service
- Speed
- We’re now getting 2s / image
- Instance Reserving
- There can be limits to turning an instance you’ve been using on or off
- Usually uptime available on some instance, even if it isn’t the same you used last time.
- Interface
- The Jupyter notebooks and terminal based interactions made it easier than the full desktop simulation
- Not having the browser interface made the flow for uploading images require a download then upload, but not a big deal with single image generation tasks like the stories.
Conclusion for AI Services
There are some downsides to changing to Runpod, but the benefits will outweigh those in the long run!
Related Articles
Using in-painting in Stable Diffusion to make detailed images for One Drop in the Ocean.
Using img2img in Stable Diffusion to make the header image for Little Frog, Big Dragon.
Using Stable Diffusion to make composited images for Sarah in the Secret Garden.