Self Hosting an AI Junior Developer
Lucas Jaggernauth - Aug 30th
We moved Sweep to an open core model! We also changed Sweep’s license from CC-NC to ELv2. This will allow you to host Sweep in your own VPC and use either open source models like WizardCoder or from large providers like ChatGPT Enterprise. All of our work until now will be Elastic Licensed, and a subset of future work around integrations, code validation, and improved performance will be closed source.
CC-NC allows you to use Sweep, but not for commercial purposes. This was preventing a lot of our users from self-hosting Sweep, so we changed this to Elastic License V2. This allows you to use Sweep commercially. Previously, we ran entirely on Modal (which is great), but it constrained all of our users to Modal. We moved everything to Docker for simplicity.
We had a couple of concerns which led us to our previous license, so we wanted to share our thoughts on why we changed it.
We’re not concerned about this anymore, we learned a lot of the difficulty in doing this well. If someone is able to copy Sweep and build faster than us, they likely don't need Sweep to do that.
With the commoditization of open source LLMs that can program well, it’s now viable to have above GPT3.5-level quality with a 32k context window.
It took us about 24 hours to set up a working version on FastAPI + Docker. We’re also going to provide instructions for using Sweep with long-context OSS models like CodeLlama/WizardCoder.
We’ll be close sourcing the minority of our features in the future. They will be available through our hosted solution (which won’t be changing).
Thanks again for all of your support. If you have any questions, please click the intercom button on the bottom right of the page to chat with us directly. If it's between 9am-12pm PST, we'll respond within 5 minutes.
Alternatively, we're available on Discord at https://discord.gg/sweep (opens in a new tab).