Getting bolt.diy running on a Coolify mananged server



Date: 02/14/2025

Watch the Video

Okay, this video is about using Bolt.diy, an open-source project from StackBlitz, combined with Coolify, to self-host AI coding solutions, specifically focusing on running GPT-4o (and its mini variant). It’s a practical exploration of how you can ditch relying solely on hosted AI services (like Bolt.new) and instead, roll your own solution on a VPS. The author even provides a `docker-compose` file to make deployment on Coolify super easy – a big win for automation!

For a developer like me, knee-deep in AI-assisted development, this is gold. We’re constantly balancing the power of LLMs with the costs and control. The video provides a concrete example, complete with price comparisons, showing where self-hosting can save you a ton of money, especially when using a smaller model like `gpt-4o-mini`. Even with the full `gpt-4o` model, the savings can be significant. But it’s also honest about the challenges, mentioning potential issues like “esbuild errors” that can arise. It highlights the pragmatic nature of AI integration; it’s not perfect, but iterative.

Imagine using this setup to power an internal code generation tool for your team or automating repetitive tasks in your CI/CD pipeline. This isn’t just about saving money; it’s about having more control over your data and model access. The fact that it’s open-source means you can tweak and optimize it for your specific needs. Honestly, the potential to create customized, cost-effective AI workflows makes it absolutely worth experimenting with. I’m already thinking about how to integrate this with my Laravel projects!