Bret and Nirmal are joined by Continue.dev co-founder, Nate Sesti, to walk through an open source replacement for GitHub Copilot.

Continue lets you use a set of open source and closed source LLMs in JetBrains and VSCode IDEs for adding AI to your coding workflow without leaving the editor.

You've probably heard about GitHub Copilot and other AI code assistants. The Continue team has created a completely open source solution as an alternative, or maybe a superset of these existing tools, because along with it being open source, it's also very configurable and allows you to choose multiple models to help you with code completion and chatbots in VSCode, JetBrains, and more are coming soon.

So this show builds on our recent Ollama show. Continue uses Ollama in the background to run a local LLM for you, if that's what you want to Continue to do for you, rather than internet LLM models.

Be sure to check out the live recording of the complete show from May 16, 2024 on YouTube (Ep. 266). Includes demos.

★Topics★
Continue.dev Website

Creators & Guests


Cristi Cotovan - Editor
Beth Fisher - Producer
Bret Fisher - Host
Nirmal Mehta - Host
Nate Sesti - Guest

(00:00) - Introduction
(01:52) - Meet Nate Sesti, CTO of Continue
(02:40) - Birth and Evolution of Continue
(03:56) - Continue's Features and Benefits
(22:24) - Running Multiple Models in Parallel
(26:38) - Best Hardware for Continue
(32:45) - Other Advantages of Continue
(36:08) - Getting Started with Continue


You can also support my free material by subscribing to my YouTube channel and my weekly newsletter at bret.news!

Grab the best coupons for my Docker and Kubernetes courses.
Join my cloud native DevOps community on Discord.
Grab some merch at Bret's Loot Box
Homepage bretfisher.com