Add ability to use local LLMs in Copilot #190067
Replies: 2 comments 1 reply
-
|
Hi moffatted 👋 +1 — this is a really thoughtful suggestion! Rate limits during heavy agentic/refactor sessions are a pain (especially mid-flow), and offloading to a local LLM for overflow or lighter tasks makes total sense for power users. Staying in the same slick Copilot UI while switching to local models would be huge — no context-switching to Continue.dev, Cline, or Ollama directly. Current state in 2026 (from recent updates & community feedback)GitHub Copilot does support "bring your own model" (BYOM) / custom LLMs in VS Code (and expanding to other IDEs):
So you're halfway there: local models are supported, but not a seamless "switch to local when rate-limited" toggle in the UI, and not 100% offline/no-login. Why this feature request is spot-on
Workarounds while we wait
GitHub/Microsoft team — this would be a killer addition for Pro users: a simple "Fallback to local when limited" toggle or deeper local integration (no login for local mode, full agent support). It aligns with privacy trends and helps heavy users stay productive. Thanks for posting — upvoted and following! What local model/setup are you thinking of using (Ollama + Llama 3.1 70B? DeepSeek?)? Happy to share setup tips if needed. Keep refactoring! 🚀 |
Beta Was this translation helpful? Give feedback.
-
|
👋 Welcome to the GitHub Community, @moffatted! Congratulations on your first post! 🎉 You are more likely to get a useful response if you are posting your question(s) in the applicable category. Check out this Guide on how to post in the right category. It'll ensure your questions and discussions reach the right people faster! This particular discussion belongs in the Happy to have you here! 🚀 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Select Topic Area
Question
Body
I'm enjoying using the new agentic features of Copilot. I was in the middle of a major refactor yesterday and got rate-limited. I completely understand that is necessary. It would be nice if Copilot would allow me to switch to a local LLM from within Copilot. This seems reasonable if you are trying to ensure fair and equitable access to everyone. I know I could switch to Cline or Continue to use my local LLMs, but I'd prefer to stay within the same user interface. I think what you guys offer for $10.00 a month is very reasonable, but offloading to local LLM's might solve your equitable use issue if developers are getting rate limited.
Beta Was this translation helpful? Give feedback.
All reactions