this post was submitted on 19 Dec 2024
179 points (93.7% liked)
Technology
60112 readers
1865 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Can i point it at a local endpoint or do they wanna force me to send all my code to thwir servers
Run copilot’s proprietary model locally? You’re dreaming. But you can do this with ollama, and they aren’t forcing you. There are many local models that works pretty well.
No i mean i assume they are shipping a vscode extension as default. I was wondering if said extension allows me to point at said locally run model.
They aren't. Copilot is not a built-in extension. Can't say much about future plans though.
I used Ollama locally and it worked decently well. Code suggestions were fast and relatively accurate (as far as an LLM goes). The real issue was the battery hit. Oh man, it HALVED my battery life, which is already short enough when running a server locally