@flamdragparadiddle from what I can figure out, it'll run just like any other ggml LLM, it's just trained to make correctly formatted API calls. Are you running any models currently? If not, I'd recommend oobabooga text-generation-webui (mostly because that's what I got up and running to get started) https://github.com/oobabooga/text-generation-webui let me know if you have any trouble! Once you get it running you can enable an API to use it programmatically. If you don't need to call it from an API there's a ui.
this post was submitted on 17 Jun 2023
2 points (100.0% liked)
LocalLLaMA
2244 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS