this post was submitted on 17 Jun 2023
2 points (100.0% liked)

LocalLLaMA

2244 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
 

Gorilla is an LLM that can learn to use APIs, and if like to try getting it to use some that I work with.

There's a GGML here and the original repo is here. They have instructions for adding an API, but I don't really understand them, at least not well enough to add a generic one.

It looks really good though, which is why I'm excited about it! I think it should be possible to use generic APIs like this, if I understand it correctly.

top 1 comments
sorted by: hot top controversial new old
[–] npetecca@mastodon.social 1 points 1 year ago

@flamdragparadiddle from what I can figure out, it'll run just like any other ggml LLM, it's just trained to make correctly formatted API calls. Are you running any models currently? If not, I'd recommend oobabooga text-generation-webui (mostly because that's what I got up and running to get started) https://github.com/oobabooga/text-generation-webui let me know if you have any trouble! Once you get it running you can enable an API to use it programmatically. If you don't need to call it from an API there's a ui.