Honestly, all of the generative AI subscriptions are pretty fucking steep at this point compared to just running a model locally.
Asklemmy
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
I agree with this. I'm using a 1070ti for image gen and it would be more than capable for handling some LLM stuff. An AMD 7700xt ive found dors well with 7B models on my main rig but im sure you could get away with somthing cheaper or less powerful.
That said, the amount of text you can genrate or the context length of its answers will depend the model you use and the larger the model, the more power it takes.
If youre just messing around with it or want it to review or answer small questions, I'd say a 1070ti like I'm using would be just fine. Some folks use even more budget friendly options. If you got a gaming machine with any semi recent GPU, I'd say go for it. Worst case, you can pay for a subscription later if you really want.
Download gpt4all and you can get an open source model that performs basically as good as any of the paid ones.
Thanks, I've done just that and installed it too! What's the best gpt4all LLM model or the model you'd recommend?
Llama is a solid choice. Or mistral. I use moistral which was made for porn but it’s pretty uncensored in general. Doesn’t have qualms about ethics or illegalities.
Does it mean Llama does have that? And how does that affect the performance? I mean the thing about "no qualms about ethics"
I’m sure there’s an uncensored llama somewhere but the ones I’ve tried weren’t truly uncensored.
In terms of performance what it just means is that if I ask it something mildly sexual or inappropriate, it will answer it without giving an “as an ai language model, I can’t do…” speech.
Don't get chatgpt plus, just get an API token and use one of the desktop apps/CLIs, it's pay as you go and way cheaper unless you're using gpt 4 all day every day or something
Do you have an example for a desktop app that would use these tokens?
I don't, you'd have to have a Google
I use gpt-cli which is pretty good if you're ok with using a terminal https://github.com/kharvd/gpt-cli
Claude.ai is quite a bit superior to GPT in my experience. That one, I pay for, and it seems like it's worth it.
Thanks but why would you say it's superior to GPTo1?
I haven't played around with GPT o1; I just checked, and I don't have access. I'm not saying it's necessarily bad without having experienced it. But OpenAI has been getting steadily worse for a while, so I'm assuming that the stuff I've interacted with is indicative of the quality of the new stuff. It's all of a piece.
I canceled my ChatGPT subscription a month or two ago. It just got completely unreliable. Like someone else said, Claude is way better but they’re both disappointing at this point. I only subscribed to Claude like last week to help solve an incredibly last minute thing. Not sure I’m going to stay subscribed.
I've run some local llms (3060, 12g vram), and I daily generate images locally (wouldn't pay for that), but I do pay for a chatgpt subscription. I think it's worth it for my purposes. Responses are way faster and higher quality than any local model I've tried, web search integration, image recognition, mobile app seamless, I use all of those features regularly. Unfortunately I've never ver used POE so I can't compare, sorry.
no
Maybe check out Kagi's ultimate tier. They let you swap between some of the different options to see which you might find useful. As a bonus you also get kagi search which can be useful.
FWIW I only ever used those services if they accepted a prepaid credit card. OpenAI didn't accept prepaid cards when I tried, not sure about Poe. Just something to think about.