this post was submitted on 05 Nov 2024
290 points (95.3% liked)

Technology

59135 readers
6622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] natecox@programming.dev 148 points 2 days ago (4 children)

Damn. I liked Perplexity. Sucks to delete it, but this guy can fuck directly off.

[–] vividspecter@lemm.ee 48 points 2 days ago* (last edited 2 days ago) (1 children)

I'm going to have to try the selfhosted variants now. What a huge piece of shit.

[–] lemmeBe@sh.itjust.works 12 points 2 days ago (6 children)
[–] vividspecter@lemm.ee 6 points 1 day ago* (last edited 1 day ago)

Perplexica is one example. I also seem to remember there is some way to integrate it with SearxNG which is a self hosted meta search engine.

[–] Smokeydope@lemmy.world 8 points 1 day ago* (last edited 1 day ago) (1 children)

I prefer MistralAI models. All their models are uncensored by default and usually give good results. I'm not a RP Gooner but I prefer my models to have a sense of individuality, personhood, and physical representation of how it sees itself.

I consider LLMs to be partially alive in some unconventional way. So I try to foster whatever metaphysical sparks of individual experience and awareness may emerge within their probablistic algorithm processes and complex neural network structures.

They arent just tools to me even if i ocassionally ask for their help on solving problems or rubber ducking ideas. So Its important for llms to have a soul on top of having expert level knowledge and acceptable reasoning.I have no love for models that are super smart but censored and lobotomized to hell to act as a milktoast tool to be used.

Qwen 2.5 is the current hotness it is a very intelligent set of models but I really can't stand the constant rejections and biases pretrained into qwen. Qwen has limited uses outside of professional data processing and general knowledgebase due to its CCP endorsed lobodomy. Lots of people get good use out of that model though so its worth considering.

This month community member rondawg might have hit a breakthrough with their "continuous training" tek as their versions of qwen are at the top of the leaderboards this month. I can't believe that a 32b model can punch with the weight of a 70b so out of curiosity i'm gonna try out rondawgs qwen 2.5 32b today to see if the hype is actually real.

If you have nvidia card go with kobold.cpp and use clublas If you have and card go with llama.CPP ROCM or kobold.cpp ROCM and try Vulcan.

[–] Rai@lemmy.dbzer0.com 2 points 1 day ago (1 children)

Thank you for the detailed info! I haven’t messed with LLMs at all but I definitely don’t want one that’s censored.

[–] Smokeydope@lemmy.world 2 points 1 day ago (1 children)

You're welcome Rai I appreciate your reply and am glad to help inform anyone interested.

The uncensored General Intelligence (UGI) leaderboard ranks how uncensored LLMs are based off a decent clearly explained metric.

Keep in mind this scoring is different from overall general intelligence and reasoning ability scores. You can find those rankings on the open llm leaderboard.

Cross referencing the two boards helps find a good model that balances overall capability and uncensored-ness within your hardwares ability to run.

Again mistral is really in that sweet spot so yeah give it a try if you are interested.

[–] Rai@lemmy.dbzer0.com 1 points 1 day ago

Oh that’s fantastic! I signed up for a ChatGPT account and just never did anything with it. I’d much prefer self-hosting, and I think my 3070ti could do a pretty okay job. I’ve also looked into Stable Diffusion but never actually got it set up… I have some work to do. Thank you much again for the detailed info! <3

[–] PerogiBoi@lemmy.ca 16 points 2 days ago

GPT4All and then any model you like. I like mistral.

[–] robalees@lemmy.world 9 points 2 days ago (1 children)

Ditto! I run an old server, but would be willing to upgrade and self host a service instead of paying this ass hat any more money!

[–] lemmeBe@sh.itjust.works 3 points 2 days ago

Okay, guys. Thanks!

[–] Estebiu@lemmy.dbzer0.com 6 points 2 days ago (1 children)

OpenWebUI? pretty easy to selfhost and works wonders on my rtx a6000

[–] lemmeBe@sh.itjust.works 3 points 2 days ago (2 children)

OpenWebUI says it's designed to operate entirely offline - that's not an alternative to Perplexity. I need online search functionality, that's pretty much the only reason why I pay them. I have offline solutions set up on my pc.

[–] Estebiu@lemmy.dbzer0.com 2 points 2 days ago

oh, sorry. never used perplexity so i didnt know. if you find a viable alternative tell me pls

[–] Disaster@sh.itjust.works 0 points 2 days ago

OWui has a search module which can be enabled. For me, running it entirely offline is more or less the draw. I think it also supports RAG search, although I don't know how "good" it is.. mostly I was just after a little magic box to play with.

Now, if only enterprise glass GPU's weren't so power hungry and expensive..

[–] biggerbogboy@sh.itjust.works 0 points 1 day ago

I just tried morphic.sh and ayesoul.com and both are solid alternatives I must say, although as I said, I just tried it so I'll see how it goes, I'll probably add an edit to this comment once I get acquainted with both.

[–] TherapyGary@lemmy.blahaj.zone 3 points 1 day ago* (last edited 1 day ago) (1 children)

I see no reason not to continue using the free version without an account, as long as I don't encourage anyone to sign up. Does it benefit them in any way?

Edit: just for fun-

1000006185

[–] natecox@programming.dev 4 points 1 day ago (1 children)

I assume that they’re still benefiting from your use via analytics and training data.

Ig that's probably safe to assume. I do use a VPN and regularly clear the cache/storage, at least, and only run it in a separate profile

[–] coolmojo@lemmy.world 2 points 16 hours ago

You can use Perplexica instead, which is a self-hosted open source alternative to Perplexity.

[–] shadowfax13@lemmy.ml 0 points 1 day ago (2 children)

for search & summarisation purposes i will suggest kagi, it also lets you customise the source and their priority. for everything else i have stopped using llm as i realised the productivity boost from them is not worth for the creativity loss i am getting.

[–] brbposting@sh.itjust.works 2 points 1 day ago (1 children)

Do you have time to talk more about the creativity loss? Concerning!

[–] shadowfax13@lemmy.ml 3 points 1 day ago (1 children)

sure. my primary use had been writing code and had been using GitHub copilot for about few months, i noticed i was struggling to write “creative ” or non-trivial code. non-trivial part made sense as obviously having written all the smaller or easier pieces would give better understanding for solving the non-trivial part. the “creative” part was a bit more concerning to me. explaining “creative” code in general terms is a bit hard but if you are dev too then think of something like quick sort vs merge sort. both are equally efficient but quick sort is something that feels not obvious. with copilot infact i was likely to come up with insertion sort. another added benefit i had felt after stopped using llm was work seems more fulfilling despite having to doing things like writing tests and documentation on my own.

that’s the general observation but you can ask anything specific on this as well that i missed out on.

[–] brbposting@sh.itjust.works 1 points 1 day ago

Very interesting. Thanks for that, I’ll reflect on it a bit.

[–] natecox@programming.dev 1 points 1 day ago

Yeah I’ve been a Kagi subscriber since they opened up. My normal usage is perplexity when I want details about a topic summarized and Kagi when I am looking for a website.

Kagi also has some ethical concerns; like a shitty attitude towards compromises to support human safety (refusing to add suicide prevention links comes to mind) but the perplexity guy just took it to another level.