this post was submitted on 26 Feb 2024
55 points (85.7% liked)

Technology

58200 readers
2852 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI ‘dream girls’ are coming for porn stars’ jobs::undefined

top 30 comments
sorted by: hot top controversial new old
[–] Vanth@reddthat.com 49 points 7 months ago (6 children)

It’s unclear who will benefit.

Lol, companies that start running the girlfriend AIs, duh. And then the Russians/others using the AIs to scrape user data.

Can't wait for the subset of self-host hobbyists to start posting about their self-hosted AI partners.

[–] Unforeseen@sh.itjust.works 9 points 6 months ago (2 children)

Can't wait for the subset of self-host hobbyists to start posting about their self-hosted AI partners.

They could even arrange meetups like double dates and parties and such. The future is gonna be so chaotic. I love it.

[–] Vanth@reddthat.com 7 points 6 months ago (3 children)

Aye. And for as many people who would use it to avoid IRL interactions, just as many could add in a little therapy skills to the AI and tell it to nudge the user towards IRL interaction.

A gf AI that encourages you to go out and join a social bowling league or something? That could be really good for some people. No VR boobies unless you got some sunlight today, babe.

[–] rottingleaf@lemmy.zip 3 points 6 months ago

Some would do that. Some wouldn't. Some would nudge in the direction of becoming even more fscked up.

Some moral limitations we take for given in real people (often imagined even there) don't exist for computer programs at all.

[–] retrieval4558@mander.xyz 2 points 6 months ago

If the algorithms of social media are any indication, it's more likely they'd be programmed to manipulate the user into spending as much time with the AI as possible, while the AI serves them ads.

[–] BearOfaTime@lemm.ee 1 points 6 months ago

No VR boobies unless you get some sunlight today, babe

Hahahaha

I like how you think!

[–] rottingleaf@lemmy.zip 2 points 6 months ago (2 children)

Until those partners have bodies ...

[–] foggy@lemmy.world 1 points 6 months ago* (last edited 6 months ago)

A lot of top minds are saying 2024 will be a big year for robotics ಠ⁠_⁠ಠ

[–] Kusimulkku@lemm.ee 1 points 6 months ago

What would be the issue with that?

[–] MyPornViewingAccount@lemmy.world 7 points 6 months ago (2 children)

Big privacy advocate so I was curious what it takes to self host something like that, more so just wanting a very flexible personal assistant for product, weather alerts all in one.

Takes a lot of RAM and GPU power, more than I have sitting around.

[–] Vanth@reddthat.com 8 points 6 months ago

Which means the push for optimization will be super interesting. Once again, porn drives technological advancements.

[–] Womble@lemmy.world 2 points 6 months ago

Have you been looking at quantised models? You can get pretty good ones at the 20 gig RAM+VRAM level which is very reasonable if you have a gaming PC and are ok with responses not being instant.

[–] Kusimulkku@lemm.ee 5 points 6 months ago

Already a thing on /g/

[–] tiredofsametab@kbin.run 3 points 6 months ago

Not just scraping data but spreading disinfo and radicalizing people. There was one case recently I saw about that. Not just AI lovers, but potentially a number of AI applications

[–] hansl@lemmy.world 3 points 6 months ago (1 children)

Neural engines are coming to basically all CPUs. It won’t be long before you can run your own girlfriend offline on your phone. Training the data is the expensive part after all. I can already run basic llama 2B on my iPad, though offloading the software instead of just downloading off the App Store.

I’m fairly sure anyone with a good GPU can also run these, but I haven’t tried.

[–] SniffDoctor@lemmy.ml 1 points 6 months ago* (last edited 6 months ago)

Yes. The Llama 70B derived models, as well as Mixtral 8x7B and the new Mistral Medium 70B are competitive with ChatGPT 3.5. Most of them can do 16,000 token context similar to ChatGPT as well.

You only NEED 40GB of free RAM to run them at decent quality, but it's slow.

With a 24GB GPU like a 3090 or 4090 you can run them at a reasonable speed with partial GPU offload. About 1-2 words per second. I run 70Bs in this manner on my computer.

With two 24GB GPUs you can run them very fast, like ChatGPT.


There's of course a whole world in between as well, but those are the rough hardware requirements to match ChatGPT in a self-hosted sort of way. There's also a new thing people are doing where they add layers from one model onto another one, like a merge but keeping >50% of the original layers from each model. "Goliath 120B" and the like. They're even better but it's a bit beyond reasonable consumer hardware now.

[–] jqubed@lemmy.world 1 points 6 months ago (1 children)
[–] HelloHotel@lemm.ee 0 points 6 months ago* (last edited 6 months ago)

Pasterama, Don't Date Fumos that are also Tulpas and if your cold their cold, put them behind 3 secret walls that are on fire, under the sea.