this post was submitted on 09 Oct 2023
411 points (96.0% liked)

Technology

59135 readers
6622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI Industry Struggles to Curb Misuse as Users Exploit Generative AI for Chaos::Artificial intelligence just can't keep up with the human desire to see boobs and 9/11 memes, no matter how strong the guardrails are.

you are viewing a single comment's thread
view the rest of the comments
[–] bitsplease@lemmy.ml 87 points 1 year ago (3 children)

Serious question - why should anyone care about using AI to make 9/11 memes? Boobs I can see the potential argument against at least (deep fakes and whatnot), but bad taste jokes?

Are these image generation companies actually concerned they'll be sued because someone used their platform to make an image in bad taste? Even if such a thing we're possible, wouldn't the responsibility be on the person who made it? Or at worst the platform that distributed the images -As opposed to the one that privately made it?

[–] Fyurion@lemmy.world 75 points 1 year ago (2 children)

I don't see adobe trying to stop people from making 911 memes in photoshop nor have they been sued over anything like that, I dont get why AI should be different. It's just a tool.

[–] bitsplease@lemmy.ml 20 points 1 year ago

That's a great analogy, wish I'd thought of it

I guess it comes down to whether the courts decide to view AI as a tool like photoshop, or a service - like an art commission. I think it should be the former, but I wouldn't be at all surprised if the dinosaurs in the US gov think it's the latter

[–] makyo@lemmy.world 5 points 1 year ago

The problem for Adobe is that the AI work is being done on their computers, not yours, so it could be argued that they are liable for generated content. 'Could' because it's far from established but you can imagine how nervous this all must make their lawyers.

[–] kromem@lemmy.world 16 points 1 year ago (1 children)

Protect the brand. That's it.

Microsoft doesn't want non-PC stuff being associated with the Bing brand.

It's what a ton of the 'safety' alignment work is about.

This generation of models doesn't pose any actual threat of hostile actions. The "GPT-4 lied and said it was human to try to buy chemical weapons" in the safety paper at release was comical if you read the full transcript.

But they pose a great deal of risk to brand control.

Yet still apparently not enough to run results through additional passes which fixes 99% of all these issues, just at 2-3x the cost.

It's part of why articles like these are ridiculous. It's broadly a solved problem, it's just the cost/benefit of the solution isn't enough to justify it because (a) these issues are low impact and don't really matter for 98% of the audience, and (b) the robust fix is way more costly than the low hanging fruit chatbot applications can justify.

[–] theterrasque@infosec.pub 1 points 1 year ago

Microsoft doesn't want non-PC stuff being associated with the Bing brand.

You mean bing, the porn Google? Yeah, that might be a tad too late

[–] M500@lemmy.ml 2 points 1 year ago (1 children)

I’d guess that they are worried the IP owners will sue them for singing their IP.

So sonic creators will say, your profiting by using sonic and not paying us for the right to use him.

But I agree that deep fakes can be pretty bad.

[–] elbarto777@lemmy.world 11 points 1 year ago

your profiting

You are profiting = you're profiting.