this post was submitted on 22 Jan 2024
237 points (97.6% liked)

Technology

59346 readers
7390 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Alternative link: https://archive.is/qgEzK

you are viewing a single comment's thread
view the rest of the comments
[–] Jaysyn@kbin.social 75 points 9 months ago (6 children)

Surprise, that's completely unenforceable.

Yet more out of touch legislators working with things they can't even begin to understand.

(And I'm not shilling for fucking AI here, but let's call a spade a spade.)

[–] Max_P@lemmy.max-p.me 18 points 9 months ago (3 children)

What baffles me is that those lawmakers think they can just legislate any problem with law.

So okay, California requires it. None of the other states do. None of the rest of the Internet does. It doesn't fix anything.

They act like the Internet is like cable and it's all american companies that "provides" services to end users.

[–] PM_Your_Nudes_Please@lemmy.world 7 points 9 months ago (1 children)

Inb4 AI devs just slap a generic “click this box to confirm you are not in California” verification on their shit.

[–] sorghum@sh.itjust.works 1 points 9 months ago

If the server isn't even in California, would it even apply/be enforceable to them?

[–] 50gp@kbin.social 6 points 9 months ago (1 children)

so youre saying nothing should be done? great idea

[–] gsfraley@lemmy.world 2 points 9 months ago* (last edited 9 months ago) (1 children)

Sure, but this is less than nothing. It literally applies 0 friction against AI and is complete and totally unenforceable. AND it's a laughing stock for everyone and sucks the oxygen out of better AI regulation groups and think-tanks.

[–] Imgonnatrythis@sh.itjust.works 9 points 9 months ago

Why? If a California corporation is pumping out AI content and it doesn't have watermarks, why can't this be enforced? It's not an all use solution, but I fail to see how it fails completely.

[–] tyler@programming.dev 3 points 9 months ago

They call it the California effect for a reason.

http://eprints.lse.ac.uk/42097/1/__Libfile_repository_Content_Neumayer, E_Neumayer_Does _California_effect_2012_Neumayer_Does _California_effect_2012.pdf

[–] assassin_aragorn@lemmy.world 7 points 9 months ago

I'm not so sure. A lot of environmental laws require companies to self report exceeding limits, and they actually do. It was a common thing for my contact engineer colleagues to be called up at night to calculate release amounts because their unit had an upset.

A law like this would force companies to at least pretend to comply. None can really say "we're not going to because you can't catch us".

[–] tsonfeir@lemm.ee 6 points 9 months ago

Watermarks? Super important. Helping the unhoused though, nooooo.

[–] RobotToaster@mander.xyz 4 points 9 months ago

Even if it was enforceable, there are watermark removal AI tools.

[–] Brkdncr@lemmy.world 3 points 9 months ago (1 children)

Hmm, technically speaking we could require images be digitally signed, tie it to a CA, and then browsers could display a “this image is not trusted” warning like we do for https issues.

People that don’t source their images right would get their cert revoked.

Would be a win for photo attribution too.

[–] Gutless2615@ttrpg.network -1 points 9 months ago (1 children)

This comment shows all the thirty seconds of thought your “Hmm” implies.

[–] Brkdncr@lemmy.world 3 points 9 months ago

You also had 30 seconds but chose to insult instead of contribute. See you at the next comment section.

[–] bluGill@kbin.social 0 points 9 months ago

It is enforceable. Not in all cases, probably not even in the majority, but it only needs a few examples to be hit with large fines and everyone doing legal things will take notice. Often you can find enough evidence to get someone to confess to using AI and that is aall the courts need.

Scammers of course will not put this in, but they are already breaking the law so this might be - like tax evasion - be a way to get scammers who you can't get for something else.