this post was submitted on 23 Dec 2024
228 points (100.0% liked)

Technology

37801 readers
154 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Archived link

Opinionated article by Alexander Hanff, a computer scientist and privacy technologist who helped develop Europe's GDPR (General Data Protection Regulation) and ePrivacy rules.

We cannot allow Big Tech to continue to ignore our fundamental human rights. Had such an approach been taken 25 years ago in relation to privacy and data protection, arguably we would not have the situation we have to today, where some platforms routinely ignore their legal obligations at the detriment of society.

Legislators did not understand the impact of weak laws or weak enforcement 25 years ago, but we have enough hindsight now to ensure we don’t make the same mistakes moving forward. The time to regulate unlawful AI training is now, and we must learn from mistakes past to ensure that we provide effective deterrents and consequences to such ubiquitous law breaking in the future.

you are viewing a single comment's thread
view the rest of the comments
[–] Bronzebeard@lemm.ee 29 points 2 days ago (22 children)

That's stupid. The damage is still done to the owner of that data used illegally. Make them destroy it.

But when you levy such miniscule fines that are less than they stand to make from it, it's just a cost of business. Fines can work if they were appropriate to the value derived.

[–] teawrecks@sopuli.xyz 3 points 1 day ago* (last edited 1 day ago) (10 children)

Destroying it is both not an option, and an objectively regressive suggestion to even make.

Destruction isn't possible because even if you deleted every bit of information from every hard drive in the world, now that we know it's possible, someone would recreate it all in a matter of months.

Regressive because you're literally suggesting that we destroy a new technology because we're afraid of what it will do to the technology it replaces. Meanwhile, there's a very decent chance that AI is our best chance at solving the energy/climate crises through advancing nuclear tech, as well as surviving the next pandemic via ground breaking protein folding tech.

I realize AI tech makes people uncomfortable (for...so many reasons), but becoming old fashioned conservatives in response is not a solution.

I would take it a step further than public domain, though. I would also make any profits from illegally trained AI need to be licensed from the public. If you're going to use an AI to replace workers, then you need to pay taxes to the people proportional to what you would be paying those it replaces.

[–] Bronzebeard@lemm.ee 6 points 1 day ago (1 children)

I never suggested destroying the technology that is "AI". I'm not uncomfortable about AI, I've even considered pivoting my career in that direction.

I suggested destroying the particular implementation that was trained on the illegitimate data. If someone can recreate it using legitimate data, GREAT. That's what we want to happen. The tool isn't the problem. It's the method they're using to train them.

Please don't make up random ass narratives I never even hunted at, and then argue against them.

[–] teawrecks@sopuli.xyz 1 points 1 day ago (1 children)

I didn't misinterpret what you were saying, everything I said applies to the specific case you lay out. If illegal networks were somehow entirely destroyed, someone would just make them again. That's my point, there's no way around that, there's just holding people accountable when they do it. IMO that takes the form of restitutions to the people proportional to profits.

[–] Bronzebeard@lemm.ee 1 points 23 hours ago (1 children)

This is the dumb kind of "best do nothing, because both no is perfect" approach to making sure no disincentives are ever taken because someone somewhere else might also try to do the illegal thing that they'll lose access to the moment they're caught...

[–] teawrecks@sopuli.xyz 1 points 22 hours ago

What the? I'm literally saying what action to take, what is happening? Is there maybe a bug where you only see the first few characters of my post? Are you able to read these characters I'm typing? Testing testing testing. Let me know how far you get. Maybe there's just too many words for you? Test test. Say "elephant" if you can read this.

load more comments (8 replies)
load more comments (19 replies)