978
AI companies are violating a basic social contract of the web and and ignoring robots.txt
(www.theverge.com)
This is a most excellent place for technology news and articles.
I hope not, laws tend to get outdated real fast. Who knows robots.txt might not even be used in the future and it just there adding space because of law reasons.
You can describe the law in a similar way to a specification, and you can make it as broad as needed. Something like the file name shouldn't ever come up as an issue.
The law can be broad with allowances to define specifics by decree, executive order or the equivalent.
robots.txt is a 30 year old standard. If we can write common sense laws around things like email and VoIP, we can do it for web standards too.
robots.txt has been an unofficial standard for 30 years and its augmented with sitemap.xml to help index uncrawlable pages, and Schema.org to expose contents for Semantic Web. I'm not stating it shouldn't not be a law, but to suggest changing norms as a reason is a pretty weak counterargument, man.
We don't need new laws we just need enforcement of existing laws. It is already illegal to copy copyrighted content, it's just that the AI companies do it anyway and no one does anything about it.
Enforcing respect for robots.txt doesn't matter because the AI companies are already breaking the law.
I think the issue is that existing laws don't clearly draw a line that AI can cross. New laws may very well be necessary if you want any chance at enforcement.
And without a law that defines documents like robots.txt as binding, enforcing respect for it isn't "unnecessary", it is impossible.
I see no logic in complaining about lack of enforcement while actively opposing the ability to meaningfully enforce.
Copyright law in general needs changing though that's the real problem. I don't see the advantage of legally mandating that a hacky workaround solution becomes a legally mandated requirement.
Especially because there are many many legitimate reasons to ignore robots.txt including it being misconfigured or it just been set up for search engines when your bot isn't a search engine crawler.