this post was submitted on 07 Oct 2023
-5 points (46.4% liked)

Lemmy Support

4646 readers
3 users here now

Support / questions about Lemmy.

Matrix Space: #lemmy-space

founded 5 years ago
MODERATORS
 

The problem:

The web has obviously reached a high level of #enshitification. Paywalls, exclusive walled gardens, #Cloudflare, popups, CAPTCHAs, tor-blockades, dark patterns (esp. w/cookies), javascript that makes the website an app (not a doc), etc.

Status quo solution (failure):

#Lemmy & the #threadiverse were designed to inherently trust humans to only post links to non-shit websites, and to only upvote content that has no links or links to non-shit venues.

It’s not working. The social approach is a systemic failure.

The fix:

  • stage 1 (metrics collection): There needs to be shitification metrics for every link. Readers should be able to click a “this link is shit” button on a per-link basis & there should be tick boxes to indicate the particular variety of shit that it is.

  • stage 2 (metrics usage): If many links with the same hostname show a pattern of matching enshitification factors, the Lemmy server should automatically tag all those links with a warning of some kind (e.g. ⚠, 💩, 🌩).

  • stage 3 (inclusive alternative): A replacement link to a mirror is offered. E.g. youtube → (non-CF’d invidious instance), cloudflare → archive.org, medium.com → (random scribe.rip instance), etc.

  • stage 4 (onsite archive): good samaritans and over-achievers should have the option to provide the full text for a given link so others can read the article without even fighting the site.

  • stage 5 (search reranking): whenever a human post a link and talks about it, search crawlers notice and give that site a high ranking. This is why search results have gotten lousy -- because the social approach has failed. Humans will post bad links. So links with a high enshitification score need to be obfuscated in some way (e.g. dots become asterisks) so search crawlers don’t overrate them going forward.

This needs to be recognized as a #LemmyBug.

top 28 comments
sorted by: hot top controversial new old
[–] dandroid@dandroid.app 31 points 1 year ago (1 children)

As a developer, if a tester posted something like this as a bug instead of a change request, it would get thrown right into the trash bin. This isn't a bug. You are asking for an enhancement.

Side note, do the hash tags do anything on lemmy, or are they just posted here for emphasis?

[–] mateomaui@reddthat.com 16 points 1 year ago (1 children)

The inherent fallacy in your argument is that a link is a “bad” link simply because it goes to an original source instead of always being redirected to you via a third party that circumvents what you don’t like.

If someone posts a link to a original non-misinformation news article and it gets marked as a “bad” link, that’s actually a bug.

[–] fubo@lemmy.world 13 points 1 year ago

For context, check this poster's other recent works. They have a mistaken belief that they stand in a position of power & authority over the developers of free software they use.

[–] Nemo@midwest.social 5 points 1 year ago

I'm with you all the way.

[–] TootSweet@lemmy.world 4 points 1 year ago (1 children)

So, first off, I love everything you have here.

The only thing. Onsite archive. I'd love it, but I wouldn't want copyright law used to punish the Lemmy community. I don't think I'm quite qualified to answer this question, so I'll ask it here: how worried should we be about that?

[–] jet@hackertalks.com 3 points 1 year ago

You're trying to solve a social problem with technology. That's going to be very difficult

[–] rglullis@communick.news 3 points 1 year ago (1 children)

The solution for that can be a whole lot simpler: add these features to the browser so that it works in favor of the users. I have extensions to redirect from YouTube/medium/Twitter, so these issues do not affect me regardless of website I am visiting.

[–] activistPnk@slrpnk.net -5 points 1 year ago* (last edited 1 year ago) (1 children)

The browser (more appropriately named: client) indeed needs some of the logic here, but it cannot do the full job I’ve outlined. The metrics need to be centralized. And specifically when you say browser, this imposes an inefficient amount of effort & expertise on the end-user. A dedicated client can make it easy on the user. But it’s an incomplete solution nonetheless.

[–] rglullis@communick.news 5 points 1 year ago (1 children)

The metrics need to be centralized.

Why? And how would guarantee the integrity of the ones holding the metrics?

this imposes an inefficient amount of effort & expertise on the end-user.

A lot less effort than having to deal with the different "features" that each website admin decides to run on their own.

[–] activistPnk@slrpnk.net -4 points 1 year ago* (last edited 1 year ago)

Why?

  1. It’s a big database. It would be a poor design to replicate a db of all links in every single client.
  2. Synchronization of the db would not be cheap. When Bob says link X has anti-feature Y, that information must then be shared with 10s of thousands of other users.

Perhaps you have a more absolute idea of centralized. With Mastodon votes, they are centralized on each node but of course overall that’s actually decentralized. My bad. I probably shouldn’t have said centralized. I meant more centralized than a client-by-client basis. It’d be early to pin those details down at this point other than to say it’s crazy for each client to maintain a separate copy of that DB.

And how would guarantee the integrity of the ones holding the metrics?

The server is much better equipped than the user for that. The guarantee would be the same guarantee that you have with Mastodon votes. Good enough to be fit for purpose. For any given Mastodon poll everyone sees a subset of votes. But that’s fine. Perfection is not critical here. You wouldn’t want it to decide a general election, but you don’t need that level of integrity.

A lot less effort than having to deal with the different “features” that each website admin decides to run on their own.

That doesn’t make sense. Either one person upgrades their Lemmy server, or thousands of people have to install, configure, and maintain a dozen different browser plugins ported to a variety of different browsers (nearly impossible enough to call impossible). Then every Lemmy client also has to replicate that complexity.

[–] antlion@lemmy.dbzer0.com 1 points 1 year ago

This, but I think equally important is de-duplication of links. Ideally these alternative links to the same content could also be de-duped. All comments should be in one thread. I know what I’m describing is complicated due to communities across servers, but it would really improve lemmy for me.

[–] Spzi@lemm.ee 1 points 1 year ago

Not sure if social media in general has failed. That particular point can be solved at the community level.

Create or join a community which by it's guidelines restricts posting paywalled or otherwise bad content. Which explicitly encourages posting "liberated" content. Have moderation. Problem solved. Moderators will remove all which you dislike. All that remains is the solution you want.