this post was submitted on 24 Aug 2024
394 points (98.5% liked)

Technology

59419 readers
5352 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] General_Effort@lemmy.world 101 points 2 months ago (2 children)

[French media] said the investigation was focused on a lack of moderators on Telegram, and that police considered that this situation allowed criminal activity to go on undeterred on the messaging app.

Europe defending its citizens against the tech giants, I'm sure.

[–] RedditWanderer@lemmy.world 68 points 2 months ago* (last edited 2 months ago) (1 children)

There's a lot of really really dark shit on telegram that's for sure, and it's not like signal where they are just a provider. They do have control the content

[–] sunzu2@thebrainbin.org 15 points 2 months ago (3 children)
[–] RedditWanderer@lemmy.world 22 points 2 months ago (2 children)

I don't recall CP/gore being readily available on those platforms, it gets reported/removed pretty quickly.

[–] sunzu2@thebrainbin.org 16 points 2 months ago (1 children)
[–] southsamurai@sh.itjust.works 7 points 2 months ago (2 children)

You're young. It really was a thing. It never stayed up long, and they found ways to make it essentially instantaneous, but there was a time it was easy to find very unpleasant things on Facebook, whether you wanted to or not. Gore in specific was easy to run across at one point. CP, it was more offers to sell it.

They fixed it, and it isn't like that now, but it was a problem in the first year or two.

[–] sunzu2@thebrainbin.org 7 points 2 months ago

And there are still informal networks of Pedos and other pests operating on these platforms to this day.

[–] Kecessa@sh.itjust.works 6 points 2 months ago

So you don't see the difference between the platforms that actually has measures in place to try and prevent it and platforms that intentionally don't have measures in place to try and prevent it?

Man, Lemmings must be even dumber than Redditors or something

[–] Kusimulkku@lemm.ee 3 points 2 months ago

If they similarly go unmoderated then action should be taken

[–] chiisana@lemmy.chiisana.net 26 points 2 months ago (2 children)

Safe harbour equivalent rules should apply, no? That is, the platforms should not be held liable as long as the platform does not permit for illegal activities on the platform, offer proper reporting mechanism, and documented workflows to investigate + act against reported activity.

It feels like a slippery slope to arrest people on grounds of suspicion (until proven otherwise) of lack of moderation.

[–] rottingleaf@lemmy.world 5 points 2 months ago (1 children)

Telegram does moderation of political content they don't like.

Also Telegram does have means to control whatever they want.

And sometimes they also hide certain content from select regions.

Thus - if they make such decisions, then apparently CP and such are in their interest. Maybe to collect information for blackmail by some special services (Durov went to France from Baku, and Azerbaijan is friendly with Israel, and Mossad is even suspected of being connected to Epstein operation), maybe just for profit.

[–] RandomlyRight@sh.itjust.works 3 points 2 months ago (2 children)

Do you have any links/sources about this? I'm not saying you’re wrong, I’m just interested

[–] rottingleaf@lemmy.world 2 points 2 months ago (2 children)

No, but they do sometimes delete channels for gore and such. I remember a few Azeri channels being banned for this during/after 2020 war.

About having means - well, with server-side stored unencrypted everything it's not a question.

About hiding channels per region by governmental requests - I've heard about that on Lemmy.

[–] Petter1@lemm.ee 2 points 2 months ago (1 children)

Where did you get that the data on the servers are not encrypted?

[–] rottingleaf@lemmy.world 1 points 2 months ago

You are, ahem, not decrypting it when getting history and not encrypting it when uploading files. That should be sufficient.

Anyway, look at TG Desktop sources. They are crap, but in general it's clear what happens there. At least that's how I remember it.

[–] RandomlyRight@sh.itjust.works 1 points 2 months ago

Thank you, really appreciate it!

[–] Kecessa@sh.itjust.works 1 points 2 months ago (1 children)

Thing is, Telegram don't do shit about it

[–] chiisana@lemmy.chiisana.net 4 points 2 months ago

I don’t know how they manage their platform — I don’t use it, so it’s irrelevant for me personally — was this proven anywhere in a court of law?