this post was submitted on 30 Aug 2023
317 points (95.2% liked)

Lemmy

12506 readers
11 users here now

Everything about Lemmy; bugs, gripes, praises, and advocacy.

For discussion about the lemmy.ml instance, go to !meta@lemmy.ml.

founded 4 years ago
MODERATORS
 

EDIT

TO EVERYONE ASKING TO OPEN AN ISSUE ON GITHUB, IT HAS BEEN OPEN SINCE JULY 6: https://github.com/LemmyNet/lemmy/issues/3504

June 24 - https://github.com/LemmyNet/lemmy/issues/3236

TO EVERYONE SAYING THAT THIS IS NOT A CONCERN: Everybody has different laws in their countries (in other words, not everyone is American), and whether or not an admin is liable for such content residing in their servers without their knowledge, don't you think it's still an issue anyway? Are you not bothered by the fact that somebody could be sharing illegal images from your server without you ever knowing? Is that okay with you? OR are you only saying this because you're NOT an admin? Different admins have already responded in the comments and have suggested ways to solve the problem because they are genuinely concerned about this problem as much as I am. Thank you to all the hard working admins. I appreciate and love you all.


ORIGINAL POST

You can upload images to a Lemmy instance without anyone knowing that the image is there if the admins are not regularly checking their pictrs database.

To do this, you create a post on any Lemmy instance, upload an image, and never click the "Create" button. The post is never created but the image is uploaded. Because the post isn't created, nobody knows that the image is uploaded.

You can also go to any post, upload a picture in the comment, copy the URL and never post the comment. You can also upload an image as your avatar or banner and just close the tab. The image will still reside in the server.

You can (possibly) do the same with community icons and banners.

Why does this matter?

Because anyone can upload illegal images without the admin knowing and the admin will be liable for it. With everything that has been going on lately, I wanted to remind all of you about this. Don't think that disabling cache is enough. Bad actors can secretly stash illegal images on your Lemmy instance if you aren't checking!

These bad actors can then share these links around and you would never know! They can report it to the FBI and if you haven't taken it down (because you did not know) for a certain period, say goodbye to your instance and see you in court.

Only your backend admins who have access to the database (or object storage or whatever) can check this, meaning non-backend admins and moderators WILL NOT BE ABLE TO MONITOR THESE, and regular users WILL NOT BE ABLE TO REPORT THESE.

Aren't these images deleted if they aren't used for the post/comment/banner/avatar/icon?

NOPE! The image actually stays uploaded! Lemmy doesn't check if the images are used! Try it out yourself. Just make sure to copy the link by copying the link text or copying it by clicking the image then "copy image link".

How come this hasn't been addressed before?

I don't know. I am fairly certain that this has been brought up before. Nobody paid attention but I'm bringing it up again after all the shit that happened in the past week. I can't even find it on the GitHub issue tracker.

I'm an instance administrator, what the fuck do I do?

Check your pictrs images (good luck) or nuke it. Disable pictrs, restrict sign ups, or watch your database like a hawk. You can also delete your instance.

Good luck.

top 50 comments
sorted by: hot top controversial new old
[–] sabreW4K3@lemmy.tf 99 points 1 year ago (3 children)

Perhaps someone should create a script to purge orphan images

[–] Danc4498@lemmy.ml 36 points 1 year ago (1 children)

Seems like the logical fix

[–] Matriks404@lemmy.world 43 points 1 year ago (2 children)

The logical fix would be to delete them automatically when unused for longer than let's say 24 hours. That should be in the lemmy code, and we should not depend on 3rd party utilities to do that.

[–] sabreW4K3@lemmy.tf 10 points 1 year ago (1 children)

You can submit a patch upstream

[–] gumball4933@lemm.ee 25 points 1 year ago (3 children)

Not everyone is a developer. Users are allowed to point out issues without working on fix themselves.

load more comments (3 replies)
load more comments (1 replies)
[–] gravitas_deficiency@sh.itjust.works 10 points 1 year ago* (last edited 1 year ago) (1 children)

Or, just tighten up the api such that uploaded pictures have a relatively short TTL unless they become attached to a post or otherwise linked somewhere.

A script is a fine stopgap measure, but we should try to treat the cause wherever possible, instead of simply addressing the symptom.

[–] chaorace@lemmy.sdf.org 5 points 1 year ago* (last edited 1 year ago) (2 children)

What's the practical difference? In both cases you're culling images based on whether they're orphaned or not.

If you're suggesting that the implementation be based on setting individual timers instead of simply validating the whole database at regular intervals, consider whether or not the complexity of such a system is actually worth the tradeoff.

"Complexity comshmexity", you might say. "Surely it's not a big deal!". Well... what about an image that used to belong to a valid post that later got deleted? Guess you have to take that edge case into account and add a deletion trigger there as well! But what if there were other comments/posts on the same instance hotlinking the same image? Guess you have to scan the whole DB every time before running the deletion trigger to be safe! Wait... wasn't the whole purpose of setting this up with individual jobs to avoid doing a scripted DB scan?

load more comments (2 replies)
[–] bmygsbvur@lemmy.ca 7 points 1 year ago

Very much needed.

[–] sunaurus@lemm.ee 63 points 1 year ago* (last edited 1 year ago)

FYI to all admins: with the next release of pict-rs, it should be much easier to detect orphaned images, as the pict-rs database will be moved to postgresql. I am planning to build a hashtable of "in-use" images by iterating through all posts and comments by lemm.ee users (+ avatars and banners of course), and then I will iterate through all images in the pict-rs database, and if they are not in the "in-use" hash table, I will purge them.

Of course, Lemmy can be improved to handle this case better as well!

[–] Nerd02@lemmy.basedcount.com 30 points 1 year ago (3 children)

I’m an instance administrator, what the fuck do I do?

There's one more option. The awesome @db0@lemmy.dbzer0.com has made this tool to detect and automatically remove CSAM content from a pict-rs object storage.

https://github.com/db0/lemmy-safety

[–] bmygsbvur@lemmy.ca 19 points 1 year ago (1 children)

This is a nice tool but orphaned images still need to be purged. Mentioned on the other thread that bad actors can upload spam to fill up object storage space.

[–] Nerd02@lemmy.basedcount.com 7 points 1 year ago

That is also very true. I think better tooling for that might come with the next pict-rs version, which will move the storage to a database (right now it's in an internal ky-value storage). Hopefully that will make it easier to identify orphaned images.

[–] dandroid@dandroid.app 4 points 1 year ago (8 children)

I tried getting this to run in a container, but I was unable to access my GPU in the container. Does anyone have any tips on doing that?

load more comments (8 replies)
[–] Xylight@lemmy.xylight.dev 3 points 1 year ago (5 children)

You need a GPU for that. Most $5 VPSs don't have that.

load more comments (5 replies)
[–] dandroid@dandroid.app 19 points 1 year ago (1 children)

'm an instance administrator, what the fuck do I do?

Check your pictrs images (good luck) or nuke it. Disable pictrs, restrict sign ups, or watch your database like a hawk. You can also delete your instance.

How? I have checked, and there doesn't seem to be any way to see the photos on my server.

I actually shut down pictrs entirely on my instance. Running pictrs in its current state is criminally negligent imo.

[–] bmygsbvur@lemmy.ca 14 points 1 year ago (1 children)

They are stored in the pctrs folder. They don't have file extensions but are viewable with many image programs.

[–] dandroid@dandroid.app 11 points 1 year ago

Oh, I see. I only use command line on my server, so I didn't realize they were actual photos. Thanks!

[–] Kecessa@sh.itjust.works 17 points 1 year ago (5 children)

Pedo trolls will be the death of Lemmy, you heard it here first!

[–] bmygsbvur@lemmy.ca 12 points 1 year ago

Which is why we need to act now.

load more comments (4 replies)
[–] newhoa@lemmy.ml 14 points 1 year ago (2 children)

A lot of web software does this (Github and Gmail for example). I like it but always thought it could be abused.

[–] Send_me_nude_girls@feddit.de 10 points 1 year ago

You mean Gmail drafts? I know from at least one case where criminals used this, they shared the Gmail account password and messaged each other only via the drafts function. So technically there was never a mail send.

load more comments (1 replies)
[–] Xylight@lemmy.xylight.dev 13 points 1 year ago (2 children)

FYI this requires a JWT so if registrations are closed on your instance you don't have to worry

[–] bmygsbvur@lemmy.ca 7 points 1 year ago

This is for public instances.

load more comments (1 replies)
[–] rektifier@sh.itjust.works 13 points 1 year ago (2 children)

Wasn't facebook also found to store images that were uploaded but not posted? This is just a resource leak . I can't believe no one has mentioned this phrase yet. I'm more concerned about DoS attacks that fill up the instance's storage with unused images. I think the issue of illegal content is being blown out of proportion. As long as it's removed promptly (I believe the standard is 1 hour) when the mods/admins learn about it, there should be no liabilities. Otherwise every site that allows users to post media would be dead by now.

[–] newline@feddit.nl 9 points 1 year ago

I'm a pentester and security consultant. From my point of view, this vulnerability has more impact than just a resource leak or DOS. We all know how often CSAM or other illegal material is uploaded to communities here as actual posts (where hundreds of viewers run into it to report it). Now imagine them uploading it and spreading it like this, and only the admin can catch it if they goes out of their way to check it?

I wouldn't call this a high risk issue for sure. But a significant security risk regardless.

[–] bmygsbvur@lemmy.ca 5 points 1 year ago

Whether it's illegal content or storage-filling DoS attacks, the issue needs to be addressed.

[–] WtfEvenIsExistence@lemmy.ca 11 points 1 year ago (1 children)

Oh wow. I always assumed the images are deleted if you don't submit the post.

😬

load more comments (1 replies)
[–] homesnatch@lemmy.one 4 points 1 year ago (1 children)

Because anyone can upload illegal images without the admin knowing and the admin will be liable for it.

The admin/company isn't liable until it is reported to them and they don't do anything about it... That's how all social media sites work, Google isn't immediately liable if you upload illegal materials to GDrive and share it anonymously.

load more comments (1 replies)
load more comments
view more: next ›