this post was submitted on 23 Dec 2024
531 points (94.9% liked)
Microblog Memes
6037 readers
2988 users here now
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Lossy compression is antiquated. Jpg should no longer be used as it's not 1999. I will die on this mole hill.
JPEG XL (JXL) seems promising, being able to do a fair amount of compression while keeping images still high quality.
The showcase webpage for JXL.
Lossless compression doesn't really do well for pictures of real life. For screenshots it's ideal, but for complex images PNGs are just wayyyy to big for the virtually non noticeable difference.
A high quality JPG is going to look good. What doesn't look good is when it gets resized, recompressed, screenshotted, recompressed again 50 times.
PNG is the wrong approach for lossless web images. The correct answer is WebP: https://siipo.la/blog/whats-the-best-lossless-image-format-comparing-png-webp-avif-and-jpeg-xl
I found quite a lot of AVIF encoders lied about their lossless encoding modes, and instead used the normal lossy mode at a very high quality setting. I eventually found one that did true lossless and I don't think it ever managed to produce a file smaller than the input.
Turns out, that's a well known issue with the format. It's just another case where Google's marketing makes AVIF out to be fantastic, but in reality it's actually quite mediocre.
They lied about the lossiness?! I can’t begin to exclaim loudly enough about how anxious this makes me.
The funny thing is, I knew something was off because Windows was generating correct thumbnails for the output files, and at that time the OS provided thumbnailer was incapable of generating correct thumbnails for anything but the simplest baseline files.
(Might be better now, idk, not running Windows now)
That's how I knew the last encoder was producing something different, even before checking the output file size, the thumbnail was bogus.
This story is a nightmare and I’m not sure if it’s better or worse now knowing that it was ancient ICO files that tipped you off.
Open question to you or the world: for every lossless compression I ever perform, is the only way to verify lossless compression to generate before and after bitmaps or XCFs and that unless the before-bitmap and after-bitmap are identical files, then lossy compression has occurred?
Pretty much, you can use something like ImageMagick's compare tool to quickly check if the round trip produced any differences.
It can be a bit muddled because even if the encoding is lossless, the decoding might not be (e.g. subtle differences between using non-SIMD vs. SIMD decoding), and it's not like you can just check the file hashes since e.g. PNG has like 4 different interchangeable ways to specify a colour space. So I'd say it's lossless if the resulting images differ by no more than +/- 1 bit error per pixel (e.g. 127 becoming 128 is probably fine, becoming 130 isn't)
Hey wow! Thank you!!
This explains a lot—including, likely, your username. Cheers!
jxl is a much better format, for a multitude of reasons beyond the article, but it doesn't have much adoption yet. On the chromium team (the most important platform, unfortunately), someone seems to be actively power tripping and blocking it
Yeah Google is trying to keep control of their image format and they are abusing their monopoly to do so
Webp, yo!
.tif or nothing, yo.
A high quality jpg looks good. The 100th compression into a jpg looks bad.
I know compression has a lot of upsides, but I've genuinely hated it ever since broadband was a thing. Quality over quantity all the way. My websites have always used dynamic resizing, providing the resolution in a parameter, resulting in lightning fast load times, and quality when you need it.
The way things are shared on the internet is with screenshots and social media, been like that for at least 15 years. JPG is just slowly deep frying the internet.
I disagree, but I do agree that there are better options available than JPEG. Lossy compression is actually what allows much of the modern internet to function. 4K HDR content on Netflix wouldn’t be a thing without it. And lossy compression can be perceptually lossless for a broader range of use cases. Many film productions use high quality lossy formats in their production pipelines in order to be able to handle the vast amounts of data.
Of course it all depends on the use case. If someone shares some photos or videos with me to keep, I’d like them to send the originals, whatever format they might be in.
I understand the need for compression and re-encoding but I stand by the claim we should not use a container that will eat itself alive a little bit every time it's edited.
How often does a jpeg get edited in practice though? maybe a 2-3 times at most?
Yeah, let's all post RAW 40MB photos right from the phone on ... The Internet!
What a good idea.
Is there a specific reason? And subsidiary do you only listen to 96-bits FLAC too? Should video not be compressed either?
I mean, I'm all in with you when it comes to storing my holiday photos, but sharing them? Not so much.
That said, I grew up with 35kb jpgs so I'm kind of used to it, maybe I'm skewed.
Files should be at reasonable resolutions and sizes for their purpose but not in file formats that slowly deteriate in an internet of remixing ideas.
So not analog?
Who taught you jpgs deteriorate over time lol
https://uploadcare.com/blog/jpeg-quality-loss/
It happens when the image is edited and re-encoded on save. Who taught you they didn't?
Oh, so it happens ... Shuffles papers ... When someone degrades the quality intentionally.
That happens if you reduce the dpi of your raw image too btw.
Not "over time" !
Don't tell the kids over on Dormi.zone that.