this post was submitted on 24 Sep 2024
39 points (100.0% liked)

Linux

48182 readers
1292 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

Hello. I know this isn't completely related to Linux, but I was still curious about it.

I've been looking at Linux laptops and one that caught my eye from Tuxedo had 13 hours of battery life on idle, or 9 hours of browsing the web. The thing is, that device had a 3k display.

My question is, as someone used to 1080p and someone that always tries to maximise the battery life out of a laptop, would downscaling the display be helpful? And if so, is it even worth it, or are the benefits too small to notice?

top 27 comments
sorted by: hot top controversial new old
[–] Max_P@lemmy.max-p.me 44 points 1 month ago

No, the majority of the energy consumption is in the backlight.

[–] MachineFab812@discuss.tchncs.de 10 points 1 month ago* (last edited 1 month ago) (2 children)

Maybe if it allowed you to switch to integrated graphics versus discrete, putting the GPU to sleep.

For just browsing, even integrated graphics has been plenty since the beginning of the internet, maybe with some exceptions when Flash gaming reached its pinnacle.

[–] merthyr1831@lemmy.ml 3 points 1 month ago (1 children)

That might save a bit of power, but your dedicated GPU is usually in an idle/powered down state until your compositor gives it specific applications to accelerate. for Nvidia laptops this is what the PRIME/Optimus feature does.

[–] MachineFab812@discuss.tchncs.de 1 points 1 month ago

Even back when I was in the laptop-repair game, this is the kinda stuff people would expect me to know about their stuff that I hated. I saw too many features come and go over the years to keep track of even half of it on behalf of others.

[–] Fisch@discuss.tchncs.de 2 points 1 month ago (1 children)

Using the iGPU might save power but the resolution doesn't need to be turned down for that

[–] MachineFab812@discuss.tchncs.de 1 points 1 month ago

Depends on the iGPU, but this being a damn near brand-new laptop, I'm sure you're right.

[–] vole@lemmy.world 8 points 1 month ago (2 children)

I'd think so. 3k is so many pixels to compute and send 60 times a second.

But this video says the effect on battery life in their test was like 6%, going from 4k to 800x600. I can imagine that some screens are better at saving power when running at lower resolutions... but what screen manufacturer would optimize energy consumption for anything but maximum resolution? 🤔 I guess the computation of the pixels isn't much compared to the expense of having those physical dots. But maybe if your web browser was ray-traced? ... ?!

Also, if you take a 2880x1800 screen and divide by 2 (to avoid fractional scaling), you get 1440x900 (this is not 1440p), which is a little closer to 720p than 1080p.

[–] Strit@lemmy.linuxuserspace.show 8 points 1 month ago

But you don't lower the amount of pixels you use. You just up the amount of pixels used to display a "pixel" when lowering the resolution. So the same amount of power is going to be used to turn those pixels on.

[–] merthyr1831@lemmy.ml 2 points 1 month ago

Your GPU doesn't need to re-render your entire screen every frame. Your compositor will only send regions of the screen that change for rendering, and most application stacks are very efficient with laying out elements to limit the work needed.

At higher resolutions those regions will obviously be larger, but they'll still take up roughly the same % of the screen space.

[–] merthyr1831@lemmy.ml 6 points 1 month ago (1 children)

Unless you're running games or 3D intensive apps no. Resolution is cheap on power under normal circumstances.

[–] bruce965@lemmy.ml 2 points 1 month ago (1 children)

As a web developer, I noticed that some elements such as very big tables struggle to render on 4K but are absolutely fine at 1080p. I would assume that means the CPU and/or GPU are more taxed to draw at higher resolution, and therefore I assume they would draw more power. I might be mistaken. Do you speak by experience?

[–] merthyr1831@lemmy.ml 2 points 1 month ago (1 children)

I'm a flutter dev, and I've seen testimonies from a former Windows 98 dev about limiting the number of redraws in the shell.

There's deffo extra overhead, but it's not linear - 4k being 4 times as many pixels as 1080p doesn't mean 4k the work to render after the first frame, as the browser/framework will cache certain layout elements.

The initial layout is still expensive, though, so big tables will take longer, but that big table at high Res will probably be less chuggy when scrolling once loaded.

[–] bruce965@lemmy.ml 1 points 1 month ago

I am not sure... in the case I'm referring to, they were lagging also when scrolling. But it was React, so native browser rendering. And they were actually very large tables, so we had to do some funny things like viewport culling (see react-window).

For what it's worth I've never had any similar performance issues with tables in Flutter (web with the canvas-based render engine, not Android) when applying the same culling technique, they just ran fine at any resolution. Different hardware, though, so it's not an apple to apple comparison.

In any case just to be safe I would personally assume less pixels = less work = less power = more battery life. My opinion is very unscientific though.

[–] PetteriPano@lemmy.world 6 points 1 month ago (1 children)

My PowerBook G4 might be a bit dated, but running other resolutions than native is quite heavy on that thing. Your built-in display can handle one resolution only - anything else will require upscaling.

Your GPU can probably do that upscaling for cheap. But cheaper than rendering your desktop applications? 🤷‍♂️

You'll have to benchmark your particular device with powertop.

[–] bruce965@lemmy.ml 1 points 1 month ago* (last edited 1 month ago) (1 children)

Isn't rescaling usually done by the display driver? I am fairly certain this is the case for external displays. Are laptop displays any different?

Edit: with "display driver" I mean the hardware chip behind the display panel, dedicated to converting a video signal to the electrical signals necessary to turn on the individual pixels.

[–] PetteriPano@lemmy.world 2 points 1 month ago

For an external display I'd bet the case is the hardware driver for the panel.

At least my 17" Powerbook G4 with a massive 2560x1440 display does it in the software display driver. I'm sure some laptop panels do it in hardware as well, but seems there's some very janky shit going on at least with laptops that have both integrated and discrete GPUs.

[–] jhdeval@lemmy.world 5 points 1 month ago

The display on my laptop is 4k and i can tell you i tried downscaling it was not as big a difference as simply turning the brightness down as low as was comfortable.

[–] InvertedParallax@lemm.ee 5 points 1 month ago

Yes, but by very little.

You're saving on GPU processing, but that's unlikely to be that much for browsing.

[–] Omega_Jimes@lemmy.ca 3 points 1 month ago

I don't think it would matter that much since a desktop at 3k is very similar on modern hardware to a desktop at 1080.

But I'd be interested in someone who had the hardware to test this. Right now I use my laptop for school work, and in trying to squeeze every ounce of battery life I was running my display at 45hz instead of 60hz. I had a free day during the summer so I charged it up, ran a YouTube video on repeat and timed the battery life, then changed the display frequency and it was like a 2 minute difference. I also tried it while running a second 1080p monitor through hdmi and the difference was something like 10 minutes. Like, so small a difference or didn't matter.

I don't have the data sheet anymore so these numbers are anecdotal etc etc YMMV. The biggest change for me was buying a 65w PD battery bank and keeping that charged in my bag.

[–] CMDR_Horn@lemmy.world 2 points 1 month ago (1 children)

You’ll have to downscale the resolution unless you have super human vision. I suspect that the laptop is configured to ~150% ootb which would mean those battery estimates are based on that as well.

[–] floofloof@lemmy.ca 10 points 1 month ago (1 children)

I don't think upscaling the text/UI and downscaling the whole screen are the same thing.

[–] MachineFab812@discuss.tchncs.de 2 points 1 month ago* (last edited 1 month ago)

The one usually works best with the other, though.

EDIT: nm, I see what you were getting at in their comment now. They also meant downscaling the Text/UI, not upscaling.

[–] bloodfart@lemmy.ml 2 points 1 month ago* (last edited 1 month ago) (1 children)

Short answer: no.

Long answer: also no, but in some specific circumstances yes.

Your display uses energy to do two things, change the color you see and make them brighter or dimmer. It honestly speaking has a little processor in it but that sucker is so tiny and energy efficient that it’s not affecting things much and you can’t affect it anyway.

There’s two ways to do the things your display does, one way is to have a layer of tiny shutters that open up when energized and allow light through their red, blue or green tinted windows in front of a light source. In this case you can use two techniques to reduce the energy consumption: open fewer shutters or reduce the intensity of the light source. Opening fewer shutters seems like it would be part of lowering the resolution, it when you lower the resolution you just get more shutters open for one logical “pixel” in the framebuffer (more on that later).

Another way to do what your display does is to have a variable light source behind each tinted window and send more or less luminance through each one. In this case there is really only one technique you can use to reduce the energy consumption of the display, and that’s turning down the brightness. This technique has the same effect as before when you lower the resolution. It’s worth noting that a “darker” displayed image will consume less energy in this case, so if you have an oled display, consider using a dark theme!

So the display itself shouldn’t save energy with a lowered resolution.

Your gpu has a framebuffer, which is some memory that corresponds to the display frame. If that display is running at a lower resolution, the framebuffer will be smaller and if it it’s running at higher resolution it’ll be bigger. Memory is pretty energy efficient nowadays, so the effect of a larger framebuffer on energy consumption is negligible.

Depending on your refresh rate, the framebuffer gets updated some number of times a second. But the gpu doesn’t just completely wipe and rewrite and resend the framebuffer, it just changes stuff that needs it, so when you move your mouse at superhuman speeds exactly one cursor width to the left in one sixtieth of a second, the framebuffer updates two cursor area locations in the framebuffer, the place the cursor was gets updated to reflect whatever was underneath and the place the cursor is gets updated with a cursor on it.

Okay but what if I’m doing something that changes the whole screen at my refresh rate? In that case the whole framebuffer gets updated!

But that doesn’t often happen…

Let’s say you’re watching a movie. It’s 60fps source material, so wouldn’t the framebuffer be updating 60 times a second? No! Not only is the video itself encoded to reflect that colors don’t change from frame to frame and that the thing decoding them doesn’t need to worry about those parts, the thing decoding them is actively looking for even more ways to avoid doing the work of changing parts of the framebuffer.

So the effect of a larger framebuffer on battery is minimized while playing movies, even when the frame buffer is huge!

But actually decoding a 3k movie is much more cpu intensive than 1080. So maybe watch in 1080, but that’s not your display or resolution, it’s the resolution of the source material.

Okay, but what about games? Games use the framebuffer too, but because they aren’t pre-encoded, they can’t take advantage of someone having already done the work of figuring out what parts are gonna change and what parts are. So you pop into e1m1 and the only way the computer can avoid updating the whole framebuffer is when the stuff chocolate doom sends it doesn’t change the whole framebuffer, like those imps marching in place.

But chocolate doom still renders the whole scene, making use of computer resources to calculate and draw the frame and send it to the framebuffer which looks up and says “you did all this work to show me imp arms swinging over a one inch square portion of screen area”?

But once again, chocolate doom takes more computer resources to render a 3k e1m1 than one in 1080, so maybe turn down your game resolution to save that energy.

Hold on, what about that little processor on the display? Well it can do lots of stuff but most of the time it’s doing scaling calculations so that when you run chocolate doom full screen at 1080 the image is accurately and as nicely as possible scaled across the whole screen instead of stuck at the top left or in the middle or something. So in that case you could actually make that little sucker do less work and take up less energy by running at the displays “native” resolution than if you were at 1080.

So when jigsaw traps you in his airport terminal shaped funhouse and you wake up with the exploder on your neck and a note in front of you that says “kill carmack” and no charger in your bag, yes, you will save energy running at a lower resolution.

E: running chocolate doom at a lower resolution, not the display.

[–] averyminya@beehaw.org 1 points 1 month ago

Color change, eh? Sounds like B+W makes displays more energy efficient, that should be significant!

(\s)

[–] MangoPenguin@lemmy.blahaj.zone 2 points 1 month ago

It would, but it would be a very small difference. Maybe 2-3% at most.

[–] Xiisadaddy@lemmygrad.ml 1 points 1 month ago (1 children)

Your not gonna get much from that. Your much better off looking for more efficient processors. If your looking at a brand new tuxedo you have a pretty high budget already so id suggest waiting a bit and looking at the new mobile CPUs from intel and amd coming out which seem to have really good efficiency. Linux support for them should roll out pretty quick since its not like they have the same challenges as ARM chips being x86.

[–] theshatterstone54@feddit.uk 1 points 1 month ago

Good to hear that they're better at power efficiency. What's potentially concerning however, is whether that would lead to manufacturers just using smaller batteries. I want my 80 or 99 Wh battery for the longest battery life! I'll heed your advice and wait to see where things go.