this post was submitted on 01 Sep 2023
91 points (85.8% liked)
Linux
48145 readers
1048 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
X11's core design is very bad for normal desktop use, though. You've got all of these pluggable resources that can run on different computers, attached through pipes or network connections, with tons of extensions and trickery to get things like hardware acceleration and touchpad support working.
There are downsides to Wayland, but X11's design plain sucks. It was designed for timesharing mainframes, not for a desktop computer. Running X11 on a desktop is like running the Mars rover firmware in your car: with plenty of extensions and modifications it'll work wonderfully, but you're not getting the most out of it because the core design principles don't match up with the hardware you're running.
This is not an insult to the people behind X11. The fact they've got all of these modern features to work at all is an impressive accomplishment.
The people behind X11 agree and that's why they founded Wayland.
I'm aware of the overlap, but some people take any criticism of X11 as some kind of insult to the X project and the people behind it.
Sure but the people behind X11 are the same ones behind Wayland so when the develpers didn't think it was worth the time to fix X11 and it would be better to start a new project to fix the issues. How can end users think we should just fix X11 make anysense? I think their biggest mistake is they should have called Wayland X12 or something like that.
[This comment has been deleted by an automated system]
Wayland 1.0 was released in 2012, though.
I feel that the biggest mistake of X11's protocol design is the idea of a "root window" that is supposed to cover the whole screen.
Perhaps that worked greatly in the 1990s, but it's just completely incompatible with multi-displays that we commonly see in modern setups. Hacks upon hacks were involved to make multi-displays a possibility on X11. The root window no longer corresponded to a single display. In heterogenous display setups, part of the root window is actually invisible.
Later on we decided to stack compositing on top of the already-hacky mess, and it was so bad that many opted to disable the compositor (no Martha, compositors are more than wobbly windows!).
And then there's the problem of sandboxing programs... Which is completely unmappable to X11 even with hacks.
Multiple displays work fine. The only thing that needs to be drawn in the root window is attractive backgrounds sized to your displays I'm not sure why you think that is hacky or complicated.
Multiple displays only work as long as you have identical resolutions and refresh rates. Good luck mixing monitors with different scaling factors and refresh rates on X11.
I run multiple refresh rates without any trouble, one 165hz monitor alongside my other 60hz ones. Is that supposed to be broken somehow?
This wasn't true in 2003 when I started using Linux in fact the feature is so old I'm not sure exactly when it was implemented. You have always been able to have different resolutions and in fact different scaling factors. It works like this
You scale your lower DPI display or displays UP to match your highest DPI and let X scale down to the physical size. HIGHER / LOWER = SCALE FACTOR. So with 2 27" monitors where one is 4k and the other is 1080p the factor is 2, a 27" 4K with a 24" 1080p is roughly 1.75.
Configured like so everything is sharp and UI elements are the same size on every screen. If your monitors are vertically aligned you could put a window between monitors and see the damn characters lined up correctly.
If you use the soooo unfriendly Nvidia GPU you can actually configure this in its GUI for configuring your monitors. If not you can set with xrandr the argument is --scale shockingly enough
Different refresh rates also of course work but you ARE limited to the lower refresh rate. This is about the only meaningful limitation.
It's the fact that the root window is a lie.