- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Wayland. It comes up a lot: “Bug X fixed in the Plasma Wayland session.” “The Plasma Wayland session has now gained support for feature Y.” And it’s in the news quite …
Wayland. It comes up a lot: “Bug X fixed in the Plasma Wayland session.” “The Plasma Wayland session has now gained support for feature Y.” And it’s in the news quite …
Then the problem is that it’s abandoned, not that it has stagnated (which can also be phrased as “stabilized” depending entirely on context and the speaker’s/author’s personal feelings about the project). Once again, I’m not saying that Xorg is good, but that particular critique needs to stop; it’s major flaw is that even the “maintainers” are sick of it and want it to die, not that it has ceased major developments.
Even the article acknowledges this:
But it also falls into the “Bells and whistles” side of the critique immediately after:
and it even starts of explaining the problems with X by saying it’s in “maintenance mode.” I couldn’t care less about new features, the Pareto principle implies 80% of users don’t need new features regardless of how much dopamine they get from seeing the marketing hype. “Maintenance mode” isn’t a bad thing, it’s a good thing. Abandoned projects that most GUIs still rely on is a disaster waiting to happen.
That’s an extremely bold claim, and vague, with no actual examples. Do I take it on faith that changing code can break things with X? Yes, but I, having worked with code, just assume that’s what happens to all software. Do I believe that Wayland has found a way to do away with that problem of software architecture (and not necessarily protocol architecture)? Not unless they’ve somehow found a way to compartmentalize every single module such that every aspect is fully isolated and yet has interfaces for every potential use case that could ever be dreamed up. Any devs in the comments want to pipe up and let me know how that endeavor has worked for them in past projects?
The problem is not the code per se, but that we can’t add stuff anymore that doesn’t somehow break the core protocol. The plain fact is that we’ve been tacking on things to X11 which it was never designed to do for decades and we reached a breaking point a while ago.
Stuff like multi-DPI setups are impossible to implement in X11’s single-framebuffer model; security on X11 is non-existent, but we can’t retroactively fit any kind of permissions on the protocol as that breaks X11 applications that (rightfully) assumed they could get a pixmap from the root window. There’s so much more, just take look at https://www.youtube.com/watch?v=RIctzAQOe44
By all means, feel free to start working on it!
All the people who developed Xorg for 20+ years decided that creating and working on Wayland was a better use of their time. But I’m sure you know better…
The problem isn’t that Xorg is spaghetti code (it’s pretty good for a large C project, imho). The problem is that the X11 protocol was designed to expose the capabilities of 1980s display hardware.
Stop putting words in my mouth. I never mentioned spaghetti code, and i said nothing about being better or smarter than either Xorg or Wayland devs
You said that Xorg being abandoned is the problem. How should we interpret that, other than a criticism of the decision-making process of the devs?
Wayland will become a spaghetti too, unless you do “compositorhop” because one compositor is not complete and need to use another, idk if this would be a good idea