Title

  • feral_hedgehog@pawb.social
    link
    fedilink
    arrow-up
    65
    arrow-down
    1
    ·
    1 year ago

    There’s a very nice (albeit somewhat outdated) talk here.

    In a nutshell, both X11 and Wayland are protocols that define how software should communicate to (hopefully) display stuff on your screen.
    Protocols as in there’s a bunch of documentation somewhere that says which function a program must call to create a window, without specifying how either program or function should be implemented.
    This is great because it allows for independently written software to be magically compatible.

    X11 is the older protocol, and was working fine good enough for many years, but has issues handling a bunch of modern in-deman technologies - issues which can’t be fixed without changing the protocol in a way that would make it incompatible with existing software (which is the entire point).
    Plus its most used implementation - Xorg, consists of a huge and complex codebase that fewer and fewer people are willing to deal with.

    Wayland is the newer protocol, that mostly does the exact same thing, but better, in a way that allows for newer tech, and completely breaks compatibility in order to do so.

    The trouble with the whole situation was that in order to replace X with Wayland basically the entire Linux graphics stack had to be rewritten - and it was, with raging debates and flame wars and Nvidia being lame.
    They also wrote a compatibility layer called Xwayland that lets you keep using older X-only apps which somehow manages to outperform Xorg.

    Now we’re at the point where major distributions are not only switching to Wayland by default, but also dropping support for Xorg completely, and announcing that they’ll no longer maintain it, which is why posts about it keep popping up.