One big difference that I’ve noticed between Windows and Linux is that Windows does a much better job ensuring that the system stays responsive even under heavy load.

For instance, I often need to compile Rust code. Anyone who writes Rust knows that the Rust compiler is very good at using all your cores and all the CPU time it can get its hands on (which is good, you want it to compile as fast as possible after all). But that means that for a time while my Rust code is compiling, I will be maxing out all my CPU cores at 100% usage.

When this happens on Windows, I’ve never really noticed. I can use my web browser or my code editor just fine while the code compiles, so I’ve never really thought about it.

However, on Linux when all my cores reach 100%, I start to notice it. It seems like every window I have open starts to lag and I get stuttering as the programs struggle to get a little bit of CPU that’s left. My web browser starts lagging with whole seconds of no response and my editor behaves the same. Even my KDE Plasma desktop environment starts lagging.

I suppose Windows must be doing something clever to somehow prioritize user-facing GUI applications even in the face of extreme CPU starvation, while Linux doesn’t seem to do a similar thing (or doesn’t do it as well).

Is this an inherent problem of Linux at the moment or can I do something to improve this? I’m on Kubuntu 24.04 if it matters. Also, I don’t believe it is a memory or I/O problem as my memory is sitting at around 60% usage when it happens with 0% swap usage, while my CPU sits at basically 100% on all cores. I’ve also tried disabling swap and it doesn’t seem to make a difference.

EDIT: Tried nice -n +19, still lags my other programs.

EDIT 2: Tried installing the Liquorix kernel, which is supposedly better for this kinda thing. I dunno if it’s placebo but stuff feels a bit snappier now? My mouse feels more responsive. Again, dunno if it’s placebo. But anyways, I tried compiling again and it still lags my other stuff.

  • Baldur Nil
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    5 months ago

    You’re right that garbage collection makes Go simpler, and maybe other patterns do contribute to prevent complexity from piling up. I never worked with Go outside of silly examples to try it out, so I’m no authority about it.

    What I meant was more of a “general” rule that the simpler a language is, the more code is necessary to express the same thing and then the intent can become nebulous, or the person reading might miss something. Besides, when the language doesn’t offer feature X, it becomes the programmer’s job to manage it, and it creates an extra mental load that can add pesky bugs (ex: managing null safety with extra checks, tracking pointers and bounds checking in C and so on…).

    Also there are studies that show the number of bugs in a software correlate with lines of code, which can mean the software is simply doing more, but also that the more characters you have to read and write, the higher the chance of something to go wrong.

    But yeah, this subject depends on too many variables and some may outweigh others.