• CatsGoMOW@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    6 days ago

    I’ve had so many different laptops over the past ~10 years ranging from Dell, ThinkPad, System76, Asus, and now a newer MacBook. I’ve used Windows, Linux, and Mac OS as a daily driver OS. The only Arm chip I’ve had is my current MacBook, but to answer your question, its power efficiency is unmatched compared to anything else I’ve ever had… resulting in crazy battery life as well as a device that doesn’t try to melt a hole through my lap whenever I try and do something even remotely taxing on it.

    • the_riviera_kid@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      edit-2
      6 days ago

      The efficiency had better be good that’s specifically what the RISC (arm stands for Advanced RISC Machine) architecture was designed for but that’s also why its terrible for general purpose. ARM is also proprietary which always stifles progress because of licensing (greed). It’s one of the reasons that RISC-V is becoming popular despite being less efficient.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        6 days ago

        This is kinda a myth. ARM is fine for HPC or desktop use (hence there have been very high power ARM designs like Fujitsu A64FX, Ampere Altra or the European Rhea), x86 is fine for low power, it’s just more about how the specific chip is tuned for power/raw performance/price.

        Apple seems very good (partially) because they pay top dollar for power efficiency and a cutting edge low power process. Most x86 laptop chips make more significant cost tradeoffs, with cheaper dies, higher clocks, more aggressive power curves and so on.

        • the_riviera_kid@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          edit-2
          6 days ago

          It’s not though, ARM themselves admit it. https://www.arm.com/glossary/risc.

          “With RISC, a central processing unit (CPU) implements the processor design principle of simplified instructions that can do less but can execute more rapidly.”

          None of this is to say RISC or by extension ARM is bad, just that where everything currently is it’s not a good choice for everyday computing. By design its as light weight and simple as possible so that it can perform its specific function faster and more efficiently with less overhead than a more general purpose processor.

          Geeks for geeks has a good writeup on it.

          https://www.geeksforgeeks.org/computer-organization-risc-and-cisc/

          • lka1988@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            5 days ago

            Just to add - a rather large reason the technology we have today even exists is thanks in no small part to the x86 architecture and it’s immense backwards-compatibility.

          • oo1@lemmings.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 days ago

            Laptops run off batteries a lot of the time - so compromising outright performance - full instruction set - for battery life will be attractive for many laptop users who use it on the go.

            I’m no apple fanatic, I’d never get one, but I do see the appeal of those apple laptops.

            I’m sure x86 could get closer on the performance to battery tradeoff if they wanted to; but I bet they’d be looking to price up at the apple level for that.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            6 days ago

            I mean, that doesn’t mean much.

            The Fujitsu A64FX had full 512-bit SVE, with 2x 512-bit units per core and HBM memory, which is as CISC as it gets. IIRC was the “widest” CPU that could get the most done per clock, at the time, and the US Department of Energy seemed to love them.

            And then you have tiny cores like Intel’s in order ones that are way thinner than ARM designs.

            Reality is decoding doesn’t take up much die space these days and stuff is decoded into micro ops anyway. The ISA has an effect, but efficiency/appropriateness for different platforms comes down to design and business decisions more than the ISA.