• SturgiesYrFase@lemmy.ml
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    While this will speed up loads of things, it also feels like this will end up being another way to remove upgradability from devices. Want more ram in your desktop buy a new cpu.

    • Patapon Enjoyer@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 year ago

      The article says the whole CPU has like 200MB of memory, so it’s not really replacing the RAM already in PCs. Plus this seems focused on AI applications, not general computing.

      • SturgiesYrFase@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        hits blunt That’s just your opinion, man.

        And that’s fair, at the same time it’s still quite new, once it’s matured a bit I could definitely see this being how things go until…idk, hardlight computing or w.e

      • DaPorkchop_@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        So… they’ll probably add some slower, larger-capacity memory chips on the side, and then they’ll need to copy data back and forth between the slow off-chip memory and the fast on-chip memory… I’m pretty sure they’ve just invented cache

    • glimse@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I mean the physical distance between the RAM and CPU will eventually be the limiting factor right? It’s inevitable for more reasons than profit