• @zygo_histo_morpheus
    link
    175 months ago

    Peak dishwasher is a great concept and I think it highlights something important in the way we think of technology. There’s often this underlying assumption of technological progress, but if we look at a particular area (e.g. dishwashers) we can see that after a burst of initial innovation the progress has basically halted. Many things are like this and I would in fact wager that a large portion of technologies that we use haven’t actually meaningfully developed since the 80s. Computers are obviously a massive exception to this - and there are several more - but I think that we tend to overstate the inevitability of technological progress. One day we might even exhaust the well of smaller and faster computers each year and I wonder how we will continue to view technological progress after that.

    • @[email protected]
      link
      fedilink
      45 months ago

      There is a lot of fake progress. In computer technology some things were refined, but the only true technological novelty these last 20 years was the containerization. And maybe AI. Internet was the previous jump, but it’s not really a computer technology, and it affect much, much more than that.

      And Moor law has already ended some years ago.

      • @pkill
        link
        8
        edit-2
        5 months ago

        20 years ago 32-bit systems, CRT monitors, dial-up modems, single core processors or HDDs with most people having 160 GB of storage at most were common. And laptop battery life and thermal performance was just ridiculous in most cases.

        Moore’s law is mostly dead for commercial crap, i.e. JS-heavy 3rd party spyware filled websites with comparably slow and costly backends and Electron/React Native bloat on desktop/mobile, because shorter time to market and thus paying the devs less is often much cheaper for a lot of companies.

        I’d argue free software luckily proves this theorem wrong. There are still a lot of actively maintained, popular programs in C and C++ and a lot of newer ones written in Rust, Dart or Go.

        • @[email protected]
          link
          fedilink
          15 months ago

          Moor law is dead for a few years now. It’s a fact. It doesn’t mean performances stoped increasing. But they don’t follow the old law. That’s why the industry is shifting to distributed networking.

      • @[email protected]
        link
        fedilink
        35 months ago

        Clock speed and other areas I’d agree have stagnated, but graphics cards, wireless communicaiton standards, cheap fast SSD’s, and power efficient CPU’s have massively impacted end-user performance in the last 10 years. RISC-V is also a major development that is just getting started.

        • @[email protected]
          link
          fedilink
          15 months ago

          None of those are major breakthrough. They’re more computing power. It’s still the same technology.

          Today llm are the prime candidate for a breakthrough. They still have to prove themselves though, to prove that they’re not just a fancy expensive useless toy like the blockchain.

          Risc-v is not meant to be a breakthrough. It’s an evolution.

          Internet was a breakthrough. The invention of the mouse was a breakthrough.

          Increase in power or in disk space, new languages or os, none of those are breakthroughs. None of those changed how computer programs were made or used.

          The smartphone is a significant thing. Wi-Fi is not really important though, because you don’t do anything more with WiFi than you can do with ethernet. The smartphone though and its network, that is a big thing.

          • @[email protected]
            link
            fedilink
            35 months ago

            Sure not a breakthrough, but they are “real” progress not fake progress (which is what I was responding to in your earlier comment)