• @[email protected]
    link
    fedilink
    English
    222 months ago

    Alexa and LLMs are fundamentally not too different from each other. It’s just a slightly different architecture and most importantly a much larger network.

    The problem with LLMs is that they require immense compute power.

    I don’t see how LLMs will get into the households any time soon. It’s not economical.

    • just another dev
      link
      fedilink
      English
      152 months ago

      The problem with LLMs is that they require immense compute power.

      To train. But you can run a relatively simple one like phi-3 on quite modest hardware.

    • @[email protected]
      link
      fedilink
      English
      12 months ago

      The immense computing power for AI is needed for training LLMs, it’s far less for running a pre-trained model on a local machine.

    • @[email protected]
      link
      fedilink
      English
      -12 months ago

      The problem with LLMs is that they require immense compute power. I don’t see how LLMs will get into the households any time soon. It’s not economical.

      You realize the current systems run in the cloud?

      • @[email protected]
        link
        fedilink
        English
        12 months ago

        Well yea. You could slap Gemini Google-Home today. You wouldn’t even need a new device for that probably. The reason they don’t do that is econimical.

        My point is that LLMs aren’t replacing those devices. They are the same thing essentially. Just one a trimmed version of the other for economic reasons.