Windows 12 and the coming AI chip war::A Windows release planned for next year may be the catalyst for a new wave of desktop chips with AI processing capabilities.

  • @parpol
    link
    English
    366 months ago

    This is a surprising development. I was almost certain big companies wanted absolutely nothing but locking AI behind a cloud subscription service. Allowing us to run AI locally is a big win for the consumer.

    I’m still sticking with Linux, though.

    • 𝕽𝖔𝖔𝖙𝖎𝖊𝖘𝖙
      link
      fedilink
      English
      126 months ago

      Cloud AI services will always be more capable. You can do more with near-limitless RAM and processing power than you’ll ever be able to do locally on your mobile/desktop system.

      But locally-run AI’s usefulness was never in question either.

      An example of this is Google who made the Coral TPU and then the Pixel TPU chips specifically in a bid to bring AI to edge and mobile devices. And now they have been developing a version of their Bard AI that is built to run on mobile systems.

      There is a huge market for it but there are a lot of challenges to running AI models on lower power hardware that have to be addressed. In its current state most AI on these platforms performed only get specific operations using very condensed models.

    • @[email protected]
      link
      fedilink
      English
      126 months ago

      Not sure if you already know, but - sophisticated large language models can be run locally on basic consumer-grade hardware right now. LM Studio, which I’ve been playing with a bit, is a host for insanely powerful, free, and unrestricted LLMs.

    • @[email protected]
      link
      fedilink
      English
      36 months ago

      I’m sure lots of big companies do, but there are lots of big companies and they all have their own goals. The article goes into it a bit, but lots of companies aren’t exactly leaping to send off their most sensitive data into a third party cloud. They want AI to work with their data, but they want it locally and on-prem.

      Plus you also got Intel, AMD, Nvidia, Qualcomm etc who want to sell as much as they can regardless of if it’s going to be a cloud customer or a customer looking to train AI locally.

      Being good for the consumer is just a side effect of the blending of money, business paranoia, competition, sensitive data and rapid expansion of an industry