• @[email protected]
    link
    fedilink
    English
    1
    edit-2
    6 months ago

    They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.

    • @[email protected]
      link
      fedilink
      English
      16 months ago

      It couldn’t quite be a decade, a decade ago we only just had the vgg — but sure, broad strokes, they’ve been doing local stuff, cool.