• chicken@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    6 hours ago

    The use of local AI does not imply doing that, especially not the centralizing part. Even if some software does collect and store info locally (not inherent to the technology and anything with autosave already qualifies here), that is not close to as bad privacywise as filtering everything through a remote server, especially if there is some guarantee they won’t just randomly start exfiltrating it, like being open source.

    • Umbrias@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      I don’t care if your language model is “local-only” and runs on the user’s device. If it can build a profile of the user (regardless of accuracy) through their smartphone usage, that can and will be used against people.

      emphasis mine from the text you quoted…

      • chicken@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        4 hours ago

        I don’t see how the possibility it’s connected to some software system for profile building, is a reason to not care whether a language model is local only. The way things are worded here make it sound like this is just an intrinsic part of how LLMs work, but it just isn’t. The model still just does text prediction, any “memory” features are bolted on.

        • Umbrias@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          Because these are often sold with profile building features, for example, recall. Recall is sold as “local only” with profile building features. So it continues to be centralized pii that is a point of failure. As the quote says, as i said.

          • chicken@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 hours ago

            Even with Recall, a hypothetical non-local equivalent would be significantly worse. Whether Microsoft actually has your data or not obviously matters. Most conceivable software that uses local AI wouldn’t need any kind of profile building anyway, for instance that Firefox translation feature.

            The thing that’s frustrating to me here is the lack of acknowledgement that the main privacy problem with AI services is sending all queries to some company’s server where they can do whatever they want with them.

                  • chicken@lemmy.dbzer0.com
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    55 minutes ago

                    Software that is designed not to send your data over the internet doesn’t collect your data. That’s what local-only means. If it does send your data over the internet, then it isn’t local-only. How is it still happening?