• Captain Janeway
    link
    fedilink
    88 days ago

    Well I’m guessing they actually did testing on local AI using a 4GB and 8GB RAM laptop and realized it would be an awful user experience. It’s just too slow.

    I wish they rolled it in as an option though.

      • suoko
        link
        fedilink
        18 days ago

        Llamafile with tinyllama model is 640mb. It could be a flag to enable or an extension