If you’re just wanting to run LLMs quickly on your computer in the command line, this is about as simple as it gets. Ollama provides an easy CLI to generate text, and there’s also a Raycast extension for more powerful usage.

  • varsock
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    when running models locally, I presume the models are trained and the weights and stuff are exported to a “model.” For example Meta’s LLama model.

    Do these models get updated, new versions released? I don’t quite understand