• @[email protected]
      link
      fedilink
      English
      66 months ago

      ??? The top lvl commenter wants an LLM with big context window and the other commenter responded with an LLM which has 200k token context window which is waaaaaay more than “100 lines of code”.

    • @Echostorm
      link
      English
      26 months ago

      Yeah sorry, I thought that was clear. It’s how context is measured.