In my case, there are 95 packages that depend on zlib, so removing it is absolutely the last thing you want to do. Fortunately though, GPT also suggested refreshing the gpg keys, which did solve the update problem I was having.

You gotta be careful with that psycho!

  • JPSound@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    10 months ago

    I recently asked ChatGPT “what’s a 5 letter word for a purple flower?” It confidently responded “Violet” there’s no surprise it gets far more complex questions wrong.

    • Akisamb
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      These models do not see letters but tokens. For the model, violet is probably two symbols viol and et. Apart from learning by heart the number of letters in each token, it is impossible for the model to know the number of letters in a word.

      This is also why gpt family sucks at addition their tokenizer has symbols for common numbers like 14. This meant that to do 14 + 1 it could not use the knowledge 4 + 1 was 5 as it could not see the link between the token 4 and the token 14. The Llama tokenizer fixes this, and is thus much better at basic algebra even with much smaller models.