It doesn’t search the internet for cats, it is pre-trained on a large set of labelled images and learns how to predict images from labels. The fact that there are lots of cats (most of which have tails) and not many examples of things “with no tail” is pretty much why it doesn’t work, though.
It’s not the “where” specifically I’m correcting, it’s the “when.” The model is trained, then the query is run against the trained model. The query doesn’t involve any kind of internet search.
And I care about “how” it works and “what” data it uses because I don’t have to walk on eggshells to preserve the sanctity of an autocomplete software
You need to curb your pathetic ego and really think hard about how feeding the open internet to an ML program with a LLM slapped onto it is actually any more useful than the sum of its parts.
It doesn’t search the internet for cats, it is pre-trained on a large set of labelled images and learns how to predict images from labels. The fact that there are lots of cats (most of which have tails) and not many examples of things “with no tail” is pretty much why it doesn’t work, though.
Unrelated to the convo but for those who’d like a visual on how LLM’s work: https://bbycroft.net/llm
And where did it happen to find all those pictures of cats?
It’s not the “where” specifically I’m correcting, it’s the “when.” The model is trained, then the query is run against the trained model. The query doesn’t involve any kind of internet search.
And I care about “how” it works and “what” data it uses because I don’t have to walk on eggshells to preserve the sanctity of an autocomplete software
You need to curb your pathetic ego and really think hard about how feeding the open internet to an ML program with a LLM slapped onto it is actually any more useful than the sum of its parts.
Dawg you’re unhinged