That’s how it works. It has a fairly profound psychological effect on people where they can easily be convinced that it’s beneficial, when the actual reality is that we have absolutely no evidence of that being the case. On the contrary, we have a growing body of evidence that it has a great deal of negative effects, like decreasing productivity, cognitive decline, widespread social issues arising from their use, and more.
As to your point, they haven’t actually fundamentally changed at all since the original transformer model paper was written in 2017. The only thing that has changed over time is the number of parameters and the datasets (that is, vacuuming up and stealing all content on the internet). But it does the same thing as it’s always done, which is simply generate the next token by taking the token with the greatest probability across a probability distribution created from the combination of the tokens it’s seen before. Note that this is simply a maximum, so if all of the tokens in the distribution have a low probability, it will still take the max, resulting in hallucinations, fabrications, illogical conclusions, and so forth. That has not, and quite simply cannot, change. You would need a fundamentally different technology for that, and quite frankly, one that exists purely in the realm of science fiction.
That’s how it works. It has a fairly profound psychological effect on people where they can easily be convinced that it’s beneficial, when the actual reality is that we have absolutely no evidence of that being the case. On the contrary, we have a growing body of evidence that it has a great deal of negative effects, like decreasing productivity, cognitive decline, widespread social issues arising from their use, and more.
As to your point, they haven’t actually fundamentally changed at all since the original transformer model paper was written in 2017. The only thing that has changed over time is the number of parameters and the datasets (that is, vacuuming up and stealing all content on the internet). But it does the same thing as it’s always done, which is simply generate the next token by taking the token with the greatest probability across a probability distribution created from the combination of the tokens it’s seen before. Note that this is simply a maximum, so if all of the tokens in the distribution have a low probability, it will still take the max, resulting in hallucinations, fabrications, illogical conclusions, and so forth. That has not, and quite simply cannot, change. You would need a fundamentally different technology for that, and quite frankly, one that exists purely in the realm of science fiction.