Interesting discussion on HN.

  • @DoubleEndedIterator
    link
    English
    12
    edit-2
    1 year ago

    I was talking to a friend of my mother and she was genuinely shocked when i told her that ChatGPT sometimes just makes things up. I was a little take aback with how deeply she thanked me for telling her that she needs to double check if ChatGPT suggests something. She was just completely believing the AI output.

    A friend of mine sent me some code and ask me if i can help him getting it to work. Turns out he doesn’t know any programming and just had ChatGPT generate it for him. He wanted it to do a thing for which there was no python library available so ChatGPT just hallucinated the API. It took me a while to explain to him that what he was trying to do was just not going to work but I was able to suggest him an existing software solution.

    So yeah I think that the average people are not aware of it and then many people that are aware of it are overestimating the technology.

    • @kraegar
      link
      English
      41 year ago

      This has been my experience too. A junior dev at my last company kept trying to use ChatGPT to generate docket compose files and wondered why they generally didn’t work.

      My research has been on time series forecasting which is tangentially related to NLP. People are shocked when I point out to them that all these models do it predict the next token. Using weather forecasting has been a good analogy for why long AI generated texts are extra bad: weather forecasts get worse as the horizon increases.

      Despite all my gripes about LLMs, I must say that copilot has saved me writing TONS of boilerplate code and unit tests.

    • @kraegar
      link
      English
      11 year ago

      deleted by creator