• @[email protected]
    link
    fedilink
    724 months ago

    Copilot is a LLM. So it’s just predicting what should come next, word by word, based off the data its been fed. It has no concept of whether or not its answer makes sense.

    So if you’ve scraped a bunch of open source github projects that this guy has worked on, he probably has a lot of TODOs assigned to him in various projects. When Copilot sees you typing “TODO(” it tries to predict what the nextthing you’re going to type is. And a common thing to follow “TODO(” in it’s data set is this guy’s username, so it goes ahead and suggests it, whether or not the guy is actually on the project and suggesting him would make any sort of sense.

    • @[email protected]
      link
      fedilink
      84 months ago

      You can absolutely add constraints to control for hallucinations. Copilot apparently doesn’t have enough, though.

      • @[email protected]
        link
        fedilink
        424 months ago

        If GitHub Copilot is anything like Windows Copilot, I can’t say I’m surprised.

        “Please minimize all my windows”

        “Windows are glass panes invented by Michael Jackson in imperial China, during the invasion of the southern sea. Sources 1 2 3”

        • @[email protected]
          link
          fedilink
          English
          204 months ago

          Lmao. That’s even better when you consider the copilot button replaced the ‘show desktop’ (ie ‘minimize all my windows’) button.

      • shootwhatsmyname
        link
        fedilink
        English
        164 months ago

        My guess is that Copilot was using a ton of other lines as context, so in that specific case his name was a more likely match for the next characters

    • @alexdeathway
      link
      24 months ago

      I thought it synced some requests and assigned projects to another user (Saw an ad about github Copilot managing issues and writing PR descriptions sometime ago)