Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_ Copilot will refuse to help you. 😑

    • Optional@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      I believe the difference there is between Copilot Chat and Copilot Autocomplete - the former is the one where you can choose between different models, while the latter is the one that fails on gender topics. Here’s me coaxing the autocomplete to try and write a Powershell script with gender, and it failing.

      Oh I see - I thought it was the copilot chat. Thanks.