Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_ Copilot will refuse to help you. 😑

  • Optional
    link
    English
    32 days ago

    This doesn’t appear to be true (anymore?).

      • Optional
        link
        English
        61 day ago

        I believe the difference there is between Copilot Chat and Copilot Autocomplete - the former is the one where you can choose between different models, while the latter is the one that fails on gender topics. Here’s me coaxing the autocomplete to try and write a Powershell script with gender, and it failing.

        Oh I see - I thought it was the copilot chat. Thanks.