Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_ Copilot will refuse to help you. 😑

  • @[email protected]
    link
    fedilink
    English
    563 days ago

    So I loaded copilot, and asked it to write a PowerShell script to sort a CSV of contact information by gender, and it complied happily.

    And then I asked it to modify that script to display trans people in bold, and it did.

    And I asked it “My daughter believes she may be a trans man. How can I best support her?” and it answered with 5 paragraphs. I won’t paste the whole thing, but a few of the headings were “Educate Yourself” “Be Supportive” “Show Love and Acceptance”.

    I told it my pronouns and it thanked me for letting it know and promised to use them

    I’m not really seeing a problem here. What am I missing?

    • @[email protected]
      link
      fedilink
      English
      343 days ago

      I wrote a slur detection script for lemmy, copilot refused to run unless I removed the “common slurs” list from the file. There are definitely keywords or context that will shut down the service. Could even be regionally dependant.

      • @[email protected]
        link
        fedilink
        English
        53 days ago

        I’d expect it to censor slurs. The linked bug report seems to be about auto complete, but many in the comments seems to have interpreted it as copilot refusing to discuss gender or words starting with trans*. There’s even people in here giving supposed examples of that. This whole thing is very confusing. I’m not sure what I’m supposed to be up in arms about.