• Celediel
      link
      fedilink
      2311 months ago

      People can be concerned about more than one thing.

        • knightly the Sneptaur
          link
          fedilink
          911 months ago

          So, what? You think women need their own LLMs or something?

          You go ahead and get started on that, the rest of us can work on making the existing ones less sexist.

            • knightly the Sneptaur
              link
              fedilink
              1311 months ago

              They don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.

                • @[email protected]
                  link
                  fedilink
                  English
                  17
                  edit-2
                  11 months ago

                  Programmers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.

                  • @drislands
                    link
                    411 months ago

                    They’re nuts. Easy block, IMO.

        • @[email protected]
          link
          fedilink
          211 months ago

          Why.

          Doesn’t it make sense to fix and address these issues now rather than waiting for them to fester.

            • @hikaru755
              link
              111 months ago

              “inanimate objects”? Where are you getting that from? The article doesn’t state explicitly what the test sentences were, but I highly doubt that LLMs have trouble grammatically gendering inanimate objects correctly, since their gender usually doesn’t vary depending on anything other than the base noun used. I’m pretty sure this is about gendering people.