• @DevCat
      link
      1810 months ago

      Usually known as GIGO - Garbage in, Garbage out.

    • Ooops
      link
      fedilink
      99 months ago

      Would be really cool if people finally understood the basic priciples of what they call AI, to realize this is not a criticism of AI but of society’s output sourced for training.

      • @mods_are_assholes
        link
        29 months ago

        Yeah that’ll never happen there are people threatening suicide if their AI girlfriends get wiped.

        Most of humanity has no fucking clue and just thinks its a genie in a box.

  • @[email protected]
    link
    fedilink
    -79 months ago

    It exists even on DeepL and Google translate. You put in “I am a student” in English and you get out “i am a male student” in German. No pop up asking which you mean, because women don’t matter.

    • Turun
      link
      fedilink
      159 months ago

      The male form of “student” in German is generic and includes both male and female students.

        • @FooBarrington
          link
          59 months ago

          According to grammatical rules, it can be. That’s not saying these rules are good, but that is what they are right now.

        • Turun
          link
          fedilink
          2
          edit-2
          9 months ago

          Yes it can be. In the DDR it was common for women in the work force to be called “Ingenieur” or whatever. They do the same thing as the guys, why would they need a new job description.
          Words are what we make them. But they are a woefully inadequate method of communication, because the listener may make them something different from what the speaker intended. So this may very well be a woman speaking, even though you don’t believe it.

          The thing is, if you operate under the assumption that generic male form is not generic you will see miscommunications all over the place. Even if none was intended.

          Btw, what’s your preferred form of gendering?

          Edit: I realize we may be talking about completely different things. The generic male form only applies to the plural. I still stand by my comment when talking about groups of students, but I realize now that that may have been completely orthogonal to the initial comment. You are correct when complaining about translation software incorrectly assuming the gender of a single individual.

          • @hikaru755
            link
            29 months ago

            Even in the plural, you’re not really correct here. Yes, the plural “Studenten” has been and is still often used to include both genders, but it’s still the male form, and nobody would consider using the female plural equivalent, “Studentinnen”, in the same way to include both genders. The easy way out in this case, of course, is the participle “Studierende”. That one is truly free of gender connotations in the plural form.

    • @[email protected]
      link
      fedilink
      English
      79 months ago

      That’s not as easy as it sounds, because machine translation systems don’t typically understand the grammar or vocabulary of the languages they work with. All it “knows” is that in the examples of English and German texts that say the same thing, the word “student” corresponds to „Student“ more often than it does to „Studentin“. Depending on the word you’re translating, you’ll also see the opposite problem, where a word like “nurse” is translated as feminine because that’s what occurs more often in the training data, because the training data comes from a particular culture with its own gender biases. It’s not an indication that anyone thinks women don’t matter, or that men don’t matter, just that the technology has some pretty serious shortcomings.

      Looking at the Google Translate app for Android, I see it lists both German words as translations of “student” by itself, because someone has already done the work to handle simple cases correctly. It gets tricky when you try to translate whole phrases or sentences, because to do it correctly in general you have to make sure articles, adjectives, pronouns, etc. are changed to match the gender of the noun you’d prefer to be translated with a particular gender. When you’re translating a sentence with nouns of multiple genders, it’s likely to get confused and change the genders of the wrong words. The longer the text you’re trying to translate, the worse the problems get.

      I’m sure getting gender right most of the time can be done, but having worked with machine learning systems, natural language processing systems, and the kind of people who build them, I’m certain the developers are aware of the shortcomings and would love to fix them, and my intuition is that getting gender right most of the time for one language is probably as complex as the whole existing translation system, and that work would need will need to be repeated for every language that’s supported. It’s the kind of feature I might expect to see in a high-end translation system, like maybe a system designed as an assistant for human translators, but not in one that’s offered for free. In other words, the real problem is that companies offering free machine translations don’t have any financial incentive to spend the money that would be needed, and companies that do spend the money don’t have an incentive to share their technology.

      Maybe LLMs will offer a cheaper path to better translations because they’re basically a much more sophisticated version of the same technology. OTOH a lot of people seem to hate LLMs with a fiery passion, so I don’t know if using an LLM would actually make people happy even if it works a lot better.

      I’d also like to add that the same kind of problem occurs with grammatical concepts other than gender. If you translate Chinese into English, you’ll mostly get singular nouns and present-tense verbs for exactly the same reasons you get more masculine words in German. Machine translation is just generally crappy when one language lacks information that’s mandatory in another language. The only thing that’s different about gender is that people are more apt to be offended when the system fills in missing information with naive assumptions.

      • @[email protected]
        link
        fedilink
        -59 months ago

        I think having a check box at the top of the screen, where you could choose your gender, would not be so hard to implement. Or an alternate translation could be provided like DeepL already does with formal to informal address.

        • @FooBarrington
          link
          29 months ago

          And what if I have a sentence with multiple words with unclear genders? Should they have a dropdown with every possible combination? If we have a sentence with 8 unclear words we’d have a list like:

          • “Männlich, männlich, männlich, männlich, männlich, männlich, männlich, männlich”
          • “Männlich, männlich, männlich, männlich, männlich, männlich, männlich, weiblich”
          • “Männlich, männlich, männlich, männlich, männlich, männlich, weiblich, männlich”

          with 256 entries. For 16 unclear words you’d have 65.536 list entries.

          Yeah, “that wouldn’t be so hard to implement”, because it’s the worst way of approaching the problem and completely unusable.

          • Turun
            link
            fedilink
            29 months ago

            Oh come on, you can argue against their point without being completely brain dead.

            Do you need 256 memory addresses to store a single byte? Obviously not. So you can just make a three way checkbox for every character in the translation.

    • Celediel
      link
      fedilink
      239 months ago

      People can be concerned about more than one thing.

        • knightly the Sneptaur
          link
          fedilink
          99 months ago

          So, what? You think women need their own LLMs or something?

          You go ahead and get started on that, the rest of us can work on making the existing ones less sexist.

            • knightly the Sneptaur
              link
              fedilink
              139 months ago

              They don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.

                • @[email protected]
                  link
                  fedilink
                  English
                  17
                  edit-2
                  9 months ago

                  Programmers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.

        • @[email protected]
          link
          fedilink
          29 months ago

          Why.

          Doesn’t it make sense to fix and address these issues now rather than waiting for them to fester.

            • @hikaru755
              link
              19 months ago

              “inanimate objects”? Where are you getting that from? The article doesn’t state explicitly what the test sentences were, but I highly doubt that LLMs have trouble grammatically gendering inanimate objects correctly, since their gender usually doesn’t vary depending on anything other than the base noun used. I’m pretty sure this is about gendering people.