• @Red_October
    link
    171 day ago

    Even when one of their own strongholds, Texas, was dying because the power grid failed because the weather was too cold, or when the power grid failed because the weather was too hot, the Right gave no shits about actually helping people. They absolutely aren’t about to discover some shred of empathy when they think it’s just “The Libs” who are burning.

  • @secundnature
    link
    762 days ago

    Pieces of shit acting like pieces of shit. News at 11!

    • @[email protected]
      link
      fedilink
      142 days ago

      Cue some video of Abbott or Desantis next to an ASL interpreter. Hasn’t this stuff been standard for decades?

    • @Archer
      link
      124 hours ago

      American Sign Language is now apparently un-American

  • 𝚐𝚕𝚘𝚠𝚒𝚎
    link
    fedilink
    26
    edit-2
    2 days ago

    So he postulated why not just have only closed captioning as once before instead of half the screen be ASL.

    I guess I would be curious to know why, actually. Is there a better reason?

    Edit: I read the quote, but curious if by emotional nuance they mean in the interpreter’s facial expressions? Wouldn’t they also gain this from the primary person speaking too?

    • @[email protected]
      link
      fedilink
      402 days ago

      You can’t read text and view a person’s expression at the same time, but you can watch signing and see the expression (sometimes because it’s part of the sign itself).

      For some Deaf people it’s also an issue that they aren’t as fluent in written English as they are in American Sign Language. Text isn’t going to help them if they can’t figure out what it means, or it’s going by too fast to comprehend.

    • @BrianTheeBiscuiteer
      link
      262 days ago

      My wife is an ASL interpreter and illiteracy or reduced reading proficiency are common issues in the deaf community. ASL is obviously based on English but it’s not a 1-to-1 mapping of words to signs. Also, written language is usually based on spoken language and since they can’t hear the language it’s a big disadvantage. Imagine learning written Chinese but without ever hearing it spoken.

      • @[email protected]
        link
        fedilink
        92 days ago

        It’s hard to imagine how you would even begin to learn to read. You see text and you have to translate that directly to meaning without imagining the sound in your head? Witchcraft I say.

    • celeste
      link
      fedilink
      232 days ago

      I learned pretty recently that facial expressions are part of ASL and can change a sign’s meaning.

      https://www.yahoo.com/news/opinion-criticizing-sign-language-interpreters-195855277.html here’s an opinion article that gets into this current controversy more.

      But, on the facial expression question, these links are more about that:

      The past 30 years of linguistic research on sign languages have revealed that there are facial expressions which are used together with manual signs and function as phonological features, morphemes, and syntactic/prosodic markers, for example brow raising marking conditional clauses (Liddell, 1980; Dachkovsky and Sandler, 2009). These facial expressions are clearly communicative in nature and they are used in combination with other meaningful movements (those of the hands).

      from: https://pmc.ncbi.nlm.nih.gov/articles/PMC3593340/

      https://www.lifeprint.com/asl101/pages-layout/facialexpressions.htm

      Basically, facial expressions are grammar in ASL. There are specific meanings assigned to them, which is different than the more subtle nuance that would’ve been my first guess too, a while ago.

      • @AliasVortex
        link
        English
        102 days ago

        Correct. ASL is fascinating because of how visual it is and just how much you can convey by taking the same sign moving it differently (for example you can describe a rough flight by making the sign for airplane and then bouncing it up and down).

        I might also add that in addition to your facial expressions form grammar structures, body language (of which facial expressions are a part) also conveys tone/emphasis. For some concrete examples of how this provides context: the sign for thin becomes anorexic if you suck in your cheeks/ stomach while you make it. Similarly, fat can become obese if you puff out your cheeks and slouch a bit while you make it. Or on a more topical note, the sign for fire is made by wiggling your fingers in an upward motion in front of your chest (visual), the size of your sign sort of describes the size of the fire your talking about, small slow movements might describe the dying embers of a campfire, while larger (pushing towards of out of the area you normally sign in) more frantic movements would be used to describe a miles high inferno.

    • aramis87
      link
      fedilink
      202 days ago

      Because live captions are frequently shit, especially for live broadcasts of local or regional issues. They’re behind the images that are shown on the screen, so you get things like someone saying “you see here where the fire is coming from” and they switch to a second map just as the captions appear on the screen. The captioners are rushed and can’t always keep up and they make typos. Auto-generated captions lose words and nuance and sometimes just output pure gibberish.

      • @[email protected]
        link
        fedilink
        12 days ago

        That seems like a problem that would be well addressed by investing more resources into the live captioning process, though.

          • @[email protected]
            link
            fedilink
            02 days ago

            To be clear, I’m not necessarily against sign language interpreters.

            They do miss out on what I consider to be an important part of accessibility though - they are not an example of universal design. High-quality captions are an example of universal design, which gives them higher staying power (what right-winger would move to kill captions?) and makes it easier to justify investments.

            • @[email protected]
              link
              fedilink
              English
              51 day ago

              ASL is not English, though. It has its own grammar structure and words (obviously) so it is functionally a different language than written English. People who speak ASL as a first language are essentially learning a second language with written English, one that is based on spoken language they can’t hear. As a result, many in the Deaf community struggle to read and write. Add to that the stress of it being an emergency and having to process the text in real-time before it disappears? I just don’t see captions being the answer for this already vulnerable community, especially in life-or-death situations.

    • @Today
      link
      32 days ago

      Helps provide tone and emotion to the conversation - the things you get when listening to someone speak and the reason text messages can be so difficult sometimes.