• @[email protected]
    link
    fedilink
    211 day ago

    Not against this feature, but this quote made me laugh:

    … once this is in place, people won’t have to scour the internet for sourcing subtitles to their favorite movies, shows, or even anime.

    As if MTL will get anywhere near the nuance of a properly made human translation.

    • @[email protected]
      link
      fedilink
      3
      edit-2
      15 hours ago

      Personally, I would be happy even if it didn’t translate it but were able to give some half decent transcription of, at least, English voice into English text. I prefer having subtitles, even when I speak the language, because it helps in noisy environments and/or when the characters mumble / have weird accents.

      However, even that would likely be difficult with a lightweight model. Even big companies like Google often struggle with their autogenerated subtitles. When there’s some very context-specific terminology, or uncommon names, it fumbles. And adding translation to an already incorrect transcript multiplies the nonsense, even if the translation were technically correct.

  • Fonzie!
    link
    fedilink
    231 day ago

    Oh so that wasn’t a joke from their booth.

    This seems really out of place, but locally ran auto subtitles from ethically sourced AI would be great.

    It’s just that there’s two very big conditions in that sentence there.

      • Fonzie!
        link
        fedilink
        5
        edit-2
        23 hours ago

        JetBrains’ AI code suggestions were only trained on code where authors gave explicit permission for it, but that’s the only one I know from the top of my head. Most chat-oriented LLMs (ChatGPT, Claude, Gemini…) were almost certainly trained using corporate piracy.

      • @smayonak
        link
        141 day ago

        There are a number of open weight open source models out there with all their data sourced from the public domain. Look up BLOOM and Falcon. There are others.

  • @[email protected]
    link
    fedilink
    English
    524 hours ago

    It won’t be better than human translated ones but begter than no subtitles. I don’t think even humans can make subtitles correctly without knowing context

    • @[email protected]
      link
      fedilink
      1023 hours ago

      Honestly, if it can generate subtitle files it’ll be a huge benefit to people creating subtitles. It’s way easier to start with bad subs and fix them than it is to write from scratch.

  • @[email protected]
    link
    fedilink
    382 days ago

    This is not by default bad thing, if it is something you only use when you decide to do so, when you don’t have other subtitles available tbh. I hate AI slop too but people just go to monkey brain rage mode when they read AI and stop processing any further information.

    I’d still always prefer human translated subtitles if possible. However, right now I’m looking into translating entire book via LLM cause it would be only way to read that book, as it is not published in any language I speak. I speak English well enough, so I don’t really need subtitles, just like to have them on so I won’t miss anything.

    For English language movies, I’d probably just watch them without subtitles if those were AI, as I don’t really need them, more like nice to have in case I miss something. For languages I don’t understand, it might be good, although I wager it will be quite bad for less common languages.

  • @[email protected]
    link
    fedilink
    502 days ago

    It is probably good that OS community are exploring this however I’m not sure the technology is ready (or will ever be maybe) and it potentially undermines the labour intensive activity of producing high quality subtitling for accessibility.

    I use them quite a lot and I’ve noticed they really struggle on key things like regional/national dialects, subject specific words and situations where context would allow improvement (e.g. a word invented solely in the universe of the media). So it’s probably managing 95% accuracy which is that danger zone where its good enough that no one checks it but bad enough that it can be really confusing if you are reliant on then. If we care about accessibility we need to care about it being high quality.

  • @Evotech
    link
    452 days ago

    If youtube transcriptions is anything to go by this won’t be great. But I’m optimistic

    • @lefixxx
      link
      342 days ago

      Youtube transcriptions are suprisingly abysmal considering what technology google already has at hand.

      • @Matriks404
        link
        102 days ago

        I find them pretty good for English spoken by native speakers. For anything else it’s horrible.

        • ffhein
          link
          117 hours ago

          As long as they are talking about normal things and not playing D&D 😃

      • @[email protected]
        link
        fedilink
        22 days ago

        I actually disagree.

        I’m consistently impressed whenever I have auto-subtitles turned on on Youtube.

        • @YourMomsTrashman
          link
          5
          edit-2
          1 day ago

          I’m not impressed by the subtitles themselves (they’re just ok) but rather by how accessible it is. Like it being an option rather than it being a “tool for creators” or limited to premium or something

          Or maybe youtube has added so much dogshit features recently (like ai overviews, automatically adding info cards for anyone mentioned, and highlighting seemingly random words in comments to search it outside of context) that it makes me appreciate these things more lol

    • Fonzie!
      link
      fedilink
      71 day ago

      They’re helpful to my deaf ears, even when they’re wrong (50% of the words) they do give me a solid idea of what is being said together with what the audio sounds like.

      With it, I get almost everything correct. Without it, I understand near to nothing.

      This only goes for English spoken by Americans and sometimes London Britons, sadly, nothing else get detected nearly as good enough, so I can’t enjoy YouTube in my native language (Dutch), but being able to consume English YouTube already helps a lot!

      • @Evotech
        link
        41 day ago

        That is very true. It’s hard to find local subtitles to a lot of stuff. And the whole deaf angle :)

    • @[email protected]
      link
      fedilink
      3
      edit-2
      2 days ago

      I’ve been messing with more recent open-source AI Subtitling models via Subtitle Editor which has a nice GUI for it. Quality is much better these days, at least for English. It still makes mistakes, but the mistakes are on the level of “I misheard what they said and had little context for the conversation” or “the speaker has an accent which makes it hard to understand what they’re saying” mistakes, which is way better than most YouTube Auto Transriptions I’ve seen.

  • metaStatic
    link
    fedilink
    152 days ago

    I’ve seen some pretty piss poor implementations on streaming apps but if anyone can get it right it’s VLC

  • @[email protected]
    link
    fedilink
    English
    51 day ago

    Im curious What makes what VLC is doing qualify as artificial intelligence instead of just an automated transcription plugin?

    Automated transcription software has been around for decades, I totally understand getting in on the ai hype train but i guess I’m confused as to if software from years past like “dragon naturally speaking” or Shazam are also LLMs that predate openAI or is how those services worked to identify things different from how modern llms work?

    • セリャスト
      link
      fedilink
      21 day ago

      Llms are a very specific Gennerative AI subset. Not everything AI is LLM, especially stuff like Shazam is pretty traditional AI. It’s been around for a while already, and studied for even longer (even back in the 1960s we were already starting to have a field of study in this domain)

    • @[email protected]
      link
      fedilink
      82 days ago

      What do you mean by active component? Is processing the audio being played back to add subtitles active?

      • Despotic Machine
        link
        fedilink
        -1
        edit-2
        2 days ago

        Is processing the audio being played back to add subtitles active?

        Not sure where you are confused. If any part of this feature is active by default I will disable it.

        • @[email protected]
          link
          fedilink
          21 day ago

          The way you wrote this, I thought you meant that if it required a cloud service you would turn it off. But now I think you’re just saying you wouldn’t use this feature.

          I share the confusion over your definition of “active”. You got all defensive when someone asked, so now no one really knows what you meant.

      • z3rOR0ne
        link
        fedilink
        4
        edit-2
        2 days ago

        Its a command line multimedia player. It’s implementation is ideal for minimalists, and easily understood by reading the man pages.

        It works very well imo.

  • @coolmojo
    link
    32 days ago

    What would be actually cool if it could translate foreign movies based on audio and add the English subtitles to it.

  • Juntti
    link
    fedilink
    English
    32 days ago

    I wonder how good it is.

    Does it translate from audio or from text?

    Does it translate multiple languages, if video has a, b, c languages does it translate all to x.

    Does user need to set input language?