cross-posted from: https://lemmy.ca/post/37011397

[email protected]

The popular open-source VLC video player was demonstrated on the floor of CES 2025 with automatic AI subtitling and translation, generated locally and offline in real time. Parent organization VideoLAN shared a video on Tuesday in which president Jean-Baptiste Kempf shows off the new feature, which uses open-source AI models to generate subtitles for videos in several languages.

  • snooggums
    link
    English
    4210 hours ago

    Yup, and if it isn’t perfect that is ok as long as it is close enough.

    Like getting name spellings wrong or mixing homophones is fine because it isn’t trying to be factually accurate.

    • @[email protected]
      link
      fedilink
      English
      64 hours ago

      I’d like to see this fix the most annoying part about subtitles, timing. find transcript/any subs on the Internet and have the AI align it with the audio properly.

    • TJA!
      link
      fedilink
      English
      189 hours ago

      Problem ist that now people will say that they don’t get to create accurate subtitles because VLC is doing the job for them.

      Accessibility might suffer from that, because all subtitles are now just “good enough”

      • snooggums
        link
        English
        64 hours ago

        Regular old live broadcast closed captioning is pretty much ‘good enough’ and that is the standard I’m comparing to.

        Actual subtitles created ahead of time should be perfect because they have the time to double check.

      • @[email protected]
        link
        fedilink
        English
        58 hours ago

        Honestly though? If your audio is even half decent you’ll get like 95% accuracy. Considering a lot of media just wouldn’t have anything, that is a pretty fair trade off to me

        • @[email protected]
          link
          fedilink
          English
          2
          edit-2
          4 hours ago

          From experience AI translation is still garbage, specially for languages like Chinese, Japanese, and Korean , but if it only subtitles in the actual language such creating English subtitles for English then it is probably fine.

          • @[email protected]
            link
            fedilink
            English
            11 hour ago

            That’s probably more due to lack of training than anything else. Existing models are mostly made by American companies and trained on English-language material. Naturally, the further you get from the model, the worse the result.

            • @[email protected]
              link
              fedilink
              English
              13 minutes ago

              It is not the lack of training material that is the issue, it doesn’t understand context and cultural references. Someone commented here that crunchyroll AI subtitles translated Asura Hall a name to asshole.

      • @[email protected]
        link
        fedilink
        English
        69 hours ago

        I have a feeling that if you care enough about subtitles you’re going to look for good ones, instead of using “ok” ai subs.

      • @shyguyblue
        link
        English
        2
        edit-2
        3 hours ago

        I imagine it would be not-exactly-simple-but-not- complicated to add a “threshold” feature. If Ai is less than X% certain, it can request human clarification.

        Edit: Derp. I forgot about the “real time” part. Still, as others have said, even a single botched word would still work well enough with context.

        • snooggums
          link
          English
          1
          edit-2
          4 hours ago

          That defeats the purpose of doing it in real time as it would introduce a delay.

          • @shyguyblue
            link
            English
            13 hours ago

            Derp. You’re right, I’ve added an edit to my comment.