• @deathmetal27
    link
    English
    39
    edit-2
    6 months ago

    Non-clickbait title:

    UC Berkeley Professor Founds a Deepfake Forensics Company GetReal Labs.

  • Ð Greıt Þu̇mpkin
    link
    fedilink
    English
    116 months ago

    I’ve said before, what’s needed here is an independent org or maybe even multiple that can provide certification that pieces of media were created without the use of generative tech, and then open up companies that publish anything missing the cert to lawsuits from the public.

    Verify then trust publishing would go a long way to contain misinformation in general, not just deep faking and other generated content.

    • @CheeseNoodle
      link
      English
      26 months ago

      Theoretically good but it’d also be yet another way to raise the bar to entry for all creative industries giving buffalo buffalo buffalo established players an even bigger advantage.

      • Ð Greıt Þu̇mpkin
        link
        fedilink
        English
        2
        edit-2
        6 months ago

        Nah just get the work certed, it’d be available for anyone, it just needs to pass the “No AI was used at all” test, and that can be done by giving creative software tools the ability to sign files whenever AI or screen shotting tools get used in their making

        I think the approach of providing software tools the ability to grant the cert if no AI tools are used will be the way to really spread the practice, can even work with them to create a cert mode that just doesn’t even present those tools to the user in the first place.

        The next step would be getting publishing sites to use the cert or signature as a check that either blocks the upload or more leniently get a visual marker that indicates it was not made with generator tools.

        Plus checking for the cert also eliminates the crisis of teachers having to outsmart GPT to avoid their students cheating on takehome essays.