Text for readability:

So far, Americans using RedNote have said they don’t care if China has access to their data. Viral videos on TikTok in recent days have shown Americans jokingly saying they will miss their personal “Chinese spy,” while others say they are purposefully giving RedNote access to their data in a show of protest against the wishes of the U.S. government.

“This also highlights the fact that people are thirsty for platforms that aren’t controlled by the same few oligarchs,” Quintin said. “People will happily jump to another platform even if it presents new, unknown risks.”

  • @Korne127
    link
    111 month ago

    I don’t see why the concept should be unethical.

    In practice, of course it is insanely unethical as the algorithms are designed to maximize view time which leads to algorithmic radicalization and hate spreading more quickly, but the concept of an algorithm knowing and learning what you like and selecting for you itself isn’t unethical.

    • @brucethemoose
      link
      21 month ago

      It’s expensive for video though.

      In other words, I have a hard time seeing Pixelfed with a high quality “benign” TikTok algorithm. It’s already possible for music, but video data\analysis is just so voluminous that, without the profitable exploitation backing it, I don’t see how they’d pay for it.

      • @[email protected]
        link
        fedilink
        11 month ago

        We also have to consider moderation. If suddenly everyone just jumped to the fediverse all at once…hoo boi let’s just say I bet the FBI would have quite a field day.

        But then again there’s PeerTube instances that seem to be doing pretty well so…I dunno…?

    • Amon
      link
      11 month ago

      concept of an algorithm knowing and learning what you like and selecting for you itself isn’t unethical.

      Unless you host it yourself, you have practically given away your soul to an instance operator

    • @[email protected]
      link
      fedilink
      -1
      edit-2
      1 month ago

      I don’t see why the concept should be unethical

      It’s like engineering drugs specifically targeting reward systems of the brain associated with human emotional development and socialization.

      Edit: more explicit

      • Amon
        link
        31 month ago

        I’m sorry, say that again?

        • @[email protected]
          link
          fedilink
          2
          edit-2
          1 month ago

          Besides the fact that it’s quite difficult to do this non-invasively, giving anyone instant access to any amount of exactly what they want most is dangerous (Edit: likely irresponsible, potentially dangerous, like designing escapist drugs, fine line between helping and hurting, and you must consider both).

          Definitely find lack of care on the part of fellow computer scientists irresponsible. I’ve rejected grant followups for thinly veiled weapons research for the same reason; i.e., potential misuse.

          • Amon
            link
            11 month ago

            I get it now

          • @[email protected]
            link
            fedilink
            1
            edit-2
            1 month ago

            giving anyone instant access to any amount of exactly what they want most is dangerous (Edit: likely irresponsible, potentially dangerous, like designing escapist drugs

            Oh wow, how you so perfectly, succinctly described all the empty promises of Ai hype in one elegant line. 😬