A report by the Network Contagion Research Institute (NCRI) at Rutgers University says topics important to the Chinese government are given added or less weight depending on them being favorable.

In its conclusion in the report released this month, the NCRI said: “Whether content is promoted or muted on TikTok appears to depend on whether it is aligned or opposed to the interests of the Chinese Government.”

  • Flying Squid
    link
    255 months ago

    The account ‘northkoreanlife’ shows one video that says the country’s capital of Pyongyang “has the best nightlife.” The video shows a number of people ambling along what appears to be a main road lit at night. ‘Northkoreanlife’ has over 286,000 followers.

    Another video shows people walking down a street before a narrator says, “Approaching us is a man riding a motorcycle; clear sign of wealth in North Korea.” Said motorcycle is riding down a road covered in potholes.

    Maybe people are following because that’s pretty amusing. “Let’s see what bullshit propaganda North Korea is failing at today.”

    • @Buddahriffic
      link
      95 months ago

      That is hilarious. The wealth disparity is so high that they can’t even understand how to impress Westerners with wealth. Our contexts barely even overlap. Is that propaganda even meant for the west?

      • Flying Squid
        link
        25 months ago

        Right? If I was on TikTok, I would consider following that for the laughs.

  • @gedaliyahOPM
    link
    English
    245 months ago

    The full report from NCRI and Rutgers is a chilling read. TikTok is also suppressing Pro-Ukraine, Uyghur, Tibet, and Hong Kong videos, while magnifying topics like “Free Kashmir” that are harmful to their regional rival India.

    • BobVersionFour
      link
      fedilink
      105 months ago

      What do people expected from a china state own massive surveillance and propaganda app?

      • dtc
        link
        145 months ago

        I wish china would stop claiming ownership of large swaths of the ocean. They won’t.

          • @psmgx
            link
            65 months ago

            They sure as shit tried too, i.e. sending a fleet to harass Filipino fishermen

  • @Mango
    link
    65 months ago

    Tencent owned Riot games also changed all their splash arts and skins in League of Legends to match the Chinese version so groovy Zilean stopped being a magnificent hippy grandpa and became a cranked out meth head.

  • @[email protected]
    link
    fedilink
    English
    25 months ago

    Reposting my comment on this same report from 10 days ago:

    I read the NCRI-Rutgers report in question. You can, too.

    The report’s conclusion states…

    Given the research above, we assess a strong possibility that content on TikTok is either amplified or suppressed based on its alignment with the interests of the Chinese Government.

    …but the data they present doesn’t prove that statement at all.

    The report authors describe their data collection methodology at the top of Page 5 of the report. They state that they’re using each platform’s advertising management system to count the total number of posts/entries that feature a given hash tag, and comparing the counts on one platform to the counts on the other.

    Think about that for a second. Those numbers are just aggregates of tagged user posts. To assert that ByteDance is “amplifying” or “suppressing” a given topic, the data would need to show evidence of raw posts in a given category being edited or deleted en mass, or that perhaps the content feeds and searches that each platform provides to its users are being modified to hide or promote posts aligned with specific subjects. The data doesn’t address any of that.

    What the data DOES show is how many posts on each platform align with given topics that advertisers have access to. Taken at face value, this data can tell us a lot of interesting things about the users of these particular platforms. For example, TikTok seems to be a lot more into Shakira than Harry Styles. That’s interesting, I guess. Also, Instagram users are making more posts about Uyghurs than TikTok users. That’s also interesting, but that’s not necessarily evidence that ByteDance is suppressing content. What seems more likely is that people who give enough of a shit about Uyghurs to write posts about it aren’t using TikTok.

    So ok, fine, let’s get into some deep-data-fuckery hypotheticals:

    Could TikTok posts pertaining to topics that the Chinese government has expressed opinions about be being edited or deleted? Maybe. That should be easy enough to collect data on and test.

    Could the aggregation of TikTok posts for the advertising/marketing systems be deliberately fudging the numbers by under-counting posts for some topics and/or over-counting for others? Maybe. The data doesn’t prove it. But… why? The function of those advertising systems is to allow marketers to buy ads and figure out costs. Lying about those numbers would mean ByteDance was scamming advertisers. Admittedly, that would be quite a scandal if it were happening, but that’s nowhere near the same thing as the report’s conclusion.

    The report’s conclusion is a full-throated statement that ByteDance is tipping the scales in terms of what content is being served to TikTok’s users. This might actually be happening, and it’s absolutely worth investigating, but the evidence in this report does not back up that claim.

    Finally, a pro-tip: if you’re skimming a research report and spot the authors misusing the phrase “begging the question”, it’s time to crank up your bullshit detector to maximum.

    • @gedaliyahOPM
      link
      4
      edit-2
      5 months ago

      Fine, but I don’t think that there is any claim that they are deleting or editing posts. That is not the only way (or even the best way) to manipulate messaging on a social media platform.

      Although that might also be happening through stricter moderation of posts dealing with certain topics, it’s more likely that they are using the algorithm to make sure that fewer people see posts about certain topics that are sensitive or critical of Chinese government interests.

      Although the methodology is imperfect, I haven’t heard any critique that really accounts for their findings. Neutral topics like pop culture and US politics show up with similar frequency* on TikTok and Instagram, while topics that are negative for China appear dozens or hundreds of times less frequently. You claim that people posting about Uyghurs are just not using tiktok as much as Instagram. Why would they prefer Instagram so strongly? Why are neutral political topics posted with near pairity? Why are people who care about Kashmir (a topic beneficial to China) posting an order of magnitude more on TikTok than on Instagram?

      I’ve heard other critiques suggest that it has to do with the age of the platform. The claim is that Instagram is an older platform and so would have higher occurrence of older topics like Tiananmen Square. That theory fails to explain why there’s also a huge difference on recent topics like Ukraine, Hong Kong, or Israel/Palestine.

      I’ve heard other critiques claim that there is a difference in the age range of the user base. Although they are pretty similar, TikTok does skew younger and there’s not a good control for that in the study. Again, however, why would there be such a huge difference on the topics in the study? Who are these people who have drastically different views Hong Kong or Tibet but share the same beliefs about BLM and Trump?

      It doesn’t make sense.

      The simplest explanation that accounts for these findings is that information on TikTok is being manipulated based on the interests of the Chinese government.

      *compared with the given size of the user base