I know MediaBiasFactCheck is not a be-all-end-all to truth/bias in media, but I find it to be a useful resource.

It makes sense to downvote it in posts that have great discussion – let the content rise up so people can have discussions with humans, sure.

But sometimes I see it getting downvoted when it’s the only comment there. Which does nothing, unless a reader has rules that automatically hide downvoted comments (but a reader would be able to expand the comment anyways…so really no difference).

What’s the point of downvoting? My only guess is that there’s people who are salty about something it said about some source they like. Yet I don’t see anyone providing an alternative to MediaBiasFactCheck…

  • @Eutent
    link
    English
    94 months ago

    Bias can be subtle and take work to suss out, especially if you’re not familiar with the source.

    After getting a credibility read of mediabiasfactcheck itself (which I’ve done only superficially for myself), it seems to be a potentially useful shortcut. And easy to block if it gets annoying.

    • @Rottcodd
      link
      -14 months ago

      The main problem that I see with MBFC, aside from the simple fact that it’s a third party rather than ones own judgment (which is not infallible, but should still certainly be exercised, in both senses of the term) is that it appears to only measure factuality, which is just a tiny part of bias.

      In spite of all of the noise about “fake news,” very little news is actually fake. The vast majority of bias resides not in the nominal facts of a story, but in which stories are run and how they’re reported - how those nominal facts are presented.

      As an example, admittedly exaggerated for effect, compare:

      Tom walked his dog Rex.

      with

      Rex the mangy cur was only barely restrained by Tom’s limp hold on his thin leash.

      Both relay the same basic facts, and it’s likely that by MBFC’s standards, both would be rated the same for that reason alone. But it’s plain to see that the two are not even vaguely similar.

      Again, exaggerated for effect.

      • @[email protected]
        link
        fedilink
        34 months ago

        MBFC doesn’t only count how factual something is. They very much look at inflammatory language like that, and grade a media outlet accordingly. It’s just not in the factual portion, it is in the bias portion. Which makes sense since, like you said, both stories can be factually accurate.

        • @Rottcodd
          link
          44 months ago

          I haven’t seen any evidence that it does that, and quite the contrary, evidence that it does not - examples from publications ranging from Israel Times to New York Times to Slate in which it accompanied an article with clearly loaded language with an assessment of high credibility.

          It’s possible that it’s improved of late - I don’t know, since I blocked it weeks ago, after a particularly egregious example of that accompanied a technically factually accurate but brazenly biased Israel Times article.

          • @[email protected]
            link
            fedilink
            -14 months ago

            The bot wasn’t assessing the individual articles. It was just pulling the rating from their website. If you look at the full reports on the website they have a section that discusses bias, and gives examples of things like loaded language found in the articles they assessed.

            • @Rottcodd
              link
              34 months ago

              Right, nor did I expect a rating based an on individual article - sorry if that’s the way I made it sound.

              It’s simply that the rating of high credibility accompanying an article that was so obviously little more than a barrage of loaded language cast the problem into such sharp relief that I went from being unimpressed by MBFC to actively not wanting to see it.

              • @[email protected]
                link
                fedilink
                24 months ago

                Totally get that. And I’ve not been trying to push people to accept the bot, or saying that MBFC isn’t flawed. Mostly just trying to highlight the irony of some people having wildly biased views, and pushing factually incorrect info about a site aimed at scoring bias and factual accuracy.

      • @[email protected]
        link
        fedilink
        -4
        edit-2
        4 months ago

        Both relay the same basic facts

        NO, THEY DO NOT.

        rex has a mange is factual statement, that can be investigated and either confirmed or rejected.

        same goes for rex’s leash was inadequate and tom’s hold of the dog was weak.

        there is a lot more facts in your second example, compared to first one.

        it’s likely that by MBFC’s standards, both would be rated the same for that reason alone

        no, they would not and it is pretty easy to find out - https://mediabiasfactcheck.com/methodology/

        your powers of “paying attention, weighing, analyzing, reviewing and questioning” are not as strong as you think.

        be careful not to hurt yourself when you are falling down from this mountain.

        • @Rottcodd
          link
          34 months ago

          So are you saying that you wouldn’t be able to recognize my second example as a biased statement without the MBFC bot’s guidance?

          Or did you just entirely miss the point?

          • @[email protected]
            link
            fedilink
            -34 months ago

            i am saying you are lying about the same facts in your two examples and i am saying you are lying about how these two statements would be rated by mbfc, because you either didn’t exercise your imaginary analytical skills, or you are intentionally obfuscating.

            you can read that. it is just above your last comment.

            • @Rottcodd
              link
              44 months ago

              All I see here is someone whose ego relies on a steady diet of derision hurled in the general direction of strangers on the internet.