Would it be correct to say that a 3.5/5 movie rating on a 0.5/5 scale isn’t exactly the same as a 7/10 rating on a 0.5/10 scale, even though it seems like it mathematically? The reason is that half a star on the 5-point scale visually represents less than a full point on the 10-point scale. So, while a great scene might earn you a half-point bump, it wouldn’t necessarily add a full point on the 10-point scale. If rated on a 10-point scale, it’d probably be closer to a 6.5, which converts to 3.25/5 or simply 3/5 on the 0.5/5 scale. This shows that converting ratings to different scales don’t always align perfectly with your intended rating.

Would I be right claiming this?

  • @lazyViking
    link
    311 hours ago

    Well apparently for some reason you are starting both scales on 0.5 which means they are not “the same”. If you start the 5 scale on 0.5 you have to start the 10 scale on 1 to be similar. So it’s not weird they feel different

    • @Johnmilash8OP
      link
      110 hours ago

      True, but 3,5/5 would still be 7/10 regardless, this post came from an argument with my friend where we were rating movies and i’m calling a movie mid by giving it 3/5 and he said “bro you that would be 6/10 you should score lower then” and I said that you cant just convert to a 10 point scale since 3,5/5 and 6/10 have different visual representation and as you mentioned the 10 point system would also have its own 0,5 increments. But he just said math is math…