Would it be correct to say that a 3.5/5 movie rating on a 0.5/5 scale isn’t exactly the same as a 7/10 rating on a 0.5/10 scale, even though it seems like it mathematically? The reason is that half a star on the 5-point scale visually represents less than a full point on the 10-point scale. So, while a great scene might earn you a half-point bump, it wouldn’t necessarily add a full point on the 10-point scale. If rated on a 10-point scale, it’d probably be closer to a 6.5, which converts to 3.25/5 or simply 3/5 on the 0.5/5 scale. This shows that converting ratings to different scales don’t always align perfectly with your intended rating.

Would I be right claiming this?

  • @edgemaster72
    link
    English
    2
    edit-2
    11 hours ago

    I think part of the problem is that most times that we use a 10 point scale for rating things, the low end tends to get ignored and unused, whereas on a 5 point scale there’s not as much room for throwing out the low end so ratings there tend to utilize more of the available spectrum. So where a 2/5 might be “below average” for example, a 4/10 would tend to be treated more harshly, more as a clear failure (but not without perhaps 1 or 2 redeeming qualities)