Would it be correct to say that a 3.5/5 movie rating on a 0.5/5 scale isn’t exactly the same as a 7/10 rating on a 0.5/10 scale, even though it seems like it mathematically? The reason is that half a star on the 5-point scale visually represents less than a full point on the 10-point scale. So, while a great scene might earn you a half-point bump, it wouldn’t necessarily add a full point on the 10-point scale. If rated on a 10-point scale, it’d probably be closer to a 6.5, which converts to 3.25/5 or simply 3/5 on the 0.5/5 scale. This shows that converting ratings to different scales don’t always align perfectly with your intended rating.
Would I be right claiming this?
I’m not sure about your visual interpretation, but I completely agree that the two scales don’t translate directly, and that if something is rated 7/10 I’d assume it’s better than something rated 3.5 stars / 5.
As to the reason? I wonder if the scales five different senses of the middle value? In a five star system, 3/5 film is the middle value, and not especially good nor bad, but I’d probably give the same “totally average, not good not bad” film 5/10. Similarly, it seems weird to translate “Awful, 1/5” into “Awful, 2/10”. So maybe the difference comes from a lack of clarity about half stars, it’s okay to give 0.5 / 5? But not 0? Or 5.5?
And that doesn’t even start to address the modern “if it’s rated less than 4.6* it’s probably awful” issue…
I think I probably would too. And yet, I would tend to instinctively think of 70% as worse than 7/10, even though that makes no sense.