I’m curious how software can be created and evolve over time. I’m afraid that at some point, we’ll realize there are issues with the software we’re using that can only be remedied by massive changes or a complete rewrite.
Are there any instances of this happening? Where something is designed with a flaw that doesn’t get realized until much later, necessitating scrapping the whole thing and starting from scratch?
Which - in my considered opinion - makes them so much worse.
Is it because writing native UI on all current systems I’m aware of is still worse than in the times of NeXTStep with Interface Builder, Objective C, and their class libraries?
And/or is it because it allows (perceived) lower-cost “web developers” to be tasked with “native” client UI?
Probably mainly a matter of saving costs, you get a web interface and a standalone app from one codebase.
and a mobile app sometimes
Are you aware of macOS? Because it is still built with the same UI tools that you mention.
Aware, yes. Interested, no - closed source philosophy, and the way Apple implements it specifically, turn me off hard.
Most UI frameworks are hot garbage. HTML is just much easier and cross platform.