Nowadays, we come across many websites where the pages are too slow. They take a while to load simple content like thumbnails, texts, and pictures. Most of the time, these pages are slow not because of your internet speed, but because of the technology the websites use for serving content
Oh the author is here. Thank you for the article.
Here are some of my thoughts of some of your points as professional web developer for startups to some of the biggest companies in the world.
Your first point of developers getting dependent on a framework or library doesn’t really matter. Look at the average developer’s resume with 6 years of experience. They have a lot of things listed because this is what the culture is. We have to learn or miss the opportunity working for a company that uses React rather than Angular.
Besides a professional setting, learning the new hip thing and talking about why it sucks, is programming culture and human nature.
I think this topic can be its own article especially since any contribution to open source even if it from corpos, is not bad to me but moving on.
what I am going to write next is what good web architects do. I am not saying they all do this.
Web architects look at stats of their current users or target users. They have to find a balance of server performance and user experience. Some even have a list of old devices just to test out the results to make a decision on what devices should be excluded.
You showed a video of old tech using the internet now. Some of those devices are less powerful than Raspberry Pis now. Those devices aren’t even meant to handle a flat high res image. Also this is rare. Should a company spend thousand of dollars for every unique User Agent they get? No, especially since those are easily spoofed.
You are right about some sites slowing down because of JS but a lot of those, like on your own website which uses a lot of Google code, use code to stream a video in tiny bits. It makes the experience of the user a lot better. It takes a lot of logic to predict how many chunks video a user will use at a given time. I definitely wouldn’t wait for a 50 mb video to stutter and load on website.
Other examples like infinite scrolling aka doom scrolling. Social media use lazy loading for images an videos. Try shoving hundreds videos and high res images with sound on a page. It would be horrendous for the user. People would log off. Say what you want about doom scrolling, but it does its job keeping users on the page.
If you want to blame Javascript, you should also blame browsers since they run the code. Firefox is at version 135. Version 1 was less than 9 mb. Version 25 is at 45 mb. Version 50 is at 82 mb. Version 100 is at 121 mb. Version 135 is at 151 mb. You want to look at slow downs, make sure you have enough RAM is run the apps you are using.
I enjoy this topic and would enjoy what you think about what I wrote. Again, than you for posting.