Tech's most contentious debates end with people talking over each other, but they make way more sense viewed through the lens of inter-generational conflict.
I left University in the late 90’s and got my first job based on the things I’d been messing about with in my spare time with the University’s facilities/at home (Unix, Internet protocols, client/server arch, distributed computing, etc.) rather than anything I’d been taught. I learnt more in my first 3 months in work than 3 years of education.
Then the dot-com boom hit, and the number of applicants for any position surged - everyone was going into software development for the money. The whole team became involved in selecting candidates and being part of the interviewing process - it was a nightmare trying to give every person a fair chance. We had some good hires and some bad hires, but the bad hires became such a problem because we had to go through the recruitment mill again.
But we realised that the number one factor for whether they’d be a good hire or not was not education, but their own personal projects. That’s what mattered. Doing this for fun was the key indicator of being good, and became the ONLY thing we looked for on CVs in the first pass. Doesn’t matter if you have a 1st from Cambridge, if you don’t demonstrate you have a passion for the subject, you don’t get an interview. It was a huge success, and we built an amazing team and saved ourselves a ton of time during recruitment.
Those people still exist though, I see it all the time! But I think now that the “industry” has grown so much that in any given field there are less people (relatively) being attracted to it. For example, I can see that while back in the 80’s I was drawn to the personal computer, then the 90’s the internet - those things are staples of everyday life now. But I can see more modern young people being attracted to things like AI, drones, quantum computing, 3D printing, and so on as well.
I left University in the late 90’s and got my first job based on the things I’d been messing about with in my spare time with the University’s facilities/at home (Unix, Internet protocols, client/server arch, distributed computing, etc.) rather than anything I’d been taught. I learnt more in my first 3 months in work than 3 years of education.
Then the dot-com boom hit, and the number of applicants for any position surged - everyone was going into software development for the money. The whole team became involved in selecting candidates and being part of the interviewing process - it was a nightmare trying to give every person a fair chance. We had some good hires and some bad hires, but the bad hires became such a problem because we had to go through the recruitment mill again.
But we realised that the number one factor for whether they’d be a good hire or not was not education, but their own personal projects. That’s what mattered. Doing this for fun was the key indicator of being good, and became the ONLY thing we looked for on CVs in the first pass. Doesn’t matter if you have a 1st from Cambridge, if you don’t demonstrate you have a passion for the subject, you don’t get an interview. It was a huge success, and we built an amazing team and saved ourselves a ton of time during recruitment.
Those people still exist though, I see it all the time! But I think now that the “industry” has grown so much that in any given field there are less people (relatively) being attracted to it. For example, I can see that while back in the 80’s I was drawn to the personal computer, then the 90’s the internet - those things are staples of everyday life now. But I can see more modern young people being attracted to things like AI, drones, quantum computing, 3D printing, and so on as well.