It is “virtually impossible to capture the quality of an entire university with all the different courses and disciplines in one figure," the university said.
Utrecht University said it did not provide any data to the makers of the British trade magazine Times Higher Education’s World University Rankings, so they couldn’t determine a score. According to Utrecht University, the rankings place “too much emphasis on scoring and competition,” while the university says it considers collaboration and openness of scientific research to be more important.
University rankings are prime examples of “When a measure becomes a target, it ceases to be a good measure.”
The amount of worthless papers, international exchanges etc. I’ve seen at my university is astounding.
MedUni near me used to be like that, which ended up deteriorating the quality of doctors they put out so considerably, the hospitals in the country that historically would employ the graduates coming out of there instantly, were starting to hire primarily elsewhere.
They since switched to being far less “churning out papers” focused and putting a lot of time and effort in applicable research and experience training and that turned everything around again.
Their ranking on that shitlist Times puts out dropped (considerably) as a result, yet the request from foreign students to study there surged.
Your reputation in your region and sector is still more important than some fictitious score some media outlet gives you.
Yes, in the business world this often comes with the Icarus paradox, a phenomenon that eventually leads to a business’s failure by the very elements that brought its temporary success before.
Nothing is perfect, but it seems reasonable to rank universities based on their performance in education attainment and research?
How do you rank research output consistently. Every university is expected to create their own exam content, how do you effectively measure education attainment across universities?
Push for research outputs can also create perverse incentives for rapid publishing in whatever is considered quality journals - which are usually themselves associated with universities - rather than the pursuit of quality research and academic integrity.
It leads to increased strain on researvhers and a unfruitful obsession with any kind of academic output that can be easily counted. Of course research outputs are inherently important, but the whole academic publishing industry has just gotten weird the last decade or so.
A university I know of brought in a policy of only two terms for postdocs, so regardless of where they were in their research, they either had to become a lecturer or move on.
The reason behind this was to bring new postdocs in. Not to increase the quality of the research, but because it was a very effective way of opening up access to new funding streams.
These funding streams are of course very time limited and commercially driven, so what normally happens is some half assed piece of work is produced, with possibly an attempt to monetised and then more often than not discarded. Actually producing work that furthers an academic field seems to be very much down the list of priorities…
https://www.timeshighereducation.com/world-university-rankings/world-university-rankings-2024-methodology