Announcement from THE
Times Higher Education have just announced that they will only rank 200 universities this year. Another 200 will be listed alphabetically but not ranked.
Let us be clear: the Times Higher Education World University Rankings list only the world’s top 200 research-led global universities.
We stop our annual list at the 200th place for two reasons. First, it helps us to make sure that we compare like with like. Although those ranked have different histories, cultures, structures and sizes, they all share some common characteristics: they recruit from the same global pool of students and staff; they push the boundaries of knowledge with research published in the world’s leading journals; and they teach at both the undergraduate and doctoral level in a research-led environment.
We unashamedly rank only around 1 per cent of the world’s universities – all of a similar type – because we recognise that the sector’s diversity is one of its great strengths, and not every university should aspire to be one of the global research elite.
If THE are going to provide sufficient detail about the component indicators to enable analysts to work out how universities compare with each other this would be be a good idea. It would avoid raucous demands that university heads resign whenever the top national university slips 20 places in the rankings but would allow analysts to figure out exactly where schools were standing.But we also stop the ranking list at 200 in the interests of fairness. It is clear that the lower down the tables you go, the more the data bunch up and the less meaningful the differentials between institutions become. The difference between the institutions in the 10th and 20th places, for example, is much greater than the difference between number 310 and number 320. In fact, ranking differentials at this level become almost meaningless, which is why we limit it to 200.
It is true, as Phil Baty says, that there is not much difference between being 310 and 320 but there is, or there would be if the methodology was valid, a difference between 310 and 210. If THE are just going to present us with a list of 200 universities that did not (quite?) make it into the top 200 a lot of usable information will be lost.
The argument that THE is interested only in the ranking of the leading research led institutions seems to run counter to THE's emphasis on its bundle of teaching indicators and the claim that normalization of citations data can uncover hidden pockets of excellence. If we are concerned only with universities with a research led environment then a few pockets or even a single pocket should be of little concern.
One also wonders what would happen if disgruntled universities decided that it was not worth the effort of collecting masses of data for TR and THE if the only reward is to be lumped among 200 also rans.