Thursday, October 23, 2008
This year 21 universities got a score of 100 on the academic survey section of the THE-QS rankings. A look at the subject rankings, which are combined with equal weighting to form the total score for this indicator, shows that such a perfect score can mean many different things.
Harvard scored 1oo in Arts and Humanities, 100 in Life Sciences and Biomedicine, 96.1 in Natural Sciences, 100 in Social Sciences and 59.6 in Engineering and IT.
The Australian National University scored 74 in Arts and Humanities, 46 .9 in Life Sciences and Biomedicine, 66.1 in Natural Sciences, 71.4 in Social sciences and 49.9 in Engineering and IT.
Peking University scored 56.4 for Arts and humanities, 56.9 for Life sciences and Biomedicine, 73 for Natural Sciences, 57.8 for Social Sciences, and 39.2 for Engineering and IT.
Friday, October 17, 2008
In 2007 the University of Alabama was listed by THE-QS as the fifth best research university in the world, as measured by citations per faculty. This year it is not even even in the top100. What happened?
Did all those researchers go on strike?
Or is just that that last year the number of faculty was underestimated and this year the mistake was corrected?
Tuesday, October 14, 2008
QS have helpfully provided the means and standard deviations for their ranking indicators. For student faculty the mean is 0.09, which works out as a mean of 11.11 students per faculty.
But looking at the data for 2006, the mean number of students per faculty (using the scores provided by QS and cross-checking with the data on their website) was 16.44 students per faculty (N = 531). There has, it would seem, been a very substantial improvement on this indicator in only two years.
Is this a real improvement?
Saturday, October 11, 2008
Times Higher Education has published an editorial on the 2008 rankings that deserves comment. It says:
This is the fifth year we have published the rankings and the methodology
has remained unchanged for the past two. Along with our partners in the venture,
Quacquarelli Symonds, we make enormous efforts to ensure that our
quality-control processes and anti-cheating mechanisms are as robust as
I accept that so far this year the rankings have not been disfigured by the sort of spectacular errors that have occurred in the past but I would feel more confident about those enormous efforts if THE and QS owned up to their past errors, such as putting Duke in top place for student faculty ratio by counting undergraduate students as faculty, and indicated exactly what they are doing to stop similar mistakes from occurring again.
The editorial continues:
This is fallacious. The validity of any survey depends on how representative the respondents are of the larger population, not how many they are. If QS continued to report a response rate of around 3 percent they must expect continued criticism.
We try to ensure that the results are produced with a large amount of data. For example, there were more than 6,000 participants in the academic survey alone, producing an average of 20 responses per head. That is a staggering 120,000 data points, making it the largest known survey of university quality.
After proclaiming its lack of bias THE concludes:
For 2008, we congratulate Harvard University for its success
in topping the rankings yet again. However, it is worth remembering that its
endowment now totals more than $35 billion (£19 billion), roughly equivalent to
the total income received by the entire UK sector last year. By that measure,
the UK, with its 29 institutions in the Top 200 (and four in the Top 10), can
stand proud on the world stage.
According to THE's citations per faculty indicator the research impact of Cambridge academics is less than that of 35 other institutions including Resselaer Polytechnic Institute, Tufts, McMaster, Tel Aviv, UC Davis, Minnesota, Leiden, Emory, Toronto, Kyoto, Brown and ETH Zurch. Cambridge probably is not quite that bad -- there may be problems with the faculty side of the equation that are causing distortions -- but it does look as though the academic survey is in part an attempt to cover up the steady decline of British higher education and research.
There is a comment on the editorial by bgc:
The Times uses a non-transparent, undefined opinion
survey for most of their weightings - presumably this is what leads to such
Anyone who knows anything about international HE would
realize that the Times ranking lack basic validity.
For goodness sake,
sort-out the ranking methodology before next year. Or please stop inflicting
this annual embarrassment on those of us who practice scientometrics and try to
use objective methods of educational evaluation.
This is perhaps over-dramatic. The THE- QS rankings do seem to be improving in some respects. However, a reply by Martin Ince, editor of the rankings is rather unfortunate:
By contrast, we have measures relating to teaching,
globalisation and employability, and our research indicators cover the full
range of subjects. We set out exactly who and where our respondents are - there
is a nice pie chart in today's paper. These expert academics provide us with
about 126,000 data points (20 per person for 6,300 people) and make up the
biggest and best survey of university quality.
Whether the ability to sign on to a mailing list makes one an academic expert is debatable. And telling experts in scientometrics about your nice pie charts does you no good at all.
Friday, October 10, 2008
The THE-QS rankings seem to be getting better. So far this year, no outrageous errors like ranking a non existent university (Beijing University), turning Malaysian ethnic minorities into foreigners or giving Washington University in St Louis a near zero for research, have surfaced.
But comparing the scores for the academic survey with those for citations for research, both of which are supposed to measure research quality, suggests that the former has is a large and systematic bias.
Here is a list of universities whose score on the academic survey exceeds their score on the citations per faculty by forty points or more.
New York University
Trinity College Dublin
Seoul National University
London School of Economics
Nanyang Technological University Singapore
University College Dublin
Humboldt University Berlin
Shanghai Jiao Tong
Aotonomous National University of Mexico
Chulalongkorn University Thailand
Lomonosov State University Moscow
Maybe LSE and NYU can be explained by excellence in subjects that produce few publications or citations. But is it not possible that there is a pronounced geographical bias in the survey?
Thursday, October 09, 2008
This year there has been only one methodological change, namely the separation of the lists in the academic survey section into international and domestic sections and then their recombination. This would probably work against universities that receive a lot of votes from their own countries and might explain why Hong Kong, Peking and several Australian universities have fallen quite a bit.
Also, it is likely that the geographical spread of the academic and employer surveys has expanded and that this has benefitted universities in Latin America, Africa and India.
The biggest change in the top 100 is that Washington University in St Louis has risen to 60th place from 161st in 2007. This, presumably, is because it is now getting a realistic score for citations per faculty instead of the 1 it got in 2007, when QS seem to have confused it with the University of Washington. I am a little bit suspicious though about these two places being next to each other in this year's ranking.
The University of Hong Kong has fallen from 18th to 26th, Peking from 36th to 50th, Nanyang from 69th to 77th, Melbourne from 21st to 38th and Macquarie from 168th to 183rd (I wonder what Dr. Schwartz will say about that.)
On the other hand, the National Autonomous University of Mexico has risen from 192nd to 150th, the Indian Institute of Technology Delhi from 307th to 154th and Chulalongkorn from 223rd to 166th.
One oddity that I've noticed is that Stony Brook University, which is an autonomous university centre of the State University of new York, has risen dramatically to 127th place from 224th, while the other three centres at Binghamton, Buffalo and Albany which are of equal or better quality do not even get into QS's initial list.
Wednesday, October 08, 2008
One of the more distasteful aspects of the US presidential election is the obsession in some quarters with Sara Palin's IQ. See here for example. Some have suggested that she is not particularly bright because she graduated from the University of Idaho and not from Columbia or Harvard.
May I point out that the University of Idaho, although it is not Harvard or Columbia, is ranked in the 400s in the Shanghai Jiao Tong rankings, which puts it way above most institutions in the world and in the United States.
May I also point out that Palin, unlike Joe Biden, has not been accused of plagiarism. Also, if we are going to be snobby about universities, one also wonders why Biden, if he were so clever, would plagiarise from Neil Kinnock who had to take his final exams twice at a university that also wasn't Harvard or Columbia