Monday, December 21, 2009

Does Size Really Matter?
Times Higher Education (THE) are keeping the "peer review" but possibly with new questions. According to a recent article they will be using the British pollsters Ipsos MORI to collect data.

"So we are delighted to confirm that for the 2010 Times Higher Education World University Rankings, our new rankings partner Thomson Reuters has commissioned one of the world's leading polling companies, Ipsos Mori, to carry out research to support the peer-review element of the tables. Using a professional polling company means that we can inject proper targeting and transparency into the process while ensuring that we get a much larger response rate than in the past - the aim is for at least 25,000 responses in 2010. It also means that the questions in the opinion survey can be carefully crafted to elicit meaningful and consistent responses while ensuring that every respondent knows what is being asked of them. "

THE seems to be overly concerned with the number of respondents, claiming that the 9,000 plus of the 2009 THE-QS rankings was an inadequate number to represent the millions of academics of one sort or another around the world. They are right to be concerned but the number of respondents is not the main determinant of the validity of any survey. What matters more is the extent to which the sample is representative of the population about which data is sought. If THE and if Ipsos MORI are going to do no more than get a lot of people to fill out online forms then their new survey will be little better than the old one.

If the rankings industry is going to descend into a squabble about who's got the biggest survey then QS might be able to trump THE. They could revive their retired respondents from 2004-06, purchase a large stash of email addresses from Mardev, make the survey more user-friendly (tick boxes instead of typing names) and they might well be able to get above the 25,000 mark.

The choice of Ipsos MORI, whose offices are in London, Harrow, Manchester, Edinburgh, Belfast and Dublin might be an indicator of a narrowing of vision. THE's editorial board, which seems to have become more active of late, is predominantly British with a heavy bias towards officialdom. Discussion about rankings in THES seems rather anglocentric. A subtle slip was Phil Baty's recent reference to "overseas" universities. They may be overseas to you but you are overseas to them and everybody else.

Saturday, December 12, 2009

Whither the QS Rankings?

While Times Higher Education is looking around for a new methodology, QS, judging from a recent conversation with Ben Sowter and Tony Martin and comments on its website, appears set on continuing with the old system perhaps with a bit of tweaking.

The need to maintain some sort of continuity is understandable, especially after the yo-yoing of some universities in recent editions of the THE-QS rankings. However, criticism of the rankings is such that it would seem a good idea to seize the opportunity to make some simple changes.

The least liked element of the THE-QS rankings of 2004-09 was the "peer review". It had, being based on the mailing lists of a Singapore-based publishing company with links to Imperial College London, an obvious geographical bias. The declared response rate was too low to meet conventional standards of face validity. Its weighting was too high. As a survey of research expertise it was quite redundant since citations are a far better measure of research impact and quality.

Furthermore, the "peer review" added to the overemphasis on research. The THE-QS rankings gave a 20 % weighting to citations, the faculty student ratio gave a big and obvious boost to universities with large numbers of non-teaching research-only faculty and then there was 40% for a research-based survey.

I would like to suggest a simple change. Keep the survey of academic opinion (and stop calling it a peer review because it is nothing of the sort) but use it to assess the general excellence or reputation, perhaps including teaching and student satisfaction, of universities. It is not credible that someone with a functioning mouse can sign up for the World Scientific list and became competent to assess the research performance of universities but he or she might have some idea of the general reputation of institutions. This would require minimal changes to the current procedure: all that is needed is to change the questions.

A couple of other refinements might be in order. The division of the academic world into three super-regions for weighting purposes is too crude. Latin America, Africa, Southwest Asia and Southeast Asia deserve to be treated as separate regions.

Telling everybody that you have sent 180,000 e-mails is asking for trouble if you are going to get a negligible response. It would be better to use the World Scientific lists to accumulate a list of people willing to participate in the survey, combine it with names collected from various events and then send out the survey. If nothing else, the response rate would be a little more respectable.

Thursday, December 10, 2009

A Ranking from SCImago

Tekmillinen Korkeakoulu-Tekniska Hogskolen in the top 400
Ollscoil Luimnigh just misses top 1000
Good showing by Debreceni Egyetem

SCImago, a research group based on Spanish universities has published SIR, SCImago Institutions Rankings, has published its 2009 report which includes a ranking of 2124 institutions, including research centres as well as universities.

There are five indicators, one of which, the number of publications in Scopus-indexed journals, is used for ranking.

There are some positive things about this ranking. It uses Scopus data: anything which reduces the emerging Thomson Reuters monopoly is welcome. It ranks more than two thousand places. It is quite transparent: I have checked a few institutions and the figures seem accurate.

The most striking thing about this index is that it shows that a vast amount of research is being done outside universities. The top three places for research output go to government research centres in France, China and Russia, lending support to French claims that current ranking systems fail to take account of their distinctive system of higher education and research.

One irritating thing about these rankings is the eccentric naming policy. Japanese universities are referred to by their Japanese names but Korean and Chinese ones are in English. Some New Zealand universities are listed with English and Maori names but the Universities of Auckland and Waikato are only in English. Dublin Institute of Technology is in Irish but Trinity College Dublin is in English. Some Saudi institutions are in English and some in Arabic. Three Israeli universities are in Catalan (or German without the umlaut!)

Wednesday, December 09, 2009

An Ancient Dinosaur Reborn?

Times Higher Education and some of its readers seem to be concerned about what they think is the low position of the London School of Economics (LSE) in previous rankings. It is true that institutions that specialise in the social sciences and humanities suffer from any ranking based on citations and publications since they produce longer and fewer papers with fewer authors and more books and use citations more sparingly than do those in the natural sciences and medicine. However, this seems to affect universities like Yale and Princeton as much as LSE. It would be quite simple for rankers to use some sort of weighting to reduce the disadvantage of such places and it would be an improvement if THE were to do this in any future ranking system.

But the concern with LSE is rather suspicious. Should specialist institutions be regarded as the equal of universities that excel in all disciplines? Perhaps THE should also think about the overrating of Oxford and Cambridge (take away the peer review from the THE-QS rankings of 2004-09 or the alumni and awards indicators from the Shanghai rankings and see where they are) as they discuss their new system.

It might be worth recalling a comment made by a THE reader back in October.


"It is always quite interesting to see that British institutions are still regarded as the top of the world. (I just compare it with the FT MBA rankings as well, where UK institutions dominate all rankings). As someone from the continent I only can say "Long live the British Empire!" It seems to me that the stereotype of British domination is still very alive in UK. A closer look at the British economy, engineering and scientifc achievements, however, reveals the the mental fraud. Travelling across UK, I often realize that UK is frozen in time. Sometimes the technology, housing and machines are like from a 3rd world. London Metro is like from 1899. Trains across the country are like in the 30s. Communication technology is like mid of last century. I would have reasoned that with all the best universities, as you have figured out yourself, only bright scientist and engineers evolve. It's an illusion. Travel across Europe, marvel at French TGV trains, drive German cars and have a look at Spanish solar power plants and you will see that others, with officially inferior schooling systems, have achieved far more. Your university ranking is an illusion, buried in century long self-perception of world dominance. I am sorry to write that, but it is true. The British dominance is long gone, same with academic instituions. Your ranking list is an ancient dinosaur."

Tuesday, December 01, 2009

Whither the Times Higher Rankings?

Times Higher Education has announced that it will be producing a new ranking system to replace the THE-QS World University Rankings.

THE does not seem to have much idea about where it is going. Its advisory committe (it would be interesting to find out who they are) is reported to have complained that the number of respondents in the peer review is too small and that the citations indicator is biased against the social sciences and the humanities.

Neither of these is very helpful. The small number of respondents is not for lack of trying by QS. They have been sending out nearly 200,000 e-mails a year. I doubt if there is very much anyone can do get many more respondents. What could be done and should be done is to improve the validity of the survey by clearly identifying the group whose opinion is being sought or using databases that are less obviously biased. The second problem could be dealt with quite easily by assigning appropriate weighting to the various dsicipline clusters.

THE has also published comments from readers about future directions for its rankings. Some of these seem unaware of the basic methods of the THE-QS rankings. One, for example wants to see an "increased number of academics interviewed" -- QS never interviewed anyone for its survey. Others want the rankings to include criteria that are of limited global comparabilty such as starting salaries or graduate job prospects.

Several readers are unhappy with what they feel is the unfairly low position of LSE. This would seem misplaced. The rankings are supposed to be of universities not of research institutes and offering a full range of courses ought to be a significant element in the assessment of a university.

Other readers are sceptical about the significance of internationalisation and there appears to be division about whether citations are an adaequate nmeasure of research quality.

The response so far appears to be predominantly British. If THE are going to listen to their readers it is likely that the obvious pro-British and even pro-Oxbridge bias of the old rankings will continue.

Anyone interested in taking part in a survey by Thomson Reuters and THE can do so by going here.