Saturday, July 21, 2007

Blog on University Rankings

There is a Spanish-language blog on university rankings and other academic matters by Alejandro Pisanty that is well worth looking at.

Tuesday, July 17, 2007

Somebody Else Has Noticed

Matt Rayner has posted an interesting question on the QS topuniversities site. He has noticed that in the Guide to the World's Top Universities, published by QS, Cambridge is supposed to have a student faculty ratio of 18.9 and a score of 64 for this part of the 2006 World Rankings while Glasgow, with an almost identical ratio of 18.8, gets a score of 35.

As already noted, this anomaly is not confined to Cambridge and Glasgow. The student faculty ratios provided in the data about individual universities in the Guide are completely different from those given in the rankings.

There is in fact no significant relationship, as a quick correlation done by SPSS will show, between the two sets of data.

It will be even more interesting to see when and how QS reply to Matt's question

Sunday, May 13, 2007

The University of Santo Tomas

Varsitarian, the newspaper of the University of Santo Tomas (UST) in the Philippines has published an article questioning the credibility of the THES-QS world university rankings.

The complaint appears to be valid although the newspaper makes several errors about the rankings.

Alberto Laurito, assistant to the rector for planning and development at UST, has claimed that QS got the number of students wrong. The consultants reported 11, 764 students whereas the correct number is 32,971. The university’s figure seems to be correct. An article by Guzman and Torres in the Asia Pacific Education Review reports 32,322 students in 2002-3. However, QS’s deflating of student numbers, if it were the only mistake, would work to UST’s advantage in a number of ways. Firstly fewer students mean fewer students per faculty, if the number of the latter is constant, and hence a lower score on the student–faculty ratio component of the rankings. Secondly, if the number of international students is the same, fewer students overall means a higher percentage of international students.

However, this is not QS’s only error. They report that UST has 524 faculty, making a student faculty ratio of 22.45. According to the article, in 2002-3 UST had 1500 faculty. With 32,322 students, this would mean a faculty student ratio of 21.55. QS has made two errors and they have pretty much cancelled each other out.

Laurito then complained:

that THES-QS research on peer review was also irregular, considering that it was worth 40 per cent of the entire survey when only 1,600 universities turned in their responses or about one per cent of the 190,000 needed

The low response rate does of course invalidate the “peer review” but it was individual academics who were surveyed, not universities.

Laurito then points out that UST got a zero for research citations:

The score is obtained through a research collation database maintained by Thomson, an information-based solutions provider, called Essential Science Indicators (ESI). For every citation given to a university researcher or professor, the university would acquire a point.

The procedure is not like this at all. Laurito continues:

Based also on the survey, UST received the lowest grade on international outlook (meaning UST has no international students or faculty) when the University actually has seven international professors and 300 international students.”

Again, not quite. UST gets a score from QS of 3.6 for international faculty and 0.6 for international students, representing 12 international faculty members and 47 international students.

Laurito has got the wrong end of several sticks but the basic point still remains that QS got the data for students, faculty and international students wrong.

The newspaper then quotes Laurito as saying:

We were told by the research representative (of THES-QS) that the data they used were personally given to them by a University personnel, but they were not able to present who or from what office it came from

If Laurito is reported correctly and if this is what the “research representative” told him, there is something very strange here.

IF QS have a documentary record of an e-mail or a phone call to UST how could the record not indicate the person or office involved?

If they do not, how can QS be sure that the information came from an official university source or that there was any contact at all?

Friday, May 11, 2007

More about Student-Faculty Ratios

I have just discovered a very good site by Ben Wilbrink, Prestatie-indicatoren (indicator systems). He starts off with "Een fantastisch document voor de kick-off", referring to a monograph by Sharon L. Nichols and David C. Berliner (2005), The Inevitable Corruption of Indicators and Educators Through High-Stakes Testing. Education Policy Studies Laboratory, Arizona State University pdf (180 pp.).

The summary of this study reports that:

"This research provides lengthy proof of a principle of social science known as Campbell's law: "The more any quantitative social indicator is used for social decisionmaking, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor." "

This insight might well be applied to current university ranking systems. We have seen, for example, some US universities making it optional for applicants to submit their SAT results. It is predictable that good scores will be submitted to admissions officers, but not bad ones. Universities will then find that the average scores of their applicants will rise and therefore so will their scores on rankings that include SAT data.

I would like to propose a new law, an inversion of Gresham's. Good scores drive out bad.

Wilbrink has some good comments on the THES-QS rankings but I would like to focus on what he says about the student-faculty ratio.

"The faculty/student score (20%)The scores in this rubric are remarkable, to say the least. I do not think the student/staff ratio is less reliable than the other indicators, yet the relation to the world rank score seems to be nil. The first place is for (13) Duke, the second for (4=) Yale, the third for (67) Eindhoven University of Technology. Watch who have not made it here in the top twenty: Cambridge is 27th, Oxford 31st, Harvard 37th, Stanford 119, Berkeley 158. This is one more illustration that universities fiercely competing for prestige (see Brewer et al.) tend to let their students pay at least part of the bill.
"We measure teaching by the classic criterion of staff-to-student ratio." Now this is asking for trouble, as Ince is well aware of. Who is a student, who is a teacher? In the medieval universities these were activities, not persons. Is it much different nowadays? How much? ...


Every administration will creatively fill out the THES/QS forms asking them for the figures on students and teachers, this much is absolutely certain. If only because they will be convinced other administrations will do so. Ince does not mention any counter-measure, hopefully the THES/QS people have a secret plan to detect fraudulent data."

It is possible to test whether Wilbrink's remarks are applicable to the student-faculty scores for the 2006 THES-QS rankings. THES have published a table of student-faculty ratios at British universities from the University and College Union that is derived from data from the Higher Education Statistics Agency (HESA). These include further education students and exclude research-only staff. These results can be compared to the data in the THES-QS rankings


In 2006 QS reported that the top scorer for student-faculty ratio was Duke. Looking at QS's website we find that this represents a ratio of 3.48 students per faculty. Cross-checking shows that QS used the data on their site to construct the scores on the 2006 rankings. Thus, the site reports that Harvard had 3,997 faculty and 24,648 students , a ratio of 6.17 students per faculty, ICL 3,090 faculty and 12,185 students, a ratio 0f 3.94, Peking 5,381 faculty and 26,972 students, a ratio of 5.01, Cambridge 3,886 faculty and 21,290 students, a of ratio of 5 .48. These ratios yielded scores of 56, 88, 69 and 64 on the student-faculty component of the 2006 rankings.


Now we can compare the QS data with those from HESA for the period 1005-06. Presumably ,this represents the period covered in the rankings. If Wilbrink is correct, then we would expect the ratios of the rankings to be much lower and more favourable than those provided by HESA.

That in fact is the case. Seven British universities have lower ratios in the HESA statistics. The se are Cranfield, Lancaster, Warwick, Belfast, Swansea, Strathclyde and Goldsmith's College. In 35 cases the THES-QS score was much better. The most noticeable differences were ICL, 3.95 and 9.9, Cambridge , 5,48 and 12,.30, Oxford 5.70 and 11.9, LSE 6.57 and 13, Swansea, 8.49 and 15.1 and Edinburgh, 8.29 and 14.

It is possible that the differences are the result of different consistent and principled conventions. Thus one set of data might specifically include people excluded by the other. The HESA data, for example, includes further education students, presumably meaning non-degree students, but the THES-QS data apparently does not. This would not, however, seem to make much of a difference between the two sets of data for places like Oxford and LSE.

Both HESA`and QS claim not to count staff engaged only in research.

It is possible then that the data provided by universities to QS has been massaged a bit to give favourable scores. I suspect that this does not amount deliberate lying. It is probably more a case of choosing the most beneficial option whenever there is any ambiguity.

Overall, the ratios provided by QS`are much lower, 11.37 compared to 14.63.

Wednesday, May 09, 2007

Another Comment on QS and Kenan-Flagler

A blog by MBA student Shawn Snyder remarks:

"So CNN recently published its "Top 50 Business Schools to get Hired in 2007" and I was glad to see Maryland's Smith school listed, but I was confused to see the George Washington University right above Smith. After all, by their own ranking the GW grads had one less job offer and starting salary almost $10,000 lower. Umm, maybe recruiters think that George Washington is a better deal because they can snag grads for cheap, but from a business student perspective (the people reading the rankings) wouldn't Smith be the better choice? And why wouldn't it rank higher? Business rankings are crap in my opinion....and yet I still read all of them as if it matters. Maybe I have the problem."


And there is a comment by Dave:


" I too noticed some discrepancies in the ratings on CNN.com. Specifically, UNC Kenan-Flagler is not in the top 50! I dug a bit deeper and looked at the data from topmba.com - the website where the list came from - and found some startling errors. UNC KFBS average salary is listed as $76k when the actual average is $89k! I wrote a letter to TopMBA.com and found that not only did they screw up the salaries, but they did not distinguish between University of North Carolina and North Carolina State U in the recruiter rankings! It's really incredible the garbage that these people are allowed to print. What ever happened to 'trust but verify'?"
More on QS and Kenan-Flagler


There is an interesting post at Accepted Admissions Almanac about the QS-Kenan-Flagler affair. The writer remarks:


"It's safe to say that this mess is a nightmare for QS, CNNMoney, and Fortune. Providing and publishing rankings so sloppily slapped together is beneath criticism for an industry that even when the data is accurate has more than its share of critics and is deserving of skepticism. The CNNMoney/QS fiasco is about as bad as it gets for rankings."


I am afraid that it gets very much worse for QS. They have made errors as bad as this in the compilation of the THES-QS World University rankings -- a response rate of less than 1 per cent to an online survey, counting ethnic minority students in Malaysia as international students, renaming Peking University Beijing University, boosting Duke University's score for student-faculty ratio by counting undergraduates as faculty and so on.

But nobody seems to mind very much when it comes to the THES rankings. Is it something about the brand name?

The post concludes with a very appropriate comment:

"When accurate, unlike the removed QS/CNNMoney version, they are sources of information. Sometimes valuable information. Databanks. I use the data, and so should you. If you want to know the average salaries of graduates from particular schools or their average entering test scores, the rankings will have that information compiled in one place. Like a library, they are sources of information. They are not an excuse for decision-making; using them mindlessly could be the equivalent of a lobotomy. And an expensive one at that."
Best Value Colleges

The Princeton Review (registration required) has published a list of the best value colleges in the US.



Here is what they say about their methodology:

"We chose the schools that appear on our Top Ten Best Value Public and Private Colleges ranking lists based on institutional data we collected from more than 650 schools during the 2005-2006 academic year and our surveys of students attending them. Broadly speaking, the factors we weighed covered undergraduate academics, costs, and financial aid.

More specifically, academic factors included the quality of students the schools attracted, as measured by admissions credentials, as well as how students rated their academic experiences. Cost considerations were tuition, room and board, and required fees.

Financial aid factors included the average gift aid (grants, scholarships, or free money) awarded to students, the average percentage of financial need met for students who demonstrated need, the percentage of students with financial need whose need was fully met by the school, the percentage of graduating students who took out loans to pay for school, and the average debt of those students. We also took into consideration how satisfied students were with the financial aid packages they received."



There are a few questions that should be asked about the methodology, especially concerning the student surveys, but this approach may be more useful for undergraduate students than that of the THES-QS and Shanghai Jiao Tong rankings.

The top 10 best value private colleges for undergraduates are:

1. Rice University
2. Williams College
3. Grinell College
4. Swarthmore College
5. Thomas Aquinas college
6. Wabash College
7. Whitman College
8. Amherst College
9. Scripps College
10. Harvard College


The top 10 best value public colleges are:

1. New College of Florida
2. Truman State University
3. University of North Carolina at Asheville
4. University of Virginia
5. University of California at Berkeley
6. University of California at San Diego
7. University of California at Santa Cruz
8. University of Minnesota, NMorris
9. University of Wisconson-Madison
10. St. Mary's College of Maryland

Thursday, May 03, 2007

‘again!?’ Yep... Quacquarelli Symonds Ltd (QS) did it again.


Eric Beeekens at Bog,u +S has written some excellent posts on the internationalization of higher education.

A recent one concerns QS Quacquarelli Symonds Ltd (QS) who were responsible for collecting data for a ranking of business schools by Fortune magazine. It seems that QS committed a major blunder by leaving out the Kenan-Flagler School at the University of North Carolina at Chapel Hill, one of the top American business schools and one that regularly appears among the high fliers in other business school rankings. Apparently QS got mixed up with North Carolina State University’s College of Management. They also left out the Boston University School of Business. Beerkens refers to an article in the Economist (subscription required) and remarks:

“After reading the first line, I thought: 'again!?' Yep... Quacquarelli Symonds Ltd (QS) did it again.”

Beerkens then points out that this is not the first time that QS has produced flawed research, referring – for which many thanks – to this blog and others. He concludes:

“It's rather disappointing that reputable publications like THES and Forbes use the services of companies like QS. QS clearly doesn't have any clue about the global academic market and has no understanding of the impact that their rankings are having throughout the world. There has been a lot of critique about the indicators that they use, but at least we can see these indicators. It are the mistakes and the biases that are behind the indicators that make it unacceptable!”


There was a vigorous response from the University of North Carolina. They pointed out that QS had admitted to not contacting the university about the rankings, using outdated information and getting the University of North Carolina mixed up with North Carolina State University. QS did not employ any proper procedures for verification and validation, apparently failed to check with other rankings, gave wrong or outdated information about salaries and provided data from 2004 0r 2005 although claiming that these referred to 2006.

Fortune has done the appropriate and honest, although probably expensive, thing and removed the rankings from its website.

What is remarkable about this is the contrast between Fortune and the THES All of the errors committed by QS with regard to the Fortune rankings are parallelled in the World University Rankings. They have, for example grossly inflated the scores of Ecole Normale Superieure in Paris in 2004 and Ecole Polytechnique in 2005 by counting part-time faculty as full time, and done the same for Duke University – QS does seem to have bad luck in North Carolina, doesn’t it? -- in 2005 by counting undergraduate students as faculty and in 2006 by counting faculty twice, used a database from a Singapore based academic publishing company that specializes in Asia-Pacific publications to produce a survey to represent world academic opinion, conducted a survey with an apparent response rate of less than one per cent and got the names of universities wrong – Beijing University and the Official University of California among others.

It is probably unrealistic for THES to remove the rankings from its website. Still, they could at the very least start looking around for another consultant.
Book Review

This is a draft of a review that may appear shortly in an academic journal.

Guide to the World’s Top Universities, John O’Leary, Nunzio Quacquarelli and Martin Ince. QS Quacquarelli Symonds Ltd.: London. 2006.


The THES (Times Higher Education Supplement)-QS World University Rankings have aroused massive interest throughout the world of higher education, nowhere more so than in East and Southeast Asia. Very few university teachers and administrators in the region can be unaware of the apparent dramatic collapse of quality at Universiti Malaya, which was in fact nothing of the sort. That this resulted from nothing more than an error by THES’s consultants and its belated correction has done little to diminish public fascination.

Now, QS Quacquarelli Symonds, the consultants who compiled the data for the rankings, have published a large 512-page volume. The book, written by John O’Leary and Martin Ince of THES and Nunzio Quacquarelli of QS, comes with impressive endorsements. It is published in association with IELTS, TOEFL and ETS, names that quite a few Asian students and teachers will know, and is distributed by Blackwell Publishing of Oxford. At the top of the front cover, there is a quotation from Tim Rogers, former Head of Student Recruitment and Admissions, London School of Economics: “A must – have book for anyone seeking a quality university education at home and abroad.” Tim Rogers, by the way, has been a consultant for QS.

The Guide to the World’s Top Universities certainly contains a large amount of material. There are thirteen chapters as follows.

Welcome to the world’s first top university guide
Ranking the world’s universities
How to choose a university and course
The benefits of studying abroad
What career? Benefits of a top degree
Tips for applying to university
What parents need to know -- guide to study costs and more
Financing and scholarships
The world’s top 200 universities. This is the ranking that was published last year in the THES.
The world’s top universities by subject. This was also published in the THES.
The top 100 university profiles. This provides two pages of information about each university.
12. The top ten countries
13. Directory of over 500 top world universities.

Basically, there are two parts. The earlier chapters mostly consist of advice that is generally interesting, well written and sensible. Later, we have data about various characteristics of the universities, often ranking them in order. The latter comprise much of the book. The profiles of the top 100 universities take up 200 pages and the directory of 500 plus universities another 140.

So, is this a must-have book? At ₤19.99, $35.95 or Euro 28.50 the answer has to be not really. Maybe it would be a good idea to glance through the earlier advisory chapters but as a source of information and evaluation it is not worth the money. First of all, there are serious problems with the information presented in the rankings, the profiles and the directory. The book’s credibility is undermined by a succession of errors, indicating an unacceptable degree of carelessness. At 35 dollars or 20 pounds we surely have the right to expect something a little better, especially from the producers of what is supposed to be “the gold standard” of university rankings.

Thus we find that the Technical University of Munich appears twice in the profiles in positions 82 (page 283) and 98 (Page313). The latter should be the University of Munich. In the directory the University of Munich is provided with an address in Dortmund (page 407). The Technical University of Helsinki is listed twice in the directory (pages 388 and 389). A number of Swiss universities are located in Sweden (pages 462 and 463). The authors cannot decide whether there is only one Indian Institute of Technology and one Indian Institute of Management (page 416) or several (pages 231 and 253). New Zealand is spelt ‘New Zeland’ (page 441). The profile for Harvard repeats the same information in the factfile under two different headings (page 119). There is something called the ‘Official University of California, Riverside’ on page 483. Kyungpook National University in Korea has a student faculty ratio of zero (page 452). Something that is particularly irritating is that the authors or their assistants still cannot get the names of Malaysian universities right. So we find ‘University Putra Malaysia’ on page 435 and ‘University Sains Malaysia’ on page 436. After that famous blunder about Universiti Malaya’s international students and faculty one would expect the authors to be a bit more careful.

Still, we must give some credit. At least the book has at last started to use the right name for China’s best or second best university – Peking University, not Beijing University -- and ‘University of Kebangsaan Malaysia’ in the 2006 rankings in the THES has now been corrected to ‘Universiti Kebangsaan Malaysia’.

The Guide really gets confusing, to put it mildly, when it comes to the number of students and faculty. A perceptive observer will note that the data for student-faculty ratio in the top 200 rankings reproduced in chapter 9 is completely different from those in the profiles in chapter 11 and the directory in chapter 13.

For example, in the rankings Duke University, in North Carolina, is given a score of 100, indicating the best student faculty ratio. Going to QS’s topuniversities website we find that Duke supposedly has 11,106 students and 3,192 faculty, representing a ratio of 3.48 students per faculty. But then we turn to the profile and see that Duke is assigned a ratio of 16.7 students per faculty (page 143). On the same page we are told that Duke has 6,301 undergraduates and 4,805 postgraduates and “just under 1,600 faculty”. That makes a ratio of about 6.94. So, Duke has 3.48 or 6.94 or 16.7 students per faculty. Not very helpful.

Looking at Yale University, the book tells us on the same page (127) that the student faculty ratio is 34.3 and that the university has “around 10,000 students” and 3,333 faculty, a ratio of 3 students for each faculty member.

On page 209 we are told that the University of Auckland has a student–faculty ratio of 13.5 and in the adjacent column that it has 2,000 academic staff and 41, 209 students, a ratio of 20.6. Meanwhile, the top 200 rankings give it a faculty student score of 38 which works out at a ratio of 9.2. So, take your pick from 9.2, 13.5 and 20.6.

The data for research expertise is also contradictory. Universities in Australia and China get excellent scores for the “peer review” of best research in the rankings of the top 200 universities in chapter 9 but get relatively poor scores for research impact. The less glamorous American universities like Boston and Pittsburgh get comparatively low scores for peer review of research but actually do excellent research.

Errors and contradictions like these seriously diminish the book’s value as a source of information.

It would not be a good idea to buy this book although it might be worth looking at the early chapters if you can borrow it from a library. To judge the overall global status of a university, the best bet would be to look east and turn to at the Shanghai Jiao Tong University Index, available on the Internet, which ranks the top 500 universities. This index focuses entirely on research but there is usually at least a modest relationship between research activity and other variables such as the quality if the undergraduate student intake and teaching performance. Those thinking about going to the US should look at the US News and World Report ‘s America’s Best Colleges. Anyone concerned about costs – who isn’t? -- should look at Kiplinger’s Index, which calculates the value for money of American universities. Incidentally, the fifth place here goes to the State University of New York at Binghamton, which is not even mentioned in the Guide. The Times (which is not the same as the Times Higher Education Supplement) and Guardian rankings are good for British universities.

Students who are not certain about going abroad or who are thinking about going to a less well known local institution could try doing a Google Scholar search for evidence of research proficiency and a Yahoo search for miscellaneous activity. Whatever you do, it is not a good idea to rely on any one source alone and certainly not this one.

Friday, April 27, 2007

How Rankings Lead to a Decline in Quality

Geoffrey Alderman, currently a visiting research fellow at the Institute of Historical Research, University of London, has an article in the Guardian about the decline of standards in British universities. He refers to a case at Bournemouth University where officials overrode a decision by a professor and examination board to fail thirteen students. Apparently, the officials thought it unreasonable that students were required to do any reading to pass the course. He also comments on the remarkable increase in the number of first-class degrees at the University of Liverpool. Professor Alderman is clear that part of the problem is with the current obsession with rankings:


"Part of the answer lies in the league-table culture that now permeates the sector. The more firsts and upper seconds a university awards, the higher its ranking is likely to be. So each university looks closely at the grading criteria used by its league-table near rivals, and if they are found to be using more lenient grading schemes, the argument is put about that "peer" institutions must do the same. The upholding of academic standards is thus replaced by a grotesque "bidding" game, in which standards are inevitably sacrificed on the alter of public image - as reflected in newspaper rankings."

Similarly, it seems that in the US large numbers of students are being pushed through universities for no other reason than to improve graduation rates and therefore scores on the US News and World Report rankings.

Tuesday, April 24, 2007

Comparison of the THES-QS "Peer Review" and Citations per Faculty Scores

QS Quacquarelli Symonds, the consultants responsible for the THES-QS World University Rankings, have now placed data for 540 universities, complete with scores for the various components, on their topuniversities website (registration required). This reveals more dramatically than before the disparity between scores by some universities on the “peer review” and scores for citations per faculty, a measure of research quality. Below are the top 20 universities in the world according to the THES-QS “peer review” by research-active academics who were asked to select the universities that are best for research. In curved brackets to the right is the position of the universities in the 2006 rankings according to the number of citations per faculty.

Notice that some universities, including Sydney, Melbourne, Australian National University and the National University of Singapore perform dramatically better on the peer review than on citations per faculty. Melbourne, rated the tenth best university in the world for research by the THES-QS peer reviewers, is 189th for citations per faculty while the National University of Singapore, in twelfth place for the peer review, comes in at 170th for citations per faculty. The most devastating disparity is for Peking University, 11th on the “peer review” and 352nd for citations, behind, among others, Catania, Brunel, Sao Paulo, Strathclyde and Jyväskylä. Once again, this raises the question of how universities whose research is regarded so lightly by other researchers could be voted among the best for research. Oxford, Cambridge and Imperial College London are substantially overrated by the peer review. Kyoto is somewhat overrated while the American universities, with the exception of Chicago, have roughly the same place that would be indicated by the citations per faculty position.

Of course, part of the problem could be with the citations per faculty. I am fairly confident that the data for citations, which is collected by Evidence Ltd, is accurate but less certain about the number of faculty. I have noted already that if a university increases its score for student faculty ratio by increasing the number of faculty it would suffer a drop in the citations per faculty score. For most universities the trade-off would be worth it since the gap between the very good and the good is much greater for citations than for student-faculty ratio. So, if there has been some inflating of the number of faculty, however and by whom it was done, than this would have an adverse impact on the figures for citations per faculty.

I have therefore included the positions of these universities according to their score for articles in the Science Citation Index-expanded and Social Science Citation Index in 2005 in the Shanghai Jiao Tong rankings. This is not the same as the THES measure. It covers one year only and is based on the number of papers, not number of citations. It therefore measures overall research output of a certain minimum quality rather than the impact of that research on other researchers. The position according to this index is indicated in square brackets.

We can see that Cambridge and Oxford do not do as badly as they did on citations per faculty. Perhaps they produced research characterised by quantity more than quality or perhaps the difference is a result of inflated faculty numbers. Similarly the performance of Peking, National University of Singapore, Melbourne and Sydney is not as mediocre on this measure as it is THES’s citations per faculty.

Nonetheless the disparity still persists. Oxford, Cambridge, Imperial College and universities in Asia and Australia are still overrated by the THES-QS review.

1. Cambridge (46 ) [15]
2. Oxford (63) [17]
3. Harvard (2) [1]
4. Berkeley ( 7) [9]
5. Stanford (3) [10]
6. MIT (4) [29}
7. Yale (20) [27]
8. Australian National University (83) [125]
9. Tokyo (15) [2]
10. Melbourne (189) [52]
11. Peking (352) [50]
12. National University of Singapore (170) [111]
13. Princeton (10) [96]
14. Imperial College London ( 95) [23]
15. Sydney (171) [46}
16. Toronto (18) [3]
17. Kyoto (42) [8]
18. Cornell (16)
19. UCLA (19) [21]
20. Chicago (47) [55]

Monday, April 23, 2007

Book Review in the THES

A new book on university rankings, The World-Class University and Ranking: Aiming Beyond Status, has appeared and has been reviewed by Martin Ince in the Times Higher Education Suppplement (THES). It is edited by Liu Nian Cai of Shanghai Jiao Tong University and Jan Sadlak. You can read a review here (subscription required) I hope eventually to review the book myself.

I must admit to being rather amused by one comment by Ince. He says:

"Although one of its editors is the director of the Shanghai rankings, The World-Class University and Ranking largely reflects university concerns at being ranked. Many contributors regard ranking as an unwelcome new pressure on academics and institutions. Much is made of the "Berlin principles" for ranking, a verbose and pompous 16-point compilation that includes such tips as "choose indicators according to their relevance and validity". The Shanghai rankings themselves fall at the third principle, the need to recognise diversity, because they rank the world's universities almost exclusively on science research. But the principles are silent on the most important point they should have contained - the need for rankings to be independent and not be produced by universities or education ministries."

I would not argue about the desirability of rankings being independent of university or government bureaucracies but there is far greater danger in rankings that are dominated by the commercial interests of newspapers.

Friday, April 20, 2007

Rankings to be Investigated

The Guardian has announced that the Higher Education Funding Council for England (Hefce) will investigate university league tables, Its chief executive David Eastwood has announced that the council will examine the rankings produced by the Guardian, the Times and the Sunday Times and whether university policies are influenced by attempts to improve their scores.

The report continues:

'World tables compiled by Shanghai Jiao Tong University and the Times Higher Education Supplement will also be surveyed. The University of Manchester, for example, has made it clear that its strategy is to climb the international rankings, which include factors like the number of Nobel prizewinners. The university has pledged to recruit five Nobel laureates in the next few years.

Prof Eastwood said league tables were now part of the higher education landscape "as one of a number of sources of information available to potential students".

He added: "Hefce has an interest in the availability of high quality useful information for students and the sector's other stakeholders. The league table methodologies are already the subject of debate and academic comment. We plan to commission up-to-date research to explore higher education league table and ranking methodologies and the underlying data, with the intention of stimulating informed discussion.'


Thursday, April 19, 2007

More about Internationalisation

There is a very interesting piece by Ahmad Ismail at highnetworth -- acknowledgement to Education in Malaysia -- that argues that the economic value of an overseas university education for a Malaysian student is minimal. There are, no doubt, going to be questions about the assumptions behind the financial calculations and there are of course other reasons for studying abroad.


Even so, if students themselves typically gain little or nothing economically from studying in another country and if their parents suffer a great deal and if students or taxpayers in the host country in one way or another have to pick up the tap (at Cornell it takes USD1,700 to recruit an international student) then one wonders what internationalisation has to do with university quality. And one wonders why THES and QS consider it so important
How Rankings Produce Distortions

A letter to the Cornell Daily Sun from Mao Ye, a student-elected trustee, suggests increasing the recruitment of international students in order to boost the university's position in the US News and World Report rankings.

The question arises of whether the intenational students would add anything to the quality of an institution. If they do, then surely they would be recruited anyway. But if students are admitted for no other reason than to boost scores in the rankings they may well contribute to a decline in the overall quality of the students.


"Two critical ways to improve Cornell’s ranking are to increase the number of applications and to increase the yield rate of admitted students. To achieve this goal, no one can overlook the fact that international applications to all U.S. institutions have recently increased at a very fast pace. For Cornell, the applications from China increased by 42.9 percent in 2005 and 47.5 percent in 2006. We also saw a 40 percent increase in applications from India last year. By my estimation, if international applications continue to grow at the current rate, in 10 years there will be more than 10,000 foreign applications received by the Cornell admissions office. Therefore, good performance in the international market will have a significant positive impact on our ranking in U.S. News and World Report.

How might we get more international students to apply? It’s actually very easy. We can have different versions of application materials, each in various students’ native languages, highlighting Cornell’s achievements in that country and addressing the specific concerns of students from that country. I checked the price and realized we do not need more than $500 to translate the whole application package into Chinese. If we focus translation on the crucial information for Chinese applicants, the cost is as low as $50. Comparatively, this is lower than the cost of recruiting one undergraduate student to a university, which costs an average of $1,700 per student, based on the calculations of Prof. Ronald Ehrenburg, industrial and labor relations. Staff, students, parents and Cornell as a whole will all benefit greatly from this plan."

Monday, March 05, 2007

Top Universities Ranked by Research Impact

The THES – QS World Universities Rankings, and its bulky offspring, Guide to the World’s Top Universities (London: QS Quacquarelli Symonds), are strange documents, full of obvious errors and repeated contradictions. Thus, we find that the Guide has data about student faculty ratios that are completely different from those used in the top 200 rankings published in the THES while talking about how robust such a measure is. Also, if we look at the Guide we notice that for each of the top 100 universities it provides a figure for research impact, that is the number of citations divided by the number of papers. In other words it indicates how interesting other researchers found the research of each institution. These figures completely undermine the credibility ot the “peer review” as a measure of research expertise.

The table below is a re-ranking of the THES top 100 universities for 2006 by research impact and therefore by overall quality of research. This is not by any means a erefect measure. For a start, the natural sciences and medicine do a lot more citing than other disciplines and this might favor some universities more than others. Nonetheless it is very suggestive and it is so radically different from the THES-QS peer review and the overall ranking that it provides further evidence of the invalidity of the latter.

Cambridge and Oxford, ranked second and third by THES-QS, only manage to achieve thirtieth and twenty-first places for research impact.

Notice that in comparison to their research impact scores the following universities are overrated by THES-QS: Imperial College London, Ecole Normale Superieure, Ecole Polytechnique, Peking, Tsing Hua,Tokyo, Kyoto, Hong Kong, Chinese University of Hong Kong, National University of Singapore, Nanyang Technological University, Australian National University, Melbourne, Sydney, Monash, Indian Institutes of Technology, Indian Institutes of Management.

The following are underrated by THES-QS: Washington University in St Louis,
Pennsylvania State University, University of Washington, Vanderbilt, Case Western Reserve, Boston, Pittsburgh, Wisconsin, Lausanne, Erasmus, Basel, Utrecht, Munich, Wageningen, Birmingham.

The number on the left is the ranking by research impact, i.e. the number of citations divided by the number of papers. The number to the right of the universities is the research impact. The number in brackets is the overall ranking in the THES-QS 2006 rankings.

1 Harvard 41.3 (1 )
2 Washington St Louis 35.5 (48 )
3 Yale 34.7 (4 )
4 Stanford 34.6 (6 )
5 Caltech 34 (7 )
6 Johns Hopkins 33.8 (23 )
7 UC San Diego 33 (44)
8 MIT 32.8 (4)
9= Pennsylvania State University 30.8 (99)
9= Princeton 30.8 (10)
11 Chicago 30.7 (11)
12= Emory 30.3 (56)
12= Washington 30.3 (84)
14 Duke 29.9 (13 )
15 Columbia 29.7 (12 )
16 Vanderbilt 29.4 (53)
17 Lausanne 29.2 (89 )
18 University of Pennsylvania 29 (26)
19 Erasmus 28.3 (92)
20 UC Berkeley 28 (8)
21= UC Los Angeles 27.5 (31)
21= Oxford 27.5 (3
23 Case Western Reserve 27.4 (60)
24 Boston 27.2 (66)
25 Pittsburgh 27.1 (88 )
26 Basel 26.7 (75 )
27= New York University 26.4 (43)
27= Texas at Austin 26.4 (32 )
29 Geneva 26.2 (39 )
30= Northwestern 25.8 (42 )
30= Cambridge 25.8 (2)
32 Dartmouth College 25.6 (61
33 Cornell 25.5 (15 )
34 Rochester 25.1 (48 )
35 Michigan 25 (29)
36 University College London 24.9 (25 )
37 Brown 24.1 (54)
38 McGill 23.6 (21)
39 Edinburgh 23.4 (33 )
40 Toronto 23 (27 )
41 Amsterdam 21.6 (69 )
42 Wisconsin 21.5 (79 )
43= Utrecht 21.4 (95
43= Ecole Normale Superieure Lyon 21.4 (72)
45 ETH Zurich 21.2 (24 )
46 Heidelberg 20.8 (58 )
47 British Columbia 20.6 (50 )
48 Carnegie Mellon 20.5 (35 )
49= Imperial College London 20.4 (9)
49= Ecole Normale Superieure Paris 20.4 (18 )
51 King’s College London 20.1 (48 )
52 Bristol 20 (64)
53= Trinity College Dublin 19.9 (78 )
53= Copenhagen 19.9 (54 )
53= Glasgow 19.9 (81 )
56 Munich 19.8 (98)
57 Technical University Munich 19.4 (82 )
58= Birmingham 19.1 (90)
58= Catholic University of Louvain 19.1 (76 )
60 Tokyo 18.7 (19)
61 Illinois 18.6 (77 )
62 Osaka 18.4 (70
63 Wageningen 18.1 (97 )
64 Kyoto 18 (29 )
65 Australian National University 17.9 (16 )
66 Vienna 17.9 (87)
67 Manchester 17.3 (40 )
68 Catholic University of Leuven 17 (96)
69= Melbourne 16.8 (22 )
69= New South Wales 16.8 (41 )
71 Nottingham 16.6 (85 )
72 Sydney 15.9 (35)
73= Pierre-et-Marie-Curie 15.7 (93 )
73= Monash 15.7 (38)
75 Otago 15.5 (79 )
76 Queensland 15.3 (45)
77 Auckland 14.8 (46 )
78= EPF Lausanne 14.3 (64 )
78= MacQuarie 14.3 (82 )
78= Leiden 14.3 (90 )
81 Eindhoven University of Technology 13,4 (67 )
82= Warwick 13.3 (73 )
82= Delft University of Technology 3.3 (86)
84 Ecole Polytechnique 13.2 (37 )
85 Hong Kong 12.6 (33 )
86 Hong Kong Uni Science and Technology 12.2 (58)
87 Chinese University of Hong Kong 11.9 (50 )
88 Seoul National University 10.9 (63)
89 National University of Singapore 10.4 (19 )
90 National Autonomous University of Mexico 9.8 (74)
91 Peking 8 (14)
92 Lomonosov Moscow State 6 (93 )
93 Nanyang Technological University 5.6 (61)
94 Tsing Hua 5.4 (28 )
95 LSE 4.4 (17 )
96 Indian Institutes of Technology 3 (57 )
97 SOAS 2.5 (70 )
98 Indian Institutes of Management 1.9 (68)
Queen Mary London -- (99 )
Sciences Po -- (52)

Thursday, March 01, 2007

THES-QS Bias Chart

LevDau has been kind enough to reproduce my post "More Problems with Method" and to add a couple of very interesting graphs. What he has done is to calculate a bias ratio, which is the number of THE-QS reviewers reported on the topuniversities site divided by the number of highly cited researchers listed in Thomson ISI. The higher the number the more biased is the THES-QS review towards that country and the lower the number the more biased against. Some countries will not appear because they did not have anybody at all in the highly cited list.

If we chose a less rigorous definition of research expertise such as number of papers published rather than the number of highly cited researchers then the bias might be somewhat reduced. It would certainly not, however, be removed. In any case, if we are talking about the gold standard of ranking then the best researchers would surely be most qualified to judge the merits of their peers.

Bias in the THES-QS peer review (Selected Countries)

Iran 25
India 23.27
Singapore 23
Pakistan 23
China 19
Mexico 17
South Korea 9
Taiwan 3.22
Australia 1.82
Hong Kong 1.79
Finland 1.53
New Zealand 1.47
France 1
UK 0.86
Israel 0.77
Germany 0.43
Japan 0.22
USA 0.14

Monday, February 26, 2007

Is it Really a Matter of Global Opinion?

The THES-QS rankings of 2005 and 2006 are heavily weighted towards its so-called peer review, which receives 40% of the total ranking score. No other section gets more than 20 %. The “peer review” is supposed to be a survey of research active academics from around the world. One would therefore expect it to be based on a representative sample of the international research community, or “global opinion”, as THES claimed in 2005. It is, however, nothing of the sort.

The review was based on e-mails sent to people included on a database purchase from World Scientific Publishing Company. This is a publishing company that was founded in 1981. It now has 200 employees at its main office in Singapore. There are also subsidiary offices in New Jersey, London, Hong Kong, Taipei, Chennai, Beijing and Singapore. It claims to be the leading publisher of scientific journals and books in the Asia-Pacific region.


World Scientific has several subsidiaries. These include Imperial College (London) Press, which publishes books and journals on engineering, medicine, information technology, environmental technology and management, Pan Stanford Publishing of Singapore, which publishes in such fields as nanoelectronics, spintronics biomedical engineering and genetics, and KH Biotech Services Singapore who specialise in biotechnology, pharmaceuticals, food and agriculture; consultancy, training and conference organisation services. It also distributes books and journals produced for The National Academies Press (based in Washington, D.C.) in most countries in Asia (but not in Japan).

World Scientific has particularly close links with China, especially with Peking University. Their newsletter of November 2005 reports that:

”The last few years has seen the rapid growth of China's economy and academic sectors. Over the years, World Scientific has been actively establishing close links and putting down roots in rapidly growing China”

Another report describes a visit from Chinese university publishers:

”In August 2005, World Scientific Chairman Professor K. K. Phua, was proud to receive a delegation from the China University Press Association. Headed by President of Tsinghua University Press Professor Li Jiaqiang, the delegation comprised presidents from 18 top Chinese university publishing arms. The parties exchanged opinions on current trends and developments in the scientific publishing industry in China as well as Singapore. Professor Phua shared many of his experiences and expressed his interest in furthering collaboration with Chinese university presses. “

World Scientific has also established very close links with Peking University:

”World Scientific and Peking University's School of Mathematical Sciences have, for many years, enjoyed a close relationship in teaching, research and academic publishing. To further improve the close cooperation, a "World Scientific - Peking University Work Room" has been set up in the university to serve the academic communities around the world, and to provide a publishing platform to enhance global academic exchange and cooperation. World Scientific has also set up a biannual "World Scientific Scholarship" in the Peking School of Mathematical Sciences. The scholarship, totaling RMB 30,000 per annum and administered by the university, aims to reward and encourage students and academics with outstanding research contributions.”

Here are some of the titles published by the company:

China Particuology
Chinese Journal of Polymer science
Asia Pacific Journal of Operational research
Singapore Economic review
China: an International Journal
Review of Pacific Basin Financial Markets and Policies
Asian Case Research Journal

It should be clear by now that World Scientific is active mainly in the Asia-Pacific region, with an outpost in London. It seems more than likely that its database, which might be the list of subscribers or its mailing list, would be heavily biased towards the Asia-Pacific region. This goes a long way towards explaining why Chinese, Southeast Asian and Australasian universities do so dramatically better on the peer review than they do on the citations count or any other measure of quality.

I find it inconceivable that QS were unaware of the nature of World Scientific when they purchased the database and sent out the e-mails. To claim that the peer review is in any sense an international survey is absurd. QS have produced what may some day become a classical example of how bad sampling technique can destroy the validity of any survey.

Monday, February 19, 2007

Inflatable Australian Universities

Professor Simon Marginson of the University of Melbourne has made some very appropriate comments to The Age about the THES - QS rankings and Australian universities.

Professor Marginson told The Age that a lack of transparency in the rankings method means that universities could be damaged through no fault of their own.

'"Up to now, we in Australian universities have done better out of the Times rankings than our performance on other indicators would suggest," he said. "But it could all turn around and start working against us, too."
The Times rankings are volatile because surveys of employers and academics are open to manipulation, subjectivity and reward marketing over research, Professor Marginson said.'

The admitted extraordinarily low response rate to the THES - QS "peer review" combined with the overrepresentation of Australian "research-active academics"among the respondents are sufficient to confirm Professor's Marginsons remarks about the rankings.

Friday, February 16, 2007

More problems With Method

Another problem with the peer review section of the THES-QS World University Rankings is that it is extremely biased against certain countries and biased in favour of certain others. Here is an incomplete list of countries where respondents to the peer review survey are located and the number of respondents.

USA 532
UK 378
India 256
Australia 191
Canada 153
Malaysia 112
Germany 103
Indonesia 93
Singapore 92
China 76
Japan 53
France 56
Japan 53
Mexico 51
Thailand 37
Israel 36
Iran 31
Taiwan 29
South Korea 27
HongKong 25
New Zealand 25
Pakistan 23
Finland 23
Nigeria 20


How far does the above list reflect the distribution of research expertise throughout the world? Here is a list of the same countries with the number of academics listed in Thomson ISI Highly Cited Researchers.


USA 3,825
UK 439
India 11
Australia 105
Canada 172
Malaysia 0
Germany 241
Indonesia 0
Singapore 4
China (excluding Hong Kong) 4
Japan 53
France 56
Japan 246
Mexico 3
Thailand 0
Israel 47
Iran 1
Taiwan 9
South Korea 3
HongKong 14
New Zealand 17
Pakistan 1
Finland 15
Nigeria 0


The number of highly cited scholars is not a perfect measure of research activity -- for one thing, some disciplines cite more than others -- but it does give us a broad picture of the research expertise of different countries.

The peer review is outrageously biased against the United States, extremely biased against Japan and very biased against Canada, Israel, European countries like France, Germany, Switzerland and the Netherlands.


On the other hand, there is a strong bias towards China (less so Taiwan and Hong Kong), India Southeast Asia and Australia.

Now we now why Cambridge does so much better in the peer review than Harvard despite an inferior research record, why Peking university is apparantly among the best in the world, why there are so many Australian universities in the top 200 and why the world 's academics supposedly cite Japanese researchers copiously but cannot bring themselves to vote for them in the peer review .

Thursday, February 15, 2007

Something Needs Explaining

QS Quacquarelli Symonds have published additional information on their web site concerning the selection of the initial list of universities and the administration of the "peer review". I would like to focus on just one issue for the moment, namely the response rate to the e-mail survey. Ben Sowter of QS had already claimed to have surveyed more than 190,000 academics to produce the review. He had said:

"Peer Review: Over 190,000 academics were emailed a request to complete our online survey this year. Over 1600 responded - contributing to our response universe of 3,703 unique responses in the last three years. Previous respondents are given the opportunity to update their response." (THES-QS World University Rankings _ Methodology)

This is a response rate about 0.8%, less than 1%. I had assumed that the figure of 190,000 was a typographical error and that it should have been 1,900. A response rate of 80% would have been on the high side but perhaps respondents were highly motivated by being included in the ranks of "smart people" or winning a BlackBerry organiser.

However, the new information provided appears to suggests that QS did survey such a large number.

"So, each year, phase one of the peer review exercise is to invite all previous reviewers to return and update their opinion. Then we purchase two databases, one of 180,000 international academics from the World Scientific (based in Singapore) and another of around 12,000 from Mardev - focused mainly on Arts & humanities which is poorly represented in the former.
We examine the responses carefully and discard any test responses and bad responses and look for any non-academic responses that may have crept in. "
(Methodology-- The Peer Review)

There is a gap between "we purchase" and "we examine the responses" but the implication is that about 192, 000 academics were sent emails.

If this is the case then we have an extraordinarily low response rate, probably a record in the history of survey research. Kim Sheehan in an article in the Journal of Computer Mediated Communication reports that 31 studies of e-mail surveys show a mean response rate of about 37%. Response rates have been declining in recent years but even in 2004 the mean response rate was about 24%

Either QS did not send out so many e-mails or there was something wrong with the database or something else is wrong. Whatever it is, such a low response rate is in itself enough to render a survey invalid. A explanation is needed.

Wednesday, February 14, 2007

Congratulations to the Technical University Munich

The Technical Univeritsity of Munich has pulled off a major feat. It has been awarded not one but two places among the world's top 100 universities in the THES-QS book, Guide to the World's Top Universities. The Guide has also managed to move a major university several hundred miles.


In 2006 the THES -- QS world university rankings placed the Technical University of Munich in 82nd place and the University of Munich at 98th.

The new THES-QS Guide has profiles of the top 100 universities. On page 283 and in 82nd place we find the Technical University Munich. Its address is given as "Germany". How very helpful. The description is clearly that of the Technical University and so is the data in the factfile.


On page 313 the Technical University Munich appears again, now in 98th place . The description is identical to that on page 283 but the information in the factfile is different and appears to refer to the (Ludwig-Maximilien) University of Munich. The university is given an address in Dortmund, in a completely different state and the web site appears to be that of the University of Munich.

Turning to the directory we find that "Universitat Munchen" is listed, again with an address in Dortmund, and the Technische Universitat Munichen is on page 409, without an address. This time the data for the two universities appears to be correct.

Sunday, February 11, 2007

A Robust Measure

There is something very wrong with the THES-QS Guide to the World’s Top Universities, recently published in association with Blackwell’s of London. I am referring to the book’s presentation of two completely different sets of data for student faculty ratio.

In the Guide, it is claimed that this ratio “is a robust measure and is based on data gathered by QS from universities or from national bodies such as the UK’s Higher Education Statistics Agency, on a prescribed definition of staff and students” (p 75).


Chapter 9 of the book consists of the ranking of the world’s top 200 universities originally published in the THES in October 2006. The rankings consist of an overall score for each university and scores for various components one of which is for the number of students per faculty. This section accounted for 20% of the total ranking. Chapter 11 consists of profiles of the top 100 universities, which among other things, include data for student faculty ratio. Chapter 12 is a directory of over 500 universities which in most cases also includes the student faculty ratio.

Table 1 below shows the top ten universities in the world according to the faculty student score in the university rankings, which is indicated in the middle column. It is possible to reconstruct the process by which the scores in THES rankings were calculated by referring to QS’s topuniversities site which provides information, including numbers of students and faculty, about each university in the top 200, as well as more than 300 others.

There can be no doubt that the data on the web site is that from which the faculty student score has been calculated. Thus Duke has, according to QS, 11,106 students and 3,192 faculty or a ratio of 3.48 students per faculty which was converted to a score of 100. Harvard has 24,648 students and 3,997 faculty, a ratio of 6.17, which was converted to a score of 56. MIT has 10,320 students and 1,253 faculty, a ratio of 8.24 converted to a score of 42 and so on. There seems, incidentally, to have been an error in calculating the score for Princeton. The right hand column in table 1 shows the ratio of students per faculty, based on the data provided in the rankings for the ten universities with the best score on this component.

Table 1

1. Duke ...............................................................100..........3.48

2. Yale ..........................................................................................93 ............3.74

3. Eindhoven University of Technology .................92 ..........3.78

4. Rochester ...............................................................................91 .........3.82

5. Imperial College London ...........................................88.........4.94

6. Sciences Po Paris ............................................................86.........4.05

7= Tsing Hua, PRC ............................................................84 .............4.14

7= Emory .................................................................................84...........4. 14

9= Geneva xxxxxxxx...................................................xxxxxx81 ...............4.30

9= Wake Forest ..................................................81 ...............4.30

Table 2 shows the eleven best universities ranked for students per faculty according to the profile and directory in the Guide. It may need to be revised after another search. You will notice immediately that there is no overlap at all between the two lists. The student faculty ratio in the profile and directory is indicated in the right hand column.

Table 2

1. Kyongpook National University , Korea xxxxxxxxxxxxxxxxxxxxxxxxxxx0

2. University of California at Los Angeles (UCLA) xxxxxxxxxxxxxxxxxxxxxxxxxxxxx0.6

3.= Pontificia Univeridade Catolica do Rio de Janeirio, Brazil xxxxxxxxxxxxx3.8

3= Ecole Polytechnique Paris xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.8

5. Ljubljana, Slovenia xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.9

6= Kanazawa, Japan xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.0

6= Oulo, Finland xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.0

8= Edinburgh xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.1

8= Trento, Italy ...........................................................................................................4.1

10= Utrecht, Netherlands ..........................................................................................4.3

10= Fudan, PRC.......................................................................................................... 4.3

The figures for Kyongpook and UCLA are obviously simple data entry errors. The figure for Ecole Polytechnique might not be grotesquely wrong if part-timers were included . But I remained very sceptical about such low ratios for universities in Brazil, China, Finland and Slovenia.

Someone who was looking for a university with a commitment to teaching would end up with dramatically different results if he or she checked the rankings or the profile and directory. A search of the first would produce Duke, Yale and Eindhoven and so on. A search of the second would produce (I’ll assume even the most naïve student would not believe the ratios for Kyongpook and UC LA) Ecole Polytechnique, Ljubljana and Kanazawa and so on.

Table 3 below compares the figures for student faculty ratio derived from the rankings on the left with those given in the profile and directory sections of the Guide, on the right.

Table 3.

Duke xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx...x3.48 xxxxxxxxxxxxxxxxxxx16.7

Yale.........................................................................3.74 xxxxxxxxxxxxxxxxxxxx34.3

Eindhoven University of Technology................. 3.78 xxxxxxxxxxxxxxxxxx x31.1

Rochester xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3.82 xxxxxxxxxxxxxxxxxxxx7.5

Imperial College London xxxxxxxxxxxxxxxxxxxxxxxxx4.94 xxxxxxxxxxxxxxxxxxx6.6

Sciences Po, Paris xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.05 xxxxxxxxxxxxxxxxxxx22.5

Tsing Hua xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.14 xxxxxxxxxxxxxxxxxxxxx9.3

Emory xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx4.14xxxxxxxxxxxxxx 9.9

Geneva.....................................................................4.30 .........................................8.4

Wake Forest............................................................4.30 .........................................16.1

UCLA.......................................................................10.20............................... 0.6

Ecole Polytechnique, Paris xxxxxxxxxxxxxxxxxxxxxxx....x5.4 xxxxxxxxxxxxxxxxxxxxx3.8

Edinburgh ...................................................................8.3 xxxxxxxxxxxxxxxxxxxx4.1

Utrecht ......................................................................13.9 xxxxxxxxxxxxxxxxxxxx4.3

Fudan xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx19.3 xxxxxxxxxxxxxxxxxxx4.3

There seem to be no relationship whatsoever between the ratios derived from the rankings and those given in the profiles and directory.

Logically, there are three possibilities. The ranking data is wrong. The directory data is wrong. Both are wrong. It is impossible for both to be correct.

In a little while, I shall try to figure out where QS got the data for both sets of statistics. I am beginning to wonder, though, whether they got them from anywhere.

To call the faculty student score a robust measure is ridiculous. As compiled and presented by THES and QS, it is as robust as a pile of dead jellyfish.

Friday, February 09, 2007

Guide to the World’s Top Universities

Guide to the World’s Top Universities: Exclusively Featuring the Official Times Higher Education Supplement QS World University Rankings. John O’Leary, Nunzio Quacquarelli and Martin Ince (QS Quacquarelli Symonds Limited/Blackwell Publishing 2006)

Here are some preliminary comments on the THES-QS guide. A full review will follow in a few days.

The Times Higher Education Supplement and QS Quacquarelli Symonds have now produced a book, published in association with Blackwell’s. The book incorporates the 2006 world university rankings of 200 universities and the rankings by peer review of the top 100 universities in disciplinary areas. It also contains chapters on topics such as choosing a university, the benefits of studying abroad and tips for applying to university. There are profiles of the top 100 universities in the THES-QS rankings and a directory containing data about over 500 universities

The book is attractively produced and contains a large amount of information. A superficial glance would suggest that it would be a very valuable resource for anybody thinking about applying to university or anybody comparing universities for any reason. Unfortunately, this would be a mistake.

There are far too many basic errors. Here is a list, almost certainly incomplete. Taken individually they may be trivial but collectively they create a strong impression of general sloppiness.

“University of Gadjah Mada” (p91). Gadjah Mada was a person not a place.

In the factfile for Harvard (p119) the section Research Impact by Subject repeats information given in the previous section on Overall Research Performance.

The factfile for Yale (p 127) reports a Student Faculty Ratio of 34.3 , probably ten times too high.

The directory (p 483) provides data about something called the “Official University of Califormia, Riverside”. No doubt someone was cutting and pasting from the official university website.

Zurich, Geneva, St Gallen and Lausanne are listed as being in Sweden (p 462-3)

Kyungpook National University, Korea, has a Student faculty Ratio of 0:1. (p 452)

New Zealand is spelt New Zeland (p441)

There is a profile for the Indian Institutes of Technology [plural] (p 231) but the directory refers to only one in New Delhi (p 416).

Similarly, there is a profile for the Indian Institutes of Management [plural] (p 253) but the directory refers to one in Lucknow (p416)

On p 115 we find the “University of Melbourneersity”

On p 103 there is a reference to “SUNY” (State University of new York” that does not specifiy which of the four university centres of the SUNY system is referred to.

Malaysian universities are given the bahasa rojak (salad language) treatment and are referred to as University Putra Malaysia and University Sains Malaysia. (p437-438)

UCLA has a student faculty ratio of 0.6:1 (p483)

There will be further comments later.



Monday, February 05, 2007

The Rise of Seoul National University

One remarkable feature of the THES-QS world university rankings has been the rise of the Seoul National University (SNU) in the Republic of Korea from 118th place in 2004 to 93rd in 2005 and then to 63rd in 2006. This made SNU the eleventh best university in Asia in 2006 and placed it well above any other Korean university.

This was accomplished in part by a rise in the peer review score from 39 to 43. Also, SNU scored 13 on the recruiter rating compared with zero in 2005. However, the most important factor seems to be an improvement in the faculty student score from 14 in 2005 to 57 in 2006

How did this happen? If we are to believe QS it was because of a remarkable expansion in the number of SNU’s faculty. In 2005 according to QS’s topgraduate site, SNU had a total of 31,509 students and 3,312 faculty or 9.51 students per faculty. In 2006, again according to QS, SNU had 30,120 students and 4,952 faculty, a ratio of 6.08. The numbers provided for students seem reasonable. SNU’s site refers to 28,074 students. It is not implausible that QS’s figures included some categories such as non-degree, part-time or off-campus students that were counted by SNU.

The number of faculty is, however, another matter. The SNU site refers to 28,074 students and 1,927 full time equivalent faculty members. There are also “1,947 staff members”. It is reasonable to assume that the latter are non-teaching staff such as technicians and librarians.

Further down, the SNU site things begin to get confusing. As of 1st April 2006, according to the site, there were 3,834 “teaching faculty” and 1, 947 “educational staff”. Presumably these are the same as the earlier 1, 947 “staff members.

The mystery now is how 1,927 full time equivalent faculty grew to 3,834 teaching faculty. The latter figure would seem to be completely wrong if only because one would expect teaching faculty to be fewer than total faculty.

Since 1,947 full time equivalent faculty plus 1,947 staff members adds up to 3,874, a little bit more than 3, 834, it could be that “faculty” and “staff” were combined to produce a total for “teaching faculty”.

Another oddity is that SNU has announced on this site that it has a student-faculty ration of 4.6. I am baffled how this particular statistic was arrived at.

QS should, I suppose, get some credit for not accepting this thoroughly implausible claim. It’s ratio of 6.08 is, however, only slightly better and seems dependent on accepting a figure of 4,952 faculty. Unless somebody has been fabricating data out of very thin air, the most plausible explanation I can think of is that QS constructed the faculty statistic from a source that did something like taking the already inflated number of teaching faculty and then added the professors. Perhaps the numbers were obtained in the course of a telephone conversation over a bad line.

And the real ratio? On the SNU site there is a "visual statistics page" that refers to 1,733 "faculty members" in 2006. This seems plausible. Also, just have a look at what the MBA Dauphine-Sorbonne-Renault programme, which has partnerships with Asian and Latin American universities, says:

"Founded in 1946, Seoul National University (SNU) marked the opening of the first national university in modern Korean history. As an indisputable leader of higher education in Korea, SNU has maintained the high standard of education in liberal arts and sciences. With outstanding cultural and recreational benefits, SNU offers a wide variety of entertainment opportunities in the college town of Kwanak and in the city of Seoul.
SNU began with one graduate school and nine colleges, and today SNU has 16 colleges, 3 specialized graduate schools , 1 graduate school, 93 research institutes, and other supporting facilities, which are distributed over 2 campuses.
Currently, SNU has a student enrollment of approximately 30,600 degree candidates, including 27,600 undergraduates and 3,000 graduates. Also SNU has approximately 700 foreign students from 70 different countries. Maintaining the faculty of student ratio of 1:20, over 1,544 faculty and around 30 foreign professors are devoted to teaching SNU students to become leaders in every sector of Korean Society.
With the ideal of liberal education and progressive visions for the 21st century, SNU will continue to take a leading position as the most prestigious, research-oriented academic university in South Korea. " (my italics)

A Student-faculty ratio of around 20 seems far more realistic than the 4.6 claimed by SNU or QS's 6.08. An explanation would seem to be in order from SNU and from QS.