Tuesday, February 20, 2018

Is Erdogan Destroying Turkish Universities?


An article by Andrew Wilks in The National claims that the position of Turkish universities in the Times Higher Education (THE) world rankings, especially that of Middle East Technical University (METU) has been declining as a result of the crackdown by president Erdogan following the unsuccessful coup of July 2016.

He claims that Turkish universities are now sliding down the international rankings and that this is because of the decline of academic freedom, the dismissal or emigration of many academics and a decline in its academic reputation.


'Turkish universities were once seen as a benchmark of the country’s progress, steadily climbing international rankings to compete with the world’s elite.
But since the introduction of emergency powers following a failed coup against President Recep Tayyip Erdogan in July 2016, the government’s grip on academic freedom has tightened.
A slide in the nation's academic reputation is now indisputable. Three years ago, six Turkish institutions [actually five] were in the Times Higher Education’s global top 300. Ankara's Middle East Technical University was ranked 85th. Now, with Oxford and Cambridge leading the standings, no Turkish university sits in the top 300.
Experts say at least part of the reason is that since the coup attempt more than 5,800 academics have been dismissed from their jobs. Mr Erdogan has also increased his leeway in selecting university rectors.
Gulcin Ozkan, formerly of Middle East Technical University but now teaching economics at York University in Britain, said the wave of dismissals and arrests has "forced some of the best brains out of the country".'
I have no great regard for Erdogan but in this case he is entirely innocent.

There has been a massive decline in METU's position in the THE rankings since 2014 but that is entirely the fault of THE's methodology. 

In the world rankings of 2014-15, published in 2014, METU was 85th in the world, with a whopping score of 92.0 for citations, which carries an official weighting of 30%. That score was the result of METU's participation in the Large Hadron Collider (LHC) project which produces papers with hundreds or thousands of authors and hundreds and thousands of citations. In 2014 THE counted every single contributor as receiving all of the citations. Added to this was a regional modification that boosted the scores of universities located in countries with a low citations impact score.

In 2015, THE revamped its methodology by not counting the citations to these mega-papers and by applying the regional modification to only half of the research impact score.

As a result, in the 2015-16 rankings METU crashed to the 501-600 band, with a score for citations of only 28.8. Other Turkish universities had also been involved in the LHC project and benefited from the citations bonus and they too plummeted. There was now only one Turkish university in the THE top 300.

The exalted position of METU in the THE 2014-15 rankings was the result of THE's odd methodology and its spectacular tumble was the result of changing that methodology. In other popular rankings METU seems to be slipping a bit but it never goes as high as in THE in 2014 or as low as in 2015

In the QS world rankings for 2014-15 METU was in the 401-410 band and by 2017-18 it had fallen to  471-480 in 2017

The Russian Round University Rankings have it 375 in 2014 and 407 in 407. The US News Best Global Universities placed it 314th last year.

Erdogan had nothing to do with it.















Friday, February 16, 2018

It's happened: China overtakes USA in scientific research

Last November I noted that the USA was barely managing to hold onto its lead over China in scientific research as measured by articles in the Scopus database. At the time, there were 346,425 articles with a Chinese affiliation and 352,275 with a US affiliation for 2017.

As of today, there are 395,597 Chinese and 406,200 US articles dated 2017.

For 2018 so far, the numbers are 53,941 Chinese and 49,428 US.

There are other document types listed in Scopus and perhaps the situation may change over the course of the year.

Also, the United States still has a smaller population so it maintains its lead in per capita research production. For the moment.

Saturday, February 10, 2018

Influence of Rankings on State Policy: India

In case you are wondering why the Indian media get so excited about the THE and QS rankings and not about those that are just as good or better such as Leiden Ranking, RUR or Nature Index, see this from the University Grants Commission.

Note that it says "any time" and that only the Big Three rankings count for getting Assistant Professor jobs.


"NEW DELHI:  University Grants Commission (UGC) has come up with, UGC Regulations 2018, which exempts PhD candidates from having NET qualification for direct recruitment to Assistant Professor post. This new draft regulation is known as Minimum Qualifications for Appointment of Teachers and Other Academic Staff in Universities and Colleges and Measures for the Maintenance of Standards in Higher Education. Further the Commission has also listed 'Ph.D degree from a university/ institution with a ranking in top 500 in the World University ranking (at any time) by Quacquarelli Symonds (QS), the Times Higher Education (THE) and Academic Ranking of World Universities (ARWU) of the Shanghai Jiao Tong University (Shanghai),' as one of the criteria for Assistant Professor appointment."


Thursday, February 08, 2018

Playing the Rankings Game in Pakistan

This article by Pervez Hoodbhoy from October 2016 is worth reading:

"A recently released report by Thomson-Reuters, a Canada based multinational media firm, says, “In the last decade, Pakistan’s scientific research productivity has increased by more than 4 times, from approximately 2000 articles per year in 2006 to more than 9000 articles in 2015. During this time, the number of Highly Cited Papers (HCPs) featuring Pakistan based authors increased tenfold from 9 articles in 2006 to 98 in 2015.”
This puts Pakistan well ahead of Brazil, Russia, India, and China in terms of HCPs. As the reader surely knows, every citation is an acknowledgement by other researchers of important research or useful new findings. The more citations a researcher earns, the more impact he/she is supposed to have had upon that field. Research evaluations, through multiple pathways, count for 50-70 percent of a university’s ranking (if not more).
If Thomson-Reuters has it right, then Pakistanis should be overjoyed. India has been beaten hollow. Better still, two of the world’s supposedly most advanced countries–Russia and China–are way behind. This steroid propelled growth means Pakistan will overtake America in just a decade or two.
But just a little analysis shows something is amiss. Surely a four-fold increase in scientific productivity must have some obvious manifestations. Does one see science laboratories in Pakistani universities four times busier? Are there four times as many seminars presenting new results? Does one hear animated discussions on scientific topics four times more frequently?
Nothing’s visible. Academic activity on Pakistani campuses might be unchanged or perhaps even less today, but is certainly not higher than ten years ago. So where–and why–are the authors of the HCP’s hiding? Could it be that these hugely prolific researchers are too bashful to present their results in departmental seminars or public lectures? The answer is not too difficult to guess."




Should Pakistan Celebrate the Latest THE Asian Rankings?


This is an updating and revision of a post from a few days ago


There appears to be no end to the craze for university rankings. The media in many parts of the world show almost as much interest in global university rankings as in the Olympics or the World Cup. They are now used to set requirements for immigration, chose research collaborators, external examiners, international partners and for marketing, public relations, and recruitment.

Pakistan has not escaped the craze although it was perhaps a bit slower than some other places. Recently, we have seen headlines announcing that ten Pakistani universities are included in the latest Times Higher Education (THE) Asian rankings and highlighting the achievement of Quaid-i-Azam University (QAU) in Islamabad reaching the top 100.

Rankings are unavoidable and sometimes they have beneficial results. The first publication of the research-based Shanghai rankings in 2003, for example, was a salutary shock to continental European universities and a clear demonstration of how far China had to go to catch up with the West in the natural sciences. But rankings do need to be treated with caution especially when ranking metrics are badly and obviously flawed.

THE note that there are now ten Pakistani universities in the Asian rankings and one, QAU, in 79th place, which would appear to be evidence of academic progress.

Unfortunately, Pakistani universities, especially QAU, do very much better in the THE rankings than in others. QAU is in the 401-500 band in the THE world rankings, which use the same indicators as the Asian rankings. But in the QS World University Rankings it is in the 650-700 band. It does not even get into the 800 ranked universities In the Shanghai rankings, the 903 in the Leiden Ranking, or the 763 in the Russian Round University Rankings. In the University Ranking by Academic Performance, published in Ankara, it is 605th, in the Center for World University Rankings list 870th.

How can we explain QAU’s success in the THE world and Asian rankings, one that is so much greater than any other ranking? It is in large part the result of a flawed methodology.

Take a look at the scores that QAU got in the THE rankings. In all cases the top scoring university gets 100.

For Teaching, combining five indicators, it was 25.7 which is not very good. For international outlook it was 42.1. Since QAU has very few international staff or students this mediocre score is very probably the result of a high score for international collaboration.

For research income from industry it was 31.8. This is probably an estimate since exactly the same score is given for four other Pakistani universities.

Now we come to something very odd. QAU’s research score was 1.3. It was the lowest of the 350 universities in the Asian rankings, very much lower than the next worse, Ibaraki University in Japan with 6.6.  The research score is composed of research reputation, publications per faculty and research income per faculty. This probably means that QAU’s score for research reputation was zero or close to zero.

In contrast, QAU’s score of 81.2 for research impact measured by citations is among the best in Asia. Indeed, in this respect it would appear to be truly world class with a better score than Monash University, the Chinese University of Hong Kong, the University of Bologna or the University of Nottingham.

How is it being possible that QAU could be 7th in Asia for research impact but 350th for research?

The answer is that THE’s research impact indicator is extremely misleading. It does not simply measure the number of citations but the number of citations in over 300 fields, five years of publication and up to six years of citations. This means that a few highly cited papers in a strategic discipline at a strategic time can have a disproportionate effect on the impact score especially if the total number of papers is low.

Added to this is THE’s regional modification which means that the citation impact score of a university is divided by the square root of the score of the whole country in which they university is located. That means that the score of universities in the top scoring country remain the same but that of all the others goes up, the worse the country the bigger the increase. The effect of this is to give a big boost to countries like Pakistan. THE used to apply this bonus to all of the citations indicator but now only to 50%.

Then we have to consider how THE deals with mega-papers mainly in physics and medicine, those with hundred even thousands of authors and hundreds and thousands of citations.

Until the world rankings of 2015-16 THE treated every single author of such papers as though he or she were the only author of the papers. Then they stopped counting citations to these papers and then in 2016-17 they awarded each institution a minimum 5% for citations.

The effect of the citations metric has been to make a mockery of the THE Asian and world rankings. A succession of unlikely places has been propelled to the top of the indicator because of contributions to mega-papers or because of a few or even a single prolific author combined with a low overall number of papers. We have seen Alexandria University, Anglia Ruskin University, Moscow State Engineering Physics Institute, Tokyo Metropolitan University rise to the top of this indicator. In last year’s Asian rankings, Veltech University in India appeared to be first for research impact.

QAU has been involved in the Large Hadron Collider (LHC) project, which produces papers with hundreds or thousands of authors and hundreds or thousands of citations, and has provided authors for several papers. One 2012 paper derived from this project received 4094 citations so that QU would be credited with 205 citations just for this paper.

In addition to this QAU employs an extremely productive mathematician, Tasawar Hayat, who is among the world’s elite of researchers in Clarivate Analytics list of Highly Cited Researchers where his primary affiliation is King Abdulaziz University in Saudi Arabia and QAU is his secondary affiliation. Professor Hayat is extremely prolific: in 2017 alone, he was author or co-author of 384 scientific documents, articles, reviews, notes and so on.

There is nothing wrong with QAU taking part in the LHC project and I am unable to comment on the quality of his research. It should, however, be understood that if Professor Hayat left QAU or QAU withdrew from the LHC project or THE changed its methodology then QAU could suffer a dramatic fall in the rankings similar to those suffered by some Japanese, Turkish or Korean universities in recent years. This is an achievement built on desperately weak foundations.

It would be very unwise to use these rankings as evidence for the excellence of QAU or any other university.

Tuesday, February 06, 2018

Rising Stars of Asian research

Times Higher Education (THE) has just announced the latest edition of its Asian rankings. Since the indicators are the same as the world rankings with adjusted weightings there was absolutely no suspense about who would be top. In case anybody still doesn't know it was the National University of Singapore.

The really interesting part of the rankings is the citations indicator, field- and year-normalised, based on Scopus, with fractional counting only for papers with more than 1,000 authors.

Here are some of the superstars of Asian research. On the left is the citations rank and the score for citations. On the right in brackets is the score for Research comprising research reputation, publications per faculty, and research income. To achieve a score in the seventies, eighties or nineties  for citations with minimal research reputation, very few publications and limited funding is remarkable.

1st. 99.1. Babol Noshirvani University of Technology (15.3)
2nd. 92.0 King Abdulaziz University (92.3)
3rd.  93.1. Ulsan National Institute of Science and Technology (37.8)
7th. 81.2. Quaid-i-Azam University (1.3)
13th. 74.5. Fujita Health University (9.4)
16th.72.5.  Central China Normal University (11.3)






Free speech rankings from Spiked

The magazine Spiked is descended from Living Marxism although some think it is now more libertarian than socialist. It has just published the latest edition of its free speech university rankings.

These are not actually rankings but a classification or a rating, since they just divide UK universities into three groups. They have been subjected to mockery from sections of the academic blogosphere, including WONKHE, that might be justified on technical grounds. This is, however, such an important topic that any sort of publicity has to be welcomed.

Universities are divided into three categories: 

RED; "A students’ union, university or institution that is hostile to free speech and free expression, mandating explicit restrictions on speech, including, but not limited to, bans on specific ideologies, political affiliations, beliefs, books, speakers or words."

AMBER; "A students’ union, university or institution that chills free speech and free expression through restricting vague and subjective types of speech, such as ‘offensive’ or ‘insulting’ speech, or requiring burdensome vetting procedures for events, speakers, posters or publications. Many policies in this category might not explicitly limit speech, but have the potential to be used to that end, due to purposefully vague or careless wording."

GREEN; "A students’ union, university or institution that, as far as we are aware, places no significant restrictions on free speech and expression – other than where such speech or expression is unlawful."

The roll of honour in the green category includes exactly seven universities, none of them in the Russell Group: Anglia Ruskin, Buckingham, Hertfordshire, Robert Gordon, Trinity St David, West of Scotland, and Winchester.


Interesting data from Webometrics

The Webometrics rankings perform the invaluable function of ranking 27,000 plus universities or entities claiming to be universities around the world. Also, their Excellence indicator identifies those  institutions, 5,776 this year, with any claim to involvement in research.

Consequently, it has often been used in unofficial national rankings in countries, especially in Africa, where very few places can make it into the top 500 or 1,000 universities included in the better known international rankings.

However, there seems to be a universal law that when a ranking becomes significant it will have unintended and perverse consequences. In the UK we have seen massive inflation in the number of first and upper second class degrees partly because this is a n element in popular national rankings. Sophisticated campaigns can also produce  significant gains in the QS academic opinion survey which has a 40% weighting  and a few hundred strategic citations can boost the most unlikely universities in the research impact indicator of THE world and regional rankings.

Webometrics also has indicator that seems to be susceptible to bad practices. This is "Presence", the number of pages in the main webdomain including subdomains and file types such as rich files, with a 5% weighting. Apparently this can be easily manipulated. Unlike other rankings, Webometrics does not attempt to ignore this but has highlighted it in several recent tweets, which is helpful since it indicates who might be manipulating the variable. It is  possible that there might have been a misunderstanding of the Webometrics guidelines, an error somewhere, or perhaps some totally valid and innocent explanation. If the latter is the case iIwill be happy to publish a statement.

Here is a selection of universities with their world rank in the Webometrics Presence indicator. The overall rank is in brackets.

4.  University of Nairobi, Kenya (874)

5.  Masaryk University in Brno, Czechia (433)

9.  Federal university of Santa Catarina, Brazil  (439)

15.  Charles University in Prague (203)

17.  University of Costa Rica (885)

20.  University of the West Indies St Augustine (1792)

32.  National University of Honduras (3777)

40.  Mahidol University, Thailand (548)

55.  Universitas Muhammadiyah Surakarta, Indonesia  (6394)




Wednesday, January 24, 2018

Fake Rankings from Nigeria?

Although the Webometrics rankings, based mainly on web activities, receive little attention from the good and the great among the world 's university administrators they do serve the important function of providing some sort of assessment of over 20,000 universities or entities that claim to be universities. They get to places where the market leaders, Shanghai Ranking, THE and QS, cannot go.

As a result, the media in several African countries have from time to time published local rankings based on Webometrics that do not appear all that different from what would be expected from a ranking based on research or reputation.

For example, the current top five in Webometrics are:

1. University of Ibadan
2. Covenant University
3. Obafemi Awolowo University
4. University of Nigeria
5. University of Lagos.

The Nigerian press have in the last few years announced the results of rankings supposedly produced by the country's national university commission. In 2016 Nigerian Scholars reported that the NUC had produced a ranking with the top five being:

1. University of Ibadan
2. University of Lagos
3. University of Benin
4. Obafemi Awolowo University
5. Ahmadu Bello University.

Now we have this published in The Nation .Professor Adamu Abubakar Abdulrasheed, Executive Secretary of the NUC, has announced that the rankings attributed to the NUC were fake and that the commission had not published any ranking for several years.

This is  a bit strange. Does that mean that nobody on the commission noticed that fake rankings were being published in its name until now? There may be more to the story.

For the moment, it looks as though Nigeria and other countries in Africa may have to continue relying on Webometrics.





Saturday, January 20, 2018

What use is a big endowment?






Quite a lot. But not as much as you might expect.

The website THEBESTSCHOOLS has just published a list of the world's 100 wealthiest universities, as measured by the value of their endowments. As expected, it is dominated by US institutions with Harvard in first place. There are also three universities from Canada and two each  from the UK, Australia, Japan, Singapore and Saudi Arabia

There are of course other elements in university funding but it worth looking at how this ranking compares with others. The top five are familiar to any rankings observer, Harvard with an endowment of 34.5 US$ followed by Yale, the University of Texas system, Stanford and Princeton. Then there is a surprise, King Abdullah University of Science and Technology in Saudi Arabia in sixth place with an endowment of 20 billion.

Some of the wealthy universities also do well in other rankings. Stanford, in fourth place here, is second in the overall Shanghai rankings and seventh for publications, and fifth in the  Leiden Ranking default publications indicator. It does even better in the QS employer survey indicator, where it is ranked second.

There are, however, several places that are very wealthy but just don't get anywhere in the global rankings. Williams College, the University of Richmond, Pomona College, Wellesley College, Smith College, and Grinnell College are not even given a value in the QS employment indicator, or the Leiden or Shanghai publication indicators. They may of course do well in some other respects: the University of Richmond is reported by the Princeton Review to be second in the US for internships.

On the other hand, some less affluent universities do surprisingly well. Some California schools seem to among the best high-performers.  Caltech is 47th here but 9th in the Shanghai rankings where it has always been first in the productivity per capita indicator. Berkeley is 65th here and fifth in Shanghai. The University of California San Francisco, a medical school, is 90th here and 21st in Shanghai.

Overall there is an association between endowment value and research output or reputation among employers that is definitely positive but rather modest. The correlation  between endowment and Shanghai publication score is 0.38, between endowment and number of publications 2012-15 (in the Leiden Ranking) 0.46, and between endowment and the QS employer survey score 0.40. The relationship would certainly be higher if we corrected for restriction of range.

Having a lot of money helps a university produce research and build up a reputation for excellence but it is certainly not the only factor involved.

Here is the top ten in a a ranking of the 100 universities by papers (Leiden Ranking) per billion dollars of endowment.

1. University of Toronto
2. University of British Columbia
3. McGill University
4. University of California San Francisco
5. University of Melbourne
6. Rutgers University
7. UCLA
8. University of Florida
9. University of California Berkeley
10. University of Sydney.

When it comes to research value for money it looks as though Australian and Canadian universities and US state institutions are doing rather better than the Ivy League or Oxbridge.














Ranking News: Chinese Think Tank Ranking

From the China Daily

The Global Think Tank Research Center affiliated with Zhejiang University of Technology has released a ranking of domestic university think tanks.

The first three places go to the National Academy of Development and Strategy at Renmin University of China, the national School of Development at Peking University, and the National Conditions Institute at Tsinghua University.

Wednesday, January 17, 2018

Ranking News: US State K-12 Rankings

Education Week has produced a ranking of states according to three criteria: Chance for Success, School Finance and K-12 Achievement. Overall, the top state is Massachusetts, which is also first for Chance for Success and K-12 Achievement. Pennsylvania is top for school finance. Overall the worse performing state is Nevada while New Mexico is worst for Chance for Success, Idaho for School Finance, and Mississippi for K-12 achievement.

California is an interesting case. Overall it is below average and gets a grade of C-. For K-12 its grade is D+. The state has some of the best universities in the world. Typically three or four of them will be found in the top ten of any global ranking. So why is the performance of primary and secondary schools so poor? Could it be that Education Week has identified the future of California's tertiary sector?






Thursday, January 11, 2018

Ranking News: US News online program rankings

U.S. News Releases 2018 Best Online Programs Rankings

Ranking news: Jordan cancels classification of universities

The Higher Education Accreditation Commission of Jordan has cancelled its proposed  classification of universities. Apparently, academics were  opposed because it was based on international rankings and ignored "“the reality of the universities and the damage to their reputation”.


Source

Jordan Times

Friday, December 29, 2017

Getting ready for next year's university rankings


Getting ready for next year's university rankings.




More on Japan and the Rankings

The Japan Times recently published an article by Takamitsu Sawa, President and Distinguished Professor at Shiga University, discussing the apparent decline of Japan's universities in the global rankings.

He notes that in 2014 there were five Japanese universities in the top 200 of the Times Higher Education (THE) world rankings but only two in 2016. He attributes Japan's poor performance to the bias of the citations indicator towards English language publications and the inability or reluctance of Japanese academics to write in English. Professor Sawa seems to be under the impression that THE does not count research papers not written in English, which is incorrect. It is, however, true that the failure of Japanese scholars to write in English prevents their universities doing better in the rankings. He also blames lack of funding from the government and the Euro-American bias of the THE reputation survey.

The most noticeable thing about this article is that the author talks about exactly one table, the THE World University Rankings. This is unfortunately very common especially among Asian academics, There are now over a dozen global rankings of varying quality and some of them tell a different, and perhaps more accurate, story than THE's. For example, there are several well known international rankings in which there are more Japanese universities in the world top 200 than there are in THE's.

There are currently two in the THE top 200 but seven in the Shanghai Academic Ranking of World Universities (ARWU), ten in the QS World University Rankings, ten in the Russian Round University Rankings, seven in the CWTS Leiden Ranking total publications indicator and ten in the Nature Index.

Let's now take a look at the University of Tokyo (Todai), the country's best known university, and it's position in these rankings. Currently it is 46th in the world in THE but in ARWU it is 23rd, in QS 28th, in Leiden Ranking tenth for publications and tenth in the Nature Index. RUR put the university in 43rd place, still a little better than THE. It is very odd that Professor Sawa should focus on the rankings that puts Japanese universities in the worst possible light and ignore the others.

As noted in an earlier post, Tokyo's tumble in the THE rankings came suddenly in 2015 when THE made some drastic changes in its methodology, including switching to Scopus as data supplier, excluding papers with large numbers of authors such as those derived from the CERN projects, and applying a country adjustment to half instead of all the citations indicator. Then in 2016 THE made further changes for its Asian rankings that further lowered the scores of Japanese universities.

It is true that scores of leading Japanese universities in most rankings have drifted downwards over the last few years but this is a relative trend caused mainly by the rise of a few Chinese and Korean universities. Japan's weakest point, as indicated by the RUR and THE rankings, is internationalisation. These rankings show that the major Japanese universities still have strong reputations for postgraduate teaching and research while the Nature Index and the Leiden Ranking point to an excellent performance in research in the natural science at the highest levels.

Nobody should rely on a single ranking and changes caused mainly by methodological tweaking should be taken with a large bucket of salt.




Tuesday, December 19, 2017

Rankings Calendar

The US News Online Program Rankings will be published on January 9th, 2018


Saturday, December 16, 2017

Rankings in Hong Kong

My previous post on the City University of Hong Kong has been republished in the Hong Kong Standard.

So far I can find no reference to anyone asking about the City University of Hong Kong's submission of student data to THE or data about faculty numbers for any Hong Kong university.

I also noticed that the Hong Kong University of Science and Technology is not on the list of 500 universities in the QS Employability Rankings although it is 12th in the one published in THE. Is there a dot here? 



Measuring graduate employability; two rankings

Global university rankings are now well into their second decade. Since 2003, when the first Shanghai rankings appeared, there has been a steady growth of global and regional rankings. At the moment most global rankings are of two kinds, those that focus entirely or almost entirely on research and those such as the Russian Round Rankings, Times Higher Education (THE) and Quacquarelli Symonds (QS) that claim to also measure teaching, learning or graduate quality in some way, although even those are biased towards research when you scratch the surface a little.

The ranking industry has become adept at measuring research productivity and quality in various ways. But the assessment of undergraduate teaching and learning is another matter.

Several ranking organisations use faculty student ratio as a proxy for quality of teaching which in turn is assumed to have some connection with something that happens to students during their programmes. THE also count institutional income, research income and income from industry, again assuming that there is a significant association with academic excellence. Indicators like this are usually based on those supplied by institutions. For examples of problems here see an article by Alex Usher and a reply by Phil Baty.

An attempt to get at student quality is provided by the CWUR rankings now based in UAE, counting alumni who win international awards or who are CEOs of major companies. But obviously this is relevant only for a very small number of universities. A new pilot ranking from Moscow also counts international awards.

The only attempt to measure student quality  by the well known rankers that is relevant to most institutions is the survey of employers in the QS world and regional rankings. There are some obvious difficulties here. QS gets respondents from a variety of channels and this may allow some universities to influence the survey. In recent years some Latin American universities have done much better on this indicator than on any other.

THE now publish a global employability ranking which is conducted by two European firms, Trendence and Emerging. This is based on two surveys of recruiters in Argentina, Australia, Austria, Brazil, Canada, China, Germany, France, India, Israel, Italy, Japan, Mexico, Netherlands, Singapore, Spain, South Africa, South Korea, Turkey, UAE, UK, and USA. There were two panels with a total of over 6,000 respondents.

A global survey that does not include Chile, Sweden, Egypt, Nigeria, Saudi Arabia, Russia, Pakistan, Indonesia, Bangladesh, Poland, Malaysia or Taiwan can hardly claim to be representative of international employers. This limited representation may explain some oddities of the rankings such as the high places of the American University of Dubai and and the National Autonomous University of Mexico.

The first five places in these rankings are quite similar to the THE world rankings: Caltech, Harvard, Columbia, MIT, Cambridge.  But there some significant differences after that and some substantial changes since last year. Here Columbia, 14th in the world rankings, is in third place, up from 12th last year. Boston University is 6th here but 70th in the world rankings. Tokyo Institute of Technology in 19th place is in the 251-300 band in the world rankings. CentraleSupelec, is 41st,  but in the world  401-500 group.

These rankings are useful only for a small minority of universities, stakeholders and students. Only 150 schools are ranked and only a small proportion of the world's employers consulted.

QS have also released their global employability rankings with 500 universities. These combine the employer reputation survey, used in their world rankings with  other indicators: alumni outcomes, based on lists of high achievers, partnership with employers, that is research collaboration noted in the Scopus database, employer-student connections, that is employers actively present on campus, and graduate employment rate. There seems to be a close association, at least at the top, between overall scores, employer reputation and alumni outcomes. Overall the top three are Stanford, UCLA, Harvard. For employer reputation they are Cambridge, Oxford, Harvard and for alumni outcomes Harvard, Stanford Oxford.

The other  indicators are a different matter. For employer-student connections the top three are Huazhong University of Science and Technology, Arizona State University, and New York University. In fact seven out of the top ten on this measure  are Chinese. For graduate employment rate they are Politecnico di Torino, Moscow State Institute of International Relations, and Sungkyunkwan University and for partnership with employers Stanford, Surrey and Politecnico Milano. When the front runners in indicators are so different one has to wonder about their validity.

There are some very substantial differences in the ranks given to various universities in these rankings. Caltech is first in the Emerging-Trendence rankings and 73rd in QS. Hong Kong University of Science and Technology is 12th in Emerging-Trendence but not ranked at all by QS. The University of Sydney is 4th in QS and 48th in Emerging-Trendence. The American University of Dubai is in QS's 301-500 band but 138th for Emerging-Trendence

The  rankings published by THE could be some value to those students contemplating careers with the leading companies in the richest countries.

The QS rankings may be more helpful for those students or stakeholders looking at universities outside the very top of the global elite. Even so QS have ranked only a fraction of the world's universities.

It still seems that the way forward in the assessment of graduate outcomes and employability is through standardised testing along the lines of AHELO or the Collegiate Learning Assessment.




Monday, December 11, 2017

Rankings Calendar

The Times Higher Education (THE) Asian Universities Summit will be held at the Southern University of Science and Technology, Shenzen, China, 5th-7th February, 2018. The 2018 THE Asian universities rankings will be announced.



Monday, November 27, 2017

Rankings Uproar in Hong Kong


There is a controversy brewing in Hong Kong about the submission of data to the QS World University Rankings. It seems that the City University of Hong Kong (CityU) has submitted a smaller figure for the total number of its students than that presented by the SAR's University Grants Committee (UGC). The objective of this was presumably to boost the score for faculty student ratio, which accounts for 20% of the total score in the QS rankings. The complaints apparently began with two other local universities and were reported in the Chinese language Apple Daily.

There is nothing new about this sort of thing. Back in 2006 I commented on the difference between the number of students at "Beijing University" on the university web site and that declared by QS. Ong Kian Ming has noted discrepancies between the number of students at Malaysian universities reported on web sites and that published by QS and there have been questions about the number of international students at Singapore universities

The first thing that strikes an outside observer about the affair is that the complaint seems to be just about QS and does not mention the THE rankings although exactly the same number of students, 9,240, appears on both the QS and THE pages. The original article in Chinese apparently makes no mention of THE.

This suggests that there might be a bit of politics going on here. THE seems to have a good relationship with some of the leading universities in Hong Kong such as the University of Hong Kong (UHK) and the Hong Kong University of Science and Technology (HKUST). In 2015 THE held a prestigious summit at HKUST where it announced after "feedback from the region" that it was introducing methodological changes that would dethrone the University of Tokyo from the number one spot in the Asian rankings and send it down to seventh place behind HKUST and UHK. It looks as though whoever is complaining about CityU is diverting their eyes from THE.

There is certainly a noticeable difference between the number of students submitted to QS and THE by CityU and that published by the UGC. This is not, however, necessarily nefarious. There are many ways in which a university could massage or trim data in ways compliant with the rankers' guidelines: using a specific definition of Full Time Equivalent, omitting or including branch campuses, research centres, affiliated institutions, counting students at the beginning or the end of the semester, counting or not counting exchange students or those in certificate, diploma, transitional or preparatory programmes. It is also not totally impossible that the government data may not be 100% accurate.

Other Hong Kong universities have also submitted student data that differs from that available at the UGC site but to a lesser extent. 

The UGC's data refers to 13,725 full time equivalent students in 2014-15. It is possible that City University has found legitimate ways of whittling down this number. If nothing else, they could claim that they had to use data from earlier years because of uncertainty about the validity of current data.

The real problem here is that it is possible that some universities have learned that success in the rankings is sometimes as much a matter of careful reading of statistics and guidelines as it is of improved teaching or research.

Another thing that has so far gone unnoticed is that CityU has also been reducing the number of faculty. The UGC reports a total of 2,380 full time equivalent faculty while QS reports 1,349. If the university had just used the raw UGC figures it would have a faculty student ratio of 5.77. The QS figure is 6.85. So by modifying the UGC data, if that is where the university started, CityU actually got a worse result on this indicator. They would, however, have done a bit better on the citations per faculty indicator.

This leads on to what the  Hong Kong  universities did with their faculty numbers.

For the University of Hong Kong the UGC reports a total of 5,093 FTE  staff but the QS site has 3,012. THE does not give a figure for the number of faculty but it is possible to calculate this from the number of students and the faculty student ratio, which are provided.  The current THE profile of UHK has 18,364 students and 18 students per staff, which gives us 1,020 staff.

For HKUST the UGC number of staff is 2,398. The number calculated from THE data is 442. QS has a total of 1,150. 

For the Chinese University of Hong Kong (CUHK) we have these numbers: UGC 5,070, QS 2,208, THE 1,044.

For the Polytechnic University of Hong Kong (PUHK): UGC, 3,356, QS 2,447, THE 809.

The UGC gives 2,380 FTE staff for CityU, QS 1,349, and THE 825.

The UGC also provides the number of  faculty wholly funded by the UGC and this number is  always much lower than the total faculty. The QS faculty numbers are generally quite similar to these although I do not know if there was a decision to exclude non-funded faculty. The calculated THE faculty numbers are much lower than those provided by the UGC and lower than the QS numbers.

I suspect that what is going on is that the leading Hong Kong universities have adopted the strategy of aiming for the THE rankings where their income, resources and international connections can yield maximum advantage. They presumably know that the weighting of the staff/student indicator, where it is better to have more faculty, is only 4.5% but the indicators where fewer total staff are better (international faculty, research income, research productivity, industry income, doctorates awarded, institutional income) have a combined weighting of 25.25%.

CityU in contrast has focussed on the QS rankings and looked for ways of reducing the number of students submitted.

It is possible that HKUST and UHK could justify the data the submitted to the rankers while CityU might not, It does, however, seem rather strange and unfair that City University's student data has come under such intense scrutiny while the faculty data of the other universities is so far unquestioned.

Ranking organisations should heed the suggestion by the International Rankings Experts Group (IREG) that indicators measure outcomes rather than inputs such as staff, facilities or income. They also should think about how much they should use data submitted by institutions. This may have been a good idea when they were ranking 200 or 300 places mainly in North America and Western Europe but now they are approaching 1,000 universities, sometimes very decentralised, and data collection is becoming more complicated and difficult.

QS used to talk about its "validation hierarchy" with central agencies such as HESA and NCES at the top, followed by direct contact with institutions, websites, and ending with "smart" averages. Perhaps this could be revived but with institutional data further down the hierarchy. The lesson of the latest arguments in Hong Kong and elsewhere is that data submitted by universities can often be problematical and unreliable.














Friday, November 17, 2017

Another global ranking?

In response to  suggestion by Hee Kim Poh of Nanyang Technological University, I have had a look at the Worldwide Professional University Rankings which appear to be linked to "Global World Communicator" and the "International Council of Scientists" and may be based in Latvia.

There is a methodology page but it does not include essential information. One indicator is "number of publications to number of academic staff" but there is nothing about how either of these are calculated or where the data comes from. There is a reference to a survey of members of the International Council of Scientists but nothing about the wording of the survey, date of survey, distribution of respondents or the response rate.

Anyway, here is the introduction to the methodology:

"The methodology of the professional ranking of universities is based on comparing universities and professional evaluation by level of proposed training programs (degrees), availability and completeness of information on activities of a university, its capacity and reputation on a national and international levels. Main task is to determine parameters and ratios needed to assess quality of the learning process and obtained specialist knowledge. Professional formalized ranking system based on a mathematical calculation of the relation of parameters of the learning process characterizing quality of education and learning environment. Professional evaluation criteria are developed and ranking is carried out by experts of the highest professional qualification in relevant fields - professors of universities, specialists of the highest level of education, who have enough experience in teaching and scientific activities. Professional rating of universities consists of three components.. "

The top five universities are 1. Caltech,  2. Harvard,  3. MIT,  4. Stanford,  5. ETH Zurich.

Without further information, I do not think that this ranking is worth further attention.









http://www.cicerobook.com/en/ranks

Wednesday, November 15, 2017

Rankings Calendar: QS BRICS University Rankings

The QS BRICS (Brazil, Russia, India, China, South Africa) university rankings will be announced on November 23 at the QS-APPLE conference in Taiwan.


Tuesday, November 14, 2017

China overtakes USA in supercomputing

The website TOP500 keeps track of the world's most powerful computers. Six months ago the USA had 169 supercomputers in the top 500 and China 160. Now China has 202 and the USA 143.

They are followed by Japan with 35, Germany 20, France 18 and the UK 15.

There are four supercomputers in India, four in the Middle East (all in Saudi Arabia), one in Latin America (Mexico), one in Africa (South Africa), 



The closing gap: When will China overtake the USA in research output?

According to the Scopus database, China produced 387,475 articles in 2016 and the USA 409,364, a gap of 21,889.

To be precise, there were 387,475 articles with at least one author affiliated to a Chinese university or research center and 409,364 with at least one author affiliated to an American university or research center.

So far this year there have been 346,425 articles with Chinese affiliations and 352,275 with US affiliations.

The gap is now 5,850 articles.

I think it safe to say that at some point early next year the gap will close and that China will then pull ahead of the USA.

Some caveats. A lot of those articles are just routine stuff and not very significant. For a while, the US may do better in high impact research as measured by citations. Also, US universities contribute more heads of research projects.

On the other hand, I suspect that many of the researchers listed as having American affiliations did their undergraduate degrees or secondary education in China.

And if we counted Hong Kong as part of China, then the gap would already have been closed.