Showing posts with label QS World University Rankings. Show all posts
Showing posts with label QS World University Rankings. Show all posts

Thursday, January 18, 2024

Petition Against Silly University Ranking Systems

Philip Moriarty has published an Open Letter "We reject the absurd QS/THE World University Rankings", which has so far been signed by 107 academics. While I agree with the sentiment, this list of distinguished academics have missed the point of university rankings: they aren't about scientifically measuring the scholarly & educational worth of institutions, they are a marketing tool. As a marketing tool, and a publishing business, rankings have been very successful. However flawed, the rankings are a response to a real need from students, to provide advice on the quality of institutions. 

When I wanted to study internationally, I turned to ranking systems to check the university I had selected was okay. I didn't care exactly where it rated, but more so it was actually a real university, not a scam. For this I used the Webometrics Ranking of World Universities, as it lists many more than the commercial ranking schemes, and is more open.

If academics want to do away with the current silly ranking systems, they have to come up with something better, which measures what they think should be measured. They then need a marketing strategy to have it adopted, and a business model to sustain it.

Monday, December 11, 2023

Sixteen Australian Universities in the top 100 for Sustainability Worldwide

Australian universities have done very well in the QS World University Rankings for Sustainability 2024. There are 16 Australian universities in the top 100, based on environmental, social and governance (ESG) challenges. This includes many large capital city universities, which I would have assumed would have more difficulty with sustainability, due to densely built inner city campuses. The regional universities have ranked lower. This may reflect the amount of specialist expertise needed to document sustainability, rather than level of sustainability itself. One way around that would be for students to undertake this as part of their coursework.

  1.  7 University of Sydney
  2.  9 University of Melbourne
  3. 11 University of New South Wales (UNSW Sydney)
  4. 23 Monash University
  5. 30 Australian National University (ANU)
  6. 36 University of Queensland
  7. 40 Griffith University
  8. 43 University of Technology Sydney
  9. 49 University of Adelaide
  10. 57 Macquarie University
  11. 62 RMIT University
  12. 62 University of Wollongong
  13. 66 Deakin University.
  14. 74 University of Newcastle, Australia (UON)
  15. 89 Curtin University
  16. 100 Queensland University of Technology (QUT)

Tuesday, October 25, 2022

University Rankings Are a Marketing Tool Not a Student Guide

Julie Hare writes "Why university rankings don’t tell you what you need to know" (AFR, Oct 23, 2022). But ranking systems, such as that from  Times Higher Education (THE), were never intended as a guide for students in selecting a university. My favourite ranking system is the non-profit Webometrics, which includes things like openness. There are also some awards which explicitly look at teaching quality, such as the Good Universities Guide Awards, which show that institutions which rate poorly on THE rankings do well when it comes to education.

The ranking schemes from publishers are designed as marketing tools to help promote their publications, and help their advertisers (the universities), promote themselves. Like many industry awards, the university rankings are designed to appeal to the vanity of the established organisations and their executives. The reality is that research excellence has little to do with the quality of the education provided by a university. If anything there is a negative correlation, as researchers are not selected for their teaching ability. Also the quality of the teaching has little to do with the student's learning outcomes, as this mostly depends on the student, and their background, and the support they get externally. Universities in a particular system also tend to level outcomes. Australia has a strict government regulatory framework for universities, so there are no bad ones, and not that much difference between the top ones.

Sunday, September 13, 2020

More Relivant Higher Educaiton Ranking Systems Needed

Australian education consultancy Studymove has pointed out that the higher the QS World University Ranking for an Australian university, the higher the international tuition fees charged. The correlation applies across fields and levels of study, but is least for education studies. The consultants speculate that high unemployment may result in students selecting institutions and programs with higher raking for employment outcomes. However, I expect many students don't look past the overall ranking of a university to examine individual measures. The current ranking schemes are weighted towards academic quality and research output, although these are not relevant to most students.

Like many such ranking systems, QS is heavily skewed to academic reputation. In the case of QS, the largest proportion of the overall measure is based on a survey of academics. Other raking systems use research papers published as a measure of academic quality. However, the research at a university has little effect on the quality of education a student receives. Most students are not planning to be researchers and few who complete research degrees end up in research jobs. Researchers don't make particularly good teachers anyway and this emphasis on research may be resulting in students making poor education choices.

Webometrics uses measures of openness, as well as quality, to provide a more relevant ranking of universities. This relies on readily available information, rather than surveys, which has the advantage of allowing inclusion of many more institution. In particular Webometrics includes thousands of vocational institutions which are excluded from most university ranking systems. Australian has only 43 universities, but Webometrics lists another 150 non-university institutions. 

Despite the different measures used, Australian universities outrank the non-university institutions on the Webometrics scoring and the raking of universities is not very different from QS. This suggests it should be possible to create new rankings of universities at low cost, using a similar approach to Webometrics.

The Webometrics methodology, like that of QS, is heavily weighted towards academic quality. Overall, 35% of the measure is based on papers cited in research publications which perhaps explains why universities are at the top of the list, outranking vocational institutions. 

The ordering of the universities is different to QS, but the same universities feature near the top of both lists. Near the bottom of the universities in the Webometrics ranking are a few large state government vocational intuitions: TAFE NSW, TAFE Queensland, Adelaide Institute of TAFE and Canberra Instituter of Technology. Just one private for-profit vocational provider also features: Open Colleges, which has a 100 year history in correspondence education.

I pasted the Webometrics list of Australian universities to a spreadsheet, and reweighed the scores by deleting the excellence measure. The leading universities remained at the top of the list, but a few vocational institutions crept up a little: Open Colleges from 41 to 39th place, TAFE NSW from 43 to 40, TAFE Queensland 46 to 44, Adelaide Institute of TAFE 45,  Kangan Institute 49. Canberra Instituter of Technology dropped one place from 49 to 50.

Wednesday, June 10, 2020

QS Top Universities Rankings 2021

The latest QS Top Universities Rankings  have six Australian institutions in the top 100, ranked: 31 ANU, 40 Sydney, 41 Melbourne, 44 NSW, 46 Queensland, 55 Monash, and 92 UWA. These results are similar to other ranking schemes, are widely reported, and influence student choices. However, the relevance of the methodologies used is questionable.

The QS rankings are based on five indicators: Academic and Employer Reputation, Faculty/Student Ratio, Citations per faculty, International Faculty and Student Ratios. Reputation makes up half of the raking, but is, in part, a self fulfilling prophesy: academics and employers base their ranking of an institution, in part, on how universities have ranked in the past. The ratio of teachers to students might seem more objective, but this doesn't take into account if these teachers are qualified to teach (most university academics do not have a degree in education). Citations provide some measure of research output, but this may indicate less interest by the staff in teaching. International staff and student ratio is used as a proxy for soft skills, but does not measure if the international staff and students mix, or are confined to specific campuses and classes.

Another difficulty with QS and similar ranking systems is that they only cover about one tenth of higher education institutions in the world. Most students can't afford to attend one of these "top" 1,000 universities. Most students would be better off with an institution not in the top 1,000, but which focuses on quality teaching for skills in demand, rather than research. The Webometrics Ranking of World Universites, tries to provide a boarder ranking, and covers more institutions.