Showing posts with label University Ranking. Show all posts
Showing posts with label University Ranking. Show all posts

Friday, January 30, 2026

TIME Magazine's Ranking of the World's Top Universities

Time Magazine has produced a ranking of the World's Top Universities. Note that this is from a different publisher and is a different ranking scheme to the Times Higher Education World University Rankings. Time claim their rankings are more relevant to students, but produce much the same results as Times & other ranking schemes. The Time & Times rankings have the same 8 universities in their top 10 for Australia. Also Time only rank 500 universities, compared to more than 5,000 by Webometrics. As with most such ranking schemes this seems more about selling advertising than helping students.

Time moves Curtin & James Cook Universities to the top 10, dropping University of Technology Sydney, and Macquarie University.

  1. University of Queensland (6)
  2. University of Melbourne (1)
  3. University of Sydney (2)
  4. University of Western Australia (8)
  5. University of New South Wales (5)
  6. Australian National University (4)
  7. University of Adelaide (7)
  8. Monash University (3)
  9. Curtin University (13)
  10. James Cook University (24)
Time's ranking is based on a shortlist of one highly cited researcher in Clarivate, "among the most renowned and frequently mentioned institutions", or they applied to be on the list. The detailed ranking appears similar to other such schemes. These tend to emphasize research output, rather than education. Time appears to have tried to emphasize education more, with measures such as resource expenditure per student, faculty-to-student and staff-to-student ratios. There is no measure of what proportion of staff are qualified to teach. Essentially, these are measures of input, rather than output. 

Wednesday, January 31, 2024

Open University Ranking System?

Cameron Neylon, Curtin University
Greetings from the Future Campus webinar on the CWTS Leiden Ranking Open Edition. This claims to be an open verifiable way to rank universities. It is an advance on the ranking systems produced by publishers, in that you can look at the data, and use it to come up with your own custom ranking system. Also Leiden ranks 1411 universities, which is more than most other systems (except Webometrics). Cameron Neylon mentioned the interesting work he is doing at Curtin University for a more detailed analysis of Australian university rankings.

The rankings on the Leiden system are not that different to others, in that the leading Australian universities still come out on top. This should not be a surprise as this is ranking research only. As Elisabeth Gadd pointed out in the webinar, the rankings are not an indication of education quality. As a result, in my view, these rankings are no better than previous ones for the purpose which they are most used: students, and their parents, selecting a university. 

Unfortunately the public assumes that research quality translates to education quality. Universities exploit this misunderstanding in their marketing. The ranking scheme I still prefer is Webometrics, as it covers more institutions. 

Thursday, January 18, 2024

Petition Against Silly University Ranking Systems

Philip Moriarty has published an Open Letter "We reject the absurd QS/THE World University Rankings", which has so far been signed by 107 academics. While I agree with the sentiment, this list of distinguished academics have missed the point of university rankings: they aren't about scientifically measuring the scholarly & educational worth of institutions, they are a marketing tool. As a marketing tool, and a publishing business, rankings have been very successful. However flawed, the rankings are a response to a real need from students, to provide advice on the quality of institutions. 

When I wanted to study internationally, I turned to ranking systems to check the university I had selected was okay. I didn't care exactly where it rated, but more so it was actually a real university, not a scam. For this I used the Webometrics Ranking of World Universities, as it lists many more than the commercial ranking schemes, and is more open.

If academics want to do away with the current silly ranking systems, they have to come up with something better, which measures what they think should be measured. They then need a marketing strategy to have it adopted, and a business model to sustain it.

Tuesday, October 25, 2022

University Rankings Are a Marketing Tool Not a Student Guide

Julie Hare writes "Why university rankings don’t tell you what you need to know" (AFR, Oct 23, 2022). But ranking systems, such as that from  Times Higher Education (THE), were never intended as a guide for students in selecting a university. My favourite ranking system is the non-profit Webometrics, which includes things like openness. There are also some awards which explicitly look at teaching quality, such as the Good Universities Guide Awards, which show that institutions which rate poorly on THE rankings do well when it comes to education.

The ranking schemes from publishers are designed as marketing tools to help promote their publications, and help their advertisers (the universities), promote themselves. Like many industry awards, the university rankings are designed to appeal to the vanity of the established organisations and their executives. The reality is that research excellence has little to do with the quality of the education provided by a university. If anything there is a negative correlation, as researchers are not selected for their teaching ability. Also the quality of the teaching has little to do with the student's learning outcomes, as this mostly depends on the student, and their background, and the support they get externally. Universities in a particular system also tend to level outcomes. Australia has a strict government regulatory framework for universities, so there are no bad ones, and not that much difference between the top ones.

Sunday, December 6, 2020

UNSW Ranking Of Universities: a Result Looking for a Rationale

The UNSW Aggregate Ranking Of Top Universities combines the Times Higher Education (THE), Quacquarelli Symonds (QS) and Academic Ranking of World Universities (ARWU) to create the ARTU. This is claimed to provide a long term comparison of top universities. But it is not clear if this combination has any theoretical unpinning, or practical purpose, except to boost UNSW's rating. 

Such ranking systems are used by students in selecting an institution to study at. However, these rankings are heavily weighted to research performance, while most students are undertaking coursework. Also these ranking systems only include a few hundred universities, while there are thousands of other good educational institutions excluded. A reliable ranking of institutions would be useful for students, but ARTU is not that.

My favorite university ranking scheme is Webometrics, which relies on readily available information. As a result, about 30,000 institutions listed, including many vocational colleges, which are excluded other ranking systems. 

Here are institutions I have studied at most recently. All these institutions provided a useful quality education, but only the first (ANU), is in the ARTU list: 

National
Ranking
World Rank University Presence Rank* Impact Rank* Openness Rank* Excellence Rank*
6
75
Australian National University
159
77
70
140
29
764
University of Southern Queensland
1721
864
777
976
49
8269
Canberra Institute of Technology
13734
5420
5819
6626

Canada

 
National
Ranking
World Rank University Presence Rank* Impact Rank* Openness Rank* Excellence Rank*
44
1424
Athabasca University
2454
959
1884
2428

Sunday, September 13, 2020

More Relivant Higher Educaiton Ranking Systems Needed

Australian education consultancy Studymove has pointed out that the higher the QS World University Ranking for an Australian university, the higher the international tuition fees charged. The correlation applies across fields and levels of study, but is least for education studies. The consultants speculate that high unemployment may result in students selecting institutions and programs with higher raking for employment outcomes. However, I expect many students don't look past the overall ranking of a university to examine individual measures. The current ranking schemes are weighted towards academic quality and research output, although these are not relevant to most students.

Like many such ranking systems, QS is heavily skewed to academic reputation. In the case of QS, the largest proportion of the overall measure is based on a survey of academics. Other raking systems use research papers published as a measure of academic quality. However, the research at a university has little effect on the quality of education a student receives. Most students are not planning to be researchers and few who complete research degrees end up in research jobs. Researchers don't make particularly good teachers anyway and this emphasis on research may be resulting in students making poor education choices.

Webometrics uses measures of openness, as well as quality, to provide a more relevant ranking of universities. This relies on readily available information, rather than surveys, which has the advantage of allowing inclusion of many more institution. In particular Webometrics includes thousands of vocational institutions which are excluded from most university ranking systems. Australian has only 43 universities, but Webometrics lists another 150 non-university institutions. 

Despite the different measures used, Australian universities outrank the non-university institutions on the Webometrics scoring and the raking of universities is not very different from QS. This suggests it should be possible to create new rankings of universities at low cost, using a similar approach to Webometrics.

The Webometrics methodology, like that of QS, is heavily weighted towards academic quality. Overall, 35% of the measure is based on papers cited in research publications which perhaps explains why universities are at the top of the list, outranking vocational institutions. 

The ordering of the universities is different to QS, but the same universities feature near the top of both lists. Near the bottom of the universities in the Webometrics ranking are a few large state government vocational intuitions: TAFE NSW, TAFE Queensland, Adelaide Institute of TAFE and Canberra Instituter of Technology. Just one private for-profit vocational provider also features: Open Colleges, which has a 100 year history in correspondence education.

I pasted the Webometrics list of Australian universities to a spreadsheet, and reweighed the scores by deleting the excellence measure. The leading universities remained at the top of the list, but a few vocational institutions crept up a little: Open Colleges from 41 to 39th place, TAFE NSW from 43 to 40, TAFE Queensland 46 to 44, Adelaide Institute of TAFE 45,  Kangan Institute 49. Canberra Instituter of Technology dropped one place from 49 to 50.

Wednesday, June 10, 2020

QS Top Universities Rankings 2021

The latest QS Top Universities Rankings  have six Australian institutions in the top 100, ranked: 31 ANU, 40 Sydney, 41 Melbourne, 44 NSW, 46 Queensland, 55 Monash, and 92 UWA. These results are similar to other ranking schemes, are widely reported, and influence student choices. However, the relevance of the methodologies used is questionable.

The QS rankings are based on five indicators: Academic and Employer Reputation, Faculty/Student Ratio, Citations per faculty, International Faculty and Student Ratios. Reputation makes up half of the raking, but is, in part, a self fulfilling prophesy: academics and employers base their ranking of an institution, in part, on how universities have ranked in the past. The ratio of teachers to students might seem more objective, but this doesn't take into account if these teachers are qualified to teach (most university academics do not have a degree in education). Citations provide some measure of research output, but this may indicate less interest by the staff in teaching. International staff and student ratio is used as a proxy for soft skills, but does not measure if the international staff and students mix, or are confined to specific campuses and classes.

Another difficulty with QS and similar ranking systems is that they only cover about one tenth of higher education institutions in the world. Most students can't afford to attend one of these "top" 1,000 universities. Most students would be better off with an institution not in the top 1,000, but which focuses on quality teaching for skills in demand, rather than research. The Webometrics Ranking of World Universites, tries to provide a boarder ranking, and covers more institutions.

Wednesday, June 13, 2018

Charles Darwin University in Top 4% Globally

Charles Darwin University claims to be "... ranked in the top two percent of universities worldwide", based on the THE World University Rankings. However, CDU is ranked 301–350 out of 1,000 institutions, placing it in the top 30%, not 2%. In 2017 the UK Advertising Standards Authority issued guidelines to stop UK universities making similar misleading claims.  Perhaps similar guidelines are needed in Australia.

The Webometrics Ranking of World Universites places CDU 1,165 out of 27,000 institutions. So it would be reasonable for CDU to claim to be ranked in the top four percent of universities worldwide, on that evidence.

Tuesday, March 20, 2018

Data Science Ranking Universities

In "Data science can fix ranking briar patch" (The Australian, 14 March 2018), Tim Dodd suggests that data science could be used to rank universities, but this has been done for years with the "Ranking Web of Universities".  Their methodology emphasizes the quality and quantity of information universities provide online (which I think is a good thing). This produces slightly different rankings for Australian universities, to measures emphasizing research output behind pay-walls.

For its top ten Australian institutions the QS World University Rankings has: ANU, Melbourne, UNSW, Queensland, Sydney, Monash, UWA, Adelaide, UTS and Newcastle. On the Ranking Web of Universities, ANU slips from first to fifth place, while UTS and Newcastle displace Curtin, and Macquarie in the bottom two places.

Australian Web rankingWorld Web Ranksort descendingUniversityQS Australian Rank Presence Rank*Impact Rank*Openness Rank*Excellence Rank*
1
55
University of Melbourne2
179
108
53
26
2
62
University of New South Wales3
166
87
85
52
3
63
University of Queensland4
152
114
72
37
4
74
University of Sydney5
626
172
94
30
5
75
Australian National University1
122
72
75
131
6
93
Monash University6
571
206
97
57
7
129
University of Adelaide8
531
143
128
149
8
136
University of Western Australia7
581
189
171
115
9
235
Curtin University of Technology15
1069
227
245
276
10
271
Macquarie University12
553
266
196
341

Monday, June 19, 2017

Webometrics Ranking of Ten Thousand Universities

The "Webometrics Ranking of World Universities" by Spanish researchers has advantages over the better known ranking systems and produces a few surprises. Webometrics ranks more than ten thousand universities around the world, which is many more than other measures provide. For Australia 199 institutions are listed, whereas other rankings mostly cover the 43 accredited as universities, not colleges which also issue Higher Education qualifications.

Webometrics uses four measures: Presence, Impact, Openness and Excellence. While the weighted score of these measures gives a similar result to other measures, there are some surprises. For example, many ranking system give a similar result to the the Times Higher Education University Rankings with the "Group of Eight" first:
  1. University of Melbourne
  2. Australian National University
  3. University of Queensland
  4. University of Sydney
  5. Monash University
  6. University of New South Wales
  7. University of Western Australia
  8. University of Adelaide
Webometrics lists the same top Australian universities , but in a different order:
  1. University of Melbourne
  2. University of Queensland
  3. University of New South Wales
  4. Australian National University
  5. University of Sydney
  6. University of Western Australia
  7. University of Adelaide
  8. Monash University
The ANU slips two places due to low Presence and Excellence scores. Monash drops three places to eight position, due to a very low Presence score.

The Presence measure is unusual in university ranking systems, in that it measures the quality of the university's website. Ranked by presence, the top eight Australian universities are:
  1. University of Queensland
  2. University of Melbourne
  3. University of New South Wales
  4. Australian National University
  5. RMIT University
  6. Edith Cowan University
  7. University of Western Australia
  8. Macquarie University
Sydney, Adelaide and Monash are replaced by RMIT, Edith Cowan and Macquarie. This effect of the Presence is more pronounced with world rankings, where the usual US and UK prestige institutions rank first in Webometrics, but for presence the University of São Paulo and  Fundação Getúlio Vargas (Brazil), at 6 and 7 in the world, outrank Clatech and the University of Oxford. 

Another interesting result is that universities with "open" in their names do not rate highly on the Webometrics openness scale (the Open University UK ranks highest at 403). Similarly those with "virtual" in their name do not rate highly for web presence (Tamil Virtual Academy, is highest at 1172).

The institutions I have studied at most recently rank as follows on Webometrics:

RankInstitutionPresenceImpactOpennessExcellence
78ANU3007069131
738USQ14386587181200
1242Athabasca University1256116913621879
8175CIT4961635286355778

This is much as I would expect, with ANU being a leading research university, USQ and Athabasca teaching universities and CIT a vocational college. However both USQ and Athabasca score poorly compared to ANU on presence and openness, despite their emphasis on e-learning and access to education. USQ claims a "commitment to open education" and Athabasca describes itself as "Canada's Open University", but neither rates in the top one thousand.

Tuesday, May 10, 2016

Reduce Number of Australian Universities to Improve Rankings

The latest Times Higher Education's World Reputation Rankings has only three Australian universities in the top 100:  University of Melbourne, Australian National University and University of Sydney. These rankings are based on the opinions of academics. As Asian countries increase in wealth they can afford to spend more on their universities and so their rankings will increase and those of other countries will continue to decline.

The rankings are not very meaningful for academics, as it is the team for a particular sub-discipline which is important, not the institution overall. However, the university reputation is important in terms of attracting students and funding.

One way Australia could improve its rankings is to have fewer universities. Currently Australia has 43 universities for 1.3 million students, which is about 30,000 students per university.  Research by Garrett (2016) indicates that larger institutions (100,000 to 500,000 students) were growing, whereas smaller ones were declining. This suggests Australia should have about 10 universities, each with about 130,000 students. This need not require a radical restructuring or closing of campuses, just merging of university brands.

Garrett's study was of on-line universities and it might be argued that this is not applicable to Australia's campus based institutions. However, a radical restructuring of universities is happening in Australia, brought on by adoption of e-learning. This will happen regardless of the size of Australian universities and will happen regardless of what they do. Currently this restructuring is being largely undertaken by stealth under the label of "blended learning". However, whatever it is called, if Australian universities don't restructure for e-learning and have a sufficient size to make them viable, they will be put out of business by overseas institutions offering education on-line to Australian students.

References

Garrett, R. (2016). The State of Open Universities in the Commonwealth: A perspective on performance, competition and innovation. Retrieved from: http://dspace.col.org/bitstream/handle/11599/2048/2016_Garrett_State-of-Open-Universities.pdf?sequence=1&isAllowed=y

Monday, January 11, 2016

Are Malaysian Universities Performing Poorly?

Murray Hunter asks "Why Malaysian universities are performing poorly" (OnLine Opinion, 11 January 2016). He notes that Malaysian universities do not rank well in regional or global scales, such as QS World University Ranking. I checked the latest rankings and found the top ranking Malaysian institution was Universiti Malaya at 146, then another four in the top 400 and two in the top 1,000. This does not appear such a bad result to me. Singapore has only two, although they rate higher that Malaysia (NUS at 12 and NTU at 13).

Hunter dismissed a suggestion by Dr. Kamarudin Hussin, VC of Universiti Malaysia Perlis (Unimap) that the rankings favor established universities. I checked Malaysia's ranking on a slightly different and more progressive ranking system, the Raking Web of Universities (RWU) This lists 118 Malaysian institution (many are "colleges" rather than universities and also campuses of overseas institutions). University of Malaya comes top, as it did in QS. The RWU lists 11,898 institution,whereas the QS stops ranking at 701. UM's ranking of 461 in RWU is there a good result. The RWU places more emphasis on factors such as the university's web presence, which perhaps reflects Malaysian institution's better ranking than on QS.

Hunter suggests that the poor performance of Malaysian universities  is due to a lack of academic freedom and expenditure on non-academic items. However, he had already commented that universities in countries with authoritarian governments had done well in the QS rankings. Also allegations of lavish entertaining and trips by university staff are not confined to Malaysia.

Hunter suggests Malaysian universities are "dominated by vice chancellors who are intent on micromanaging their universities". This is not a new complaint about universities world wide, nor is his proposed solution: "re-organize Malaysian public universities from the top down". However, if the problem is micromanaging from the top, then I suggest any top down approach will likely make matters worse, rather than better.

Universities are not top-down organizations and VCs do not really run them. Universities are made up of semi-autonomous units which do the real work. A university is similar to a corporation with multiple business units, or a country which is a federation of states (as Malaysia is). There is a delicate balancing act, as to what functions are administrated centrally at a university and what is left to a policy which the parts administer themselves. This also applies at the national level, with government administering some aspects of universities centrally and leaving other aspects to policy. An example is the way quality of research and teaching is set, either through direct setting of standards, or encourager through grants.

Malaysian Universities QS Rankings


146

289 

303 

312 

331 

551

701+

701+



Malaysia in Raking Web of Universities

RankingWorld Ranksort descendingUniversityDet.Presence Rank*Impact Rank*Openness Rank*Excellence Rank*
1
461
University of Malaya
601
1158
250
380
2
517
Universiti Teknologi Malaysia
94
1471
164
526
3
522
Universiti Sains Malaysia
984
1090
569
422
4
589
Universiti Putra Malaysia
173
1684
726
484
5
703
Universiti Kebangsaan Malaysia / National University of Malaysia
771
1863
677
527
6
881
Universiti Teknologi MARA / MARA University of Technology
120
2141
441
1063
7
1186
Universiti Tenaga Nasional
1750
1377
2217
1610
8
1372
Universiti Tun Hussein Onn Malaysia
1431
2283
1970
1543

Monday, March 9, 2015

Rate University Teaching Quality On-line

In "The hunt for Australia's best teaching uni" (The Age, 9 March 2015), Gary Newman asks how to rank the quality of teaching at universities. Using the measures which have been used he lists Australia's top universities for teaching as: Victoria, Griffith, Deakin, Central Queensland, Tasmania, Edith Cowan, Canberra, Federation, Bond, Notre Dame, Sunshine Coast, New England and Western Sydney. That seems a reasonable list, with the regional universities not known for research having to focus on teaching. Also many of these institutions started as technical and teachers colleges devoted to education and now have an emphasis on e-learning (which requires a more professional approach to teaching).

As he points out the current ranking systems are based largely on research quality which is not necessarily related to the quality of teaching. He says a 2013 study provides some data (this appears to be "Factors Associated with Job Satisfaction Amongst Australian University Academics and Future Workforce Implications" by Peter Bentley, Hamish Coates, Ian Dobson, Leo Goedegebuure and Lynn Meek).

One interesting variation in rankings is the Ranking Web of Universities (Webometrics Ranking), which bases it assessment on the university's web presence. This produces a result similar to other measures but is easier to compile. Perhaps that could be applied to provide a measure of teaching quality, not just for Australian institutions, but world wide.
This, of course assumes that students are interested in teaching quality. It may be that students select a university based on its research reputation, as that is how the quality of their qualification will be seen.

Wednesday, December 31, 2014

University Ranking

The Global Employability University Survey 2014 has ANU as the top ranking Australian university (20 in the world), followed by Monash University (33), UNSW (55) and University of Melbourne (50).

This compares with the Times Higher Education World University Rankings 2014-2015 with the top Australian institution, University of Melbourne (33), then ANU (45), Sydney University (60), University of Queensland (65), Monash University (83), UNSW (109).

The university ranking system I find most interesting  is Webometrics. This is based on an analysis of the university's web pages. With measures for presence, impact, openness and excellence. On these measures, Australia has first the University of Melbourne (82), then University of New South Wales (96), University of Queensland (98),  Australian National University (101) and Monash University (114).

Wednesday, August 28, 2013

Plan to Rate US Colleges and Cap Student Loans Similar to Australia

US President Obama released a "Plan to Make College More Affordable: A Better Bargain for the Middle Class" (White House, 22 August 2013). This envisages publishing performance measures for US Colleges from 2015 and later tie government aid to these ratings. Also it is proposed to cap student loan repayments at 10 percent of monthly income. These measures are similar to some already in place in the Australian higher education system. The USA could learn from Australia's experience.

The Australian Government provides subsidized student loans called "HELP" which only need to be repaid when income reaches a set level (currently $51,309 per year). This is a relatively low cost scheme to administer, but is not without problems (such as students who leave Australia and never repay the debt).

If government is funding education, then it seems reasonable there is a government mandated minimum standard. This system applies in Australia to higher education, with both private and public institutions requiring to meet minimum standards. But a government rating one institution as better than another is more problematic and not done in Australia. Apart from the problem of having a reliable system, it has to be asked what is the purpose of the government rating and if it will be effective.

Australia has the Tertiary Education Quality and Standards Agency (TEQSA), which is a government agency, but relatively free of political interference, with academics setting the standards.

The Australian Government provides the "MyUniversity" website with details of all Australian institutions. This details the qualifications of the staff and how many have awards for teaching, which is relatively uncontroversial.  More at issue are student survey results for each subject area.


I have recently undertaken tertiary teacher training and so am comfortable with the idea of being rated by the students and have designed a course which rates highly. But some of my colleagues have difficulty with the concept and practice of designing courses which meet external standards and are popular with students.

There is the risk of a race to the bottom, with courses which just meet minimum standards and are designed to be easy and therefore popular. But I find that students value a course which challenges them. Also while meeting external standards is a useful discipline for the course designer.


The USA might want to adopt more of the Australian approach to encourage quality higher education. However, problems remain with both systems. One issue is if the information provided about courses actually influences student behavior and if this information is useful in making a decision on what and where to study. As an example, there are swings in different sectors of the economy. If these swings match the length of a course, then the information reported about student's success in the workforce will give the wrong signals to the students.

As an example, Australia has experienced a mining boom over the last few years, with a high demand for skills in that sector. A student looking at the statistics would think  mining is a good area to get into. However that boom is now coming to an end. The student may find there are no jobs by the time they graduate. Or if they are lucky, the end of their course will coincide with the next upswing in mining. One way to counter this is to provide predictions of future employment (which have their own problems). Another solution is to provide courses suitable for a range of jobs.

Sunday, July 21, 2013

Ranking Web of Universities

Various rankings of universities have been published. These use a combination of research publications, status of university staff and some measures of teaching quality. This can influence a student's choice of an institution and so is taken very seriously by university administrators. One interesting variation is the Ranking Web of Universities (Webometrics Ranking), which bases it assessment on the university's web presence. Apart from making the process more automated, this more closely mimics a world where academics get information on-line, not from traditional publications. Interestingly the results from Webometrics are not so different from the ranking produced by more labor intensive and traditional methods.

In the latest Webometrics, the top ranking Australian university is the Australian National University (ANU) at 76 in the world. The overall rank is compued from three components: Presence Rank: 335, Impact Rank: 96, Openness Rank: 110, Excellence Rank: 131.

ANU is followed in the Webometrics list for Australia by: University of Melbourne, University of Sydney, Monash University, University of Queensland. The Times Higher Education World University Rankings  has University of Melbourne before ANU,  then University of Sydney, University of Queensland and University of New South Wales, with Monash University relegated to sixth place. The Academic Ranking of World Universities has the University of Western Australia displacing Monash for fifth place.

While universities would need to make large investments in research over many years to improve their ranking in traditional systems, it should be much simpler and cheaper for university administrators to improve the Webometrics rank. This is because web "presence" and "openness" are two of the four criteria used.

Administrators may not be able to get researchers to do research any quicker or to write better papers, but they can improve the university's web site and access to on-line publications. The web sites can be improved by using web accessibility guidelines, so that web pages are easy to access. Also universities can pay the additional publication fees to have journal papers made open access, so that readers don't need to pay a subscription to read them.

Often marketing and graphic design staff produce complex, hard to access web designs, in the mistaken belief these will appeal more to readers. What in fact happens is that web search engines can't index the content and people, particularly using mobile devices can't read the documents. It is better to use simple web formats.

Researchers will choose the most prestigious journals to publish their papers. However, these publications tend not to be "open access" and require a subscription to read the articles. Some journals offer to make individual papers open access if an additional fee is paid by the author. But academic are reluctant to pay these fees, unless there is a requirement from their institution (and a special grant) to do so. As a result these closed access papers tend to be read and cited less than open access ones, simply because they are harder to get on-line.