Monday, 24 July 2017

What is the point of world University rankings?

The ShanghaiRanking Consultancy (SRC), which claims to be the official publisher of the Academic Ranking of World Universities, has published its rankings of mining and mineral engineering university departments for 2017.  I am sure that those listed in the top 10 will be very pleased with this publication, but I am equally sure that many departments will feel that they have been hard done by, particularly when the listing is compared with the 2017 QS World Ranking of Top Mineral & Mining Universities. I criticised the QS list in 2016 (posting of 5th April 2016), as many of the ranked universities did not have minerals departments, which makes you wonder how these lists are put together.
For the 2nd year running the Colorado School of Mines is #1 in the QS ranking, but only merits #45 by the SRC. The top ranked SRC university, the admirable University of Science and Technology Beijing, does not merit a place in the QS rankings!
Of the universities in the listings, there is only one which appears in the top 10 of both listings- the University of Queensland. The world renowned Camborne School of Mines does not appear in the top 50 of either listing.
So I ask once again- what is the point of these listings? At best they are controversial, and at worst they create much ill-feelings and despondency, and I am sure can be very damaging to the reputation of many departments.
Twitter @barrywills

18 comments:

  1. The rankings make no sense and that does not show the quality of the university. Moreover, for example in the Latin american list the #1 university sits behind another latin american university in the world ranking. Makes no sense at all, and should not be published.

    ReplyDelete
  2. Yes. They increase one's self-confidence. A graduate from Columbia University will certainly start path with lot of enthusiasm and vigour and shall have higher probability of success in professional and personal life.
    Arun Dongrey, India

    ReplyDelete
    Replies
    1. I think the graduates from quality departments who do not appear in the listing would disagree strongly with you Arun. The listings probably don't do much for their self-confidence!

      Delete
  3. Dangerous to play the ranking game but just as dangerous ignoring it. As with statistics it can be manipulated. Strange enough the way Universties rankings is done always to an extent favour old establish quality institutions. It takes many years for younger quality universities to see results in the ranking process when committed. However, when it come to departments the opposite seems true.
    Leon Lorenzen, Lorenzen Consultants and Mintrex, Australia

    ReplyDelete
  4. I think they are important, but each ranking should be measured in line with the methodology that they use. Here are the approached for Shanghai and QS: http://www.shanghairanking.com/Shanghairanking-Subject-Rankings/Methodology-for-ShanghaiRanking-Global-Ranking-of-Academic-Subjects-2017.html https://www.topuniversities.com/subject-rankings/methodology

    Basically the big difference is that Shanghai uses bibliometric measures (citations and publications) and QS uses surveys from academics and employers. Of course these different approaches are going to give significantly different results.

    ReplyDelete
    Replies
    1. Citations and publications rearing their ugly heads again (see posting of 7th July. Obviously quality of teaching plays no part then?

      Delete
  5. This looks like a cribed together list based on old data. As in many rankings University of California - Berekeley ranks very high (#10), but they no longer have a mining and minerals program. Pennsylvania State University is also high (16), but they are mostly coal. Then there is University of Texas- Austen (43) which is primarily petroleum. And no mention of Univeristy of Nevada - Reno (Macky School of Mines).

    ReplyDelete
  6. The Shanghai/ARWU ranking for the subject areas is dangerous given the excessive emphasis on citations. This is also gamed by some players where citations and references are added during the proofing stage when editors and reviewers have no more control over rhe edits but the copy editor allows this. Thus researchers of questionable integrity adds citations and references and the copy-editor has no insight to reject the late edits. Citations also tend to be low for papers publishing important but controversial views (which are much needed), niche areas or true innovations. Citations and h-indices can skew outcomes and can also be manipulated.
    To me the QS ranking (at least the latest 2017 one) has all the recognised mining and minerals processing schools, and is balanced by employer and academic standing, but still includes the impact of citations and h-indices, but to a lesser extent than rhe Shanghai/ARWU ranking.
    Jacques Eksteen

    ReplyDelete
  7. For mining folks, the number of both graduates and institutions is small. From my perspective, the place an engineer graduates from is secondary to their willingness to work hard and learn new things. Small numbers means that professors and individuals have a greater influence on 'quality' than which school they come from. So, I'm with Barry, I strongly disagree with Arun. Ranking schools is for people outside the industry. Once you have a degree you don't hear a lot folks bragging about which school it came from.
    Susan (Sue) Ritz. Jack Rabbit Consulting, Utah, USA

    ReplyDelete
    Replies
    1. I'm in total agreement with you, Sue! Thanks for weighing in with a common sense approach.
      Corale Brierley, Brierley Consultancy LLC, Colorado, USA

      Delete
    2. Agree with this. In the end performance and competency appears more a personal characteristic than a university-related one. This particularly since few universities appear to introduce many of the areas of competency necessary for practice by mineral processing engineers in operations or design. They are certainly unable to provide the experiences necessary to develop competencies.
      We are most fortunate if students learn how to learn and how to solve problems (perhaps using a scientific method).

      Bob

      Delete
  8. Much like an H Index, these rankings are a subjective evaluation based upon the chosen metrics behind them. The devil is in the details. However, do not discount them as University administrators routinely rely upon such manifestations.
    Corby Anderson, Colorado School of Mines, USA

    ReplyDelete
    Replies
    1. If they do, Corby, then it is more than a little worrying

      Delete
  9. Hi Barry, yes I read that, and I agree, it can be a weak measure that promotes wrong behaviour. But I think teaching quality should at least be affected by the perception of academics and employers in the QS rankings.

    However, I should say, I am from the University of Queensland, I would probably be more likely to protest these rankings if I wasn't so proud of how well we had done.

    ReplyDelete
    Replies
    1. The only evidence of any consistency in these rankings is UQ's position in the top 5 in both lists. So we must assume that you are doing some right!

      Delete
  10. Why do we rank a system, particularly in a discipline like Mining and Minerals? Is it to all concerned to know the quality of work done which in turn helped the profession to reach new heights? If so why not write a few lines on the impact of a given Institute on the profession/society. Can Govts and policy makers learn "doables" from this ranking so that other Institutes can also be motivated to improve?
    For me these rankings are creating more confusion than contributing to any useful purpose--
    Rao,T.C.

    ReplyDelete
  11. Hello, I might be a bit biased for having worked here for 35 years, but it does not seem realistic that The Robert M. Buchan Department of Mining at Queen's does not even appear in some of the rankings, while in others it does quite well. Clearly something is wrong! Not sure of the purpose of these ranking! Any advice what we should do to get on the lists we are not even on?

    Chris Pickls
    Queen's University,
    Kingston, Canada

    ReplyDelete
  12. Dave Middleditch ACSM 200427 July 2017 at 22:13

    Obviously the Shanghai Ranking Consultancy did not take into account a graduate's ability to demolish 40 beers, so for that reason alone I would contend that the SRC list is a total fallacy and has no bearing to actual ability to produce well rounded Mining Industry professionals.

    ReplyDelete

If you have difficulty posting a comment, please email the comment, and any photos that you might like to add, to bwills@min-eng.com and I will submit on your behalf