How did this happen at the University of Ottawa, and what does this mean for the school’s reputation as a whole?
During this winter, high school seniors across the country are making one of the biggest decisions of their young lives: selecting a university to attend.
Since 2010, Times Higher Education (THE) and Quacquarelli Symonds (QS) world university rankings have been the go-to source for these prospective students.
Both lists are based on formulas that pull data to determine a university’s research performance, reputation, international outlook, and employability.
But when it comes to these rankings, there’s a lot more at stake than the decision of a single high school student. In reality, the well-being of the universities can hang in the balance as well.
Fall from grace
At the University of Ottawa, these rankings are taken very seriously.
Once it was revealed that the U of O’s THE ranking fell below the 250 spot this September, its lowest standing ever, the school’s higher-ups took notice.
During the U of O’s Sept. 26 Board of Governors (BOG) meeting, Jacques Frémont dedicated a large portion of his president’s report to this revelation, saying that the administration would investigate the ranking’s methodology.
BOG member Michelle O’Bonsawin even fielded the question about whether or not the university should seek damages from THE for receiving such a low score.
“We look at them very closely,” said Serge Nadeau, the associate vice-president of planning at the U of O. “Rankings are one of several indicators that we have. It can generate a lot of discussion and it could lead to changes in priorities at the university.”
According to M’hamed Aisati, vice-president of product management funding and content analytics at Elsevier, a company that provides a large portion of research data to ranking agencies such as THE and QS, there are three types of people who are concerned about these rankings: future students, the university marketing department, and industry partners.
“Many universities do use the rankings to market themselves,” Aisati explains. “They attract students, typically students from abroad because that brings investment.”
Universities rely on enrolment to meet their budget, and international students—with a tuition price tag of more than three times the price of a national student—are a big catch for an institution.
Generally the U of O performs fairly well in these university rankings, but they’ve been dropping steadily over the last couple years.
Outside of losing favour with THE for five years straight, the QS ranking has been falling as well, dropping from 218 in 2014 to 291 in 2016.
Although these changes seem shocking, it’s important to note that they come at a time when QS and THE incorporated large changes into how they calculate their academic rankings.
Representatives from Elsevier and the U of O both agree that this methodology change has played a significant role in the school’s 2016–17 results.
Methodology changes
In order to really understand why the U of O’s rank took a turn for the worse, and how it can improve in the future, it’s important to investigate how THE and QS calculate their rankings in the first place.
Both ranking agencies use very similar inputs into their calculations, but weigh them differently.
One of the largest components of both rankings is research performance.
This variable is calculated using research output, or the number of publications a university published that year divided by the number of researchers at the university. There’s also a citation score that’s taken into account, which is dependent on how often a publication from that university has been cited in other studies.
Daniel Calto, director of solutions services at Elsevier, notes that the U of O’s overall citation score in the THE rankings actually went up in comparison to previous years. In fact, it increased to a score of 67.5 in 2016–17 compared to 61.5 in 2015–16.
“It is unlikely that citations performance had a negative effect on the U of O’s overall ranking unless other institutions had even higher rises in their citations scores,” Calto explains.
Both rankings also weigh a teaching and educational environment component. This portion looks at data such as student-to-teacher ratio, the institution’s income, and its reputation.
Internationalism is also highly valued in the rankings, measured by the ratio of international students and faculty members.
The last key area is knowledge transfer with industry, based on amount of funding received from that industry.
But a major change this year to the rankings is the addition of new universities.
There are certain criteria that a university must meet before it will be ranked by THE or QS, for instance they must have published a minimum number of publications. As new universities become more established and start publishing research, they are added to the rankings.
“Changes include the expansion of the THE rankings exercise from 400 institutions in 40 countries to over 800 institutions in 71 countries worldwide,” said Calto.
When these new universities are added, it increases competition for every university in the rank.
Nadeau explained that even though an institution is lower in the ranking this year, they may still be in the same or a higher percentile as they were in previous years, due to the addition of more universities.
This means that with more universities being ranked, this can lead to misleading results. For example, a university that ranks 100 out of a total of 400 is ranked in the top 25 per cent of global research universities. A university in the top 25 per cent of a total of 800 universities would receive a rank of 200.
Additionally, Calto noted that universities relegated to tiers below the top 100 are often statistically very close in rank. Since universities’ results are only a few points different from the universities before and after them in the rank, any disturbance by adding more universities, which are also very close in rank, can easily impact the rank of the older universities.
Another methodology change worth noting is a decision by THE to exclude the citations from so called “kilo papers,” publications with more than 1,000 authors, in this year’s calculations.
Calto mentioned that “these papers, a large majority of which are physics papers, are often very highly cited. Exclusion of these papers may affect the rankings, especially at institutions with a significant percentage of their research in physics.”
This is particularly important for the U of O, which, having recently opened the André E. Lalonde Accelerator Mass Spectrometry Laboratory in 2014, now dedicates a lot of resources to research in physics.
Additionally, due to the university’s location, many former students are now employed by the government, which can affect the student feedback portion of the rankings.
Sylvain Charbonneau, associate vice president of research at the U of O, said this creates challenges when ranking agencies contact graduates for reputation surveys. Graduates of universities employed in the private sector have much more freedom when discussing the reputation of their alma mater than government employees, who are required to remain unbiased when discussing government-funded universities.
How can the U of O improve?
With a better understanding of the methodology of rankings, the U of O must begin to make a steady climb back up the QS and THE rankings.
Since there are many factors that go into determining a ranking, there are a variety of ways the U of O’s ranking could continue to worsen.
“You could do many things,” said Aisati. “You might do fantastic work in research but you lag behind because you are not internationally seen, or you do not work with industry to get an income or somehow your reputation is not good.”
That said, there are also several key areas to focus on to improve a university ranking.
“We can say increase your infrastructure, attract the best researchers in the world, develop the best programs, attract funding as best as you can,” said Aisati “but ultimately, you need to have a solid research foundation.”
As such, probably one of the best places for the U of O to start rehabilitating its reputation is in the area of research, since the school already flaunts a strong citation score.
In order to increase the number of citations a publication receives, it needs to be seen by a lot of people—and that’s why international collaboration is so important. More authors, in different locations, means more audience.
According to Calto, “On average, work that has even one international author has on average 1.6 times the impact of papers written only by authors in a single country.”
Another important stride the U of O can make is in fostering an interdisciplinary work environment.
“Some of the most highly cited papers are actually in interdisciplinary areas,” said Calto. “Often dealing with grand challenges like climate change, aging, and cybersecurity.”
In the end, the key to better research for a university is to build a strong network of collaboration across countries and disciplines.
Aisati encourages universities to “build something that is very rock solid” in order to support the institution in the long term.
What the future holds
When it comes down to it, rankings are obviously important to the U of O. The dramatic drop in rank these last couple years is concerning for the school since these results affect its global status, enrollment levels, and funding opportunities.
However, like most metrics, university rankings aren’t infallible. This last year has seen a doubling in the number of universities ranked by THE and some significant changes in the way ranking results are calculated by THE and QS.
Luckily, in 2014, the university launched its Destination 2020 Strategic Plan featuring four pillars of focus: the student experience, research excellence, internationalism, and bilingualism. These pillars, according to the administration at least, will all directly impact research performance and potentially ranking performance as well.
The university has made some strides in this direction by offering a variety of educational activities such as scientific experiments, research projects, co-op, international exchanges, fieldwork, and resources such as the Michaëlle Jean Centre for Global and Community Engagement.
The U of O has also been growing its network of international partners and this focus has already brought results.
“We have collaborators all over the world. We’ve established a joint school in China, we have a joint lab in Leon, France, we have the Max Planck Center,” said Charbonneau.
But with the change in methodology, it still might take a few years for rankings performance to stabilize.
Despite the persisting decline in the university rankings for the U of O, we can still rely on our work in research. Charbonneau said that the collaboration with Max Planck Center offers a certain amount of prestige.
“The Max Planck society would only partner with the best. And in that field of research, we’re the best.”