The Australian

by Jill Rowbotham

Dropping rankings from journals for the next round of the Excellence in Research for Australia audit was the correct decision, according to a leading thinker on metrics in higher education. Ellen Hazelkorn, Dublin Institute of Technology's director of research and enterprise in the higher education policy unit, says the Australian Research Council was right to drop the designated A* to C rankings assigned to the 22,000 journals used in the bibliometrics for ERA.

According to Professor Hazelkorn, best known for her work on global university rankings, there are serious doubts about the role of journals in academic culture.

Although there were relatively few complaints about rankings being wrongly assigned, they were bitter and persistent, arguably distracting from an exercise that on the whole had been judged effective and worthwhile by higher education leaders in Australia.

But Professor Hazelkorn argued the relevance of a broader debate about journals in a world of accelerating change.

"Journals, their editors and their reviewers can be extremely conservative," Professor Hazelkorn said. "They act as gatekeepers and can often unintentionally discourage intellectual risk-taking. Indeed, it could be argued that the soaring number of journals is a response to the increasing complexity of knowledge."

This proliferation might be an acknowledgment that there were many legitimate ways of thinking or could be a reaction to the perception that journals were closed shops to contrary viewpoints or methodologies.

Academic status and reputation were "so tied up with particular journals, the practice can assert a hierarchy of knowledge and disciplinary values [that] endorse a traditional world order, privileging some researchers and their universities over others".

Another problem was an over-reliance on peer review to measure research impact.

"Impact is perceived simply as that which is read within the academic community rather than impact on society," she said. "Many articles are published, but how many actually have beneficial value for society?

"Assessment should go beyond simply reviewing what one academic has written and another has read.

"Today, policy-makers and the wider society want to know how the research can be used to solve major societal and global challenges."

This argument is gaining increasing currency. US Studies Centre chief operating officer Sean Gallagher wrote in last week's HES that assessments of university quality lacked a measure for inter-disciplinary research. "With such strategic importance placed on interdisciplinary research by the world's best universities, along with society's heavy expectations, the pressure will increase on global university rankings to measure a university's IDR performance alongside its disciplinary output."