After several years of development, the first round of the Excellence in Research for Australia (ERA) initiative was run in 2010, with results published by the Australian Research Council (ARC) earlier this year in the ERA National Report. The exercise has been an overwhelming success in meeting its objective of providing institutions, researchers, industry and students with a sound, evidence-based means of identifying areas of strength and potential, as well as areas where we need to do better.So overwhelmingly successful, it seems, that it will be changed.
These assessments were made against international benchmarks using the indicators that have been developed over time – in many instances over many decades – by the disciplines themselves. This has underpinned the strong support for the ERA methodology across the higher education research sector.The silence of the lambs is a signifier of "strong support"?
Now we need more support, more silence, and just a tweak or two of the infrastructure funding "roadmap" noted here?
I have said all along that we are keen to undertake meaningful consultation. We remain open to suggestions on enhancements to what we know to be a very good scheme.Oops, some ungrateful lambkins have been bleating -
I have been aware for some time of concerns within the sector about certain aspects of the exercise, particularly the ranked journal lists. These concerns have been communicated to me directly, reported in the sector media, and voiced in the ARC's extensive sector consultations ahead of preparations for the second iteration of ERA in 2012.Hardly surprising, dare I say, when a journal that purveys astrology, quantum mysticism, dowsing, remote healing and other manifestations of what hoary old sceptics such as myself would characterise - fairly or otherwise - as parapsychology (ie World Futures) is 'ranked' and therefore has a value for academic advancement. To adapt the words of Johnny Rotten, just get the DIISR points and don't worry about the bollocks.
The Minister went on to comment that -
The ARC has advised me that consultation has revealed that there is a widespread preference for limited change, to ensure that ERA 2010 and ERA 2012 outcomes can be compared. Overall, however, the ARC considers that making a small number of changes to the ERA 2010 methodology could substantially enhance the integrity and acceptance of the ERA 2010 evaluation exercise, without compromising comparability.Steady, Sir Humphrey, steady.
As always, we are in the business of making refinements that improve the operation of ERA. I therefore commissioned the ARC to produce an options paper outlining different ways we might be able to utilise these indicators to address these concerns, and to consider any implications arising from the potential adoption of alternatives. I placed particular emphasis on the absolute need to maintain the rigour of the ERA exercise, to ensure the comparability of the results of the next iteration with ERA 2010, and to pay close attention to the detailed concerns of the sector. Within those parameters, however, I wished to explore ways in which we could improve ERA so the aspects of the exercise causing sector disquiet – especially issues around the ranked journals list – could be minimised or even overcome.
As the result of this process, I have approved a set of enhancements recommended by the ARC that deal substantially with those sector concerns while maintaining the rigour and comparability of the ERA exercise. These improvements are:Problems?• refinement of the journal quality indicator to remove the prescriptive A*, A, B and C ranks;
• introduction of a journal quality profile, showing the most frequently published journals for each unit of evaluation;
• increased capacity to accommodate multi-disciplinary research to allow articles with significant content from a given discipline to be assigned to that discipline, regardless of where it is published ... ;
• alignment across the board of the low volume threshold to 50 outputs (bringing peer-reviewed disciplines in line with citation disciplines, up from 30 outputs)... ;
• modification of fractional staff eligibility requirements to 0.4 FTE (up from 0.1 FTE), while maintaining the right to submit for staff below this threshold where affiliation is shown, through use of a by-line, for instance).
As with some other aspects of ERA, the rankings themselves were inherited from the discontinued Research Quality Framework (RQF) process of the previous government, and were developed on the basis of expert bibliometric advice. Patterns of their utilisation by the RECs and detailed analysis of their performance in the ERA 2010 exercise, however, have made it clear that the journal lists themselves are the key contributor to the judgements made, not the rankings within them.Carr concluded that -
There is clear and consistent evidence that the rankings were being deployed inappropriately within some quarters of the sector, in ways that could produce harmful outcomes, and based on a poor understanding of the actual role of the rankings. One common example was the setting of targets for publication in A and A* journals by institutional research managers.
In light of these two factors – that ERA could work perfectly well without the rankings, and that their existence was focussing ill-informed, undesirable behaviour in the management of research – I have made the decision to remove the rankings, based on the ARC’s expert advice.
The journals lists will still be of great utility and importance, but the removal of the ranks and the provision of the publication profile will ensure they will be used descriptively rather than prescriptively.What that means - the Government's enthusiasm for openness and transparency has apparently yet to trickle down through some parts of the education machine - few people yet know. We might thus be cautious in endorsing the Minister's confidence that -
These reforms will strengthen the role of the ERA Research Evaluation Committee (REC) members in using their own, discipline-specific expertise to make judgments about the journal publication patterns for each unit of evaluation.
these improvements will strengthen the ERA methodology and minimise the unintended consequences arising from inappropriate external use of the indicators, while maintaining the comparability of future rounds with the ERA 2010 results.