LSE’s public policy group put on an excellent conference programme on 4 December at Beveridge Hall. The conference explored the themes:1) the Economic impact of academic research; 2) impact and the new digital paradigm; 3) next steps in assessing impact; 4) impact as a driver for Open Access. Throughout the day there were break-out sessions on different types of social media for enhancing academic impact but sadly I was unable to attend those (for more info see here).
Upon arrival on a cold London morning, I was struck by the size of the pastries on offer but once I had assured myself of one I bustled into the main hall for the beginning of the day’s sessions. The economic impact of academic research was a striking title, and I was unsure how the pounds were going to be counted. Patrick Dunleavy set out the work he had been doing on the impact of social sciences and the artificial lines in the sand he had to draw to demarcate the social sciences from other work in an increasingly interdisciplinary world. This included impressive figures such as £4.8bn annually as the total value-added from social sciences to the economy.
Nicola Dandridge from UUK argued why economic impact had to be addressed and why UUK were supporters of the REF (controversial rumblings from the audience). She argued that impact was a good way of demonstrating to the general public the value of research, as the view of universities can be purely one of undergraduate teaching and not as hubs of research. According to her there’s a need for the public to understand the value of research is in the billions not millions, and the community need to understand that it’s not a distortion to show economic value, that information is already there, but it’s about articulating it alongside a wider social and economic value.
Sir Adrian Smith (the headline act) emphasised that it’s well established the UK has an excellent research base but that we are less good at transferring that into social capital/cultural capital. That it was important to value research in monetary terms and not just in citations. He highlighted the TSB, directed research towards societal challenges and the debate surrounding an industrial policy as three main factors that may help to change this. The interesting thing he highlighted was the change from the government backing sectors and then companies to technology groupings (see George Osborne’s speech at the RS).
Session 2 was a fascinating onslaught of digital innovations in the academic outputs field, with an excellent presentation from Mendeley’s co-founder and CEO Dr Henning on the work they’re doing to track and enable a new digital landscape and the changing nature of academic impact. This was followed by Jason Priem’s presentation on Impactstory a web service that looks at mapping research from its influences to where it has influenced other work and the connections between papers, creating interoperability of metrics with existing services such as Web of Science. Sandwiched in between the two presentations was a talk by Ziyad Marar from Sage who spoke about the changing nature of academic scholarship, insisting that peer review would remain central to measuring scholarly reputation (much to the disagreement of the other two panellists). He cited Einstein, that “Not everything that counts is countable and not everything that is countable counts” with regards to the flaws of altmetrics.
An interesting discussion ensued about what is perceived to be the two types of publishing: Publishing ie in journals, and publishing ie making papers publicly available. I think they may have just forgotten the term Open Access. The issues of dealing with fraud and retraction are not fully addressed in the traditional metrics or altmetrics, but the key question of whether the peer-review process may need to evolve somewhat was fascinating, couched in terms of ‘Who guards the guards?’ ie ‘Who peer reviews the peer reviewers?’. A really fascinating discussion was around what we mean by impact and these two definitions were described: 1) Impact in terms of countable factors like citations, or altmetrics; 2) Impact in terms of whether policy or business or interested parties act on the work.
Session 3 saw heated debate between David Sweeney and Julia Lane, but Cameron Neylon stole the show with his discussion of the mission of research, and how the motivations of universities have become clouded by rankings and international competition. He argued that impact of research was being able to have open research for it to be useable and resuseable. That it should not be about where a paper is published but who the paper reaches.
The final session was on surer ground: the theme of Open Access; with the RCUK, Wellcome Trust and British Library all represented on the panel. There was a general feeling of determinism about Open Access on the panel, it was going ahead the only thing that needed to be decided was at what speed and to ensure academics understand the merits and processes of Green and Gold OA. Stephen Curry discussed the merits of OA: better value for public money, faster exchange of research information, greater access for all sectors of society, and as a potential driver of impact. Mark Thorley set out the RCUK’s plan over the next 5 years for increasing Open Access and the funding they will put in place to bring it about. Robert Wiley from the Wellcome Trust explained their approach and estimated costs of 1.25-1.5% of research costs, and how vital it was to have a CC-BY license.
One aspect the whole community could agree on was the non-immediacy of research impact, whether it is citations a few years down the line, or policy built on work, or the accumulation of research endeavour producing breakthroughs in 20 years, rarely can research’s impact be measured on the timescales which politics run. Despite this funding has to abide by shorter timescales and the community awaited the Autumn Statement decisions with baited breath.
p.s. As with any science policy expedition to conferences the much overused analogy of sausage machines/factories was ridden and butchered and ground into new and mixed ways of discussing scientific research. It was either undercooked or overdone and I for one am sick of it now. Calling all science policy people please find a new analogy!