As co-editor to a forthcoming issue of the IDS Bulletin (autumn 2012) which will be focusing on ‘research communications’ (all facets of it!), I am currently reading about some very exciting work around this.
During our Call for Submissions, Geoff Barnard, former Head of the Information Department here at IDS, and now Head of Knowledge Management at the Climate and Development Knowledge Network (CDKN) sent me a link his blog on Seeking a Cure for Portal Proliferation Syndrome.
Geoff aptly captures the dilemma that anyone working in research communication and knowledge brokering will be familiar with – the temptation to solve some of the challenges around research communication and uptake in development policymaking and practice by gathering all the relevant research into a super, sophisticated website. The underlying assumption being – if only people could access the research (at the click of a button), then the rest will follow.
He obviously hit a nerve, as there was a stream of responses to his blog, including one from Catherine Fisher who also contributes to this blog, highlighting her work “Ten Portal Pitfalls” – I would urge you to read Geoff's blog and contribute to the debate.
Can we 'scientifically' test for what works when it comes to research uptake?
|Image from: http://188.8.131.52/|
But does similar testing occur in this sector – the one that wants to get good research results out of the lab and into development policy and practice? Is it even feasible to conceive of a scientific test for something as amorphous as “knowledge” and “evidence”?
Going back to 'the cure', we should perhaps be asking whether portals are a syndrome or a symptom. If they are symptom, the problem could be that we think research is not being used in policymaking and practice because people don’t have access to it. Yet, surely the very proliferation of portals in itself highlights that this isn’t the problem – after all, how will one more portal succeed where others have failed? What do we know about the success and failures of portals? What actually do we know about the relationship between portals and research uptake?
With people still wracking their brains over measuring impact of research, there is room for some robust ‘scientific’ testing what is and isn’t effective for supporting research uptake and the place (or otherwise) of portals within this. We recently teamed up with 3ie to carry out an experiment on the effectiveness of the ubiquitous ‘policy brief’ (even more ubiquitous than portals, I would argue). The results are just beginning to come through. Watch this space for more on this – we will of course be sharing our findings with you!