Friday, 25 July 2014

It’s stakeholder mapping Jim but not as we know it

By James Georgalakis

What happens when you create fictitious organisations working on make-believe influencing scenarios and ask a bunch of people who have never worked together before to develop stakeholder maps for them? Quite cool stuff actually.

At a recent IDS short course designed to provide a broad overview of research and policy communications we included a section on stakeholder mapping tools. Over the past decade I have run many workshops which included some form of stakeholder mapping exercise. But whether the sessions took place with civil society organisations in Malawi, researchers in Nepal or social workers in Ukraine, context was always key. During all these capacity building events we worked on real scenarios. So what were we to do in a single day with 25 participants from a broad range of think tanks, universities, NGOs and consultancies, two thirds of whom had never undertaken any kind of stakeholder mapping before? Make it up of course!

We simply created five pretend scenarios based on very different policy contexts inspired by the range of participants we were expecting. We surveyed them first checking what kind of policy actors they typically targeted. We then got pretty creative making up an organisational profile, an influencing objective and we even suggested some potential stakeholders they might want to kick things off with. The exercise was otherwise pretty much like any other network mapping style process. They identified further stakeholders, they placed them on a large map to indicate their relationship to their made up institution and they looked at their relationship to one another. They then scored each one in terms of their level of influence on the issue and the likelihood of them being allies, opponents or neutral in relation to the hoped for influencing goal.

One of the main differences from a more conventional session was it was faster. Detailed maps were produced in under an hour and a half – although they would have all happily taken much longer if we’d let them. With less time spent on reviewing objectives (we provided these) and dissecting deep rooted institutional issues around identity, legitimacy, power and profile, the participants quickly explored different external stakeholders and their potential usefulness. However, the discussions still contained much of the richness that more conventional sessions do. Our approach meant that we had mixed groups learning from one another’s institutional and sector perspectives. They quickly discovered they had quite different ideas about how change happens and the impact of research, and begun exploring hidden power relationships. In the group I facilitated they challenged the narrow list of parliamentary and governmental stakeholders we had suggested and wanted to extend their map to include civil society organisations, social scientists and the media. This in turn helped them unpack what it means to try to influence the quality of a particular public policy discourse with evidence (they were pretending that they were going to try and engage with the controversial debate on the impact of immigration on the UK in the run up to a General Election!)

The participants really seemed to gain an appreciation of the importance of mapping your policy environment and how documenting the discussion is almost as useful as the map itself. They revisited their maps in the afternoon and used them to select priority audiences for whom a communications plan was then developed. It all felt pretty real by the end and the fictitious scenarios appeared to deliver much the same learning and tools, that they could apply back in their own organisations, as any more realistic grounded exercise would have done.


You can download all the course materials including the stakeholder mapping scenarios and facilitators notes here. IDS is currently developing an Achieving Influence and Impact Series, so do let us know if you would like to be kept informed of future courses and free resources on this topic: training@ids.ac.uk

James Georgalakis, is the Head of Communications at IDS
Follow James @ www.twitter.com/bloggs74

Other posts by James Georgalakis on research communications:

The Guardian
Has Twitter killed the media star?
Marketing: still the dirty word of development?

On Think Tanks
Is it wrong to herald the death of the institutional website?
How can we make research communications stickier? 

Impact and Learning 
Digital repositories – reaching the parts other websites cannot reach
Influencing and engagement: why let research programmes have all the fun?
Going for gold: why and how is IDS bringing our journal back in house and making it open access?



Monday, 14 July 2014

Going for gold: Why and how is IDS bringing our journal back in-house and making it open access?

By James Georgalakis


The recent announcement by IDS that we are not planning to renew our contract with Wiley Blackwell for the publication of our journal, the IDS Bulletin, will have delighted some and baffled others. Re-launching our flagship publication as a gold open access digital journal means the end of subscription income and the end of a large publisher’s marketing support. From January 2016 the Bulletin will be produced in-house and will be available to all for free.


Since 1968 the IDS Bulletin has been an integral part of IDS’ research dissemination strategy, covering the major themes and influencing debates within international development. As we move forward we will build on its unique characteristics including the thematic issues that mobilise scholars from multiple disciplines, around key development issues. However, for the first time in its history from 2016 there will be no paywall, no embargos and few licencing restrictions to obstruct researchers, students, policy actors and activists from using the Bulletin to support their work.


This new open access IDS Bulletin will be supported by robust editorial and peer review processes with an editorial steering group made up of IDS fellows from all of our key research areas plus an advisory body to provide oversight. Academic editors of issues will be drawn from across the IDS community, including our partners, and a small in-house production team will provide a high quality publication available for free digitally and in print for those that need it.

Who Pays?
Of course the key conundrum of the open access movement has always been who pays? If not the subscriber then surely the researcher (via their funder of course). What this means in practice for conventional social science journals is paying article processing charges (APCs) of around £2000 per article to commercial publishers. In the case of the new IDS Bulletin we are dispensing with this process altogether and simply bringing production and distribution in-house and charging projects and programmes a fixed sum of just under £6000 for a whole issue of up to 9 articles and then fundraising to meet the shortfall. With its long history, policy focused thematic issues, not-for-profit financial model and full compliance with even the most stringent open access policies, we are confident that the IDS open access Bulletin will attract the financial support it needs to continue to provide fresh new thinking on key development issues.

Why not just publish a hybrid?
A hybrid publishing route involves placing submitted versions (post peer review but pre editing and formatting) of your articles in a repository and making others gold open access with APCs. Many publishers including Wiley have developed hybrid systems. However, the Bulletin is quite different to other journals. With thematic issues that are often built around a specific research project or programme the use of APCs is almost impossible. We need to fund each whole issue otherwise it is game over. Hence the charge we apply to projects wanting to commission an issue. Plus, simply releasing the submitted versions of articles still fails to meet the strictest of open access mandates such as DFID’s. In fact, this hybrid route was originally recommended by the UK Government as part of a five-year transition (from 2012 to 2017) towards fully Gold Open Access. It does not work for us and it may not represent a long term solution for other learned societies in the social science sector.

An open access strategy fit for fifty years in development studies
Our emerging open access strategy and our desire to pursue engaged excellence demands that we open access to as much of our evidence and knowledge as possible. The schemes to open up access to journals to southern institutions such as HINARI, AGORA and OARE under the Research4Life programme are good but no longer go far enough. The ongoing evolution of the IDS Bulletin is in part thanks to Wiley Blackwell themselves who have over the last six years helped build its credibility and reach. However, the expiry of our contract with Wiley at the end of 2015 marked an opportunity to take the journal into the next exciting phase of its development. It will be re-launched in our fiftieth year and form a key part of our anniversary celebrations as we release the entire back catalogue. This major publishing event will form part of the narrative around the Institute’s fiftieth birthday as we explore our future role in a changing world in which development knowledge is generated globally and we seek to share it with all those that need it.

James Georgalakis, is the Head of Communications at IDS
Follow James @ www.twitter.com/bloggs74

Other posts by James Georgalakis on research communications:

The Guardian
Has Twitter killed the media star?
Marketing: still the dirty word of development?

On Think Tanks
Is it wrong to herald the death of the institutional website?
How can we make research communications stickier? 

Impact and Learning 
Digital repositories – reaching the parts other websites cannot reach
Influencing and engagement: Why let research programmes have all the fun?

Friday, 14 March 2014

Hosting webinars: lessons from a recent ‘on air’ experience engaging stakeholders in real-time

By Adrian Bannister

As part of a recent e-dialogues week delivered by the Making All Voices Count programme team, colleagues from IDS Research and Knowledge Services departments worked to convene an entirely web-based audience of invited stakeholders to two online events. 

These book-ended 5 days of asynchronous online discussion that took place on the Eldis Communities platform – you can read my top tips for facilitating online discussion here.

So what’s a webinar?
For the uninitiated a webinar is simply defined as ‘a seminar conducted over the Internet’. Like a real-world seminar this generally involves one or more presenters speaking to, and receiving questions from, an audience. 

It happens in real-time and typically involves the streaming of audio and video. For the audience it enables them to be ‘virtually there’ - for the project team it means working like a television production unit.
By Jorge Díaz (Flickr: On air) CC-BY-SA-2.0 

One key distinctive feature, and advantage, is that nobody (from the panel of speakers or the audience) has to be in the same location. However, each participant must have a broadband Internet connection and knowledge of ‘where the virtual room is’ i.e. a web link. In some cases it is also necessary to download software e.g. At&T Connect.

Rationale and approach
We chose to open the e-dialogues week using a webinar event for three reasons:
:- to present the programme’s own thinking on the questions the e-dialogues week was posing (the focus was on the experience of Technology for Transparency and Accountability Initiatives - T4TAIs) 
:- to provide an opportunity for programme stakeholders to ‘meet’ the programme staff and hear them speak directly, and;
:- to begin to stimulate thinking and interaction among participants around aspects of the online discussion to come

Both events were an hour-long and followed a broadly similar format. A panel of IDS colleagues sat together in the same room, though they could have each been disparately located, and each shared a short presentation that was live-streamed to the audience. 

The audience were able to interact with each other via a chat-room and, and also to queue-up questions to the panel. These were fed to the speakers during an extended question and answer session. 

The closing webinar provided a mechanism for the panel to verbally summarise and provide commentary on the areas of the discussion that they had facilitated. 

A note about technology
To host these webinars we setup dedicated and private virtual rooms using the Click Meeting platform, which was provided and supported by Webgathering. Participants were able to access the space via a web browser, which only required a minimum software download, the popular Adobe Flash plug-in.

Though there were only 80 or so invitees for our events, Click Meeting can host up to 1000 attendees for large webinars and has useful text translation features for a variety of languages. It allows for meetings to be recorded as a video file (in MP4 format) which makes it accessible and shareable with most computer users afterwards. 

Web Gathering has an extended facility with Click Meeting so they can accommodate more presenters than a standard plan however a free trial is also available.

Key lessons
Based largely on these experiences, we’ve gathered our thoughts into the following: 15 top-tips for managing webinars which may be useful for your own projects.


Adrian Bannister is the Web Innovations Convenor & Eldis Communities Coordinator at IDS

Thursday, 27 February 2014

Influencing and engagement: Why let research programmes have all the fun?

By James Georgalakis

Is it time for us to start applying some of the strategies for bridging research to policy used so effectively at a project level to our institutions as a whole? Could a more coherent approach to policy engagement and research communications applied cross-organisationally lead to greater impact?

There is a chasm to be crossed in many research organisations and think tanks. It is the deep gap that exists between high level institutional strategies and project based impact plans. Many project funders pile pressure onto researchers to develop theories of change and detailed research communication strategies, but there is little incentive for research organisations to look at the policy and knowledge landscapes they operate in at an institutional level. Is it really only campaigning organisations, like international NGOs with their advocacy frameworks, which need to take a holistic approach to policy engagement?

The challenge for many research producing organisations, made up of multiple centres, projects and programmes, is how can they be greater than the sum of their parts? Even academic institutions and think tanks with a clearly articulated mission of actively engaging in policy discourse risk entirely vacating key policy debates or abandoning prime influencing opportunities when certain projects come to an end.

Research programme vs institutional priorities

I recently co-facilitated a series of workshops in Nepal as part of the Think Tanks Initiative Policy Engagement and Communications South Asia Programme, with an inspiring group of researchers and communications professionals from fifteen South Asian think tanks. They were all interested in the development of institutional level engagement strategies and were simply not willing to restrict their planning to a specific project. Or as one participant put it, “Why should the research programmes have all the fun?”

They each developed a clear policy engagement goal, or set of goals, that reflected their vision of change. For some these were softer changes in the nature of the policy discourse for others quite specific changes in the direction of policy in a particular area. They then mapped their context and gained a real understanding of how change happens in relation to their hoped for long term impact. The lack of a specific set of project or research goals did not seem to dilute the richness of their discussions. But it did lead to a different set of answers. They each looked at their emerging institutional level impact strategies in relation to an earlier exercise that had assessed their capacity in relation to policy engagement and communications. Areas they needed to invest in at an institutional level whether: social media, publications or knowledge management skills, quickly emerged. As did key relationships, networks and knowledge of policy processes they needed to grow.

See the Center for Study of Science, Technology and Policy from India take a dizzying six second journey around this strategic planning process: https://vine.co/v/MZzY3xhLrz6

Breaking down barriers

This experience also helped me to reflect on IDS’ approach to institutional level strategic planning. We too have been on this journey in trying to identify a wider set of engagement priorities. Take the Post2015 debate for instance. Here is a prime example of something that cuts across projects and programmes and research centres. By actively prioritising it in a cross institutional strategy and mapping out our strengths and weaknesses and the key areas of potential engagement, whether in the media, UN processes or the UK Government and Parliament, we have been able to add real value to the work of our project teams and their partners. Some of these groups are explicitly focused on this debate, such as Participate. Others find this framing essential, using it to push their research up the agendas of key policy audiences. We have been able to create a more enabling environment for their work by actively identifying key influencing and engagement opportunities (and challenges), building relevant networks and alliances and prioritising the timely profiling and intelligent framing of their outputs.

This process has also led to a great deal of cross-organisational collaboration, breaking down the barriers between research teams, projects and multi-sited research centres. So, whilst all our engagement and communications activities remain entirely based on our research (there is not retro fitting of evidence to advocacy objectives here) we are not wholly driven by the ubiquitous project log frame which cannot always facilitate the type of policy entrepreneurship needed to engage effectively at a national or international level.

There are a wealth of academic papers, blogs, donor guides and other materials on effective research communications and the incorporation of impact strategies into projects. However, there is far less about cross institutional approaches. Some commentators claim that cross institutional strategies focused on policy outcomes are simply too broad but is it time to challenge this? I would love to hear from those who have experience in this area. We need to share our learning and explore ways that researchers and communications professionals can work together to build a strategic framework at an institutional level to support those committed to making sure their research makes a difference.

James Georgalakis, is the Head of Communications at IDS
Follow James @ www.twitter.com/bloggs74

Other posts by James Georgalakis on research communications:

The Guardian
Has Twitter killed the media star?
Marketing: still the dirty word of development?

On Think Tanks
Is it wrong to herald the death of the institutional website?
How can we make research communications stickier? 

Impact and Learning 
Digital repositories – reaching the parts other websites cannot reach

Tuesday, 18 February 2014

Open knowledge spells murky waters for M & E

By Ruth Goodman

In mid-January I ran a session on monitoring and evaluation at the Eldis Open Knowledge Hub Partnerships Meeting. The meeting housed a group of individuals united by a concern with opening up access to research evidence and, particularly, increasing the visibility of research from developing countries.

The partnerships meeting was undertaken as part of the Global Open Knowledge Hub (GOKH) – a 3 year DFID funded project. The vision for GOKH is that IDS, and partners, will build on their existing information services to create an open data architecture for exchange and sharing of research evidence – the so-called Hub. For insight into the issues that need to be addressed in trying to set up an open knowledge hub see Radhika Menon’s recent blog The Global Open Knowledge Hub: building a dream machine-readable world.

Our hope is that through the open data approach the partners and third-party users of the Hub will be in a position to extract and re-purpose information about research evidence that is relevant and contextual to their audiences. This in turn will contribute to research content being more visible thereby enabling otherwise unheard voices to contribute to global debate and decision making. My session on M & E then was concerned with how we can know if this is being achieved.

M & E is great. It allows you to track and evidence what works and what doesn’t so that you can learn, improve and grow. In order to reach this end though, you need to know how to evaluate your work. When it comes to approaching M&E for the Hub, the waters are murky.
Photo by Kessie-Louise Given at deviantart.com
Open data approaches are still (relatively) new and the body of evidence for M & E when working with open data, let alone the specifics of evaluating and learning from this sort of Hub model, is sparse. The traditional technical methods of tracking information on the internet fall over when you make the data open. By making data open you give up most, if not all, of the control over how your data is used, implemented and displayed. There are ways to implement tracking but these are easily circumvented, so the statistics you can obtain do not reliably represent the whole picture. So, depending on how they implement the content, if organisation A is consuming data from the hub  that organisation B has contributed to the Hub then it may be that the ‘hits’ register on organisation A’s web statistics, not organisation B’s. Even if/when we do identify the most suitable metric for measuring impact in open knowledge, as we discussed at the workshop, numbers aren’t really enough. Indeed, web metrics are unreliable at the best of times and their value lies in spotting trends in behaviour – not for demonstrating impact. To engage with quantitative data people need to be clear on what that data is telling you. If open knowledge data is not the most exciting thing in the world for you, or maybe something that you don’t quite understand, then numbers are likely to do little to inspire understanding or perceived value of open data initiatives such as the Hub. However, if you can tell a story about what the Hub has allowed users to do then people have something real to engage with. Not only will they have a better understanding of the nature of your work and the value of it but they are more likely to be motivated to care. At the workshop we discussed the potential of collating stories of use as one approach to M & E that might allow us to translate the value and challenges of open knowledge work to a wider audience.

Other possibilities we discussed were around helping and supporting each other. If partner organisation A is featuring content from organisation B, delivered by the Hub, then potentially A could tell B how many hits they are getting for your content. If doing some M & E of their own, could partner A even add a couple of questions to their user survey about partner B’s data? And what about the experiences and perceptions of those partners using the Hub? Partner organisations own reflections and observations are as important as those of users in gaining a full understanding of the value and potential of the initiative.

Moving forward, our aim is to convene an M & E working group which, among other things, could serve as a community of good practice where we can be open with each other about our evaluation efforts. By sharing our experiences of different M & E approaches and the challenges of these we can work toward a position where we can know the influence of this work, we can translate this to others in a comprehensive way, and we can start to identify what we need to do to realise the potential in this exciting new arena.


Ruth Goodman is Monitoring, Evaluation and Learning Officer at the Institute of Development Studies