When it comes to developing tools and products there are ethical things that you should consider say Bernd Carsten Stahl FBCS, Marina Jirotka, Grace Eden, Job Timmermans and Mark Hartswood.

You are a software engineer working on a project to create an emotionally aware robot that is meant to enable older people to live at home for longer. This seems to you like a helpful way of addressing the predicted crisis in care as the elderly population grows.

However, you also have some reservations about the project. The robot is bulky and the person cared for is frail - can the robot be engineered to always behave in ways that are safe? One feature is that a networked robot would connect to family and professional carers who can direct some of its actions remotely, and you worry what this might mean for the independence of the cared-for person.

You ponder over the future, asking yourself if you really want to live in a society that outsources its care for the elderly to machines? While you are confident you can deal with issues that draw directly on your professional skills, for example, making a robot that is safe, issues of autonomy and broader social impact make you feel uneasy.

You are uncertain as to the boundary of your responsibilities, and fully aware of your limited ability to influence how the robot is used once it leaves your hands.

How sustainable are the outcomes?

  Process Product Purpose People
Anticipate (Opportunities) Is the planned research methodology acceptable? Will the products be socially desirable?
How sustainable are the outcomes?
Why should this research be undertaken? Have we included the right stakeholders?
Reflect (Considerations) Which mechanisms are used to reflect on process? How do you know what the consequences be?
What might be the potential use?
What don’t we know about?
How can we ensure societal desirability?
Is the research controversial? Who is affected?
Engage (Alternatives) How to engage a wide group of stakeholders? What are viewpoints of a wide group of stakeholders? Is the research agenda acceptable? Who prioritises research? For whom is the research done?
Act (Capacities) How can your research structure become flexible? What needs to be done to ensure social desirability? How do we ensure that the implied future is desirable? Who matters?
What training is required?
What infrastructure is required?
 

This brief example shows the complexity of social and ethical concerns that arise when research and innovation activities bring new information and communication technologies into everyday use. The ethically charged nature of modern ICT is apparent in the continual stream of media stories highlighting these sorts of issues, from the PRISM revelations, to protests by taxi drivers over apps that allow private hire cars to be ordered by smartphones, through to the myriad dangers and pitfalls of social media.

This raises important questions about what new responsibilities researchers, funders and innovators should assume, and how they might be empowered to discharge those new responsibilities effectively.

The concept of responsible research and innovation (RRI) has emerged in response to a world where many of the problems we face come from our own technological innovations and successes, global warming, population pressure financial crises; in a world where there is also increased public awareness and media attention to the benefits as well as the potential darker side of our scientific and technological successes.

Science and technology increasingly is able to manipulate the very physical, biological and social fabrics of our world through synthetic biology, nanotechnology and computing. These innovations may provide powerful new solutions to existing societal challenges - but they also may bring new and even more potent dangers.

So it is in recognising that as technology becomes more powerful, that we have to exercise its use with ever greater care. This is the task of RRI. Fundamentally RRI is about understanding who has what responsibility, and who has the capacity to discharge those responsibilities. But there are many challenges to making RRI work.

For example:

  • How do you make decisions about technology without knowing beforehand what their implications will be?
  • How do you exert greater control without stifling innovation?
  • Who gets to decide what technological futures are more or less desirable?
  • How do you decide who should be accountable and when?... Since researchers are often not in control over the practical applications of their work when they are taken up by industry and made into products.

RRI is a concept that is not confined to IT. It emerged from debates in nanotechnology and is now being promoted across many areas of research and technology development.

However, ICT has some features that can raise particular concerns for RRI. This includes the high speed of innovation, the open nature of ICT that make it exceedingly difficult to predict their use, their pervasiveness, the difficulty of understanding embedded algorithms, their intimacy in everyday life and the problem of distributed development.

All of these render classical views of responsibility for ICT problematic. RRI as proposed by Stilgoe et al. (2013) and promoted by the UK’s Engineering and Physical Sciences Research Council (EPSRC) requires individuals involved in research and innovation to (A) anticipate the possible impacts and outcomes of their work, to (R) reflect on its purposes and implications, to (E) engage with other stakeholders about these topics and to (A) act accordingly. These principles constitute the AREA framework.

In the course of a three year project funded by the EPSRC we tried to find out what that might mean in practice.

We undertook a landscape study, interviewing 45 ICT researchers as well as representatives from industry, civil society and professional organisations.

We established a community of people interested in these questions and organised a number of events, workshops and meetings. One outcome of this was that those involved in ICT research and innovation desire more detailed guidance on how to do their work in a responsible manner. We therefore extended the AREA framework with a focus on research process, product, purpose and people, (the 4Ps) all of which play an important role when considering questions of responsibility for ICT.

By combining the AREA framework with the 4Ps, and with the properties that in combination make ICT unique such as speed of innovation and diffusion, ubiquity and pervasiveness and logical malleability, we have developed a framework for RRI in IT. The table shows the areas that are likely to be of relevance for responsible practice.

The framework is populated by a number of questions which highlights the fact that it is a matter of discussion and discovery, rather than a straightforward algorithm. We present the RRI framework in the form of scaffolding questions. This underlines that developing ICT responsibly requires an openness to communication and other positions.

There are numerous possible answers to most of these questions and many ways of implementing them in practice. While the framework gives no easy answers, we believe that it can help ICT researchers and innovators to navigate the difficult social and ethical questions their work can raise.

Innovating responsibly is not something that one individual can easily achieve. It requires a community and support from companies, regulators, civil society, users, funders and others. The success of RRI in ICT depends on support for the development of such a community and access to shared resources and good practice.

We have also developed a prototype of such a resource, called the observatory for RRI in ICT and hope that it will be a focal point for everybody who cares about the ethical and social aspects of ICT and wants to contribute to the benefits of ICT innovation while addressing its possible downsides.

We believe that RRI can constitute an important resource for creativity. By interacting with customers, civil society, regulators and other stakeholders, ICT innovators can consider the impact of their work early and proactively. This should allow them to create better technologies that successfully address social challenges.