IT used to be a fairly standard branch of engineering. Problems could be logically defined and solutions were solved using logic. Morality and ethics were not things we had to think about when working out the best algorithm or data structure. Those days are over.
For the last 10-15 year’s IT has been transforming people’s personal lives - their homes, their cars, their interactions with friends and family, their knowledge of themselves. This changes the game for IT. Once something has the power to change people’s lives in ways which matter to them, it acquires an ethical dimension. Consequently, when people build systems which will be used by ordinary people in their daily lives, decisions about system architecture, data structures, input forms and user interfaces become moral decisions.
Privacy is an obvious example of ethical IT, and illustrates the complexity of the problem. Decisions about access control, how (and where) to process information, what type of information is gathered, how it is transmitted are all ethical privacy decisions. What makes this difficult for the development community is the lack of consensus about privacy. As with any other ethical issue, there are no correct answers.
There are merely different people’s opinions, based on factors which matter to them, weighed by each according to their personal values. You can disagree with another’s ethical opinions, but they can’t be ‘wrong’ in the sense that a calculation can be wrong. We live in a pluralistic society, which means we acknowledge a diversity of morals and the right of each person to live as they wish. Where it matters, this respect for diversity is backed by law.
Ethical problems occur in IT when developers build systems that impose their personal values or opinions on users. This can happen because the developers fail to recognise they are working within an ethical context. It can also happen when developers fail to recognise that their values are just their values, not universal truths. This is most evident with cross-cultural applications.
In 2015 Facebook blocked a picture of a bare-breasted Brazilian woman of the Airmore tribe, in her traditional clothing, on the grounds of nudity. Brazil’s Minister of Culture, Juca Ferreira stated: ‘For us it is a serious issue because it is an assault on our sovereignty. It is disrespect to our cultural diversity and to the indigenous peoples of Brazil.’
Facebook’s problem was not so much that it blocked these images, as that it imposed one moral position on all. It was the original design decision which caused the problem. Facebook could have built a system which allowed users to select their own system of image blocking, allowing people to impose their own morality on the system.
Alternatively, Facebook could have imposed different blocking regimes based on nationality. Then again, Facebook could have recognised that nudity has different status in different cultures and allowed for some images to be classified as cultural costume and not sex. Facebook’s systems simply lack the ability to handle perfectly valid differences of opinion.
How to fit ethics into design
Adding ethical considerations to development requires recognising that other people may not think like you, therefore you need to ask their opinion. Here are three basic steps:
1. Initial ethical checklist
During the earliest stage of design, any ethical issues which may arise for the overall system should be identified. To start with, a checklist of key ethical areas should be compiled. Does the system impact on any areas of life which have ethical status (e.g.: health, personal development, sexuality, social interactions, environmental impact, living standards, daily life, child-rearing, education, family life, portrayal of cultures)? If the system enters into any of these areas, you need to think about how wide the range of opinion is regarding this area, whether you’re imposing your opinions, and how the system will handle multiple perspectives.
2. Formal ethical assessment
An informal assessment means you ask around the office. A formal assessment means you ask potential users through structured methodologies, such as focus groups or surveys. Questioning needs to be open-ended so that people can raise concerns you may not have thought of. Such a study was conducted by Germany’s Metro Future Store when developing RFID systems (for automated checkouts, inventory, etc.). While development had only considered how Metro Future would use RIFD in stores and warehouses, the focus groups raised concerns about how RFID in products could be used outside the store. For example, people worried that criminals could scan baggage at airports to identify valuable items worth stealing. These concerns prompted the addition of systems to disable RFID in products as the customer exited the store.
3. User ethical testing
Once a system is nearing completion, user testing for ethics should begin. What you are looking for here is input or output which is ethically compromised. Simple elements like input fields can be problematic. For example, asking people to choose male or female may offend the trans-gendered, while forcing women to choose between ‘Mrs’ and ‘Miss’, with no option for ‘Ms’, will offend others. Internal data processing can also have ethical failings. For example, transgendered people are raising complaints with IT systems, especially in educational and billing, which can’t process change of sex or first name. Letting users test input and output allows you to fix issues like this before the press gets a chance to have fun with you.
The essence of ethical programming
The essence of ethical programming is understanding that systems have ethical actions embedded within them, that people have different opinions, and that systems need to cope with the range of values users bring to them. This requires some additional thinking to ensure you don’t impose your own way of thinking on everyone else.
The best way to determine where users will see issues is to ask them. The best way to cope with ethical problems is to build flexibility so the system can handle a range of opinions. The more IT systems become involved in our daily lives, the more ethical issues will arise. In a society which allows each to determine their own morality, IT systems need to do the same.