Applied ethics for responsible innovation: ethical dilemmas and key notions of individual and collective responsibility in technology development

Alicja Halbryt
6 min readMay 7, 2021
Source: Unsplash

Introduction

Over the last decades, increasing attention has been given to the topic of responsibility in engineering and technology development. The debate is usually steered between questions regarding blameworthiness and liability. Some authors argue that the technology industry needs a shift from the traditional approach of blaming and assigning alleged wrongdoing to more socially responsible engineering. This essay will present the grounds of ethical dilemmas and moral responsibility, both individual and collective, to explore the ethical notions faced everyday by technology developers.

Thought experiments — The Trolley Dilemma

Thought experiments are developed by philosophers to freely explore moral and ethical nuances in an abstract manner. Typically, they are carefully arranged dilemmas, which provoke the reader to pick a preferred course of action and explain why it is the lesser evil. Through this, there is an opportunity to explore philosophical implications of the various responses to the dilemma. When speaking of “responsible innovation”, it becomes important to break down and deeply understand the notion of responsibility — who, why, and when should be held responsible (Kormelink, 2019). The following section of the essay will explore one of the most significant thought experiments — the Trolley Dilemma.

The Trolley Dilemma, or the Trolley Problem, was first introduced by Philippa Foot in 1967. It is a set of hypothetical scenarios taking place in an extreme environment which aim to test a person’s ethical prowess (Kormelink, 2019). Amongst different variations of the experiment, the main form remains the same — the subject is asked to imagine a situation where a trolley is fast approaching five people tied up to the rails ahead. It is possible for the subject to pull a lever which would divert the train onto a side-track where, however, one person is tied up. Is it morally permissible for the subject to switch the trolley onto another track and, therefore, save five people, but kill one? Majority of people think that it is permissible, or even morally obligatory, to divert the trolley and save five lives (Thomson, 1985; Kormelink, 2019). Variants of the Trolley Problem include, e.g. the fat-man scenario, which assumes pushing a fat man onto the rail track which, in turn, would make the trolley stop and prevent it from killing the five people. Moreover, the Dilemma is studied both from the first-person perspective (the subject makes a choice) and the third-person perspective (the subject judges a choice made by another person). Sometimes, instead of having the five people threatened to be killed on the tracks, they are seated in the trolley itself. This version is often used to explore the case of autonomous vehicles and the dangers they pose to their passengers and pedestrians (Wolkenstein, 2018).

Individual responsibility

There are several different notions of individual responsibility. Causal responsibility holds the minimal level of responsibility and could be a result of an accident; it means that the subject is the cause of some occurrence or outcome (Stanford Encyclopedia of Philosophy, 2019). For instance, a chemical plant operator who accidentally turned the wrong switch to stop a leakage, which caused an explosion and killed a worker, was causally responsible. Then, there is moral responsibility which, simply put, means doing (or not doing) the right thing. For example, in a routine case when the operator turns the right switch to stop the chemical spill, they are morally responsible for preventing the worker’s death, even though they do not merit praise or blame. On the other hand, if the operator had ill intentions and caused the explosion to intentionally kill the worker, then the operator is both causally responsible and morally culpable — thus also morally responsible and blameworthy.

The above-mentioned individual responsibility notions inter-connect. For instance, the operator has to cause the death of the worker in order for them to be morally responsible for it. There is no moral responsibility without causal responsibility. Additionally, if the operator could not avoid the worker’s death, then they are not morally responsible, and also not culpable (Kormelink, 2019).

In order to call a person responsible, for example an engineer, four conditions need to be met. First, the person has to have the freedom to act and not be under external pressure (e.g. a person cannot be held responsible for their actions when a gun is held to their head and they are asked to perform illegal or immoral acts). Second, the person should have the knowledge that their actions would bring a negative result (e.g. if someone has just painted a house’s door without putting a notification on, and another person touches the door and in turn destroys the paint, this person cannot be held responsible, as they had no chance of knowing they would cause damage). The third condition assumes that there should be a causal connection between the person’s action and the negative result, that is a person cannot be held accountable if they did not causally contribute to the outcome. And lastly, a person can be called responsible if they transgressed a norm, for example an ethical, social, or legal norm (Kormelink, 2019).

Collective responsibility

Individual moral responsibility alone is, in some cases, not enough to address all concerns, especially if numerous parties of equal influence affecting the outcome are involved. The following section will describe a problem of collective action, sometimes referred to as the tragedy of the commons, which occurs when people neglect the well-being of society in favour of their personal gain (Boyle, 2020).

The tragedy of the commons arises in the context of shared resources, i.e. atmosphere, rivers, forests, etc. The general statement of the concept demonstrates that the world based on human-centred, moral principles and the notion of equal justice supports and cannot prevent population growth and consumption. This, in turn, leads to a constant, but not inevitable, threat of the breakdown of ecosystems which support civilization (Elliott, 1997). In other words, it describes a situation where individuals who have open access to resources not limited by social structures or formal rules, act according to their own self-interest, independently and contrary to the common good of all users. This approach causes reduction of the resource through the subjects’ uncoordinated action. A typical example of the tragedy of the commons is the global problem of overfishing (Kormelink, 2019).

In some cases which involve many individuals, it is impossible to point one person who can be held responsible. The phenomenon when actions of the stakeholders together lead to a dramatic outcome, and none of them can be held responsible, is called the problem of many hands. Climate change can serve as a typical example of a situation when pointing the responsible for collective harm is especially difficult (van de Poel et al., 2011). The problem of many hands and inability to point the responsible can lead to serious consequences also in case of developing risky technology.

Conclusion

This essay aimed to explain the basic notions of applied ethics for responsible innovation. It touched upon the topics of ethical dilemmas and individual moral responsibility, including the conditions required to call a subject responsible. In addition, the essay talked about collective responsibility and the phenomena of both the tragedy of the commons and the problem of many hands. These notions are necessary to be acknowledged and understood by tech industry in order to move towards socially responsible innovations.

References

Boyle, M., 2020. Tragedy Of The Commons Definition. [online] Investopedia. Available at: <https://www.investopedia.com/terms/t/tragedy-of-the-commons.asp> [Accessed 19 April 2021].

Elliott, H. (1997). A general statement of the tragedy of the commons. Population and Environment, 18(6), 515–531. doi:10.1007/bf02209385

Kormelink, J., 2019. Responsible Innovation: Ethics and risks of new technologies. 2nd ed. Delft.

Thomson, J., 1985. The Trolley Problem. The Yale Law Journal, 94(6).

van de Poel, I., Nihlén Fahlquist, J., Doorn, N., Zwart, S. and Royakkers, L., 2011. The Problem of Many Hands: Climate Change as an Example. Science and Engineering Ethics, 18(1), pp.49–67.

van de Poel, I., Royakkers, L. and Zwart, S., 2015. Moral responsibility and the problem of many hands. 1st ed. Routledge.

Wolkenstein, A., 2018. What has the Trolley Dilemma ever done for us (and what will it do in the future)? On some recent debates about the ethics of self-driving cars. Ethics and Information Technology, 20(3), pp.163–173.

--

--

Alicja Halbryt

Writing about Technology Ethics and Design. MSc student of Philosophy of Technology (NL), MA Service Design graduate (UK)