How moral is your machine?

By Diane Dowling. Posted

Image credit: Lianhao Qu

Diane Dowling explores the moral and ethical dimension in computer science education

The A-level computer science specification of all English exam boards requires students to have the ability to ‘articulate the individual (moral), social (ethical), legal, and cultural opportunities and risks of digital technology’.

Ethics and morals are terms that are sometimes used interchangeably, as both refer to behaviour that can be labelled ‘right’ or ‘wrong’. Ethics may be guided or directed by codes of conduct in the school or workplace, or by faith leaders for those who practise a religion. Ethical guidance is provided to computing professionals by external bodies such as the British Computer Society (BCS) which sets the professional standards of competence, conduct, and ethical practice for computing in the United Kingdom.

On the other hand, morals are guided by our own principles and a sense of permitted behaviour. Charles Darwin maintained that “of all the differences between man and the lower animals the moral sense or conscience is by far the most important”. It is a generally accepted view that as humans we all have:

  • The ability to anticipate the consequences of our own actions

  • The ability to make value judgements

  • The ability to choose between alternative courses of action

Although we all have the capacity for moral behaviour, our individual moral code is not biologically determined, but arises as a result of human experience. Our morals will be influenced by the society in which we live. For young people, it will be formed by the views of parents, teachers, and the opinions of others whom they interact with. Increasingly for most of us, and especially for young people, this will include content consumed through the internet. The internet does not respect national borders; thoughts and ideas can be readily shared on social media, in chat rooms, and on forums. Such a wide sphere of influence can and will result in diverse views of what is right and wrong, even between members of the same household.

Some moral values are widely held by most societies, but there can be shades of grey in even the most widely held beliefs. ‘Thou shalt not kill’ is a tenet of many religions and most people, when asked, will agree that killing another human being is wrong. However, across the globe, 56 countries retain the death penalty and research shows that in these countries, the majority of the population agrees that the penalty is an appropriate punishment for those in society who commit the most serious crimes.

An interesting dilemma arises when we have to choose between two alternative courses of action, where both are morally reprehensible. An example of such a dilemma is the much-studied trolley problem. In this thought experiment, there is a runaway trolley. Ahead, on the tracks, there are five people tied up and unable to move; the trolley is headed straight for them. You are standing, some distance away, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person on the side track. You have two options:

  • Do nothing and allow the trolley to kill the five people on the main track

  • Pull the lever, diverting the trolley onto the side track, where it will kill one person

The dilemma can be made more challenging by adapting the alternatives to include children or animals, or by varying age, gender, intelligence, or socio-economic factors. Examples of how the trolley problem can be used in the classroom were given by Marc Scott in issue 12, page 86 (

Make the link

The moral conundrum is interesting, but how does any of this relate to computer science? The fact is that many of today’s computer science students will become the software engineers of the future and a large number will be faced with the task of designing and writing code for applications of artificial intelligence such as self-driving cars. These programs will have to make complex—often life or death—autonomous decisions.

The prospect of delegating moral decision-making to machines can be fascinating or frightening, depending on your outlook. The trolley problem is one that can be abstracted to encompass many moral dilemmas. Sometimes, unprecedented situations such as the current coronavirus pandemic will throw up new moral dilemmas. In hospitals around the world, medics are being forced to prioritise their patients. Who gets a ventilator when there is insufficient supply for all who need one? Would it be better if a machine were given the responsibility of making this impossible choice?

An algorithm may ensure that consistently reliable decisions are made, but whose morals will determine the way that autonomous machines are programmed? Would you rather a human could override a machine-made decision, or would you rather rules were absolute and consistently applied? Machine learning muddies the debate still further. Neural networks used for this purpose are difficult to analyse and determine why a decision was produced, so there is a serious issue of accountability.

The Massachusetts Institute of Technology has created a website—the Moral Machine—that is collecting data that will help researchers by providing a platform for ‘building a crowd-sourced picture of human opinion on how machines should make decisions when faced with moral dilemmas, and crowd-sourcing assembly and discussion of potential scenarios of moral consequence’. Projects like this will allow researchers to gain a better understanding of the choices that humans make.

In guiding young people through the moral maze, there are many topics that can be discussed in the classroom that will promote lively discussion. Facilitating a debate in which students propose arguments for both sides will allow a wide range of views to be shared and discussed. However, there are issues that will need careful consideration. For example, when discussing autonomous vehicles, a young person with a friend or family member who has been involved in a traffic accident might find this a very hard topic to engage with.

Life-and-death choices are at the extreme end of the decision dilemma. There are many less contentious areas that could be discussed. In her excellent book, Hello World: Being Human in the Age of Algorithms, mathematician Hannah Fry introduces a range of topics, from medicine to law, where algorithms are already being used to automate decision-making. For example, in some US states, an algorithm that uses data about a defendant to estimate their likelihood of committing a future crime is deployed to recommend whether someone awaiting trial should be granted bail. If you search for more information on this, you will quickly find some fascinating examples of bias in the data.

Artificial intelligence and automated decision-making are not the only topics in which morals play an important role. The use of big data and the ability of organisations and government to analyse personal information is worth discussing.

The power of the state to monitor behaviour is always contentious. Advances in facial recognition technology are enabling some regimes to monitor and track their citizens, putting in doubt the principle of informed consent. Since early December, all mobile phone users in China registering new SIM cards must submit to facial recognition scans, giving rise to suspicion of mass state surveillance.

Such an initiative would have been widely vilified by more democratic societies but, since coronavirus has swept the globe, many governments are now deploying tracking apps. In the UK, the government is currently piloting an NHS app to automatically collect details of those we have been in close contact with, to help control the spread of the virus. Many will applaud such initiatives, but defenders of civil liberties and the right to privacy will be dismayed by such developments; their moral code would not condone such a measure, even for the greater good.

The use of social media and other online platforms is another area that can facilitate lively debate. How much freedom should people have to express a viewpoint? Where is the line between what is allowed and what should be banned? In the recent general election campaign, many female candidates said that they felt unsafe. In research funded by Joseph Rowntree Reform Trust, analysis of 139,564 tweets sent on a single day in November 2019, which either replied to or otherwise mentioned any of the 2,503 election candidates (who used Twitter) found that 23,039 (16.5%) of the total were abusive.

Of greater relevance to teenagers might be the death of Molly Russell, a 14-year-old girl who took her own life in 2017. Her Instagram account contained distressing material about self-harm and suicide. Molly’s father claimed the algorithms used by some online platforms push similar content towards you, based on what you have been previously looking at.

However, there are also stories of social media being used for collective change. In 2017, actor Alyssa Milano encouraged women to say ‘me too’ if they had experienced sexual harassment or assault, and the hashtag #MeToo quickly swept the globe and encouraged empowered victims to speak out. Social media also gives a voice to many who live in less liberal societies. In Hong Kong, activists have been able to use social media to communicate and organise large-scale demonstrations against what they see as an attempt by the Chinese government to undermine the region’s autonomy and civil liberties.

Plan for morals and ethics

Integrating the study of morals and ethics into your scheme of work for computer science will provide the opportunity to relate real-world issues to more theoretical topics and to make them relevant to the world in which we live. Many of the topics introduced in this article are emotionally challenging and teachers may feel uncomfortable introducing them into the classroom. However, for A-level learners who are nearing adulthood, many of these issues are relevant and important. Teachers have a unique place in the lives of young people and an important role in steering and guiding their moral development.



If you’re a UK-based teacher, volunteer, librarian or something in between, we'll send each issue free to your door.



Just want to read the free PDF? Get each new issue delivered straight to your inbox. No fuss and no spam.


From £6

If you are UK-based but not involved in education, you can get hard copies by buying back issues or subscribing via our partner service.