Affiliations: [a] School of Computer Science and Informatics, Cardiff University, Cardiff, UK | [b] School of Art, Design and Architecture, University of Huddersfield, Huddersfield, UK | [c] School of Computing and Engineering, University of Huddersfield, Huddersfield, UK
Corresponding author: Federico Cerutti, School of Computer Science and Informatics, Cardiff University, Cardiff, UK. E-mail: [email protected]
Abstract: The inability of current machines to expose biases induced by programmers and data scientists is leading towards the creation of a new religion, where machines are mystic oracles whose pronouncements have to be believed, and computer users are their servants. This has to change. In this paper we discuss the issues that can raise from biases introduced in autonomous systems, with specific care of the case of machine learning systems, and their impact on our society. In the light of the (current and future) exploitation of autonomous systems for law enforcement and war – fighting, we emphasise the importance of issues related to discrimination and safety. We also support the bold claim that artificial intelligence can help artificial intelligence in overcoming those issues: by enabling artificial intelligence to record every single step that lead to a given inference, and to argue with humans, we can unveil the mystic oracle and trust its services.
Keywords: Artificial intelligence, argumentation, AI and society