Is there a need for robots with moral agency? a case study in social robotics

Date

2024-06-05

Advisors

Journal Title

Journal ISSN

Volume Title

Publisher

IEEE

Department

Type

Conference paper

ISSN

2641-0184

item.page.extent-format

Citation

Raper R. (2024) Is there a need for robots with moral agency? a case study in social robotics. In: 2024 IEEE International Conference on Industrial Technology (ICIT), 25-27 March 2024, Bristol, UK

Abstract

There has been significant recent interest in the risks associated with Artificial Intelligence (AI), so much so that a Global AI Summit was recently hosted at Bletchley Park in the United Kingdom. One supposed risk associated with Artificial Intelligence is the threats that might be associated with an Artificial General Intelligence (AGI) carrying out acts detrimental to humanity. In the past, some researchers have attempted to bestow machines with morals to mitigate against these types of threat, however, in recent times the approach has been largely dismissed, with claims that giving machines moral agency poses more of a threat in of itself, than preventing it. One critique of the calls of the risk associated with AGI is that it is unrealistic, and that there is no grounding for any threat to humanity. The aim of this paper is to present a case study in social robotics to illustrate two points: 1) what real-life risks associated with AI might be, and 2) to reinstate the discussion surrounding whether there is a requirement for robots with moral agency.

Description

item.page.description-software

item.page.type-software-language

item.page.identifier-giturl

Keywords

Laws of Robotics, Moral Machines, Machine Ethics, Moral Agency, Robot Ethics, Social Robotics

Rights

Attribution-NonCommercial 4.0 International

item.page.relationships

item.page.relationships

item.page.relation-supplements