Mathiass2004ResponsibilityGap

Matthias, "The responsibility gap: Ascribing responsibility for the actions of learning automata"

Bibliographic info

Matthias, A. (2004). The responsibility gap: Ascribing responsibility for the actions of learning automata. Ethics and information technology, 6(3), 175-183.

Commentary

This article provides an interesting point of view in my regard by offering a more legal side to the argument. It argues that when an artificial agent performs certain actions that have consequences for society, it carries the burden of responsibility. The authors introduce the notion of a responsibility gap. Saying that this responsibility is ascribed to these artificial agents without them agreeing to the terms before hand. In legal sence, this is an interesting point of view since one could argue that they can not be held accountable for things if they do not agree to this beforehand. However, the artificial agents may be ascribed a certain responsibility but they will not actually be the ones paying for it when something goes wrong. In this sence, the article is not complete in my opinion.

Excerpts & Key Quotes

⇒ For 3-5 key passages, include the following: a descriptive heading, the page number, the verbatim quote, and your brief commentary on this

Core argument

For a person to be rightly held responsible, that is, in accordance with our sense of justice, she must have control over her behaviour and the resulting consequences ‘‘in a suitable sense’’ (Fischer and Ravizza 1998: 13). That means that the agent can be considered responsible only if he knows the particular facts surrounding his action, and if he is able to freely form a decision to act, and to select one of a suitable set of available alternative actions based on these facts.

Comment:

This forms the core argument of the paper. It comes down to the fact that the author thinks that an agent can only be considered responsible if he knows the facts and consequences of its actions and is able to choose another action if he would not agree with the terms.

The responsibility gap

Now it can be shown that there is an increasing class of machine actions, where the traditional ways of responsibility ascription are not compatible with our sense of justice and the moral framework of society because nobody has enough control over the machine’s actions to be able to assume the responsibility for them. These cases constitute what we will call the responsibility gap.

Comment:

The responsibility gap is introduced and it is explained that no one has enough control over machines to take the responsibility for them. Since machines are themselves not able to take the responsibility either, the gap arises. The way it is explained in this part of the paper is very clear to me and I would agree that this shared responsibility is problematic.

The process of losing control

Thus, we can identify a process in which the designer of a machine increasingly loses control over it, and gradually transfers this control to the machine itself. In a steady progression the programmer role changes from coder to creator of software organisms.

Comment:

The process of losing control is explicated here. It ends with the statement that the programmer's role changes from coder to creator of software organisms. Personally, I indeed think that this is the way we should start thinking about artificial agents and their 'creators', since the autonomy and its additional responsibility is increasing and cannot be ignored anymore.