Article ID Journal Published Year Pages File Type
394258 Information Sciences 2013 22 Pages PDF
Abstract

One of the main goals of the agent community is to provide a trustworthy technology that allows humans to delegate some specific tasks to software agents. Frequently, laws and social norms regulate these tasks. As a consequence agents need mechanisms for reasoning about these norms similarly to the user that has delegated the task to them. Specifically, agents should be able to balance these norms against their internal motivations before taking action. In this paper, we propose a human-inspired model for making decisions about norm compliance based on three different factors: self-interest, enforcement mechanisms and internalized emotions. Different agent personalities can be defined according to the importance given to each factor. These personalities have been experimentally compared and the results are shown in this article.

Related Topics
Physical Sciences and Engineering Computer Science Artificial Intelligence
Authors
, , , ,