Abstract
Humans interact with computer systems and other humans in similar ways. Researchers identified personality as an important requirement for believable synthetic agents. By modeling a personality system in the agent's architecture we can bring the illusion of life to users and suspend their disbelief. Personality in synthetic agents is the main subject of this dissertation. Our focus is on group dynamics of agents interacting with users as a team with the same cooperative task. Instead of focusing on efficiency and performance we are more concerned on credibility and believability. We extended an existing model for building believable synthetic agents (SGD Model) by improving its personality system. One of the concluding remarks from previous experimental results of the original model states that very cohesive groups induce lower levels of user identification with the group. So we decided to add individualism as a new feature of the SGD Model by having a new type of incitement. The new individual motivation is permanently in conflict with the revised existing (group) motivation: help the group or do something for its own profit? Our improved personality system dictates if an agent would rather choose the former or the latter. The model was tested in a computer game. Our evaluation shows that the extended version of the SGD Model and its implementation are working as intended. Agents? behaviors and reactions follow a well defined set of rules. Additional tests are needed to verify if the extended model offers a more believable environment of cooperative synthetic agents.