Glossary

Part of The Empirical Semantics Approach to Communication Structure Learning and Usage: Individualistic vs. Systemic Views

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

A

• Artificial agent
Agents in the sense of this work are computer systems which exhibit intelligent and autonomous behavior (more or less). Autonomy as a property of an agent means that the agent exhibits goal-directed behavior without being operated by some external force. The inner state of such agents (including their goals) is usually opaque to observers.
Additionally, we consider only artificial agents which are able to interact with other agents or humans by means of communication, using an agent communication language.

• Agent communication language (ACL)
A formal language for the communication among artificial agents. ACLs are located on a higher conceptual level than means for the technical exchange of messages among computer systems or objects. Most ACLs are speech act-based, i.e., they specify assertions, requests etc. directed to other agents in form of speech acts. Typically, a single ACL message consists of the speech act type (indicating, e.g., an assertion or a request), the names of the sender and receiver agents, and the content, which describes what is asserted or requested. Examples for ACLs are KQML and FIPA ACL). Additionally, an ontology underlying the message content can be specified.

The ability to use an ACL for communication is a crucial property of interacting agents within a multiagent system.

The specification of a suitable semantics of ACL messages is still an open research problem.

E

• Elementary communication act (ECA)
A data structure for the modeling of communication acts. An ECA consists of an agent identifier and a projection. A single utterance can consist of multiple (or even infinitely many) conjunctive or disjunctive ECAs.

• Expectation
Gradual belief that a certain event will or should happen.

• Expectation networks (ENs)
Graphs for the computational representation of expectation structures. They consist of nodes (corresponding to expected events) and probabilistically labeled edges (denoting event correlations). ENs are obtained from observed events using machine learning techniques.
For further details on expectation networks please refer to (Nickles et al 2004b; Nickles et al 2005a; Lorentzen and Nickles 2002).

• Expectation structures
Regularities and patterns of event processes (in this work especially communications), modeled in terms of interrelated probabilistic expectations.
Expectation structures which model communication processes like dialogues are called social structures or communication structures (in the Luhmannian sense).
For further details on expectations and expectation structures please refer to (Nickles et al 2005a; Lorentzen and Nickles 2002).

M

• Multiagent system (MAS)
A set of interacting agents. Typically, these agents interact using an agent communication language.
A MAS is called open if agents can enter or leave the system at their will, and if the participating agents are fully autonomous.

• Message (utterance)
A word of an agent communication language which is transmitted between agents. The act of transmission initiated by the speaker is also called "utterance".

P

• Projections
Ostensibly desired states (i.e., publicly desired during communication). A "state" is here the state of some event (especially an action) happened in a certain context of previous events. Projections are represented as pointers leading from EN nodes (corresponding to ECAs) leading to another nodes (corresponding to desired events).

R

• Rational hulls
Rational hulls specify how persons (modeled as expectation structures) are expected to act rationally in a foreseeable manner in the succession of communication acts (e.g., by argumentation, sanctioning or negotiation in support of communicated goals, which are in our approach represented as so-called projections). Since a rational hull describes only communicated rationality, it does not need to have any direct relation to the "cognitive rationality" of an agent (steered by hidden goals). It is also important to see that like any kind of expectation, assumptions about rational behaviour can be disappointed and need then to be revised. Eventually, rational hulls form significant parts of the social expectation structure type called person in the terminology of social systems theory.
Typically, a rational hull is initially indefinite (because the agents have not formed enough reliable expectations about each other yet) and becomes increasingly definite in the course of interaction, provided that the agents work towards mutual understanding (but not necessarily cooperation), and stick to their allegations and alleged goals to some degree. The utterances themselves are modelled as pointers pointing to the desired/proposed states within the expectation network (thus denoting subjective expectation directed toward other agents in contrast to the objective expectations maintained by the semantics observer, who may be a non-agent, global entity or the human designer herself).

S

• Speech acts in relationship to empirical semantics
A concept from linguistic pragmatics and distributed AI, denoting "in or by saying something, we do something". The underlying speech act theory (in fact a bunch of partially inconsistent theories) traditionally distinguishes illocutionary acts, perlocutionary acts, and locutionary acts (or the "locution" of a speech act). Locutionary acts are the act of saying something, in terms of producing a meaningful utterance. Illocutionary acts in contrast perform an act of, e.g., stating, questioning, commanding, promising... in saying something, and perlocutionary acts perform acts by saying something (e.g., persuading, threatening, or convincing). Thus, illocutionary acts emphasize the immediate effect a message has, whereas perlocutionary acts refer to effects on the hearer (effects like actions or belief changes).


Empirical ACL semantics also takes an essentially pragmatic point of view, and allows for a "saying = acting"-based interpretation of messages. But in contrast to speech act theory, the meaning of an utterance is here entirely in its expectable behavioral consequences, moving empirical semantics close to Wittgenstein's concept of language games and also close to philosophical pragmatism. This consequentialist perspective does not mean to neglect "saying = acting", since despite being basically an a-posteriory approach, it allows for modeling immediate performance of acts in saying something due to the fact that it can learn the expectable consequences of a certain type of utterance, and apply what it has learned immediately at the time a new utterance of the same type occurs. In terms of Austin's original speech act theory, we could say that empirical semantics is able to learn conventions in form of expectations.

Empirical-rational semantics refines the very general approach of empirical semantics by explicitely considering the intentions of a rational speaker. But again, such "intentions" are only considered in form of specific behavioral consequences, arising from observable, ostensible intentions only. Instead of distinguishing illocution and perlocution as speech act theory does, it treats the alleged intentions of the speaker as a special case of the actual consequences of his respective utterances. Namely, the alleged intention causes the (expectable) speaker's behavior of supporting the alleged goal which has been expressed by means of the utterance. The notion of alleged intentionality is cleanly separated from "real" (mental) intentionality - a sincerity condition for the performance of communication acts do not exist in empirical(-rational) semantics.

More information on speech act theory can be found here. An in-depth criticism of speech act theory can be found here.

• Spheres of communication
The sphere of a single communication is the largest sequence of subsequent events which are consistent with the content of this communication. Thus, the sphere of communication can be seen as a horizon of the reliability of the utterance ("reliability" in the sense that the agent sticks to his communicated intentions).
The sphere of communication is sometimes given as a time span or the length of the mentioned sequence. A sphere of communication ends as soon as the respective agent contradicts himself or stops trying to achieve his projected goal states.

• Semantics observer
An agent or another technical facility which observes messages transmitted among other agents, and learns from these observations the empirical semantics of the ongoing communication. The semantics observer can be passive (overhearing the MAS only), or one of the interacting agents.