Definition of Agent
An agent is anything that can be viewed as perceiving its environment through sensors and acting upon that environment through actuators.- Human Sensors: Eyes, ears, and other organs for sensors.
- Human Actuators: Hands, legs, mouth, and other body parts.
- Robotic Sensors: Mic, cameras and infrared range finders for sensors
- Robotic Actuators: Motors, Display, speakers etc
The Structure of Intelligent Agents
Agent = Architecture + Agent Program
Architecture = the machinery that an agent executes on. (Hardware)
Agent Program = an implementation of an agent function. (Algorithm , Logic – Software)
Characteristics of Intelligent Agents
1. Situatedness
- The agent receives some form of sensory input from its environment, and it performs some action that changes its environment in some way.
- Examples of environments: the physical world and the Internet.
2. Autonomy
- The agent can act without direct intervention by humans or other agents and that it has control over its own actions and internal state.
3. Adaptivity
The agent is capable of- Reacting flexibly to changes in its environment;
- Taking goal-directed initiative (i.e., is pro-active), when appropriate; and
- Learning from its own experience, its environment, and interactions with others.
4. Sociability
- The agent is capable of interacting in a peer-to-peer manner with other agents or humans.
Properties of Environment
- An environment is everything in the world which surrounds the agent, but it is not a part of an agent itself. An environment can be described as a situation in which an agent is present.
- The environment is where agent lives, operate and provide the agent with something to sense and act upon it.
1. Fully observable vs Partially Observable:
- If an agent sensor can sense or access the complete state of an environment at each point of time then it is a fully observable environment, else it is partially observable.
- A fully observable environment is easy as there is no need to maintain the internal state to keep track history of the world.
- An agent with no sensors in all environments then such an environment is called as unobservable.
- Example: chess – the board is fully observable, as are opponent’s moves. Driving – what is around the next bend is not observable and hence partially observable.
2. Deterministic vs Stochastic:
- If an agent's current state and selected action can completely determine the next state of the environment, then such environment is called a deterministic environment.
- A stochastic environment is random in nature and cannot be determined completely by an agent.
- In a deterministic, fully observable environment, agent does not need to worry about uncertainty.
3. Episodic vs Sequential:
- In an episodic environment, there is a series of one-shot actions, and only the current percept is required for the action.
- However, in Sequential environment, an agent requires memory of past actions to determine the next best actions.
4. Single-agent vs Multi-agent
- If only one agent is involved in an environment, and operating by itself then such an environment is called single agent environment.
- However, if multiple agents are operating in an environment, then such an environment is called a multi-agent environment.
- The agent design problems in the multi-agent environment are different from single agent environment.
5. Static vs Dynamic:
- If the environment can change itself while an agent is deliberating then such environment is called a dynamic environment else it is called a static environment.
- Static environments are easy to deal because an agent does not need to continue looking at the world while deciding for an action.
- However for dynamic environment, agents need to keep looking at the world at each action.
- Taxi driving is an example of a dynamic environment whereas Crossword puzzles are an example of a static environment.
6. Discrete vs Continuous:
- If in an environment there are a finite number of precepts and actions that can be performed within it, then such an environment is called a discrete environment else it is called continuous environment.
- A chess game comes under discrete environment as there is a finite number of moves that can be performed.
- A self-driving car is an example of a continuous environment.
7. Known vs Unknown
- Known and unknown are not actually a feature of an environment, but it is an agent's state of knowledge to perform an action.
- In a known environment, the results for all actions are known to the agent. While in unknown environment, agent needs to learn how it works in order to perform an action.
- It is quite possible that a known environment to be partially observable and an Unknown environment to be fully observable.
8. Accessible vs. Inaccessible
- If an agent can obtain complete and accurate information about the state's environment, then such an environment is called an Accessible environment else it is called inaccessible.
- An empty room whose state can be defined by its temperature is an example of an accessible environment.
- Information about an event on earth is an example of Inaccessible environment.
0 Comments
Post a Comment