Spring2011RobotLabAssignment4

From GICL Wiki
Jump to: navigation, search


Contents

Assignment 4: "Predator vs. Prey"

In this assignment, you will program a multi-robot team of Predators to hunt and capture Prey. You have five Predators (red), who have limited camera sensors and the ability to communicate with each other. Their job is to capture four Prey (blue) after being randomly positioned in the world.

Capturing the Prey

It takes four Predators to capture one Prey. They must position themselves roughly to the North, South, East and West of the Prey, in close proximity. There is a built-in fudge factor for the boxing in of the Prey, so the Predators do not have to be \'\'exactly\'\' NSEW of the Prey.

Once the Predators have surrounded the Prey, it is considered captured, and it will disappear, and the camera sensors will no longer detect it. Each Predator will also receive a message that a Prey has been captured (with a location). After this message is received, the Predators should move on to find the next Prey.

Once all four Prey have been captured, the Predators should stop (or perform a celebratory dance).

Configuration

For this assignment, you will use AHOY, an event-based simulation environment designed to test networked multi-agent systems. Your implementation will be entirely in Python.

Setup

AHOY requires the standard Python libraries, and a *NIX environment (including Mac OS X), which you should already be using, and pygame. If you are using a Debian system, you can install pygame with:

 $ sudo apt-get install python-pygame

To begin, check out the \'robot_lab\' branch from the AHOY SVN repositories:

 $ svn checkout http://ahoy.googlecode.com/svn/branches/robot_lab ~/robot_lab_A4

Set your $PYTHONPATH to include this new path:

 $ export PYTHONPATH=~/robot_lab_A4/src/

You must also set a port for AHOY to use for communication. Pick something unique: if you run your simulator on the same multicast network and using the same port as someone else, there will be collisions. Use your birthday, hometown zip code, ATM pin number, last four digits of your SSN, etc.

 $ export AHOY_PORT=12345

Executing the Simulation

\'\'\'NOTE:\'\'\' \'\'You\'ll probably want to execute each of the following as individual processes in separate terminals. Depending on your environment, this might mean that you\'ll have to set the environment variables for each terminal\'\'

To start the simulator daemon (you can pass it any ID you want -- this is for distributed simulations and will not be used):

 $ cd $PYTHONPATH/ahoy/
 $ python startupdaemon.py 0

To start the GUI:

 $ cd $PYTHONPATH/../gui/
 $ python gui.py

Finally, to start the simulation:

 $ cd $PYTHONPATH/ahoy
 $ python robotlab-a4.py

To restart the simulation at any point, you only need to restart the simulation -- the simulator daemon and GUI can remain running.

Implementation

For your implementation, you should modify only \'\'\'one\'\'\' class: the PredatorAgentImpl class in predatorimpl.py. It is located in the SVN repo at /svn/branches/robot_lab/src/ahoy/agents/predatorimpl.py

Currently, there is a skeleton implementation there which provides an example of some basic behavior. For this example, the robot simply weaves back and forth.

Implementation Functions

The following functions are available in the PredatorAgentImpl class for you to modify. You may also add additional functions to this class as you see fit.

  • __init__(self, uid) - The initialization class. You may add any additional constructor code here.
  • main(self) - Main method which is called at startup
  • on_message_recv(self, src, contents) - This method is automatically invoked when a message from another predator is received.
  • on_prey_death(self, pos, uid) - This method is automatically invoked when a prey dies.
  • on_camera(self, locations) - This method is automatically invoked when when the camera sees other predators/prey.

Inherited Functions

The following functions are available to you via the PredatorAgent class, however, you may \'\'\'not\'\'\' modify them:

  • set_speed(blocks_per_second, radians_per_second) - How the Predator is moved. NOTE: The max for this is 2 blocks/sec.
  • get_position() - Returns the Predator\'s current position
  • get_rotaion() - Gets the Predator\'s current rotation in radians (where 0 radians is facing north (the vector [0.0, 1.0]) and it moves clockwise)
  • get_uid() - Returns a Unique ID for the Predator (to distinguish between other Predators)
  • send_message(data) - Sends a string message to all other predators with the new location. The contents being sent can be any serializable object, not just a string.

Implementation Breakdown

Some items which your implementation must address to ensure success are:

  • Initially discovering a Prey with a camera sensor
  • Communicating found Prey locations between Predators
  • Coordinating movement/orientation between Predators
  • Dealing with the Prey\'s constant movement
  • Dealing with unreliable or out-of-order message delivery

Additional Requirements

  • You may not rely on anything particular to the provided map -- The Predators and the Prey will begin in different randomly selected locations every time the simulation is run
  • You cannot modify ANY code except in the \'\'\'Predator Agent Implementation\'\'\' (/svn/branches/robot_lab/src/ahoy/agents/predatorimpl.py)

If you are not sure about the above requirements, feel free to email the TA.

Extra Credit

All working implementations will be given five timed trials to capture all the Prey, and their times will be averaged for each implementation. The top 5 implementations will receive extra extra credit points as follows:

  • 1st place - 10 points
  • 2nd place - 8 points
  • 3rd place - 6 points
  • 4th place - 4 points
  • 5th place - 2 points

You do not need to complete anything extra to be eligible to receive extra credit. You should instead work to make your implementation as fast and efficient as possible.

Submission

  • Submit a tarball to the TA\'s: Dustin Ingram (dsi23@drexel.edu) & Aaron Rosenfeld (ar374@drexel.edu) containing:
    • Your modified \'\'\'Predator Agent Implementation\'\'\' (/svn/branches/robot_lab/src/ahoy/agents/predatorimpl.py) file
    • A \'\'\'README\'\'\' file briefly describing your work, method, and peculiarities of your program
    • \'\'\'Nothing Else!\'\'\' You should not have to modify any other files!
  • If something does not work properly or is incomplete, you must say so in the README.

Grading

The assignment is worth 30 points. Grading will be as follows (based on an average of 3-5 trials):

  • 30 points : All of the Prey are captured within 5 minutes.
  • 20-29 points : Some of the Prey are captured within 5 minutes.
  • 10-19 points : None of the Prey are captured within 5 minutes, but the Predators come really, really close to capturing them.
  • 0-9 points : None of the Prey are captured within 5 minutes, and the Predators do not come close to capturing them.

The extra-credit is worth up to 10 extra points.

\'\'\'TOTAL: 30 + 10E.C. \'\'\'

Graphics

The Predators are red, the Prey are blue. The green points signify that a camera sensor has detected a Predator or Prey:

PP CS485-511 2011 screenshot v2.png

Additional Resources

Although they are probably not necessary for this assignment, if you are interested in this problem, you may enjoy the following research papers: