Computer Science 315
You tell me that it's evolution
Assignment 7 - Genetic Algorithms
Well you know
We all want to to change the world
Due Friday, 14 December
Part I: SGA vs. NSGA-II
A few years ago I wrote a little genetic algorithms package called SuEAP
(Suite of Evolutionary Algorithms in Parallel), which enabled Matlab users
to use genetic algorithms with fitness computation done in parallel on a
Beowulf cluster or other
multi-processor Unix platform. Mainly this was a way to run NSGA-II with
parallel fitness computation, but I factored out the fitness computation
into a parent class, to facilitate addition of future algorithms to the
package. I have translated SuEAP into Python, minus the parallelism
implemented NSGA-II for you. (Which means that the P really stands
for Python, not Parallel!) In this part of the assignment you will
implement the Simple Genetic Algorithm (SGA) in Python, and compare the two
algorithms on a couple of benchmark problems.
First, add the following line to your .bashrc file:
which will give you access to a
nice open-source plotting
package used by SuEAP.
Next, download and unzip this file into a new directory.
Then, in a file called sgap.py, implement the SGAP class. Specifically,
you will have to implement the __init__ constructor
and the update method as specified here.
Your constructor should pass the optional argument seed to
SuEAP.__init__ (the superclass constructor) and then store the
required arguments (pc,
mu, elit) in self
for later. Your update method will be where you implement
the Simple Genetic Algorithm shown in the
(slide #18). Crossover should take place with probability pc:
you toss a biased coin, and if it comes up heads (True), you cross the parents
using the cross method, and put the resulting child into the new
population. If you get tails (False), you
randomly pick one of the parents (toss an unbiased coin) and put it into the
new population. Like crossover, mutation is already implemented in the test classes; you
just call the child's mutate method with arguments
mu, gen, and ngen.
Note that SuEAP sets up the initial population for you and
runs the generations loop (including fitness computation), so you only have to implement the
part of SGA highlighted in yellow in the lecture slide. In order to deal with multi-objective fitnesses,
SuEAP represents each member's fitness as an array of numbers, so you will have
to convert each fitness into a single scalar number before doing
fitness-proportionate selection. As we discussed in class, you can just replace each such array by
its product or sum.
To help you answer the questions below, your
update method should report the maximum and mean
fitnesses before returning the new population. You may find the
utility functions in
numutils.html useful in implementing SGA. Note
that the "pure" SGA in the lecture does not implement elitism (keeping
some fraction of the fittest individuals), so you can ignore the
elit parameter in your implementation, or implement elitism for
I have provided two test problems: simple bit strings, with one-dimensional
fitness, and the two-dimensional multi-objective problem from
Fonseca & Fleming (1993), cited by Deb et al. (2002) as an example of the
superiority of NSGA-II. Once you've written your SGA implementation, you can
test it by running the following command at from your Linux terminal:
The default settings in this test code will run your SGA on the bit-string
problem, with a high crossover probability and low mutation rate. Once you
have got your SGA working with these default values, experiment with them
to answer the following questions:
1. What is the effect on mean fitness of having little or no crossover?
2. What is the effect of a high mutation rate?
Now you are ready to explore the difference between SGA and NSGA-II. Change
your crossover and mutation back to their original values (high crossover,
low mutation), and switch from the Bits problem to the Fon problem. Running
sueap_test with this problem will launch a two-dimensional plot of fitnesses
over each generation (close the plot window to see the next generation.)
With NSGA-II, each front is plotted using its rank, with the size of the
number indicating the crowding distance of that member.
Observe what happens and answer the following
3. What is happening to the population fitnesses when you
collapse them into one dimension for SGA? Do you get a nice, spread-out
Now change sueap_test again so that you're running the NSGA-II algorithm,
and answer these questions:
4. How does the Pareto front obtained with NSGA-II on the Fon
problem compare with what you get from SGA?
5. With NSGA-II, what happens to the Pareto fronts as the solutions evolve?
If you implemented elitism in your SGA, answer the following question for
6 (Extra credit). Does a non-zero elitism value improve the
performance of your SGA on either or both test problems?
Here's another extra-credit problem you may want to try:
7 (Extra credit). Copy fon.py into another
file and modify it to implement one
of the other test problems from Table I on p. 187 of the
How do NSGA-II and SGA compare on this problem?
Part II: The NEAT algorithm and the NERO game
In this part of the assignment, you will train a team in
NEROGame, the videogame developed around
(real-time) verrsion of the
algorithm. This should be fun and will not require you to write any code.
The easiest way to get started is to
game onto your laptop or home computer (Windows, Mac, or Linux). On all three
platforms, you can run it by double-clicking on the NERO icon in the folder
where you installed it.
As a last resort, you can browse or cd to the shared directory where I installed it,
and run it from there:
Unfortunately, it seems to run faster on Mac and Windows than on Linux.
So you can also try installing it on your H: drive if you have enough
disk, and run it on one of the Windows machines in P405. For some reason,
the tutorial text is munged on the right-hand side on those machines, so I've
copied and posted the images for the simple and
advanced tutorials for you to follow along
Let me know ASAP if you can't get it to run
at a usable speed on the machines available to you.
To get started, choose Single Player / Simple Tutorial, then move on to the Advanced
Tutorial. When you understand the game, select Single Player / Start Training to
launch a training mission. For your arena, choose Virtual Sandbox, which is
what you were using in the tutorial
(it shows as the default for training, but you may have to select it explicitly,
or you'll get something else). For your starting team, choose
the default <Create New Team>, and for your training team size choose 30.
Aftering clicking BEGIN, spawn an enemy team for your team to play against; then
spawn your team and start training them. Once you're happy with the results,
hit ESC and save your team to a file via SAVE ARMY. Your file name should be your username
plus anything else you wish to call it; e.g., levy-team.rtf.
Make a note of the team you trained against.
Now you can test your team in battle: go back to the top menu, click Sandbox
Battle, load your saved team for Blue Team, and the one you trained against
for Red. This is how I am going to test your team, too. You can keep training
and testing till you're happy. For maximum fun, share your team with others
in the class (you can send each other your .rtf file as an
attachment and save it in the nero/data/saves/brains directory),
so you can play against each other offline. If we have enough time, we'll
try out a multi-player tournament too.
What to turn in
Send me an email containing your answers to the questions from the first part,
instructions for testing your NERO team in battle (opponent team name)
and with your sgap.py and saved team .rtf file as
attachments. If you answered
extra-credit question #7, send me your test-problem code as well.
1If you would like to implement parallel fitness computation as
a project (perhaps an R.E. Lee summer scholarship), please
let me know!