Whether it’s building maple seed shaped drones or inventing Velcro based on burdock burrs, nature has always been a great inspiration for humans.
It goes without saying that optimization algorithms optimize for some kind of goal or target metric. Evolution is no different. Natural selection in the real world optimizes for survival on Earth. In fact, each and every life form on Earth is a solution generated by evolution’s algorithm, which evolves a population of individuals over generations, optimizing for survival. Give it enough time (4.5 billion years), and you get humans, although we have yet to pass certain extinction tests that have wiped out other species in the past, but I’m optimistic.
In this post we’ll loosely draw inspiration from evolution to build a simple genetic algorithm.
Our two main classes will be an Individual and a Population, and populations are made up of individuals. Our individuals will be made up of 10 initially random integers between 0 and 100. The goal will be to get the sum of those 10 numbers to equal 900, and that’ll be the measure of how fit an individual is.
We’ll initialize a population with a population size of 100 individuals, where each individual will be randomly initialized. Then we’ll evolve the population to find an optimal solution.
Here are the basic steps for the evolving part:
Select the fittest (sum is closest to 900) individuals to be the parents of the next generation. In this case a fitness of 0 will be the best because fitness is abs(900 — sum(individual.numbers))
We’ll randomly select some of the non-fittest individuals to be parents as well. This increases our chance of finding a global optimum.
Then we’ll breed or crossover the selected parents, creating new individuals, that we’ll call the children . Instead of randomly initializing the children, we’ll base them on the parents. When creating children, there’ll be a chance that the child will have a random mutation of it’s numbers.
The parents and the children will form part of the next generation.
Calculate the average population fitness. Rinse and repeat.
When we hit an average population fitness of 0 (or close to it) we’ll stop evolving. Normally it’s more intuitive to have a higher fitness be better than a lower fitness, in our case our convention is that lower fitness is better.
This of course is a super simplified version of a more complex program we could build. For example, we could have different species competing with each other, and we can have the chance of mutation be property of a species, etc. Having just the right mutation chance could make the difference between one species winning over another.
Once we have the main code nailed down, it’s fun to play around with the parameters and look at how long it takes to achieve an optimal solution:
We can use these kinds of genetic algorithms to build really cool things, such as evolving a neural network’s hyperparameters leading to a very efficient search of optimal parameters.
Thanks for reading and let me know if you have any feedback :)
Happy holidays!