Genetic algorithm report
Genetic algorithm report
Problems.
Asser Aldardiri – 21065616
Conclusion
In conclusion, this paper has delved deep into
the intricate dynamics of genetic algorithms,
presenting an exhaustive investigation into the
interactions of various GA operators and
structural elements in the optimization of
complex functions like the Rosenbrock function
and a polynomial fitness function. The findings
underscore the profound influence that the
selection, crossover, and mutation strategies,
along with the chromosome count, can exert on
the performance and outcomes of the genetic
algorithm. In particular, this work has revealed
how each of these elements, when aptly
configured, can play an instrumental role in
managing the balance between exploration and
exploitation in the search space. This critical
balance is, ultimately, what drives the ability of
a genetic algorithm to discover potential
solutions and reach global optima in challenging
optimization landscapes. The experimental
results, however, also illustrate the inherent
complexities and nuances associated with
genetic algorithms. The variability in outcomes
across different chromosome configurations and
GA parameters underlines the diversity of
Appendix
Figure 2 20 chromosomes
Figure 1 20 chromosomes
Figure 3 20 chromosomes Figure 4 20 chromosomes
Figure 5 20 chromosomes Figure 6 20 chromosomes
Figure 7 10 chromosomes Figure 8 10 chromosomes
Figure 9 10 chromosomes Figure 10 10 chromosomes
Figure 11 10 chromosomes Figure 12 30 chromosomes
Figure 13 30 chromosomes Figure 14 30 chromosomes
Figure 15 30 chromosomes Figure 17 30 chromosomes
Figure 16 30 chromosomes
Code
'''
This code implements a Genetic Algorithm (GA) framework that tests different parameter
combinations across multiple generations and tracks the performance in terms of best and
average fitness. The individuals in the population represent solutions to a given optimization
problem.
The Individual class defines an individual in the population with genes and a fitness value. The
create_initial_population function initializes the population with random gene values and
calculates the fitness of each individual.
The script then initializes some parameters, like the population size (P), number of genes per
individual (N), and number of generations (GENERATIONS). It also prepares a list of selection,
crossover, mutation, and fitness functions to be tested. It includes two types of selection
methods, three types of crossover methods, two types of mutation methods, and two fitness
functions.
Then, for a predefined number of sets (SET_OF_GENERATIONS), the GA runs for a number of
generations, selecting one method from each list at random. It also randomly selects mutation
rate (MUTRATE) and mutation step size (MUTSTEP). All these details are printed and written into
a file.
Each generation involves performing selection, crossover, and mutation operations on the
population. The fittest individual from each generation is preserved (elitism) and replaces the
least fit individual in the next generation. At the end of each generation, the best and average
fitnesses of the population are computed and stored. The process continues for a given number
of generations.
Finally, the code visualizes the best and average fitness across each set of generations, and
overall across all sets, using matplotlib. The generation and set that achieved the best overall
fitness are also recorded. The best fitness achieved over all generations and all sets is printed
out at the end.
'''
import random
import copy
import matplotlib.pyplot as plt
import math
import itertools
class Individual:
def __init__(self, N, MIN, MAX):
self.gene = [random.uniform(MIN, MAX) for _ in range(N)]
self.fitness = 0
def create_initial_population(P, N, MIN, MAX, fitness_function):
population = []
for _ in range(P):
newind = Individual(N, MIN, MAX)
newind.fitness = fitness_function(newind, N)
population.append(newind)
return population
offspring = []
for _ in range(P):
selected = random.choices(
population=population,
weights=selection_probs,
k=1
)[0]
offspring.append(copy.deepcopy(selected))
return offspring
def total_fitness(population):
total = 0
for individual in population:
total += individual.fitness
return total
N = 20
P = 5000
GENERATIONS = 100
SET_OF_GENERATIONS = 432
set_generations = []
set_best_fitness = []
set_average_fitness = []
best_of_bests = float('inf')
# Create a dictionary to store the best results of each set
best_results_per_set = {}
offspring = selection_function(population, P)
offspring = crossover_function(offspring, P, N)
# Replace the worst individual in the new generation with the best individual from the
previous generation
worst_individual_index = max(range(P), key=lambda index: offspring[index].fitness)
offspring[worst_individual_index] = copy.deepcopy(best_individual)
offspring = uniform_mutation(offspring, P, N, MUTRATE, MIN, MAX)
total_fitness_offspring = total_fitness(offspring)
new_best_fitness_offspring = min(ind.fitness for ind in offspring)
avg_fitness_offspring = total_fitness_offspring / P
# Update the best fitness and the generation where it was achieved
if new_best_fitness_offspring < best_results_per_set[set_gen]["best_fitness"]:
best_results_per_set[set_gen]["best_fitness"] = new_best_fitness_offspring
best_fitness_offspring = new_best_fitness_offspring
best_results_per_set[set_gen]["best_fitness_generation"] = gen
generations.append(gen)
best_fitness.append(best_fitness_offspring)
average_fitness.append(avg_fitness_offspring)
# Update the best fitness and the generation and set where it was achieved
if new_best_fitness_offspring < best_of_bests:
best_of_bests = new_best_fitness_offspring
best_generation = gen
best_set = set_gen
ax_row = j // 3
ax_col = j % 3
References
Boyd, S., & Vandenberghe, L. (2004). Convex optimization. Cambridge University Press.
Mitchell, M. (1998). An introduction to genetic algorithms. MIT press.
Kirkpatrick, S., Gelatt Jr, C. D., & Vecchi, M. P. (1983). Optimization by simulated annealing.
Science, 220(4598), 671-680.
Goldberg, D. E. (1989). Genetic algorithms in search, optimization, and machine learning.
Addison-Wesley.
Kennedy, J., & Eberhart, R. C. (1995). Particle swarm optimization. In Proceedings of ICNN'95-
International Conference on Neural Networks (Vol. 4, pp. 1942-1948). IEEE.
Bäck, T. (1996). Evolutionary algorithms in theory and practice: evolution strategies,
evolutionary programming, genetic algorithms. Oxford University Press.