0% found this document useful (0 votes)
6 views

Genetic algorithm report

This research study develops a Genetic Algorithm (GA) framework to optimize complex problems, particularly focusing on the Rosenbrock and polynomial quartic functions. The study explores various parameter combinations, selection methods, crossover techniques, and mutation strategies to analyze performance in terms of fitness across generations. Results indicate that the GA's adaptability and strategic parameter selection significantly influence optimization outcomes, highlighting the nuanced relationship between algorithm structure and problem complexity.

Uploaded by

Asser Moemen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Genetic algorithm report

This research study develops a Genetic Algorithm (GA) framework to optimize complex problems, particularly focusing on the Rosenbrock and polynomial quartic functions. The study explores various parameter combinations, selection methods, crossover techniques, and mutation strategies to analyze performance in terms of fitness across generations. Results indicate that the GA's adaptability and strategic parameter selection significantly influence optimization outcomes, highlighting the nuanced relationship between algorithm structure and problem complexity.

Uploaded by

Asser Moemen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 28

Evolutionary Genetic Algorithm for Optimization

Problems.
Asser Aldardiri – 21065616

INTRODUCTION and random_resetting_mutation) have been


included, modifying the genes of individuals
In this research study, we explore the
slightly. We then set various parameters, such as
development of a Genetic Algorithm (GA)
population size (P), number of genes per
framework, leveraging different parameter
individual (N), and the number of generations
combinations across multiple generations to
(GENERATIONS). A list of selection, crossover,
analyze performance in terms of best and
mutation, and fitness functions to be tested is
average fitness. Genetic Algorithms represent a
prepared, allowing us to test a variety of
class of heuristic optimization algorithms
methods for each aspect of the genetic
inspired by the process of natural evolution. In
algorithm. With the above infrastructure, we can
our case, we use these algorithms to solve a
now execute our genetic algorithm framework
given optimization problem, where the
for a predetermined number of sets
individuals in the population embody potential
(SET_OF_GENERATIONS). During each
solutions. Our approach is comprehensive and
generation, we randomly select a method from
methodical. The initial step includes defining an
each list, along with a mutation rate
Individual class that characterizes an individual
(MUTRATE) and mutation step size
within the population, incorporating unique
(MUTSTEP). These operations are thoroughly
genes and an assigned fitness value. We then
logged and documented for analysis. Our GA
proceed to construct the initial population
runs through generations, performing selection,
through the create_initial_population function,
crossover, and mutation operations on the
assigning random gene values and calculating
population. The fittest individual from each
fitness for each individual. Two distinct fitness
generation is preserved (elitism) and replaces the
functions, polynomial_quartic and
least fit individual in the next generation. At the
rosenbrock_like_function, have been utilized
end of each generation, the best and average
to evaluate the quality of solutions, where lower
fitness of the population are computed and
fitness values signify superior solutions. The
stored, before proceeding to the next generation.
selection process for the GA has been carried out
Upon completion, we visualize the best and
using tournament_selection and
average fitness across each set of generations,
roulette_wheel_selection functions,
and overall, across all sets, utilizing matplotlib.
determining the fittest individuals in a random
We also identify the generation and set that
selection. We have introduced three types of
achieved the best overall fitness. The optimal
crossover functions: uniform_crossover,
fitness achieved over all generations and all sets
single_point_crossover, and
is printed out at the end, providing an overall
two_point_crossover. These functions are
summary of the algorithm's performance. This
designed to combine the genes of two parent
comprehensive process aims to optimize the
individuals to generate offspring. Moreover, to
problem-solving capacity of our Genetic
encourage variability in the population, two
Algorithm framework through rigorous testing
types of mutation functions (uniform_mutation
and performance tracking.
Experimentation was set at 0.15625 to balance exploration and
exploitation in the search space. This experiment
The exploration of genetic algorithms (GAs) and was built on a unique blend of variation, with
their applications in solving complex each iteration using either 30, 20, or 10
optimization problems can be a challenging yet chromosomes. I anchored the program on the 20
rewarding endeavor. In my latest experimental chromosomes variant, which presented an
run, a significant result was yielded when optimal balance between complexity and
attempting to optimize the Rosenbrock function computational resources. Harnessing the
(Equation 1) – a notorious non-convex function itertools module's power, the parameters were
that poses a considerable challenge for randomly selected for each set, maximizing the
optimization techniques, as well as a polynomial exploration of potential combinations.
quartic equation (Equation 2). Interestingly, the Rosenbrock function's
d optimization displayed a lack of consistent
f ( x )=( x1 −1 ) + ∑ i ( 2 x 2i −x i−1 )
2 2
convergence across the generations. This
i=2 variability emphasizes the function's complexity,
Equation 1 highlighting the unpredictability of the
function's topology in the genetic algorithm's
d
1 search space. Conversely, the polynomial fitness
f ( x )= ∑ ( x 4−16 x 2i +5 x i)
2 i =1 i function showed higher consistency, converging
swiftly and reaching a stable fitness value of
Equation 2 approximately -58000.
At the 100th generation, I achieved a promising Further reflection on the chosen parameters
result of 0.8261354856096976. This was reveals their competitive edge over alternatives
accomplished within set 426 out of a total of 432 such as roulette wheel selection, single-point or
sets, each comprising 20 chromosomes. uniform crossover, and uniform mutation. For
Selection, crossover, and mutation are the instance, the tournament selection method,
cornerstones of GAs, playing instrumental roles chosen for this experiment, has a couple of
in the discovery of potential solutions. For this significant advantages over roulette wheel
experiment, I utilized the tournament selection, selection. Unlike roulette wheel selection that
two-point crossover, and random resetting often tends to favor individuals with higher
mutation functions. The tournament selection fitness values, tournament selection fosters
method was preferred for its simplicity, diversity diversity by creating a fair opportunity for less
preservation, and ability to maintain genetic fit individuals. This preservation of diversity
variety without invoking excessive mitigates premature convergence and helps
computational load. The two-point crossover maintain a broader search space. Two-point
and random resetting mutation furthered the crossover also demonstrated superior results
diversity by introducing genetic variance and compared to single-point or uniform crossover.
modifications, essential for the exploration of While single-point crossover swaps genes at a
search space. Each GA operator was single location, two-point crossover swaps genes
supplemented by the Rosenbrock-like fitness in between two points, providing greater scope
function, a challenging non-convex function for exploration. It allows the algorithm to probe
used in testing the efficiency of optimization deeper into the search space, enhancing the
algorithms. By implementing this function, I possibilities of finding better solutions.
was able to assess the genetic algorithm's Similarly, uniform crossover may disrupt the
effectiveness in navigating the Rosenbrock positioning of promising gene sequences by
function's rough landscape. The mutation rate enforcing a uniform spread of genes from both
was maintained at 0.03, and the mutation step
parents. Two-point crossover, in contrast, can genetic algorithm despite the smaller gene pool.
potentially maintain beneficial genetic For the Rosenbrock function, the 10-
sequences intact, thereby aiding the optimization chromosome variant manifested an impressive
process. Random resetting mutation, used in this fitness result of 0.000041497596279779214, the
experiment, contributes more effectively to most promising across all configurations. This
genetic diversity compared to uniform mutation. result was unexpected given the reduced
While uniform mutation changes the value of a complexity and diversity inherent to a smaller
gene to any value within the permissible range chromosome count. Interestingly, unlike the 20-
uniformly, random resetting can disrupt the chromosome variant, this configuration seemed
predictability of the search, promoting to favor the roulette wheel selection, uniform
exploration, and aiding in the escape from local crossover, and uniform mutation. These methods
minima. The chosen mutation rate of 0.03 and are typically seen as more random and less
step of 0.15625, combined with random disruptive to the existing gene pool, which might
resetting mutation, managed to maintain an be beneficial when working with smaller
optimal balance between genetic variance chromosome sets. The roulette wheel selection,
(exploration) and the maintenance of promising as compared to tournament selection, could
solutions (exploitation). Elitism, another prioritize individuals with higher fitness values
important aspect of this experiment, ensured the more strongly in a smaller population, speeding
survival of the fittest individuals from one up the convergence. The uniform crossover and
generation to the next. This strategy not only mutation, on the other hand, could contribute to
preserves the best-found solutions but also maintaining diversity in the population by
maintains a steady upward pressure on the spreading genes more uniformly, which might be
population's overall fitness. By ensuring that the crucial when dealing with a limited number of
most fit individuals are not lost due to selection, chromosomes. The comparison between these
crossover, or mutation operations, elitism two variations reveals the nuanced relationship
significantly contributes to the rapid between the genetic algorithm's structural
convergence to optimal solutions, even in the parameters and the optimization landscape it
face of complex optimization problems such as navigates. The more complex 20-chromosome
the Rosenbrock function. In conclusion, the variant found success with more disruptive
chosen configuration of parameters and techniques such as tournament selection, two-
strategies including tournament selection, two- point crossover, and random resetting mutation.
point crossover, random resetting mutation, and In contrast, the simpler 10-chromosome
the practice of elitism, orchestrated the configuration seemed to thrive on less
successful optimization of the Rosenbrock disruptive, more random strategies. This
function. Each parameter had a specific role in emphasizes the adaptability of genetic
maintaining genetic diversity, facilitating algorithms and underscores the importance of
exploration-exploitation balance, and promoting context when selecting the most suitable
rapid convergence, which collectively led to the strategies and parameters. It also serves as a
successful outcomes in this experiment. reminder of the diverse paths to optimization
Observations from the same experimental run that genetic algorithms can take, depending on
with 10 chromosomes offered intriguing their structure and the problem at hand. In the
insights. In this variation, the polynomial 30-chromosome variation, the genetic algorithm
function showed a similar trend of consistent exhibited another set of intriguing results. The
convergence, with the best result reaching a polynomial function reached its peak
fitness value of -29000. Notably, this was performance in this variant, achieving an
achieved with a reduced chromosomal impressive fitness value of -89000. This
configuration, suggesting the robustness of the suggests that the polynomial function's
optimization could benefit from the increased algorithms. One potential modification could be
complexity and diversity inherent in the larger the introduction of different selection strategies.
chromosome set. More chromosomes may offer While the experiment involved tournament and
the genetic algorithm more opportunities to roulette wheel selection methods, there are other
explore and exploit the solution space, leading to methods like rank-based or truncation selection
better results. However, the Rosenbrock function which could provide a different dynamic to the
displayed a contrasting trend. Despite the selection process. For instance, rank-based
increased chromosome count, the best fitness selection, which ranks individuals based on their
achieved was 2.267, a less optimal outcome fitness and selects them based on their rank
compared to the other variants. This outcome rather than their absolute fitness, could
might suggest that the Rosenbrock function, potentially offer a more balanced approach
with its complex, multi-modal landscape, could towards selection, mitigating the risk of
become increasingly challenging to optimize premature convergence. Another potential area
with more chromosomes. The added complexity of exploration could be population size
might have increased the possibility of the variation. While the experiments focused on
genetic algorithm getting trapped in local chromosome variations, manipulating the
minima, thereby degrading the performance. population size could offer new insights into the
Interestingly, the optimal parameters for the 30- behavior of GAs. Larger populations might
chromosome variant mirrored those of the 10- provide a more diverse gene pool to work with,
chromosome variant, with the exception of potentially leading to more effective exploration
mutation function. This again included roulette of the search space. Conversely, smaller
wheel selection and uniform crossover. populations might require fewer resources, lead
However, uniform mutation was found to be the to faster computations, and perhaps put more
most suitable for this configuration, unlike in the emphasis on the exploitation of promising areas
10-chromosome variant. This result underscores of the search space. Moreover, the
that a larger gene pool might benefit from a implementation of alternative crossover
more uniform spread of genetic variance, which techniques such as cycle crossover or order
uniform mutation can provide. The differences crossover, particularly in permutation-based
in optimal parameters and outcomes across the problems, could have been another area for
10, 20, and 30-chromosome variants highlight exploration. These methods respect the absolute
the dynamic interplay between the genetic position or order of the genes, which could offer
algorithm's structure and the optimization unique advantages in specific problem domains.
landscape. They emphasize the importance of Additionally, the use of adaptive mutation rates
adaptability and strategic parameter selection in could be considered. In this experiment, a fixed
achieving successful results with genetic mutation rate was used. However, an adaptive
algorithms. Moreover, these results indicate that mutation rate, which changes based on the
while increasing complexity (via more fitness of the individuals or the generation
chromosomes) may enhance performance in number, could potentially provide a more
some optimization problems, it could also pose dynamic balance between exploration and
new challenges in others, underscoring the exploitation over the course of the evolution
nuanced nature of genetic algorithm process. While the experiment provided
performance across different problem domains. considerable insights into the behavior and
Reflecting on the conducted experiments and the performance of GAs, these potential
resulting insights, certain alternative strategies modifications and additions signify the vast
or modifications may have led to different possibilities that exist within the realm of
outcomes or added another layer of genetic algorithms. They serve as a reminder
understanding to the functionality of genetic that optimization strategies can always be
tailored, modified, and improved upon, convex functions like the Rosenbrock function.
highlighting the dynamic and adaptable nature However, it requires careful tuning of
of genetic algorithms. This opens up several parameters such as the cooling schedule and
exciting avenues for future exploration and acceptance probability. In comparison, the GA
experimentation in this field. framework incorporates selection, crossover, and
mutation operators that provide a systematic and
Comparison structured exploration of the search space,
In order to assess the performance of the Genetic potentially leading to more efficient
Algorithm (GA) framework developed in this optimization (Goldberg, 1989).
research study, it is important to compare its Particle Swarm Optimization (PSO):
results with those of other well-known
optimization approaches. While the PSO is a population-based optimization
implementation and evaluation of other algorithm inspired by the collective behavior of
algorithms were not conducted in this study, we bird flocking or fish schooling. It maintains a
can discuss the general characteristics and swarm of particles that explore the search space
performance of some commonly used and communicate information about the best
optimization algorithms and their applicability solutions found (Kennedy & Eberhart, 1995).
to the provided optimization problems. PSO can be effective for both convex and non-
convex functions. However, it heavily relies on
Gradient Descent: the social interaction between particles and can
Gradient Descent is a popular optimization suffer from premature convergence or
algorithm that relies on the calculation of the stagnation. The GA framework, with its diverse
gradient of a function to iteratively update the selection, crossover, and mutation strategies,
solution towards the minimum. It is particularly allows for a more balanced exploration and
effective for convex functions with smooth exploitation, reducing the risk of premature
landscapes, where the gradient provides a convergence and potentially achieving better
reliable indication of the direction of optimization results (Bäck, 1996).
improvement (Boyd & Vandenberghe, 2004). Evolutionary Strategies (ES):
However, for non-convex and multimodal
functions like the Rosenbrock function, Gradient ES is an optimization algorithm that focuses on
Descent can easily get stuck in local minima and the adaptation of a population through mutation
struggle to find the global optimum. The GA and selection. It is particularly effective for
framework, with its exploration-exploitation continuous function optimization problems and
balance and genetic diversity, has the potential can handle non-convex landscapes. ES typically
to overcome this limitation and discover global employs Gaussian mutation to explore the
optima by exploring different regions of the search space (Bäck, 1996). While ES shares
search space (Mitchell, 1998). some similarities with the GA framework in
terms of genetic diversity and adaptation, the
Simulated Annealing: GA framework incorporates additional selection
Simulated Annealing is a metaheuristic and crossover operators that provide a more
optimization algorithm that mimics the comprehensive exploration of the search space
annealing process in metallurgy. It performs a and potentially lead to better optimization
stochastic search by accepting worse solutions performance (Mitchell, 1998).
with a certain probability to escape local optima
(Kirkpatrick et al., 1983). Simulated Annealing
can be effective for rugged landscapes and non- It is important to note that the performance of
optimization algorithms can vary depending on
the specific characteristics of the problem being strategies that can lead to optimization success,
solved. Different algorithms may excel in and the adaptability required to achieve it.
different scenarios, and their effectiveness can Moreover, the reflection on alternative strategies
be influenced by factors such as the function's and methods that were not explored in this work
landscape, dimensionality, and constraints. serves to remind us of the vast potential that
Therefore, it is recommended to carefully select genetic algorithms hold. As optimization
and tailor the optimization algorithm to the strategies can always be tailored, modified, and
specific problem at hand. While this study enhanced, the realm of genetic algorithms opens
focused on comparing the performance of the up a multitude of exciting paths for future
GA framework with different parameter exploration and experimentation. This work,
combinations on the Rosenbrock function and a therefore, not only contributes valuable insights
polynomial quartic equation, future research to the understanding and application of genetic
could include a more comprehensive evaluation algorithms but also encourages further
of the GA framework against other optimization exploration and innovation in this dynamic and
algorithms using a wider range of benchmark versatile field of study. Ultimately, the findings
functions and performance metrics. This would attest to the robustness and versatility of genetic
provide a more rigorous and quantitative algorithms as a powerful tool for solving
comparison of the GA framework's performance complex optimization problems, a tool that
and its advantages over other approaches in continues to hold significant promise for future
various problem domains. research and applications.

Conclusion
In conclusion, this paper has delved deep into
the intricate dynamics of genetic algorithms,
presenting an exhaustive investigation into the
interactions of various GA operators and
structural elements in the optimization of
complex functions like the Rosenbrock function
and a polynomial fitness function. The findings
underscore the profound influence that the
selection, crossover, and mutation strategies,
along with the chromosome count, can exert on
the performance and outcomes of the genetic
algorithm. In particular, this work has revealed
how each of these elements, when aptly
configured, can play an instrumental role in
managing the balance between exploration and
exploitation in the search space. This critical
balance is, ultimately, what drives the ability of
a genetic algorithm to discover potential
solutions and reach global optima in challenging
optimization landscapes. The experimental
results, however, also illustrate the inherent
complexities and nuances associated with
genetic algorithms. The variability in outcomes
across different chromosome configurations and
GA parameters underlines the diversity of
Appendix

Figure 2 20 chromosomes

Figure 1 20 chromosomes
Figure 3 20 chromosomes Figure 4 20 chromosomes
Figure 5 20 chromosomes Figure 6 20 chromosomes
Figure 7 10 chromosomes Figure 8 10 chromosomes
Figure 9 10 chromosomes Figure 10 10 chromosomes
Figure 11 10 chromosomes Figure 12 30 chromosomes
Figure 13 30 chromosomes Figure 14 30 chromosomes
Figure 15 30 chromosomes Figure 17 30 chromosomes

Figure 16 30 chromosomes
Code
'''

This code implements a Genetic Algorithm (GA) framework that tests different parameter
combinations across multiple generations and tracks the performance in terms of best and
average fitness. The individuals in the population represent solutions to a given optimization
problem.

The Individual class defines an individual in the population with genes and a fitness value. The
create_initial_population function initializes the population with random gene values and
calculates the fitness of each individual.

The polynomial_quartic function and rosenbrock_like_function are fitness functions that


evaluate the quality of solutions. Lower fitness values represent better solutions.

tournament_selection and roulette_wheel_selection functions perform the selection operation


for the GA. In tournament selection, two individuals are chosen at random and the fitter one is
selected. In roulette wheel selection, individuals are selected based on their relative fitnesses.

Three types of crossover functions are defined: uniform_crossover, single_point_crossover, and


two_point_crossover. These functions mix the genes of two parent individuals to generate
offspring.

Two types of mutation functions are defined: uniform_mutation and


random_resetting_mutation. The mutation operation introduces variability in the population by
slightly altering the genes of the individuals.

The script then initializes some parameters, like the population size (P), number of genes per
individual (N), and number of generations (GENERATIONS). It also prepares a list of selection,
crossover, mutation, and fitness functions to be tested. It includes two types of selection
methods, three types of crossover methods, two types of mutation methods, and two fitness
functions.
Then, for a predefined number of sets (SET_OF_GENERATIONS), the GA runs for a number of
generations, selecting one method from each list at random. It also randomly selects mutation
rate (MUTRATE) and mutation step size (MUTSTEP). All these details are printed and written into
a file.

Each generation involves performing selection, crossover, and mutation operations on the
population. The fittest individual from each generation is preserved (elitism) and replaces the
least fit individual in the next generation. At the end of each generation, the best and average
fitnesses of the population are computed and stored. The process continues for a given number
of generations.

Finally, the code visualizes the best and average fitness across each set of generations, and
overall across all sets, using matplotlib. The generation and set that achieved the best overall
fitness are also recorded. The best fitness achieved over all generations and all sets is printed
out at the end.

'''

import random
import copy
import matplotlib.pyplot as plt
import math
import itertools

class Individual:
def __init__(self, N, MIN, MAX):
self.gene = [random.uniform(MIN, MAX) for _ in range(N)]
self.fitness = 0
def create_initial_population(P, N, MIN, MAX, fitness_function):
population = []
for _ in range(P):
newind = Individual(N, MIN, MAX)
newind.fitness = fitness_function(newind, N)
population.append(newind)
return population

def rosenbrock_like_function(ind, N):


return (ind.gene[0] - 1)**2 + sum((i+1) * (2 * ind.gene[i]**2 - ind.gene[i-1])**2 for i in
range(1, N))

def polynomial_quartic(ind, N):


return 0.5 * sum((ind.gene[i])**4 - (16*ind.gene[i])**2 + (5*ind.gene[i]) for i in range(N))

def tournament_selection(population, P):


offspring = []
for i in range(P):
parent1 = random.randint(0, P-1)
off1 = copy.deepcopy(population[parent1])
parent2 = random.randint(0, P-1)
off2 = copy.deepcopy(population[parent2])
if off1.fitness < off2.fitness:
offspring.append(off1)
else:
offspring.append(off2)
return offspring

def roulette_wheel_selection(population, P):


total_fitness = sum(1.0 / individual.fitness for individual in population)
selection_probs = [(1.0 / individual.fitness) / total_fitness for individual in population]

offspring = []

for _ in range(P):
selected = random.choices(
population=population,
weights=selection_probs,
k=1
)[0]
offspring.append(copy.deepcopy(selected))

return offspring

def uniform_crossover(offspring, P, N):


for i in range(0, P, 2):
child1 = copy.deepcopy(offspring[i])
child2 = copy.deepcopy(offspring[i+1])
for j in range(N):
if random.random() < 0.5:
child1.gene[j], child2.gene[j] = child2.gene[j], child1.gene[j]
offspring[i] = copy.deepcopy(child1)
offspring[i+1] = copy.deepcopy(child2)
return offspring

def single_point_crossover(offspring, P, N):


for i in range(0, P, 2):
child1 = copy.deepcopy(offspring[i])
child2 = copy.deepcopy(offspring[i+1])
crossover_point = random.randint(0, N-1)
child1.gene[crossover_point:], child2.gene[crossover_point:] =
child2.gene[crossover_point:], child1.gene[crossover_point:]
offspring[i] = child1
offspring[i+1] = child2
return offspring

def two_point_crossover(offspring, P, N):


for i in range(0, P, 2):
child1 = copy.deepcopy(offspring[i])
child2 = copy.deepcopy(offspring[i+1])
crossover_point1 = random.randint(0, N-1)
crossover_point2 = random.randint(0, N-1)
if crossover_point2 < crossover_point1:
crossover_point1, crossover_point2 = crossover_point2, crossover_point1
child1.gene[crossover_point1:crossover_point2],
child2.gene[crossover_point1:crossover_point2] =
child2.gene[crossover_point1:crossover_point2],
child1.gene[crossover_point1:crossover_point2]
offspring[i] = child1
offspring[i+1] = child2
return offspring

def uniform_mutation(offspring, P, N, MUTRATE, MIN, MAX):


ELITISM_MUTRATE = 0.01 # mutation rate for the elite individual
for i in range(P):
newind = Individual(N, MIN, MAX)
for j in range(N):
gene = offspring[i].gene[j]
mutprob = random.random()
if i == 0: # the best individual (elitism)
if mutprob < ELITISM_MUTRATE: # use a smaller mutation rate
gene += random.uniform(-MUTSTEP, MUTSTEP)
gene = max(min(gene, MAX), MIN) # Ensure gene is within bounds
else: # the rest individuals
if mutprob < MUTRATE:
gene += random.uniform(-MUTSTEP, MUTSTEP)
gene = max(min(gene, MAX), MIN) # Ensure gene is within bounds
newind.gene[j] = gene
newind.fitness = fitness_function(newind, N)
offspring[i] = copy.deepcopy(newind)
return offspring

def random_resetting_mutation(offspring, P, N, MUTRATE, MIN, MAX):


for i in range(P):
for j in range(N):
mutprob = random.random()
if mutprob < MUTRATE:
offspring[i].gene[j] = random.uniform(MIN, MAX) # Uniformly random new value
offspring[i].fitness = fitness_function(offspring[i], N) # Recalculate fitness
return offspring

def total_fitness(population):
total = 0
for individual in population:
total += individual.fitness
return total

N = 20
P = 5000
GENERATIONS = 100
SET_OF_GENERATIONS = 432
set_generations = []
set_best_fitness = []
set_average_fitness = []
best_of_bests = float('inf')
# Create a dictionary to store the best results of each set
best_results_per_set = {}

selection_functions = [tournament_selection, roulette_wheel_selection]


crossover_functions = [uniform_crossover, single_point_crossover, two_point_crossover]
mutation_functions = [uniform_mutation, random_resetting_mutation]
fitness_functions = [polynomial_quartic, rosenbrock_like_function]
mutation_rates = [0.5, 0.25, 0.12, 0.06, 0.03, 0.015]
mutation_steps = [5, 2.5, 1.25, 0.625, 0.3125, 0.15625]
# Define the fitness function to bounds mapping
fitness_bounds = {
polynomial_quartic: (-5, 5),
rosenbrock_like_function: (-10, 10),
}

# Generate all combinations


combinations = list(itertools.product(
selection_functions,
crossover_functions,
mutation_functions,
fitness_functions,
mutation_rates,
mutation_steps
))

with open('Big run MAIN.txt', 'a') as f:


for set_gen in range(SET_OF_GENERATIONS):
generations = []
best_fitness = []
average_fitness = []
best_fitness_offspring = float('inf')
# Get a combination
selection_function, crossover_function, mutation_function, fitness_function, MUTRATE,
MUTSTEP = combinations.pop(random.randint(0, len(combinations) - 1))
MIN, MAX = fitness_bounds[fitness_function]

# Print the selected options at the start of the run


print(f"Run {set_gen + 1} Selections:")
print(f"Selection Function: {selection_function.__name__}")
print(f"Crossover Function: {crossover_function.__name__}")
print(f"Mutation Function: {mutation_function.__name__}")
print(f"Fitness Function: {fitness_function.__name__}")
print(f"Mutation Rate: {MUTRATE}")
print(f"Mutation Step: {MUTSTEP}")
print(f"Bounds: {MIN} to {MAX}")

# Write these values to the file


f.write("\n")
f.write(f"Run {set_gen + 1} Selections:\n")
f.write(f"Selection Function: {selection_function.__name__}\n")
f.write(f"Crossover Function: {crossover_function.__name__}\n")
f.write(f"Mutation Function: {mutation_function.__name__}\n")
f.write(f"Fitness Function: {fitness_function.__name__}\n")
f.write(f"Mutation Rate: {MUTRATE}\n")
f.write(f"Mutation Step: {MUTSTEP}\n")
f.write(f"Bounds: {MIN} to {MAX}\n")

# Store the combination and best fitness for this set


best_results_per_set[set_gen] = {
"selection_function": selection_function.__name__,
"crossover_function": crossover_function.__name__,
"mutation_function": mutation_function.__name__,
"fitness_function": fitness_function.__name__,
"mutation_rate": MUTRATE,
"mutation_step": MUTSTEP,
"best_fitness": float('inf'), # This will be updated later
"best_fitness_generation": -1, # This will be updated later
}
population = create_initial_population(P, N, MIN, MAX, fitness_function)

for gen in range(GENERATIONS):


# Elitism: retain the best individual from the current generation
best_individual = min(population, key=lambda ind: ind.fitness)

offspring = selection_function(population, P)

offspring = crossover_function(offspring, P, N)

# Replace the worst individual in the new generation with the best individual from the
previous generation
worst_individual_index = max(range(P), key=lambda index: offspring[index].fitness)
offspring[worst_individual_index] = copy.deepcopy(best_individual)
offspring = uniform_mutation(offspring, P, N, MUTRATE, MIN, MAX)

total_fitness_offspring = total_fitness(offspring)
new_best_fitness_offspring = min(ind.fitness for ind in offspring)
avg_fitness_offspring = total_fitness_offspring / P

# Update the best fitness and the generation where it was achieved
if new_best_fitness_offspring < best_results_per_set[set_gen]["best_fitness"]:
best_results_per_set[set_gen]["best_fitness"] = new_best_fitness_offspring
best_fitness_offspring = new_best_fitness_offspring
best_results_per_set[set_gen]["best_fitness_generation"] = gen

generations.append(gen)
best_fitness.append(best_fitness_offspring)
average_fitness.append(avg_fitness_offspring)

# Update the best fitness and the generation and set where it was achieved
if new_best_fitness_offspring < best_of_bests:
best_of_bests = new_best_fitness_offspring
best_generation = gen
best_set = set_gen

print(f"Generation: {gen+1} | Best Fitness: {best_fitness_offspring} | Average Fitness:


{avg_fitness_offspring} | Mutation Rate: {MUTRATE} | Mutation Step Size: {MUTSTEP}")
f.write(f"Generation: {gen+1} | Best Fitness: {best_fitness_offspring} | Average Fitness:
{avg_fitness_offspring} | Mutation Rate: {MUTRATE} | Mutation Step Size: {MUTSTEP} \n")
population = copy.deepcopy(offspring)

# Adding average of averages and best of bests


avg_of_avgs = sum(average_fitness) / len(average_fitness)
best_of_gen_bests = min(best_fitness)
print(f"Set: {set_gen+1} | Average Fitness: {avg_of_avgs} | Best Fitness:
{best_of_gen_bests}\n")
f.write(f"Set: {set_gen+1} | Average Fitness: {avg_of_avgs} | Best Fitness:
{best_of_gen_bests}\n\n")

# add results to the set results lists


set_generations.append(generations)
set_best_fitness.append(best_fitness)
set_average_fitness.append(average_fitness)

# Print the best results for each set


for set_gen, results in best_results_per_set.items():
print(f"Set {set_gen+1} | Best fitness: {results['best_fitness']} (achieved in generation
{results['best_fitness_generation']+1})")
f.write(f"Set {set_gen+1} | Best fitness: {results['best_fitness']} (achieved in generation
{results['best_fitness_generation']+1})\n\n")
print(f" with parameters: selection function={results['selection_function']}, crossover
function={results['crossover_function']}, mutation function={results['mutation_function']},
fitness function={results['fitness_function']}, mutation rate={results['mutation_rate']}, mutation
step={results['mutation_step']}")
f.write(f'''with parameters:
selection function={results['selection_function']}
crossover function={results['crossover_function']}
mutation function={results['mutation_function']}
fitness function={results['fitness_function']}
mutation rate={results['mutation_rate']}
mutation step={results['mutation_step']}\n\n''')

n_sets_per_figure = 9 # The number of sets displayed per figure


n_figures = math.ceil(SET_OF_GENERATIONS / n_sets_per_figure) # The number of figures

for figure_i in range(n_figures):


fig, axs = plt.subplots(3, 3, figsize=(10, 15))
fig.suptitle(f'Rosenbrock Function (figure {figure_i + 1})')
for j in range(n_sets_per_figure):
set_i = figure_i * n_sets_per_figure + j # The actual set index
if set_i >= SET_OF_GENERATIONS: # If we've exhausted all the sets
break

ax_row = j // 3
ax_col = j % 3

axs[ax_row, ax_col].plot(set_generations[set_i], set_best_fitness[set_i], label='Best


Fitness')
axs[ax_row, ax_col].set_title(f'Set {set_i + 1} Best Fitness')
axs[ax_row, ax_col].legend()

axs[ax_row, ax_col].plot(set_generations[set_i], set_average_fitness[set_i], label='Average


Fitness')
axs[ax_row, ax_col].set_title(f'Set {set_i + 1} Average Fitness')
axs[ax_row, ax_col].legend()
plt.tight_layout()
plt.savefig(f'set_{figure_i//9+1}_figure.png') # Save the figure with a unique name
plt.show()

References
Boyd, S., & Vandenberghe, L. (2004). Convex optimization. Cambridge University Press.
Mitchell, M. (1998). An introduction to genetic algorithms. MIT press.
Kirkpatrick, S., Gelatt Jr, C. D., & Vecchi, M. P. (1983). Optimization by simulated annealing.
Science, 220(4598), 671-680.
Goldberg, D. E. (1989). Genetic algorithms in search, optimization, and machine learning.
Addison-Wesley.
Kennedy, J., & Eberhart, R. C. (1995). Particle swarm optimization. In Proceedings of ICNN'95-
International Conference on Neural Networks (Vol. 4, pp. 1942-1948). IEEE.
Bäck, T. (1996). Evolutionary algorithms in theory and practice: evolution strategies,
evolutionary programming, genetic algorithms. Oxford University Press.

You might also like