Non-Continuous Function Optimization Using Genetic Algorithm
Non-Continuous Function Optimization Using Genetic Algorithm
Gradient Descent & Newton method are quite popular when finding solutions of a continuous function. But ever wondered how should we solve a non continuous/discontinuous function. This is where genetic algorithm helps. In this article, we will go through its implementation in python and provide an intuitive explanation of each step.
Introduction
Gradient Descent and Newton method both fail for a discontinuous function. This is because Gradient Descent requires computation of first differential and Newton method requires both first differential and second differentials. At the point of discontinuity, however, neither first nor second differentials are defined which makes these two algorithms not useful.
Genetic algorithm comes to rescue in such a scenario. Genetic algorithm works on principles of fitness, parents, crossover, mutation, next generation which run in cycles:

Letās go through and understand each step described above:
Objective Function & Input Parameters
We start with defining the objective function. Objective function represents the discontinuous function for which are supposed to find the solution.
Input parameters include:
- population size,
- number of generations for which we need to run this cycle.
- Additionally, we assign the mutation rate which basically tells us how much random change do we infuse in the offsprings generated from crossover between parents.
- Crossover probability of crossover is the probability with which a parent can be selected for crossover.
- Tournament size: tournament size defines the random selection of sample from population
# Define the non-continuous function to be optimized
def objective_function(x):
return abs(np.cos(x))+abs(np.sin(x)) + (20 if x > 2 else 0)
# GA parameters
population_size = 100 # Number of individuals in the population
generations = 500 # Number of generations
mutation_rate = 0.1 # Probability of mutation
crossover_rate = 0.8 # Probability of crossover
tournament_size = 5 # Tournament size for selection
x_min, x_max = -10, 10 # Search space bounds
1. Population (1st Generation)
In this step, we define the initial generation of random solutions. We get this population by defining a maximum and minimum range of search space.

# Generate initial population of random solutions
def initialize_population(pop_size, x_min, x_max):
return np.random.uniform(x_min, x_max, pop_size)
2. Fitness
Fitness is nothing but computing value of objective function for each sample in population of random solutions defined above.
fitness = np.array([objective_function(ind) for ind in population])
3. Selection (Choosing Parents)
In this step, we choose number of parents along with their fitness scores. Number of selection is equal to the size of tournament we initially defined.
Finally, we select a winner by selecting the sample with minimum value of objective function (fitness score).
This is repeated as many times as the population size so at the end of the selection, we will have same number of winners are the size of population initially defined.
def select(population, fitness):
selected = []
for _ in range(len(population)):
tournament = random.sample(list(zip(population, fitness)), tournament_size)
winner = min(tournament, key=lambda x: x[1]) # Minimize the objective
selected.append(winner[0])
return np.array(selected)
4. Crossover (Mating/Combining Parents)
This is the step in which we combine parents to create offsprings. Notice the cross over rate will be the percentage of parents which will pass through the random selection else they will be returned as is.

def crossover(parent1, parent2):
if random.random() < crossover_rate:
# Creating children by averaging the parents' solutions
child1 = (parent1 + parent2) / 2 # Simple average crossover for real-valued variables
child2 = (parent2 + parent1) / 2 # Simple average crossover for real-valued variables
return child1, child2
else:
return parent1, parent2
5. Mutation (Random Changes)
We then randomly select child with the rate of mutation rate and add a random error (mutation) in a chosen range. At the end of it we ensure the child remains within the bounds initially defined by us.

def mutate(child):
if random.random() < mutation_rate:
child += np.random.uniform(-0.5, 0.5) # Small mutation within a range
# Ensure the mutation stays within bounds
child = np.clip(child, x_min, x_max)
return child
6. Genetic Algorithm Overall Process:
We repeat the above process for as many generations as defined initially. Best solution would the the sample with minimum value of objective function (fitness) at the end of last generation.
def genetic_algorithm():
population = initialize_population(population_size, x_min, x_max) # Initialize the population
best_solution = None
best_fitness = float('inf')
for gen in range(generations): # Run for a number of generations
# Evaluate fitness of the population
fitness = np.array([objective_function(ind) for ind in population])
# Update the best solution
min_fitness_idx = np.argmin(fitness)
if fitness[min_fitness_idx] < best_fitness:
best_fitness = fitness[min_fitness_idx]
best_solution = population[min_fitness_idx]
# Select parents based on fitness
selected = select(population, fitness)
# Create next generation through crossover and mutation
next_generation = []
for i in range(0, len(selected), 2): # Pair up individuals for mating
parent1 = selected[i]
parent2 = selected[i+1]
child1, child2 = crossover(parent1, parent2) # Crossover step
next_generation.append(mutate(child1)) # Mutation step
next_generation.append(mutate(child2)) # Mutation step
# Replace population with the next generation
population = np.array(next_generation)
# Print best solution of each generation
if gen % 50 == 0:
print(f"Generation {gen}, Best Solution: {best_solution}, Best Fitness: {best_fitness}")
return best_solution, best_fitness
Summary
We looked step by step explanation and code of how we can perform numerical optimization of non continuous function using Genetic algorithm.
If you liked the explanation , follow me for more! Feel free to leave your comments if you have any queries or suggestions.
You can also check out other articles written around data science, computing on medium. If you like my work and want to contribute to my journey, you cal always buy me a coffee :)

Comments
Post a Comment