Python优化算法之遗传算法案例代码

  • Post category:Python

下面是关于“Python优化算法之遗传算法案例代码”的完整攻略。

1. 遗传算法简介

遗传算法是一种基于自然选择和遗传学原理的优化算法,它通过模拟生物进化过程,从而实现对问题的优化。遗传算法的基本流程包括初始化种群、选择、交叉、变异等步骤。

2. Python实现遗传算法

2.1 初始化种群

在遗传算法中,种群是指一组个体,每个个体都代表了问题的一个解。在Python中,我们可以使用 numpy 库生成随机的二进制编码来初始化种群。

下面是一个初始化种群的示例:

import numpy as np

def init_population(pop_size, chrom_length):
    population = np.random.randint(2, size=(pop_size, chrom_length))
    return population

在这个示例中,我们使用 numpy.random.randint() 函数生成一个大小为 (pop_size, chrom_length) 的随机矩阵,其中每个元素都是 0 或 1。

2.2 选择

选择是指从种群中选择一部分个体作为下一代种群的父代。在遗传算法中,选择通常使用轮盘赌选择或锦标赛选择。

下面是一个使用轮盘赌选择的示例:

def roulette_selection(population, fitness):
    fitness_sum = np.sum(fitness)
    fitness_ratio = fitness / fitness_sum
    cum_sum = np.cumsum(fitness_ratio)
    selected_index = []
    for i in range(len(population)):
        r = np.random.rand()
        for j in range(len(cum_sum)):
            if r < cum_sum[j]:
                selected_index.append(j)
                break
    selected_population = population[selected_index]
    return selected_population

在这个示例中,我们首先计算每个个体的适应度比例,然后计算适应度比例的累积和。接着,我们使用 numpy.random.rand() 函数生成一个随机数,然后根据随机数选择一个个体。重复这个过程,直到选择足够数量的个体。最后,我们返回选择的个体。

2.3 交叉

交叉是指将两个父代个体的染色体进行配对,生成新的子代个体。在遗传算法中,交叉通常使用单点交叉或多点交叉。

下面是一个使用单点交叉的示例:

def single_point_crossover(parent1, parent2):
    chrom_length = len(parent1)
    crossover_point = np.random.randint(1, chrom_length)
    child1 = np.concatenate((parent1[:crossover_point], parent2[crossover_point:]))
    child2 = np.concatenate((parent2[:crossover_point], parent1[crossover_point:]))
    return child1, child2

在这个示例中,我们首先生成一个随机数,作为交叉点。然后,我们将两个父代个体的染色体在交叉点处进行切割,并将切割后的部分进行交换,生成两个新的子代个体。

2.4 变异

变异是指对个体的染色体进行随机的改变,从而生成新的个体。在遗传算法中,变异通常使用位翻转或随机重置。

下面是一个使用位翻转的示例:

def bit_flip_mutation(individual, mutation_rate):
    chrom_length = len(individual)
    for i in range(chrom_length):
        if np.random.rand() < mutation_rate:
            individual[i] = 1 - individual[i]
    return individual

在这个示例中,我们遍历个体的每个基因,如果随机数小于变异率,则将该基因进行翻转。

3. 示例说明

下面是一个使用遗传算法求解函数最小值的示例:

import numpy as np

def fitness_func(x):
    return np.sin(10 * np.pi * x) / (2 * x) + (x - 1) ** 4

def init_population(pop_size, chrom_length):
    population = np.random.randint(2, size=(pop_size, chrom_length))
    return population

def roulette_selection(population, fitness):
    fitness_sum = np.sum(fitness)
    fitness_ratio = fitness / fitness_sum
    cum_sum = np.cumsum(fitness_ratio)
    selected_index = []
    for i in range(len(population)):
        r = np.random.rand()
        for j in range(len(cum_sum)):
            if r < cum_sum[j]:
                selected_index.append(j)
                break
    selected_population = population[selected_index]
    return selected_population

def single_point_crossover(parent1, parent2):
    chrom_length = len(parent1)
    crossover_point = np.random.randint(1, chrom_length)
    child1 = np.concatenate((parent1[:crossover_point], parent2[crossover_point:]))
    child2 = np.concatenate((parent2[:crossover_point], parent1[crossover_point:]))
    return child1, child2

def bit_flip_mutation(individual, mutation_rate):
    chrom_length = len(individual)
    for i in range(chrom_length):
        if np.random.rand() < mutation_rate:
            individual[i] = 1 - individual[i]
    return individual

def genetic_algorithm(pop_size, chrom_length, max_iter, mutation_rate):
    population = init_population(pop_size, chrom_length)
    for i in range(max_iter):
        fitness = np.array([fitness_func(x) for x in population])
        selected_population = roulette_selection(population, fitness)
        new_population = []
        for j in range(pop_size // 2):
            parent1 = selected_population[j * 2]
            parent2 = selected_population[j * 2 + 1]
            child1, child2 = single_point_crossover(parent1, parent2)
            child1 = bit_flip_mutation(child1, mutation_rate)
            child2 = bit_flip_mutation(child2, mutation_rate)
            new_population.append(child1)
            new_population.append(child2)
        population = np.array(new_population)
    fitness = np.array([fitness_func(x) for x in population])
    best_index = np.argmin(fitness)
    best_individual = population[best_index]
    best_fitness = fitness[best_index]
    return best_individual, best_fitness

best_individual, best_fitness = genetic_algorithm(pop_size=100, chrom_length=20, max_iter=100, mutation_rate=0.01)
print("Best individual:", best_individual)
print("Best fitness:", best_fitness)

在这个示例中,我们定义了一个函数 fitness_func(x),用于计算函数的适应度。然后,我们使用 init_population() 函数初始化种群。接着,我们使用 roulette_selection() 函数进行选择,使用 single_point_crossover() 函数进行交叉,使用 bit_flip_mutation() 函数进行变异。最后,我们使用 genetic_algorithm() 函数进行遗传算法求解最小值。

3.2 使用遗传算法求解TSP问题

下面是一个使用遗传算法求解TSP问题的示例:

import numpy as np
import matplotlib.pyplot as plt

def init_population(pop_size, city_num):
    population = np.zeros((pop_size, city_num), dtype=int)
    for i in range(pop_size):
        population[i] = np.random.permutation(city_num)
    return population

def calc_distance(city1, city2):
    return np.sqrt(np.sum((city1 - city2) ** 2))

def calc_fitness(individual, cities):
    fitness = 0
    for i in range(len(individual) - 1):
        city1 = cities[individual[i]]
        city2 = cities[individual[i + 1]]
        fitness += calc_distance(city1, city2)
    city1 = cities[individual[-1]]
    city2 = cities[individual[0]]
    fitness += calc_distance(city1, city2)
    return 1 / fitness

def roulette_selection(population, fitness):
    fitness_sum = np.sum(fitness)
    fitness_ratio = fitness / fitness_sum
    cum_sum = np.cumsum(fitness_ratio)
    selected_index = []
    for i in range(len(population)):
        r = np.random.rand()
        for j in range(len(cum_sum)):
            if r < cum_sum[j]:
                selected_index.append(j)
                break
    selected_population = population[selected_index]
    return selected_population

def order_crossover(parent1, parent2):
    chrom_length = len(parent1)
    crossover_point1 = np.random.randint(1, chrom_length - 1)
    crossover_point2 = np.random.randint(crossover_point1 + 1, chrom_length)
    child1 = np.zeros(chrom_length, dtype=int)
    child2 = np.zeros(chrom_length, dtype=int)
    child1[crossover_point1:crossover_point2] = parent1[crossover_point1:crossover_point2]
    child2[crossover_point1:crossover_point2] = parent2[crossover_point1:crossover_point2]
    for i in range(crossover_point2, chrom_length):
        for j in range(chrom_length):
            if parent1[i] not in child2:
                child2[j] = parent1[i]
                break
            if parent2[i] not in child1:
                child1[j] = parent2[i]
                break
    for i in range(crossover_point1):
        for j in range(chrom_length):
            if parent1[i] not in child2:
                child2[j] = parent1[i]
                break
            if parent2[i] not in child1:
                child1[j] = parent2[i]
                break
    return child1, child2

def swap_mutation(individual, mutation_rate):
    chrom_length = len(individual)
    for i in range(chrom_length):
        if np.random.rand() < mutation_rate:
            j = np.random.randint(chrom_length)
            individual[i], individual[j] = individual[j], individual[i]
    return individual

def genetic_algorithm(pop_size, city_num, max_iter, mutation_rate, cities):
    population = init_population(pop_size, city_num)
    best_fitness_list = []
    for i in range(max_iter):
        fitness = np.array([calc_fitness(individual, cities) for individual in population])
        best_fitness = np.max(fitness)
        best_fitness_list.append(best_fitness)
        selected_population = roulette_selection(population, fitness)
        new_population = []
        for j in range(pop_size // 2):
            parent1 = selected_population[j * 2]
            parent2 = selected_population[j * 2 + 1]
            child1, child2 = order_crossover(parent1, parent2)
            child1 = swap_mutation(child1, mutation_rate)
            child2 = swap_mutation(child2, mutation_rate)
            new_population.append(child1)
            new_population.append(child2)
        population = np.array(new_population)
    fitness = np.array([calc_fitness(individual, cities) for individual in population])
    best_index = np.argmax(fitness)
    best_individual = population[best_index]
    best_fitness = fitness[best_index]
    return best_individual, best_fitness, best_fitness_list

def plot_tsp(best_individual, cities):
    plt.plot(cities[:, 0], cities[:, 1], 'o')
    for i in range(len(best_individual) - 1):
        city1 = cities[best_individual[i]]
        city2 = cities[best_individual[i + 1]]
        plt.plot([city1[0], city2[0]], [city1[1], city2[1]], 'k-')
    city1 = cities[best_individual[-1]]
    city2 = cities[best_individual[0]]
    plt.plot([city1[0], city2[0]], [city1[1], city2[1]], 'k-')
    plt.show()

cities = np.random.rand(20, 2)
best_individual, best_fitness, best_fitness_list = genetic_algorithm(pop_size=100, city_num=20, max_iter=1000, mutation_rate=0.01, cities=cities)
print("Best individual:", best_individual)
print("Best fitness:", best_fitness)
plot_tsp(best_individual, cities)

在这个示例中,我们首先定义了一个函数 calc_distance(city1, city2),用于计算两个城市之间的距离。然后,我们使用 init_population() 函数初始化种群。接着,我们使用 calc_fitness() 函数计算个体的适应度,使用 roulette_selection() 函数进行选择,使用 order_crossover() 函数进行交叉,使用 swap_mutation() 函数进行变异。最后,我们使用 genetic_algorithm() 函数进行遗传算法求解TSP问题,并使用 plot_tsp() 函数绘制最优路径。

4. 说明

Python中提供了多种优化算法,包括遗传算法、粒子群算法、蚁群算法等。这些算法可以帮助我们对问题进行优化,从而实现对数据的分析和预测。在使用这些算法时,我们需要根据具体的问题选择合适的算法,并据模型的特点和数据集的特征进行调参。