simulated annealing code vs genetic algorithm code for minimize fitness function in matlab

Here is an example of how you can implement the Simulated Annealing algorithm and the Genetic Algorithm in MATLAB to minimize a fitness function:

Simulated Annealing:

main.m
% Define the fitness function
fitnessFunction = @(x) x^2; % Insert your own fitness function

% Define the initial state (solution)
initialState = rand(); % You can change the initial state as per your requirement

% Define the temperature parameters
initialTemperature = 100; % You can change the initial temperature as per your requirement
coolingRate = 0.95; % You can change the cooling rate as per your requirement
finalTemperature = 0.1; % You can change the final temperature as per your requirement

% Initialize the current state and current fitness
currentState = initialState;
currentFitness = fitnessFunction(currentState);

% Start the simulated annealing algorithm
temperature = initialTemperature;
while temperature > finalTemperature
    % Generate a random neighboring state
    newState = currentState + randn();
    
    % Calculate the fitness of the new state
    newFitness = fitnessFunction(newState);
    
    % Calculate the acceptance probability
    acceptanceProbability = exp((currentFitness - newFitness) / temperature);
    
    % Decide whether to accept the new state or not
    if newFitness < currentFitness || rand() < acceptanceProbability
        currentState = newState;
        currentFitness = newFitness;
    end
    
    % Cool down the temperature
    temperature = temperature * coolingRate;
end

% Display the final result
disp('Simulated Annealing Results:');
disp(['Best Solution: ' num2str(currentState)]);
disp(['Best Fitness: ' num2str(currentFitness)]);
1506 chars
42 lines

Genetic Algorithm:

main.m
% Define the fitness function
fitnessFunction = @(x) x^2; % Insert your own fitness function

% Define the optimization problem parameters
populationSize = 50; % You can change the population size as per your requirement
numberOfGenerations = 100; % You can change the number of generations as per your requirement

% Initialize the population
population = rand(populationSize, 1); % You can change the initialization method as per your requirement

% Start the genetic algorithm
for generation = 1:numberOfGenerations
    % Calculate the fitness of each individual in the population
    fitness = fitnessFunction(population);
    
    % Perform selection, crossover, and mutation to create the next generation
    offspring = selection(population, fitness);
    offspring = crossover(offspring);
    offspring = mutation(offspring);
    
    % Replace the current population with the offspring
    population = offspring;
end

% Find the best solution in the final population
fitness = fitnessFunction(population);
[bestFitness, bestIndex] = min(fitness);
bestSolution = population(bestIndex);

% Display the final result
disp('Genetic Algorithm Results:');
disp(['Best Solution: ' num2str(bestSolution)]);
disp(['Best Fitness: ' num2str(bestFitness)]);

% Helper functions for genetic algorithm (selection, crossover, and mutation)
function offspring = selection(population, fitness)
    [~, sortedIndices] = sort(fitness);
    eliteIndex = sortedIndices(1);
    rouletteWheel = cumsum(fitness(sortedIndices) / sum(fitness(sortedIndices)));
    offspring = population(eliteIndex);
    for i = 2:length(population)
        randomValue = rand();
        selectedParent = find(rouletteWheel >= randomValue, 1);
        offspring = [offspring; population(selectedParent)];
    end
end

function offspring = crossover(population)
    offspring = zeros(size(population));
    for i = 1:length(population)
        parent1 = population(i);
        parent2 = population(randi(length(population)));
        offspring(i) = (parent1 + parent2) / 2; % You can change the crossover method as per your requirement
    end
end

function offspring = mutation(population)
    offspring = population + randn(size(population)); % You can change the mutation method as per your requirement
end
2275 chars
60 lines

Please note that you will need to modify the fitness function, initialization method, and crossover/mutation methods to suit your specific optimization problem. Additionally, there are several other parameters and techniques that can be adjusted to improve the performance of both algorithms.

gistlibby LogSnag