All Courses
All Courses
Courses by Software
Courses by Semester
Courses by Domain
Tool-focused Courses
Machine learning
POPULAR COURSES
Success Stories
Introduction: The genetic algorithm is inspired by the process that drives biological evolution. The process of evolution starts with the selection of fittest individuals from a population. Then they produce offspring which inherit the characteristics of the parents and will be added to the next generation. If parents…
Khaled Ali
updated on 12 Dec 2019
Introduction:
The genetic algorithm is inspired by the process that drives biological evolution. The process of evolution starts with the selection of fittest individuals from a population. Then they produce offspring which inherit the characteristics of the parents and will be added to the next generation. If parents have better fitness, their offspring will be better than parents and have a better chance at surviving. This process keeps on iterating and at the end, a generation with the fittest individuals will be found. And this is the main working principle for the algorithm because at each step, the genetic algorithm selects randomly from the current population to be parents and uses them to produce the children for the next generation. Over successive generations, the population evolves toward an optimal solution. The genetic algorithm can be applied to solve a variety of optimization problems that are not well suited for standard optimization algorithms, including problems in which the objective function is discontinuous, nondifferentiable, stochastic, or highly nonlinear.
The genetic algorithm uses three main types of rules at each step to create the next generation from the current population:
The genetic algorithm differs from a classical, derivative-based, optimization algorithm in two main ways, as summarized in the following table.
Algorithm working sequence:
Then algorithm then creates a sequence of new populations. At each step, the algorithm uses the individuals in the current generation to create the next population. To create the new population, it uses the fitness function which is the function you want to optimize. For standard optimization algorithms, this is known as the objective function. The toolbox software tries to find the minimum of the fitness function.
The algorithm performs the following steps:
Fig:Mechanism for producing children
Fig:Final answer developing with iterations.
Input function:
f(x,y)=f1,x⋅f2,x⋅f1,y⋅f2,y
where,
f1,x=[sin(5.1πx+0.5)]6
f1,y=[sin(5.1πy+0.5)]6
f2,x=exp[−4ln(2)(x−0.0667)20.64]
f2,y=exp[−4ln(2)(y−0.0667)20.64]
Main code:
clear all
close all
clc
x = linspace(0,0.6,150);
y = linspace(0,0.6,150);
num_cases=50;
% Creating a 2D Mesh
[xx yy] = meshgrid(x,y);
% Evaluating Stalagmite Function
for i = 1 : length(xx);
for j = 1 : length(yy);
input_value(1)= xx(i,j);
input_value(2) = yy(i,j);
f(i,j) = stalagmite(input_value);
end
end
%unbounded statistical behaviour
tic
for i=1:num_cases;
[inputs1,fopt1(i)]=ga(@stalagmite,2);
xopt(i)=inputs1(1);
yopt(i)=inputs1(2);
end
study1_time=toc;
figure(1)
subplot(2,1,1)
hold on
surfc(xx,yy,-f)
xlabel('x-value')
ylabel('y-value')
zlabel('function value')
shading interp
plot3(xopt,yopt,-fopt1,'marker','o','markersize',5,'markerfacecolor','r')
title('Unbounded Statistical Behvaiour')
subplot(2,1,2)
plot(-fopt1)
xlabel('iterations')
ylabel('function maximum')
%statastical behaviour with upper and lower bound
tic;
for i=1:num_cases;
[inputs2,fopt2(i)]=ga(@stalagmite,2,[],[],[],[],[0;0],[1;1]);
xopt(i)=inputs2(1);
yopt(i)=inputs2(2);
end
study2_time=toc;
figure(2)
subplot(2,1,1)
hold on
surfc(xx,yy,-f)
xlabel('x-value')
ylabel('y-value')
zlabel('function value')
shading interp
plot3(xopt,yopt,-fopt2,'marker','o','markersize',5,'markerfacecolor','r')
title('Bounded Statistical Behvaiour')
subplot(2,1,2)
plot(-fopt2)
xlabel('iterations')
ylabel('function maximum')
%increasing Ga population size
options=optimoptions('ga');
options=optimoptions(options,'populationsize',170);
tic
for i=1:num_cases;
[inputs3,fopt3(i)]=ga(@stalagmite,2,[],[],[],[],[0;0],[1;1],[],[],options);
xopt(i)=inputs3(1);
yopt(i)=inputs3(2);
end
study3_time=toc;
figure(3)
subplot(2,1,1)
hold on
surfc(xx,yy,-f)
xlabel('x-value')
ylabel('y-value')
zlabel('function value')
shading interp
plot3(xopt,yopt,-fopt3,'marker','o','markersize',5,'markerfacecolor','r')
title('Bounded Statistical Behvaiour with Increased Population Size')
subplot(2,1,2)
plot(-fopt3)
xlabel('iterations')
ylabel('function maximum')
Total_Time = study1_time+study2_time+study3_time
fprintf ('\n The Results Are Below :')
fprintf ('\n\n Study 1 : Unbounded Statistical Behvaiour')
fprintf ('\n\n The Time taken for the study 1 is : %f',study1_time)
fprintf ('\n The Co-ordinates of x and y for the Global Maxima of the Function are : %f,%f',inputs1(1),inputs1(2))
fprintf ('\n\n The Global Maxima of the Function is : %f',-(mean(fopt1)))
fprintf ('\n\n Study 2 : Bounded Statistical Behvaiour')
fprintf ('\n\n The Time taken for the study 2 is : %f',study2_time)
fprintf ('\n The Co-ordinates of x and y for the Global Maxima of the Function are : %f,%f',inputs2(1),inputs2(2))
fprintf ('\n\n The Global Maxima of the Function is : %f',-(mean(fopt2)))
fprintf ('\n\n Study 3 : Bounded Statistical Behvaiour with Increased Population Size')
fprintf ('\n\n The Time taken for the study 3 is : %f',study3_time)
fprintf ('\n The Co-ordinates of x and y for the Global Maxima of the Function are : %f,%f',inputs3(1),inputs3(2))
fprintf ('\n\n The Global Maxima of the Function is : %f',-(mean(fopt3)))
fprintf ('\n\n The Total Time taken to Complete the total Study = %f',Total_Time)
Main code explaination:
[inputs,fopt] = ga(fun,nvars)
Inputs represent the values of parameters x and y for which the function has the least value (fopt)
fun - The function to be optimized, in this case Stalagmite
nvars - the number of design variables, in this case only x and y are the variables to be plotted
[inputs,fopt] = ga(fun,nvars,A,b,Aeq,beq,lb,ub)
Inputs, fopt,fun and nvars are the same as Study 1
A - Linear inequality constraints : real matrix, if there are none, then []
b - Linear inequality constraints : real vector, if there are none, then []
Aeq - Linear equality constraints : real matrix, if there are none, then []
beq - Linear equality constraints : real vector, if there are none, then []
lb - Lower bounds, specified as a real vector or array
ub - Upper bounds, specified as a real vector or array
[inputs,fopt]=ga(fun,nvars,A,b,Aeq,beq,lb,ub,nonlcon,IntCon,options)
inputs, fopt,fun and nvars,A,b,Aeq,beq,lb,ub are the same as Study 1 and Study 2 respectively)
nonlcon - Nonlinear constraints, specified as a function handle or function name. If there are none,then []
IntCon - Integer variables, specified as a vector of positive integers, if there are none, then []
Options - Optimization options, specified as the output of optimoptions or a structure.
Stalagmite function code:
function [f] = stalagmite(input_value)
f_1_x = (sin(5.1*pi*input_value(1) + 0.5))^6;
f_1_y = (sin(5.1*pi*input_value(2) + 0.5))^6;
f_2_x = exp(-4*log(2)*(((input_value(1) - 0.0667)^2)/(0.64)));
f_2_y = exp(-4*log(2)*(((input_value(2) - 0.0667)^2)/(0.64)));
f = -(f_1_x*f_1_y*f_2_x*f_2_y);
end
Stalagmite function code explaination:
Errors during programming:
This error happened because there was a bracket[] missing,it was then corrected
OUTPUT:
Unbounded statastical behaviour plot
Bounded statastical behaviour plot
Modified population size plot
Stalagmite function output print in command window
References:
https://www.mathworks.com/help/gads/how-the-genetic-algorithm-works.html
https://www.mathworks.com/help/gads/ga.html
Leave a comment
Thanks for choosing to leave a comment. Please keep in mind that all the comments are moderated as per our comment policy, and your email will not be published for privacy reasons. Please leave a personal & meaningful conversation.
Other comments...
ADAMS - Project - Hardpoint tuning to achieve pitch gradient targets using suspension “anti” characteristics
please see project at: https://skill-lync.com/student-projects/adams-project-hardpoint-tuning-to-achieve-pitch-gradient-targets-using-suspension-anti-characteristics-10
30 Jun 2021 04:20 AM IST
ADAMS - Project - Hardpoint tuning to achieve pitch gradient targets using suspension “anti” characteristics
Project Deliverables Perform hardpoint tuning to achieve the target acceleration and braking pitch gradients for Double Wishbone Suspension at the front and Multilink Suspension at the rear. Adams Plot showing effect of tuning (overlay curves). Completed Excel Sheet. Project targets and coonstraints: Introduction: Pitch…
30 Jun 2021 04:15 AM IST
Adams Assignments – Problem 5 – Compliance
please see project at: https://skill-lync.com/projects/adams-assignments-problem-5-compliance-11
30 Jun 2021 01:54 AM IST
Adams Assignments – Problem 5 – Compliance
Adams sign convention: Steps: -First rear multilink suspension is created -then static load simulation is made -first brake static load analysis is performed with 5000 newton braking force -then x axis input variable was chosen which wheel base travel -then y-axis input variable was chosen as toe angle -then the brake…
30 Jun 2021 01:52 AM IST
Related Courses
0 Hours of Content
Skill-Lync offers industry relevant advanced engineering courses for engineering students by partnering with industry experts.
© 2025 Skill-Lync Inc. All Rights Reserved.