Genetic Algorithms  Quick Guide
Genetic Algorithms  Introduction
Genetic Algorithm (GA) is an optimization technique based on research based on the principles of Genetics and natural selection . It is frequently used to find optimal or nearoptimal solutions to difficult problems that would otherwise take a lifetime to solve. It is frequently used to solve optimization problems, in research and in machine learning.
Introduction to Optimization
Optimization is the process of improving something . In any process, we have a set of inputs and a set of outputs as shown in the following figure.
Optimization refers to finding the values of entries in way to get the "bestes ”output values. The definition of "best" varies from problem to problem, but in mathematical terms it refers to the maximization or minimization of one or more objective functions, by varying the parameters of Entrance.
The set of all possible solutions or values that the entries can occupy constitute the search space. In this search space, there is a point or a set of points which gives the optimal solution. The goal of optimization is to find this point or set of points in the search space.
What are genetic algorithms?
Nature has always been a great source of inspiration for all mankind. Genetic Algorithms (GA) are researchbased algorithms based on the concepts of natural selection and genetics. GAs are a subset of a much larger branch of calculus known as Evolutionary Calculus .
GAs were developed by John Holland and his students and colleagues at the University of Michigan, notably David E. Goldberg, and has since been tried on
In GA we have a pool or population of possible solutions to the given problem. These solutions then undergo recombination and mutation (as in natural genetics), producing new children, and the process is repeated over different generations. Each individual (or candidate solution) is assigned a fitness value (based on their objective function value) and fitter individuals have a greater chance of mating and producing more "fit" individuals. This is in accordance with Darwin's theory of "survival of the fittest".
In this way, we continue to 'evolve' better individuals or solutions over generations, until we reach a benchmark.re stop.
Genetic algorithms are sufficiently random in nature, but they work much better than local random search (in which we just try
Advantages of GAs
GAs have

Requires no derivative information (which may not be available for many real world issues).

Is faster and more efficient than traditional methods.

Has very good parallel abilities.

Optimizes continuous and discrete functions as well as multiobjective problems.

Provides a list of "good" solutions, not just a single solution.

Always getsa response to the problem, which improves over time.

Useful when the search space is very large and there are a large number of parameters involved.
Limitations of GAs
Like any technique, GAs also suffer from some limitations. These include 

GAs are not suitable for all problems, especially problems that are simple and for which derived information is needed. available.

The fitness value is calculated multiple times, which can be computationally expensive for some issues.

Being sto chastic, there is no guarantee on the optimality or the quality of the solution.

If it is not implemented correctly, the GA may not converge to the optimal solution.
GA  Motivation
Genetic algorithms have the capacity to provide a solution "good enoughis not "" fast enough ". This makes genetic algorithms interesting to use when solving optimization problems. The reasons why GAs are needed are as follows 
Solving difficult problems
In computing, there are a lot of problems, which are NPHard . This basically means that even the most powerful computer systems take a long time (even years !) To solve this problem. In such a scenario, GAs prove to be an effective tool to provide near optimal usable solutions in a short period of time.
Failed gradient based methods
Traditional calculation based methods work by starting from a random point and moving in the direction of the gradient, until we reach the top of the hill This technique is efficient and works very well for object functions.single peak if as the cost function in linear regression. But, in most real world situations we have a very complex problem called landscapes, which are made up of many peaks and many valleys, which causes these methods to fail because they suffer from a tendency inherent in getting stuck in the local optima. as shown in the following figure.
Get a good solution quickly
Some difficult problems like the traveling salesman problem (TSP) have real applications like path finding and VLSI design. Now imagine that you are using your GPS navigation system and that calculating the "optimal" path from source to destination takes a few minutes (or even hours). The delay in such real world applications is not acceptable and therefore a "good enough" solution, which is deliveredé "quickly" is what is needed.
Genetic Algorithms  Basics
This section introduces the basic terminology needed to understand GA. Additionally, a generic GA structure is presented as pseudocode and graphically . The reader is advised to fully understand all the concepts presented in this section and keep them in mind while reading other sections of this tutorial.
Basic terminology
Before starting a discussion of Genetic Algorithms, it is essential to familiarize yourself with the basic terminology that will be used throughout this tutorial.

Population  This is a subset of all possible (coded) solutions to the given problem. The population for GA is analogous to the population of human beings except that instead of human beings we have candidate solutions representing ets.human res.

Chromosomes  A chromosome is one of the solutions to the given problem.

Gene  A gene is an elementary position of a chromosome.

Allele  This is the value that a gene takes for a particular chromosome.

Genotype  The genotype is the population in the compute space. In computing space, solutions are represented in a way that can be easily understood and manipulated with the help of a computer system.

Phenotype  The phenotype is the population in the real world solution space in which the solutions are represented in a way that they are represented in real situations.

Decoding and encoding  For simple problems, the spaces phenotype and genotype are the same. However, in most cases, the phenotypic and genotypic spaces are different. Decoding is a process of transforming a solution from the genotype into the phenotype space, while encoding is a process of transforming the phenotype into the genotype space. The decoding should be fast because it is performed repeatedly in an GA during the calculation of the fitness value.
For example, consider the 0/1 backpack problem. The Phenotype space is made up of solutions that just contain the item numbers of the objects to choose from.
However, in genotype space, it can be represented as a binary string of length n (where n is the number of articles). A 0 at position x represents that the x^{th} element is selected while a 1 represents the reverse. This is a case where the genotype and phenotype spaces are different.

Fitness Function  A simply defined fitness function is one that takes the solution as the input and produces the adequacy of the solution as the output. In some cases the fitness function and the goal function may be the same, while in others they may be different depending on the problem.

Genetic Operators  These alter the genetic makeup of the offspring. These include crossbreeding, mutation, selection, etc.
Basic structure
The basic structure of a GA is as follows 
We start with an initial population ( which can be randomly generated or seeded by other heuristics), select the parents of this population for mating. Apply crossbreeding and mutation operators on the parents to generate new offspring. And finally, these rejetones replace existing individuals in the population and the process is repeated. In this way, genetic algorithms attempt to mimic human evolution to some extent.
Each of the following steps is covered in a separate chapter later in this tutorial.
A generalized pseudocode for a GA is explained in the following program 
GA () initializes the population finds the suitability of the population while (the termination criteria are met) crosses the selection of parents with the probability pc mutation with the probability pm decoding and the calculation of the aptitude the selection of the survivor finds the best return
Representation of the genotype
One of the most important decisions to make when implementing a genetic algorithm is to decide on the representation that we will use to represent our solutions. It has been observed that incorrect representation can lead to poor performance of the AG.
Therefore, choosing a correct representation, having a correct definition of the mappings between the phenotype and genotype spaces is essential for the success of a GA.
In this section, we present some of the most common used representations for genetic algorithms. However, the representation is highly p roblem specific and the reader might find that another representation or a mixture of the representations mentioned here might better suit his problem.
Binary representation
This is one of the simplest and most widely used representations in AGs. In this type of representation, the genotype consists of strings of bits.
For some problems when the solution space consists of decision variables boolean  yes or no, the binary representation is natural. Take, for example, the 0/1 backpack problem. If there are n elements, we can represent a solution by a binary string of n elements, where the element x ^{ th } indicates whether element x is chosen (1) or not ( 0).
For other problems, especially those dealing with numbers, we can represent numbers with their binary representation. The problem with this type of encoding is that different bits have different meanings and hence the mutation and crossing operators can have unwanted consequences. This can be solved to some extent by using gray coding, as a single bit change does not have a massive effect on the solution.
Real value representation
For problems where we want to define genes using variablescontinuous rather than discrete, the valued real representation is the most natural. However, the accuracy of these real or floating point numbers is limited to the computer.
Full representation
For genes with discrete values, we cannot always limit solution space in binary 'yes ' or 'no '. For example, if we want to encode the four distances  North, South, East and West, we can encode them as { 0,1,2,3} . In such cases an integer representation is desirable.
Representation by permutation
In many problems, the solution is represented by an order of elements. In such cases, the permutation representation is most appropriate.
A classic example of this representation is the comm traveler problem.erce (TSP). In this, the seller must go around all the cities, visit each city exactly once and return to the starting city. The total distance of the circuit should be minimized. The solution to this TSP is naturally an order or permutation of all cities and therefore the use of a permutation representation makes sense for this problem.
Genetic algorithms  Population
Population is a subset of current generation solutions. It can also be defined as a set of chromosomes. Th There are several things to keep in mind when dealing with the GA population 

The

The size of the population should not be kept very large because it can slow down a GA, while asmaller population may not be sufficient for a good mating pool. Therefore, an optimal population size must be decided by trial and error.
Population is generally defined as a twodimensional array of  population size, x size, chromosome size .
Initializing the population
There are two main methods for initializing a population in an GA. They are 
It has been observed that the population set should not be initialized using a heuristic, as this can cause the population to have similar solutions and very little
It has also been observed that the initialization heuristic in some cases only affects the initial fitness of the population, but in the end it is the
Population models
There are two widely used population models 
Stable state
Steady state GA, we generate one or two rejects on each iteration and they replace one or two individuals in the population. Steady state GA is also referred to as incremental GA .
Rational gene
In a generational model, we generate 'n 'offsets, where n is the size of the population, and the entire population is replaced by the new one at the end of the iteration.
Genetic Algorithms  Fitness Function
The Simply Defined Fitness Function is a function that takes a candidate solution to the problem as input and produces as output comment "Fit" our how "Good", the solution is with regard to the problem considered.
The calculation of the fitness value is done several times in a GA and must therefore be fast enough. Slow calculation of the fitness value can have a negative effect on GA and make it unusually slow.
In most cases, the fitness function and the goal function are the same as the goal is to maximize or minimize the given goal function. However, for more complex problems with multiple objectives and constraints, an Algorithm Designer may choose to 'have different fitness function.
A fitness function must have the following characteristics 
In some cases, direct calculation of the fitness function may not be possible due to the inherent complexity of the problem at hand. In such cases, we approximate the physical condition based on our needs.
The following image shows the fitness calculation for a 0/1 backpack solution. This is a simple fitness function that simply adds up the profit values of the selected items (which have a 1), swiping the items from left to right until the backpack isfull.
Genetic algorithms  Selection of parents
Selection parentage is the process of selecting parents that mate and recombine to create offspring for the next generation. Parent selection is very crucial for GA convergence rate, as good parents push individuals towards better and more suitable solutions.
However, care must be taken to prevent a highly suitable solution from taking control of the entire population within a few generations, as this leads to solutions that are close to each other in the 'space for solutions thus leading to a loss of Maintaining a good within the population is extremely crucial for the success of a GA. This taking of the entire population by an extremely suitable solution is known as convergencepremature and is an undesirable condition in GA.
Proportional fitness selection
Proportional fitness selection is one of the most popular methods of parent selection. In this, each individual can become a parent with a probability proportional to his ability. Therefore, individuals in better shape are more likely to mate and spread their characteristics to the next generation. Therefore, such a selection strategy applies selection pressure to the fittest individuals in the population, causing better individuals to evolve over time.
Consider a circular wheel. The wheel is n pies , where n is the number of individuals in the population. Each individual gets a part of the circle that is proportional to their fitness value.
Two implementations of proportional selection of form physique are possible 
Roulette selection
In a roulette wheel selection, the circular wheel is
It is clear that a installer has a bigger pie on the wheel and therefore a higher chance of landing in front of the runup when the wheel is turned. Therefore, the probability of choosing an individual directly depends on their physical form.
Regarding the implementation, we use the following steps 

Calculate S = the sum of the finenesses.

Generate a random number between 0 and S.

Starting from the top of the population, continue to add the finesse to the partial sum P, up to P

The individual for which P exceeds S is the chosen individual.
Universal Stochastic Sampling (SUS)
Universal Stochastic Sampling is quite similar to selecting roulette, but instead of having only one fixed point, we have multiple fixed points as shown in the following image. Therefore, all parents are chosen in one turn of the wheel. In addition, such a configuration encourages very suitable individuals to be chosen at least once.
It should be noted that the methods of selection proportional to physical form do not work in cases where the physical form can take a negative value.
Tournament selection
In the selection of KWay tournaments, we select K individuals from the population at random and selectns the best among them to become parents. The same process is repeated to select the next parent. Tournament selection is also extremely popular in the literature as it can even work with negative fitness values.
Row selection
Rank selection also works with negative fitness values and is mainly used when individuals in the population are very close to fitness values (this usually happens at the end of the run). This leads each individual to have an almost equal share of the pie (as in the case of proportional fitness selection) as shown in the following image and, therefore, each individual, regardless of their fit to each other, has approximately the same probability of being selected as a parent. This leads to son turn to a loss of selection pressure towards fitter individuals, forcing the GA to make poor parental selections in such situations.
In this we remove the concept of fitness value when selecting a parent. However, each individual in the population is classified according to their physical form. The selection of parents depends on the rank of each individual and not on physical form. The highest ranked people are preferred over the lowest ranked people.
Chromosome  Shape value  Rank 
A  8.1  1 
B  8.0  4 
C  8.05  2 
D  7.95  6 
E  8,02  3 
F  7.99  5 
Random selection
In this strategy, we select randomly relatives of the existing population. There is no selection pressure towards fitter individuals and therefore this strategy is generally avoided.
Genetic Algorithms  Crossover
In this chapter, we will discuss what a Crossover Operator is with its other modules, their uses and advantages.
Introduction to Crossover
The crossover operator is analogous to reproduction and biological crossover. In this case, more than one parent is selected and one or more offspring are produced using the genetic material from the parents. Crossover is generally applied in an GA with a high pr obability  p_{c} .
Crossover operators
In this section we will discussfrom some of the most commonly used crossover operators. Note that these crossover operators are very generic and the GA designer can also choose to implement a problemspecific crossover operator.
One point crossover
In this crossover point, a random crossover point is selected and the tails of both parents are swapped to get new springs.
Multipoint crossover
Multipoint crossover is a generalization of onepoint crossover in which alternate segments are swapped for get new springs.
Uniform crossover
In a uniform crossover, we don't
Integer arithmetic recombination
This is commonly used for integer representations and works by taking the weighted average of the two parents using the following formulas 
 Child1 = α.x + (1α) .y
 Child2 = α.x + (1α) .y
Obviously, if α = 0.5, then the two children will be the same as shown in the following image.
Davis Order Crossover (OX1)
OX1 is used for crossover based permutation with the intention of conveying information about the relative order of the offspring. It works as follows 

CCreate two random cross points in the parent and copy the segment between them from the first parent to the first offspring.

Now from the second crosspoint in the second parent, copy the remaining unused numbers from the second parent to the first child, going around the list.

Repeat for the second child with the role of the parent reversed.
There are many other crossovers like Partially Mapped Crossover (PMX), Order based crossover (OX2), Shuffle Crossover, Ring Crossover, etc.
Genetic Algorithms  Mutation
Introduction to mutation
In simple terms, mutation can be defined as a small random adjustment in the chromosome, to obtain a new solution. It is used to maintain and introduce diversity into the genetic population and is generallynt applied with low probability  p_{m} . If the probability is very high, the GA reduces to a random search.
Mutation is the part of GA which is related to the "exploration" of the search space. It has been observed that the mutation is essential for the convergence of GA while the crossover is not.
Mutation operators
In this section, we describe some of the most commonly used mutation operators. Like the crossover operators, this list is not exhaustive and the GA designer might find a combination of these approaches or a problemspecific mutation operator more useful.
Bit Flip Mutation
In this bit flip mutation, we select one or more random bits and flip them. This is used for GAs encoded in binary.
Random reset
Random reset is an extension of bit flipping for the entire representation. In this, a random value from the set of allowed values is assigned to a randomly selected gene.
Swap Mutation
In swap mutation, we select two positions on the chromosome at random, and swap the values. This is common in permutation based encodings.
Scramble Mutation
The Scramble mutation is also popular with permutation representations, in this, from the entire chromosome a subset of genes is chosen and their values scrambled or randomly mixed.
Inversion mutation
In inversion mutation we select a subset of genes like in scrambled mutation, but instead of mixing the sorset, we just reverse the entire chain of the subset.
Genetic Algorithms  Selection of Survivors
The survivor selection policy determines which individuals should be deported and which should be kept in the next generation. It is crucial because it must ensure that the fittest individuals are not expelled from the population, while preserving
Some GAs employ elitism . In simple terms , this means that the current fittest member of the population is always propagated to the next generation. Therefore, under no circumstances can the fittest member of the current population be replaced.
Politics the simplest is to expel random members of the population, but such an approach frequently poses problems of convergence, egTherefore the following strategies are widely used.
Selection based on age
In selection based on age, we do not have the notion of physical form. It is based on the principle that every individual is allowed in the population for a finite generation where he is allowed to reproduce, after that he is expelled from the population, regardless of his physical form.
For example, in the following example, age is the number of generations for which the individual was part of the population. The oldest members of the population ie P4 and P7 are expelled from the population and the age of the rest of the members is incremented by one.
Selection according to physical condition
In this selection based on physical condition, the children tend to replace the less able individuals in the population.less able individuals can be done using a variation of one of the selection policies described previously  tournament selection, selection commensurate with fitness, etc.
For example, in the following image, children replace the least fit individuals P1 and P10 in the population. Note that since P1 and P9 have the same fitness value, the decision to remove which individual from the population is arbitrary.
Genetic Algorithms  Termination Condition
The termination condition of a genetic algorithm is important in determining the end of It was observed that initially the AG progresses very quickly with better solutions every few iterations, but this tends to saturate in the later stages where the improvements are very small. usually want a condition of termining such that our solution is close to the optimum, at the end of the run.
Usually we keep one of the following termination conditions 
 When there is There has not been population improvement for X iterations.
 When we reach an absolute number of generations.
 When the value of the objective function has reached a certain predefined value.
For example, in a genetic algorithm, we keep a counter that keeps track of generations for which there has been no population improvement. Initially, we set this counter to zero. Whenever we don't generate better offspring than individuals in the population, we increment the counter.
However, if the physical form of any of the offspring is better, then we reset the counter to zero. The algorithm ends when the counter reaches a predetermined valuerminée.
Like other parameters of a GA, the termination condition is also very specific to the problem and the GA designer should try different options to see what works for their particular problem. best.
Lifetime Adaptation Models
So far in this tutorial, everything we have discussed corresponds to the Darwinian model of evolution  natural selection and variation genetics by recombination and mutation. In nature, only the information contained in an individual's genotype can be passed on to the next generation. This is the approach we have taken in the tutorial so far.
However, other lifelong adaptation models  Lamarckian Model and Baldwinian Model also exist. It should be noted that which model is the best is open to debate and the results obtained by the researchers show that the choice of lifelong adaptation is very specific to the problem.th.
Often times we hybridize an GA with a local search  as in memetic algorithms. In such cases, one can choose to choose either the Lamarckian model or the Baldwinian model to decide what to do with the individuals generated after the local search.
Lamarckian model
The Lamarckian model basically says that the traits that an individual acquires during his life can be transmitted to his off spring. It is named after the French biologist JeanBaptiste Lamarck.
Even though natural biology has completely ignored Lamarckism because we all know that only genotype information can be transmitted. However, from a computational point of view, it has been shown that adopting the Lamarckian model gives good results for some of the problems.
In the Lamarckian model, a local search operator examines the neighborhood (acquisition of new traits), and if a better chromosome is found, it deviatesnt offspring.
Baldwinian model
The Baldwinian model is an intermediate idea named after James Mark Baldwin (1896). In the Baldwin model, chromosomes can encode a tendency to learn beneficial behaviors. This means that unlike the Lamarckian model, we do not pass on acquired traits to the next generation, nor do we completely ignore acquired traits as in the Darwinian model.
The Baldwin model is in the middle of these two extremes, where the tendency of an individual to acquire certain traits is encoded rather than the traits themselves.
In this Baldwinian model, a local search operator examines the neighborhood (acquisition of new traits), and if a better chromosome is found, it assigns only the best suitability to the chromosome and does not change the chromosome to it. even. The change in shape signifies the ability of chromosomes to 'acquire the trait', evenif it is not passed on directly to future generations.
Effective implementation
GAs are very general in nature, and simply applying them to an optimization problem would not give good results. In this section, we describe a few points that might help and aid a GA designer or GA producer in their work.
Introduce problemspecific domain knowledge
It has been observed that the more problemspecific domain knowledge that we are integrating into the GA; the best objective values we get. Addition of problem specific information can be done by eith er using problem specific crossover or mutation operators, custom representations, etc.
The following image shows Michalewicz's (1990) view of EA 
Minimize surpeuplement
Overcrowding occurs when one fit chromosome manages to reproduce a lot, and within a few generations the whole population is filled with similar solutions having similar ability. This reduces the diversity which is a very crucial element to ensure the success of a GA. There are many ways to limit overpopulation. Some of them are 

Mutation to introduce

Switch to rank selection and tournament selection which have more selection pressure than selection proportional to physical form for individuals of similar physical form.

Fitness Sharing  In this case, the fitness of an individual is reduced if the population already contains similar individuals.
Randomization helps!
It has been observed experimentally thatbest solutions are derived by randomized chromosomes because they give
Hybridize GA with local search
Local search consists of checking the solutions in the neighborhood of a given solution to find better objective values.
It can sometimes be useful to hybridize GA with local search. The following image shows the different places where local search can be introduced in an GA.
Variation of parameters and techniques
In algorithms genetics, there is no "one size fits all " or magic formula that works for all problems. Even after the initial GA is ready, it must be fine.time and effort to play with parameters such as population size, mutation and probability of crossing, etc. to find the ones that suit the particular problem.
Genetic Algorithms  Advanced Topics
In this section, we present some advanced topics on genetic algorithms. A reader looking for a simple introduction to GA may choose to skip this section.
Constrained optimization problems
Constrained optimization problems are the optimization problems in which we must maximize or minimize an objective function value subject to certain constraints . Therefore, not all results in the solution space are achievable and the solution space contains achievable regions as shown in the following image.
In such a scenario , the operatorss of crossing and mutation could provide us with impractical solutions. Therefore, additional mechanisms must be used in the AG to deal with constrained optimization problems.
Some of the more common methods are 

Use penalty functions which reduce the Matching unrealizable solutions, preferably so that the match is reduced in proportion to the number of constraints violated or the distance from the feasible region.

Use repair functions that take an unworkable solution and modify it so that the violated constraints are satisfied.

Do not let impossible solutions at all enter the population.

Use a special representation or decoder functions which ensure the feasibility of the solutions.
Contextbasic theory
In this section, we will discuss the NFL scheme and theorem as well as the building block hypothesis.
Schema theorem
Researchers have tried to understand the mathematics behind how genetic algorithms work, and Holland's schema theorem is a step in that direction. During the year,
In this section, we will not delve into the mathematics of the Schema Theorem, rather we try to develop a basic understanding of what the Schema Theorem is. The basic terminology to know is as follows 

A Schema is a "template ". Formally it is a string over the alphabet = {0,1, *},
where * doesn't care and can take any value.
Therefore, * 10 * 1 could ifgnify 01001, 01011, 11001 or 11011
Geometrically, a diagram is a hyperplane in the solution search space.

The order of a pattern is the number of fixed positions specified in a gene.
The schema theorem indicates that this schema with aboveaverage physical form, short definition length, and lower order is more likely to survive crossing and mutation.
Building blocks hypothesis
Building blocks are loworder, lowlength diagrams of definition wi e given average physical form above. The construction block hypothesisthese building blocks are said to serve as the basis for the success and adaptation of FA in FA as it progresses by successively identifying and recombining these "building blocks".
No Free Lunch Theorem (NFL)
Wolpert and Macready published an article in 1997 called "No Free Lunch Theorems for Optimization. " It basically states that if we do the averaged over the space of all possible problems, then all bl The ack box algorithms will exhibit the same performance.
This means that the more we understand a problem, our GA becomes more specific to the problem and performs better, but it compensates for this by performing poorly for other problems.
GAbased machine learning
Genetic algorithms also find application in machine learning. classification systems are a form of apprenticeship system Genetic Based Automatic Age (GBML) which is frequently used in the field of machine learning. GBML methods are a niche approach to machine learning.
There are two categories of GBML systems 

The Pittsburg Approach  In this approach, a chromosome encoded a solution, and therefore the physical form is assigned to the solutions.

The Michigan Approach  a Solution is usually represented by many chromosomes and therefore matching is attributed to partial solutions .
It should be borne in mind that the standard problem like crossbreeding, mutation, Lamarckian or Darwinian, etc. are also present in GBML systems.
Genetic Algorithms  Areas of Application
Genetic algorithms are mainly used in optimization problems of
In this section, we list some of the areas in which genetic algorithms are frequently used. These are 

Optimization  Genetic algorithms are most commonly used in optimization problems in which we must maximize or minimize a value of the objective function under a given set of constraints. The approach to solving optimization problems has been highlighted throughout the tutorial.

Economics  GA are also used to characterize

Neural Networks  GAs are also used to train neural networks, especially recurrent neural networks.

Parallelization  GAs also have very good parallelism capabilities, and prove to be very effective means of solving certain problems, and also provide a good area of research.

Image processing  GAs are used for

Vehicle routing issues  With multiple flexible time windows, multiple depots and a mixed fleet.

Scheduling applications  GAs are used to solve

Machine Learning  as already discussed, GeneticBased Machine Learning (GBML) is a niche area in machine learning .

Generation of trajectories of robot  GAs were used to plan the path taken by a robot arm while moving from one point to another.

Parametric aircraft design  GAs have been used to design airplanes by varying parameters and evolving better solutions.

DNA Analysis  GAs have been used to determine the structure of DNA using spectrometric data on the sample.

Multimodal optimization  GAs are obviously very good multimodal optimization approaches in which we have to find several optimal solutions.

Salesman problem and its applications  GAs have been used to solve TSP, which is a wellknown combinatorial problem using new crossbreeding and packing strategies.
Genetic Algorithms  Further Reading
TheThe following books can be consulted to improve the reader's knowledge of genetic algorithms and evolutionary calculus in general 

Genetic algorithms in research, l optimization and machine learning by David E. Goldberg .

Genetic algorithms + data Structures = Evolutionary programs by Zbigniew Michalewicz .

Practical Genetics A algorithms by Randy L. Haupt and Sue Ellen Haupt .

Multiobjective optimization using scalable algorithms by> Kalyanmoy Deb .