Release History¶

PyGAD 1.0.17¶
Release Date: 15 April 2020
Thepygad.GA class accepts a new argument named
fitness_funcwhich accepts a function to be used for calculating the fitnessvalues for the solutions. This allows the project to be customized toany problem by building the right fitness function.
PyGAD 1.0.20¶
Release Date: 4 May 2020
Thepygad.GA attributes are moved from the class scope to theinstance scope.
Raising an exception for incorrect values of the passed parameters.
Two new parameters are added to thepygad.GA class constructor(
init_range_lowandinit_range_high) allowing the user tocustomize the range from which the genes values in the initialpopulation are selected.The code object
__code__of the passed fitness function ischecked to ensure it has the right number of parameters.
PyGAD 2.0.0¶
Release Date: 13 May 2020
The fitness function accepts a new argument named
sol_idxrepresenting the index of the solution within the population.A new parameter to thepygad.GA class constructor named
initial_populationis supported to allow the user to use a custominitial population to be used by the genetic algorithm. If not None,then the passed population will be used. IfNone, then thegenetic algorithm will create the initial population using thesol_per_popandnum_genesparameters.The parameters
sol_per_popandnum_genesare optional and settoNoneby default.A new parameter named
callback_generationis introduced in thepygad.GA class constructor. It accepts a function with a singleparameter representing thepygad.GA class instance. This functionis called after each generation. This helps the user to dopost-processing or debugging operations after each generation.
PyGAD 2.1.0¶
Release Date: 14 May 2020
The
best_solution()method in thepygad.GA class returns anew output representing the index of the best solution within thepopulation. Now, it returns a total of 3 outputs and their order is:best solution, best solution fitness, and best solution index. Hereis an example:
solution,solution_fitness,solution_idx=ga_instance.best_solution()print("Parameters of the best solution :",solution)print("Fitness value of the best solution :",solution_fitness,"\n")print("Index of the best solution :",solution_idx,"\n")
- A new attribute named
best_solution_generationis added to theinstances of thepygad.GA class. it holds the generation numberat which the best solution is reached. It is only assigned thegeneration number after therun()method completes. Otherwise,its value is -1.Example:
print("Best solution reached after{best_solution_generation} generations.".format(best_solution_generation=ga_instance.best_solution_generation))
The
best_solution_fitnessattribute is renamed tobest_solutions_fitness(plural solution).Mutation is applied independently for the genes.
PyGAD 2.2.1¶
Release Date: 17 May 2020
Adding 2 extra modules (pygad.nn and pygad.gann) for building andtraining neural networks with the genetic algorithm.
PyGAD 2.2.2¶
Release Date: 18 May 2020
The initial value of the
generations_completedattribute ofinstances from the pygad.GA class is0rather thanNone.An optional bool parameter named
mutation_by_replacementis addedto the constructor of the pygad.GA class. It works only when theselected type of mutation is random (mutation_type="random"). Inthis case, settingmutation_by_replacement=Truemeans replace thegene by the randomly generated value. IfFalse, then it has noeffect and random mutation works by adding the random value to thegene. This parameter should be used when the gene falls within afixed range and its value must not go out of this range. Here aresome examples:
Assume there is a gene with the value 0.5.
Ifmutation_type="random" andmutation_by_replacement=False,then the generated random value (e.g. 0.1) will be added to the genevalue. The new gene value is0.5+0.1=0.6.
Ifmutation_type="random" andmutation_by_replacement=True, thenthe generated random value (e.g. 0.1) will replace the gene value. Thenew gene value is0.1.
Nonevalue could be assigned to themutation_typeandcrossover_typeparameters of the pygad.GA class constructor. WhenNone, this means the step is bypassed and has no action.
PyGAD 2.3.0¶
Release date: 1 June 2020
A new module named
pygad.cnnis supported for buildingconvolutional neural networks.A new module named
pygad.gacnnis supported for trainingconvolutional neural networks using the genetic algorithm.The
pygad.plot_result()method has 3 optional parameters namedtitle,xlabel, andylabelto customize the plot title,x-axis label, and y-axis label, respectively.The
pygad.nnmodule supports the softmax activation function.The name of the
pygad.nn.predict_outputs()function is changed topygad.nn.predict().The name of the
pygad.nn.train_network()function is changed topygad.nn.train().
PyGAD 2.4.0¶
Release date: 5 July 2020
A new parameter named
delay_after_genis added which accepts anon-negative number specifying the time in seconds to wait after ageneration completes and before going to the next generation. Itdefaults to0.0which means no delay after the generation.The passed function to the
callback_generationparameter of thepygad.GA class constructor can terminate the execution of the geneticalgorithm if it returns the stringstop. This causes therun()method to stop.
One important use case for that feature is to stop the genetic algorithmwhen a condition is met before passing though all the generations. Theuser may assigned a value of 100 to thenum_generations parameter ofthe pygad.GA class constructor. Assuming that at generation 50, forexample, a condition is met and the user wants to stop the executionbefore waiting the remaining 50 generations. To do that, just make thefunction passed to thecallback_generation parameter to return thestringstop.
Here is an example of a function to be passed to thecallback_generation parameter which stops the execution if thefitness value 70 is reached. The value 70 might be the best possiblefitness value. After being reached, then there is no need to passthrough more generations because no further improvement is possible.
deffunc_generation(ga_instance):ifga_instance.best_solution()[1]>=70:return"stop"
PyGAD 2.5.0¶
Release date: 19 July 2020
- 2 new optional parameters added to the constructor of the
pygad.GAclass which arecrossover_probabilityandmutation_probability.While applying the crossover operation, each parent has a randomvalue generated between 0.0 and 1.0. If this random value is lessthan or equal to the value assigned to thecrossover_probabilityparameter, then the parent is selectedfor the crossover operation.For the mutation operation, a random value between 0.0 and 1.0 isgenerated for each gene in the solution. If this value is less thanor equal to the value assigned to themutation_probability,then this gene is selected for mutation. A new optional parameter named
linewidthis added to theplot_result()method to specify the width of the curve in theplot. It defaults to 3.0.Previously, the indices of the genes selected for mutation wasrandomly generated once for all solutions within the generation.Currently, the genes’ indices are randomly generated for eachsolution in the population. If the population has 4 solutions, theindices are randomly generated 4 times inside the single generation,1 time for each solution.
Previously, the position of the point(s) for the single-point andtwo-points crossover was(were) randomly selected once for allsolutions within the generation. Currently, the position(s) is(are)randomly selected for each solution in the population. If thepopulation has 4 solutions, the position(s) is(are) randomlygenerated 4 times inside the single generation, 1 time for eachsolution.
A new optional parameter named
gene_spaceas added to thepygad.GAclass constructor. It is used to specify the possiblevalues for each gene in case the user wants to restrict the genevalues. It is useful if the gene space is restricted to a certainrange or to discrete values. For more information, check theMoreabout the ``gene_space`Parameter <https://pygad.readthedocs.io/en/latest/pygad_more.html#more-about-the-gene-space-parameter>`__section. Thanks toProf. Tamer A.Farrag for requesting this usefulfeature.
PyGAD 2.6.0¶
Release Date: 6 August 2020
A bug fix in assigning the value to the
initial_populationparameter.A new parameter named
gene_typeis added to control the genetype. It can be eitherintorfloat. It has an effect onlywhen the parametergene_spaceisNone.7 new parameters that accept callback functions:
on_start,on_fitness,on_parents,on_crossover,on_mutation,on_generation, andon_stop.
PyGAD 2.7.0¶
Release Date: 11 September 2020
The
learning_rateparameter in thepygad.nn.train()functiondefaults to0.01.Added support of building neural networks for regression using thenew parameter named
problem_type. It is added as a parameter tobothpygad.nn.train()andpygad.nn.predict()functions. Thevalue of this parameter can be eitherclassification orregression to define the problem type. It defaults toclassification.The activation function for a layer can be set to the string
"None"to refer that there is no activation function at thislayer. As a result, the supported values for the activation functionare"sigmoid","relu","softmax", and"None".
To build a regression network using thepygad.nn module, just do thefollowing:
Set the
problem_typeparameter in thepygad.nn.train()andpygad.nn.predict()functions to the string"regression".Set the activation function for the output layer to the string
"None". This sets no limits on the range of the outputs as itwill be from-infinityto+infinity. If you are sure that alloutputs will be nonnegative values, then use the ReLU function.
Check the documentation of thepygad.nn module for an example thatbuilds a neural network for regression. The regression example is alsoavailable atthis GitHubproject:https://github.com/ahmedfgad/NumPyANN
To build and train a regression network using thepygad.gann module,do the following:
Set the
problem_typeparameter in thepygad.nn.train()andpygad.nn.predict()functions to the string"regression".Set the
output_activationparameter in the constructor of thepygad.gann.GANNclass to"None".
Check the documentation of thepygad.gann module for an example thatbuilds and trains a neural network for regression. The regressionexample is also available atthis GitHubproject:https://github.com/ahmedfgad/NeuralGenetic
To build a classification network, either ignore theproblem_typeparameter or set it to"classification" (default value). In thiscase, the activation function of the last layer can be set to any type(e.g. softmax).
PyGAD 2.7.1¶
Release Date: 11 September 2020
A bug fix when the
problem_typeargument is set toregression.
PyGAD 2.7.2¶
Release Date: 14 September 2020
Bug fix to support building and training regression neural networkswith multiple outputs.
PyGAD 2.8.0¶
Release Date: 20 September 2020
Support of a new module named
kerasgaso that the Keras modelscan be trained by the genetic algorithm using PyGAD.
PyGAD 2.8.1¶
Release Date: 3 October 2020
Bug fix in applying the crossover operation when the
crossover_probabilityparameter is used. Thanks toEng. HamadaKassem, Research and Teaching Assistant, Construction Engineering andManagement, Faculty of Engineering, Alexandria University,Egypt.
PyGAD 2.9.0¶
Release Date: 06 December 2020
The fitness values of the initial population are considered in the
best_solutions_fitnessattribute.An optional parameter named
save_best_solutionsis added. Itdefaults toFalse. When it isTrue, then the best solutionafter each generation is saved into an attribute namedbest_solutions. IfFalse, then no solutions are saved and thebest_solutionsattribute will be empty.Scattered crossover is supported. To use it, assign the
crossover_typeparameter the value"scattered".NumPy arrays are now supported by the
gene_spaceparameter.The following parameters (
gene_type,crossover_probability,mutation_probability,delay_after_gen) can be assigned to anumeric value of any of these data types:int,float,numpy.int,numpy.int8,numpy.int16,numpy.int32,numpy.int64,numpy.float,numpy.float16,numpy.float32, ornumpy.float64.
PyGAD 2.10.0¶
Release Date: 03 January 2021
Support of a new module
pygad.torchgato train PyTorch modelsusing PyGAD. Checkitsdocumentation.Support of adaptive mutation where the mutation rate is determinedby the fitness value of each solution. Read theAdaptiveMutationsection for more details. Also, read this paper:Libelli, S.Marsili, and P. Alba. “Adaptive mutation in genetic algorithms.”Soft computing 4.2 (2000):76-80.
Before the
run()method completes or exits, the fitness value ofthe best solution in the current population is appended to thebest_solution_fitnesslist attribute. Note that the fitnessvalue of the best solution in the initial population is alreadysaved at the beginning of the list. So, the fitness value of thebest solution is saved before the genetic algorithm starts and afterit ends.When the parameter
parent_selection_typeis set tosss(steady-state selection), then a warning message is printed if thevalue of thekeep_parentsparameter is set to 0.More validations to the user input parameters.
The default value of the
mutation_percent_genesis set to thestring"default"rather than the integer 10. This change helpsto know whether the user explicitly passed a value to themutation_percent_genesparameter or it is left to its defaultone. The"default"value is later translated into the integer10.The
mutation_percent_genesparameter is no longer accepting thevalue 0. It must be>0and<=100.The built-in
warningsmodule is used to show warning messagesrather than just using theprint()function.A new
boolparameter calledsuppress_warningsis added tothe constructor of thepygad.GAclass. It allows the user tocontrol whether the warning messages are printed or not. It defaultstoFalsewhich means the messages are printed.A helper method called
adaptive_mutation_population_fitness()iscreated to calculate the average fitness value used in adaptivemutation to filter the solutions.The
best_solution()method accepts a new optional parametercalledpop_fitness. It accepts a list of the fitness values ofthe solutions in the population. IfNone, then thecal_pop_fitness()method is called to calculate the fitnessvalues of the population.
PyGAD 2.10.1¶
Release Date: 10 January 2021
In the
gene_spaceparameter, anyNonevalue (regardless ofits index or axis), is replaced by a randomly generated number basedon the 3 parametersinit_range_low,init_range_high, andgene_type. So, theNonevalue in[...,None,...]or[...,[...,None,...],...]are replaced with random values.This gives more freedom in building the space of values for thegenes.All the numbers passed to the
gene_spaceparameter are casted tothe type specified in thegene_typeparameter.The
numpy.uintdata type is supported for the parameters thataccept integer values.In the
pygad.kerasgamodule, themodel_weights_as_vector()function uses thetrainableattribute of the model’s layers toonly return the trainable weights in the network. So, only thetrainable layers with theirtrainableattribute set toTrue(trainable=True), which is the default value, have their weightsevolved. All non-trainable layers with thetrainableattributeset toFalse(trainable=False) will not be evolved. Thanks toProf. Tamer A. Farrag forpointing about that atGitHub.
PyGAD 2.10.2¶
Release Date: 15 January 2021
A bug fix when
save_best_solutions=True. Refer to this issue formore information:https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/25
PyGAD 2.11.0¶
Release Date: 16 February 2021
In the
gene_spaceargument, the user can use a dictionary tospecify the lower and upper limits of the gene. This dictionary musthave only 2 items with keyslowandhighto specify the lowand high limits of the gene, respectively. This way, PyGAD takes careof not exceeding the value limits of the gene. For a problem withonly 2 genes, then usinggene_space=[{'low':1,'high':5},{'low':0.2,'high':0.81}]means the accepted values in the first gene start from 1 (inclusive)to 5 (exclusive) while the second one has values between 0.2(inclusive) and 0.85 (exclusive). For more information, please checktheLimit the Gene ValueRangesection of the documentation.The
plot_result()method returns the figure so that the user cansave it.Bug fixes in copying elements from the gene space.
For a gene with a set of discrete values (more than 1 value) in the
gene_spaceparameter like[0,1], it was possible that thegene value may not change after mutation. That is if the currentvalue is 0, then the randomly selected value could also be 0. Now, itis verified that the new value is changed. So, if the current valueis 0, then the new value after mutation will not be 0 but 1.
PyGAD 2.12.0¶
Release Date: 20 February 2021
4 new instance attributes are added to hold temporary results aftereach generation:
last_generation_fitnessholds the fitness valuesof the solutions in the last generation,last_generation_parentsholds the parents selected from the last generation,last_generation_offspring_crossoverholds the offspring generatedafter applying the crossover in the last generation, andlast_generation_offspring_mutationholds the offspring generatedafter applying the mutation in the last generation. You can accessthese attributes inside theon_generation()method for example.A bug fixed when the
initial_populationparameter is used. Thebug occurred due to a mismatch between the data type of the arrayassigned toinitial_populationand the gene type in thegene_typeattribute. Assuming that the array assigned to theinitial_populationparameter is((1,1),(3,3),(5,5),(7,7))which has typeint. Whengene_typeis set tofloat, then the genes will not be floatbut casted tointbecause the defined array hasinttype. Thebug is fixed by forcing the array assigned toinitial_populationto have the data type in thegene_typeattribute. Check theissue atGitHub:https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/27
Thanks to Andrei Rozanski [PhD Bioinformatics Specialist, Department ofTissue Dynamics and Regeneration, Max Planck Institute for BiophysicalChemistry, Germany] for opening my eye to the first change.
Thanks toMariosGiouvanakis,a PhD candidate in Electrical & Computer Engineer,Aristotle Universityof Thessaloniki (Αριστοτέλειο Πανεπιστήμιο Θεσσαλονίκης),Greece, for emailing me about the secondissue.
PyGAD 2.13.0¶
Release Date: 12 March 2021
A new
boolparameter calledallow_duplicate_genesissupported. IfTrue, which is the default, then asolution/chromosome may have duplicate gene values. IfFalse,then each gene will have a unique value in its solution. Check thePrevent Duplicates in GeneValuessection for more details.The
last_generation_fitnessis updated at the end of eachgeneration not at the beginning. This keeps the fitness values of themost up-to-date population assigned to thelast_generation_fitnessparameter.
PyGAD 2.14.0¶
PyGAD 2.14.0 has an issue that is solved in PyGAD 2.14.1. Pleaseconsider using 2.14.1 not 2.14.0.
Release Date: 19 May 2021
Issue#40is solved. Now, the
Nonevalue works with thecrossover_typeandmutation_typeparameters:https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/40The
gene_typeparameter supports accepting alist/tuple/numpy.ndarrayof numeric data types for the genes.This helps to control the data type of each individual gene.Previously, thegene_typecan be assigned only to a single datatype that is applied for all genes. For more information, check theMore about the ``gene_type`Parameter <https://pygad.readthedocs.io/en/latest/pygad_more.html#more-about-the-gene-type-parameter>`__section. Thanks toRainerEngelfor asking about this feature inthisdiscussion:https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/43A new
boolattribute namedgene_type_singleis added to thepygad.GAclass. It isTruewhen there is a single data typeassigned to thegene_typeparameter. When thegene_typeparameter is assigned alist/tuple/numpy.ndarray, thengene_type_singleis set toFalse.The
mutation_by_replacementflag now has no effect ifgene_spaceexists except for the genes withNonevalues. Forexample, forgene_space=[None,[5,6]]themutation_by_replacementflag affects only the first gene whichhasNonefor its value space.When an element has a value of
Nonein thegene_spaceparameter (e.g.gene_space=[None,[5,6]]), then its value willbe randomly generated for each solution rather than being generateonce for all solutions. Previously, the gene withNonevalue ingene_spaceis the same across all solutionsSome changes in the documentation according toissue#32:https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/32
PyGAD 2.14.2¶
Release Date: 27 May 2021
Some bug fixes when the
gene_typeparameter is nested. Thanks toRainerEngelfor openingadiscussionto report this bug:https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/43#discussioncomment-763342
RainerEngelhelped a lot in suggesting new features and suggesting enhancements in2.14.0 to 2.14.2 releases.
PyGAD 2.14.3¶
Release Date: 6 June 2021
Some bug fixes when setting the
save_best_solutionsparameter toTrue. Previously, the best solution for generationiwasadded into thebest_solutionsattribute at generationi+1.Now, thebest_solutionsattribute is updated by each bestsolution at its exact generation.
PyGAD 2.15.0¶
Release Date: 17 June 2021
Control the precision of all genes/individual genes. Thanks toRainer for asking about thisfeature:https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/43#discussioncomment-763452
A new attribute named
last_generation_parents_indicesholds theindices of the selected parents in the last generation.In adaptive mutation, no need to recalculate the fitness values ofthe parents selected in the last generation as these values can bereturned based on the
last_generation_fitnessandlast_generation_parents_indicesattributes. This speeds-up theadaptive mutation.When a sublist has a value of
Nonein thegene_spaceparameter (e.g.gene_space=[[1,2,3],[5,6,None]]), then itsvalue will be randomly generated for each solution rather than beinggenerated once for all solutions. Previously, a value ofNoneina sublist of thegene_spaceparameter was identical across allsolutions.The dictionary assigned to the
gene_spaceparameter itself orone of its elements has a new key called"step"to specify thestep of moving from the start to the end of the range specified bythe 2 existing keys"low"and"high". An example is{"low":0,"high":30,"step":2}to have only even values forthe gene(s) starting from 0 to 30. For more information, check theMore about the ``gene_space`Parameter <https://pygad.readthedocs.io/en/latest/pygad_more.html#more-about-the-gene-space-parameter>`__section.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/48A new function called
predict()is added in both thepygad.kerasgaandpygad.torchgamodules to make predictions.This makes it easier than using custom code each time a predictionis to be made.A new parameter called
stop_criteriaallows the user to specifyone or more stop criteria to stop the evolution based on someconditions. Each criterion is passed asstrwhich has a stopword. The current 2 supported words arereachandsaturate.reachstops therun()method if the fitness value is equalto or greater than a given fitness value. An example forreachis"reach_40"which stops the evolution if the fitness is >= 40.saturatemeans stop the evolution if the fitness saturates for agiven number of consecutive generations. An example forsaturateis"saturate_7"which means stop therun()method if thefitness does not change for 7 consecutive generations. Thanks toRainer for asking about thisfeature:https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/44A new bool parameter, defaults to
False, namedsave_solutionsis added to the constructor of thepygad.GAclass. IfTrue, then all solutions in each generation areappended into an attribute calledsolutionswhich is NumPyarray.The
plot_result()method is renamed toplot_fitness(). Theusers should migrate to the new name as the old name will be removedin the future.Four new optional parameters are added to the
plot_fitness()function in thepygad.GAclass which arefont_size=14,save_dir=None,color="#3870FF", andplot_type="plot".Usefont_sizeto change the font of the plot title and labels.save_diraccepts the directory to which the figure is saved. Itdefaults toNonewhich means do not save the figure.colorchanges the color of the plot.plot_typechanges the plot typewhich can be either"plot"(default),"scatter", or"bar".https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/47The default value of the
titleparameter in theplot_fitness()method is"PyGAD-Generationvs.Fitness"rather than"PyGAD-Iterationvs.Fitness".A new method named
plot_new_solution_rate()creates, shows, andreturns a figure showing the rate of new/unique solutions exploredin each generation. It accepts the same parameters as in theplot_fitness()method. This method only works whensave_solutions=Truein thepygad.GAclass’s constructor.A new method named
plot_genes()creates, shows, and returns afigure to show how each gene changes per each generation. It acceptssimilar parameters like theplot_fitness()method in addition tothegraph_type,fill_color, andsolutionsparameters.Thegraph_typeparameter can be either"plot"(default),"boxplot", or"histogram".fill_coloraccepts the fillcolor which works whengraph_typeis either"boxplot"or"histogram".solutionscan be either"all"or"best"to decide whether all solutions or only best solutions are used.The
gene_typeparameter now supports controlling the precisionoffloatdata types. For a gene, rather than assigning just thedata type likefloat, assign alist/tuple/numpy.ndarraywith 2 elements where the firstone is the type and the second one is the precision. For example,[float,2]forces a gene with a value like0.1234to be0.12. For more information, check theMore about the``gene_type`Parameter <https://pygad.readthedocs.io/en/latest/pygad_more.html#more-about-the-gene-type-parameter>`__section.
PyGAD 2.15.1¶
Release Date: 18 June 2021
Fix a bug when
keep_parentsis set to a positive integer.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/49
PyGAD 2.15.2¶
Release Date: 18 June 2021
Fix a bug when using the
kerasgaortorchgamodules.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/51
PyGAD 2.16.0¶
Release Date: 19 June 2021
A user-defined function can be passed to the
mutation_type,crossover_type, andparent_selection_typeparameters in thepygad.GAclass to create a custom mutation, crossover, and parentselection operators. Check theUser-Defined Crossover, Mutation, andParent SelectionOperatorssection for more details.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/50
PyGAD 2.16.1¶
Release Date: 28 September 2021
The user can use the
tqdmlibrary to show a progress bar.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/50.
importpygadimportnumpyimporttqdmequation_inputs=[4,-2,3.5]desired_output=44deffitness_func(ga_instance,solution,solution_idx):output=numpy.sum(solution*equation_inputs)fitness=1.0/(numpy.abs(output-desired_output)+0.000001)returnfitnessnum_generations=10000withtqdm.tqdm(total=num_generations)aspbar:ga_instance=pygad.GA(num_generations=num_generations,sol_per_pop=5,num_parents_mating=2,num_genes=len(equation_inputs),fitness_func=fitness_func,on_generation=lambda_:pbar.update(1))ga_instance.run()ga_instance.plot_result()
But this work does not work if thega_instance will be pickled (i.e.thesave() method will be called.
ga_instance.save("test")
To solve this issue, define a function and pass it to theon_generation parameter. In the next code, theon_generation_progress() function is defined which updates theprogress bar.
importpygadimportnumpyimporttqdmequation_inputs=[4,-2,3.5]desired_output=44deffitness_func(ga_instance,solution,solution_idx):output=numpy.sum(solution*equation_inputs)fitness=1.0/(numpy.abs(output-desired_output)+0.000001)returnfitnessdefon_generation_progress(ga):pbar.update(1)num_generations=100withtqdm.tqdm(total=num_generations)aspbar:ga_instance=pygad.GA(num_generations=num_generations,sol_per_pop=5,num_parents_mating=2,num_genes=len(equation_inputs),fitness_func=fitness_func,on_generation=on_generation_progress)ga_instance.run()ga_instance.plot_result()ga_instance.save("test")
Solved the issue of unequal length between the
solutionsandsolutions_fitnesswhen thesave_solutionsparameter is set toTrue. Now, the fitness of the last population is appended to thesolutions_fitnessarray.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/64There was an issue of getting the length of these 4 variables(
solutions,solutions_fitness,best_solutions, andbest_solutions_fitness) doubled after each call of therun()method. This is solved by resetting these variables at the beginningof therun()method.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/62Bug fixes when adaptive mutation is used(
mutation_type="adaptive").https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/65
PyGAD 2.16.2¶
Release Date: 2 February 2022
A new instance attribute called
previous_generation_fitnessaddedin thepygad.GAclass. It holds the fitness values of onegeneration before the fitness values saved in thelast_generation_fitness.Issue in the
cal_pop_fitness()method in getting the correctindices of the previous parents. This is solved by using the previousgeneration’s fitness saved in the new attributeprevious_generation_fitnessto return the parents’ fitnessvalues. Thanks to Tobias Tischhauser (M.Sc. -Mitarbeiter InstitutEMS, Departement Technik, OST – Ostschweizer Fachhochschule,Switzerland)for detecting this bug.
PyGAD 2.16.3¶
Release Date: 2 February 2022
Validate the fitness value returned from the fitness function. Anexception is raised if something is wrong.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/67
PyGAD 2.17.0¶
Release Date: 8 July 2022
An issue is solved when the
gene_spaceparameter is given a fixedvalue. e.g. gene_space=[range(5), 4]. The second gene’s value isstatic (4) which causes an exception.Fixed the issue where the
allow_duplicate_genesparameter did notwork when mutation is disabled (i.e.mutation_type=None). This isby checking for duplicates after crossover directly.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/39Solve an issue in the
tournament_selection()method as theindices of the selected parents were incorrect.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/89Reuse the fitness values of the previously explored solutions ratherthan recalculating them. This feature only works if
save_solutions=True.Parallel processing is supported. This is by the introduction of anew parameter named
parallel_processingin the constructor of thepygad.GAclass. Thanks to@windowshopr for opening theissue#78at GitHub. Check theParallel Processing inPyGADsection for more information and examples.
PyGAD 2.18.0¶
Release Date: 9 September 2022
Raise an exception if the sum of fitness values is zero while eitherroulette wheel or stochastic universal parent selection is used.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/129
Initialize the value of the
run_completedproperty toFalse.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/122The values of these properties are no longer reset with each call tothe
run()methodself.best_solutions,self.best_solutions_fitness,self.solutions,self.solutions_fitness:https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/123. Now,the user can have the flexibility of calling therun()methodmore than once while extending the data collected after eachgeneration. Another advantage happens when the instance is loaded andtherun()method is called, as the old fitness value are shown onthe graph alongside with the new fitness values. Read more in thissection:Continue without LosingProgressThanksProf. Fernando JiménezBarrionuevo (Dept. of Information andCommunications Engineering, University of Murcia, Murcia, Spain) forediting thiscommentin the code.https://github.com/ahmedfgad/GeneticAlgorithmPython/commit/5315bbec02777df96ce1ec665c94dece81c440f4
A bug fixed when
crossover_type=None.Support of elitism selection through a new parameter named
keep_elitism. It defaults to 1 which means for each generationkeep only the best solution in the next generation. If assigned 0,then it has no effect. Read more in this section:ElitismSelection.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/74A new instance attribute named
last_generation_elitismadded tohold the elitism in the last generation.A new parameter called
random_seedadded to accept a seed for therandom function generators. Credit to this issuehttps://github.com/ahmedfgad/GeneticAlgorithmPython/issues/70 andProf. Fernando Jiménez Barrionuevo.Read more in this section:RandomSeed.Editing the
pygad.TorchGAmodule to make sure the tensor data ismoved from GPU to CPU. Thanks to Rasmus Johansson for opening thispull request:https://github.com/ahmedfgad/TorchGA/pull/2
PyGAD 2.18.1¶
Release Date: 19 September 2022
A big fix when
keep_elitismis used.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/132
PyGAD 2.18.2¶
Release Date: 14 February 2023
Remove
numpy.intandnumpy.floatfrom the list of supporteddata types.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/151https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/152Call the
on_crossover()callback function even ifcrossover_typeisNone.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/138Call the
on_mutation()callback function even ifmutation_typeisNone.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/138
PyGAD 2.18.3¶
Release Date: 14 February 2023
Bug fixes.
PyGAD 2.19.0¶
Release Date: 22 February 2023
A new
summary()method is supported to return a Keras-likesummary of the PyGAD lifecycle.A new optional parameter called
fitness_batch_sizeis supportedto calculate the fitness in batches. If it is assigned the value1orNone(default), then the normal flow is used where thefitness function is called for each individual solution. If thefitness_batch_sizeparameter is assigned a value satisfying thiscondition1<fitness_batch_size<=sol_per_pop, then thesolutions are grouped into batches of sizefitness_batch_sizeand the fitness function is called once for each batch. In thiscase, the fitness function must return a list/tuple/numpy.ndarraywith a length equal to the number of solutions passed.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/136.The
cloudpicklelibrary(https://github.com/cloudpipe/cloudpickle) is used instead of thepicklelibrary to pickle thepygad.GAobjects. This solvesthe issue of having to redefine the functions (e.g. fitnessfunction). Thecloudpicklelibrary is added as a dependency intherequirements.txtfile.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/159Support of assigning methods to these parameters:
fitness_func,crossover_type,mutation_type,parent_selection_type,on_start,on_fitness,on_parents,on_crossover,on_mutation,on_generation, andon_stop.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/92https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/138Validating the output of the parent selection, crossover, andmutation functions.
The built-in parent selection operators return the parent’s indicesas a NumPy array.
The outputs of the parent selection, crossover, and mutationoperators must be NumPy arrays.
Fix an issue when
allow_duplicate_genes=True.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/39Fix an issue creating scatter plots of the solutions’ fitness.
Sampling from a
set()is no longer supported in Python 3.11.Instead, sampling happens from alist(). ThanksMarcoBrennafor pointing to this issue.The lifecycle is updated to reflect that the new population’sfitness is calculated at the end of the lifecycle not at thebeginning.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/154#issuecomment-1438739483
There was an issue when
save_solutions=Truethat causes thefitness function to be called for solutions already explored andhave their fitness pre-calculated.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/160A new instance attribute named
last_generation_elitism_indicesadded to hold the indices of the selected elitism. This attributehelps to re-use the fitness of the elitism instead of calling thefitness function.Fewer calls to the
best_solution()method which in turns savessome calls to the fitness function.Some updates in the documentation to give more details about the
cal_pop_fitness()method.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/79#issuecomment-1439605442
PyGAD 2.19.1¶
Release Date: 22 February 2023
Add thecloudpicklelibrary as a dependency.
PyGAD 2.19.2¶
Release Date 23 February 2023
Fix an issue when parallel processing was used where the elitismsolutions’ fitness values are not re-used.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/160#issuecomment-1441718184
PyGAD 3.0.0¶
Release Date 8 April 2023
The structure of the library is changed and some methods defined inthe
pygad.pymodule are moved to thepygad.utils,pygad.helper, andpygad.visualizesubmodules.The
pygad.utils.parent_selectionmodule has a class namedParentSelectionwhere all the parent selection operators exist.Thepygad.GAclass extends this class.The
pygad.utils.crossovermodule has a class namedCrossoverwhere all the crossover operators exist. Thepygad.GAclassextends this class.The
pygad.utils.mutationmodule has a class namedMutationwhere all the mutation operators exist. Thepygad.GAclassextends this class.The
pygad.helper.uniquemodule has a class namedUniquesomehelper methods exist to solve duplicate genes and make sure everygene is unique. Thepygad.GAclass extends this class.The
pygad.visualize.plotmodule has a class namedPlotwhereall the methods that create plots exist. Thepygad.GAclassextends this class.Support of using the
loggingmodule to log the outputs to boththe console and text file instead of using theprint()function.This is by assigning thelogging.Loggerto the newloggerparameter. Check theLoggingOutputsfor more information.A new instance attribute called
loggerto save the logger.The function/method passed to the
fitness_funcparameter acceptsa new parameter that refers to the instance of thepygad.GAclass. Check this for an example:Use Functions and Methods toBuild Fitness Function andCallbacks.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/163Update the documentation to include an example of using functionsand methods to calculate the fitness and build callbacks. Check thisfor more details:Use Functions and Methods to Build FitnessFunction andCallbacks.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/92#issuecomment-1443635003
Validate the value passed to the
initial_populationparameter.Validate the type and length of the
pop_fitnessparameter of thebest_solution()method.Some edits in the documentation.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/106
Fix an issue when building the initial population as (some) geneshave their value taken from the mutation range (defined by theparameters
random_mutation_min_valandrandom_mutation_max_val) instead of using the parametersinit_range_lowandinit_range_high.The
summary()method returns the summary as a single-linestring. Just log/print the returned string it to see it properly.The
callback_generationparameter is removed. Use theon_generationparameter instead.There was an issue when using the
parallel_processingparameterwith Keras and PyTorch. As Keras/PyTorch are not thread-safe, thepredict()method gives incorrect and weird results when morethan 1 thread is used.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/145https://github.com/ahmedfgad/TorchGA/issues/5https://github.com/ahmedfgad/KerasGA/issues/6. Thanks to thisStackOverflowanswer.Replace
numpy.floatbyfloatin the 2 parent selectionoperators roulette wheel and stochastic universal.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/168
PyGAD 3.0.1¶
Release Date 20 April 2023
Fix an issue with passing user-defined function/method for parentselection.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/179
PyGAD 3.1.0¶
Release Date 20 June 2023
Fix a bug when the initial population has duplciate genes if anested gene space is used.
The
gene_spaceparameter can no longer be assigned a tuple.Fix a bug when the
gene_spaceparameter has a member of typetuple.A new instance attribute called
gene_space_unpackedwhich hasthe unpackedgene_space. It is used to solve duplicates. Forinfinite ranges in thegene_space, they are unpacked to alimited number of values (e.g. 100).Bug fixes when creating the initial population using
gene_spaceattribute.When a
dictis used with thegene_spaceattribute, the newgene value was calculated by summing 2 values: 1) the value sampledfrom thedict2) a random value returned from the randommutation range defined by the 2 parametersrandom_mutation_min_valandrandom_mutation_max_val. Thismight cause the gene value to exceed the range limit defined in thegene_space. To respect thegene_spacerange, this releaseonly returns the value from thedictwithout summing it to arandom value.Formatting the strings using f-string instead of the
format()method.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/189In the
__init__()of thepygad.GAclass, the logged errormessages are handled using atry-exceptblock instead ofrepeating thelogger.error()command.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/189A new class named
CustomLoggeris created in thepygad.cnnmodule to create a default logger using theloggingmoduleassigned to theloggerattribute. This class is extended in allother classes in the module. The constructors of these classes havea new parameter namedloggerwhich defaults toNone. If nologger is passed, then the default logger in theCustomLoggerclass is used.Except for the
pygad.nnmodule, theprint()function in allother modules are replaced by theloggingmodule to logmessages.The callback functions/methods
on_fitness(),on_parents(),on_crossover(), andon_mutation()can return values. Thesereturned values override the corresponding properties. The output ofon_fitness()overrides the population fitness. Theon_parents()function/method must return 2 values representingthe parents and their indices. The output ofon_crossover()overrides the crossover offspring. The output ofon_mutation()overrides the mutation offspring.Fix a bug when adaptive mutation is used while
fitness_batch_size>1.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/195When
allow_duplicate_genes=Falseand a user-definedgene_spaceis used, it sometimes happen that there is no room tosolve the duplicates between the 2 genes by simply replacing thevalue of one gene by another gene. This release tries to solve suchduplicates by looking for a third gene that will help in solving theduplicates. Checkthissectionfor more information.Use probabilities to select parents using the rank parent selectionmethod.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/205
The 2 parameters
random_mutation_min_valandrandom_mutation_max_valcan accept iterables(list/tuple/numpy.ndarray) with length equal to the number of genes.This enables customizing the mutation range for each individualgene.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/198The 2 parameters
init_range_lowandinit_range_highcanaccept iterables (list/tuple/numpy.ndarray) with length equal to thenumber of genes. This enables customizing the initial range for eachindividual gene when creating the initial population.The
dataparameter in thepredict()function of thepygad.kerasgamodule can be assigned a data generator.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/115https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/207The
predict()function of thepygad.kerasgamodule accepts 3optional parameters: 1)batch_size=None,verbose=0, andsteps=None. Check documentation of theKerasModel.predict()method for more information.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/207The documentation is updated to explain how mutation works when
gene_spaceis used withintorfloatdata types. Checkthissection.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/198
PyGAD 3.2.0¶
Release Date 7 September 2023
A new module
pygad.utils.nsga2is created that has theNSGA2class that includes the functionalities of NSGA-II. The class hasthese methods: 1)get_non_dominated_set()2)non_dominated_sorting()3)crowding_distance()4)sort_solutions_nsga2(). Checkthissectionfor an example.Support of multi-objective optimization using Non-Dominated SortingGenetic Algorithm II (NSGA-II) using the
NSGA2class in thepygad.utils.nsga2module. Just return alist,tuple, ornumpy.ndarrayfrom the fitness function and the library willconsider the problem as multi-objective optimization. All theobjectives are expected to be maximization. Checkthissectionfor an example.The parent selection methods and adaptive mutation are edited tosupport multi-objective optimization.
Two new NSGA-II parent selection methods are supported in the
pygad.utils.parent_selectionmodule: 1) Tournament selection forNSGA-II 2) NSGA-II selection.The
plot_fitness()method in thepygad.plotmodule has a newoptional parameter namedlabelto accept the label of the plots.This is only used for multi-objective problems. Otherwise, it isignored. It defaults toNoneand accepts alist,tuple,ornumpy.ndarray. The labels are used in a legend inside theplot.The default color in the methods of the
pygad.plotmodule ischanged to the greenish#64f20ccolor.A new instance attribute named
pareto_frontsadded to thepygad.GAinstances that holds the pareto fronts when solving amulti-objective problem.The
gene_typeaccepts alist,tuple, ornumpy.ndarrayfor integer data types given that the precision isset toNone(e.g.gene_type=[float,[int,None]]).In the
cal_pop_fitness()method, the fitness value is re-used ifsave_best_solutions=Trueand the solution is found in thebest_solutionsattribute. These parameters also can helpre-using the fitness of a solution instead of calling the fitnessfunction:keep_elitism,keep_parents, andsave_solutions.The value
99999999999is replaced byfloat('inf')in the 2methodswheel_cumulative_probs()andstochastic_universal_selection()inside thepygad.utils.parent_selection.ParentSelectionclass.The
plot_result()method in thepygad.visualize.plot.Plotclass is removed. Instead, please use theplot_fitness()if youdid not upgrade yet.
PyGAD 3.3.0¶
Release Date 29 January 2024
Solve bugs when multi-objective optimization is used.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/238
When the
stop_ciiteriaparameter is used with thereachkeyword, then multiple numeric values can be passed when solving amulti-objective problem. For example, if a problem has 3 objectivefunctions, thenstop_criteria="reach_10_20_30"means the GAstops if the fitness of the 3 objectives are at least 10, 20, and30, respectively. The number values must match the number ofobjective functions. If a single value found (e.g.stop_criteria=reach_5) when solving a multi-objective problem,then it is used across all the objectives.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/238The
delay_after_genparameter is now deprecated and will beremoved in a future release. If it is necessary to have a time delayafter each generation, then assign a callback function/method to theon_generationparameter to pause the evolution.Parallel processing now supports calculating the fitness duringadaptive mutation.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/201
The population size can be changed during runtime by changing allthe parameters that would affect the size of any thing used by theGA. For more information, check theChange Population Size duringRuntimesection.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/234
When a dictionary exists in the
gene_spaceparameter without astep, then mutation occurs by adding a random value to the genevalue. The random vaue is generated based on the 2 parametersrandom_mutation_min_valandrandom_mutation_max_val. Formore information, check theHow Mutation Works with the gene_spaceParameter?section.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/229Add
objectas a supported data type for int(GA.supported_int_types) and float (GA.supported_float_types).https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/174Use the
raiseclause instead of thesys.exit(-1)toterminate the execution.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/213Fix a bug when multi-objective optimization is used with batchfitness calculation (e.g.
fitness_batch_sizeset to a non-zeronumber).Fix a bug in the
pygad.pyscript when finding the index of thebest solution. It does not work properly with multi-objectiveoptimization whereself.best_solutions_fitnesshave multiplecolumns.
self.best_solution_generation=numpy.where(numpy.array(self.best_solutions_fitness)==numpy.max(numpy.array(self.best_solutions_fitness)))[0][0]
PyGAD 3.3.1¶
Release Date 17 February 2024
After the last generation and before the
run()method completes,update the 2 instance attributes: 1)last_generation_parents2)last_generation_parents_indices. This is to keep the list ofparents up-to-date with the latest population fitnesslast_generation_fitness.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/2754 methods with names starting with
run_. Their purpose is to keepthe main loop inside therun()method clean. Check theOtherMethodssection for more information.
PyGAD 3.4.0¶
Release Date 07 January 2025
The
delay_after_genparameter is removed from thepygad.GAclass constructor. As a result, it is no longer an attribute of thepygad.GAclass instances. To add a delay after each generation,apply it inside theon_generationcallback.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/283In the
single_point_crossover()method of thepygad.utils.crossover.Crossoverclass, all the random crossoverpoints are returned before theforloop. This is by calling thenumpy.random.randint()function only once before the loop togenerate all the K points (where K is the offspring size). This iscompared to calling thenumpy.random.randint()function insidetheforloop K times, once for each individual offspring.Bug fix in the
examples/example_custom_operators.pyscript.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/285While making prediction using the
pygad.torchga.predict()function, no gradients are calculated.The
gene_typeparameter of thepygad.helper.unique.Unique.unique_int_gene_from_range()methodaccepts the type of the current gene only instead of the fullgene_type list.Created a new method called
unique_float_gene_from_range()inside thepygad.helper.unique.Uniqueclass to find a uniquefloating-point number from a range.Fix a bug in the
pygad.helper.unique.Unique.unique_gene_by_space()method toreturn the numeric value only instead of a NumPy array.Refactoring the
pygad/helper/unique.pyscript to removeduplicate codes and reformatting the docstrings.The
plot_pareto_front_curve()method added to thepygad.visualize.plot.Plot class to visualize the Pareto front formulti-objective problems. It only supports 2 objectives.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/279Fix a bug converting a nested NumPy array to a nested list.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/300
The
Matplotliblibrary is only imported when a method inside thepygad/visualize/plot.pyscript is used. This is more efficientthan usingimportmatplotlib.pyplotat the module level as thiscauses it to be imported whenpygadis imported even when it isnot needed.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/292Fix a bug when minus sign (-) is used inside the
stop_criteriaparameter (e.g.stop_criteria=["saturate_10","reach_-0.5"]).https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/296Make sure
self.best_solutionsis a list of lists inside thecal_pop_fitnessmethod.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/293Fix a bug where the
cal_pop_fitness()method was using theprevious_generation_fitnessattribute to return the parentsfitness. This instance attribute was not using the fitness of thelatest population, instead the fitness of the population before thelast one. The issue is solved by updating theprevious_generation_fitnessattribute to the latest populationfitness before the GA completes.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/291
PyGAD 3.5.0¶
Release Date 08 July 2025
Fix a bug when minus sign (-) is used inside the
stop_criteriaparameter for multi-objective problems.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/314https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/323Fix a bug when the
stop_criteriaparameter is passed as aniterable (e.g. list) for multi-objective problems (e.g.['reach_50_60','reach_20,40']).https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/314Call the
get_matplotlib()function from theplot_genes()method inside thepygad.visualize.plot.Plotclass to import thematplotlib library.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/315Create a new helper method called
select_unique_value()insidethepygad/helper/unique.pyscript to select a unique gene froman array of values.Create a new helper method called
get_random_mutation_range()inside thepygad/utils/mutation.pyscript that returns therandom mutation range (min and max) for a single gene by its index.Create a new helper method called
change_random_mutation_value_dtypeinside thepygad/utils/mutation.pyscript that changes the data type of thevalue used to apply random mutation.Create a new helper method called
round_random_mutation_value()inside thepygad/utils/mutation.pyscript that rounds the valueused to apply random mutation.Create the
pygad/helper/misc.pyscript with a class calledHelperthat has the following helper methods:change_population_dtype_and_round(): For each gene in thepopulation, round the gene value and change the data type.change_gene_dtype_and_round(): Round the change the datatype of a single gene.mutation_change_gene_dtype_and_round(): Decides whethermutation is done by replacement or not. Then it rounds andchange the data type of the new gene value.validate_gene_constraint_callable_output(): Validates theoutput of the user-defined callable/function that checks whetherthe gene constraint defined in thegene_constraintparameteris satisfied or not.get_gene_dtype(): Returns the gene data type from thegene_typeinstance attribute.get_random_mutation_range(): Returns the random mutationrange using therandom_mutation_min_valandrandom_mutation_min_valinstance attributes.get_initial_population_range(): Returns the initialpopulation values range using theinit_range_lowandinit_range_highinstance attributes.generate_gene_value_from_space(): Generates/selects a valuefor a gene using thegene_spaceinstance attribute.generate_gene_value_randomly(): Generates a random value forthe gene. Only used ifgene_spaceisNone.generate_gene_value(): Generates a value for the gene. Itchecks whethergene_spaceisNoneand calls eithergenerate_gene_value_randomly()orgenerate_gene_value_from_space().filter_gene_values_by_constraint(): Receives a list ofvalues for a gene. Then it filters such values using the geneconstraint.get_valid_gene_constraint_values(): Selects one valid genevalue that satisfy the gene constraint. It simply callsgenerate_gene_value()to generate some gene values then itfilters such values usingfilter_gene_values_by_constraint().
Create a new helper method called
mutation_process_random_value()inside thepygad/utils/mutation.pyscript that generates constrained randomvalues for mutation. It calls eithergenerate_gene_value()orget_valid_gene_constraint_values()based on whether thegene_constraintparameter is used or not.A new parameter called
gene_constraintis added. It accepts alist of callables (i.e. functions) acting as constraints for thegene values. Before selecting a value for a gene, the callable iscalled to ensure the candidate value is valid. Check theGeneConstraintsection for more information.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/119A new parameter called
sample_sizeis added. To select a genevalue that respects a constraint, this variable defines the size ofthe sample from which a value is selected randomly. Useful if eitherallow_duplicate_genesorgene_constraintis used. Aninstance attribute of the same name is created in the instances ofthepygad.GAclass. Check thesample_sizeParametersection for more information.Use the
sample_sizeparameter instead ofnum_trialsin themethodssolve_duplicate_genes_randomly()andunique_float_gene_from_range()inside thepygad/helper/unique.pyscript. It is the maximum number ofvalues to generate as the search space when looking for a uniquefloat value out of a range.Fixed a bug in population initialization when
allow_duplicate_genes=False. Previously, gene values werechecked for duplicates before rounding, which could allownear-duplicates like 7.61 and 7.62 to pass. After rounding (e.g.,both becoming 7.6), this resulted in unintended duplicates. The fixensures gene values are now rounded before duplicate checks,preventing such cases.More tests are created.
More examples are created.
Edited the
sort_solutions_nsga2()method in thepygad/utils/nsga2.pyscript to accept an optional parametercalledfind_best_solutionwhen calling this method just to findthe best solution.Fixed a bug while applying the non-dominated sorting in the
get_non_dominated_set()method inside thepygad/utils/nsga2.pyscript. It was swapping the non-dominatedand dominated sets. In other words, it used the non-dominated set asif it is the dominated set and vice versa. All the calls to thismethod were edited accordingly.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/320.Fix a bug retrieving in the
best_solution()method whenretrieving the best solution for multi-objective problems.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/331
PyGAD Projects at GitHub¶
The PyGAD library is available at PyPI at this pagehttps://pypi.org/project/pygad. PyGAD is built out of a number ofopen-source GitHub projects. A brief note about these projects is givenin the next subsections.
GeneticAlgorithmPython¶
GitHub Link:https://github.com/ahmedfgad/GeneticAlgorithmPython
GeneticAlgorithmPythonis the first project which is an open-source Python 3 project forimplementing the genetic algorithm based on NumPy.
NumPyANN¶
GitHub Link:https://github.com/ahmedfgad/NumPyANN
NumPyANN builds artificialneural networks inPython 3 usingNumPy from scratch. Thepurpose of this project is to only implement theforward pass of aneural network without using a training algorithm. Currently, it onlysupports classification and later regression will be also supported.Moreover, only one class is supported per sample.
NeuralGenetic¶
GitHub Link:https://github.com/ahmedfgad/NeuralGenetic
NeuralGenetic trainsneural networks using the genetic algorithm based on the previous 2projectsGeneticAlgorithmPythonandNumPyANN.
NumPyCNN¶
GitHub Link:https://github.com/ahmedfgad/NumPyCNN
NumPyCNN buildsconvolutional neural networks using NumPy. The purpose of this projectis to only implement theforward pass of a convolutional neuralnetwork without using a training algorithm.
CNNGenetic¶
GitHub Link:https://github.com/ahmedfgad/CNNGenetic
CNNGenetic trainsconvolutional neural networks using the genetic algorithm. It uses theGeneticAlgorithmPythonproject for building the genetic algorithm.
KerasGA¶
GitHub Link:https://github.com/ahmedfgad/KerasGA
KerasGA trainsKeras models using the genetic algorithm. It usestheGeneticAlgorithmPythonproject for building the genetic algorithm.
TorchGA¶
GitHub Link:https://github.com/ahmedfgad/TorchGA
TorchGA trainsPyTorch models using the genetic algorithm. Ituses theGeneticAlgorithmPythonproject for building the genetic algorithm.
Stackoverflow Questions about PyGAD¶
How do I proceed to load a ga_instance as “.pkl” format in PyGad?¶
Binary Classification NN Model Weights not being Trained in PyGAD¶
How to solve TSP problem using pyGAD package?¶
How can I save a matplotlib plot that is the output of a function in jupyter?¶
How do I query the best solution of a pyGAD GA instance?¶
Multi-Input Multi-Output in Genetic algorithm (python)¶
https://www.linkedin.com/pulse/validation-short-term-parametric-trading-model-genetic-landolfi
https://itchef.ru/articles/397758
https://blog.csdn.net/sinat_38079265/article/details/108449614
Submitting Issues¶
If there is an issue using PyGAD, then use any of your preferred optionto discuss that issue.
One way issubmitting anissueinto this GitHub project(github.com/ahmedfgad/GeneticAlgorithmPython)in case something is not working properly or to ask for questions.
If this is not a proper option for you, then check theContactUssection for more contact details.
Ask for Feature¶
PyGAD is actively developed with the goal of building a dynamic libraryfor suporting a wide-range of problems to be optimized using the geneticalgorithm.
To ask for a new feature, eithersubmit anissueinto this GitHub project(github.com/ahmedfgad/GeneticAlgorithmPython)or send an e-mail toahmed.f.gad@gmail.com.
Also check theContactUssection for more contact details.
Projects Built using PyGAD¶
If you created a project that uses PyGAD, then we can support you bymentioning this project here in PyGAD’s documentation.
To do that, please send a message atahmed.f.gad@gmail.com or check theContactUssection for more contact details.
Within your message, please send the following details:
Project title
Brief description
Preferably, a link that directs the readers to your project
Tutorials about PyGAD¶
Adaptive Mutation in Genetic Algorithm with Python Examples¶
In this tutorial, we’ll see why mutation with a fixed number of genes isbad, and how to replace it with adaptive mutation. Using thePyGADPython 3 library, we’ll discuss a fewexamples that use both random and adaptive mutation.
Clustering Using the Genetic Algorithm in Python¶
This tutorial discusses how the genetic algorithm is used to clusterdata, starting from random clusters and running until the optimalclusters are found. We’ll start by briefly revising the K-meansclustering algorithm to point out its weak points, which are latersolved by the genetic algorithm. The code examples in this tutorial areimplemented in Python using thePyGADlibrary.
Working with Different Genetic Algorithm Representations in Python¶
Depending on the nature of the problem being optimized, the geneticalgorithm (GA) supports two different gene representations: binary, anddecimal. The binary GA has only two values for its genes, which are 0and 1. This is easier to manage as its gene values are limited comparedto the decimal GA, for which we can use different formats like float orinteger, and limited or unlimited ranges.
This tutorial discusses how thePyGAD library supports the two GArepresentations, binary and decimal.
5 Genetic Algorithm Applications Using PyGAD¶
This tutorial introduces PyGAD, an open-source Python library forimplementing the genetic algorithm and training machine learningalgorithms. PyGAD supports 19 parameters for customizing the geneticalgorithm for various applications.
Within this tutorial we’ll discuss 5 different applications of thegenetic algorithm and build them using PyGAD.
Train Neural Networks Using a Genetic Algorithm in Python with PyGAD¶
The genetic algorithm (GA) is a biologically-inspired optimizationalgorithm. It has in recent years gained importance, as it’s simplewhile also solving complex problems like travel route optimization,training machine learning algorithms, working with single andmulti-objective problems, game playing, and more.
Deep neural networks are inspired by the idea of how the biologicalbrain works. It’s a universal function approximator, which is capable ofsimulating any function, and is now used to solve the most complexproblems in machine learning. What’s more, they’re able to work with alltypes of data (images, audio, video, and text).
Both genetic algorithms (GAs) and neural networks (NNs) are similar, asboth are biologically-inspired techniques. This similarity motivates usto create a hybrid of both to see whether a GA can train NNs with highaccuracy.
This tutorial usesPyGAD, a Pythonlibrary that supports building and training NNs using a GA.PyGAD offers both classification andregression NNs.
Building a Game-Playing Agent for CoinTex Using the Genetic Algorithm¶
In this tutorial we’ll see how to build a game-playing agent using onlythe genetic algorithm to play a game calledCoinTex,which is developed in the Kivy Python framework. The objective ofCoinTex is to collect the randomly distributed coins while avoidingcollision with fire and monsters (that move randomly). The source codeof CoinTex can be foundonGitHub.
The genetic algorithm is the only AI used here; there is no othermachine/deep learning model used with it. We’ll implement the geneticalgorithm usingPyGad.This tutorial starts with a quick overview of CoinTex followed by abrief explanation of the genetic algorithm, and how it can be used tocreate the playing agent. Finally, we’ll see how to implement theseideas in Python.
The source code of the genetic algorithm agent is availablehere,and you can download the code used in this tutorial fromhere.
How To Train Keras Models Using the Genetic Algorithm with PyGAD¶
PyGAD is an open-source Python library for building the geneticalgorithm and training machine learning algorithms. It offers a widerange of parameters to customize the genetic algorithm to work withdifferent types of problems.
PyGAD has its own modules that support building and training neuralnetworks (NNs) and convolutional neural networks (CNNs). Despite thesemodules working well, they are implemented in Python without anyadditional optimization measures. This leads to comparatively highcomputational times for even simple problems.
The latest PyGAD version, 2.8.0 (released on 20 September 2020),supports a new module to train Keras models. Even though Keras is builtin Python, it’s fast. The reason is that Keras uses TensorFlow as abackend, and TensorFlow is highly optimized.
This tutorial discusses how to train Keras models using PyGAD. Thediscussion includes building Keras models using either the SequentialModel or the Functional API, building an initial population of Kerasmodel parameters, creating an appropriate fitness function, and more.
Train PyTorch Models Using Genetic Algorithm with PyGAD¶
PyGAD is a genetic algorithm Python3 library for solving optimization problems. One of these problems istraining machine learning algorithms.
PyGAD has a module calledpygad.kerasga. It trainsKeras models using the genetic algorithm. On January 3rd, 2021, a newrelease ofPyGAD 2.10.0 brought anew module calledpygad.torchga to trainPyTorch models. It’s very easy to use, but there are a few tricky steps.
So, in this tutorial, we’ll explore how to use PyGAD to train PyTorchmodels.
A Guide to Genetic ‘Learning’ Algorithms for Optimization¶
PyGAD in Other Languages¶
French¶
Cómo los algoritmos genéticos pueden competir con el descenso degradiente y elbackprop
Bien que la manière standard d’entraîner les réseaux de neurones soit ladescente de gradient et la rétropropagation, il y a d’autres joueursdans le jeu. L’un d’eux est les algorithmes évolutionnaires, tels queles algorithmes génétiques.
Utiliser un algorithme génétique pour former un réseau de neuronessimple pour résoudre le OpenAI CartPole Jeu. Dans cet article, nousallons former un simple réseau de neurones pour résoudre le OpenAICartPole . J’utiliserai PyTorch et PyGAD .
Spanish¶
Cómo los algoritmos genéticos pueden competir con el descenso degradiente y elbackprop
Aunque la forma estandar de entrenar redes neuronales es el descenso degradiente y la retropropagacion, hay otros jugadores en el juego, uno deellos son los algoritmos evolutivos, como los algoritmos geneticos.
Usa un algoritmo genetico para entrenar una red neuronal simple pararesolver el Juego OpenAI CartPole. En este articulo, entrenaremos unared neuronal simple para resolver el OpenAI CartPole . Usare PyTorch yPyGAD .
Korean¶
[PyGAD] Python 에서 Genetic Algorithm 을 사용해보기¶
파이썬에서 genetic algorithm을 사용하는 패키지들을 다 사용해보진않았지만, 확장성이 있어보이고, 시도할 일이 있어서 살펴봤다.
이 패키지에서 가장 인상 깊었던 것은 neural network에서 hyper parameter탐색을 gradient descent 방식이 아닌 GA로도 할 수 있다는 것이다.
개인적으로 이 부분이 어느정도 초기치를 잘 잡아줄 수 있는 역할로도 쓸 수있고, Loss가 gradient descent 하기 어려운 구조에서 대안으로 쓸 수 있을것으로도 생각된다.
일단 큰 흐름은 다음과 같이 된다.
사실 완전히 흐름이나 각 parameter에 대한 이해는 부족한 상황
Turkish¶
PyGAD ile Genetik Algoritmayı Kullanarak Keras Modelleri Nasıl Eğitilir¶
This is a translation of an original English tutorial published atPaperspace:How To Train Keras Models Using the Genetic Algorithm withPyGAD
PyGAD, genetik algoritma oluşturmak ve makine öğrenimi algoritmalarınıeğitmek için kullanılan açık kaynaklı bir Python kitaplığıdır. Genetikalgoritmayı farklı problem türleri ile çalışacak şekilde özelleştirmekiçin çok çeşitli parametreler sunar.
PyGAD, sinir ağları (NN’ler) ve evrişimli sinir ağları (CNN’ler)oluşturmayı ve eğitmeyi destekleyen kendi modüllerine sahiptir. Bumodüllerin iyi çalışmasına rağmen, herhangi bir ek optimizasyon önlemiolmaksızın Python’da uygulanırlar. Bu, basit problemler için bilenispeten yüksek hesaplama sürelerine yol açar.
En son PyGAD sürümü 2.8.0 (20 Eylül 2020’de piyasaya sürüldü), Kerasmodellerini eğitmek için yeni bir modülü destekliyor. Keras Python’daoluşturulmuş olsa da hızlıdır. Bunun nedeni, Keras’ın arka uç olarakTensorFlow kullanması ve TensorFlow’un oldukça optimize edilmişolmasıdır.
Bu öğreticide, PyGAD kullanılarak Keras modellerinin nasıl eğitileceğianlatılmaktadır. Tartışma, Sıralı Modeli veya İşlevsel API’yi kullanarakKeras modellerini oluşturmayı, Keras model parametrelerinin ilkpopülasyonunu oluşturmayı, uygun bir uygunluk işlevi oluşturmayı ve dahafazlasını içerir.
Hungarian¶
Tensorflow alapozó 10. Neurális hálózatok tenyésztése genetikus algoritmussal PyGAD és OpenAI Gym használatával¶
Hogy kontextusba helyezzem a genetikus algoritmusokat, ismételjük kicsitát, hogy hogyan működik a gradient descent és a backpropagation, ami aneurális hálók tanításának általános módszere. Az erről írt cikkemet itttudjátok elolvasni.
A hálózatok tenyésztéséhez aPyGAD nevűprogramkönyvtárat használjuk, így mindenek előtt ezt kell telepítenünk,valamint a Tensorflow-t és a Gym-et, amit Colabban már eleve telepítvekapunk.
Maga a PyGAD egy teljesen általános genetikus algoritmusok futtatásáraképes rendszer. Ennek a kiterjesztése a KerasGA, ami az általános motorTensorflow (Keras) neurális hálókon történő futtatását segíti. A 47.sorban létrehozott KerasGA objektum ennek a kiterjesztésnek a része ésarra szolgál, hogy a paraméterként átadott modellből a másodikparaméterben megadott számosságú populációt hozzon létre. Mivel ahálózatunk 386 állítható paraméterrel rendelkezik, ezért a DNS-ünk itt386 elemből fog állni. A populáció mérete 10 egyed, így a kezdőpopulációnk egy 10x386 elemű mátrix lesz. Ezt adjuk át az 51. sorban azinitial_population paraméterben.
Russian¶
PyGAD: библиотека для имплементации генетического алгоритма¶
PyGAD — это библиотека для имплементации генетического алгоритма. Крометого, библиотека предоставляет доступ к оптимизированным реализациямалгоритмов машинного обучения. PyGAD разрабатывали на Python 3.
Библиотека PyGAD поддерживает разные типы скрещивания, мутации иселекции родителя. PyGAD позволяет оптимизировать проблемы с помощьюгенетического алгоритма через кастомизацию целевой функции.
Кроме генетического алгоритма, библиотека содержит оптимизированныеимплементации алгоритмов машинного обучения. На текущий момент PyGADподдерживает создание и обучение нейросетей для задач классификации.
Библиотека находится в стадии активной разработки. Создатели планируютдобавление функционала для решения бинарных задач и имплементации новыхалгоритмов.
PyGAD разрабатывали на Python 3.7.3. Зависимости включают в себя NumPyдля создания и манипуляции массивами и Matplotlib для визуализации. Одиниз изкейсов использования инструмента — оптимизация весов, которыеудовлетворяют заданной функции.
Research Papers using PyGAD¶
A number of research papers used PyGAD and here are some of them:
Alberto Meola, Manuel Winkler, Sören Weinrich, Metaheuristicoptimization of data preparation and machine learning hyperparametersfor prediction of dynamic methane production, Bioresource Technology,Volume 372, 2023, 128604, ISSN 0960-8524.
Jaros, Marta, and Jiri Jaros. “Performance-Cost Optimization ofMoldable Scientific Workflows.”
Thorat, Divya. “Enhanced genetic algorithm to reduce makespan ofmultiple jobs in map-reduce application on serverless platform”. Diss.Dublin, National College of Ireland, 2020.
Koch, Chris, and Edgar Dobriban. “AttenGen: Generating Live AttenuatedVaccine Candidates using Machine Learning.” (2021).
Bhardwaj, Bhavya, et al. “Windfarm optimization using Nelder-Mead andParticle Swarm optimization.”2021 7th International Conference onElectrical Energy Systems (ICEES). IEEE, 2021.
Bernardo, Reginald Christian S. and J. Said. “Towards amodel-independent reconstruction approach for late-time Hubble data.”(2021).
Duong, Tri Dung, Qian Li, and Guandong Xu. “Prototype-basedCounterfactual Explanation for Causal Classification.”arXiv preprintarXiv:2105.00703 (2021).
Farrag, Tamer Ahmed, and Ehab E. Elattar. “Optimized Deep Stacked LongShort-Term Memory Network for Long-Term Load Forecasting.”IEEEAccess 9 (2021): 68511-68522.
Antunes, E. D. O., Caetano, M. F., Marotta, M. A., Araujo, A., Bondan,L., Meneguette, R. I., & Rocha Filho, G. P. (2021, August). SoluçõesOtimizadas para o Problema de Localização de Máxima Cobertura em RedesMilitarizadas 4G/LTE. InAnais do XXVI Workshop de Gerência eOperação de Redes e Serviços (pp. 152-165). SBC.
M. Yani, F. Ardilla, A. A. Saputra and N. Kubota, “Gradient-Free DeepQ-Networks Reinforcement learning: Benchmark and Evaluation,”2021IEEE Symposium Series on Computational Intelligence (SSCI), 2021, pp.1-5, doi: 10.1109/SSCI50451.2021.9659941.
Yani, Mohamad, and Naoyuki Kubota. “Deep Convolutional Networks withGenetic Algorithm for Reinforcement Learning Problem.”
Mahendra, Muhammad Ihza, and Isman Kurniawan. “OptimizingConvolutional Neural Network by Using Genetic Algorithm for COVID-19Detection in Chest X-Ray Image.”2021 International Conference onData Science and Its Applications (ICoDSA). IEEE, 2021.
Glibota, Vjeko.Umjeravanje mikroskopskog prometnog modela primjenomgenetskog algoritma. Diss. University of Zagreb. Faculty of Transportand Traffic Sciences. Division of Intelligent Transport Systems andLogistics. Department of Intelligent Transport Systems, 2021.
Zhu, Mingda.Genetic Algorithm-based Parameter Identification forShip Manoeuvring Model under Wind Disturbance. MS thesis. NTNU, 2021.
Abdalrahman, Ahmed, and Weihua Zhuang. “Dynamic pricing fordifferentiated pev charging services using deep reinforcementlearning.”IEEE Transactions on Intelligent Transportation Systems(2020).
More Links¶
https://rodriguezanton.com/identifying-contact-states-for-2d-objects-using-pygad-and/
For More Information¶
There are different resources that can be used to get started with thegenetic algorithm and building it in Python.
Tutorial: Implementing Genetic Algorithm in Python¶
To start with coding the genetic algorithm, you can check the tutorialtitledGenetic Algorithm Implementation inPythonavailable at these links:
Thistutorialis prepared based on a previous version of the project but it still agood resource to start with coding the genetic algorithm.
Tutorial: Introduction to Genetic Algorithm¶
Get started with the genetic algorithm by reading the tutorial titledIntroduction to Optimization with GeneticAlgorithmwhich is available at these links:
Tutorial: Build Neural Networks in Python¶
Read about building neural networks in Python through the tutorialtitledArtificial Neural Network Implementation using NumPy andClassification of the Fruits360 ImageDatasetavailable at these links:
Tutorial: Optimize Neural Networks with Genetic Algorithm¶
Read about training neural networks using the genetic algorithm throughthe tutorial titledArtificial Neural Networks Optimization usingGenetic Algorithm withPythonavailable at these links:
Tutorial: Building CNN in Python¶
To start with coding the genetic algorithm, you can check the tutorialtitledBuilding Convolutional Neural Network using NumPy fromScratchavailable at these links:
Thistutorial)is prepared based on a previous version of the project but it still agood resource to start with coding CNNs.
Tutorial: Derivation of CNN from FCNN¶
Get started with the genetic algorithm by reading the tutorial titledDerivation of Convolutional Neural Network from Fully Connected NetworkStep-By-Stepwhich is available at these links:
Book: Practical Computer Vision Applications Using Deep Learning with CNNs¶
You can also check my book cited asAhmed Fawzy Gad ‘Practical ComputerVision Applications Using Deep Learning with CNNs’. Dec. 2018, Apress,978-1-4842-4167-7which discusses neural networks, convolutional neural networks, deeplearning, genetic algorithm, and more.
Find the book at these links:

Contact Us¶
E-mail:ahmed.f.gad@gmail.com

Thank you for usingPyGAD :)











