Release History

image1

PyGAD 1.0.17

Release Date: 15 April 2020

  1. Thepygad.GA class accepts a new argument namedfitness_funcwhich accepts a function to be used for calculating the fitnessvalues for the solutions. This allows the project to be customized toany problem by building the right fitness function.

PyGAD 1.0.20

Release Date: 4 May 2020

  1. Thepygad.GA attributes are moved from the class scope to theinstance scope.

  2. Raising an exception for incorrect values of the passed parameters.

  3. Two new parameters are added to thepygad.GA class constructor(init_range_low andinit_range_high) allowing the user tocustomize the range from which the genes values in the initialpopulation are selected.

  4. The code object__code__ of the passed fitness function ischecked to ensure it has the right number of parameters.

PyGAD 2.0.0

Release Date: 13 May 2020

  1. The fitness function accepts a new argument namedsol_idxrepresenting the index of the solution within the population.

  2. A new parameter to thepygad.GA class constructor namedinitial_population is supported to allow the user to use a custominitial population to be used by the genetic algorithm. If not None,then the passed population will be used. IfNone, then thegenetic algorithm will create the initial population using thesol_per_pop andnum_genes parameters.

  3. The parameterssol_per_pop andnum_genes are optional and settoNone by default.

  4. A new parameter namedcallback_generation is introduced in thepygad.GA class constructor. It accepts a function with a singleparameter representing thepygad.GA class instance. This functionis called after each generation. This helps the user to dopost-processing or debugging operations after each generation.

PyGAD 2.1.0

Release Date: 14 May 2020

  1. Thebest_solution() method in thepygad.GA class returns anew output representing the index of the best solution within thepopulation. Now, it returns a total of 3 outputs and their order is:best solution, best solution fitness, and best solution index. Hereis an example:

solution,solution_fitness,solution_idx=ga_instance.best_solution()print("Parameters of the best solution :",solution)print("Fitness value of the best solution :",solution_fitness,"\n")print("Index of the best solution :",solution_idx,"\n")
  1. A new attribute namedbest_solution_generation is added to theinstances of thepygad.GA class. it holds the generation numberat which the best solution is reached. It is only assigned thegeneration number after therun() method completes. Otherwise,its value is -1.
    Example:
print("Best solution reached after{best_solution_generation} generations.".format(best_solution_generation=ga_instance.best_solution_generation))
  1. Thebest_solution_fitness attribute is renamed tobest_solutions_fitness (plural solution).

  2. Mutation is applied independently for the genes.

PyGAD 2.2.1

Release Date: 17 May 2020

  1. Adding 2 extra modules (pygad.nn and pygad.gann) for building andtraining neural networks with the genetic algorithm.

PyGAD 2.2.2

Release Date: 18 May 2020

  1. The initial value of thegenerations_completed attribute ofinstances from the pygad.GA class is0 rather thanNone.

  2. An optional bool parameter namedmutation_by_replacement is addedto the constructor of the pygad.GA class. It works only when theselected type of mutation is random (mutation_type="random"). Inthis case, settingmutation_by_replacement=True means replace thegene by the randomly generated value. IfFalse, then it has noeffect and random mutation works by adding the random value to thegene. This parameter should be used when the gene falls within afixed range and its value must not go out of this range. Here aresome examples:

Assume there is a gene with the value 0.5.

Ifmutation_type="random" andmutation_by_replacement=False,then the generated random value (e.g. 0.1) will be added to the genevalue. The new gene value is0.5+0.1=0.6.

Ifmutation_type="random" andmutation_by_replacement=True, thenthe generated random value (e.g. 0.1) will replace the gene value. Thenew gene value is0.1.

  1. None value could be assigned to themutation_type andcrossover_type parameters of the pygad.GA class constructor. WhenNone, this means the step is bypassed and has no action.

PyGAD 2.3.0

Release date: 1 June 2020

  1. A new module namedpygad.cnn is supported for buildingconvolutional neural networks.

  2. A new module namedpygad.gacnn is supported for trainingconvolutional neural networks using the genetic algorithm.

  3. Thepygad.plot_result() method has 3 optional parameters namedtitle,xlabel, andylabel to customize the plot title,x-axis label, and y-axis label, respectively.

  4. Thepygad.nn module supports the softmax activation function.

  5. The name of thepygad.nn.predict_outputs() function is changed topygad.nn.predict().

  6. The name of thepygad.nn.train_network() function is changed topygad.nn.train().

PyGAD 2.4.0

Release date: 5 July 2020

  1. A new parameter nameddelay_after_gen is added which accepts anon-negative number specifying the time in seconds to wait after ageneration completes and before going to the next generation. Itdefaults to0.0 which means no delay after the generation.

  2. The passed function to thecallback_generation parameter of thepygad.GA class constructor can terminate the execution of the geneticalgorithm if it returns the stringstop. This causes therun() method to stop.

One important use case for that feature is to stop the genetic algorithmwhen a condition is met before passing though all the generations. Theuser may assigned a value of 100 to thenum_generations parameter ofthe pygad.GA class constructor. Assuming that at generation 50, forexample, a condition is met and the user wants to stop the executionbefore waiting the remaining 50 generations. To do that, just make thefunction passed to thecallback_generation parameter to return thestringstop.

Here is an example of a function to be passed to thecallback_generation parameter which stops the execution if thefitness value 70 is reached. The value 70 might be the best possiblefitness value. After being reached, then there is no need to passthrough more generations because no further improvement is possible.

deffunc_generation(ga_instance):ifga_instance.best_solution()[1]>=70:return"stop"

PyGAD 2.5.0

Release date: 19 July 2020

  1. 2 new optional parameters added to the constructor of thepygad.GA class which arecrossover_probability andmutation_probability.
    While applying the crossover operation, each parent has a randomvalue generated between 0.0 and 1.0. If this random value is lessthan or equal to the value assigned to thecrossover_probability parameter, then the parent is selectedfor the crossover operation.
    For the mutation operation, a random value between 0.0 and 1.0 isgenerated for each gene in the solution. If this value is less thanor equal to the value assigned to themutation_probability,then this gene is selected for mutation.
  2. A new optional parameter namedlinewidth is added to theplot_result() method to specify the width of the curve in theplot. It defaults to 3.0.

  3. Previously, the indices of the genes selected for mutation wasrandomly generated once for all solutions within the generation.Currently, the genes’ indices are randomly generated for eachsolution in the population. If the population has 4 solutions, theindices are randomly generated 4 times inside the single generation,1 time for each solution.

  4. Previously, the position of the point(s) for the single-point andtwo-points crossover was(were) randomly selected once for allsolutions within the generation. Currently, the position(s) is(are)randomly selected for each solution in the population. If thepopulation has 4 solutions, the position(s) is(are) randomlygenerated 4 times inside the single generation, 1 time for eachsolution.

  5. A new optional parameter namedgene_space as added to thepygad.GA class constructor. It is used to specify the possiblevalues for each gene in case the user wants to restrict the genevalues. It is useful if the gene space is restricted to a certainrange or to discrete values. For more information, check theMoreabout the ``gene_space`Parameter <https://pygad.readthedocs.io/en/latest/pygad_more.html#more-about-the-gene-space-parameter>`__section. Thanks toProf. Tamer A.Farrag for requesting this usefulfeature.

PyGAD 2.6.0

Release Date: 6 August 2020

  1. A bug fix in assigning the value to theinitial_populationparameter.

  2. A new parameter namedgene_type is added to control the genetype. It can be eitherint orfloat. It has an effect onlywhen the parametergene_space isNone.

  3. 7 new parameters that accept callback functions:on_start,on_fitness,on_parents,on_crossover,on_mutation,on_generation, andon_stop.

PyGAD 2.7.0

Release Date: 11 September 2020

  1. Thelearning_rate parameter in thepygad.nn.train() functiondefaults to0.01.

  2. Added support of building neural networks for regression using thenew parameter namedproblem_type. It is added as a parameter tobothpygad.nn.train() andpygad.nn.predict() functions. Thevalue of this parameter can be eitherclassification orregression to define the problem type. It defaults toclassification.

  3. The activation function for a layer can be set to the string"None" to refer that there is no activation function at thislayer. As a result, the supported values for the activation functionare"sigmoid","relu","softmax", and"None".

To build a regression network using thepygad.nn module, just do thefollowing:

  1. Set theproblem_type parameter in thepygad.nn.train() andpygad.nn.predict() functions to the string"regression".

  2. Set the activation function for the output layer to the string"None". This sets no limits on the range of the outputs as itwill be from-infinity to+infinity. If you are sure that alloutputs will be nonnegative values, then use the ReLU function.

Check the documentation of thepygad.nn module for an example thatbuilds a neural network for regression. The regression example is alsoavailable atthis GitHubproject:https://github.com/ahmedfgad/NumPyANN

To build and train a regression network using thepygad.gann module,do the following:

  1. Set theproblem_type parameter in thepygad.nn.train() andpygad.nn.predict() functions to the string"regression".

  2. Set theoutput_activation parameter in the constructor of thepygad.gann.GANN class to"None".

Check the documentation of thepygad.gann module for an example thatbuilds and trains a neural network for regression. The regressionexample is also available atthis GitHubproject:https://github.com/ahmedfgad/NeuralGenetic

To build a classification network, either ignore theproblem_typeparameter or set it to"classification" (default value). In thiscase, the activation function of the last layer can be set to any type(e.g. softmax).

PyGAD 2.7.1

Release Date: 11 September 2020

  1. A bug fix when theproblem_type argument is set toregression.

PyGAD 2.7.2

Release Date: 14 September 2020

  1. Bug fix to support building and training regression neural networkswith multiple outputs.

PyGAD 2.8.0

Release Date: 20 September 2020

  1. Support of a new module namedkerasga so that the Keras modelscan be trained by the genetic algorithm using PyGAD.

PyGAD 2.8.1

Release Date: 3 October 2020

  1. Bug fix in applying the crossover operation when thecrossover_probability parameter is used. Thanks toEng. HamadaKassem, Research and Teaching Assistant, Construction Engineering andManagement, Faculty of Engineering, Alexandria University,Egypt.

PyGAD 2.9.0

Release Date: 06 December 2020

  1. The fitness values of the initial population are considered in thebest_solutions_fitness attribute.

  2. An optional parameter namedsave_best_solutions is added. Itdefaults toFalse. When it isTrue, then the best solutionafter each generation is saved into an attribute namedbest_solutions. IfFalse, then no solutions are saved and thebest_solutions attribute will be empty.

  3. Scattered crossover is supported. To use it, assign thecrossover_type parameter the value"scattered".

  4. NumPy arrays are now supported by thegene_space parameter.

  5. The following parameters (gene_type,crossover_probability,mutation_probability,delay_after_gen) can be assigned to anumeric value of any of these data types:int,float,numpy.int,numpy.int8,numpy.int16,numpy.int32,numpy.int64,numpy.float,numpy.float16,numpy.float32, ornumpy.float64.

PyGAD 2.10.0

Release Date: 03 January 2021

  1. Support of a new modulepygad.torchga to train PyTorch modelsusing PyGAD. Checkitsdocumentation.

  2. Support of adaptive mutation where the mutation rate is determinedby the fitness value of each solution. Read theAdaptiveMutationsection for more details. Also, read this paper:Libelli, S.Marsili, and P. Alba. “Adaptive mutation in genetic algorithms.”Soft computing 4.2 (2000):76-80.

  3. Before therun() method completes or exits, the fitness value ofthe best solution in the current population is appended to thebest_solution_fitness list attribute. Note that the fitnessvalue of the best solution in the initial population is alreadysaved at the beginning of the list. So, the fitness value of thebest solution is saved before the genetic algorithm starts and afterit ends.

  4. When the parameterparent_selection_type is set tosss(steady-state selection), then a warning message is printed if thevalue of thekeep_parents parameter is set to 0.

  5. More validations to the user input parameters.

  6. The default value of themutation_percent_genes is set to thestring"default" rather than the integer 10. This change helpsto know whether the user explicitly passed a value to themutation_percent_genes parameter or it is left to its defaultone. The"default" value is later translated into the integer10.

  7. Themutation_percent_genes parameter is no longer accepting thevalue 0. It must be>0 and<=100.

  8. The built-inwarnings module is used to show warning messagesrather than just using theprint() function.

  9. A newbool parameter calledsuppress_warnings is added tothe constructor of thepygad.GA class. It allows the user tocontrol whether the warning messages are printed or not. It defaultstoFalse which means the messages are printed.

  10. A helper method calledadaptive_mutation_population_fitness() iscreated to calculate the average fitness value used in adaptivemutation to filter the solutions.

  11. Thebest_solution() method accepts a new optional parametercalledpop_fitness. It accepts a list of the fitness values ofthe solutions in the population. IfNone, then thecal_pop_fitness() method is called to calculate the fitnessvalues of the population.

PyGAD 2.10.1

Release Date: 10 January 2021

  1. In thegene_space parameter, anyNone value (regardless ofits index or axis), is replaced by a randomly generated number basedon the 3 parametersinit_range_low,init_range_high, andgene_type. So, theNone value in[...,None,...] or[...,[...,None,...],...] are replaced with random values.This gives more freedom in building the space of values for thegenes.

  2. All the numbers passed to thegene_space parameter are casted tothe type specified in thegene_type parameter.

  3. Thenumpy.uint data type is supported for the parameters thataccept integer values.

  4. In thepygad.kerasga module, themodel_weights_as_vector()function uses thetrainable attribute of the model’s layers toonly return the trainable weights in the network. So, only thetrainable layers with theirtrainable attribute set toTrue(trainable=True), which is the default value, have their weightsevolved. All non-trainable layers with thetrainable attributeset toFalse (trainable=False) will not be evolved. Thanks toProf. Tamer A. Farrag forpointing about that atGitHub.

PyGAD 2.10.2

Release Date: 15 January 2021

  1. A bug fix whensave_best_solutions=True. Refer to this issue formore information:https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/25

PyGAD 2.11.0

Release Date: 16 February 2021

  1. In thegene_space argument, the user can use a dictionary tospecify the lower and upper limits of the gene. This dictionary musthave only 2 items with keyslow andhigh to specify the lowand high limits of the gene, respectively. This way, PyGAD takes careof not exceeding the value limits of the gene. For a problem withonly 2 genes, then usinggene_space=[{'low':1,'high':5},{'low':0.2,'high':0.81}]means the accepted values in the first gene start from 1 (inclusive)to 5 (exclusive) while the second one has values between 0.2(inclusive) and 0.85 (exclusive). For more information, please checktheLimit the Gene ValueRangesection of the documentation.

  2. Theplot_result() method returns the figure so that the user cansave it.

  3. Bug fixes in copying elements from the gene space.

  4. For a gene with a set of discrete values (more than 1 value) in thegene_space parameter like[0,1], it was possible that thegene value may not change after mutation. That is if the currentvalue is 0, then the randomly selected value could also be 0. Now, itis verified that the new value is changed. So, if the current valueis 0, then the new value after mutation will not be 0 but 1.

PyGAD 2.12.0

Release Date: 20 February 2021

  1. 4 new instance attributes are added to hold temporary results aftereach generation:last_generation_fitness holds the fitness valuesof the solutions in the last generation,last_generation_parentsholds the parents selected from the last generation,last_generation_offspring_crossover holds the offspring generatedafter applying the crossover in the last generation, andlast_generation_offspring_mutation holds the offspring generatedafter applying the mutation in the last generation. You can accessthese attributes inside theon_generation() method for example.

  2. A bug fixed when theinitial_population parameter is used. Thebug occurred due to a mismatch between the data type of the arrayassigned toinitial_population and the gene type in thegene_type attribute. Assuming that the array assigned to theinitial_population parameter is((1,1),(3,3),(5,5),(7,7)) which has typeint. Whengene_type is set tofloat, then the genes will not be floatbut casted toint because the defined array hasint type. Thebug is fixed by forcing the array assigned toinitial_populationto have the data type in thegene_type attribute. Check theissue atGitHub:https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/27

Thanks to Andrei Rozanski [PhD Bioinformatics Specialist, Department ofTissue Dynamics and Regeneration, Max Planck Institute for BiophysicalChemistry, Germany] for opening my eye to the first change.

Thanks toMariosGiouvanakis,a PhD candidate in Electrical & Computer Engineer,Aristotle Universityof Thessaloniki (Αριστοτέλειο Πανεπιστήμιο Θεσσαλονίκης),Greece, for emailing me about the secondissue.

PyGAD 2.13.0

Release Date: 12 March 2021

  1. A newbool parameter calledallow_duplicate_genes issupported. IfTrue, which is the default, then asolution/chromosome may have duplicate gene values. IfFalse,then each gene will have a unique value in its solution. Check thePrevent Duplicates in GeneValuessection for more details.

  2. Thelast_generation_fitness is updated at the end of eachgeneration not at the beginning. This keeps the fitness values of themost up-to-date population assigned to thelast_generation_fitness parameter.

PyGAD 2.14.0

PyGAD 2.14.0 has an issue that is solved in PyGAD 2.14.1. Pleaseconsider using 2.14.1 not 2.14.0.

Release Date: 19 May 2021

  1. Issue#40is solved. Now, theNone value works with thecrossover_typeandmutation_type parameters:https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/40

  2. Thegene_type parameter supports accepting alist/tuple/numpy.ndarray of numeric data types for the genes.This helps to control the data type of each individual gene.Previously, thegene_type can be assigned only to a single datatype that is applied for all genes. For more information, check theMore about the ``gene_type`Parameter <https://pygad.readthedocs.io/en/latest/pygad_more.html#more-about-the-gene-type-parameter>`__section. Thanks toRainerEngelfor asking about this feature inthisdiscussion:https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/43

  3. A newbool attribute namedgene_type_single is added to thepygad.GA class. It isTrue when there is a single data typeassigned to thegene_type parameter. When thegene_typeparameter is assigned alist/tuple/numpy.ndarray, thengene_type_single is set toFalse.

  4. Themutation_by_replacement flag now has no effect ifgene_space exists except for the genes withNone values. Forexample, forgene_space=[None,[5,6]] themutation_by_replacement flag affects only the first gene whichhasNone for its value space.

  5. When an element has a value ofNone in thegene_spaceparameter (e.g.gene_space=[None,[5,6]]), then its value willbe randomly generated for each solution rather than being generateonce for all solutions. Previously, the gene withNone value ingene_space is the same across all solutions

  6. Some changes in the documentation according toissue#32:https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/32

PyGAD 2.14.2

Release Date: 27 May 2021

  1. Some bug fixes when thegene_type parameter is nested. Thanks toRainerEngelfor openingadiscussionto report this bug:https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/43#discussioncomment-763342

RainerEngelhelped a lot in suggesting new features and suggesting enhancements in2.14.0 to 2.14.2 releases.

PyGAD 2.14.3

Release Date: 6 June 2021

  1. Some bug fixes when setting thesave_best_solutions parameter toTrue. Previously, the best solution for generationi wasadded into thebest_solutions attribute at generationi+1.Now, thebest_solutions attribute is updated by each bestsolution at its exact generation.

PyGAD 2.15.0

Release Date: 17 June 2021

  1. Control the precision of all genes/individual genes. Thanks toRainer for asking about thisfeature:https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/43#discussioncomment-763452

  2. A new attribute namedlast_generation_parents_indices holds theindices of the selected parents in the last generation.

  3. In adaptive mutation, no need to recalculate the fitness values ofthe parents selected in the last generation as these values can bereturned based on thelast_generation_fitness andlast_generation_parents_indices attributes. This speeds-up theadaptive mutation.

  4. When a sublist has a value ofNone in thegene_spaceparameter (e.g.gene_space=[[1,2,3],[5,6,None]]), then itsvalue will be randomly generated for each solution rather than beinggenerated once for all solutions. Previously, a value ofNone ina sublist of thegene_space parameter was identical across allsolutions.

  5. The dictionary assigned to thegene_space parameter itself orone of its elements has a new key called"step" to specify thestep of moving from the start to the end of the range specified bythe 2 existing keys"low" and"high". An example is{"low":0,"high":30,"step":2} to have only even values forthe gene(s) starting from 0 to 30. For more information, check theMore about the ``gene_space`Parameter <https://pygad.readthedocs.io/en/latest/pygad_more.html#more-about-the-gene-space-parameter>`__section.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/48

  6. A new function calledpredict() is added in both thepygad.kerasga andpygad.torchga modules to make predictions.This makes it easier than using custom code each time a predictionis to be made.

  7. A new parameter calledstop_criteria allows the user to specifyone or more stop criteria to stop the evolution based on someconditions. Each criterion is passed asstr which has a stopword. The current 2 supported words arereach andsaturate.reach stops therun() method if the fitness value is equalto or greater than a given fitness value. An example forreachis"reach_40" which stops the evolution if the fitness is >= 40.saturate means stop the evolution if the fitness saturates for agiven number of consecutive generations. An example forsaturateis"saturate_7" which means stop therun() method if thefitness does not change for 7 consecutive generations. Thanks toRainer for asking about thisfeature:https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/44

  8. A new bool parameter, defaults toFalse, namedsave_solutions is added to the constructor of thepygad.GAclass. IfTrue, then all solutions in each generation areappended into an attribute calledsolutions which is NumPyarray.

  9. Theplot_result() method is renamed toplot_fitness(). Theusers should migrate to the new name as the old name will be removedin the future.

  10. Four new optional parameters are added to theplot_fitness()function in thepygad.GA class which arefont_size=14,save_dir=None,color="#3870FF", andplot_type="plot".Usefont_size to change the font of the plot title and labels.save_dir accepts the directory to which the figure is saved. Itdefaults toNone which means do not save the figure.colorchanges the color of the plot.plot_type changes the plot typewhich can be either"plot" (default),"scatter", or"bar".https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/47

  11. The default value of thetitle parameter in theplot_fitness() method is"PyGAD-Generationvs.Fitness"rather than"PyGAD-Iterationvs.Fitness".

  12. A new method namedplot_new_solution_rate() creates, shows, andreturns a figure showing the rate of new/unique solutions exploredin each generation. It accepts the same parameters as in theplot_fitness() method. This method only works whensave_solutions=True in thepygad.GA class’s constructor.

  13. A new method namedplot_genes() creates, shows, and returns afigure to show how each gene changes per each generation. It acceptssimilar parameters like theplot_fitness() method in addition tothegraph_type,fill_color, andsolutions parameters.Thegraph_type parameter can be either"plot" (default),"boxplot", or"histogram".fill_color accepts the fillcolor which works whengraph_type is either"boxplot" or"histogram".solutions can be either"all" or"best"to decide whether all solutions or only best solutions are used.

  14. Thegene_type parameter now supports controlling the precisionoffloat data types. For a gene, rather than assigning just thedata type likefloat, assign alist/tuple/numpy.ndarray with 2 elements where the firstone is the type and the second one is the precision. For example,[float,2] forces a gene with a value like0.1234 to be0.12. For more information, check theMore about the``gene_type`Parameter <https://pygad.readthedocs.io/en/latest/pygad_more.html#more-about-the-gene-type-parameter>`__section.

PyGAD 2.15.1

Release Date: 18 June 2021

  1. Fix a bug whenkeep_parents is set to a positive integer.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/49

PyGAD 2.15.2

Release Date: 18 June 2021

  1. Fix a bug when using thekerasga ortorchga modules.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/51

PyGAD 2.16.0

Release Date: 19 June 2021

  1. A user-defined function can be passed to themutation_type,crossover_type, andparent_selection_type parameters in thepygad.GA class to create a custom mutation, crossover, and parentselection operators. Check theUser-Defined Crossover, Mutation, andParent SelectionOperatorssection for more details.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/50

PyGAD 2.16.1

Release Date: 28 September 2021

  1. The user can use thetqdm library to show a progress bar.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/50.

importpygadimportnumpyimporttqdmequation_inputs=[4,-2,3.5]desired_output=44deffitness_func(ga_instance,solution,solution_idx):output=numpy.sum(solution*equation_inputs)fitness=1.0/(numpy.abs(output-desired_output)+0.000001)returnfitnessnum_generations=10000withtqdm.tqdm(total=num_generations)aspbar:ga_instance=pygad.GA(num_generations=num_generations,sol_per_pop=5,num_parents_mating=2,num_genes=len(equation_inputs),fitness_func=fitness_func,on_generation=lambda_:pbar.update(1))ga_instance.run()ga_instance.plot_result()

But this work does not work if thega_instance will be pickled (i.e.thesave() method will be called.

ga_instance.save("test")

To solve this issue, define a function and pass it to theon_generation parameter. In the next code, theon_generation_progress() function is defined which updates theprogress bar.

importpygadimportnumpyimporttqdmequation_inputs=[4,-2,3.5]desired_output=44deffitness_func(ga_instance,solution,solution_idx):output=numpy.sum(solution*equation_inputs)fitness=1.0/(numpy.abs(output-desired_output)+0.000001)returnfitnessdefon_generation_progress(ga):pbar.update(1)num_generations=100withtqdm.tqdm(total=num_generations)aspbar:ga_instance=pygad.GA(num_generations=num_generations,sol_per_pop=5,num_parents_mating=2,num_genes=len(equation_inputs),fitness_func=fitness_func,on_generation=on_generation_progress)ga_instance.run()ga_instance.plot_result()ga_instance.save("test")
  1. Solved the issue of unequal length between thesolutions andsolutions_fitness when thesave_solutions parameter is set toTrue. Now, the fitness of the last population is appended to thesolutions_fitness array.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/64

  2. There was an issue of getting the length of these 4 variables(solutions,solutions_fitness,best_solutions, andbest_solutions_fitness) doubled after each call of therun()method. This is solved by resetting these variables at the beginningof therun() method.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/62

  3. Bug fixes when adaptive mutation is used(mutation_type="adaptive").https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/65

PyGAD 2.16.2

Release Date: 2 February 2022

  1. A new instance attribute calledprevious_generation_fitness addedin thepygad.GA class. It holds the fitness values of onegeneration before the fitness values saved in thelast_generation_fitness.

  2. Issue in thecal_pop_fitness() method in getting the correctindices of the previous parents. This is solved by using the previousgeneration’s fitness saved in the new attributeprevious_generation_fitness to return the parents’ fitnessvalues. Thanks to Tobias Tischhauser (M.Sc. -Mitarbeiter InstitutEMS, Departement Technik, OST – Ostschweizer Fachhochschule,Switzerland)for detecting this bug.

PyGAD 2.16.3

Release Date: 2 February 2022

  1. Validate the fitness value returned from the fitness function. Anexception is raised if something is wrong.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/67

PyGAD 2.17.0

Release Date: 8 July 2022

  1. An issue is solved when thegene_space parameter is given a fixedvalue. e.g. gene_space=[range(5), 4]. The second gene’s value isstatic (4) which causes an exception.

  2. Fixed the issue where theallow_duplicate_genes parameter did notwork when mutation is disabled (i.e.mutation_type=None). This isby checking for duplicates after crossover directly.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/39

  3. Solve an issue in thetournament_selection() method as theindices of the selected parents were incorrect.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/89

  4. Reuse the fitness values of the previously explored solutions ratherthan recalculating them. This feature only works ifsave_solutions=True.

  5. Parallel processing is supported. This is by the introduction of anew parameter namedparallel_processing in the constructor of thepygad.GA class. Thanks to@windowshopr for opening theissue#78at GitHub. Check theParallel Processing inPyGADsection for more information and examples.

PyGAD 2.18.0

Release Date: 9 September 2022

  1. Raise an exception if the sum of fitness values is zero while eitherroulette wheel or stochastic universal parent selection is used.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/129

  2. Initialize the value of therun_completed property toFalse.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/122

  3. The values of these properties are no longer reset with each call totherun() methodself.best_solutions,self.best_solutions_fitness,self.solutions,self.solutions_fitness:https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/123. Now,the user can have the flexibility of calling therun() methodmore than once while extending the data collected after eachgeneration. Another advantage happens when the instance is loaded andtherun() method is called, as the old fitness value are shown onthe graph alongside with the new fitness values. Read more in thissection:Continue without LosingProgress

  4. ThanksProf. Fernando JiménezBarrionuevo (Dept. of Information andCommunications Engineering, University of Murcia, Murcia, Spain) forediting thiscommentin the code.https://github.com/ahmedfgad/GeneticAlgorithmPython/commit/5315bbec02777df96ce1ec665c94dece81c440f4

  5. A bug fixed whencrossover_type=None.

  6. Support of elitism selection through a new parameter namedkeep_elitism. It defaults to 1 which means for each generationkeep only the best solution in the next generation. If assigned 0,then it has no effect. Read more in this section:ElitismSelection.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/74

  7. A new instance attribute namedlast_generation_elitism added tohold the elitism in the last generation.

  8. A new parameter calledrandom_seed added to accept a seed for therandom function generators. Credit to this issuehttps://github.com/ahmedfgad/GeneticAlgorithmPython/issues/70 andProf. Fernando Jiménez Barrionuevo.Read more in this section:RandomSeed.

  9. Editing thepygad.TorchGA module to make sure the tensor data ismoved from GPU to CPU. Thanks to Rasmus Johansson for opening thispull request:https://github.com/ahmedfgad/TorchGA/pull/2

PyGAD 2.18.1

Release Date: 19 September 2022

  1. A big fix whenkeep_elitism is used.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/132

PyGAD 2.18.2

Release Date: 14 February 2023

  1. Removenumpy.int andnumpy.float from the list of supporteddata types.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/151https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/152

  2. Call theon_crossover() callback function even ifcrossover_type isNone.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/138

  3. Call theon_mutation() callback function even ifmutation_type isNone.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/138

PyGAD 2.18.3

Release Date: 14 February 2023

  1. Bug fixes.

PyGAD 2.19.0

Release Date: 22 February 2023

  1. A newsummary() method is supported to return a Keras-likesummary of the PyGAD lifecycle.

  2. A new optional parameter calledfitness_batch_size is supportedto calculate the fitness in batches. If it is assigned the value1 orNone (default), then the normal flow is used where thefitness function is called for each individual solution. If thefitness_batch_size parameter is assigned a value satisfying thiscondition1<fitness_batch_size<=sol_per_pop, then thesolutions are grouped into batches of sizefitness_batch_sizeand the fitness function is called once for each batch. In thiscase, the fitness function must return a list/tuple/numpy.ndarraywith a length equal to the number of solutions passed.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/136.

  3. Thecloudpickle library(https://github.com/cloudpipe/cloudpickle) is used instead of thepickle library to pickle thepygad.GA objects. This solvesthe issue of having to redefine the functions (e.g. fitnessfunction). Thecloudpickle library is added as a dependency intherequirements.txt file.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/159

  4. Support of assigning methods to these parameters:fitness_func,crossover_type,mutation_type,parent_selection_type,on_start,on_fitness,on_parents,on_crossover,on_mutation,on_generation, andon_stop.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/92https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/138

  5. Validating the output of the parent selection, crossover, andmutation functions.

  6. The built-in parent selection operators return the parent’s indicesas a NumPy array.

  7. The outputs of the parent selection, crossover, and mutationoperators must be NumPy arrays.

  8. Fix an issue whenallow_duplicate_genes=True.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/39

  9. Fix an issue creating scatter plots of the solutions’ fitness.

  10. Sampling from aset() is no longer supported in Python 3.11.Instead, sampling happens from alist(). ThanksMarcoBrennafor pointing to this issue.

  11. The lifecycle is updated to reflect that the new population’sfitness is calculated at the end of the lifecycle not at thebeginning.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/154#issuecomment-1438739483

  12. There was an issue whensave_solutions=True that causes thefitness function to be called for solutions already explored andhave their fitness pre-calculated.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/160

  13. A new instance attribute namedlast_generation_elitism_indicesadded to hold the indices of the selected elitism. This attributehelps to re-use the fitness of the elitism instead of calling thefitness function.

  14. Fewer calls to thebest_solution() method which in turns savessome calls to the fitness function.

  15. Some updates in the documentation to give more details about thecal_pop_fitness() method.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/79#issuecomment-1439605442

PyGAD 2.19.1

Release Date: 22 February 2023

  1. Add thecloudpicklelibrary as a dependency.

PyGAD 2.19.2

Release Date 23 February 2023

  1. Fix an issue when parallel processing was used where the elitismsolutions’ fitness values are not re-used.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/160#issuecomment-1441718184

PyGAD 3.0.0

Release Date 8 April 2023

  1. The structure of the library is changed and some methods defined inthepygad.py module are moved to thepygad.utils,pygad.helper, andpygad.visualize submodules.

  2. Thepygad.utils.parent_selection module has a class namedParentSelection where all the parent selection operators exist.Thepygad.GA class extends this class.

  3. Thepygad.utils.crossover module has a class namedCrossoverwhere all the crossover operators exist. Thepygad.GA classextends this class.

  4. Thepygad.utils.mutation module has a class namedMutationwhere all the mutation operators exist. Thepygad.GA classextends this class.

  5. Thepygad.helper.unique module has a class namedUnique somehelper methods exist to solve duplicate genes and make sure everygene is unique. Thepygad.GA class extends this class.

  6. Thepygad.visualize.plot module has a class namedPlot whereall the methods that create plots exist. Thepygad.GA classextends this class.

  7. Support of using thelogging module to log the outputs to boththe console and text file instead of using theprint() function.This is by assigning thelogging.Logger to the newloggerparameter. Check theLoggingOutputsfor more information.

  8. A new instance attribute calledlogger to save the logger.

  9. The function/method passed to thefitness_func parameter acceptsa new parameter that refers to the instance of thepygad.GAclass. Check this for an example:Use Functions and Methods toBuild Fitness Function andCallbacks.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/163

  10. Update the documentation to include an example of using functionsand methods to calculate the fitness and build callbacks. Check thisfor more details:Use Functions and Methods to Build FitnessFunction andCallbacks.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/92#issuecomment-1443635003

  11. Validate the value passed to theinitial_population parameter.

  12. Validate the type and length of thepop_fitness parameter of thebest_solution() method.

  13. Some edits in the documentation.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/106

  14. Fix an issue when building the initial population as (some) geneshave their value taken from the mutation range (defined by theparametersrandom_mutation_min_val andrandom_mutation_max_val) instead of using the parametersinit_range_low andinit_range_high.

  15. Thesummary() method returns the summary as a single-linestring. Just log/print the returned string it to see it properly.

  16. Thecallback_generation parameter is removed. Use theon_generation parameter instead.

  17. There was an issue when using theparallel_processing parameterwith Keras and PyTorch. As Keras/PyTorch are not thread-safe, thepredict() method gives incorrect and weird results when morethan 1 thread is used.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/145https://github.com/ahmedfgad/TorchGA/issues/5https://github.com/ahmedfgad/KerasGA/issues/6. Thanks to thisStackOverflowanswer.

  18. Replacenumpy.float byfloat in the 2 parent selectionoperators roulette wheel and stochastic universal.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/168

PyGAD 3.0.1

Release Date 20 April 2023

  1. Fix an issue with passing user-defined function/method for parentselection.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/179

PyGAD 3.1.0

Release Date 20 June 2023

  1. Fix a bug when the initial population has duplciate genes if anested gene space is used.

  2. Thegene_space parameter can no longer be assigned a tuple.

  3. Fix a bug when thegene_space parameter has a member of typetuple.

  4. A new instance attribute calledgene_space_unpacked which hasthe unpackedgene_space. It is used to solve duplicates. Forinfinite ranges in thegene_space, they are unpacked to alimited number of values (e.g. 100).

  5. Bug fixes when creating the initial population usinggene_spaceattribute.

  6. When adict is used with thegene_space attribute, the newgene value was calculated by summing 2 values: 1) the value sampledfrom thedict 2) a random value returned from the randommutation range defined by the 2 parametersrandom_mutation_min_val andrandom_mutation_max_val. Thismight cause the gene value to exceed the range limit defined in thegene_space. To respect thegene_space range, this releaseonly returns the value from thedict without summing it to arandom value.

  7. Formatting the strings using f-string instead of theformat()method.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/189

  8. In the__init__() of thepygad.GA class, the logged errormessages are handled using atry-except block instead ofrepeating thelogger.error() command.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/189

  9. A new class namedCustomLogger is created in thepygad.cnnmodule to create a default logger using thelogging moduleassigned to thelogger attribute. This class is extended in allother classes in the module. The constructors of these classes havea new parameter namedlogger which defaults toNone. If nologger is passed, then the default logger in theCustomLoggerclass is used.

  10. Except for thepygad.nn module, theprint() function in allother modules are replaced by thelogging module to logmessages.

  11. The callback functions/methodson_fitness(),on_parents(),on_crossover(), andon_mutation() can return values. Thesereturned values override the corresponding properties. The output ofon_fitness() overrides the population fitness. Theon_parents() function/method must return 2 values representingthe parents and their indices. The output ofon_crossover()overrides the crossover offspring. The output ofon_mutation()overrides the mutation offspring.

  12. Fix a bug when adaptive mutation is used whilefitness_batch_size>1.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/195

  13. Whenallow_duplicate_genes=False and a user-definedgene_space is used, it sometimes happen that there is no room tosolve the duplicates between the 2 genes by simply replacing thevalue of one gene by another gene. This release tries to solve suchduplicates by looking for a third gene that will help in solving theduplicates. Checkthissectionfor more information.

  14. Use probabilities to select parents using the rank parent selectionmethod.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/205

  15. The 2 parametersrandom_mutation_min_val andrandom_mutation_max_val can accept iterables(list/tuple/numpy.ndarray) with length equal to the number of genes.This enables customizing the mutation range for each individualgene.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/198

  16. The 2 parametersinit_range_low andinit_range_high canaccept iterables (list/tuple/numpy.ndarray) with length equal to thenumber of genes. This enables customizing the initial range for eachindividual gene when creating the initial population.

  17. Thedata parameter in thepredict() function of thepygad.kerasga module can be assigned a data generator.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/115https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/207

  18. Thepredict() function of thepygad.kerasga module accepts 3optional parameters: 1)batch_size=None,verbose=0, andsteps=None. Check documentation of theKerasModel.predict()method for more information.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/207

  19. The documentation is updated to explain how mutation works whengene_space is used withint orfloat data types. Checkthissection.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/198

PyGAD 3.2.0

Release Date 7 September 2023

  1. A new modulepygad.utils.nsga2 is created that has theNSGA2class that includes the functionalities of NSGA-II. The class hasthese methods: 1)get_non_dominated_set() 2)non_dominated_sorting() 3)crowding_distance() 4)sort_solutions_nsga2(). Checkthissectionfor an example.

  2. Support of multi-objective optimization using Non-Dominated SortingGenetic Algorithm II (NSGA-II) using theNSGA2 class in thepygad.utils.nsga2 module. Just return alist,tuple, ornumpy.ndarray from the fitness function and the library willconsider the problem as multi-objective optimization. All theobjectives are expected to be maximization. Checkthissectionfor an example.

  3. The parent selection methods and adaptive mutation are edited tosupport multi-objective optimization.

  4. Two new NSGA-II parent selection methods are supported in thepygad.utils.parent_selection module: 1) Tournament selection forNSGA-II 2) NSGA-II selection.

  5. Theplot_fitness() method in thepygad.plot module has a newoptional parameter namedlabel to accept the label of the plots.This is only used for multi-objective problems. Otherwise, it isignored. It defaults toNone and accepts alist,tuple,ornumpy.ndarray. The labels are used in a legend inside theplot.

  6. The default color in the methods of thepygad.plot module ischanged to the greenish#64f20c color.

  7. A new instance attribute namedpareto_fronts added to thepygad.GA instances that holds the pareto fronts when solving amulti-objective problem.

  8. Thegene_type accepts alist,tuple, ornumpy.ndarray for integer data types given that the precision isset toNone (e.g.gene_type=[float,[int,None]]).

  9. In thecal_pop_fitness() method, the fitness value is re-used ifsave_best_solutions=True and the solution is found in thebest_solutions attribute. These parameters also can helpre-using the fitness of a solution instead of calling the fitnessfunction:keep_elitism,keep_parents, andsave_solutions.

  10. The value99999999999 is replaced byfloat('inf') in the 2methodswheel_cumulative_probs() andstochastic_universal_selection() inside thepygad.utils.parent_selection.ParentSelection class.

  11. Theplot_result() method in thepygad.visualize.plot.Plotclass is removed. Instead, please use theplot_fitness() if youdid not upgrade yet.

PyGAD 3.3.0

Release Date 29 January 2024

  1. Solve bugs when multi-objective optimization is used.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/238

  2. When thestop_ciiteria parameter is used with thereachkeyword, then multiple numeric values can be passed when solving amulti-objective problem. For example, if a problem has 3 objectivefunctions, thenstop_criteria="reach_10_20_30" means the GAstops if the fitness of the 3 objectives are at least 10, 20, and30, respectively. The number values must match the number ofobjective functions. If a single value found (e.g.stop_criteria=reach_5) when solving a multi-objective problem,then it is used across all the objectives.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/238

  3. Thedelay_after_gen parameter is now deprecated and will beremoved in a future release. If it is necessary to have a time delayafter each generation, then assign a callback function/method to theon_generation parameter to pause the evolution.

  4. Parallel processing now supports calculating the fitness duringadaptive mutation.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/201

  5. The population size can be changed during runtime by changing allthe parameters that would affect the size of any thing used by theGA. For more information, check theChange Population Size duringRuntimesection.https://github.com/ahmedfgad/GeneticAlgorithmPython/discussions/234

  6. When a dictionary exists in thegene_space parameter without astep, then mutation occurs by adding a random value to the genevalue. The random vaue is generated based on the 2 parametersrandom_mutation_min_val andrandom_mutation_max_val. Formore information, check theHow Mutation Works with the gene_spaceParameter?section.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/229

  7. Addobject as a supported data type for int(GA.supported_int_types) and float (GA.supported_float_types).https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/174

  8. Use theraise clause instead of thesys.exit(-1) toterminate the execution.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/213

  9. Fix a bug when multi-objective optimization is used with batchfitness calculation (e.g.fitness_batch_size set to a non-zeronumber).

  10. Fix a bug in thepygad.py script when finding the index of thebest solution. It does not work properly with multi-objectiveoptimization whereself.best_solutions_fitness have multiplecolumns.

self.best_solution_generation=numpy.where(numpy.array(self.best_solutions_fitness)==numpy.max(numpy.array(self.best_solutions_fitness)))[0][0]

PyGAD 3.3.1

Release Date 17 February 2024

  1. After the last generation and before therun() method completes,update the 2 instance attributes: 1)last_generation_parents 2)last_generation_parents_indices. This is to keep the list ofparents up-to-date with the latest population fitnesslast_generation_fitness.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/275

  2. 4 methods with names starting withrun_. Their purpose is to keepthe main loop inside therun() method clean. Check theOtherMethodssection for more information.

PyGAD 3.4.0

Release Date 07 January 2025

  1. Thedelay_after_gen parameter is removed from thepygad.GAclass constructor. As a result, it is no longer an attribute of thepygad.GA class instances. To add a delay after each generation,apply it inside theon_generation callback.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/283

  2. In thesingle_point_crossover() method of thepygad.utils.crossover.Crossover class, all the random crossoverpoints are returned before thefor loop. This is by calling thenumpy.random.randint() function only once before the loop togenerate all the K points (where K is the offspring size). This iscompared to calling thenumpy.random.randint() function insidethefor loop K times, once for each individual offspring.

  3. Bug fix in theexamples/example_custom_operators.py script.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/285

  4. While making prediction using thepygad.torchga.predict()function, no gradients are calculated.

  5. Thegene_type parameter of thepygad.helper.unique.Unique.unique_int_gene_from_range() methodaccepts the type of the current gene only instead of the fullgene_type list.

  6. Created a new method calledunique_float_gene_from_range()inside thepygad.helper.unique.Unique class to find a uniquefloating-point number from a range.

  7. Fix a bug in thepygad.helper.unique.Unique.unique_gene_by_space() method toreturn the numeric value only instead of a NumPy array.

  8. Refactoring thepygad/helper/unique.py script to removeduplicate codes and reformatting the docstrings.

  9. Theplot_pareto_front_curve() method added to thepygad.visualize.plot.Plot class to visualize the Pareto front formulti-objective problems. It only supports 2 objectives.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/279

  10. Fix a bug converting a nested NumPy array to a nested list.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/300

  11. TheMatplotlib library is only imported when a method inside thepygad/visualize/plot.py script is used. This is more efficientthan usingimportmatplotlib.pyplot at the module level as thiscauses it to be imported whenpygad is imported even when it isnot needed.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/292

  12. Fix a bug when minus sign (-) is used inside thestop_criteriaparameter (e.g.stop_criteria=["saturate_10","reach_-0.5"]).https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/296

  13. Make sureself.best_solutions is a list of lists inside thecal_pop_fitness method.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/293

  14. Fix a bug where thecal_pop_fitness() method was using theprevious_generation_fitness attribute to return the parentsfitness. This instance attribute was not using the fitness of thelatest population, instead the fitness of the population before thelast one. The issue is solved by updating theprevious_generation_fitness attribute to the latest populationfitness before the GA completes.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/291

PyGAD 3.5.0

Release Date 08 July 2025

  1. Fix a bug when minus sign (-) is used inside thestop_criteriaparameter for multi-objective problems.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/314https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/323

  2. Fix a bug when thestop_criteria parameter is passed as aniterable (e.g. list) for multi-objective problems (e.g.['reach_50_60','reach_20,40']).https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/314

  3. Call theget_matplotlib() function from theplot_genes()method inside thepygad.visualize.plot.Plot class to import thematplotlib library.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/315

  4. Create a new helper method calledselect_unique_value() insidethepygad/helper/unique.py script to select a unique gene froman array of values.

  5. Create a new helper method calledget_random_mutation_range()inside thepygad/utils/mutation.py script that returns therandom mutation range (min and max) for a single gene by its index.

  6. Create a new helper method calledchange_random_mutation_value_dtype inside thepygad/utils/mutation.py script that changes the data type of thevalue used to apply random mutation.

  7. Create a new helper method calledround_random_mutation_value()inside thepygad/utils/mutation.py script that rounds the valueused to apply random mutation.

  8. Create thepygad/helper/misc.py script with a class calledHelper that has the following helper methods:

    1. change_population_dtype_and_round(): For each gene in thepopulation, round the gene value and change the data type.

    2. change_gene_dtype_and_round(): Round the change the datatype of a single gene.

    3. mutation_change_gene_dtype_and_round(): Decides whethermutation is done by replacement or not. Then it rounds andchange the data type of the new gene value.

    4. validate_gene_constraint_callable_output(): Validates theoutput of the user-defined callable/function that checks whetherthe gene constraint defined in thegene_constraint parameteris satisfied or not.

    5. get_gene_dtype(): Returns the gene data type from thegene_type instance attribute.

    6. get_random_mutation_range(): Returns the random mutationrange using therandom_mutation_min_val andrandom_mutation_min_val instance attributes.

    7. get_initial_population_range(): Returns the initialpopulation values range using theinit_range_low andinit_range_high instance attributes.

    8. generate_gene_value_from_space(): Generates/selects a valuefor a gene using thegene_space instance attribute.

    9. generate_gene_value_randomly(): Generates a random value forthe gene. Only used ifgene_space isNone.

    10. generate_gene_value(): Generates a value for the gene. Itchecks whethergene_space isNone and calls eithergenerate_gene_value_randomly() orgenerate_gene_value_from_space().

    11. filter_gene_values_by_constraint(): Receives a list ofvalues for a gene. Then it filters such values using the geneconstraint.

    12. get_valid_gene_constraint_values(): Selects one valid genevalue that satisfy the gene constraint. It simply callsgenerate_gene_value() to generate some gene values then itfilters such values usingfilter_gene_values_by_constraint().

  9. Create a new helper method calledmutation_process_random_value() inside thepygad/utils/mutation.py script that generates constrained randomvalues for mutation. It calls eithergenerate_gene_value() orget_valid_gene_constraint_values() based on whether thegene_constraint parameter is used or not.

  10. A new parameter calledgene_constraint is added. It accepts alist of callables (i.e. functions) acting as constraints for thegene values. Before selecting a value for a gene, the callable iscalled to ensure the candidate value is valid. Check theGeneConstraintsection for more information.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/119

  11. A new parameter calledsample_size is added. To select a genevalue that respects a constraint, this variable defines the size ofthe sample from which a value is selected randomly. Useful if eitherallow_duplicate_genes orgene_constraint is used. Aninstance attribute of the same name is created in the instances ofthepygad.GA class. Check thesample_sizeParametersection for more information.

  12. Use thesample_size parameter instead ofnum_trials in themethodssolve_duplicate_genes_randomly() andunique_float_gene_from_range() inside thepygad/helper/unique.py script. It is the maximum number ofvalues to generate as the search space when looking for a uniquefloat value out of a range.

  13. Fixed a bug in population initialization whenallow_duplicate_genes=False. Previously, gene values werechecked for duplicates before rounding, which could allownear-duplicates like 7.61 and 7.62 to pass. After rounding (e.g.,both becoming 7.6), this resulted in unintended duplicates. The fixensures gene values are now rounded before duplicate checks,preventing such cases.

  14. More tests are created.

  15. More examples are created.

  16. Edited thesort_solutions_nsga2() method in thepygad/utils/nsga2.py script to accept an optional parametercalledfind_best_solution when calling this method just to findthe best solution.

  17. Fixed a bug while applying the non-dominated sorting in theget_non_dominated_set() method inside thepygad/utils/nsga2.py script. It was swapping the non-dominatedand dominated sets. In other words, it used the non-dominated set asif it is the dominated set and vice versa. All the calls to thismethod were edited accordingly.https://github.com/ahmedfgad/GeneticAlgorithmPython/issues/320.

  18. Fix a bug retrieving in thebest_solution() method whenretrieving the best solution for multi-objective problems.https://github.com/ahmedfgad/GeneticAlgorithmPython/pull/331

PyGAD Projects at GitHub

The PyGAD library is available at PyPI at this pagehttps://pypi.org/project/pygad. PyGAD is built out of a number ofopen-source GitHub projects. A brief note about these projects is givenin the next subsections.

GeneticAlgorithmPython

GitHub Link:https://github.com/ahmedfgad/GeneticAlgorithmPython

GeneticAlgorithmPythonis the first project which is an open-source Python 3 project forimplementing the genetic algorithm based on NumPy.

NumPyANN

GitHub Link:https://github.com/ahmedfgad/NumPyANN

NumPyANN builds artificialneural networks inPython 3 usingNumPy from scratch. Thepurpose of this project is to only implement theforward pass of aneural network without using a training algorithm. Currently, it onlysupports classification and later regression will be also supported.Moreover, only one class is supported per sample.

NeuralGenetic

GitHub Link:https://github.com/ahmedfgad/NeuralGenetic

NeuralGenetic trainsneural networks using the genetic algorithm based on the previous 2projectsGeneticAlgorithmPythonandNumPyANN.

NumPyCNN

GitHub Link:https://github.com/ahmedfgad/NumPyCNN

NumPyCNN buildsconvolutional neural networks using NumPy. The purpose of this projectis to only implement theforward pass of a convolutional neuralnetwork without using a training algorithm.

CNNGenetic

GitHub Link:https://github.com/ahmedfgad/CNNGenetic

CNNGenetic trainsconvolutional neural networks using the genetic algorithm. It uses theGeneticAlgorithmPythonproject for building the genetic algorithm.

KerasGA

GitHub Link:https://github.com/ahmedfgad/KerasGA

KerasGA trainsKeras models using the genetic algorithm. It usestheGeneticAlgorithmPythonproject for building the genetic algorithm.

TorchGA

GitHub Link:https://github.com/ahmedfgad/TorchGA

TorchGA trainsPyTorch models using the genetic algorithm. Ituses theGeneticAlgorithmPythonproject for building the genetic algorithm.

pygad.torchga:https://github.com/ahmedfgad/TorchGA

Stackoverflow Questions about PyGAD

How do I proceed to load a ga_instance as “.pkl” format in PyGad?

Binary Classification NN Model Weights not being Trained in PyGAD

How to solve TSP problem using pyGAD package?

How can I save a matplotlib plot that is the output of a function in jupyter?

How do I query the best solution of a pyGAD GA instance?

Multi-Input Multi-Output in Genetic algorithm (python)

https://www.linkedin.com/pulse/validation-short-term-parametric-trading-model-genetic-landolfi

https://itchef.ru/articles/397758

https://audhiaprilliant.medium.com/genetic-algorithm-based-clustering-algorithm-in-searching-robust-initial-centroids-for-k-means-e3b4d892a4be

https://python.plainenglish.io/validation-of-a-short-term-parametric-trading-model-with-genetic-optimization-and-walk-forward-89708b789af6

https://ichi.pro/ko/pygadwa-hamkke-yujeon-algolijeum-eul-sayonghayeo-keras-model-eul-hunlyeonsikineun-bangbeob-173299286377169

https://ichi.pro/tr/pygad-ile-genetik-algoritmayi-kullanarak-keras-modelleri-nasil-egitilir-173299286377169

https://ichi.pro/ru/kak-obucit-modeli-keras-s-pomos-u-geneticeskogo-algoritma-s-pygad-173299286377169

https://blog.csdn.net/sinat_38079265/article/details/108449614

Submitting Issues

If there is an issue using PyGAD, then use any of your preferred optionto discuss that issue.

One way issubmitting anissueinto this GitHub project(github.com/ahmedfgad/GeneticAlgorithmPython)in case something is not working properly or to ask for questions.

If this is not a proper option for you, then check theContactUssection for more contact details.

Ask for Feature

PyGAD is actively developed with the goal of building a dynamic libraryfor suporting a wide-range of problems to be optimized using the geneticalgorithm.

To ask for a new feature, eithersubmit anissueinto this GitHub project(github.com/ahmedfgad/GeneticAlgorithmPython)or send an e-mail toahmed.f.gad@gmail.com.

Also check theContactUssection for more contact details.

Projects Built using PyGAD

If you created a project that uses PyGAD, then we can support you bymentioning this project here in PyGAD’s documentation.

To do that, please send a message atahmed.f.gad@gmail.com or check theContactUssection for more contact details.

Within your message, please send the following details:

  • Project title

  • Brief description

  • Preferably, a link that directs the readers to your project

Tutorials about PyGAD

Adaptive Mutation in Genetic Algorithm with Python Examples

In this tutorial, we’ll see why mutation with a fixed number of genes isbad, and how to replace it with adaptive mutation. Using thePyGADPython 3 library, we’ll discuss a fewexamples that use both random and adaptive mutation.

Clustering Using the Genetic Algorithm in Python

This tutorial discusses how the genetic algorithm is used to clusterdata, starting from random clusters and running until the optimalclusters are found. We’ll start by briefly revising the K-meansclustering algorithm to point out its weak points, which are latersolved by the genetic algorithm. The code examples in this tutorial areimplemented in Python using thePyGADlibrary.

Working with Different Genetic Algorithm Representations in Python

Depending on the nature of the problem being optimized, the geneticalgorithm (GA) supports two different gene representations: binary, anddecimal. The binary GA has only two values for its genes, which are 0and 1. This is easier to manage as its gene values are limited comparedto the decimal GA, for which we can use different formats like float orinteger, and limited or unlimited ranges.

This tutorial discusses how thePyGAD library supports the two GArepresentations, binary and decimal.

5 Genetic Algorithm Applications Using PyGAD

This tutorial introduces PyGAD, an open-source Python library forimplementing the genetic algorithm and training machine learningalgorithms. PyGAD supports 19 parameters for customizing the geneticalgorithm for various applications.

Within this tutorial we’ll discuss 5 different applications of thegenetic algorithm and build them using PyGAD.

Train Neural Networks Using a Genetic Algorithm in Python with PyGAD

The genetic algorithm (GA) is a biologically-inspired optimizationalgorithm. It has in recent years gained importance, as it’s simplewhile also solving complex problems like travel route optimization,training machine learning algorithms, working with single andmulti-objective problems, game playing, and more.

Deep neural networks are inspired by the idea of how the biologicalbrain works. It’s a universal function approximator, which is capable ofsimulating any function, and is now used to solve the most complexproblems in machine learning. What’s more, they’re able to work with alltypes of data (images, audio, video, and text).

Both genetic algorithms (GAs) and neural networks (NNs) are similar, asboth are biologically-inspired techniques. This similarity motivates usto create a hybrid of both to see whether a GA can train NNs with highaccuracy.

This tutorial usesPyGAD, a Pythonlibrary that supports building and training NNs using a GA.PyGAD offers both classification andregression NNs.

Building a Game-Playing Agent for CoinTex Using the Genetic Algorithm

In this tutorial we’ll see how to build a game-playing agent using onlythe genetic algorithm to play a game calledCoinTex,which is developed in the Kivy Python framework. The objective ofCoinTex is to collect the randomly distributed coins while avoidingcollision with fire and monsters (that move randomly). The source codeof CoinTex can be foundonGitHub.

The genetic algorithm is the only AI used here; there is no othermachine/deep learning model used with it. We’ll implement the geneticalgorithm usingPyGad.This tutorial starts with a quick overview of CoinTex followed by abrief explanation of the genetic algorithm, and how it can be used tocreate the playing agent. Finally, we’ll see how to implement theseideas in Python.

The source code of the genetic algorithm agent is availablehere,and you can download the code used in this tutorial fromhere.

How To Train Keras Models Using the Genetic Algorithm with PyGAD

PyGAD is an open-source Python library for building the geneticalgorithm and training machine learning algorithms. It offers a widerange of parameters to customize the genetic algorithm to work withdifferent types of problems.

PyGAD has its own modules that support building and training neuralnetworks (NNs) and convolutional neural networks (CNNs). Despite thesemodules working well, they are implemented in Python without anyadditional optimization measures. This leads to comparatively highcomputational times for even simple problems.

The latest PyGAD version, 2.8.0 (released on 20 September 2020),supports a new module to train Keras models. Even though Keras is builtin Python, it’s fast. The reason is that Keras uses TensorFlow as abackend, and TensorFlow is highly optimized.

This tutorial discusses how to train Keras models using PyGAD. Thediscussion includes building Keras models using either the SequentialModel or the Functional API, building an initial population of Kerasmodel parameters, creating an appropriate fitness function, and more.

image2

Train PyTorch Models Using Genetic Algorithm with PyGAD

PyGAD is a genetic algorithm Python3 library for solving optimization problems. One of these problems istraining machine learning algorithms.

PyGAD has a module calledpygad.kerasga. It trainsKeras models using the genetic algorithm. On January 3rd, 2021, a newrelease ofPyGAD 2.10.0 brought anew module calledpygad.torchga to trainPyTorch models. It’s very easy to use, but there are a few tricky steps.

So, in this tutorial, we’ll explore how to use PyGAD to train PyTorchmodels.

image3

A Guide to Genetic ‘Learning’ Algorithms for Optimization

PyGAD in Other Languages

French

Cómo los algoritmos genéticos pueden competir con el descenso degradiente y elbackprop

Bien que la manière standard d’entraîner les réseaux de neurones soit ladescente de gradient et la rétropropagation, il y a d’autres joueursdans le jeu. L’un d’eux est les algorithmes évolutionnaires, tels queles algorithmes génétiques.

Utiliser un algorithme génétique pour former un réseau de neuronessimple pour résoudre le OpenAI CartPole Jeu. Dans cet article, nousallons former un simple réseau de neurones pour résoudre le OpenAICartPole . J’utiliserai PyTorch et PyGAD .

image4

Spanish

Cómo los algoritmos genéticos pueden competir con el descenso degradiente y elbackprop

Aunque la forma estandar de entrenar redes neuronales es el descenso degradiente y la retropropagacion, hay otros jugadores en el juego, uno deellos son los algoritmos evolutivos, como los algoritmos geneticos.

Usa un algoritmo genetico para entrenar una red neuronal simple pararesolver el Juego OpenAI CartPole. En este articulo, entrenaremos unared neuronal simple para resolver el OpenAI CartPole . Usare PyTorch yPyGAD .

image5

Korean

[PyGAD] Python 에서 Genetic Algorithm 을 사용해보기

image6

파이썬에서 genetic algorithm을 사용하는 패키지들을 다 사용해보진않았지만, 확장성이 있어보이고, 시도할 일이 있어서 살펴봤다.

이 패키지에서 가장 인상 깊었던 것은 neural network에서 hyper parameter탐색을 gradient descent 방식이 아닌 GA로도 할 수 있다는 것이다.

개인적으로 이 부분이 어느정도 초기치를 잘 잡아줄 수 있는 역할로도 쓸 수있고, Loss가 gradient descent 하기 어려운 구조에서 대안으로 쓸 수 있을것으로도 생각된다.

일단 큰 흐름은 다음과 같이 된다.

사실 완전히 흐름이나 각 parameter에 대한 이해는 부족한 상황

Turkish

PyGAD ile Genetik Algoritmayı Kullanarak Keras Modelleri Nasıl Eğitilir

This is a translation of an original English tutorial published atPaperspace:How To Train Keras Models Using the Genetic Algorithm withPyGAD

PyGAD, genetik algoritma oluşturmak ve makine öğrenimi algoritmalarınıeğitmek için kullanılan açık kaynaklı bir Python kitaplığıdır. Genetikalgoritmayı farklı problem türleri ile çalışacak şekilde özelleştirmekiçin çok çeşitli parametreler sunar.

PyGAD, sinir ağları (NN’ler) ve evrişimli sinir ağları (CNN’ler)oluşturmayı ve eğitmeyi destekleyen kendi modüllerine sahiptir. Bumodüllerin iyi çalışmasına rağmen, herhangi bir ek optimizasyon önlemiolmaksızın Python’da uygulanırlar. Bu, basit problemler için bilenispeten yüksek hesaplama sürelerine yol açar.

En son PyGAD sürümü 2.8.0 (20 Eylül 2020’de piyasaya sürüldü), Kerasmodellerini eğitmek için yeni bir modülü destekliyor. Keras Python’daoluşturulmuş olsa da hızlıdır. Bunun nedeni, Keras’ın arka uç olarakTensorFlow kullanması ve TensorFlow’un oldukça optimize edilmişolmasıdır.

Bu öğreticide, PyGAD kullanılarak Keras modellerinin nasıl eğitileceğianlatılmaktadır. Tartışma, Sıralı Modeli veya İşlevsel API’yi kullanarakKeras modellerini oluşturmayı, Keras model parametrelerinin ilkpopülasyonunu oluşturmayı, uygun bir uygunluk işlevi oluşturmayı ve dahafazlasını içerir.

image7

Hungarian

Tensorflow alapozó 10. Neurális hálózatok tenyésztése genetikus algoritmussal PyGAD és OpenAI Gym használatával

Hogy kontextusba helyezzem a genetikus algoritmusokat, ismételjük kicsitát, hogy hogyan működik a gradient descent és a backpropagation, ami aneurális hálók tanításának általános módszere. Az erről írt cikkemet itttudjátok elolvasni.

A hálózatok tenyésztéséhez aPyGAD nevűprogramkönyvtárat használjuk, így mindenek előtt ezt kell telepítenünk,valamint a Tensorflow-t és a Gym-et, amit Colabban már eleve telepítvekapunk.

Maga a PyGAD egy teljesen általános genetikus algoritmusok futtatásáraképes rendszer. Ennek a kiterjesztése a KerasGA, ami az általános motorTensorflow (Keras) neurális hálókon történő futtatását segíti. A 47.sorban létrehozott KerasGA objektum ennek a kiterjesztésnek a része ésarra szolgál, hogy a paraméterként átadott modellből a másodikparaméterben megadott számosságú populációt hozzon létre. Mivel ahálózatunk 386 állítható paraméterrel rendelkezik, ezért a DNS-ünk itt386 elemből fog állni. A populáció mérete 10 egyed, így a kezdőpopulációnk egy 10x386 elemű mátrix lesz. Ezt adjuk át az 51. sorban azinitial_population paraméterben.

image8

Russian

PyGAD: библиотека для имплементации генетического алгоритма

PyGAD — это библиотека для имплементации генетического алгоритма. Крометого, библиотека предоставляет доступ к оптимизированным реализациямалгоритмов машинного обучения. PyGAD разрабатывали на Python 3.

Библиотека PyGAD поддерживает разные типы скрещивания, мутации иселекции родителя. PyGAD позволяет оптимизировать проблемы с помощьюгенетического алгоритма через кастомизацию целевой функции.

Кроме генетического алгоритма, библиотека содержит оптимизированныеимплементации алгоритмов машинного обучения. На текущий момент PyGADподдерживает создание и обучение нейросетей для задач классификации.

Библиотека находится в стадии активной разработки. Создатели планируютдобавление функционала для решения бинарных задач и имплементации новыхалгоритмов.

PyGAD разрабатывали на Python 3.7.3. Зависимости включают в себя NumPyдля создания и манипуляции массивами и Matplotlib для визуализации. Одиниз изкейсов использования инструмента — оптимизация весов, которыеудовлетворяют заданной функции.

image9

Research Papers using PyGAD

A number of research papers used PyGAD and here are some of them:

  • Alberto Meola, Manuel Winkler, Sören Weinrich, Metaheuristicoptimization of data preparation and machine learning hyperparametersfor prediction of dynamic methane production, Bioresource Technology,Volume 372, 2023, 128604, ISSN 0960-8524.

  • Jaros, Marta, and Jiri Jaros. “Performance-Cost Optimization ofMoldable Scientific Workflows.”

  • Thorat, Divya. “Enhanced genetic algorithm to reduce makespan ofmultiple jobs in map-reduce application on serverless platform”. Diss.Dublin, National College of Ireland, 2020.

  • Koch, Chris, and Edgar Dobriban. “AttenGen: Generating Live AttenuatedVaccine Candidates using Machine Learning.” (2021).

  • Bhardwaj, Bhavya, et al. “Windfarm optimization using Nelder-Mead andParticle Swarm optimization.”2021 7th International Conference onElectrical Energy Systems (ICEES). IEEE, 2021.

  • Bernardo, Reginald Christian S. and J. Said. “Towards amodel-independent reconstruction approach for late-time Hubble data.”(2021).

  • Duong, Tri Dung, Qian Li, and Guandong Xu. “Prototype-basedCounterfactual Explanation for Causal Classification.”arXiv preprintarXiv:2105.00703 (2021).

  • Farrag, Tamer Ahmed, and Ehab E. Elattar. “Optimized Deep Stacked LongShort-Term Memory Network for Long-Term Load Forecasting.”IEEEAccess 9 (2021): 68511-68522.

  • Antunes, E. D. O., Caetano, M. F., Marotta, M. A., Araujo, A., Bondan,L., Meneguette, R. I., & Rocha Filho, G. P. (2021, August). SoluçõesOtimizadas para o Problema de Localização de Máxima Cobertura em RedesMilitarizadas 4G/LTE. InAnais do XXVI Workshop de Gerência eOperação de Redes e Serviços (pp. 152-165). SBC.

  • M. Yani, F. Ardilla, A. A. Saputra and N. Kubota, “Gradient-Free DeepQ-Networks Reinforcement learning: Benchmark and Evaluation,”2021IEEE Symposium Series on Computational Intelligence (SSCI), 2021, pp.1-5, doi: 10.1109/SSCI50451.2021.9659941.

  • Yani, Mohamad, and Naoyuki Kubota. “Deep Convolutional Networks withGenetic Algorithm for Reinforcement Learning Problem.”

  • Mahendra, Muhammad Ihza, and Isman Kurniawan. “OptimizingConvolutional Neural Network by Using Genetic Algorithm for COVID-19Detection in Chest X-Ray Image.”2021 International Conference onData Science and Its Applications (ICoDSA). IEEE, 2021.

  • Glibota, Vjeko.Umjeravanje mikroskopskog prometnog modela primjenomgenetskog algoritma. Diss. University of Zagreb. Faculty of Transportand Traffic Sciences. Division of Intelligent Transport Systems andLogistics. Department of Intelligent Transport Systems, 2021.

  • Zhu, Mingda.Genetic Algorithm-based Parameter Identification forShip Manoeuvring Model under Wind Disturbance. MS thesis. NTNU, 2021.

  • Abdalrahman, Ahmed, and Weihua Zhuang. “Dynamic pricing fordifferentiated pev charging services using deep reinforcementlearning.”IEEE Transactions on Intelligent Transportation Systems(2020).

More Links

https://rodriguezanton.com/identifying-contact-states-for-2d-objects-using-pygad-and/

https://torvaney.github.io/projects/t9-optimised

For More Information

There are different resources that can be used to get started with thegenetic algorithm and building it in Python.

Tutorial: Implementing Genetic Algorithm in Python

To start with coding the genetic algorithm, you can check the tutorialtitledGenetic Algorithm Implementation inPythonavailable at these links:

Thistutorialis prepared based on a previous version of the project but it still agood resource to start with coding the genetic algorithm.

image10

Tutorial: Introduction to Genetic Algorithm

Get started with the genetic algorithm by reading the tutorial titledIntroduction to Optimization with GeneticAlgorithmwhich is available at these links:

image11

Tutorial: Build Neural Networks in Python

Read about building neural networks in Python through the tutorialtitledArtificial Neural Network Implementation using NumPy andClassification of the Fruits360 ImageDatasetavailable at these links:

image12

Tutorial: Optimize Neural Networks with Genetic Algorithm

Read about training neural networks using the genetic algorithm throughthe tutorial titledArtificial Neural Networks Optimization usingGenetic Algorithm withPythonavailable at these links:

image13

Tutorial: Building CNN in Python

To start with coding the genetic algorithm, you can check the tutorialtitledBuilding Convolutional Neural Network using NumPy fromScratchavailable at these links:

Thistutorial)is prepared based on a previous version of the project but it still agood resource to start with coding CNNs.

image14

Tutorial: Derivation of CNN from FCNN

Get started with the genetic algorithm by reading the tutorial titledDerivation of Convolutional Neural Network from Fully Connected NetworkStep-By-Stepwhich is available at these links:

image15

Book: Practical Computer Vision Applications Using Deep Learning with CNNs

You can also check my book cited asAhmed Fawzy Gad ‘Practical ComputerVision Applications Using Deep Learning with CNNs’. Dec. 2018, Apress,978-1-4842-4167-7which discusses neural networks, convolutional neural networks, deeplearning, genetic algorithm, and more.

Find the book at these links:

image16

Contact Us

image17

Thank you for usingPyGAD :)