- Notifications
You must be signed in to change notification settings - Fork5
Genetic algorithm for unsupervised machine learning in Go.
License
soypat/mu8
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Simple unsupervised machine learning package using Go 1.18 generics.
μ8 (mu8) uses a simple genetic algorithm implementation to optimize a objective function. It allows optimizing floating point numbers, integers and anything else that can implement the 3 methodGene
interface
The genetic algorithm implementation is currently ~150 lines long and is contained inpopulation.go
. It consists of the following steps:
- Natural selection. Best individual conserved (population champion)
- Mate.
- Mutate babies.
- Rinse and repeat.
The filemu8.go
containsGenome
andGene
interface definitions. Users should implementGenome
interface and useGene
implementations fromgenes
package.
There is an Islands Model Genetic Algorithm (IMGA) implementation inislands.go
using theIslands
type that makes use of a parallel optimization algorithm to make use of multi-core machines.
Everything starts with themu8.Genome
type on the user side. We define a type that implements itusing a helper typegenes.ContrainedFloat
from thegenes
package. All thisgenes
type doesis save us the trouble of writing our ownmu8.Gene
implementation.
typemygenomestruct {genoma []genes.ConstrainedFloat}func (g*mygenome)GetGene(iint) mu8.Gene {return&g.genoma[i] }func (g*mygenome)Len()int {returnlen(g.genoma) }// Simulate simply adds the genes. We'd expect the genes to reach the max values of the constraint.func (g*mygenome)Simulate() (fitnessfloat64) {fori:=rangeg.genoma {fitness+=g.genoma[i].Value()}// fitness must ALWAYS be greater than zero for succesful simulation.returnmath.Max(0,fitness/float64(g.Len()))}
We're almost ready to optimize our implementation to maximize it's fitness, which would simply be the addition of all it's genes.
Let's write the function that initializes a blank-slatemygenome
funcnewGenome(nint)*mygenome {return&mygenome{genoma:make([]genes.ConstrainedFloat,n)}}
The function above may be confusing... what is the constraint on the number? By defaultgenes.ConstrainedFloat
uses the range [0, 1].
constNindividuals=100individuals:=make([]*mygenome,Nindividuals)fori:=0;i<Nindividuals;i++ {genome:=newGenome(genomelen)// This spices up the initial population so fitnesses are not all zero.mu8.Mutate(genome,src,.1)individuals[i]=genome}pop:=genetic.NewPopulation(individuals,rand.NewSource(1),func()*mygenome {returnnewGenome(3)})constNgeneration=100ctx:=context.Background()fori:=0;i<Ngenerations;i++ {err:=pop.Advance(ctx)iferr!=nil {panic(err.Error())}err=pop.Selection(0.5,1)iferr!=nil {panic(err.Error())}}fmt.Printf("champ fitness=%.3f\n",pop.ChampionFitness())
The final fitness should be close to 1.0 if the algorithm did it's job. For the code seemu8_test.go
Seerocket
for a demonstration on rocket stage optimization.Below is the output of said program
champHeight:117.967kmchampHeight:136.748kmchampHeight:140.633kmchampHeight:141.873kmchampHeight:141.873kmchampHeight:141.873kmchampHeight:142.883kmchampHeight:143.292kmchampHeight:143.292kmchampHeight:143.292kmchampHeight:143.292kmour champion: Stage 0: coast=281.2s, propMass=0.0kg, Δm=99.35kg/s, totalMass=200.0Stage 1: coast=0.0s, propMass=1.6kg, Δm=0.01kg/s, totalMass=21.6
src:=rand.NewSource(1)const (genomelen=6gradMultiplier=10.0epochs=6)// Create new individual and mutate it randomly.individual:=newGenome(genomelen)rng:=rand.New(src)fori:=0;i<genomelen;i++ {individual.GetGene(i).Mutate(rng)}// Prepare for gradient descent.grads:=make([]float64,genomelen)ctx:=context.Background()// Champion will harbor our best individual.champion:=newGenome(genomelen)forepoch:=0;epoch<epochs;epoch++ {// We calculate the gradients of the individual passing a nil// newIndividual callback since the GenomeGrad type we implemented// does not require blank-slate initialization.err:=mu8.Gradient(ctx,grads,individual,nil)iferr!=nil {panic(err)}// Apply gradients.fori:=0;i<individual.Len();i++ {gene:=individual.GetGeneGrad(i)grad:=grads[i]gene.SetValue(gene.Value()+grad*gradMultiplier)}mu8.CloneGrad(champion,individual)fmt.Printf("fitness=%f with grads=%f\n",individual.Simulate(ctx),grads)}
Contributions very welcome! I myself have no idea what I'm doing so I welcomeissues on any matter :)
Pull requests also welcome but please submit an issue first on what you'd like to change.I promise I'll answer as fast as I can.
Please take a look at the TODO's in the project:Ctrl+FTODO
Inspired byCodeBullets amazing video on the subject.
Gopher rendition byJuliette Whittingslow.
Gopher design authored byRenée Frenchis licensed by the Creative Commons Attribution 3.0 licensed.
About
Genetic algorithm for unsupervised machine learning in Go.