Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

A micro neural network multilayer perceptron for MicroPython (used on ESP32 and Pycom modules)

License

NotificationsYou must be signed in to change notification settings

jczic/MicroMLP

Repository files navigation

HC²

Very easy to integrate and very light with one file only :

  • "microMLP.py"

MicroMLP features :

  • Modifiable multilayer and connections structure
  • Integrated bias on neurons
  • Plasticity of the connections included
  • Activation functions by layer
  • Parameters Alpha, Eta and Gain
  • Managing set of examples and learning
  • QLearning functions to use reinforcement learning
  • Save and load all structure to/from json file
  • Various activation functions :
    • Heaviside binary step
    • Logistic (sigmoid or soft step)
    • Hyperbolic tangent
    • SoftPlus rectifier
    • ReLU (rectified linear unit)
    • Gaussian function

Use deep learning for :

  • Signal processing (speech processing, identification, filtering)
  • Image processing (compression, recognition, patterns)
  • Control (diagnosis, quality control, robotics)
  • Optimization (planning, traffic regulation, finance)
  • Simulation (black box simulation)
  • Classification (DNA analysis)
  • Approximation (unknown function, complex function)

UsingMicroMLP static functions :

NameFunction
Createmlp = MicroMLP.Create(neuronsByLayers, activationFuncName, layersAutoConnectFunction=None, useBiasValue=1.0)
LoadFromFilemlp = MicroMLP.LoadFromFile(filename)

UsingMicroMLP speedly creation of a neural network :

frommicroMLPimportMicroMLPmlp=MicroMLP.Create([3,10,2],"Sigmoid",MicroMLP.LayersFullConnect)

UsingMicroMLP main class :

NameFunction
Constructormlp = MicroMLP()
GetLayerlayer = mlp.GetLayer(layerIndex)
GetLayerIndexidx = mlp.GetLayerIndex(layer)
RemoveLayermlp.RemoveLayer(layer)
GetInputLayerinputLayer = mlp.GetInputLayer()
GetOutputLayeroutputLayer = mlp.GetOutputLayer()
Learnok = mlp.Learn(inputVectorNNValues, targetVectorNNValues)
Testok = mlp.Test(inputVectorNNValues, targetVectorNNValues)
PredictoutputVectorNNValues = mlp.Predict(inputVectorNNValues)
QLearningLearnForChosenActionok = mlp.QLearningLearnForChosenAction(stateVectorNNValues, rewardNNValue, pastStateVectorNNValues, chosenActionIndex, terminalState=True, discountFactorNNValue=None)
QLearningPredictBestActionIndexbestActionIndex = mlp.QLearningPredictBestActionIndex(stateVectorNNValues)
SaveToFileok = mlp.SaveToFile(filename)
AddExampleok = mlp.AddExample(inputVectorNNValues, targetVectorNNValues)
ClearExamplesmlp.ClearExamples()
LearnExampleslearnCount = mlp.LearnExamples(maxSeconds=30, maxCount=None, stopWhenLearned=True, printMAEAverage=True)
PropertyExampleRead/Write
Layersmlp.Layersget
LayersCountmlp.LayersCountget
IsNetworkCompletemlp.IsNetworkCompleteget
MSEmlp.MSEget
MAEmlp.MAEget
MSEPercentmlp.MSEPercentget
MAEPercentmlp.MAEPercentget
ExamplesCountmlp.ExamplesCountget

UsingMicroMLP to learn the XOr problem (with hyperbolic tangent) :

frommicroMLPimportMicroMLPmlp=MicroMLP.Create(neuronsByLayers= [2,2,1],activationFuncName=MicroMLP.ACTFUNC_TANH,layersAutoConnectFunction=MicroMLP.LayersFullConnect )nnFalse=MicroMLP.NNValue.FromBool(False)nnTrue=MicroMLP.NNValue.FromBool(True)mlp.AddExample( [nnFalse,nnFalse], [nnFalse] )mlp.AddExample( [nnFalse,nnTrue ], [nnTrue ] )mlp.AddExample( [nnTrue ,nnTrue ], [nnFalse] )mlp.AddExample( [nnTrue ,nnFalse], [nnTrue ] )learnCount=mlp.LearnExamples()print("LEARNED :" )print("  - False xor False = %s"%mlp.Predict([nnFalse,nnFalse])[0].AsBool )print("  - False xor True  = %s"%mlp.Predict([nnFalse,nnTrue] )[0].AsBool )print("  - True  xor True  = %s"%mlp.Predict([nnTrue ,nnTrue] )[0].AsBool )print("  - True  xor False = %s"%mlp.Predict([nnTrue ,nnFalse])[0].AsBool )ifmlp.SaveToFile("mlp.json") :print("MicroMLP structure saved!" )
VariableDescriptionDefault
mlp.EtaWeighting of the error correction0.30
mlp.AlphaStrength of connections plasticity0.75
mlp.GainNetwork learning gain0.99
mlp.CorrectLearnedMAEThreshold of self-learning error0.02
GrapheActivation function nameConstDetail
HC²"Heaviside"MicroMLP.ACTFUNC_HEAVISIDEHeaviside binary step
HC²"Sigmoid"MicroMLP.ACTFUNC_SIGMOIDLogistic (sigmoid or soft step)
HC²"TanH"MicroMLP.ACTFUNC_TANHHyperbolic tangent
HC²"SoftPlus"MicroMLP.ACTFUNC_SOFTPLUSSoftPlus rectifier
HC²"ReLU"MicroMLP.ACTFUNC_RELURectified linear unit
HC²"Gaussian"MicroMLP.ACTFUNC_GAUSSIANGaussian function
Layers auto-connect functionDetail
MicroMLP.LayersFullConnectNetwork fully connected

UsingMicroMLP.Layer class :

NameFunction
Constructorlayer = MicroMLP.Layer(parentMicroMLP, activationFuncName=None, neuronsCount=0)
GetLayerIndexidx = layer.GetLayerIndex()
GetNeuronneuron = layer.GetNeuron(neuronIndex)
GetNeuronIndexidx = layer.GetNeuronIndex(neuron)
AddNeuronlayer.AddNeuron(neuron)
RemoveNeuronlayer.RemoveNeuron(neuron)
GetMeanSquareErrormse = layer.GetMeanSquareError()
GetMeanAbsoluteErrormae = layer.GetMeanAbsoluteError()
GetMeanSquareErrorAsPercentmseP = layer.GetMeanSquareErrorAsPercent()
GetMeanAbsoluteErrorAsPercentmaeP = layer.GetMeanAbsoluteErrorAsPercent()
Removelayer.Remove()
PropertyExampleRead/Write
ParentMicroMLPlayer.ParentMicroMLPget
ActivationFuncNamelayer.ActivationFuncNameget
Neuronslayer.Neuronsget
NeuronsCountlayer.NeuronsCountget

UsingMicroMLP.InputLayer(Layer) class :

NameFunction
ConstructorinputLayer = MicroMLP.InputLayer(parentMicroMLP, neuronsCount=0)
SetInputVectorNNValuesok = inputLayer.SetInputVectorNNValues(inputVectorNNValues)

UsingMicroMLP.OutputLayer(Layer) class :

NameFunction
ConstructoroutputLayer = MicroMLP.OutputLayer(parentMicroMLP, activationFuncName, neuronsCount=0)
GetOutputVectorNNValuesoutputVectorNNValues = outputLayer.GetOutputVectorNNValues()
ComputeTargetLayerErrorok = outputLayer.ComputeTargetLayerError(targetVectorNNValues)

UsingMicroMLP.Neuron class :

NameFunction
Constructorneuron = MicroMLP.Neuron(parentLayer)
GetNeuronIndexidx = neuron.GetNeuronIndex()
GetInputConnectionsconnections = neuron.GetInputConnections()
GetOutputConnectionsconnections = neuron.GetOutputConnections()
AddInputConnectionneuron.AddInputConnection(connection)
AddOutputConnectionneuron.AddOutputConnection(connection)
RemoveInputConnectionneuron.RemoveInputConnection(connection)
RemoveOutputConnectionneuron.RemoveOutputConnection(connection)
SetBiasneuron.SetBias(bias)
GetBiasneuron.GetBias()
SetOutputNNValueneuron.SetOutputNNValue(nnvalue)
ComputeValueneuron.ComputeValue()
ComputeErrorneuron.ComputeError(targetNNValue=None)
Removeneuron.Remove()
PropertyExampleRead/Write
ParentLayerneuron.ParentLayerget
ComputedOutputneuron.ComputedOutputget
ComputedDeltaErrorneuron.ComputedDeltaErrorget
ComputedSignalErrorneuron.ComputedSignalErrorget

UsingMicroMLP.Connection class :

NameFunction
Constructorconnection = MicroMLP.Connection(neuronSrc, neuronDst, weight=None)
UpdateWeightconnection.UpdateWeight(eta, alpha)
Removeconnection.Remove()
PropertyExampleRead/Write
NeuronSrcconnection.NeuronSrcget
NeuronDstconnection.NeuronDstget
Weightconnection.Weightget

UsingMicroMLP.Bias class :

NameFunction
Constructorbias = MicroMLP.Bias(neuronDst, value=1.0, weight=None)
UpdateWeightbias.UpdateWeight(eta, alpha)
Removebias.Remove()
PropertyExampleRead/Write
NeuronDstbias.NeuronDstget
Valuebias.Valueget
Weightbias.Weightget

UsingMicroMLP.NNValue static functions :

NameFunction
FromPercentnnvalue = MicroMLP.NNValue.FromPercent(value)
NewPercentnnvalue = MicroMLP.NNValue.NewPercent()
FromBytennvalue = MicroMLP.NNValue.FromByte(value)
NewBytennvalue = MicroMLP.NNValue.NewByte()
FromBoolnnvalue = MicroMLP.NNValue.FromBool(value)
NewBoolnnvalue = MicroMLP.NNValue.NewBool()
FromAnalogSignalnnvalue = MicroMLP.NNValue.FromAnalogSignal(value)
NewAnalogSignalnnvalue = MicroMLP.NNValue.NewAnalogSignal()

UsingMicroMLP.NNValue class :

NameFunction
Constructornnvalue = MicroMLP.NNValue(minValue, maxValue, value)
PropertyExampleRead/Write
AsFloatnnvalue.AsFloat = 639.513get / set
AsIntnnvalue.AsInt = 12345get / set
AsPercentnnvalue.AsPercent = 65get / set
AsBytennvalue.AsByte = b'\x75'get / set
AsBoolnnvalue.AsBool = Trueget / set
AsAnalogSignalnnvalue.AsAnalogSignal = 0.39472get / set

By JC`zic forHC² ;')

Keep it simple, stupid 👍

Releases

No releases published

Packages

No packages published

Contributors2

  •  
  •  

Languages


[8]ページ先頭

©2009-2026 Movatter.jp