Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

🧠 A Feed-Forward Neural Network (FNN) from scratch in Rust

License

NotificationsYou must be signed in to change notification settings

W4G1/neural-network-rust

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repository contains a minimal but fully functioning Feed-Forward Neural Network (FNN) implemented in the Rust programming language. The neural network uses batch gradient descent for backpropagation. It is single-threaded and uses CPU only for computations.

This implementation uses the mean squared error loss function and includes two types of activation functions: Sigmoid and Rectified Linear Unit (ReLU).

Goal

The goal of this project is to get a better understanding of neural networks by creating one from scratch.

Getting Started

To get a copy of this project up and running on your local machine, you will needRust.

Clone this repository:

git clone https://github.com/w4g1/neural-network-rust.git

Go into the repository:

cd neural-network-rust

Run XOR example:

cargo run --example xor --releaseEpoch: 250000 Error: 0.004982472275727541Epoch: 500000 Error: 0.0022680697570409874Epoch: 750000 Error: 0.0014475361058490137Epoch: 1000000 Error: 0.0010574201380490365Epoch: 1250000 Error: 0.0008307775961258309Training completedin 844.0897ms0 XOR 0 = 0.03644937294891780 XOR 1 = 0.96297617432341051 XOR 0 = 0.95971840544551321 XOR 1 = 0.040673502104589074

Usage

This implementation abstracts loss and activation functions into separate traits, providing flexibility:

traitLossFunction{fncompute(&self,target:f64,output:f64) ->f64;fnderivative(&self,target:f64,output:f64) ->f64;}traitActivationFunction{fncompute(&self,input:f64) ->f64;fnderivative(&self,output:f64) ->f64;}

ANeuron struct supports arbitrary number of inputs and keeps track of its output, error, and the activation function it should use. Various other structural elements such asConnection,Layer,NeuralNetworkConfig,TrainConfig, andNeuralNetwork are defined to accommodate a fully working feed forward neural network.

To build a simple XOR logic function learning neural network, use the following:

letmut network =NeuralNetwork::new(vec![Layer::new(2,&Activation::Sigmoid(Sigmoid)),Layer::new(5,&Activation::Sigmoid(Sigmoid)),Layer::new(1,&Activation::Sigmoid(Sigmoid)),],NeuralNetworkConfig{loss_function:&Loss::MeanSquaredError(MeanSquaredError),},);

To train the network, use thetrain method on the instance and pass in training data and Configuration for the training.

let config =TrainConfig{learning_rate:0.5,steps: -1,epochs: -1,error_threshold:0.001,eval_frequency:100_000,};network.train(&dataset, config);

License

This project is open-source software licensed under the MIT license.

Contributions

Pull requests are always welcome to improve this repository. Please feel free to fork this package and contribute by submitting a pull request to enhance the functionalities.

About

🧠 A Feed-Forward Neural Network (FNN) from scratch in Rust

Resources

License

Stars

Watchers

Forks

Languages


[8]ページ先頭

©2009-2025 Movatter.jp