Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Simple MATLAB toolbox for deep learning network: Version 1.0.3

License

NotificationsYou must be signed in to change notification settings

hiroyuki-kasai/SimpleDeepNetToolbox

Repository files navigation


Authors:Hiroyuki Kasai

Last page update: November, 14, 2018

Latest library version: 1.0.3 (see Release notes for more info.)


Introduction

The SimpleDeepNetToolbox is apure-MATLAB and simple toolbox for deep learning. This toolbox was originally ported frompython library. However, major modification have been made for MATLAB implementation and its efficient implementation.

There are much better other toolboxes available for deep learning, e.g. Theano, torch or tensorflow.I would definitely recommend you to use one of such tools for your problems at hand.The main purpose of this toolbox is to allows you, especially "MATLAB-lover" researchers, to undedrstand deep learning techniques using "non-black-box" simple implementations.


  • Feedforward Backpropagation Neural Networks
  • Convolutional Neural Networks
  • Affine layer
  • Conolution layer
  • Pooling layer
  • Dropout layer
  • Batch normalization layer (Under construction)
  • ReLu (Rectified Linear Unit) layer
  • Sigmoid layer
  • Softmax layer
  • Vanila SGD
  • AdaGrad
  • Momentum SGD

Folders and files

./                      - Top directory../README.md             - This readme file../run_me_first.m        - The scipt that you need to run first../demo.m                - Demonstration script to check and understand this package easily. ./download.m            - Script to download datasets.|networks/              - Contains various network classes.|layers/               - Contains various layer classes.|optimizer/             - Contains optimization solvers.|test_samples/          - Contains test samples.|datasets/          - Contains dasetsets (to be downloaded).

First to do: configure path

Runrun_me_first for path configurations.

%%First run the setup scriptrun_me_first;

Second to do: download datasets

Rundownload for downloading datasets.

%%Run the downloading scriptdownload;
  • If your computer is behind a proxy server, please configure your Matlab setting. Seethis.

Simplest usage example: 5 steps!

Just executedemo_two_layer_neuralnet for the simplest demonstration of this package. This is a forward backward neural network.

%%load dateaset[x_train,t_train,train_num,x_test,t_test,test_num,class_num,dimension, ~, ~]= ...    load_dataset('mnist','./datasets/',inf,inf,false);%%set networknetwork= two_layer_net(x_train,t_train,x_test,t_test,784,50,10, []);%%set trainertrainer= nn_trainer(network);%%traininfo=trainer.train();%%plotdisplay_graph('epoch','cost', {'Tow layer net'}, {}, {info});    train_info=info;test_info=info;train_info.accuracy=info.train_acc;test_info.accuracy=info.test_acc;display_graph('epoch','accuracy', {'Train','Test'}, {}, {train_info,test_info});


Let's take a closer look at the code above bit by bit. The procedure has only5 steps!

Step 1: Load dataset

First, we load a dataset including train set and test set using a data loader functionload_dataset().The output include train set and test set, and related other data.

[x_train,t_train,train_num,x_test,t_test,test_num,class_num,dimension, ~, ~]= ...    load_dataset('mnist','./datasets/',inf,inf,false);

Step 2: Set network

The next step defines the network architecture. This example uses a two layer neural network with the input size 784, the hidden layer size 50, and the output layer size 10. Datasets are also delivered to this class.

%%set networknetwork= two_layer_net(x_train,t_train,x_test,t_test,784,50,10, []);

Step 3: Set trainer

You also set the network to be used. Some options for training could be configured using the second argument, which is not used in this example, though.

%%set trainertrainer= nn_trainer(network);

Step 4: Perform trainer

Now, you start to train the network.

%%traininfo=trainer.train();

It returns the statistics information that include the histories of epoch numbers, cost values, train and test accuracies, and so on.

Step 5: Show result

Finally,display_graph() provides output results of decreasing behavior of the cost values in terms of the number of epoch. The accuracy results for the train and the test are also shown.

% plotdisplay_graph('epoch','cost', {'Two layer net'}, {}, {info});    train_info=info;test_info=info;train_info.accuracy=info.train_acc;test_info.accuracy=info.test_acc;display_graph('epoch','accuracy', {'Train','Test'}, {}, {train_info,test_info});

That's it!


More plots

TBA.


License

  • The SimpleDeepNetToolbox isfree andopen source.
  • The code provided in SimpleDeepNetToolbox should only be used foracademic/research purposes.
  • This toolbox was originally ported frompython library.

Notes

  • As always, parameters such as the learning rate should be configured properly in each dataset and network.

Problems or questions

If you have any problems or questions, please contact the author:Hiroyuki Kasai (email: kasaiat isdot uecdot acdot jp)


Release notes

  • Version 1.0.3 (Nov. 14, 2018)
    • Some class structures are re-configured.
  • Version 1.0.2 (Nov. 09, 2018)
    • Some class structures are re-configured.
  • Version 1.0.1 (Nov. 07, 2018)
    • Some class structures are re-configured.
  • Version 1.0.0 (Oct. 08, 2018)
    • Initial version.

[8]ページ先頭

©2009-2025 Movatter.jp