- Notifications
You must be signed in to change notification settings - Fork100
A parallel framework for deep learning
License
modern-fortran/neural-fortran
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
A parallel framework for deep learning.Read the paperhere.
- Training and inference of dense (fully connected), convolutional (1-d and 2-d),and transformer neural networks
- Stochastic gradient descent optimizers: Classic, momentum, Nesterov momentum,RMSProp, Adagrad, Adam, AdamW
- More than a dozen activation functions and their derivatives
- Loss functions and metrics: Quadratic, Mean Squared Error, Pearson Correlation etc.
- Data-based parallelism
- Loading dense and convolutional models from Keras HDF5 (.h5) files(see thenf-keras-hdf5 add-on)
| Layer type | Constructor name | Supported input layers | Rank of output array | Forward pass | Backward pass |
|---|---|---|---|---|---|
| Input | input | n/a | 1, 2, 3 | n/a | n/a |
| Embedding | embedding | n/a | 2 | ✅ | ✅ |
| Dense (fully-connected) | dense | input1d,dense,dropout,flatten | 1 | ✅ | ✅ |
| Dropout | dropout | dense,flatten,input1d | 1 | ✅ | ✅ |
| Locally connected (2-d) | locally_connected | input,locally_connected,conv,maxpool,reshape | 2 | ✅ | ✅ |
| Convolutional (1-d and 2-d) | conv | input,conv,maxpool,reshape | 2, 3 | ✅ | ✅ |
| Max-pooling (1-d and 2-d) | maxpool | input,conv,maxpool,reshape | 2, 3 | ✅ | ✅ |
| Linear (2-d) | linear2d | input2d,layernorm,linear2d,self_attention | 2 | ✅ | ✅ |
| Self-attention | self_attention | input2d,layernorm,linear2d,self_attention | 2 | ✅ | ✅ |
| Layer Normalization | layernorm | linear2d,self_attention | 2 | ✅ | ✅ |
| Flatten | flatten | input2d,input3d,conv1d,conv2d,maxpool1d,maxpool2d,reshape | 1 | ✅ | ✅ |
| Reshape (1-d to 2-d or 3-d) | reshape | dense,dropout,flatten,input1d | 2, 3 | ✅ | ✅ |
Get the code:
git clone https://github.com/modern-fortran/neural-fortrancd neural-fortranRequired dependencies are:
Optional dependencies are:
- OpenCoarrays (for parallel execution with GFortran)
- BLAS, MKL, or similar (for offloading
matmulanddot_productcalls) - curl (for downloading testing and example datasets)
Compilers tested include:
- flang-new 20.0.0
- gfortran 13.2.0, 14.0.1
- ifort 2021.13.1
- ifx 2024.2.1
With gfortran, the following will create an optimized build of neural-fortran:
fpm build --profile releaseIf you use GFortran and want to run neural-fortran in parallel,you must first installOpenCoarrays.Once installed, use the compiler wrapperscaf andcafrun to build and executein parallel, respectively:
fpm build --compiler caf --profile release --flag "-cpp -DPARALLEL"fpm test --profile releaseFor the time being, you need to specify the same compiler flags tofpm testas you did infpm build so that fpm knows it should use the same buildprofile.
See theFortran Package Manager for more info on fpm.
mkdir buildcd buildcmake ..makeTests and examples will be built in thebin/ directory.
If you use GFortran and want to run neural-fortran in parallel,you must first installOpenCoarrays.Once installed, use the compiler wrapperscaf andcafrun to build and executein parallel, respectively:
FC=caf cmake .. -DPARALLELmakecafrun -n 4 bin/mnist # run MNIST example on 4 coresIf you want to build with a different compiler, such as Intel Fortran,specifyFC when issuingcmake:
FC=ifort cmake ..for a parallel build of neural-fortran, or
FC=ifort cmake ..for a serial build.
To use an external BLAS or MKL library formatmul calls,run cmake like this:
cmake .. -DBLAS=-lblaswhere the value of-DBLAS should point to the desired BLAS implementation,which has to be available in the linking path.This option is currently available only with gfortran.
To build with debugging flags enabled, type:
cmake .. -DCMAKE_BUILD_TYPE=debugType:
ctestto run the tests.
You can use the CMake module availablehere tofind or fetch an installation of this project while configuring your project. Thismodule makes sure that theneural-fortran::neural-fortran target is always generated regardlessof how the neural-fortran is included in the project.
First, either copyFindneural-fortran.cmake to, say, your project'scmake directoryand then include it in yourCMakeLists.txt file:
list(APPENDCMAKE_MODULE_PATH"${CMAKE_CURRENT_SOURCE_DIR}/cmake")
or use theCMAKE_MODULE_PATH variable to point to the directory where it is installed.
Next you need to setneural-fortran_ROOT_DIR to the directory where neural-fortran is installedsuch thatneural-fortran_ROOT_DIR/lib/libneural-fortran.a exists.
The following should be added in the CMake file of your directory:
if(NOTTARGET neural-fortran::neural-fortran) find_package(neural-fortran REQUIRED)endif()
and then to use the target in your project:
target_link_libraries(your_targetPRIVATE neural-fortran::neural-fortran)
The easiest way to get a sense of how to use neural-fortran is to look atexamples, in increasing level of complexity:
- simple: Approximating a simple, constant datarelationship
- sine: Approximating a sine function
- dense_mnist: Hand-written digit recognition(MNIST dataset) using a dense (fully-connected) network
- cnn_mnist: Training a CNN on the MNIST dataset
- get_set_network_params: Getting andsetting hyperparameters of a network.
The examples also show you the extent of the public API that's meant to beused in applications, i.e. anything from thenf module.
Examples 3-6 rely oncurl to download the needed datasets,so make sure you have it installed on your system.Most Linux OSs have it out of the box.The dataset will be downloaded only the first time you run the example in anygiven directory.
If you're using Windows OS or don't have curl for any other reason,downloadmnist.tar.gzdirectly and unpack in the directory in which you will run the example program.
API documentation can be generated withFORD.Assuming you have FORD installed on your system, run
ford ford.mdfrom the neural-fortran top-level directory to generate the API documentation in doc/html.Point your browser to doc/html/index.html to read it.
ThisContributing guide briefly describes the code organization.It may be useful to read if you want to contribute a new feature to neural-fortran.
Thanks to all open-source contributors to neural-fortran:awvwgk,certik,ggoyman,ivan-pi,jacobwilliams,jvdp1,jvo203,mathomp4,milancurcic,OneAdder,pirpyn,rico07,rouson,rweed,Spnetic-5,andscivision.
Development of convolutional networks and Keras HDF5 adapters inneural-fortran was funded by a contract from NASA Goddard Space Flight Centerto the University of Miami.Development of optimizers is supported by the Google Summer of Code 2023 projectawarded toFortran-lang.
- Fortran Keras Bridge (FKB)by Jordan Ott provides a Python bridge between old (v0.1.0) neural-fortranstyle save files and Keras's HDF5 models. As of v0.9.0, neural-fortranimplements the full feature set of FKB in pure Fortran, and in additionsupports training and inference of convolutional networks.
- rte-rrtmgp-nn by Peter Ukkonenis an implementation based on old (v0.1.0) neural-fortran which optimizes forspeed and running on GPUs the memory layout and forward and backward passes ofdense layers.
- Inference Engine developedat the Berkeley Lab by the Computer Languages and Systems Software (CLaSS)group.
Neural-fortran has been used successfully in over a dozen published studies.See all papers that cite ithere.
About
A parallel framework for deep learning
Topics
Resources
License
Contributing
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.

