Luz is a higher level API for torch providing abstractions to allowfor much less verbose training loops.
This package is still under development.
It is heavily inspired by other higher level frameworks for deeplearning, to cite a few:
FastAI: we are heavilyinspired by the FastAI library, especially theLearnerobject and the callbacks API.
Keras: We are also heavilyinspired by Keras, especially callback names. The lightning moduleinterface is similar tocompile, too.
PyTorchLightning: The idea of theluz_module being a subclassofnn_module is inspired by theLightningModule object inlightning.
HuggingFaceAccelerate: The internal device placement API is heavily inspired byAccelerate, but is much more modest in features. Currently only CPU andSingle GPU are supported.
You can install the released version from CRAN with:
install.packages("luz")or the development version with:
remotes::install_github("mlverse/luz")Luz lets you take your torchnn_module definition andfit it to a dataloader, while handling the boring partslike moving data between devices, updating the weights, showing progressbars and tracking metrics.
Here’s an example defining and training an Autoencoder for the MNISTdataset. We selected parts of the code to highlight luz functionality.You can find the full example codehere.
net<-nn_module("Net",initialize =function() { self$encoder<-nn_sequential(nn_conv2d(1,6,kernel_size=5),nn_relu(),nn_conv2d(6,16,kernel_size=5),nn_relu() ) self$decoder<-nn_sequential(nn_conv_transpose2d(16,6,kernel_size =5),nn_relu(),nn_conv_transpose2d(6,1,kernel_size =5),nn_sigmoid() ) },forward =function(x) { x%>% self$encoder()%>% self$decoder() })Now that we have defined the Autoencoder architecture usingtorch::nn_module(), we can fit it using luz:
fitted<- net%>%setup(loss =nn_mse_loss(),optimizer = optim_adam )%>%fit(train_dl,epochs =1,valid_data = test_dl)