- Notifications
You must be signed in to change notification settings - Fork7
QReLU and m-QReLU: Two novel quantum activation functions for Deep Learning in TensorFlow, Keras, and PyTorch
License
luca-parisi/quantum_relu
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
The Quantum ReLU'QReLU' and its modified version or'm-QReLU' are Python custom activation functions available for both shallow and deep neural networks in TensorFlow, Keras, and PyTorch for Machine Learning- and Deep Learning-based classification. They are distributed under theCC BY 4.0 license.
Details on these functions, implementations, and validations against gold standard activation functions for both shallow and deep neural networks are available at the papers:Parisi, L., 2020 andParisi, L., et al., 2022.
The dependencies are included in theenvironment.yml
file.Run the following command to install the required version of Python (v3.9.16) and all dependencies in a conda virtualenvironment (replace<env_name>
with your environment name):
conda create --name <env_name> --file environment.yml
Run pip install -e .
to install thesrc
package in editable mode.
You can use theQuantumReLU
activation functions as a keras layer and set themodified
attribute to eitherFalse
orTrue
if using the QReLU or the m-QReLU respectively:
Example of usage in a sequential model in Keras with aQuantumReLU
layer between a convolutional layer and a pooling layer
Either
model=models.Sequential()model.add(layers.Conv2D(32, (3,3),input_shape=(32,32,3)))model.add(QuantumReLU(modified=False))# True if using the m-QReLU (instead of the QReLU)model.add(layers.MaxPooling2D((2,2)))
or
model=keras.Sequential(keras.Input(shape=(32,32,3)),layers.Conv2D(32,kernel_size=(3,3)),QuantumReLU(modified=False),# True if using the m-QReLU (instead of the QReLU)layers.MaxPooling2D(pool_size=(2,2)), ])
Example of usage in a sequential model in PyTorch with aQuantumReLU
layer between a convolutional layer and a pooling layer
self.conv1=nn.Conv2d(1,32,kernel_size=3)self.relu1=QuantumReLU(modified=False)# True if using the m-QReLU (instead of the QReLU)self.pool1=nn.MaxPool2d(kernel_size=2)
isort
is used to ensure a consistent order of imports, whilstautopep8
to ensure adherence of the codes to PEP-8,via the following two commands respectively:
isort <folder_name>
autopep8 --in-place --recursive .
Runpytest --cov-report term-missing --cov=src tests/
to execute all unit tests and view the report with the testcoverage in percentage and missing lines too.
If you use these activation functions, please cite the papers byParisi, L., 2020 andParisi, L., et al., 2022.