Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

QReLU and m-QReLU: Two novel quantum activation functions for Deep Learning in TensorFlow, Keras, and PyTorch

License

NotificationsYou must be signed in to change notification settings

luca-parisi/quantum_relu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

38 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Two novel quantum activation functions

The Quantum ReLU'QReLU' and its modified version or'm-QReLU' are Python custom activation functions available for both shallow and deep neural networks in TensorFlow, Keras, and PyTorch for Machine Learning- and Deep Learning-based classification. They are distributed under theCC BY 4.0 license.

Details on these functions, implementations, and validations against gold standard activation functions for both shallow and deep neural networks are available at the papers:Parisi, L., 2020 andParisi, L., et al., 2022.

Dependencies

The dependencies are included in theenvironment.yml file.Run the following command to install the required version of Python (v3.9.16) and all dependencies in a conda virtualenvironment (replace<env_name> with your environment name):

  • conda create --name <env_name> --file environment.yml

Usage

Run pip install -e . to install thesrc package in editable mode.

You can use theQuantumReLU activation functions as a keras layer and set themodified attribute to eitherFalseorTrue if using the QReLU or the m-QReLU respectively:

Example of usage in a sequential model in Keras with aQuantumReLU layer between a convolutional layer and a pooling layer

Either

model=models.Sequential()model.add(layers.Conv2D(32, (3,3),input_shape=(32,32,3)))model.add(QuantumReLU(modified=False))# True if using the m-QReLU (instead of the QReLU)model.add(layers.MaxPooling2D((2,2)))

or

model=keras.Sequential(keras.Input(shape=(32,32,3)),layers.Conv2D(32,kernel_size=(3,3)),QuantumReLU(modified=False),# True if using the m-QReLU (instead of the QReLU)layers.MaxPooling2D(pool_size=(2,2)),    ])

Example of usage in a sequential model in PyTorch with aQuantumReLU layer between a convolutional layer and a pooling layer

self.conv1=nn.Conv2d(1,32,kernel_size=3)self.relu1=QuantumReLU(modified=False)# True if using the m-QReLU (instead of the QReLU)self.pool1=nn.MaxPool2d(kernel_size=2)

Linting

isort is used to ensure a consistent order of imports, whilstautopep8 to ensure adherence of the codes to PEP-8,via the following two commands respectively:

  • isort <folder_name>
  • autopep8 --in-place --recursive .

Unit testing

Runpytest --cov-report term-missing --cov=src tests/ to execute all unit tests and view the report with the testcoverage in percentage and missing lines too.

Citation request

If you use these activation functions, please cite the papers byParisi, L., 2020 andParisi, L., et al., 2022.


[8]ページ先頭

©2009-2025 Movatter.jp