- Notifications
You must be signed in to change notification settings - Fork26
License
NotificationsYou must be signed in to change notification settings
iKintosh/GaborNet
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
GaborNet can be installed via pip from Python 3.7 and above:
pip install GaborNet
importtorchimporttorch.nnasnnfromtorch.nnimportfunctionalasFfromGaborNetimportGaborConv2ddevice=torch.device("cuda:0"iftorch.cuda.is_available()else"cpu")classGaborNN(nn.Module):def__init__(self):super(GaborNN,self).__init__()self.g0=GaborConv2d(in_channels=1,out_channels=96,kernel_size=(11,11))self.c1=nn.Conv2d(96,384, (3,3))self.fc1=nn.Linear(384*3*3,64)self.fc2=nn.Linear(64,2)defforward(self,x):x=F.leaky_relu(self.g0(x))x=nn.MaxPool2d()(x)x=F.leaky_relu(self.c1(x))x=nn.MaxPool2d()(x)x=x.view(-1,384*3*3)x=F.leaky_relu(self.fc1(x))x=self.fc2(x)returnxnet=GaborNN().to(device)
Original research paper (preprint):https://arxiv.org/abs/1904.13204
This research on deep convolutional neural networks proposes a modifiedarchitecture that focuses on improving convergence and reducing trainingcomplexity. The filters in the first layer of network are constrained to fit theGabor function. The parameters of Gabor functions are learnable and updated bystandard backpropagation techniques. The proposed architecture was tested onseveral datasets and outperformed the common convolutional networks
Please use this bibtex if you want to cite this repository in your publications:
@misc{gabornet, author = {Alekseev, Andrey}, title = {GaborNet: Gabor filters with learnable parameters in deep convolutional neural networks}, year = {2019}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/iKintosh/GaborNet}},}