Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

iKintosh/GaborNet

Repository files navigation

PyPI-StatusBuild StatusLICENSEDeepSource

Installation

GaborNet can be installed via pip from Python 3.7 and above:

pip install GaborNet

Getting started

importtorchimporttorch.nnasnnfromtorch.nnimportfunctionalasFfromGaborNetimportGaborConv2ddevice=torch.device("cuda:0"iftorch.cuda.is_available()else"cpu")classGaborNN(nn.Module):def__init__(self):super(GaborNN,self).__init__()self.g0=GaborConv2d(in_channels=1,out_channels=96,kernel_size=(11,11))self.c1=nn.Conv2d(96,384, (3,3))self.fc1=nn.Linear(384*3*3,64)self.fc2=nn.Linear(64,2)defforward(self,x):x=F.leaky_relu(self.g0(x))x=nn.MaxPool2d()(x)x=F.leaky_relu(self.c1(x))x=nn.MaxPool2d()(x)x=x.view(-1,384*3*3)x=F.leaky_relu(self.fc1(x))x=self.fc2(x)returnxnet=GaborNN().to(device)

Original research paper (preprint):https://arxiv.org/abs/1904.13204

This research on deep convolutional neural networks proposes a modifiedarchitecture that focuses on improving convergence and reducing trainingcomplexity. The filters in the first layer of network are constrained to fit theGabor function. The parameters of Gabor functions are learnable and updated bystandard backpropagation techniques. The proposed architecture was tested onseveral datasets and outperformed the common convolutional networks

Citation

Please use this bibtex if you want to cite this repository in your publications:

@misc{gabornet,    author = {Alekseev, Andrey},    title = {GaborNet: Gabor filters with learnable parameters in deep convolutional               neural networks},    year = {2019},    publisher = {GitHub},    journal = {GitHub repository},    howpublished = {\url{https://github.com/iKintosh/GaborNet}},}

[8]ページ先頭

©2009-2025 Movatter.jp