This articlemay be too technical for most readers to understand. Pleasehelp improve it tomake it understandable to non-experts, without removing the technical details.(February 2017) (Learn how and when to remove this message) |

Inimage processing, aGabor filter, named afterDennis Gabor, who first proposed it as a 1D filter,[1] is alinear filter used fortexture analysis, which essentially means that it analyzes whether there is any specific frequency content in the image in specific directions in a localized region around the point or region of analysis. The Gabor filter was first generalized to 2D by Gösta Granlund,[2] by adding a reference direction. Frequency and orientation representations of Gabor filters are claimed by many contemporary vision scientists to be similar to those of thehuman visual system.[3] They have been found to be particularly appropriate for texture representation and discrimination. In the spatial domain, a 2D Gabor filter is aGaussian kernel function modulated by asinusoidalplane wave (seeGabor transform).
Some authors claim that simple cells in thevisual cortex ofmammalian brains can be modeled by Gabor functions.[4][5] Thus,image analysis with Gabor filters is thought by some to be similar to perception in thehuman visual system.
This section'sfactual accuracy isdisputed. Relevant discussion may be found on thetalk page. Please help to ensure that disputed statements arereliably sourced.(February 2013) (Learn how and when to remove this message) |
Itsimpulse response is defined by asinusoidal wave (aplane wave for 2D Gabor filters) multiplied by aGaussian function.[6]Because of the multiplication-convolution property (Convolution theorem), theFourier transform of a Gabor filter's impulse response is theconvolution of the Fourier transform of the harmonic function (sinusoidal function) and the Fourier transform of the Gaussian function. The filter has areal and animaginary component representingorthogonal directions.[7] The two components may be formed into acomplex number or used individually.
Complex
Real
Imaginary
where and.
In this equation, represents the wavelength of the sinusoidal factor, represents the orientation of the normal to the parallel stripes of aGabor function, is the phase offset, is the sigma/standard deviation of the Gaussian envelope and is the spatial aspect ratio, and specifies the ellipticity of the support of the Gabor function.

Gabor filters are directly related toGabor wavelets, since they can be designed for a number of dilations and rotations. However, in general, expansion is not applied for Gabor wavelets, since this requires computation of bi-orthogonal wavelets, which may be very time-consuming. Therefore, usually, a filter bank consisting of Gabor filters with various scales and rotations is created. The filters are convolved with the signal, resulting in a so-called Gabor space. This process is closely related to processes in the primary visual cortex.[8]Jones and Palmer showed that the real part of the complex Gabor function is a good fit to the receptive field weight functions found in simple cells in a cat's striate cortex.[9]
When processing temporal signals, data from the future cannot be accessed, which leads to problems if attempting to use Gabor functions for processing real-time signals that depend on the temporal dimension. A time-causal analogue of the Gabor filter has been developed in[10] based on replacing the Gaussian kernel in the Gabor function with a time-causal and time-recursive kernel referred to as the time-causal limit kernel. In this way, time-frequency analysis based on the resulting complex-valued extension of the time-causal limit kernel makes it possible to capture essentially similar transformations of a temporal signal as the Gabor filter can, and as can be described by the Heisenberg group, see[10] for further details.
A set of Gabor filters with different frequencies and orientations may be helpful for extracting useful features from an image.[11] In the discrete domain, two-dimensional Gabor filters are given by,
whereB andC are normalizing factors to be determined.
2D Gabor filters have rich applications in image processing, especially infeature extraction for texture analysis and segmentation.[12] defines the frequency being looked for in the texture. By varying, we can look for texture oriented in a particular direction. By varying, we change the support of the basis or the size of the image region being analyzed.
In document image processing, Gabor features are ideal for identifying the script of a word in a multilingual document.[13] Gabor filters with different frequencies and with orientations in different directions have been used to localize and extract text-only regions from complex document images (both gray and colour), since text is rich in high frequency components, whereas pictures are relatively smooth in nature.[14][15][16] It has also been applied for facial expression recognition[17]Gabor filters have also been widely used in pattern analysis applications. For example, it has been used to study the directionality distribution inside the porous spongytrabecularbone in thespine.[18] The Gabor space is very useful inimage processing applications such asoptical character recognition,iris recognition andfingerprint recognition. Relations between activations for a specific spatial location are very distinctive between objects in an image. Furthermore, important activations can be extracted from the Gabor space in order to create a sparse object representation.
This is an example implementation inPython:
importnumpyasnpdefgabor(sigma,theta,Lambda,psi,gamma):"""Gabor feature extraction."""sigma_x=sigmasigma_y=float(sigma)/gamma# Bounding boxnstds=3# Number of standard deviation sigmaxmax=max(abs(nstds*sigma_x*np.cos(theta)),abs(nstds*sigma_y*np.sin(theta)))xmax=np.ceil(max(1,xmax))ymax=max(abs(nstds*sigma_x*np.sin(theta)),abs(nstds*sigma_y*np.cos(theta)))ymax=np.ceil(max(1,ymax))xmin=-xmaxymin=-ymax(y,x)=np.meshgrid(np.arange(ymin,ymax+1),np.arange(xmin,xmax+1))# Rotationx_theta=x*np.cos(theta)+y*np.sin(theta)y_theta=-x*np.sin(theta)+y*np.cos(theta)gb=np.exp(-0.5*(x_theta**2/sigma_x**2+y_theta**2/sigma_y**2))*np.cos(2*np.pi/Lambda*x_theta+psi)returngb
For an implementation on images, see[1].
This is an example implementation inMATLAB/Octave:
functiongb=gabor_fn(sigma, theta, lambda, psi, gamma)sigma_x=sigma;sigma_y=sigma/gamma;% Bounding boxnstds=3;xmax=max(abs(nstds*sigma_x*cos(theta)),abs(nstds*sigma_y*sin(theta)));xmax=ceil(max(1,xmax));ymax=max(abs(nstds*sigma_x*sin(theta)),abs(nstds*sigma_y*cos(theta)));ymax=ceil(max(1,ymax));xmin=-xmax;ymin=-ymax;[x,y]=meshgrid(xmin:xmax,ymin:ymax);% Rotationx_theta=x*cos(theta)+y*sin(theta);y_theta=-x*sin(theta)+y*cos(theta);gb=exp(-.5*(x_theta.^2/sigma_x^2+y_theta.^2/sigma_y^2)).*cos(2*pi/lambda*x_theta+psi);
Code for Gabor feature extraction from images inMATLAB can be found athttp://www.mathworks.com/matlabcentral/fileexchange/44630.
This is another example implementation inHaskell:
importData.Complexgaborλθψσγxy=exp(-(x'^2+γ^2*y'^2)/(2*σ^2))*exp(i*(2*pi*x'/λ+ψ))wherex'=x*cosθ+y*sinθy'=-x*sinθ+y*cosθi=0:+1
{{cite journal}}: CS1 maint: multiple names: authors list (link)