Movatterモバイル変換


[0]ホーム

URL:


CN102184543A - Method of face and eye location and distance measurement - Google Patents

Method of face and eye location and distance measurement
Download PDF

Info

Publication number
CN102184543A
CN102184543ACN 201110125628CN201110125628ACN102184543ACN 102184543 ACN102184543 ACN 102184543ACN 201110125628CN201110125628CN 201110125628CN 201110125628 ACN201110125628 ACN 201110125628ACN 102184543 ACN102184543 ACN 102184543A
Authority
CN
China
Prior art keywords
projection function
circle
eyes
integral projection
integral
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201110125628
Other languages
Chinese (zh)
Other versions
CN102184543B (en
Inventor
陈国庆
赵军庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SUZHOU LIANGJIANG TECHNOLOGY Co Ltd
Original Assignee
SUZHOU LIANGJIANG TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SUZHOU LIANGJIANG TECHNOLOGY Co LtdfiledCriticalSUZHOU LIANGJIANG TECHNOLOGY Co Ltd
Priority to CN 201110125628priorityCriticalpatent/CN102184543B/en
Publication of CN102184543ApublicationCriticalpatent/CN102184543A/en
Application grantedgrantedCritical
Publication of CN102184543BpublicationCriticalpatent/CN102184543B/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Landscapes

Abstract

The invention relates to a method of face and eye location and distance measurement, belonging to the technical field of face detection and location. The method comprises the following steps of: 1) finding out vertical coordinates of eyes by utilizing circle difference operator projection functions; 2) finding out horizontal coordinates of two eyes by utilizing hybrid projection functions; 3) displaying an integral projection curve at two directions; 4) defining the vertical coordinates of the eyes as y, and the horizontal coordinates of the two eyes as x1 and x2 according to two crests of integral projection, while the distance between two eyes is an absolute value of (x1-x2), and coordinates of two eyes are (x1, y) and (x2, y). By adopting the method, the accuracy and the speed of the eye location and distance measurement are improved in the face detection.

Description

The method of a kind of people's face eye location and distance measuring and calculating
Technical field
The present invention relates to the method for a kind of people's face eye location and distance measuring and calculating, belong to people's face detection and location technical field.
Background technology
It is the very work of difficulty that people's face detects, and its complexity to a certain extent even surpassed recognition of face.
Through years of researches, the method that plurality of human faces detects has appearred being permitted.And because eyes are face one of notable attribute the most, so eye location becomes the committed step of many method for detecting human face.In case the position of left and right sides eyes is decided, the position at people's face place is also just basic have been determined.According to the length and the direction of line between two, the size and Orientation of human face region also can roughly estimate.
Some effective eye locating methods have appearred at present.For example, people such as Bala proposes a kind of eye locating method based on genetic algorithm and decision tree.This method adopts a kind of blending heredity structure that basic ocular rules is constantly evolved, and finally obtains can be used for the ocular rules of the form of decision tree of eye location.People such as Reinders propose a kind of eye locating method based on neural network, and this method is with the input as neural network of the pixel of search window, if this window comprises eye image, then the output of neural network is bigger.Wu and Zhou propose a kind of eye locating method based on intensity contrast.This method utilizes big these characteristics of the gray scale of eye areas to find out the position of eyes.Yet said method only can provide the approximate location of eyes, can not accurately locate the center of eyes.Therefore, in order to improve the accuracy rate that people's face detects, be necessary to study the pinpoint method of eyes.
Projection is a kind of method of effective extraction characteristics of image.Usually, a width of cloth two dimensional image can be analyzed by the one dimension projection function of two quadratures.The feature of analysis image is convenient in the reduction of dimension, and has reduced calculated amount, so projection becomes a kind of important images analytical approach.Up to the present, existing a lot of scholars successfully apply to locate facial characteristics with projection function.Kanade successfully is applied to recognition of face with the integral projection function the earliest, and he at first carries out binaryzation with Laplace operator to original gray-scale map, with the integral projection function binary map is analyzed then.Brunelli and Poggio have done improvement to the algorithm of Kanade, and they are applied to the boundary graph analysis with the integral projection function, thereby determine the position of facial each feature.The notion of variance projection function is proposed by Feng and Yuen the earliest, and has proposed a kind of straightforward procedure of utilizing variance projection function location eyes simultaneously.Afterwards, they proposed the method for a kind of multi thread location eyes again, wherein used a kind of eyes variance filter (eye variance filter), and this filtrator utilized just the variance projection function to produce.This shows that projection is actually a kind of location technology that often adopts in the recognition of face.
But the eye detection result that traditional mixing integration projection function obtains after for whole face area integration is not satisfactory, and in the simple detection of round difference operator projection function for whole face, relatively accurately bigger for the detection of eyes ordinate but detect the influence that is subjected to ear and hair on the temple for two horizontal ordinate, and speed is slower.So still leave some room for improvement.
Summary of the invention
In order to address the above problem, the present invention proposes the method for a kind of people's face eye location and distance measuring and calculating, improved the accuracy and the speed of eye location and distance measuring and calculating during people's face detects.
The present invention adopts following technical scheme for solving its technical matters:
The method of a kind of people's face eye location and distance measuring and calculating comprises the steps:
1) utilize circle difference operator projection function to find the ordinate of eyes;
The definition of circle difference operator: set up the xy coordinate system on image, initial point is at the image middle position usually; If image (x, y) gray-scale value on the coordinate points is f(x, y); The threshold value of setting difference is h; So that (x y) is the center of circle, makes the circle that radius is r, and all form circle upper set S near the pixel of circumference, and sum of all pixels is n; If the gray-scale value of centre point be f (x, y), then in the S all gray scale (x, y)+number of pixels of h is designated as nl, (x, y)-number of pixels of h is designated as n2, the average gray of all pixels is designated as Favg to all gray scales in the S smaller or equal to f in the S more than or equal to f., 3 round coefficients of variation of definition centre point are as follows:
Circle dark coefficient of variation: b (x, y)=n1/n;
Circle bright coefficient of variation: c (x, y)=n2/n;
Circle mean difference coefficient: v (x, y)=Favg-f (x, y);
Circle difference operator CDO (x, y): if v (x, y)<=0 or b (x, y)<0.6 then CDO (x, y)=0;
if?v(x,y)>0?and?b(x,y)>=0.6
then?CDO(x,y)?=?v(x,y);
In algorithm, generally get 1<=h<=10,2<=r<=10, concrete value will obtain in practical operation;
2) utilize mixed projection function to find two horizontal ordinate;
Mixed projection function is defined as the integral projection function and adds the upside deviation projection function;
Integral projection function: suppose that (x y) represents that (the vertical integral projection function representation in interval [y1, y2] is Sv (x) to point for x, the grey scale pixel value of y) locating to I;
Formula one:
Figure 2011101256280100002DEST_PATH_IMAGE001
(1)
Average integral projection function Mv (x) is expressed as:
Formula two: Mv (x)=Sv (x)/(y2-y1) (2)
The variance projection function: the vertical variance projection function that is located in the interval [y1, y2] is expressed as2v (x)
Formula three:
Figure 327801DEST_PATH_IMAGE003
2v (x)=
Figure 2011101256280100002DEST_PATH_IMAGE004
2/ (y2-y1) (3)
Vertical mixed projection function Hv (x) is defined as:
Formula four: Hv (x)=
Figure 2011101256280100002DEST_PATH_IMAGE005
/ 2+
Figure 2011101256280100002DEST_PATH_IMAGE006
/ 2 (4)
Earlier image is justified the horizontal integral projection of difference operator, find the highest vertical coordinate that is defined as eyes of peak value, each horizontal ordinate to this coordinate vertically mixes integral projection then, and two crests about finding are as the horizontal ordinate of eyes;
3) integral projection curve of demonstration both direction;
4) ordinate that will obtain eyes is defined as y, and the horizontal ordinate that obtains two according to two crests of integral projection is x1, x2, and then two distance is | x1-x2|, two eye coordinateses are: (x1, y), (x2, y).
Beneficial effect of the present invention is as follows:
1, seeks the ordinate of eyes finds eyes then with mixing integration projection function horizontal ordinate method to justify the difference projection function, on the complexity of algorithm and accuracy, all have better effect than single method.
2, because the time complexity of circle difference operator projection function is O (n4), and the time complexity that mixes the integration projection function is O (n2), only to ordinate circle difference operator projection function, and to horizontal ordinate mixing integration projection function, used time ratio all uses the required time of circle difference operator projection function short to horizontal ordinate.The used time of this method approximates 1/2 of the simple round difference operator used time of projection function.Therefore to be better than the positive effect of classic method be that speed is fast to this method, the accuracy height.
Description of drawings
Fig. 1 processing flow chart.
Embodiment
Below in conjunction with accompanying drawing the invention is described in further details.
Treatment scheme as shown in Figure 1, the prerequisite of this method is that recognition of face has been finished and carried out carrying out on the basis of image pre-service (image noise reduction, figure image intensifying, image reconstruction).Implementation step is as follows:
1, utilize circle difference operator projection function to find the ordinate of eyes, the definition of circle difference operator: set up the xy coordinate system on image, initial point is at the image middle position usually.If image (x, y) gray-scale value on the coordinate points is f(x, y); The threshold value of setting difference is h; So that (x y) is the center of circle, makes the circle that radius is r, and all form circle upper set S near the pixel of circumference, and sum of all pixels is n.If the gray-scale value of centre point be f (x, y), then in the S all gray scale (x, y)+number of pixels of h is designated as nl, (x, y)-number of pixels of h is designated as n2, the average gray of all pixels is designated as Favg to all gray scales in the S smaller or equal to f in the S more than or equal to f., 3 round coefficients of variation of definition centre point are as follows:
Circle dark coefficient of variation: b (x, y)=n1/n; Circle bright coefficient of variation: c (x, y)=n2/n;
Circle mean difference coefficient: v (x, y)=Favg-f (x, y).
Circle difference operator CDO (x, y): if v (x, y)<=0 or b (x, y)<0.6 then CDO (x, y)=0;
if?v(x,y)>0?and?b(x,y)>=0.6
then?CDO(x,y)?=?v(x,y);
In algorithm, generally get 1<=h<=10,2<=r<=10, concrete value will obtain in practical operation.
2, utilize mixed projection function to find two horizontal ordinate then.
Mixed projection function is defined as the integral projection function and adds the upside deviation projection function.
Integral projection function: suppose that (x y) represents that (the vertical integral projection function representation in interval [y1, y2] is Sv (x) to point for x, the grey scale pixel value of y) locating to I;
Formula one:
Figure 434822DEST_PATH_IMAGE001
Figure 2011101256280100002DEST_PATH_IMAGE007
(1)
Average integral projection function Mv (x) is expressed as:
Formula two: Mv (x)=Sv (x)/(y2-y1) (2)
The variance projection function: the vertical variance projection function that is located in the interval [y1, y2] is expressed as
Figure 276876DEST_PATH_IMAGE003
2v (x);
Formula three:
Figure 485134DEST_PATH_IMAGE003
2v (x)=
Figure 299507DEST_PATH_IMAGE004
2/ (y2-y1) (3)
Vertical mixed projection function Hv (x) is defined as:
Formula four: Hv (x)=
Figure 184286DEST_PATH_IMAGE005
/ 2+
Figure 762904DEST_PATH_IMAGE006
/ 2 (1)
In this paper algorithm, earlier image is justified the horizontal integral projection of difference operator, find the highest vertical coordinate that is defined as eyes of peak value, each horizontal ordinate to this coordinate vertically mixes integral projection then, two crests about finding are as the horizontal ordinate of eyes.
3, the integral projection curve that shows both direction.
4, the ordinate that will obtain eyes is defined as y, and the horizontal ordinate that obtains two according to two crests of integral projection is x1, x2, and then two distance is | x1-x2|, two eye coordinateses are: (x1, y), (x2, y).

Claims (1)

1. the method for people's face eye location and distance measuring and calculating is characterized in that, comprises the steps:
1) utilize circle difference operator projection function to find the ordinate of eyes;
The definition of circle difference operator: set up the xy coordinate system on image, initial point is at the image middle position usually; If image (x, y) gray-scale value on the coordinate points is f(x, y); The threshold value of setting difference is h; So that (x y) is the center of circle, makes the circle that radius is r, and all form circle upper set S near the pixel of circumference, and sum of all pixels is n; If the gray-scale value of centre point be f (x, y), then in the S all gray scale more than or equal to f (x, y)+number of pixels of h is designated as nl, in the S all gray scales smaller or equal to f (x, y)-number of pixels of h is designated as n2, the average gray of all pixels is designated as Favg in the S;
3 round coefficients of variation of definition centre point are as follows:
Circle dark coefficient of variation: b (x, y)=n1/n;
Circle bright coefficient of variation: c (x, y)=n2/n;
Circle mean difference coefficient: v (x, y)=Favg-f (x, y);
Circle difference operator CDO (x, y): if v (x, y)<=0 or b (x, y)<0.6 then CDO (x, y)=0;
if?v(x,y)>0?and?b(x,y)>=0.6
then?CDO(x,y)?=?v(x,y);
In algorithm, generally get 1<=h<=10,2<=r<=10, concrete value will obtain in practical operation;
2) utilize mixed projection function to find two horizontal ordinate;
Mixed projection function is defined as the integral projection function and adds the upside deviation projection function;
Integral projection function: suppose that (x y) represents that (the vertical integral projection function representation in interval [y1, y2] is Sv (x) to point for x, the grey scale pixel value of y) locating to I;
Formula one:
Figure 341282DEST_PATH_IMAGE001
Figure 222519DEST_PATH_IMAGE002
(1)
Average integral projection function Mv (x) is expressed as:
Formula two: Mv (x)=Sv (x)/(y2-y1) (2)
The variance projection function: the vertical variance projection function that is located in the interval [y1, y2] is expressed as2v (x)
Formula three:
Figure 981714DEST_PATH_IMAGE003
2v (x)=2/ (y2-y1) (3)
Vertical mixed projection function Hv (x) is defined as:
Formula four: Hv (x)=
Figure 50350DEST_PATH_IMAGE005
/ 2+
Figure 339990DEST_PATH_IMAGE006
/ 2 (4)
Earlier image is justified the horizontal integral projection of difference operator, find the highest vertical coordinate that is defined as eyes of peak value, each horizontal ordinate to this coordinate vertically mixes integral projection then, and two crests about finding are as the horizontal ordinate of eyes;
3) integral projection curve of demonstration both direction;
4) ordinate that will obtain eyes is defined as y, and the horizontal ordinate that obtains two according to two crests of integral projection is x1, x2, and then two distance is | x1-x2|, two eye coordinateses are: (x1, y), (x2, y).
CN 2011101256282011-05-162011-05-16Method of face and eye location and distance measurementExpired - Fee RelatedCN102184543B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN 201110125628CN102184543B (en)2011-05-162011-05-16Method of face and eye location and distance measurement

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN 201110125628CN102184543B (en)2011-05-162011-05-16Method of face and eye location and distance measurement

Publications (2)

Publication NumberPublication Date
CN102184543Atrue CN102184543A (en)2011-09-14
CN102184543B CN102184543B (en)2013-03-27

Family

ID=44570713

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN 201110125628Expired - Fee RelatedCN102184543B (en)2011-05-162011-05-16Method of face and eye location and distance measurement

Country Status (1)

CountryLink
CN (1)CN102184543B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN113011393A (en)*2021-04-252021-06-22中国民用航空飞行学院Human eye positioning method based on improved hybrid projection function

Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1474357A (en)*2003-06-132004-02-11�Ϻ���ͨ��ѧ The precise and automatic positioning method of human face eye center in digital gray scale image
US20060147094A1 (en)*2003-09-082006-07-06Woong-Tuk YooPupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1474357A (en)*2003-06-132004-02-11�Ϻ���ͨ��ѧ The precise and automatic positioning method of human face eye center in digital gray scale image
US20060147094A1 (en)*2003-09-082006-07-06Woong-Tuk YooPupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
《Pattern Recognition》 20041231 Zhi-Hua Zhou et al. Projection functions for eye detection 全文 1 第37卷,*
《广西大学学报(自然科学版)》 20080630 陈雪云等 一种基于圆差异算子的眼睛垂直定位方法 全文 1 第33卷, 第2期*
《计算机工程》 20100131 陈雪云等 基于Haar小波的眼睛定位方法 全文 1 第36卷, 第1期*

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN113011393A (en)*2021-04-252021-06-22中国民用航空飞行学院Human eye positioning method based on improved hybrid projection function

Also Published As

Publication numberPublication date
CN102184543B (en)2013-03-27

Similar Documents

PublicationPublication DateTitle
US20210117704A1 (en)Obstacle detection method, intelligent driving control method, electronic device, and non-transitory computer-readable storage medium
CN103208123B (en)Image partition method and system
CN108960229A (en)One kind is towards multidirectional character detecting method and device
CN101211411B (en)Human body detection process and device
CN105868745B (en)Weather recognition methods based on dynamic scene perception
CN103955949A (en)Moving target detection method based on Mean-shift algorithm
CN108734078B (en)Image processing method, image processing apparatus, electronic device, storage medium, and program
CN103839038A (en)People counting method and device
CN104615996B (en)A kind of various visual angles two-dimension human face automatic positioning method for characteristic point
CN109086724A (en)A kind of method for detecting human face and storage medium of acceleration
CN104050448A (en)Human eye positioning method and device and human eye region positioning method and device
CN106297492A (en)A kind of Educational toy external member and utilize color and the method for outline identification programming module
CN106503683A (en)A kind of video well-marked target detection method based on dynamic focal point
CN108416304B (en)Three-classification face detection method using context information
US20250209667A1 (en)Visual measurement method and system based on digital human model
CN106371614A (en)Gesture recognition optimizing method and device
CN109840905A (en)Power equipment rusty stain detection method and system
US20240420313A1 (en)Inspection method for inspecting an object and machine vision system
US9053383B2 (en)Recognizing apparatus and method, program, and recording medium
Devadethan et al.Face detection and facial feature extraction based on a fusion of knowledge based method and morphological image processing
KR20200005853A (en)Method and System for People Count based on Deep Learning
CN106780507A (en)A kind of sliding window fast target detection method based on super-pixel segmentation
CN103955925B (en)The improvement probability Hough transformation curve detection method of minimum sampling is fixed based on piecemeal
CN102184543B (en)Method of face and eye location and distance measurement
CN102073996A (en)Image-correlation-evaluation-based method for determining image segmentation threshold

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
C14Grant of patent or utility model
GR01Patent grant
CF01Termination of patent right due to non-payment of annual fee

Granted publication date:20130327

Termination date:20160516

CF01Termination of patent right due to non-payment of annual fee

[8]ページ先頭

©2009-2025 Movatter.jp