Movatterモバイル変換


[0]ホーム

URL:


CN106108832A - A kind of in-vivo information acquiring apparatus - Google Patents

A kind of in-vivo information acquiring apparatus
Download PDF

Info

Publication number
CN106108832A
CN106108832ACN201610782831.8ACN201610782831ACN106108832ACN 106108832 ACN106108832 ACN 106108832ACN 201610782831 ACN201610782831 ACN 201610782831ACN 106108832 ACN106108832 ACN 106108832A
Authority
CN
China
Prior art keywords
pulse
umber
mentioned
information acquiring
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610782831.8A
Other languages
Chinese (zh)
Other versions
CN106108832B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xuetong Intelligent Technology Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Priority to CN201610782831.8ApriorityCriticalpatent/CN106108832B/en
Publication of CN106108832ApublicationCriticalpatent/CN106108832A/en
Application grantedgrantedCritical
Publication of CN106108832BpublicationCriticalpatent/CN106108832B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

A kind of in-vivo information acquiring apparatus, including cell recognition module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include: information acquiring section, electric power source, Magnetic Sensor portion, umber of pulse count section, umber of pulse judging part and power cut control portion.The invention have the benefit that and reliably the switching on or off of power supply of the in-vivo information acquiring apparatus being introduced inside a subject can be switched over simple structure.

Description

A kind of in-vivo information acquiring apparatus
Technical field
The present invention relates to acquisition of information field, be specifically related to a kind of in-vivo information acquiring apparatus.
Background technology
In recent years, in the field of endoscope, it is proposed that a kind of receiving image pickup part, illumination in the shell of capsule shapePortion, sending part etc. and the capsule type endoscope of deglutition type that constitutes, wherein, above-mentioned image pickup part obtains image subject within to be believedBreath, the shooting position of image pickup part is illuminated by above-mentioned Lighting Division, the image that above-mentioned sending part wireless transmission is obtained by image pickup partInformation.From the mouth as the patient of subject, swallow this capsule type endoscope this capsule type endoscope is imported to subjectInternal.Then, until the period naturally drained, this capsule type endoscope is carried out along with its vermicular movement in body cavityEndoceliac image is shot successively while movement, and to the image information acquired in external wireless transmission.
Summary of the invention
For solving the problems referred to above, it is desirable to provide a kind of in-vivo information acquiring apparatus.
The purpose of the present invention realizes by the following technical solutions:
A kind of in-vivo information acquiring apparatus, including cell recognition module and data obtaining module, described cell recognitionModule is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signalSignal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number withOn;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging partUnder condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
The invention have the benefit that can be with simple structure reliably to the organism being introduced inside a subjectSwitching on or off of the power supply of internal information acquisition device switches over.
Accompanying drawing explanation
The invention will be further described to utilize accompanying drawing, but the embodiment in accompanying drawing does not constitute any limit to the present inventionSystem, for those of ordinary skill in the art, on the premise of not paying creative work, it is also possible to obtain according to the following drawingsOther accompanying drawing.
Fig. 1 is data obtaining module structural representation of the present invention;
Fig. 2 is the structural representation of cell recognition module.
Reference:
Cell recognition module 1, Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, Classification and Identification unit 13.
Detailed description of the invention
In conjunction with following application scenarios, the invention will be further described.
Application scenarios 1
Seeing Fig. 1, Fig. 2, a kind of in-vivo information acquiring apparatus of an embodiment of this application scene, including cellIdentification module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signalSignal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number withOn;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging partUnder condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
Preferably, described data obtaining module also includes being spaced test section, and this interval test section detection passes from above-mentioned magneticThe output gap of the pulse signal in sensor portion,
Described umber of pulse count section at the output gap detected by above-mentioned interval test section not less than base set in advanceIn the case of quasi-interval, the output number of above-mentioned pulse signal is updated.
Data can be updated by this preferred embodiment in time.
Preferably, described interval test section is made up of enumerator.
It is more accurate that this preferred embodiment obtains information.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are knownOther unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition moduleScape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classificationRecognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentationSubelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more thanInteger equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y)With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value TMore than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h(x,y)=Σq(i,j)∈[q(x,y)-1.5σ,q(x,y)+1.5σ]q(i,j)k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y)Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] representNeighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls withinInterval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h(x,y)=Σ(i,j)∈Lx,yw(i,j)q(i,j)Σ(i,j)∈Lx,yw(i,j)
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to hAngle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleusPoint, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u→(x,y)=[h(x,y),have(x,y),hmed(x,y),hsta(x,y)]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generationTable its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn,yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
xz=12(Σi=1nxih(xi,yi)Σi=1nh(xi,yi)+Σi=1nxin)
yz=12(Σi=1nyih(xi,yi)Σi=1nh(xi,yi)+Σi=1nyin)
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz, yz) arrive nucleus and Cytoplasm boundary point (xp, yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1, y1) ...,If sampled pointCoordinate be not integer, its gray value is obtained by surrounding pixel linear interpolation;
Point (xi, yi) place is along the gray scale difference of line segment direction:
hd(xi, yi)=h (xi-1, yi-1)-h(xi, yi)
Definition gray scale difference inhibition function:
Y(x)=xifx≤00.5xifx>0
Point (xi, yi) place is along the gradient gra (x of line segment directioni, yi):
gra(xi,yi)=|Y(hd(xi,yi))|+|Y(hd(xi+1,yi+!))|2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixelProperty and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, useGaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially imageThe holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thinSubelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fillsDivide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limitEdge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improvedDegree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d,45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogetherThe computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X=Σi=14wiXi
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sidesThe contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four directionCorresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
wi=1|Di-D‾|+1/Σi=141|Di-D‾|+1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain:Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological mapThe Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is doneDisturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causesNumerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast,Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginsengNumber is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=13, d=2, image denoising effect improves 5% relatively, cell imageThe extraction accuracy of feature improves 8%.
Application scenarios 2
Seeing Fig. 1, Fig. 2, a kind of in-vivo information acquiring apparatus of an embodiment of this application scene, including cellIdentification module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signalSignal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number withOn;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging partUnder condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
Preferably, described data obtaining module also includes being spaced test section, and this interval test section detection passes from above-mentioned magneticThe output gap of the pulse signal in sensor portion,
Described umber of pulse count section at the output gap detected by above-mentioned interval test section not less than base set in advanceIn the case of quasi-interval, the output number of above-mentioned pulse signal is updated.
Data can be updated by this preferred embodiment in time.
Preferably, described interval test section is made up of enumerator.
It is more accurate that this preferred embodiment obtains information.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are knownOther unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition moduleScape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classificationRecognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentationSubelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more thanInteger equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y)With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value TMore than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h(x,y)=Σq(i,j)∈[q(x,y)-1.5σ,q(x,y)+1.5σ]q(i,j)k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y)Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] representNeighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood lx,yInterior gray value falls withinInterval [q (x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h(x,y)=Σ(i,j)∈Lx,yw(i,j)q(i,j)Σ(i,j)∈Lx,yw(i,j)
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to hAngle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleusPoint, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u→(x,y)=[h(x,y),have(x,y),hmed(x,y),hsta(x,y)]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generationTable its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn,yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
xz=12(Σi=1nxih(xi,yi)Σi=1nh(xi,yi)+Σi=1nxin)
yz=12(Σi=1nyih(xi,yi)Σi=1nh(xi,yi)+Σi=1nyin)
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz, yz) arrive nucleus and Cytoplasm boundary point (xp, yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1, y1) ...,If samplingThe coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi, yi) place is along the gray scale difference of line segment direction:
hd(xi, yi)=h (xi-1, yi-1)-h(xi, yi)
Definition gray scale difference inhibition function:
Y(x)=xifx≤00.5xifx>0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
gra(xi,yi)=|Y(hd(xi,yi))|+|Y(hd(xi+1,yi+!))|2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixelProperty and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, useGaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially imageThe holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thinSubelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fillsDivide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limitEdge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improvedDegree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d,45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogetherThe computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X=Σi=14wiXi
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sidesThe contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four directionCorresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
wi=1|Di-D‾|+1/Σi=141|Di-D‾|+1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain:Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological mapThe Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is doneDisturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causesNumerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast,Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginsengNumber is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=15, d=2, image denoising effect improves 6% relatively, cell imageThe extraction accuracy of feature improves 8%.
Application scenarios 3
Seeing Fig. 1, Fig. 2, a kind of in-vivo information acquiring apparatus of an embodiment of this application scene, including cellIdentification module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signalSignal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number withOn;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging partUnder condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
Preferably, described data obtaining module also includes being spaced test section, and this interval test section detection passes from above-mentioned magneticThe output gap of the pulse signal in sensor portion,
Described umber of pulse count section at the output gap detected by above-mentioned interval test section not less than base set in advanceIn the case of quasi-interval, the output number of above-mentioned pulse signal is updated.
Data can be updated by this preferred embodiment in time.
Preferably, described interval test section is made up of enumerator.
It is more accurate that this preferred embodiment obtains information.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are knownOther unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition moduleScape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classificationRecognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentationSubelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more thanInteger equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y)With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value TMore than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h(x,y)=Σq(i,j)∈[q(x,y)-1.5σ,q(x,y)+1.5σ]q(i,j)k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y)Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] representNeighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls withinInterval
[q (and x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h(x,y)=Σ(i,j)∈Lx,yw(i,j)q(i,j)Σ(i,j)∈Lx,yw(i,j)
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to hAngle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleusPoint, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u→(x,y)=[h(x,y),have(x,y),hmed(x,y),hsta(x,y)]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generationTable its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn,yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
xz=12(Σi=1nxih(xi,yi)Σi=1nh(xi,yi)+Σi=1nxin)
yz=12(Σi=1nyih(xi,yi)Σi=1nh(xi,yi)+Σi=1nyin)
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If samplingThe coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y(x)=xifx≤00.5xifx>0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
gra(xi,yi)=|Y(hd(xi,yi))|+|Y(hd(xi+1,yi+!))|2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixelProperty and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, useGaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially imageThe holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thinSubelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fillsDivide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limitEdge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improvedDegree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d,45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogetherThe computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X=Σi=14wiXi
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sidesThe contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four directionCorresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
wi=1|Di-D‾|+1/Σi=141|Di-D‾|+1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain:Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological mapThe Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is doneDisturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causesNumerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast,Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginsengNumber is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=18, d=3, image denoising effect improves 7% relatively, cell imageThe extraction accuracy of feature improves 7%.
Application scenarios 4
Seeing Fig. 1, Fig. 2, a kind of in-vivo information acquiring apparatus of an embodiment of this application scene, including cellIdentification module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signalSignal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number withOn;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging partUnder condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
Preferably, described data obtaining module also includes being spaced test section, and this interval test section detection passes from above-mentioned magneticThe output gap of the pulse signal in sensor portion,
Described umber of pulse count section at the output gap detected by above-mentioned interval test section not less than base set in advanceIn the case of quasi-interval, the output number of above-mentioned pulse signal is updated.
Data can be updated by this preferred embodiment in time.
Preferably, described interval test section is made up of enumerator.
It is more accurate that this preferred embodiment obtains information.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are knownOther unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition moduleScape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classificationRecognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentationSubelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more thanInteger equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y)With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value TMore than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h(x,y)=Σq(i,j)∈[q(x,y)-1.5σ,q(x,y)+1.5σ]q(i,j)k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y)Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] representNeighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls withinInterval
[q (and x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h(x,y)=Σ(i,j)∈Lx,yw(i,j)q(i,j)Σ(i,j)∈Lx,yw(i,j)
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to hAngle value, (i j) is neighborhood L to wx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleusPoint, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u→(x,y)=[h(x,y),have(x,y),hmed(x,y),hsta(x,y)]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generationTable its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn,yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
xz=12(Σi=1nxih(xi,yi)Σi=1nh(xi,yi)+Σi=1nxin)
yz=12(Σi=1nyih(xi,yi)Σi=1nh(xi,yi)+Σi=1nyin)
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If samplingThe coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y(x)=xifx≤00.5xifx>0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
gra(xi,yi)=|Y(hd(xi,yi))|+|Y(hd(xi+1,yi+!))|2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixelProperty and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, useGaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially imageThe holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thinSubelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fillsDivide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limitEdge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improvedDegree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d,45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogetherThe computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X=Σi=14wiXi
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sidesThe contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four directionCorresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
wi=1|Di-D‾|+1/Σi=141|Di-D‾|+1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain:Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological mapThe Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is doneDisturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causesNumerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast,Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginsengNumber is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=20, d=4, image denoising effect improves 8% relatively, cell imageThe extraction accuracy of feature improves 6%.
Application scenarios 5
Seeing Fig. 1, Fig. 2, a kind of in-vivo information acquiring apparatus of an embodiment of this application scene, including cellIdentification module and data obtaining module, described cell recognition module is used for determining that biological species, described data obtaining module include:
Information acquiring section, it obtains organism internal information;
Electric power source, it is for providing electric power to above-mentioned information acquiring section;
Magnetic Sensor portion, its detection, from the magnetic signal of outside input, exports control corresponding with the detection state of this magnetic signalSignal processed;
Umber of pulse count section, the umber of pulse of the pulse signal from above-mentioned Magnetic Sensor portion is counted by it;
Umber of pulse judging part, its judge the umber of pulse counted to get by above-mentioned umber of pulse count section be whether predetermined number withOn;
Power cut control portion, it is being judged as have input the feelings of the pulse of more than predetermined number by above-mentioned umber of pulse judging partUnder condition, above-mentioned electric power source is provided to the electric power that above-mentioned information acquiring section is carried out and switches to dissengaged positions from offer state.
Preferably, described data obtaining module also includes being spaced test section, and this interval test section detection passes from above-mentioned magneticThe output gap of the pulse signal in sensor portion,
Described umber of pulse count section at the output gap detected by above-mentioned interval test section not less than base set in advanceIn the case of quasi-interval, the output number of above-mentioned pulse signal is updated.
Data can be updated by this preferred embodiment in time.
Preferably, described interval test section is made up of enumerator.
It is more accurate that this preferred embodiment obtains information.
Preferably, described cell recognition module 1 includes that Methods of Segmentation On Cell Images unit 11, feature extraction unit 12, classification are knownOther unit 13;Described Methods of Segmentation On Cell Images unit 11 is for distinguishing the back of the body in the cell image gathered by cell image acquisition moduleScape, nucleus and Cytoplasm;Described feature extraction unit 12 is for extracting the textural characteristics of cell image;Described classificationRecognition unit 13 is for utilizing grader to realize cell image Classification and Identification according to textural characteristics.
This preferred embodiment constructs the unit structure of cell recognition module 1.
Preferably, described Methods of Segmentation On Cell Images unit 11 includes that image changes subelement, noise remove subelement, coarse segmentationSubelement, nuclear centers demarcate subelement, Accurate Segmentation subelement, particularly as follows:
(1) image conversion subelement, for being converted into gray level image by the cell image of collection;
(2) noise remove subelement, for gray level image is carried out denoising, including:
For pixel, (x y), chooses its neighborhood S of 3 × 3x,y(2N+1) the neighborhood L of × (2N+1)x,y, N is for being more thanInteger equal to 2;
First whether be that boundary point judges to pixel, set threshold value T, T ∈ [13,26], calculate pixel (x, y)With its neighborhood Sx,yIn the gray scale difference value of each pixel, and compare with threshold value T, if gray scale difference value is more than the number of threshold value TMore than or equal to 6, then (x, y) is boundary point to pixel, and otherwise, (x y) is non-boundary point to pixel;
If (x, y) is boundary point, then carry out following noise reduction process:
h(x,y)=Σq(i,j)∈[q(x,y)-1.5σ,q(x,y)+1.5σ]q(i,j)k
In formula, h (x, y) be after noise reduction pixel ((x y) is noise reduction preceding pixel point (x, ash y) to q for x, gray value y)Angle value, σ is pixel (x, y) neighborhood Lx,yInterior gray value mark is poor, q (i, j) ∈ [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] representNeighborhood Lx,yInterior gray value fall within interval [q (and x, y)-1.5 σ, q (x, y)+1.5 σ] point, k represents neighborhood Lx,yInterior gray value falls withinInterval
[q (and x, y)-1.5 σ, q (x, y)+1.5 σ] the quantity of point;
If (x, y) is non-boundary point, then carry out following noise reduction process:
h(x,y)=Σ(i,j)∈Lx,yw(i,j)q(i,j)Σ(i,j)∈Lx,yw(i,j)
In formula, (x y) is pixel (x, gray value y), q (i, j) representative image midpoint (i, j) ash at place after noise reduction to hAngle value, w (i) it is neighborhood lx,yInterior point (i, j) corresponding Gauss weight;
(3) coarse segmentation subelement, for slightly drawing the background in the cell image after denoising, Cytoplasm, nucleusPoint, particularly as follows:
By each pixel (x, y) represents with four dimensional feature vectors:
u→(x,y)=[h(x,y),have(x,y),hmed(x,y),hsta(x,y)]
In formula, (x y) represents (x, gray value y), h to have(x y) represents its neighborhood Sx,yGray average, hmed(x, y) generationTable its neighborhood Sx,yGray scale intermediate value, hsta(x y) represents its neighborhood Sx,yGray variance;
K-means clustering procedure is used to be divided into background, Cytoplasm, nucleus three class;
(4) nuclear centers demarcates subelement, for demarcating nuclear centers:
Nucleus approximate region is obtained, if nuclear area comprises n point: (x by coarse segmentation subelement1,y1),…,(xn,yn), this region is carried out intensity-weighted demarcation and geometric center is demarcated, take its meansigma methods as nuclear centers (xz,yz):
xz=12(Σi=1nxih(xi,yi)Σi=1nh(xi,yi)+Σi=1nxin)
yz=12(Σi=1nyih(xi,yi)Σi=1nh(xi,yi)+Σi=1nyin)
(5) Accurate Segmentation subelement, for carrying out Accurate Segmentation to nucleus, Cytoplasm;
Build from nuclear centers (xz,yz) arrive nucleus and Cytoplasm boundary point (xp,yp) directed line segmentDistanceRepresent and round downwards;
Carry out sampling along line segment with unit length and can obtain dispIndividual point (x1,y1) ...,If samplingThe coordinate of point is not integer, and its gray value is obtained by surrounding pixel linear interpolation;
Point (xi,yi) place is along the gray scale difference of line segment direction:
hd(xi,yi)=h (xi-1,yi-1)-h(xi,yi)
Definition gray scale difference inhibition function:
Y(x)=xifx≤00.5xifx>0
Point (xi,yi) place is along the gradient gra (x of line segment directioni,yi):
gra(xi,yi)=|Y(hd(xi,yi))|+|Y(hd(xi+1,yi+!))|2
Choose the maximum value point of gradient as nucleus and cytoplasmic precise edge.
This preferred embodiment arranges noise remove subelement, and effective integration center pixel closes on the space of neighborhood territory pixelProperty and grey similarity carry out noise reduction process, flat site in the picture, in neighborhood, grey scale pixel value is more or less the same, useGaussian filter is weighted filtering to gray value, and at the borderline region that change is violent, row bound keeps filtering, beneficially imageThe holding at edge;Use K mean cluster to extract nucleus and Cytoplasm coarse contour, can effectively remove the interference of noise;Arrange thinSubelement is demarcated at karyon center, it is simple to follow-up be accurately positioned nucleus and Cytoplasm profile;Accurate Segmentation subelement fillsDivide and make use of directional information, overcome the inflammatory cell interference to edge graph, it is possible to accurately extract nucleus and Cytoplasm limitEdge.
Preferably, the described textural characteristics to cell image extracts, including:
(1) the Gray co-occurrence matrix of cell image, described comprehensive ash is asked for based on the gray level co-occurrence matrixes method improvedDegree co-occurrence matrix embodies cell textural characteristics in different directions:
Be located at 0 °, 45 °, 90 °, gray level co-occurrence matrixes on 135 ° of four directions be respectively h (x, y, d, 0 °), h (x, y, d,45 °), h (x, y, d, 90 °), h (x, y, d, 135 °), corresponding matrix element project is X1、X2、X3、X4, then Gray is altogetherThe computing formula of raw matrix is:
H (x, y, d)=w1h(x,y,d,0°)+w2h(x,y,d,45°)+w3h(x,y,d,90°)+w4h(x,y,d,135°)
Gray co-occurrence matrix element number is:
X=Σi=14wiXi
In formula, d represents distance, and the span of d is [2,4], wiFor weight coefficient, i=1,2,3,4, it is by four sidesThe contrast level parameter that the gray level co-occurrence matrixes on each direction in is corresponding calculates, if the gray level co-occurrence matrixes on four directionCorresponding contrast level parameter is respectively Di, average isI=1,2,3,4, then weight coefficient wiComputing formula be:
wi=1|Di-D‾|+1/Σi=141|Di-D‾|+1
(2) four textural characteristics parameters needed for utilizing described Gray co-occurrence matrix and matrix element project to obtain:Contrast, variance and, energy and average;
(3) described four textural characteristics parameters are normalized, the normalized textural characteristics value of final acquisition.
This preferred embodiment, based on the gray level co-occurrence matrixes method improved, uses the mode arranging weight coefficient to ask for cytological mapThe Gray co-occurrence matrix of picture, and then extract cell textural characteristics on appointment four direction, solve owing to outside is doneDisturb the textural characteristics ginseng of the cell that (cause such as lighting angle when cell image gathers impact, the flowing interference etc. of gas) causesNumerical value has the problem of bigger difference in different directions, improves the precision of cell image texture feature extraction;Selected contrast,Variance and, energy and four textural characteristics of average, eliminate the characteristic parameter of redundancy and repetition;To described four textural characteristics ginsengNumber is normalized, and the Classification and Identification facilitating follow-up cell image processes.
In this application scenarios, setting threshold value T=26, d=2, image denoising effect improves 7.5% relatively, cytological mapAs the extraction accuracy of feature improves 8%.
Last it should be noted that, above example is only in order to illustrate technical scheme, rather than the present invention is protectedProtecting the restriction of scope, although having made to explain to the present invention with reference to preferred embodiment, those of ordinary skill in the art shouldWork as understanding, technical scheme can be modified or equivalent, without deviating from the reality of technical solution of the present inventionMatter and scope.

Claims (3)

CN201610782831.8A2016-08-302016-08-30A kind of in-vivo information acquiring apparatusActiveCN106108832B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201610782831.8ACN106108832B (en)2016-08-302016-08-30A kind of in-vivo information acquiring apparatus

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201610782831.8ACN106108832B (en)2016-08-302016-08-30A kind of in-vivo information acquiring apparatus

Publications (2)

Publication NumberPublication Date
CN106108832Atrue CN106108832A (en)2016-11-16
CN106108832B CN106108832B (en)2017-11-10

Family

ID=57273254

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201610782831.8AActiveCN106108832B (en)2016-08-302016-08-30A kind of in-vivo information acquiring apparatus

Country Status (1)

CountryLink
CN (1)CN106108832B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109360187A (en)*2018-09-102019-02-19天津大学 Lymphocyte slice cancer cell detector

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2006064972A1 (en)*2004-12-172006-06-22Olympus CorporationPosition detection system, guidance system, position detection method, medical device, and medical magnetic-induction and position-detection system
US20080154090A1 (en)*2005-01-042008-06-26Dune Medical Devices Ltd.Endoscopic System for In-Vivo Procedures
US20120022328A1 (en)*2009-02-052012-01-26Johannes ReinschkeSeparating an endoscopy capule from a surface of a liquid
CN103732114A (en)*2011-07-282014-04-16奥林巴斯株式会社Bioinformation acquisition system and method for driving bioinformation acquisition system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2006064972A1 (en)*2004-12-172006-06-22Olympus CorporationPosition detection system, guidance system, position detection method, medical device, and medical magnetic-induction and position-detection system
US20080154090A1 (en)*2005-01-042008-06-26Dune Medical Devices Ltd.Endoscopic System for In-Vivo Procedures
US20120022328A1 (en)*2009-02-052012-01-26Johannes ReinschkeSeparating an endoscopy capule from a surface of a liquid
CN103732114A (en)*2011-07-282014-04-16奥林巴斯株式会社Bioinformation acquisition system and method for driving bioinformation acquisition system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN109360187A (en)*2018-09-102019-02-19天津大学 Lymphocyte slice cancer cell detector

Also Published As

Publication numberPublication date
CN106108832B (en)2017-11-10

Similar Documents

PublicationPublication DateTitle
CN112818988B (en)Automatic identification reading method and system for pointer instrument
CN104766058B (en)A kind of method and apparatus for obtaining lane line
EP2596746B1 (en)Pupil detection device and pupil detection method
CN109598242B (en)Living body detection method
CN112668510B (en)Method, system, device, processor and storage medium for realizing performance test of three-dimensional face recognition equipment
Yang et al.Traffic sign recognition in disturbing environments
CN103971097B (en)Vehicle license plate recognition method and system based on multiscale stroke models
CN107305635A (en)Object identifying method, object recognition equipment and classifier training method
CN113160276B (en)Target tracking method, target tracking device and computer readable storage medium
CN101441721A (en)Device and method for counting overlapped circular particulate matter
CN109376589A (en) Recognition method of ROV deformable target and small target based on convolution kernel screening SSD network
CN106570899A (en)Target object detection method and device
CN102521581A (en)Parallel face recognition method with biological characteristics and local image characteristics
CN114463558A (en)Transformer substation pointer instrument detection method based on deep learning
CN102494663A (en)Measuring system of swing angle of swing nozzle and measuring method of swing angle
CN106600643A (en)People counting method based on trajectory analysis
CN116862922A (en)Target positioning method, system and medium based on image segmentation and radar information fusion
CN113963420A (en)Three-dimensional face recognition method based on deep learning
CN116704547A (en)Human body posture detection method based on GCN-LSTM under privacy protection
CN106108832B (en)A kind of in-vivo information acquiring apparatus
CN110348408A (en)Pupil positioning method and device
CN106350447B (en)A kind of detecting system of microbe contamination on elastomeric articles
CN105844614B (en)It is a kind of that northern method is referred to based on the vision for proofreading robot angle
CN117274878B (en) Swallowing disorder screening method and system based on computer vision
CN115049976B (en) A method, system, device and medium for predicting wind direction and wind speed of a transmission line

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
CB03Change of inventor or designer information

Inventor after:Song Changxin

Inventor after:Ma Ke

Inventor after:Li Anqiang

Inventor before:The inventor has waived the right to be mentioned

CB03Change of inventor or designer information
TA01Transfer of patent application right

Effective date of registration:20171012

Address after:810000, 54 West Road, 54 West Road, 38 West Road, Xining, Qinghai, China

Applicant after:Song Changxin

Applicant after:Ma Ke

Applicant after:Li Anqiang

Address before:Tunnel road, Zhenhai District 315200 Zhejiang city of Ningbo province No. 555

Applicant before:Meng Ling

TA01Transfer of patent application right
GR01Patent grant
GR01Patent grant
TR01Transfer of patent right

Effective date of registration:20210525

Address after:Room 1601, 238 JIANGCHANG Third Road, Jing'an District, Shanghai 200436

Patentee after:Shanghai Xuetong Intelligent Technology Co.,Ltd.

Address before:No.38 Wusi West Road, Chengxi District, Xining City, Qinghai Province 810000 Wusi West Road Campus of Qinghai Normal University

Patentee before:Song Changxin

Patentee before:Ma Ke

Patentee before:Li Anqiang

TR01Transfer of patent right

[8]ページ先頭

©2009-2025 Movatter.jp