Movatterモバイル変換


[0]ホーム

URL:


CN106780561B - Color space construction method with illumination robustness for visual tracking - Google Patents

Color space construction method with illumination robustness for visual tracking
Download PDF

Info

Publication number
CN106780561B
CN106780561BCN201611260740.4ACN201611260740ACN106780561BCN 106780561 BCN106780561 BCN 106780561BCN 201611260740 ACN201611260740 ACN 201611260740ACN 106780561 BCN106780561 BCN 106780561B
Authority
CN
China
Prior art keywords
color space
components
pixel
illumination
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611260740.4A
Other languages
Chinese (zh)
Other versions
CN106780561A (en
Inventor
顾国华
万敏杰
钱惟贤
任侃
陈钱
张晓敏
王佳节
陈雪琦
隋修宝
何伟基
刘雯彬
姜睿妍
王雨馨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and TechnologyfiledCriticalNanjing University of Science and Technology
Priority to CN201611260740.4ApriorityCriticalpatent/CN106780561B/en
Publication of CN106780561ApublicationCriticalpatent/CN106780561A/en
Application grantedgrantedCritical
Publication of CN106780561BpublicationCriticalpatent/CN106780561B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

本发明公开了一种用于视觉跟踪的具有光照鲁棒性的色空间构建方法,根据RGB与HSI空间的定量转化公式,提出对H分量进行帧间保持的方法;根据有无彩色信息将图像像素点分成两大类,分别对其在HSI空间中的光照敏感分量进行校正与约束,并给出在RGB空间内实行的具体操作方法,从而构建出对光照具有鲁棒性的新色彩空间;将建立的新色彩空间运用于经典视觉跟踪算法。本发明能够使传统跟踪算法保持较高的稳定度和精度,仅对色彩空间进行线性变化,不涉及跟踪算法本身的修改,极大降低了方法整体计算复杂度,同时具备运算复杂度低、实时性好的特点。

Figure 201611260740

The invention discloses a color space construction method with illumination robustness for visual tracking. According to the quantitative conversion formula of RGB and HSI space, a method for maintaining H components between frames is proposed; Pixels are divided into two categories, respectively correcting and constraining their light-sensitive components in HSI space, and giving specific operation methods in RGB space, thereby constructing a new color space that is robust to light; Apply the established new color space to the classic visual tracking algorithm. The invention can keep the traditional tracking algorithm high in stability and precision, only linearly changes the color space without involving the modification of the tracking algorithm itself, greatly reduces the overall computational complexity of the method, and has the advantages of low computational complexity and real-time performance. Sexual characteristics.

Figure 201611260740

Description

Color space construction method with illumination robustness for visual tracking
Technical Field
The invention belongs to the field of optical signal processing and digital image processing, and particularly relates to a color space construction method with robustness to illumination change based on a hue keeping principle.
Background
The study of tracking algorithms robust to illumination variations has long been a place in optical signal processing applications such as military target striking, video surveillance, and robot vision. For the vision tracking problem based on target feature distribution, for example, a vehicle ([1] royal jelly, Caojie, a moving automobile tracking algorithm based on feature points researches [ J ]. electric automation.2011 (06)), a human face tracking ([2] Xiao Bing, Wangzhan, a human face recognition research reviews [ J ]. computer application research 2005(08)), and the drift of the tracking result is often caused by the illumination intensity change caused by the influence of tree shadow shielding, weather, camera parameters and the like. Most of the existing technical researches utilize mathematical theory to correct and improve the tracking algorithm, but do not really carry out thinking and related method mining on the optical characteristics of the target ([3] Zhaoxin, Chenfeng, Wu Li Zhi, an improved meanshift moving target tracking algorithm [ J ]. communication technology [ 2011(11) ], [4] Magazine, Korea, Han Showa, a particle filter tracking algorithm [ J ]. photoelectric engineering based on information fusion [ 2007(04) ], [5] marvelin, Yinfang, double particle filter method [ J ]. electronic and informatics report [ 2008(09) ] for joint estimation of state and parameters in a nonlinear system.
In order to overcome the influence of illumination, a large number of visual tracking algorithms based on color information are produced, and can be roughly divided into two categories: a template update based method and an invariant color feature based method. The former uses the color state of the target current frame to predict and update a color model, thereby ensuring that the next frame has a certain degree of adaptability to the change of color information, but the method is limited by the updating speed and cannot cope with the rapid change of the color information; the latter is to eliminate the influence caused by the change probability density of the color distribution by calculating the invariant optical flow field, and has certain adaptability to the illumination change speed, but the calculation amount is still large, and real-time processing cannot be realized.
Disclosure of Invention
The invention provides a color space construction method with illumination robustness for visual tracking, which can be directly applied to a traditional visual tracking method based on characteristic distribution and better solves the problem that the method is easy to generate tracking drift phenomenon under the condition that the illumination condition is obviously changed.
The technical solution for realizing the purpose of the invention is as follows: a color space construction method with illumination robustness for visual tracking comprises the following steps:
firstly, according to a quantitative conversion formula of RGB and HSI space, a method for carrying out interframe maintenance on H components is provided, namely, hue maintenance is carried out;
secondly, dividing image pixel points into two categories according to the existence of color information, namely color pixel points and gray pixel points, respectively correcting and constraining illumination sensitive components of the image pixel points in an HSI space, and giving a specific operation method implemented in an RGB space, thereby constructing a new color space with robustness to illumination;
and finally, applying the established new color space to a classic visual tracking algorithm, and proving that the new color space improves the illumination change resistance of the traditional tracking algorithm.
Compared with the prior art, the invention has the following remarkable advantages: (1) under the condition that the intensity of a light source changes, the traditional tracking algorithm can keep higher stability and precision, only linear change is carried out on a color space, modification of the tracking algorithm is not involved, the overall calculation complexity of the method is greatly reduced, and the method has the characteristics of low calculation complexity and good real-time performance. (2) On one hand, the influence of illumination change on the characteristics of a target image is explained from the optical essence, on the other hand, components sensitive to illumination in a color space are corrected from the source, and the stability and the precision of the existing tracking algorithm in an illumination change environment are greatly improved, so that the illumination robustness is extremely high. (3) The method carries out interframe maintenance on color components (namely hue components) which can reflect the real color characteristics of a target and are insensitive to illumination change, and carries out artificial control and correction on the sensitive components, thereby eliminating the influence of the illumination change on the color space and providing a new color space with illumination robustness.
The invention is further described below with reference to the accompanying drawings:
drawings
FIG. 1 is a schematic diagram of the illumination robust color space construction method for visual tracking according to the present invention.
Fig. 2-1 to 2-4 are tracking result graphs of each visual tracking method in an RGB color space and a color space proposed by the method, where fig. 2-1 is a tracking result comparison graph of an MS algorithm, fig. 2-2 is a tracking result comparison graph of a PF algorithm, fig. 2-3 is a tracking result comparison graph of a CS algorithm, fig. 2-4 is a tracking result comparison graph of a PF algorithm, a solid line box in each graph represents a tracking result of an algorithm in an RGB color space, a dashed line box represents a tracking result of an algorithm in a color space provided by the present invention, and a red frame in each first picture represents a manually selected first frame target position.
Fig. 3 is a CR graph.
Detailed Description
With reference to fig. 1, the method for constructing a color space with illumination robustness for visual tracking according to the present invention includes the following steps:
firstly, on the basis of keeping the inter-frame hue information of each pixel point unchanged, a method for keeping the H component between frames is provided according to a quantitative conversion formula of RGB and HSI (hue-saturation-brightness) spaces, and hue keeping is carried out, namely, translation and scale transformation are utilized to carry out linear transformation on RGB spatial color components, so that the hue H component in the corresponding HSI space is kept unchanged, and constraint conditions required to be followed by correcting the saturation (S) and brightness (I) components can be simultaneously carried out.
Wherein the conventional RGB color space is converted to the HSI color space: for any 24-bit RGB color space display based color digital image, R, G, B three components need to be normalized to the [0,1] interval range first, that is: divide these three components by 255 respectively; the H, S, I component in its corresponding HSI color space can then be calculated according to the following formula:
Figure BDA0001199762560000031
in the formula, R, G, B represents the red, green, and blue components of a pixel, H, S, I represents the hue, saturation, and luminance components of a pixel, arctan (·) represents an arctangent operation, and min (·) represents a minimum operation.
The method for maintaining the color tone comprises the following steps: firstly, according to the mathematical relationship between H, S, I and R, G, B in formula (1), on the right side of the equal sign of the three equations, if the R, G, B three components are shifted or scaled simultaneously, that is, the three components are added with the same constant or multiplied by the same constant which is not 0, the value of H is not changed; then, if a certain pixel has an H component, the values of the S and I components can be changed only by translation and scale transformation, and the value of the H component can be ensured to be unchanged; finally, using the same translation factor or scale factor, the R, G, B of the pixel point is translated or scaled at the same time, i.e. translation and scale are performed, so as to change the corresponding S and I component values and ensure that the H component value is constant.
Secondly, dividing image pixel points into two categories according to the existence of color information, namely color pixel points and gray pixel points, respectively correcting and constraining illumination sensitive components of the image pixel points in an HSI space (correcting and constraining color components S and I components which can change along with illumination except for an H component in the HSI space so that H, S, I components of each pixel keep illumination robust characteristics in continuous frames), and providing a specific operation method implemented in an RGB space, thereby constructing a new color space with robustness to illumination. Since the hue H is robust to illumination, when the red (R), green (G), and blue (B) components are corrected, it is necessary to ensure that their corresponding H components remain unchanged before and after the conversion. That is, only R, G, B of each pixel point of the whole image is translated and scaled, and the corresponding S, I is forced to change in value, while the H value remains unchanged.
Correcting the S component of the color pixel: firstly, based on the fact that the illumination change can change the pixel saturation value, the S component of the color pixel is uniformly corrected to be 1, namely the saturation of the pixel is restrained, so that the saturation interference is eliminated; then, the foregoing translation operation is performed, a translation factor δ is set, and the S component is corrected using the following formula:
Figure BDA0001199762560000041
wherein R, G, B represents the red, green and blue components of a pixel, H, S, I represents the hue, saturation and brightness components of a pixel, and min (-) represents the minimum operation (min represents the minimum operation); finally, by observing the formula (2) and combining the conditional constraint that the three component values of the color pixel R, G, B cannot be all equal, the value of δ is obtained:
δ=-min(R,G,B) (3)
in the formula, min (·) represents a minimum value operation, namely, the original R, G, B three components are respectively subtracted by the minimum values, so that saturation correction is completed.
Correcting the I component of the color pixel point, namely, firstly, uniformly correcting the I component of the color pixel point to be a constant α based on the fact that the illumination change can change the pixel brightness value, namely, restricting the pixel brightness so as to eliminate the brightness interference, then, carrying out the scale scaling operation, setting a scale factor β, and correcting the I component by using the following formula:
Figure BDA0001199762560000042
wherein R, G, B represents the red, green, and blue components of a pixel, H, S, I represents the hue, saturation, and brightness components of a pixel, δ is a translation factor, and α is a constantβ, and finally, solving the value of the scale factor according to the formula (4):
Figure BDA0001199762560000044
in the formula (I), the compound is shown in the specification,
Figure BDA0001199762560000051
Figure BDA0001199762560000052
i.e., the R, G, B components are multiplied by β, respectively, to complete the luminance correction.
Carrying out unified mapping on the gray pixel points: firstly, based on the fact that gray pixel points R, G, B have no H information and S is 0, the conclusion that only the I component of the gray pixel points changes along with illumination and the illumination influence can be eliminated only by correcting the I component of the gray pixel points is obtained; then, the I component is artificially set as a constant in accordance with the content of the step of correcting the S component of the color pixel
Figure BDA0001199762560000053
That is, all the gray pixels on the brightness axis of the RGB space are mapped to the same point, and thus the new color space is constructed.
And finally, replacing the RGB color space with the established new color space, applying the new color space to a classic/traditional visual tracking algorithm based on characteristic distribution, and proving that the new color space can obviously improve the illumination change resistance of the traditional tracking algorithm without any correction and adjustment on the algorithm. The direct application of the new color space to the visual tracking algorithm: firstly, selecting four typical classic visual tracking algorithms based on feature distribution, namely a Mean Shift (MS) algorithm, a Particle Filter (PF) algorithm, a continuous adaptive mean shift (CS) algorithm and a mixed mean shift-particle filter (MSPF) algorithm as testing tools; then, under the scene of continuous change of illumination, a color CCD camera is adopted to shoot a total of 200 frames of images of a moving red model trolley as a test video; then, the RGB color space is used as the feature space of the four tracking algorithms to track the red trolley; thirdly, replacing the RGB space by the color space, tracking the trolley by using the same four algorithms, and respectively displaying the tracking results of the same algorithm in the RGB color space and the color space of the invention on the same graph to obtain 4 tracking result comparison graphs as shown in figures 2-1 to 2-4; finally, the accuracy of each tracking result is measured by using a quantitative index CR, wherein the CR is defined as:
Figure BDA0001199762560000054
in the formula, ACRepresenting the standard tracking result, A, of the original image, previously marked by an operatorRThe actual tracking result obtained by the tracking algorithm is shown, ∩ shows the overlapping area of the two areas, the higher the CR is, the closer the tracking result is to the standard result, and simultaneously, CR curves of all methods in 200 frames of video images are drawn to obtain a graph 3.

Claims (5)

Translated fromChinese
1.一种用于视觉跟踪的具有光照鲁棒性的色空间构建方法,其特征在于步骤如下:1. a color space construction method with illumination robustness for visual tracking, is characterized in that the steps are as follows:首先,根据RGB与HSI空间的定量转化公式,提出对H分量进行帧间保持的方法,即进行色调保持;Firstly, according to the quantitative conversion formula of RGB and HSI space, a method of inter-frame preservation of H component is proposed, that is, hue preservation;其次,根据有无彩色信息将图像像素点分成两大类,即彩色像素点和灰度像素点,分别对其在HSI空间中的光照敏感分量进行校正与约束,并给出在RGB空间内实行的具体操作方法,从而构建出对光照具有鲁棒性的新色彩空间;Secondly, according to the presence or absence of color information, the image pixels are divided into two categories, namely color pixels and grayscale pixels, and their illumination-sensitive components in HSI space are corrected and constrained respectively, and the implementation in RGB space is given. The specific operation method of , so as to construct a new color space that is robust to illumination;最后,将建立的新色彩空间运用于经典视觉跟踪算法,证明其提升传统跟踪算法的抗光照变化能力;Finally, the established new color space is applied to the classic visual tracking algorithm, which proves that it can improve the anti-illumination change ability of the traditional tracking algorithm;将传统RGB色彩空间转换到HSI色彩空间的过程为:对于任意一幅24比特的基于RGB色空间显示的彩色数字图像而言,首先将R、G、B三个分量归一化到[0,1]区间范围,即:将这三个分量分别除以255;然后,其对应的HSI色空间中的H、S、I分量可按照下面的公式进行计算:The process of converting the traditional RGB color space to the HSI color space is as follows: for any 24-bit color digital image displayed based on the RGB color space, first normalize the three components of R, G, and B to [0, 1] Interval range, that is: divide these three components by 255; then, the H, S, and I components in the corresponding HSI color space can be calculated according to the following formula:
Figure FDA0002225400930000011
Figure FDA0002225400930000011
式中,R、G、B分别表示像素的红、绿、蓝分量,H、S、I分别表示像素的色调、饱和度、亮度分量,arctan(·)表示反正切操作,min(·)表示取极小值操作;In the formula, R, G, and B represent the red, green, and blue components of the pixel, respectively, H, S, and I represent the hue, saturation, and luminance components of the pixel, respectively, arctan( ) represents the arc tangent operation, and min( ) represents the Take the minimum value operation;所述进行色调保持的步骤如下:首先,根据公式(1)中H、S、I与R、G、B之间的数学关系,在三个等式的等号右侧,若R、G、B三个分量同时发生平移或尺度变换,即三个分量同时加上同一个常数或乘以同一个不为0的常数,H的数值是不会发生任何改变的;然后,若某一像素具备H分量,则有且只有平移和尺度变换既能改变S和I分量的数值,又能够同时保证H分量的数值不变;最后,采用相同数值的平移因子或尺度缩放因子,对像素点的R、G、B同时进行平移或尺度变换,从而改变其对应的S与I分量数值,且保证H分量数值恒定。The described steps of maintaining hue are as follows: First, according to the mathematical relationship between H, S, I and R, G, B in formula (1), on the right side of the equal sign of the three equations, if R, G, The three components of B are translated or scaled at the same time, that is, the three components are simultaneously added with the same constant or multiplied by the same constant that is not 0, and the value of H will not change; then, if a pixel has For the H component, there are and only translation and scale transformation that can change the values of the S and I components and keep the value of the H component unchanged at the same time. , G, and B perform translation or scale transformation at the same time, so as to change the corresponding S and I component values, and ensure that the H component value is constant.2.根据权利要求1所述的用于视觉跟踪的具有光照鲁棒性的色空间构建方法,其特征在于对彩色像素点的S分量进行校正:首先,基于光照变化会改变像素饱和度数值的事实依据,将彩色像素点的S分量统一校正为1,即对于像素点饱和度的约束,从而消除饱和度干扰;然后,进行平移操作,设置平移因子δ,使用如下公式对S分量进行校正:2. the color space construction method with illumination robustness for visual tracking according to claim 1, it is characterized in that the S component of color pixel is corrected: firstly, based on illumination change can change the value of pixel saturation value. Based on the fact, the S component of the color pixel is uniformly corrected to 1, that is, the constraint on the saturation of the pixel, so as to eliminate the saturation interference; then, perform the translation operation, set the translation factor δ, and use the following formula to correct the S component:
Figure FDA0002225400930000021
Figure FDA0002225400930000021
式中,R、G、B分别表示像素的红、绿、蓝分量,H、S、I分别表示像素的色调、饱和度、亮度分量,min(·)表示取极小值操作;最后,通过观察式(2)并结合彩色像素点R、G、B三个分量数值不可能全都相等的条件约束,得出δ的取值:In the formula, R, G, and B represent the red, green, and blue components of the pixel, respectively, H, S, and I represent the hue, saturation, and brightness components of the pixel, respectively, and min( ) represents the minimum value operation; finally, by Observing formula (2) and combining the conditional constraints that the values of the three components of the color pixels R, G, and B cannot be all equal, the value of δ is obtained:δ=-min(R,G,B) (3)δ=-min(R,G,B) (3)式中,min(·)表示取极小值操作,即将原R、G、B三个分量分别减去它们的最小值,从而完成饱和度矫正。In the formula, min(·) represents the operation of taking the minimum value, that is, the original R, G, and B components are respectively subtracted from their minimum values, so as to complete the saturation correction.
3.根据权利要求1所述的用于视觉跟踪的具有光照鲁棒性的色空间构建方法,其特征在于对彩色像素点的I分量进行校正:首先,基于光照变化会改变像素亮度数值的事实依据,将彩色像素点的I分量统一校正为常数α,即对像素点亮度的约束,从而消除亮度干扰;然后,进行尺度缩放操作,设置尺度因子β,使用如下公式对I分量进行校正:3. the color space construction method with illumination robustness for visual tracking according to claim 1, it is characterized in that the I component of color pixel is corrected: firstly, based on the fact that illumination change can change pixel brightness value According to, the I component of the color pixel is uniformly corrected to a constant α, that is, the constraint on the brightness of the pixel point, so as to eliminate the brightness interference; then, the scale scaling operation is performed, the scale factor β is set, and the following formula is used to correct the I component:
Figure FDA0002225400930000022
Figure FDA0002225400930000022
式中,R、G、B分别表示像素的红、绿、蓝分量,H、S、I分别表示像素的色调、饱和度、亮度分量,δ为平移因子,α为常数,β为尺度因子;最后,根据式(4)解出尺度因子的数值:In the formula, R, G, B represent the red, green, and blue components of the pixel, respectively, H, S, and I represent the hue, saturation, and brightness components of the pixel, respectively, δ is the translation factor, α is a constant, and β is the scale factor; Finally, the numerical value of the scale factor is solved according to equation (4):
Figure FDA0002225400930000023
Figure FDA0002225400930000023
式中,
Figure FDA0002225400930000024
即将R、G、B分量分别乘以β,从而完成亮度校正。
In the formula,
Figure FDA0002225400930000024
That is, the R, G, and B components are multiplied by β to complete the brightness correction.
4.根据权利要求1所述的用于视觉跟踪的具有光照鲁棒性的色空间构建方法,其特征在于对灰度像素点进行统一映射:首先,基于灰度像素点R=G=B,不具备H信息,且S=0的事实,得出其仅有I分量随着光照而变化,只需针对其I分量进行修正即可消除光照影响的结论;然后,与对彩色像素点的S分量进行校正的步骤内容相统一,将I分量人为置成常数
Figure FDA0002225400930000031
即可,即将所有RGB空间亮度轴上的灰度像素点均映射到同一点,至此,新色彩空间构建完毕。
4. The color space construction method with illumination robustness for visual tracking according to claim 1, characterized in that the grayscale pixels are uniformly mapped: first, based on the grayscale pixels R=G=B, The fact that there is no H information and S=0, it is concluded that only the I component changes with the illumination, and the influence of the illumination can be eliminated by modifying the I component; The content of the steps for component correction is unified, and the I component is artificially set as a constant
Figure FDA0002225400930000031
That is, all the grayscale pixels on the luminance axis of the RGB space are mapped to the same point. At this point, the new color space is constructed.
5.根据权利要求1所述的用于视觉跟踪的具有光照鲁棒性的色空间构建方法,其特征在于将新色彩空间直接运用于视觉跟踪算法:首先,选取四种典型的基于特征分布的经典视觉跟踪算法,即均值漂移MS算法、粒子滤波PF算法、连续自适应均值漂移CS算法和混合均值漂移-粒子滤波MSPF算法作为测试工具;然后,在光照连续变化的场景下,采用彩色CCD相机对一辆运动的红色模型小车拍摄共计200帧图像作为测试视频;接着,将RGB色彩空间作为上述四种跟踪算法的特征空间,对红色小车进行跟踪;再者,利用色空间替换RGB空间,使用同样的四种算法对小车进行跟踪,得到跟踪结果对比图;最后,用定量指标CR来衡量各跟踪结果的精度,CR定义为:5. the color space construction method with illumination robustness for visual tracking according to claim 1, it is characterized in that the new color space is directly applied to the visual tracking algorithm: at first, choose four kinds of typical features based on distribution. Classical visual tracking algorithms, namely mean-shift MS algorithm, particle filter PF algorithm, continuous adaptive mean-shift CS algorithm and hybrid mean-shift-particle filter MSPF algorithm are used as test tools; then, in the scene of continuously changing illumination, a color CCD camera is used Shoot a total of 200 frames of images of a moving red model car as a test video; then, use the RGB color space as the feature space of the above four tracking algorithms to track the red car; The same four algorithms are used to track the car, and the comparison chart of the tracking results is obtained; finally, the quantitative index CR is used to measure the accuracy of each tracking result, and CR is defined as:
Figure FDA0002225400930000032
Figure FDA0002225400930000032
式中,AC表示原图像中通过人工事先标注出来的标准跟踪结果,AR表示跟踪算法得出的实际跟踪结果,∩表示两块区域的重叠面积。In theformula , AC represents the standard tracking result marked in advance manually in the original image,AR represents the actual tracking result obtained by the tracking algorithm, and ∩ represents the overlapping area of the two regions.
CN201611260740.4A2016-12-302016-12-30Color space construction method with illumination robustness for visual trackingActiveCN106780561B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201611260740.4ACN106780561B (en)2016-12-302016-12-30Color space construction method with illumination robustness for visual tracking

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201611260740.4ACN106780561B (en)2016-12-302016-12-30Color space construction method with illumination robustness for visual tracking

Publications (2)

Publication NumberPublication Date
CN106780561A CN106780561A (en)2017-05-31
CN106780561Btrue CN106780561B (en)2020-04-17

Family

ID=58953430

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201611260740.4AActiveCN106780561B (en)2016-12-302016-12-30Color space construction method with illumination robustness for visual tracking

Country Status (1)

CountryLink
CN (1)CN106780561B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101587591A (en)*2009-05-272009-11-25北京航空航天大学Visual accurate tracking technique based on double parameter thresholds dividing
CN102779330A (en)*2012-06-132012-11-14京东方科技集团股份有限公司Image reinforcement method, image reinforcement device and display device
CN103324284A (en)*2013-05-242013-09-25重庆大学Mouse control method based on face and eye detection

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
KR100507780B1 (en)*2002-12-202005-08-17한국전자통신연구원Apparatus and method for high-speed marker-free motion capture
US20090268953A1 (en)*2008-04-242009-10-29Apteryx, Inc.Method for the automatic adjustment of image parameter settings in an imaging system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN101587591A (en)*2009-05-272009-11-25北京航空航天大学Visual accurate tracking technique based on double parameter thresholds dividing
CN102779330A (en)*2012-06-132012-11-14京东方科技集团股份有限公司Image reinforcement method, image reinforcement device and display device
CN103324284A (en)*2013-05-242013-09-25重庆大学Mouse control method based on face and eye detection

Also Published As

Publication numberPublication date
CN106780561A (en)2017-05-31

Similar Documents

PublicationPublication DateTitle
CN102137272B (en)Method for calibrating colors of multiple cameras in open environment
CN111429370B (en)Underground coal mine image enhancement method, system and computer storage medium
CN108230407B (en)Image processing method and device
CN106131526B (en)A kind of white balancing treatment method and device based on rgb space
CN106897981A (en)A kind of enhancement method of low-illumination image based on guiding filtering
CN103020924B (en)Low-illumination monitored image enhancement method based on similar scenes
CN111027415B (en)Vehicle detection method based on polarization image
CN104504722A (en)Method for correcting image colors through gray points
CN106886985A (en)A kind of self adaptation enhancement method of low-illumination image for reducing colour cast
CN107895350B (en)HDR image generation method based on self-adaptive double gamma transformation
CN110378848A (en)A kind of image defogging method based on derivative figure convergence strategy
CN104680518B (en)A kind of blue screen image cutting method based on colourity Overflow handling
CN111970432A (en)Image processing method and image processing device
TW201435806A (en)Image recovery method
CN105184746B (en)Color image enhancement processing method based on histogram equalization
CN105118032B (en)A kind of wide method for dynamically processing of view-based access control model system
CN115665565A (en)Online tobacco leaf image color correction method, system and device
CN115937093A (en)Smoke concentration detection method integrating HSL space and improved dark channel technology
CN110175967A (en)Image defogging processing method, system, computer equipment and storage medium
CN112203064B (en)Method and device for constructing color mapping relationship of different illumination intensities
CN106780561B (en)Color space construction method with illumination robustness for visual tracking
CN110060308B (en) A color constancy method based on light source color distribution constraints
CN107358592A (en)A kind of iterative global method for adaptive image enhancement
CN110891166A (en)Image color enhancement method and storage medium
CN107316040B (en)Image color space transformation method with unchanged illumination

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp