The application requires the formerly rights and interests of patent application JP2012-278970 of Japan of submitting on December 21st, 2012, and its full content is incorporated to herein by reference.
Embodiment
Hereinafter, describe preferred embodiment of the present disclosure in detail in connection with accompanying drawing.Note that in this specification and accompanying drawing, the structural detail with substantially the same function and structure uses identical Reference numeral to represent, and has omitted the repeat specification to these structural details.
Fig. 1 is schematically illustrated according to the formation of the projection type video display device 100 of the embodiment of the disclosed technology of this specification.Illustrated projection type video display device 100 comprisesProjection Division 101,image processing part 102,image input part 103,camera section 104 and correcting value test section 105.Hereinafter, each several part will be described.
Image input part 103 is from the supply source received image signal of the projected image such as such as PC, television receiver or Blu-ray Disc transcriber (all not shown figure).
Image processing part 102 is to processing from the image ofProjection Division 101 projections and output.The image of exporting fromimage processing part 102 is to utilize the external image supplied with fromimage input part 103 test pattern inimage processing part 102 interior generations.Inimage processing part 102, based on the correction parameter of supplying with from correctingvalue test section 105, carry out the distortion correction of projected image.Except based onProjection Division 101 with by the trapezoidal distortion of the three-dimensional position relation between projectile, the distortion that will be corrected also comprises and results from the optical distortion of optical system ofProjection Division 101 andcamera section 104.
Projection Division 101 by the image projection of exporting fromimage processing part 102 to such as screen (not shown) etc. by projectile.Due to fromProjection Division 101 from the direction projection with respect to being write body (screen wall) inclination, therefore in projected image, produce trapezoidal distortion.
Camera section 104 is taken fromProjection Division 101 and is projected to by the test pattern projectile.By the test pattern image that usescamera section 104 to take, correctingvalue test section 105 calculates for proofreading and correct the above-mentioned trapezoidal distortion that comprises from the projected image ofProjection Division 101 and the correction amount of optical distortion, and the correcting value that output calculates is to image processing part 102.By being carried out to projective transformation, output image can proofread and correct trapezoidal distortion and optical distortion.In the present embodiment, correctingvalue test section 105 calculates projective transformation parameter as correction amount.
In the present embodiment,camera section 104 is arranged on the position of the irradiation position that is different fromProjection Division 101, and optic axis is configured to make coverage to comprise the range of exposures ofProjection Division 101 as much as possible.In the time that specific test pattern irradiates fromProjection Division 101, test pattern is taken by camera section 104.In addition, from photographic images, correctionamount test section 105 obtains until by the distance of projectile and direction, calculates projective transformation parameter, and the projective transformation parameter that output calculates is to image processing part 102.Afterwards, byimage processing part 102, all images of inputting fromimage input part 103 are carried out to projective transformation by projective transformation parameter, and the image that wherein trapezoidal distortion and optical distortion are corrected irradiates fromProjection Division 101.
Fig. 2 illustrates the inside configuration example of Projection Division 101.Illustrated Projection Division 101 comprisesliquid crystal panel 201, the illumination light department of the Chinese Academy ofSciences 202, liquid crystal panel drive division 204 and the projected light department of the Chinese Academy ofSciences 203.
Liquid crystal panel drive division 204 drivesliquid crystal panel 201 based on the picture signal inputted fromimage processing part 102, and on display screen paint projection image.The illumination light department of the Chinese Academy ofSciences 202 is from back side illuminaton liquid crystal panel 201.In the situation that projection type video display device 100 is minitype projection machine, for example, LED (light-emitting diode) or laser are for the light source of the illumination light department of the Chinese Academy of Sciences 202.The projected light department of the Chinese Academy ofSciences 203 will see through the light amplification ofliquid crystal panel 201 and project to by projectile (not shown).From the input picture ofProjection Division 101 projectedimage input parts 103 or the test pattern of projection type video display device 100 interior generations.The projected light department of the Chinese Academy ofSciences 203 comprises one or more optical lens.By inference, the projected light department of the Chinese Academy ofSciences 203 has lens distortions, therefore, in projected image, also will trapezoidal distortion optical distortion in addition occur.
Fig. 3 illustrates the inside configuration example of image processing part 102.Illustratedimage processing part 102 comprises image write/read control part 301, frame memory 302,image rectification portion 303, picture quality adjusting portion 304, test pattern generating unit 305 and output image switching part 306.
The image of supplying with fromimage input part 103 is stored in frame memory 302.Image write/read control part 301 is controlled writing and reading to the picture frame of frame memory 302.
The projective transformation parameter ofimage rectification portion 303 based on receiving from correctionamount test section 105 carried out projective transformation to the image of reading from frame memory 302, and proofreaies and correct, make when fromProjection Division 101 in the time being write body projection trapezoidal distortion eliminate.
Picture quality adjusting portion 304 is for example carried out picture quality adjusting for brightness, contrast, synchronous, tracking, color density and shade etc., makes after carrying out distortion correction projected image in required display condition.
Test pattern generating unit 305 is created on the test pattern using while calculating projective transformation parameter by correction amount test section 105.Test pattern has wherein conduct by easily obtained geometry of the three-dimensional information of the screen of projectile.The type of the test pattern using will be described later.
Output image switching part 306 switches the image that outputs to Projection Division 101.For example, by projecting to from the input picture of the image supply source such as such as PC, television receiver or Blu-ray Disc transcriber (these do not illustrate in the drawings) while being demonstrated etc. by projectile, output image switching part 306 will output toProjection Division 101 from the output image of image quality correction portion 304.In addition,, when the projective transformation parameter used at trapezoidal distortion and the optical distortion of calculation correction projected image, output image switching part 306 is exported the test pattern being produced by test pattern generating unit 305 toProjection Division 101.
Fig. 4 illustrates the inside configuration example of correcting value test section 105.Illustrated correctingvalue test section 105 comprises photographic images write/readcontrol part 401,photographic images memory 402, characteristicpoint calculating part 403 and projective transformation calculation ofparameter portion 404.
Photographic images memory 402 is stored the photographic images of camera section 104.In the present embodiment,photographic images memory 402 just has the size of the photographic images of thecamera section 104 for storing at least two frame parts.
Photographic images write/readcontrol part 401 is controlled photographic images writing and reading tophotographic images memory 402.
The photographic images that characteristicpoint calculating part 403 is read fromphotographic images memory 402 by use obtains the coordinate of characteristic point, as four of the test pattern comprising in photographic images angles.In addition, the coordinate of the characteristic point of projective transformation calculation ofparameter portion 404 based on calculating obtains fromProjection Division 101 until by the distance of projectile and direction, and calculation correction projects to the projective transformation parameter of being used by the trapezoidal distortion of the image on projectile and optical distortion.
In order to project to the correcting value that is calculated trapezoidal distortion and optical distortion by the test pattern on projectile by use, may need to extract the distincter information of test pattern.Therefore, if do not carried out in darkroom or on complete hololeucocratic screen, exist so due to the interference generation misoperation of the extraneous light such as such as natural daylight etc. may.For example, in the situation that projection type video display device 100 is minitype projection machine, in order to obtain the advantage that can carry anywhere, the information of the test pattern on the figuratum screen of tool must correctly be extracted.On the other hand, in the present embodiment, in the time that the photographic images fromcamera section 104 extracts the information of test pattern, background information is removed, and in such a way, even in the case of be subject to projection in environment that extraneous light disturbs and project to incomplete white by projectile, also can realize the automatic calibration of trapezoidal distortion and optical distortion.
In order to remove background information, in the present embodiment, carry out twice shooting by camera section 104.For example, in keeping same background, the second test pattern of taking when the first test pattern of taking while not irradiating test pattern and irradiation test pattern is stored in photographic images memory 402.Or the first test pattern of taking when illuminated at the first test pattern is stored inphotographic images memory 402 with the second test pattern of taking when illuminated at different from the first test pattern the second test patterns.In addition, characteristicpoint calculating part 403 removes the information of background by the difference generating between the first test pattern and the second test pattern read fromphotographic images memory 402, and makes the information of the test pattern that wherein the first test pattern and the second test pattern are synthesized distincter.In such a way, even in the case of be subject to projection in environment that extraneous light disturbs and project to incomplete white by projectile (thering is pattern), also can correctly calculate the coordinate of characteristic point, as four of test pattern angles, and can in the situation that there is not misoperation, proofread and correct the projective transformation parameter that trapezoidal distortion and optical distortion are used.
Here,, in order to make the coordinate of characteristicpoint calculating part 403 generating feature points, as four of test pattern angles, preferably, the test pattern being generated by test pattern generating unit 305 comprises two horizontal lines and two vertical lines.
For example, as shown in Figure 5A, in the case of using along the test rectangle pattern of the roughly periphery of the range of exposures ofProjection Division 101, take the first test pattern that does not irradiate test pattern, take the second test pattern that irradiates test pattern, and if remove background information by the difference generating between the first test pattern and the second test pattern, can make so the information of test pattern distincter.
In addition, as shown in the left part of Fig. 5 B, can use two horizontal the first test patterns that comprise along each edge up and down of the range of exposures of Projection Division 101, and as shown in the right part of Fig. 5 B, can use the second test pattern comprising along two vertical lines at each edge, left and right of the range of exposures of Projection Division 101.In this case, take the first test pattern that irradiates the first test pattern, take the second test pattern that irradiates the second test pattern, and similarly, if remove background information by the difference generating between the first test pattern and the second test pattern, can make so along the test rectangle pattern of the roughly periphery of the range of exposures of Projection Division 101 distincter.The test rectangle pattern that wherein the first test pattern and the second test pattern are synthesized (, being similar to the test pattern shown in Fig. 5 A) is included in the test pattern after background removal.In addition, when obtain characteristic points such as four angles of this test rectangle pattern coordinate time, projective transformation calculation of parameter portion 404 obtains from Projection Division 101 until by the distance of projectile and direction based on this coordinate information, and can project to the projective transformation parameter of being used by the trapezoidal distortion of the image on projectile by calculation correction.
For example, for the demonstration image of theliquid crystal panel 201 of the test pattern that comprises 630 × 360 pixels, the horizontal line configuring in circumference and the line width of vertical line are for example 6 pixels.
If only there is the coordinate at four angles of the range of exposures ofProjection Division 101 to be extracted as characteristic point, trapezoidal distortion can be corrected so.In addition, if not only the coordinate at four of the range of exposures ofProjection Division 101 angles is extracted from test pattern, and extract the coordinate of more characteristic point, so not only can carry out more detailed distortion correction for the trapezoidal distortion of the whole image of projected image but also the distortion occurring for part.Therefore, test pattern (or combination of the first test pattern and the second test pattern) not only can be made up of two horizontal lines and two vertical lines, but can constituting by three above horizontal lines and three above vertical lines.
For example, as shown in Figure 5 C, in the case of using the test pattern constituting by more than three horizontal lines and more than three vertical lines, take the first test pattern that does not irradiate test pattern, take the second test pattern that irradiates test pattern, and if remove background information by the difference generating between the first test pattern and the second test pattern, can make so wherein the information of the netted test pattern that three above horizontal lines and three above vertical lines intersect each other distincter.
In addition, the first test pattern being formed by three above horizontal lines shown in Fig. 5 E can be used, and the second test pattern being formed by three above vertical lines shown in Fig. 5 F can be used.In this case, take the first test pattern that irradiates the first test pattern, take the second test pattern that irradiates the second test pattern, and similarly, if remove background information by the difference generating between the first test pattern and the second test pattern, can make so the information of test pattern distincter.The netted test pattern that wherein the first test pattern and the second test pattern are synthesized (, similar with the test pattern shown in Fig. 5 C) can be included in the test pattern after background removal.
Here exist because the range of exposures (or the area of flat is very narrow) etc. such as be less thanProjection Division 101 by the area of projectile is former thereby from the situation of the part disappearance of the test pattern ofProjection Division 101 projections.Fig. 5 D illustrates the disappearance region from being illustrated by the broken lines in the netted test pattern shown in Fig. 5 C (or, the combination of the test pattern shown in Fig. 5 E and Fig. 5 F).In the time that all so netted test patterns can not be used to the calculating of projective transformation parameter, characteristicpoint calculating part 403 can calculate the coordinate at four angles from normal projection to the maximum rectangle that can be extracted by the test pattern projectile, and projective transformation parameter can be calculated by the coordinate at four angles with this maximum rectangle by projective transformation calculation ofparameter portion 404.
In Fig. 5 D, characteristicpoint calculating part 403 calculates four angles (in the drawings, by 4 intersection points shown in X) of the maximum rectangle (region of drawing with oblique line in the drawings) extracting from the test pattern of normal projection.In addition the portion of projective transformation calculation of parameter, 404 can be used to by calculating with the point at these four angles the projective transformation parameter that keystone correction is used.
Fig. 5 G illustrates the variation of the test pattern shown in Fig. 5 C.Take the first test pattern that does not irradiate test pattern, take the second test pattern that irradiates test pattern, and if remove background information by the difference generating between the first test pattern and the second test pattern, can make so the information of detailed netted test pattern distincter.In addition, Fig. 5 H illustrates the variation of the first test pattern shown in Fig. 5 E, and Fig. 5 I illustrates the variation of the second test pattern shown in Fig. 5 F.In this case, take the first test pattern that irradiates the first test pattern, take the second test pattern that irradiates the second test pattern, and if remove background information by the difference generating between the first test pattern and the second test pattern, so can be distincter like the info class of test pattern.The netted test pattern that wherein the first test pattern and the second test pattern are synthesized (, similar with the test pattern shown in Fig. 5 G) can be included in the test pattern after background removal.
As shown in Fig. 5 G~5I, when test pattern is made into more detailed when netted, former thereby from the part disappearance of the test pattern ofProjection Division 101 projections in the range of exposures (or the area of flat is very narrow) etc. due to such as be less thanProjection Division 101 by the area of projectile, can infer accurately the maximum rectangle extracting from test pattern.The characteristic point at four angles of the maximum rectangle that characteristicpoint calculating part 403 can be inferred by use is calculated projective transformation parameter.In addition,, owing to can easily detecting test pattern from the characteristic point of the test pattern after background removal and where be irradiated to go screen by detecting, therefore can infer the maximum image size that can be irradiated on screen.Image rectification portion 303 can carry out the distortion correction of projected image by use projective transformation parameter, and can be by mate the size adjusting that carry out projected image with maximum image size.
In addition, the test pattern shown in Fig. 5 G~5I all can comprise cut apart test pattern corresponding to cornerwise grid of rectangle or two slits of horizontal line and vertical line of profile that become test pattern.The intersection point of slit is corresponding to the substantial middle of test pattern.Therefore,, based on the intersection point of slit, can easily obtain the position number of intersections of left and right (can detect up and down or) of the maximum magnitude that is irradiated to the test pattern on screen.
As until described, in the present embodiment, by utilizing the difference between the first test pattern and the second test pattern to remove background, and obtain distincter test pattern here.But, therein user can not determine in the system of the impact of extraneous light or the exposure of shooting (camera section 104 can not be controlled orcamera section 104 can not be released to user in the situation that), there is brightness remarkable different situation between the first test pattern and the second test pattern.In the time there is luminance difference between the test pattern of taking, only remove background by utilization variance and can not carry out accurately.
Therefore,, after the adjusting of carrying out the brightness matching between the first test pattern and the second test pattern wherein, characteristicpoint calculating part 403 carries out background removal by utilization variance.For example, one in test pattern is set to benchmark, and carries out the processing that wherein ratio of mean flow rate is multiplied by the pixel value of another test pattern.In addition,, by not only considering that the uniform luminance occurring is poor but also consider that the local luminance occurring is poor in whole photographic images, carry out local brightness adjusted.
Particularly, be divided in the horizontal and vertical directions each region of multiple images for each test pattern wherein, obtained mean flow rate.In addition, in the region corresponding to each test pattern, calculate the ratio of the mean flow rate based on a test pattern, and the ratio calculating is multiplied by the pixel value of another image.
Fig. 5 J illustrates that the first test pattern A wherein taking by the first test pattern shown in irradiation Fig. 5 H is divided into multiple images and the region Area (0 at the 0th row and the 1st row in the horizontal and vertical directions, 1) in, obtain the state of mean flow rate A_Area (0,1).Similarly, Fig. 5 K illustrates wherein at the 0th row of the second test pattern B and the state of the 1st middle mean flow rate B_Area of acquisition of the region Area (0,1) being listed as (0,1) taken by irradiating the second test pattern shown in Fig. 5 I.
In addition, in the time that the ratio Scale of the mean flow rate based on the first test pattern A calculates according to following formula (1), shown in (2), by this ratio Scale being multiplied by the same area Area (0 of the second test pattern B, 1) pixel value of the each pixel in, obtains the second test pattern B ' after brightness regulation.
Scale=mean flow rate A_Area (0,1)/mean flow rate B_Area (0,1)---(1)
Test pattern B'=Scale × test pattern B---(2)
The image of characteristicpoint calculating part 403 after for the removal background by utilizing the difference between the first test pattern and the second test pattern carries out the standardization of brightness.But by projectile in bright light environments in the situation that, it is very low that the brightness of the test pattern that can be taken bycamera section 104 will become.When make the luminance standard of image of background removal by this coupling, there is following problem, for example, amplify from the noise in the uncorrelated region of test pattern and produce the deteriorated of accuracy of detection.Therefore, the image of background removal is distinguished the region beyond the test pattern region that comprises test pattern and its mutually, and by only comprising the former standardization in region of test pattern, can prevent the amplification of noise.
Determine that whether each region is that the method in test pattern region is arbitrarily.For example, in the marginal portion of test pattern, pixel value can promptly change, and uprises by the dispersion of concerned pixel value, can determine whether each region is test pattern region based on dispersion value.That is, characteristicpoint calculating part 403 obtains dispersion value for each region, and, if be equal to or greater than threshold value at dispersion value, carry out the standardized while of brightness, the region that dispersion value is less than threshold value does not have the standardization that they are carried out.
Or, by paying close attention to the part with high brightness of being irradiated by test pattern, can determine whether each region is test pattern region based on mean flow rate.That is, characteristicpoint calculating part 403 obtains mean flow rate for each region, and, if be equal to or greater than threshold value in mean flow rate, carry out the standardized while of brightness, the region that mean flow rate is less than threshold value does not have the standardization that they are carried out.But, in the threshold process of mean flow rate, in the time that the timing of the irradiation of the test pattern fromProjection Division 101 and the shutter ofcamera section 104 differs from one another step, may need to consider to carry out wrong definite problem for the low brightness area of test pattern.
In addition, due to the coordinate obtaining in characteristicpoint calculating part 403 characteristic points such as the each point at four angles of the image from after background removal, therefore adopt following methods, wherein detect two crossing line segments (as straight line or two-dimensional curve, obtain the equation of line segment), and the intersection point of two line segments of calculating.
As mentioned above, calculate the first test pattern and the second test pattern removal background information of use from test pattern.But, suppose the impact due to the extraneous light such as such as natural daylight, the image after the situation of background is removed is the still extremely low image of contrast of wherein test pattern.Therefore, characteristicpoint calculating part 403 carries out powerful noise reduction process for the removed image of information of background wherein.For example, for the removed image of information of background wherein, carry out powerful noise reduction process by the auto-correlation of line segment.That is, in the time obtaining the equation of line segment, the inclination angle of Checking line, and by by multiple Pixel Information equalizations of this direction, other noise component(s)s can be separated with the line segment that forms test pattern.
Fig. 6 is illustrated in projection type video display device 100 in a flowchart for proofreading and correct by the processing procedure of the distortion of the projected image on projectile.Illustrated processing procedure comprises following treatment step.
S601: carry out the irradiation of test pattern
S602: carry out shooting and the preservation of test pattern
S603: the coordinate that calculates the multiple characteristic points in test pattern
S604: calculate by the position of projectile and direction from the coordinate of the multiple characteristic points in test pattern
S605: based on being calculated projective transformation parameter by the position of projectile and direction
S606: proofread and correct by the trapezoidal distortion of the projected image on projectile by projective transformation
Various processes is described below.
First, the test pattern that wherein comprises the test pattern being generated by test pattern generating unit 305 is irradiated to by (step S601) projectile from Projection Division 101.Then,camera section 104 is taken and is projected to by the test pattern on projectile, and the test pattern (step S602) that storage is taken inphotographic images memory 402.
As mentioned above, in the present embodiment, in order to remove background information from test pattern, carry out twice shooting by step S601 and step S602.Description is below by explanation the first test pattern and the second test pattern that inphotographic images memory 402, storage is taken in the time irradiating respectively the first test pattern of differing from one another and the second test pattern.
Fig. 7 A is illustrated in the first test pattern A that the first test pattern forming when two transverse and longitudinal lines at the each edge, left and right by along range of exposures is taken bycamera section 104 fromProjection Division 101 is irradiated to by projectile time.In addition, Fig. 7 B is illustrated in the second test pattern B that the second test pattern forming when two transverse and longitudinal lines at the each edge up and down by along range of exposures is taken bycamera section 104 fromProjection Division 101 is irradiated to by projectile time.
Next, by use the first test pattern A and the second test pattern B that read fromphotographic images memory 402, as the characteristic point of the test pattern comprising in these photographic images, characteristicpoint calculating part 403 obtains the coordinate (step S603) at the first test pattern of wherein projection and four angles of the rectangle that the second test pattern is synthesized.
Here, characteristicpoint calculating part 403, by generating the poor information of removing background between the first test pattern A and the second test pattern B, makes the information distinctness of test pattern.In addition,, regulating rear (previously described) by the brightness of mating between the first test pattern A and the second test pattern B, carry out the removal of background information.
Fig. 8 A illustrates the image C by deduct the background removal that the second test pattern B obtains from the first test pattern A.In this subtraction process, in pixel unit, deduct the second test pattern B from the first test pattern A, and in difference is negative pixel, pixel value is set to 0.As the result of doing like this, as shown in Figure 8 A, in the image C of background removal, background disappears from the first test pattern, and the first test pattern being made up of two vertical lines at the each edge, left and right along range of exposures retains.
In addition, Fig. 8 B illustrates the image D by deduct the background removal that the first test pattern A obtains from the second test pattern B.In this subtraction process, in pixel unit, deduct the first test pattern A from the second test pattern B, and in difference is negative pixel, pixel value is set to 0.As the result of doing like this, as shown in Figure 8 B, in the image D of background removal, background disappears from the second test pattern, and the second test pattern being made up of two horizontal lines at the each edge up and down along range of exposures retains.
But Fig. 8 A and Fig. 8 B are construed to the result that characteristicpoint calculating part 403 wherein obtains brightness maximum and carries out the standardization (previously described) of brightness for each region of four line segments that comprise in the image C of background removal and D.
When the test pattern A irradiating and B are taken, can obtain image C and the D of distinct background removal in dark atmosphere.But, test pattern A and B are taken in the atmosphere of having been irradiated by the extraneous light such as such as natural daylight (even a small amount of), only obtain image C and the D of the background removal with low contrast.
Therefore, characteristicpoint calculating part 403 carries out powerful noise reduction process for the removed image of information of background wherein by the auto-correlation of line segment.
The vertical line with width has the strong correlation of above-below direction, even if there is a small amount of inclination (horizontal line similarly, with width has the strong correlation of left and right directions).Therefore, for some pixel columns, in the time that total pixel value of the par location of pixels within the scope of the image line by calculate up and down approximately ± 5 row is taked auto-correlation, vertical line by near location of pixels, pixel value amplifies approximately 10 times.On the other hand, owing to only having random noise at the unsanctioned location of pixels of vertical line, even total the pixel value of therefore take ± 5 row parts, pixel value also only amplifies approximately 3 times.Use this autocorrelative noise reduction for approximately 3 times.
If take the quantity of autocorrelative pixel column to increase, noise reduction will further be improved so.For example, in the time calculating total pixel value of par location of pixels of 100 pixel columns, near location of pixels vertical line passes through, pixel value amplifies approximately 100 times.But in order to take the auto-correlation of 100 pixel column parts, the error possibility in the angle of line segment must be in 0.5 degree.
Therefore, technology such as angle searching, can obtain more energetically noise reduction by combination.
For example, for the vertical line at each edge, left and right of the test pattern shown in the right part of Fig. 5 B, inclination maximum reaches 13 degree.Therefore, first, in the scope of ± 13 degree, carry out rough search for the angle that becomes maximum auto-correlation increase with 1 degree interval.Then,, within the scope of near the 2 degree angle by rough angle searching auto-correlation maximum, carry out detailed angle searching by auto-correlation.In such a way, when in reducing the scope of angle searching and the interval of search, angle searching is repeated three times, error will be in 0.07 degree.
In such a way, if can correctly detect the angle that becomes the vertical line of test pattern by noise reduction process, take auto-correlation by obtaining total pixel value of 100 pixel column parts on the actual direction of this angle so.In addition, make the maximum amounting to become 255 and carry out standardization by cutting apart and calculating.Owing in fact taking total pixel value in 191 pixel columns, therefore noise reduction is approximately 14 times.
Fig. 9 A~9D illustrates the state that wherein image C of the background removal shown in Fig. 8 A is carried out angle searching and noise reduction process.
The image C of background removal has two vertical lines in the left and right part of each image half part.Therefore, hereinafter, the image shown in Fig. 9 A is divided into two parts in left and right, and the processing of the left-half of explanation to image.Be appreciated that the processing procedure of the right half part to image similarly.
(1) first, in moving a pixel in the horizontal direction on the central line v/2 at vertical dimension v, search angle.As shown in Figure 9 B, first, the angular range (in the drawings, the scope between dotted line) of spending with IV interval ± 13 of 1 degree, the pixel value of upper and lower n pixel portion is added.
(2) then,, as the result of the angle searching for ± 13 angular interval of spending, on the central line v/2 of vertical dimension v, determine the wherein angle of auto-correlation maximum (anmax01) and location of pixels (imax, v/2).
(3) then, as shown in Figure 9 C, for definite angle (anmax01), at (anmax01-1) and angular range (anmax01+1) (in the drawings, scope between dotted line) in, carry out the second angle searching with the interval of 1/4 degree.
(4) then, additionally for (anmax02), by becoming maximum angle (anmax02) via the second angle searching auto-correlation, in (anmax02-1/4) and angular range (anmax02+1/4), carry out the 3rd angle searching with the interval of 1/16 degree.By repeating such angle searching three times, error will be in 0.07 degree.By above-mentioned processing, determine angle (anmax) for the vertical line of the left-half of image.
(5) then, as shown in Fig. 9 D, for the scope of y=(the m/2)~v-(m/2) in (y direction) in the vertical direction, by position x=(y-v/2) × tan (the anmax)+imax of horizontal direction (x direction) is got to center, by in the horizontal direction ± scope of 15 pixels (in the drawings, scope between dotted line) and the pixel equalization of m pixel portion in vertical direction, and these are set to the pixel value of position (x, y).Here, in Fig. 9 D, complete image, wherein in the scope by dotted line, carry out noise reduction process.
In addition,, by the processing of above-mentioned (1)~(5) is applied to the right half part of image, realize powerful noise reduction process for the image C of background removal by auto-correlation.Powerful noise reduction process can be applied to the image D of background removal similarly, for the image D that comprises two horizontal background removals shown in Fig. 8 B wherein, can apply similarly powerful noise reduction process by above-mentioned auto-correlation.
Figure 10 A illustrates theimage C 2 by the image C of the background removal shown in Fig. 8 A is carried out to the background removal of powerful noise reduction process acquisition via auto-correlation.In addition, Figure 10 B illustrates the image D2 by the image D of the background removal shown in Fig. 8 B is carried out to the background removal of powerful noise reduction process acquisition via auto-correlation.
Characteristicpoint calculating part 403 obtains the coordinate of the each point at four angles of the range of exposures ofProjection Division 101 by calculating the intersection point of two horizontal lines comprising in theimage C 2 of background removal and D2 and two vertical lines.
Here if the distortion occurring in projected image is trapezoidal distortion, become so linear by the projected image expection of vertical line and the horizontal test pattern constituting.But, owing to there is lens distortions in the projected light department of the Chinese Academy ofSciences 203 andcamera section 104, therefore project to by four line segments on projectile (in other words, the line segment of observing in theimage C 2 of background removal and D2) very littlely, will become curve.
Therefore, characteristicpoint calculating part 403 obtains intersection points by detect these four line segments that comprise in theimage C 2 of background removal and D2 as two-dimensional curve.
Particularly, characteristicpoint calculating part 403 is for theimage C 2 of background removal and each ray examination of D2 at the brightness value towards the substantially vertical direction at center from the outside of image, and the center in these by using with the region of highest brightness value arranges the position data of line segment.Figure 11 A illustrates the state that extracts the position data of line segment from the image of background removal.In the drawings, the position data depicted as solid lines of line segment.
Then, for example, by using least square method, characteristicpoint calculating part 403 calculates the two-dimensional curve y=ax of approximate four line segments from multiple position datas2coefficient a, b and the c of+bx+c.Least square method is for the detection of two-dimensional curve, and by multiple points of integrated treatment line segment, can obtain 1 precision of the quantity part of a pixel.Figure 11 B illustrates the state that two-dimensional curve detected from two horizontal lines in theimage C 2 of background removal and D2 and vertical line.In the drawings, the two-dimensional curve detecting from two horizontal lines and vertical line is by dotted lines.
In addition, characteristicpoint calculating part 403 obtains the intersection point of four two-dimensional curves, and these are set to the coordinate at four angles.Figure 11 C illustrates the state of the intersection point that obtains four two-dimensional curves.In the drawings, the position at four of acquisition angles is represented by X.
If the distortion occurring in projected image is trapezoidal distortion, become linear (previously described) by the projected image expection of vertical line and the horizontal test pattern constituting so, and if carry out the angle searching of 0 degree orientation in the vertical direction with in horizontal direction, the line segment of test pattern can be detected so.But, owing in fact there is lens distortions and because these test patterns that project to by projectile have nonlinear distortion in the projected light department of the Chinese Academy ofSciences 203 andcamera section 104, therefore have in the vertical direction with horizontal direction and get the situation in the hunting zone at center without best solution.Usually, be known that the pincushion distortion (with reference to Figure 12) of shrinking and roughly expanding towards end at the middle position in the visual field and produce by the projected image due to lens distortions in the middle position expansion in the visual field the barrel shape distortion (with reference to Figure 13) of roughly shrinking towards end.The distortion of pincushion or barrel shape is left and right and laterally zygomorphic amount distortion.
If can find best solution although carry out angle searching by expansion hunting zone, but amount of calculation also will increase.Especially, in the case of the detailed netted test pattern shown in use Fig. 5 G, amount of calculation will significantly increase.In addition, as shown in Fig. 5 G, between adjacent segments, have closely-spaced in the situation that at the test pattern using, have the possibility that adjacent segments detected mistakenly in the time carrying out angle searching by expansion hunting zone.
Therefore, can carry out angle searching by the center that changes adaptively hunting zone, and without expanding hunting zone.Particularly, carry out angle searching by the result of carrying out the adjacent segments of over-angle search before being arranged on just as center.This is because by inference, the variation of lens distortions is progressively, and the variation of angle between adjacent segments is negligible.
The distortion correction treatment of projected image is described referring again to Fig. 6.Then, the coordinate of projective transformation calculation ofparameter portion 404 based on above-mentioned four angles that calculate obtains until by the distance of projectile and direction, calculation correction projects to the projective transformation parameter of being used by the trapezoidal distortion of the image on projectile, and the projective transformation parameter calculating is outputed to image rectification portion (step S604).
Afterwards, the image of reading from frame memory 302 is carried out projection and conversion by the projective transformation parameter ofimage rectification portion 303 based on receiving from correctingvalue test section 105, and carry out while being write body proofreading and correct (step S605) such as eliminating trapezoidal distortion etc. when projecting to from Projection Division 10.
In such a way, according to the projection type video display device 100 of the present embodiment, the projected image of the test pattern based on being taken by thecamera section 104 that is configured in the irradiation position that is different fromProjection Division 101, automatically detect correcting value, and the image irradiating fromProjection Division 101 can be corrected, thereby project to by projectile by correct rectangle.
In addition, according to the projection type video display device 100 of the present embodiment, because the projected image of the test pattern for being taken bycamera section 104 carries out powerful noise reduction process by auto-correlation, therefore, by removing the background influence causing due to the extraneous light such as such as natural daylight, can obtain the coordinate accurately at four angles.Therefore, projection type video display device 100 not only can be used in darkroom, and can be used in bright chamber, and further, can be corrected, and the image irradiating fromProjection Division 101 is projected to by projectile by correct rectangle.
In addition, this technology also can have following formation.
(1) projection type video display device, comprising:
Be configured to project image onto the Projection Division by projectile;
Camera section, is arranged on the position of the irradiation position that is different from described Projection Division, is configured to take project to by the image on projectile;
Correcting value test section, is configured in the time that test pattern is from described Projection Division projects to by projectile test pattern from being taken by described camera section and removes background, detects the correction parameter of using from the image of described Projection Division projection about the information of the coordinate of the test pattern in the test pattern background removal and the information calculation correction based on described coordinate; With
Proofread and correct from the image rectification portion of the image of described Projection Division projection based on described correction parameter.
(2) projection type video display device as described in (1),
Wherein said correcting value test section calculates the coordinate from multiple characteristic points of the described test pattern of the described test pattern after background removal, and write from the coordinate calculation correction of described multiple characteristic points the projective transformation parameter that the distortion that comprises the projected image of body is used, and
Wherein said image rectification portion carries out projective transformation by described projective transformation parameter to the image from the projection of described Projection Division.
(3) projection type video display device as described in (1),
Wherein said camera section is not taken the first test pattern in the time that described test pattern does not irradiate from described Projection Division, and takes the second test pattern during from the irradiation of described Projection Division when described test pattern, and
Wherein said correcting value test section is removed the information of background by the difference generating between the first test pattern and the second test pattern from the second test pattern, and obtains the described test pattern after the background removal that comprises described test pattern.
(4) projection type video display device as described in (1),
Wherein said camera section is taken the first test pattern in the time that the first test pattern irradiates from described Projection Division, and takes the second test pattern during from the irradiation of described Projection Division when the second test pattern, and
Wherein said correcting value test section is removed the information of background by the difference generating between the first test pattern and the second test pattern from the first test pattern and the second test pattern, and obtains the described test pattern after the background removal that comprises the test pattern that wherein the first test pattern and the second test pattern are synthesized.
(5) projection type video display device as described in (3) or (4),
Wherein said correcting value test section carries out the Transformatin of the information of background after the brightness regulation of carrying out between the first test pattern and the second test pattern.
(6) projection type video display device as described in (5),
Wherein said correcting value test section carries out described brightness regulation processing by the pixel value that the ratio of the mean flow rate taking one of the first test pattern and second test pattern as benchmark is multiplied by another image.
(7) projection type video display device as described in (6),
Wherein said correcting value test section for wherein the first test pattern and the second test pattern respectively in the horizontal and vertical directions divided each region carry out described brightness regulation processing.
(8) projection type video display device as described in (1),
Wherein said correcting value test section makes luminance standard for the region of the test pattern in the described test pattern comprising after background removal.
(9) projection type video display device as described in (8),
The region that wherein said correcting value test section is only equal to or greater than predetermined threshold for the dispersion value of the pixel value in described test pattern makes luminance standard.
(10) projection type video display device as described in (5),
Wherein use and comprise multi-drop line and many horizontal netted test patterns,
The coordinate of multiple characteristic points that the vertical line of the test pattern comprising in the described test pattern of wherein said correcting value test section based on by after background removal and horizontal each intersection point form, the projective transformation parameter that calculation correction distortion is used, and
Wherein said image rectification portion carries out projective transformation by described projective transformation parameter to the image from the projection of described Projection Division.
(11) projection type video display device as described in (10),
Wherein said test pattern also comprises corresponding to cornerwise two slits of rectangle of profile that become described test pattern.
(12) projection type video display device as described in (10),
The coordinate of the characteristic point of wherein said correcting value test section based on can the described test pattern from background removal detecting, supposition can project to by the maximum image size projectile from described Projection Division.
(13) projection type video display device as described in (4),
Wherein use the first test pattern that comprises multi-drop line and comprise many horizontal the second test patterns,
The coordinate of multiple characteristic points that the vertical line of the test pattern comprising in the described test pattern of wherein said correcting value test section based on by after background removal and horizontal each intersection point form, the projective transformation parameter that calculation correction distortion is used, and
Wherein said image rectification portion carries out projective transformation by described projective transformation parameter to the image from the projection of described Projection Division.
(14) projection type video display device as described in (13),
Wherein the first test pattern and the second test pattern all comprise corresponding to cornerwise two slits of rectangle of profile that become described test pattern.
(15) projection type video display device as described in (13),
The coordinate of the characteristic point of wherein said correcting value test section based on can the described test pattern from background removal detecting, supposition can project to by the maximum image size projectile from described Projection Division.
(16) projection type video display device as described in (1),
After wherein the additional auto-correlation of the test pattern after the information by for background is removed is carried out noise reduction process, described correcting value test section detects the information about the coordinate of each characteristic point of the described test pattern in described test pattern, and the information calculation correction parameter of coordinate based on each characteristic point.
(17) projection type video display device as described in (16),
Wherein by the vertical direction or after the horizontal direction angle searching described test pattern that comprises multi-drop line or many horizontal test patterns carries out noise reduction process, described correcting value test section calculates the coordinate of the multiple characteristic points that are made up of vertical line and horizontal each intersection point of test pattern, and the projective transformation parameter used of the information calculation correction distortion of coordinate based on each intersection point.
(18) projection type video display device as described in (17),
Wherein said correcting value test section detects the vertical line or the horizontal each test pattern that in described test pattern, comprise as two-dimensional curve.
(19) projection type video display device as described in (17),
Wherein said correcting value test section carries out angle searching by the result of carrying out the adjacent segments of over-angle search before being arranged on just as center.
(20) image projecting method, comprising:
Test pattern is projected to by projectile;
Project to by the test pattern on projectile and obtain test pattern by shooting;
Remove background from described test pattern;
Detect the information of the coordinate of the test pattern about comprising in the test pattern after background removal, and the correction parameter used from the image of Projection Division projection of information calculation correction based on described coordinate; With
Based on described correction parameter correcting orthographic projection image.
(21) a kind of with computer-readable format write as and make computer carry out computer program:
Test pattern is projected to by projectile;
Project to by the test pattern on projectile and obtain test pattern by shooting;
Remove background from described test pattern;
Detect the information of the coordinate of the test pattern about comprising in the test pattern after background removal, and the correction parameter used from the image of Projection Division projection of information calculation correction based on described coordinate; With
Based on described correction parameter correcting orthographic projection image.
Understand in detail with reference to specific embodiment the technology that the disclosure is described above.But, it is evident that, those skilled in the art can revise or replace in the scope of technology contents that does not depart from disclosure description.
In the disclosure, although described the embodiment about the projection type video display device of one-piece type camera, even if but in the situation that camera can separate or be connected to main body outside with projection type video display device, also can be suitable for the technology that the disclosure is described that is similar to.
It will be appreciated by those skilled in the art that according to designing requirement and other factors, can in the scope of appending claims of the present invention or its equivalent, carry out various amendments, combination, inferior combination and change.