This application is a continuation application of U.S. application Ser. No. 14/425,275, filed Mar. 2, 2015, which was the National Stage of International Application No. PCT/JP2012/072591, filed Sep. 5, 2012, the disclosures of which are incorporated herein in their entireties by reference.
TECHNICAL FIELDThe present invention relates to an image capture device and an image processing method.
BACKGROUND ARTAs a medium for data, paper has generally been used. On the other hand, recently, as a medium for data, electronic data has become widespread. For this reason, there are an increasing number of occasions for data printed on paper to be stored as image data.
As a technology for storing data printed on paper as image data, for example, there is a technology disclosed inPatent Document 1. In this technology, a positioning symbol is printed on rectangular paper, and a region to be stored is cut out from image capture data which is obtained by capturing an image of the paper on the basis of the positioning symbol. Further, inPatent Document 1, trapezoid correction is performed with respect to the cut-out region.
RELATED DOCUMENTPatent Document[Patent Document 1] Japanese Laid-open patent publication No. 2012-068746
DISCLOSURE OF THE INVENTIONAccording to the technology disclosed inPatent Document 1, only a necessary portion in the image data is able to be cut out and stored. However, in the technology disclosed inPatent Document 1, it is necessary that the positioning symbol is printed on an object (for example, paper) to be processed in advance. For this reason, it is not possible to computerize only a necessary portion in data printed on an object on which a positioning symbol is not printed.
An example of an object of the present invention is to cut out only a necessary portion in image data even when a positioning symbol is not printed on an object to be stored as image data.
The invention according to an example embodiment is an image capture device including a mark irradiation unit which irradiates an object with a mark; an image capture unit which captures an image of the object and generates image data; and an image capture area data generation unit which recognizes a position of the mark in the image data, and cuts out image capture area data which is a part of the image data on the basis of the position.
The invention according to another example embodiment is an image processing method including capturing an image of an object and generating image data in a state in which the object is irradiated with a mark; and recognizing a position of the mark in the image data and cutting out image capture area data which is a part of the image data on the basis of the mark by using a computer.
BRIEF DESCRIPTION OF THE DRAWINGSThe above-described object and other objects, characteristics, and advantages will be made more obvious by preferred embodiments described below and the accompanying drawings.
FIG. 1 is a perspective view illustrating a configuration of an image capture device according to a first embodiment.
FIG. 2 is a block diagram illustrating a functional configuration of the image capture device.
FIG. 3 is a diagram illustrating a hardware configuration of the image capture device illustrated inFIG. 2.
FIG. 4 is a cross-sectional view illustrating a structure of an organic EL element provided in an organic EL panel as an illumination panel.
FIG. 5 is a diagram illustrating a first example of a mark irradiation of a mark irradiation unit.
FIGS. 6(a) and 6(b) each include a diagram illustrating another example of the mark irradiation of the mark irradiation unit.
FIG. 7 is a diagram illustrating an example of a method in which a control unit corrects an irradiation range of the mark irradiation unit, and thus the mark is formed in a desired shape.
FIG. 8 is a diagram for describing another function of the mark irradiation unit.
FIG. 9 is a flowchart illustrating a first example of an operation of the image capture device.
FIG. 10 is a flowchart illustrating a second example of the operation of the image capture device.
FIG. 11 is a flowchart illustrating a third example of the operation of the image capture device.
FIG. 12 is a diagram for describing a method of calculating an estimated position of a mark in a second embodiment.
FIG. 13 is a diagram illustrating a configuration of an image capture device according to a third embodiment.
FIG. 14 is a block diagram illustrating a functional configuration of the image processing device.
FIG. 15 is a flowchart for describing a first example of processing performed by the image processing device.
FIG. 16 is a flowchart for describing a second example of the processing performed by the image processing device.
DESCRIPTION OF EMBODIMENTSHereinafter, embodiments of the present invention will be described with reference to the drawings. Furthermore, in all the drawings, the same reference numerals are applied to the same constituents, and the description thereof will not be repeated.
First EmbodimentFIG. 1 is a perspective view illustrating a configuration of animage capture device10 according to a first embodiment.FIG. 2 is a block diagram illustrating a functional configuration of theimage capture device10. Theimage capture device10 includes amark irradiation unit130, animage capture unit140, and an image capture area data generation unit230 (illustrated inFIG. 2). Themark irradiation unit130 irradiates an object with a mark. The object is a sheet-like material, for example, paper on which data is printed, but may be other materials, for example, a commercial product for display. Theimage capture unit140 captures an image of the object and generates image data. The image capture areadata generation unit230 recognizes the mark in the image data, and cuts out image capture area data which is a part of the image data on the basis of the mark. According to this embodiment, themark irradiation unit130 irradiates the object with the mark, and thus even when a positioning symbol is not printed on the object to be stored as the image data, only a necessary portion in the image data is cut out. Hereinafter, the details will be described.
First, a structural configuration of theimage capture device10 will be described with reference toFIG. 1. As illustrated inFIG. 1, themark irradiation unit130 and theimage capture unit140 are embedded in anedge portion155 of aholding member150. Theedge portion155 is rotatably attached to theholding member150. A direction of theedge portion155 when based on theholding member150, that is, an angle of theedge portion155 with respect to theholding member150 is detected by a positionangle detection unit135.
Theholding member150 further holds anillumination unit170. Theillumination unit170 illuminates the object, and brightens an image shown by image capture data. One end of theholding member150 is attached to aguide member120 through anattachment portion160. Furthermore, theedge portion155 is attached to the other end side of theholding member150, that is, a side of theholding member150 opposite to theattachment portion160 across theillumination unit170.
Theillumination unit170, for example, includes an organic electroluminescence (EL) panel as a light source. The organic EL panel includes a plurality of types of organic EL having spectral properties different from each other. The organic EL element, for example, emits colors of light different from each other. A combination of colors of the light emitted by theillumination unit170 is arbitrary, and RGB (red, green, and blue), or RYB (red, yellow, and blue) is included as an example. Intensity of the plurality of colors is controlled by acontrol unit220 illustrated inFIG. 2. For this reason, a user of theimage capture device10 is able to adjust illumination light of the object to have a desired color. In particular, when theillumination unit170 includes the organic EL element, it is possible to make a spectrum of the light emitted by theillumination unit170 broad. In this case, it is possible for the light of theillumination unit170 to feel like soft natural light to an observer. Furthermore, the light source of theillumination unit170 is not limited to the organic EL panel, and for example, may be a light emitting diode (LED).
Theillumination unit170 may include a plurality of organic EL panels. In this case, the plurality of organic EL panels may emit light having colors identical to each other (including a combination of a plurality of colors). In addition, at least one organic EL panel may include an organic EL element emitting light having a color different from those of other organic EL panels. In addition, at least one organic EL panel may have a spectral width different from those of the other organic EL panels. In this case, theillumination unit170 may switch between organic EL panels emitting the light.
In this embodiment, a planar shape of the holdingmember150 is a rectangle. Theattachment portion160 attaches a side of the holdingmember150 to theguide member120. Further, theattachment portion160 rotatably attaches the holdingmember150 to theguide member120 with theattachment portion160 as a pivot point. Themark irradiation unit130 and theimage capture unit140 are attached to a side of the holdingmember150 opposite to the side which is attached to theguide member120.
Theguide member120 is attached to apedestal110, and extends upward from thepedestal110. The holdingmember150 is attached to theguide member120 such that the holdingmember150 is able to be moved in a vertical direction along theguide member120. In theattachment portion160, a stopper is embedded. The stopper is disposed to fix a position of the holdingmember150 in the vertical direction and to fix an angle of theattachment portion160. For this reason, the user of theimage capture device10 moves the holdingmember150 to a desired height, and rotates the holdingmember150 to a desired angle, and then is able to fix the height and the angle.
Further in theattachment portion160, anangle detection unit162 and aposition detection unit164 are embedded.
Theangle detection unit162 detects a direction of the holdingmember150 when based on theguide member120, that is, an angle of the holdingmember150 with respect to theguide member120. Thepedestal110 is disposed in a place (for example, on a desk) identical to that of the object (for example, the paper on which the data is printed) of the processing. In addition, an angle of theguide member120 with respect to thepedestal110 is fixed (for example, 90 degrees). For this reason, theimage capture device10 is able to calculate an angle of themark irradiation unit130 and theimage capture unit140 with respect to a surface of the object to which the mark is applied on the basis of a detection result of theangle detection unit162.
Theposition detection unit164 detects a position of theattachment portion160 in a direction along theguide member120. Theguide member120 is fixed to thepedestal110. For this reason, the image capture area data generation unit230 (illustrated inFIG. 2) is able to calculate a distance (or a height) of themark irradiation unit130 and theimage capture unit140 with respect to the object on the basis of a detection result of theposition detection unit164.
In thepedestal110, an electronic component for controlling is embedded. A control system of theimage capture device10 is configured by the electronic component.
Furthermore, themark irradiation unit130 is controlled such that the center of an image capture area shown by the mark is coincident with the center of an image capture area of theimage capture unit140. In addition, the center of an illumination region of theillumination unit170 is coincident with these centers.
Next, a function of theimage capture device10 will be described with reference toFIG. 2. Theimage capture device10 includes aninput unit210, thecontrol unit220, the image capture areadata generation unit230, and adata storage unit240 in addition to theimage capture unit140, theangle detection unit162, theposition detection unit164, and theillumination unit170.
Theinput unit210 acquires input information from the user of theimage capture device10. The input information, for example, is information indicating the image capture area shown by the mark of themark irradiation unit130, information indicating an intensity and a color hue of the light of theillumination unit170, and information indicating a generation timing of the image data of theimage capture unit140. Thecontrol unit220 controls themark irradiation unit130, theimage capture unit140, and theillumination unit170 according to the information input from theinput unit210.
In this embodiment, the image capture areadata generation unit230 recognizes the mark in the image data by the image processing, and cuts out the image capture area data on the basis of the mark. For this reason, it is possible to cut out the image capture area data with high accuracy. In addition, the image capture areadata generation unit230 corrects a distortion of the image capture area data on the basis of the detection result of theangle detection unit162 and theposition detection unit164. The correction, for example, is processing referred to as trapezoid correction.
The image capture areadata generation unit230 performs the trapezoid correction with respect to the image capture area data on the basis of a position (an example illustrated inFIG. 6(b) described later) or a shape (an example illustrated inFIG. 5 orFIG. 6(a) described later) of the mark in the image data generated by theimage capture unit140.
Specifically, themark irradiation unit130 draws the mark such that the image capture area has a shape determined in advance, for example, a square or a rectangle. Then, the image capture areadata generation unit230 performs the trapezoid correction with respect to the image capture area data such that a shape shown by the image capture area data is the above-described shape determined in advance (for example, a square or a rectangle). Accordingly, when the image capture area data is displayed, displayed information is easily viewed.
Furthermore, thecontrol unit220 changes an irradiation angle of the mark emitted by themark irradiation unit130 on the basis of the detection result of theangle detection unit162, theposition detection unit164, and the positionangle detection unit135, and thus the mark shows a desired image capture area (for example, a square or a rectangle) on the object. This processing will be described later in detail.
In addition, both of themark irradiation unit130 and theimage capture unit140 have optical systems such as a lens. These optical systems necessarily have individual differences. For this reason, the image capture area data has a distortion due to these individual differences.
In contrast, in this embodiment, the image capture areadata generation unit230 stores correction parameters for correcting the distortion of the image capture area data due to an individual difference of at least one of theimage capture unit140 and themark irradiation unit130 in advance. Then, the image capture areadata generation unit230 corrects the image capture area data by using the correction parameters. The correction parameters, for example, are generated as follows.
First, the position and the angle of the holdingmember150 and the angle of theedge portion155 in theimage capture device10 are adjusted such that detection values of theangle detection unit162, theposition detection unit164, and the positionangle detection unit135 are values determined in advance. Subsequently, theimage capture unit140 captures an image of the object having a shape determined in advance (for example, a sheet on which the mark determined in advance is printed) and generates the image capture data. The image capture areadata generation unit230 sets the correction parameters such that the shape of the object shown by the image capture data (or a shape of an area defined by the mark) is a shape determined in advance. When the correction parameters obtained by this method are used, the image capture areadata generation unit230 is able to correct the distortion of the image capture area data due to the individual differences of both of themark irradiation unit130 and theimage capture unit140. Furthermore, the correction parameters are able to be generated at an arbitrary timing.
Thedata storage unit240 stores the image capture area data after being corrected by the image capture areadata generation unit230. Thedata storage unit240 may be a volatile memory or a nonvolatile memory, and may be a hard disk. Thedata storage unit240 may be an external storage device (for example, an external hard disk, an external nonvolatile memory, or the like) of theimage capture device10, and may be embedded in thepedestal110 of theimage capture device10.
In addition, when theillumination unit170 emits light having a plurality of colors, and the intensities of the plurality of colors are able to be controlled independently of each other, the image capture areadata generation unit230 acquires illumination parameters indicating each intensity of the plurality of colors of the light emitted by theillumination unit170 from thecontrol unit220. Then, the image capture areadata generation unit230 performs color correction with respect to the image capture area data by using the acquired illumination parameters. Accordingly, a color of the object shown by the image capture area data is close to the original color even when the color of the object is shifted from the original color due to a color of the illumination light.
In particular, in this embodiment, a combination between themark irradiation unit130 and theimage capture unit140 is fixed. For this reason, parameters for the color correction are able to be adjusted in advance and are able to be fixed. Accordingly, it is possible to correct the color with high accuracy.
Furthermore, in the description illustrated inFIG. 2, each constituent of theimage capture device10 is illustrated not as a hardware unit configuration, but as a functional unit block. Each constituent of theimage capture device10 is realized by arbitrarily combining hardware and software based on a CPU, a memory, and a program realizing the constituent of this drawing which is loaded in the memory, storage media such as hard disk storing the program, and an interface for network connection to an arbitrary computer. Then, a realization method and a realization device thereof include various modification examples.
FIG. 3 is a diagram illustrating a hardware configuration of theimage capture device10 illustrated inFIG. 2. In an example illustrated in this drawing, themark irradiation unit130 includes asemiconductor laser131, alaser controller132, alaser driver133, anMEMS mirror134, the positionangle detection unit135, amirror controller136, and amirror driver137. Thesemiconductor laser131 emits laser light for drawing the mark. The laser light, for example, is visible light. Thelaser controller132 and thelaser driver133 control thesemiconductor laser131. TheMEMS mirror134 reflects the light emitted by thesemiconductor laser131. TheMEMS mirror134 changes a reflection direction of the light, and thus the mark is drawn. The reflection direction of the light due to theMEMS mirror134 is controlled by themirror controller136 and themirror driver137. Specifically, the positionangle detection unit135 detects a direction in which theMEMS mirror134 is directed. Then, themirror controller136 controls theMEMS mirror134 through themirror driver137 by using a detection result of the positionangle detection unit135.
Theillumination unit170 includes anillumination panel171, anillumination driver172, and anillumination controller173. Theillumination panel171, for example, is the organic EL panel described above. Theillumination controller173 and theillumination driver172 control theillumination panel171.
In addition, theimage capture device10 includes a CPU222, amemory242, an I/O controller250, and awireless communication unit252. The CPU222 corresponds to thecontrol unit220 and the image capture areadata generation unit230 inFIG. 2. These and theinput unit210, thelaser controller132, themirror controller136, and theillumination controller173 are connected to each other.
Furthermore, theinput unit210, the CPU222, thememory242, the I/O controller250, and thewireless communication unit252 illustrated inFIG. 3 are built into thepedestal110. However, theinput unit210 may be disposed outside thepedestal110. In addition, the image capture areadata generation unit230 and thedata storage unit240 illustrated inFIG. 2 may be disposed in a device external to theimage capture device10. In this case, the image capture areadata generation unit230 and thedata storage unit240 may be connected to theimage capture device10 through a communication line, for example, a data communication network such as the Internet, and may be disposed to be attachable and detachable with respect to theimage capture device10, for example, by using a cable.
FIG. 4 is a cross-sectional view illustrating a structure of the organic EL element provided in the organic EL panel as theillumination panel171. The organic EL element has a laminated structure in which ananode420, ahole injection layer422, ahole transport layer424, alight emitting layer426, anelectron transport layer428, anelectron injection layer430, and acathode432 are laminated on asubstrate410 in this order. Thesubstrate410, for example, is quartz, glass, metal, or a resin such as plastic.
As a phosphorescent organic compound used for thelight emitting layer426, Bis(3,5-difluoro-2-(2-pyridyl) phenyl-(2-carboxypyridyl) Tris(2-phenylpyridine) iridium(III), and Bis(2-phenylbenzothiazolato) (acetylacetonate) iridium(III) which are iridium complexes, Osmium(II) bis(3-trifluoromethyl-5-(2-pyridyl)-pyrazolate) dimethylphenylphosphine which is an osmium complex, Tris(dibenzoylmethane) phenanthroline europium(III) of a rare earth compound, 2,3,7,8,12,13,17,18-Octaethyl-21H,23H-porphine, platinum(II) which is a platinum complex, and the like are able to be exemplified.
In addition, as an organic compound having electron transport properties which is a main component of thelight emitting layer426, theelectron transport layer428, and theelectron injection layer430, a polycyclic compound such as p-terphenyl or quaterphenyl, and derivatives thereof, a condensed polycyclic hydrocarbon compound such as naphthalene, tetracene, pyrene, coronene, chrysene, anthracene, diphenylanthracene, naphthacene, and phenanthrene, and derivatives thereof, a condensed heterocyclic compound such as phenanthroline, bathophenanthroline, phenanthridine, acridine, quinoline, quinoxaline, and phenazine, and derivatives thereof, fluoroscein, perylene, phthaloperylene, naphthaloperylene, perynone, phthaloperynone, naphthaloperynone, diphenyl butadiene, tetraphenyl butadiene, oxadiazole, aldazine, bisbenzoxazoline, bisstyryl, pyrazine, cyclopentadiene, oxine, aminoquinoline, imine, diphenyl ethylene, vinyl anthracene, diaminocarbazole, pyran, thiopyran, polymethine, merocyanine, quinacridone, rubrene, and the like, and derivatives thereof, and the like are able to be exemplified.
Further, as the organic compound having the electron transport properties, in a metal chelate complex compound, in particular, in a metal chelated oxanoid compound, a metal complex including at least one of 8-quinolinolato such as tris(8-quinolinolato) aluminum, bis(8-quinolinolato) magnesium, bis[benzo (f)-8-quinolinolato] zinc, bis(2-methyl-8-quinolinolato) (4-phenyl-phenolato) aluminum, tris(8-quinolinolato) indium, tris(5-methyl-8-quinolinolato) aluminum, 8-quinolinolato lithium, tris(5-chloro-8-quinolinolato) gallium, and bis(5-chloro-8-quinolinolato) calcium, and a derivative thereof as a ligand is also able to be exemplified.
In addition, as the organic compound having the electron transport properties, oxadiazoles, triazines, stilbene derivatives, distyrylarylene derivatives, styryl derivatives, and diolefin derivatives are also able to be preferably used.
Further, as an organic compound which is able to be used as the organic compound having the electron transport properties, benzoxazoles such as 2,5-bis(5,7-di-t-pentyl-2-benzoxazolyl)-1,3,4-thiazole, 4,4′-bis(5,7-t-pentyl-2-benzoxazolyl) stilbene, 4,4′-bis[5,7-di-(2-methyl-2-butyl)-2-benzoxazolyl] stilbene, 2,5-bis(5,7-di-t-pentyl-2-benzoxazolyl) thiophene, 2,5-bis[5-(α,α-dimethylbenzyl)-2-benzoxazolyl] thiophene, 2,5-bis[5,7-di-(2-methyl-2-butyl)-2-benzoxazolyl]-3,4-diphenyl thiophene, 2,5-bis(5-methyl-2-benzoxazolyl) thiophene, 4,4′-bis(2-benzoxazolyl) biphenyl, 5-methyl-2-{2-[4-(5-methyl-2-benzoxazolyl) phenyl] vinyl} benzoxazole, and 2-[2-(4-chlorophenyl) vinyl] naphtho(1,2-d) oxazole, benzothiazoles such as 2,2′-(p-phenylene divinylene)-bisbenzothiazole, 2-{2-[4-(2-benzimidazolyl) phenyl] vinyl} benzimidazole, 2-[2-(4-carboxyphenyl) vinyl] benzimidazole, and the like are included.
Further, as the organic compound having the electron transport properties, 1,4-bis(2-methylstyryl) benzene, 1,4-bis(3-methylstyryl) benzene, 1,4-bis(4-methylstyryl) benzene, distyrylbenzene, 1,4-bis(2-ethylstyryl) benzene, 1,4-bis(3-ethylstyryl) benzene, 1,4-bis(2-methylstyryl)-2-methyl benzene, 1,4-bis(2-methylstyryl)-2-ethylbenzene, and the like are also included.
In addition, as the organic compound having the electron transport properties, 2,5-bis(4-methylstyryl) pyrazine, 2,5-bis(4-ethylstyryl) pyrazine, 2,5-bis[2-(1-naphthyl) vinyl] pyrazine, 2,5-bis(4-methoxystyryl) pyrazine, 2,5-bis[2-(4-biphenyl) vinyl] pyrazine, 2,5-bis[2-(1-pyrenyl) vinyl] pyrazine, and the like are included.
In addition, as the organic compound having the electron transport properties, a known material used for manufacturing an organic EL element of the related art such as 1,4-phenylene dimethylidyne, 4,4′-phenylene dimethylidyne, 2,5-xylylene dimethylidyne, 2,6-naphthylene dimethylidyne, 1,4-biphenylene dimethylidyne, 1,4-p-terephenylene dimethylidyne, 9,10-anthracenediyl dimethylidyne, 4,4′-(2,2-di-t-butylphenylvinyl) biphenyl, and 4,4′-(2,2 diphenylvinyl) biphenyl is able to be suitably used.
On the other hand, as an organic compound having hole transport properties which is used for the hole transport layer424 or a light emitting layer having hole transport properties, N,N,N′,N′-tetraphenyl-4,4′-diaminophenyl, N,N′-diphenyl-N,N′-di(3-methylphenyl)-4,4′-diaminobiphenyl, 2,2-bis(4-di-p-tolylaminophenyl) propane, N,N,N′,N′-tetra-p-tolyl-4,4′-diaminobiphenyl, bis(4-di-p-tolylaminophenyl) phenyl methane, N,N′-diphenyl-N,N′-di(4-methoxyphenyl)-4,4′-diaminobiphenyl, N,N,N′,N′-tetraphenyl-4,4′-diaminodiphenyl ether, 4,4′-bis(diphenylamino) quadriphenyl, 4-N,N-diphenylamino-(2-diphenylvinyl) benzene, 3-methoxy-4′-N,N-diphenylaminostilbenzene, N-phenyl carbazole, 1,1-bis(4-di-p-triaminophenyl)-cyclohexane, 1,1-bis(4-di-p-triaminophenyl)-4-phenyl cyclohexane, bis(4-dimethylamino-2-methylphenyl)-phenyl methane, N,N,N-tri(p-tolyl) amine, 4-(di-p-tolylamino)-4′-[4(di-p-tolylamino) styryl] stilbene, N,N,N′,N′-tetra-p-tolyl-4,4′-diamino-biphenyl, N,N,N′,N′-tetraphenyl-4,4′-diamino-biphenyl-N-phenylcarbazole, 4,4′-bis[N-(1-naphthyl)-N-phenyl-amino] biphenyl, 4,4″-bis[N-(1-naphthyl)-N-phenyl-amino]p-terphenyl, 4,4′-bis[N-(2-naphthyl)-N-phenyl-amino] biphenyl, 4,4′-bis[N-(3-acenaphthenyl)-N-phenyl-amino] biphenyl, 1,5-bis[N-(1-naphthyl)-N-phenyl-amino] naphthalene, 4,4′-bis[N-(9-anthryl)-N-phenyl-amino] biphenyl, 4,4″-bis[N-(1-anthryl)-N-phenyl-amino] p-terphenyl, 4,4′-bis[N-(2-phenanthryl)-N-phenyl-amino] biphenyl, 4,4′-bis[N-(8-fluoranthenyl)-N-phenyl-amino] biphenyl, 4,4′-bis[N-(2-pyrenyl)-N-phenyl-amino] biphenyl, 4,4′-bis[N-(2-perylenyl)-N-phenyl-amino] biphenyl, 4,4′-bis[N-(1-coronenyl)-N-phenyl-amino] biphenyl, 2,6-bis(di-p-tolylamino) naphthalene, 2,6-bis[di-(1-naphthyl) amino] naphthalene, 2,6-bis[N-(1-naphthyl)-N-(2-naphthyl) amino] naphthalene, 4,4″-bis[N,N-di(2-naphthyl) amino] terphenyl, 4,4′-bis{N-phenyl-N-[4-(1-naphthyl) phenyl] amino} biphenyl, 4,4′-bis[N-phenyl-N-(2-pyrenyl)-amino] biphenyl, 2,6-bis[N,N-di(2-naphthyl) amino] fluorene, 4,4″-bis(N,N-di-p-tolylamino) terphenyl, bis(N-1-naphthyl) (N-2-naphthyl) amine, and the like are able to be exemplified.
Further, as the organic compound having the hole transport properties, a material in which the organic compound described above is dispersed in a polymer, or a material in which the organic compound described above is polymerized is able to be used. A so-called n conjugated polymer such as polyparaphenylene vinylene or derivatives thereof, a non-conjugated polymer having hole transport properties which is represented as poly(N-vinylcarbazole), and a sigma conjugated polymer such as polysilanes are also able to be used.
A material of thehole injection layer422 is not particularly limited, and as the material of thehole injection layer422, metal phthalocyanines such as copper phthalocyanine (CuPc), metal-free phthalocyanines, a carbon film, and a conductive polymer such as polyaniline are able to be preferably used.
Then, by adjusting a thickness or a material of each layer laminated on thesubstrate410, it is possible to adjust sharpness of a spectrum of light emitted by theillumination panel171.
FIG. 5 is a diagram illustrating a first example of irradiation of a mark M of themark irradiation unit130. In an example illustrated in this drawing, themark irradiation unit130 emits laser light to scan an entire region of an image capture area F.
FIG. 6(a) is a diagram illustrating a second example of the irradiation of the mark M of themark irradiation unit130. In an example illustrated in this drawing, themark irradiation unit130 emits the laser light to draw an edge of the image capture area F.
FIG. 6(b) is a diagram illustrating a third example of the irradiation of the mark M of themark irradiation unit130. In an example illustrated in this drawing, the image capture area F is rectangular. Thus, themark irradiation unit130 draws marks each indicating four corners of the image capture area F by the laser light.
Furthermore, a shape and a scan method of the mark M of themark irradiation unit130 are not limited to the above-described examples.
FIG. 7 is a diagram illustrating an example of a method in which thecontrol unit220 corrects an irradiation range of the light of themark irradiation unit130, and thus the mark is formed in a desired shape. Thecontrol unit220 controls the irradiation angle of the light emitted by themark irradiation unit130 by using a direction of the object when based on theimage capture unit140, and a distance from the object to theimage capture unit140. Hereinafter, the details will be described.
As described above, theangle detection unit162 detects the angle of the holdingmember150 with respect to theguide member120, that is, the angle of themark irradiation unit130 with respect to theguide member120. Thecontrol unit220 calculates an angle θ1of themark irradiation unit130 with respect to a surface of a mounting portion40 (for example, a desk) on which the object is mounted from the detection result of theangle detection unit162. Furthermore, theangle detection unit162 may be configured to directly detect the angle θ1of themark irradiation unit130 with respect to the surface of the mountingportion40.
In addition, the positionangle detection unit135 detects an angle of a reference position (for example, a center) of themark irradiation unit130 with respect to the holdingmember150. In addition, thecontrol unit220 stores data indicating an angle of the light emitted by themark irradiation unit130 with respect to a reference axis (for example, an axis vertically passing through the center) of themark irradiation unit130 as a part of control parameters of themark irradiation unit130. For this reason, thecontrol unit220 is able to calculate an angle θ2of the light emitted by themark irradiation unit130 with respect to the holdingmember150. The angle θ2corresponds to the direction of the object when theimage capture unit140 is set as a reference.
In addition, a length of the holdingmember150 is a fixed value. For this reason, thecontrol unit220 is able to calculate a distance (a distance in an x direction in the drawings) from theguide member120 to themark irradiation unit130 in a plane which is parallel to the mountingportion40 by using the length of the holdingmember150 and the angle θ1. Further, thecontrol unit220 is able to calculate a height (a distance in a y direction in the drawings) of themark irradiation unit130 when the surface of the mountingportion40 is set as a reference by using a detection result h of theposition detection unit164, the length of the holdingmember150, and the angle θ1. The height corresponds to the distance from theimage capture unit140 to the object. Then, thecontrol unit220 is able to calculate a distance d1from theguide member120 to an irradiation position of the light by using these distances and the angles θ1and θ2.
Then, thecontrol unit220 is able to recognize the irradiation position of the light of themark irradiation unit130 by using the distance d1and the angles θ1and θ2. By recognizing the irradiation position, thecontrol unit220 is able to control themark irradiation unit130 such that the image capture area shown by the mark is in a desired shape.
FIG. 8 is a diagram for describing another function of themark irradiation unit130. Themark irradiation unit130 also has a function of drawing information such as a character in addition to a function of emitting the mark indicating the image capture area. The information drawn by themark irradiation unit130, for example, may be input from theinput unit210 illustrated inFIG. 2, or may be information maintained in thecontrol unit220. The information drawn by themark irradiation unit130, for example, may be the illumination parameters used for controlling theillumination panel171, or date information from a calendar or the like included in the CPU222 (the control unit220) or the like. As described above, the illumination parameters are information indicating each intensity of the plurality of colors of the light emitted by theillumination unit170, but are not limited thereto. Then, character information is included in the image capture area data, and thus it is possible to integrally store the image capture area data and the character information.
In particular, when the object is a commercial product, and theillumination unit170 of theimage capture device10 is used as an illumination illuminating the commercial product, it is possible to integrally store illumination parameters of the illumination of the commercial product and image data obtained by capturing an image of the commercial product. This allows to easily reproduce illumination conditions of the commercial product.
In addition, when the object is a commercial product, and theimage capture device10 is used in a display place of the commercial product, the information drawn by themark irradiation unit130 may be a description of the object. In this case, theillumination unit170 of theimage capture device10 is used as an illumination illuminating the commercial product.
Furthermore, a laser with which themark irradiation unit130 draws information such as a character may be disposed separately from the laser drawing the mark. In this case, when the laser drawing the information such as a character is an infrared laser, theimage capture unit140 may detect both visible light and infrared light. Then, even when the illumination parameters are drawn in the display place of the commercial product, the illumination parameters are not visible to the observer, and thus a design of display of the commercial product is not affected.
FIG. 9 is a flowchart illustrating a first example of an operation of theimage capture device10. First, the user of theimage capture device10 adjusts the height and the angle of the holdingmember150, and the angle of theedge portion155 with respect to the holdingmember150 while allowing themark irradiation unit130 to irradiate with the mark. At this time, the user may adjust the irradiation range of the light by performing input with respect to theinput unit210. Accordingly, the user is able to adjust the image capture area shown by the mark to be in a desired range (Step S10).
Subsequently, theimage capture unit140 generates the image data at a timing when an image capture command from the user is input (Step S20).
Subsequently, the image capture areadata generation unit230 recognizes a position of the mark in the image data generated by theimage capture unit140, and cuts out image capture area data from the image data on the basis of the recognized position of the mark (Step S30). Subsequently, the image capture areadata generation unit230 performs the trapezoid correction with respect to the generated image capture area data (Step S40). Then, the image capture areadata generation unit230 stores the image capture area data after being subjected to the trapezoid correction in the data storage unit240 (Step S50).
FIG. 10 is a flowchart illustrating a second example of the operation of theimage capture device10. First, the user of theimage capture device10 adjusts the illumination parameters of theillumination unit170, and adjusts a color or intensity of the illumination of the illumination unit170 (Step S5). Subsequently, the user adjusts the height and the angle of the holdingmember150, and the angle of theedge portion155 with respect to the holdingmember150, and thus the image capture area shown by the mark is brought into a desired range (Step S10).
Subsequently, theimage capture unit140 generates the image data according to the input from the user. At this time, thecontrol unit220 stores the illumination parameters in the data storage unit240 (Step S22).
Subsequent processings (Step S30 to Step S50) are identical to that of the first example. However, in Step S50, the image capture areadata generation unit230 associates the image capture area data with the illumination parameters stored in Step S22.
FIG. 11 is a flowchart illustrating a third example of the operation of theimage capture device10. Processings illustrated in Step S5 and Step S10 are identical to that of the second example. After Step S10, thecontrol unit220 acquires character data to be drawn on the object, and draws the acquired character data on the object together with the mark (Step S12). In this state, theimage capture unit140 captures the image of the object, and generates the image capture data (Step S20). Information indicated by the character data is as described with reference toFIG. 8.
Processings illustrated in Step S30 and Step S40 are identical to that of the second example. The image capture areadata generation unit230 performs the trapezoid correction with respect to the image capture area data (Step S40), and then corrects a color of the image capture area data by using the illumination parameters (Step S42). Then, the image capture areadata generation unit230 stores the image capture area data in the data storage unit240 (Step S50).
As described above, according to this embodiment, themark irradiation unit130 irradiates the object with the mark. Theimage capture unit140 captures the image of the object, and generates the image data. Then, the image capture areadata generation unit230 recognizes the position of the mark in the object, and cuts out the image capture area data which is a part of the image data on the basis of the mark. For this reason, since themark irradiation unit130 irradiates the object with the mark, even when the positioning symbol is not printed on the object to be stored as the image data, only a necessary portion in the image data is able to be cut out.
In addition, in this embodiment, the image capture areadata generation unit230 recognizes the position of the mark in the image data, and thus cuts out the image capture area data. For this reason, a calculation amount for generating the image capture area data is reduced.
In addition, thecontrol unit220 controls the angle of the mark which is emitted by themark irradiation unit130 by using the direction of the object when theimage capture unit140 is set as a reference, and the distance from the object to theimage capture unit140, so that an image capture area shown by themark irradiation unit130 becomes a desired shape (for example, a square or a rectangle). For this reason, a calculation amount of the trapezoid correction of the image capture areadata generation unit230 is reduced.
In addition, the holdingmember150 is vertically movable along theguide member120, and is rotatably attached by using theattachment portion160 as the center. For this reason, the user of theimage capture device10 is able to easily set a range of the image capture area shown by the mark to be in a desired size by adjusting the height and the angle of the holdingmember150.
Second EmbodimentAnimage capture device10 according to a second embodiment has the same configuration as that of theimage capture device10 according to the first embodiment except for the following matters.
First, themark irradiation unit130 does not perform the irradiation of the mark when theimage capture unit140 generates the image data. For example, when an image capture command with respect to theimage capture unit140 is input from theinput unit210, thecontrol unit220 ends the irradiation of the mark of themark irradiation unit130 at this timing. Then, thecontrol unit220 and the image capture areadata generation unit230 calculate a position at which that the mark is estimated to have existed in the image data on the basis of a field angle of theimage capture unit140, an irradiation direction of the mark of themark irradiation unit130, and the distance from the object to themark irradiation unit130 and theimage capture unit140. Specifically, thecontrol unit220 and the image capture areadata generation unit230 calculate the position at which that the mark is estimated to have existed in the image data by using the detection results of theangle detection unit162, theposition detection unit164, and the positionangle detection unit135, the irradiation direction of the mark of themark irradiation unit130, and the field angle of theimage capture unit140. Then, the image capture areadata generation unit230 generates the image capture area data by using this estimated position.
FIG. 12 is a diagram for describing a calculation method of the estimated position of the mark of the image capture areadata generation unit230. Themark irradiation unit130 performs control such that the center of the image capture area shown by the mark is coincident with the center of the image capture area of theimage capture unit140.
As described in the first embodiment, thecontrol unit220 calculates the angle θ1of themark irradiation unit130 with respect to the surface of the mounting portion40 (for example, a desk) on which the object is mounted from the detection result of theangle detection unit162. In addition, the positionangle detection unit135 detects an angle θ3of the center of themark irradiation unit130 with respect to the holdingmember150. In addition, thecontrol unit220 is able to acquire an angle α of an irradiation range when themark irradiation unit130 emits the mark indicating the image capture area. For this reason, thecontrol unit220 is able to calculate an angle β1of theimage capture unit140 with respect to an upper end of the image capture area shown by the mark by using the angles α, θ1, and θ3, and is able to calculate an angle β2of theimage capture unit140 with respect to a lower end of the image capture area shown by the mark.
In addition, as described in the first embodiment, thecontrol unit220 is able to calculate the distance (the distance in the x direction in the drawings) from theguide member120 to themark irradiation unit130 and the height of the mark irradiation unit130 (the distance in the y direction in the drawings) when the surface of the mountingportion40 is set as a reference. Thecontrol unit220 is able to calculate the distance d1from theguide member120 to the lower end of the image capture area by using the distance, the height, and the angle β1, and is able to calculate a distance (d1+d2) from theguide member120 to the upper end of the image capture area by using the distance, the height, and the angle β2. The distances d1and (d1+d2) indicate the position of the mark indicating the image capture area.
In addition, thecontrol unit220 stores the field angle of theimage capture unit140. For this reason, thecontrol unit220 is able to calculate which region of the surface of the mountingportion40 the image data generated by theimage capture unit140 indicates by using a method which is identical to the calculation method of the position of the mark. Further, a positional relationship between the angle θ3of the center of themark irradiation unit130 with respect to the holdingmember150 and the field angle of theimage capture unit140 is fixed. For this reason, the image capture areadata generation unit230 is able to calculate the position at which that the mark is estimated to have existed in the image data by using the data calculated by thecontrol unit220.
According to this embodiment, the same effect as that of the first embodiment is able to be obtained. In addition, when theimage capture unit140 generates the image data, themark irradiation unit130 does not perform the irradiation of the mark. For this reason, it is possible to prevent the mark of themark irradiation unit130 from being inserted into the image area data.
Third EmbodimentFIG. 13 is a diagram illustrating a configuration of theimage capture device10 according to a third embodiment, and corresponds toFIG. 1 in the first embodiment. Theimage capture device10 according to this embodiment has the same configuration as that of theimage capture device10 according to the first or the second embodiment except that animage processing device30 is provided. Theimage processing device30 performs at least a part of the image processing performed by the image capture areadata generation unit230 in the first embodiment.
FIG. 14 is a block diagram illustrating a functional configuration of theimage processing device30. Theimage processing device30 includes animage acquisition unit310, animage processing unit320, adisplay unit325, aninput unit330, and adata storage unit340. Theimage acquisition unit310 receives the image area data generated by the image capture areadata generation unit230 from theimage capture device10. Theimage acquisition unit310 receives the image area data through the I/O controller250 or thewireless communication unit252 of theimage capture device10. Theimage processing unit320 processes the image data received by theimage acquisition unit310. Thedisplay unit325 displays the image data after being processed by theimage processing unit320. Theinput unit330 receives information input from the user of theimage capture device10. The input information indicates parameters of the image processing of theimage processing unit320. Thedata storage unit340 stores the image data after being processed by theimage processing unit320. Thedata storage unit340 may be a nonvolatile memory, or may be hard disk.
FIG. 15 is a flowchart for describing a first example of processing performed by theimage processing device30. Processings from Step S5 to Step S40 are identical to that of the processing described with reference toFIG. 11 in the first embodiment. When the image capture areadata generation unit230 performs the trapezoid correction with respect to the image area data (Step S40), the image capture areadata generation unit230 transmits the image capture area data after being corrected to theimage processing device30. At this time, the image capture areadata generation unit230 also transmits the illumination parameters used in theillumination unit170 to the image processing device30 (Step S41).
Theimage acquisition unit310 of theimage processing device30 receives the image capture area data and the illumination parameters which have been transmitted by theimage processing device30. Theimage processing unit320 corrects the color of the image capture area data by using the illumination parameters received by theimage acquisition unit310. Theimage processing unit320 displays the image capture area data after being corrected on thedisplay unit325. Then, when the user of theimage capture device10 views the image displayed on thedisplay unit325, the user inputs an amendment command for the correction to theinput unit330 as necessary. Theimage processing unit320 amends the correction of the color of the image capture area data according to the input amendment command (Step S45).
Then, theimage processing unit320 stores the image area data after being corrected in the data storage unit340 (Step S52)
FIG. 16 is a flowchart for describing a second example of the processing performed by theimage processing device30. An example illustrated in this drawing is identical to the first example illustrated inFIG. 14 except that the trapezoid correction is also performed by the image processing device30 (Step S43).
Processings from Step S5 to Step S30 are identical to the processing described with reference toFIG. 15. The image capture areadata generation unit230 transmits the image area data before the trapezoid correction to theimage processing device30 in association with the detection results of theangle detection unit162, theposition detection unit164, and the positionangle detection unit135, and the illumination parameters (Step S31).
Theimage acquisition unit310 of theimage processing device30 receives the data which has been transmitted by theimage processing device30. Then, theimage processing unit320 performs the trapezoid correction (Step S43). Processings performed herein are identical to the processings performed in Step S40 of theFIG. 15.
Subsequent processings (Steps S45 and S52) are identical to that of the first example illustrated inFIG. 15.
According to this embodiment, the same effect as that of the first or the second embodiment is able to be obtained. In addition, the image processing is performed by using theimage processing device30, and thus it is possible to perform the image processing having a high calculation amount with respect to the image area data. In addition, theimage processing unit320 amends the image processing according to the input of the user. For this reason, it is possible to correct the image area data according to preference of the user.
As described above, the embodiments of the present invention are described with reference to the drawings, but the embodiments are examples of the present invention, and various configurations other than the configurations described above are able to be adopted.