Movatterモバイル変換


[0]ホーム

URL:


US8581998B2 - Image sensing apparatus and method of controlling the image sensing apparatus - Google Patents

Image sensing apparatus and method of controlling the image sensing apparatus
Download PDF

Info

Publication number
US8581998B2
US8581998B2US13/308,907US201113308907AUS8581998B2US 8581998 B2US8581998 B2US 8581998B2US 201113308907 AUS201113308907 AUS 201113308907AUS 8581998 B2US8581998 B2US 8581998B2
Authority
US
United States
Prior art keywords
image
parallax
coordinates
unit
imaging lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US13/308,907
Other versions
US20120154651A1 (en
Inventor
Takenori Ohno
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon IncfiledCriticalCanon Inc
Assigned to CANON KABUSHIKI KAISHAreassignmentCANON KABUSHIKI KAISHAASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: OHNO, TAKENORI
Publication of US20120154651A1publicationCriticalpatent/US20120154651A1/en
Priority to US14/048,405priorityCriticalpatent/US8711269B2/en
Application grantedgrantedCritical
Publication of US8581998B2publicationCriticalpatent/US8581998B2/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

A sensed image generated from an image signal output from an image sensor for receiving light that becomes incident sequentially through an imaging lens and a microlens array that is a two-dimensional array including a plurality of microlenses is acquired. A list is created in which, for each pixel position on the image sensor, the correspondence between the coordinates of the light incident at the pixel position on the imaging lens and the coordinates of the pixel position is registered. Images obtained by rearranging pixels at the coordinate positions on the image sensor corresponding to the coordinates in accordance with the arrangement order of the coordinates on the imaging lens are generated as a parallax image group.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to an image sensing technique.
2. Description of the Related Art
In a digital camera, conventionally, an object image is formed on an image sensing element through an imaging lens, and image information obtained by the image sensing element is displayed on an LCD (Liquid Crystal Display). The digital camera includes an electronic viewfinder (EVF) to observe the image information formed on the LCD (Japanese Patent Laid-Open No. 5-134294).
Also known is a digital camera using live view display in which a shutter arranged on the whole surface of an image sensing element is opened to project an object image on the image sensing element, thereby displaying the video in an image display LCD (Japanese Patent Laid-Open No. 2001-186401).
International Patent Publication No. 06/039486 and Ren. Ng, et al “Light Field Photography with a Hand-Held Plenoptic Camera”, Stanford Tech Report CTSR 2005-02 propose an image sensing apparatus using a method called “Light Field Photography”. This image sensing apparatus includes an imaging lens, a microlens array, an image sensing element, and an image processing unit. Sensed image data obtained from the image sensing element includes not only the light intensity distribution on the light-receiving surface but also the information of the light traveling direction. The image processing unit can reconstruct an image observed from a plurality of viewpoints or directions.
However, when the image sensing apparatus including the microlens array arranged in front of the image sensing element senses an object as shown inFIG. 20A using an aperture stop having a circular opening portion, the live view display image includes a non-light-receiving area, as shown inFIG. 20B. For this reason, it is difficult for the user to determine the focus position.
SUMMARY OF THE INVENTION
The present invention has been made in consideration of the above-described problem, and provides a technique for converting a live view display image into an image for which the user can easily determine the focus position.
According to the first aspect of the present invention, an image sensing apparatus comprises: a unit that acquires a sensed image generated from an image signal output from an image sensor for receiving light that becomes incident sequentially through an imaging lens and a microlens array that is a two-dimensional array including a plurality of microlenses; a creation unit that creates a list in which, for each pixel position on the image sensor, a correspondence between coordinates of the light incident at the pixel position on the imaging lens and coordinates of the pixel position is registered; a generation unit that generates, as a parallax image group, images obtained by rearranging pixels at pixel positions on the image sensor corresponding to the coordinates in accordance with an arrangement order of the coordinates on the imaging lens registered in the list; and an output unit that outputs the images generated by the generation unit.
According to the second aspect of the present invention, a method of controlling an image sensing apparatus including an image sensor for receiving light that becomes incident sequentially through an imaging lens and a microlens array that is a two-dimensional array including a plurality of microlenses, comprises: a step of acquiring a sensed image generated from an image signal output from the image sensor; a creation step of creating a list in which, for each pixel position on the image sensor, a correspondence between coordinates of the light incident at the pixel position on the imaging lens and coordinates of the pixel position is registered; a generation step of generating, as a parallax image group, images obtained by rearranging pixels at pixel positions on the image sensor corresponding to the coordinates in accordance with an arrangement order of the coordinates on the imaging lens registered in the list; and an output step of outputting the images generated in the generation step.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram showing an example of the arrangement of an image sensing apparatus;
FIG. 2 is a block diagram showing an example of the functional arrangement of asignal processing unit106;
FIG. 3 is a view showing anaperture stop102;
FIG. 4 is a view showing the positional relationship;
FIG. 5 is a view showing an example of light-receiving areas for the respective microlenses on the light-receiving surface of animage sensor104;
FIG. 6 is a flowchart of processing to be performed by thesignal processing unit106;
FIG. 7 is a view showing examples of area division and the indices of areas of animaging lens101 when m=3, and n=3;
FIG. 8 is a view for explaining coordinates (u,v) on the imaging lens;
FIG. 9 is a table showing an example of the arrangement of a light field;
FIG. 10 is a view for explaining a light field creation method;
FIG. 11 is a view showing an example of the arrangement of parallax images;
FIGS. 12A and 12B are views showing an example of parallax image creation;
FIG. 13 is a view showing an example of the light-receiving areas of microlenses on the light-receiving surface of theimage sensor104;
FIG. 14 is a view showing a display example on the display screen of adisplay unit116;
FIG. 15 is a view showing an example in which all parallax images are arranged in accordance with extraction start coordinates when m=5 and n=5;
FIG. 16 is a view showing an example in which only parallax images each having an average pixel value equal to or larger than a threshold are arranged and displayed on the display screen of thedisplay unit116;
FIG. 17 is a view showing an example of the arrangement of parallax images;
FIG. 18 is a view showing a display example of a GUI;
FIG. 19 is a view showing an example of the arrangement of parallax images;
FIGS. 20A to 20C are views for explaining the effect of the first embodiment;
FIG. 21 is a schematic view of light-receiving areas on theimage sensor104 when theaperture stop102 is stopped down;
FIG. 22 is a view showing parallax images when the parallax number is 5×5;
FIG. 23 is a view showing parallax images when the parallax number is 3×3;
FIG. 24 is a view showing parallax images when the parallax number is 7×7;
FIG. 25 is a block diagram showing an example of the functional arrangement of asignal processing unit106;
FIG. 26 is a flowchart of refocus evaluation information generation processing;
FIG. 27 is a view for explaining processing of acquiring main object selection information;
FIG. 28 is a view for explaining central coordinates; and
FIG. 29 is a view showing a display example on adisplay unit116.
DESCRIPTION OF THE EMBODIMENTS
The embodiments of the present invention will now be described with reference to the accompanying drawings. Note that the embodiments to be described below are merely examples of detailed practice of the present invention and represent detailed examples of the arrangement in the appended claims.
[First Embodiment]
In this embodiment, an example will be described in which an image sensing apparatus holding a microlens array converts a live view display image into an image that facilitates focusing on the display unit that displays the live view display image.
<Example of Arrangement of Image Sensing Apparatus>
An example of the arrangement of an image sensing apparatus according to this embodiment will be described with reference to the block diagram ofFIG. 1. Note thatFIG. 1 shows an example of major components for each processing to be described below, and the arrangement of the image sensing apparatus applicable to this embodiment is not limited to that shown inFIG. 1. That is, a component may be added to the arrangement shown inFIG. 1. Some of the units shown inFIG. 1 may be integrated. One constituent element may be decomposed into two or more constituent elements. Processing to be described later as processing to be executed by a given constituent element may be allotted to another constituent element.
Animaging lens101 is the main lens configured to sense an object and includes, for example, a general zoom lens, focus lens, and blur correction lens used in a video camera, a still camera, or the like.
Anaperture stop102 is an optical aperture stop for theimaging lens101. Theaperture stop102 has, for example, one circular opening portion at the center, as shown inFIG. 3. The black frame portion indicates a non-opening portion formed by stopping down the aperture stop to some degree. The diameter of the opening portion (the effective diameter of the imaging lens101) will be referred to as D. Theaperture stop102 is arranged apart from amicrolens array103 by a distance L.
Themicrolens array103 is, for example, a two-dimensional array formed by two-dimensionally arraying a plurality of microlenses in a matrix, and is arranged on the imaging plane of theimaging lens101. Each microlens has, for example, a circular or hexagonal planar shape and is formed from a solid lens, a liquid crystal lens, a liquid lens, a diffraction lens, or the like.
Theimaging lens101, theaperture stop102, themicrolens array103, and an image sensor (image sensing element)104 will be referred to as animage sensing unit100 together.
FIG. 4 shows the positional relationship between theimaging lens101, theaperture stop102, themicrolens array103, and theimage sensor104. In this case, m is the number of pixels assigned to one direction (vertical direction inFIG. 4) of each microlens of themicrolens array103, s is the size (pixel size) in one side direction (vertical direction inFIG. 4) of an image sensing element (pixel) included in theimage sensor104, and L is the distance between theaperture stop102 and themicrolens array103, as described above. Note that the distance between theimaging lens101 and theaperture stop102 is so short as to be negligible. For this reason, L represents the distance between theimaging lens101 and themicrolens array103 inFIG. 4. D represents the effective diameter of theimaging lens101, as described above, and f represents the distance between themicrolens array103 and theimage sensor104.
In other words, the number m of pixels is the number of pixels in one direction (vertical direction inFIG. 4) of an area (light-receiving area) where theimage sensor104 receives light that has become incident through one microlens. The assigned number m of pixels can be calculated by
(m×s):f=D:L
m=D×f/(L×s)  (1)
However, only a positive integer can be adopted as m, as a matter of course. Hence, if a real number is obtained as m by calculating equations (1), the first decimal place is rounded up to an integer.
Referring back toFIG. 1, theimage sensor104 receives light that has become incident through the microlens array103 (each microlens) and acquires the light amount of the object. Theimage sensor104 is arranged on the focal plane of the microlens array103 (each microlens). Theimage sensor104 includes a plurality of image sensing elements two-dimensionally arrayed in a matrix. A CCD (Charge Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor) can be employed as the image sensing element. As a matter of course, if the correspondence between a position on the light-receiving surface of theimage sensor104 and a position on an image output from theimage sensor104 can be ensured, the arrangement of theimage sensor104 is not particularly limited.
In this embodiment, M×N pixels (image sensing elements) are two-dimensionally arranged in a matrix on the light-receiving surface of theimage sensor104. Light that has passed through one microlens is received in an area (light-receiving area) formed from a plurality of pixels. The number of pixels on the light-receiving surface is, for example, M×N=5180×3450=17871000.
Let m be the number of pixels (the assigned number of pixels for the horizontal direction) calculated using equations (1) for the horizontal direction of the light-receiving area, and n be the number of pixels (the assigned number of pixels for the vertical direction) calculated using equations (1) for the vertical direction. In this case, the numbers m and n of pixels are related to the resolving power at an arbitrary viewpoint of a parallax image to be finally generated. For this reason, the resolving power at an arbitrary viewpoint of a parallax image rises as the values m and n increase. On the other hand, (M/m) and (N/n) are related to the number of pixels (resolution) of a parallax image. For this reason, the number of pixels of a parallax image increases as the values (M/m) and (N/n) increase. Hence, the number of pixels and the resolving power at an arbitrary viewpoint of a parallax image have a tradeoff relationship.
In addition, for example, color filters (not shown inFIG. 1) are two-dimensionally arranged on the pixel basis on the light-receiving surface of theimage sensor104. The color filters have a Bayer arrangement in which filters of three primary colors, that is, red (R), green (G), and blue (B) are arranged in a checkered pattern at a ratio of R:G:B=1:2:1. Since such color filters are provided on the light-receiving surface of theimage sensor104, pixels of a plurality of colors corresponding to the colors of the color filters can be obtained.
An A/D conversion unit105 converts an analog signal (an analog signal (image signal) representing the pixel value of a pixel output from each image sensing element) representing the light amount of the object output from theimage sensor104 into a digital signal.
Asignal processing unit106 performs demosaicing processing, white balance processing, gamma processing, and the like for the digital signal output from the A/D conversion unit105 to generate the data of the sensed image (sensed image data). Thesignal processing unit106 then generates the data of parallax images (parallax image data) based on the pixel size s in one side direction of a pixel on theimage sensor104, the distance L between theaperture stop102 and themicrolens array103, the effective diameter D of theimaging lens101, and the distance f between themicrolens array103 and theimage sensor104. Next, thesignal processing unit106 generates image data (display image data) to be displayed on adisplay unit116 in accordance with an instruction from anoperation unit113. The parallax image data generation processing and the display image data generation processing will be described later.
Anencoder unit107 performs processing of converting the parallax image data generated by thesignal processing unit106 into a file format such as jpeg or mpeg.
A media interface unit108 serves as an interface to connect a PC or another medium (for example, a hard disk, a memory card, a CF card, an SSD card, or a USB memory).
A D/Aconversion unit109 converts the parallax image data generated by thesignal processing unit106 into analog data.
ACPU110 executes processing using computer programs and data stored in aROM111 or aRAM112 to control the operations of the units included in the image sensing apparatus.
TheROM111 stores the activation program and initial setup data of the image sensing apparatus, computer programs and data to be used by theCPU110 to execute control the operations of the units of the image sensing apparatus, and various kinds of information to be described from this embodiment as known information.
TheRAM112 has an area to temporarily store the computer programs and data loaded from theROM111 and a work area to be used by theCPU110 and other units to execute processing. That is, theRAM112 can provide various areas as needed.
Theoperation unit113 includes buttons and a mode dial. An operation instruction input by a user operation on theoperation unit113 is sent to theCPU110. As a matter of course, theCPU110 may execute some of the functions of theoperation unit113, for example, the function of the mode selection button by displaying a button image on the display screen of thedisplay unit116 to be described later and causing the user to point out the button image by a finger or the like.
An image sensingsystem control unit114 controls theimage sensing unit100 to do focusing, shutter opening, stop adjustment, and the like.
Acharacter generation unit115 generates characters, graphics, and the like, and can generate, for example, a GUI (Graphical User Interface). The generated characters, graphics, and GUI are displayed on the display screen of thedisplay unit116 to be described later.
In general, an LCD is widely used as thedisplay unit116 to display the characters and graphics generated by thecharacter generation unit115, the display image generated by thesignal processing unit106, or the like. Thedisplay unit116 may have a touch screen function. In that case, some of the functions of theoperation unit113 can be implemented, as described above. Functions other than those of theoperation unit113 can also be implemented, as a matter of course.
An example of the arrangement of thesignal processing unit106 will be described next with reference to the block diagram ofFIG. 2.
A parallaxnumber calculation unit201 calculates the number of pixels capable of receiving, on the image sensor, light that has passed through each microlens. The number of pixels capable of receiving, on the image sensor, light that has passed through each microlens will be referred to as a parallax number hereinafter.
AnLF creation unit202 creates the light field of the image sensing apparatus based on the parallax number. The light field represents the correspondence between the passage position of a light beam through the imaging lens and the receiving position of the light on an image sensing element.
Animage reconstruction unit203 rearranges the pixels of the sensed image data held in abuffer204 based on the light field, thereby generating parallax image data observed from a different viewpoint.
A parallaximage extraction unit205 selects and extracts the parallax image data based on the display mode information of theoperation unit113 and transmits the parallax image data to theencoder unit107 and the D/A conversion unit109.
<Operation of Image Sensing Apparatus>
The operation of the image sensing apparatus according to this embodiment will be described next. Light (object image) that has become incident on themicrolens array103 through theimaging lens101 and the aperture stop102 forms an image on the light-receiving surface of theimage sensor104 in accordance with the shape of each microlens. That is, an area (light-receiving area) where the light that has passed through a microlens is received is formed for each microlens on the light-receiving surface of theimage sensor104.FIG. 5 shows an example of light-receiving areas for the respective microlenses on the light-receiving surface of theimage sensor104.
Since the opening portion of theaperture stop102 is circular, as shown inFIG. 3, a circular light-receiving area is formed on the light-receiving surface of theimage sensor104 in correspondence with each microlens. At this time, light beams incident on themicrolens array103 are received at different positions on the light-receiving surface of theimage sensor104 in accordance with their incidence positions on theimaging lens101.
Each image sensing element of theimage sensor104 outputs an analog signal corresponding to the received light amount. As a result, theimage sensor104 outputs an analog signal for each pixel. The A/D conversion unit105 converts (A/D-converts) the analog signal of each pixel into a digital signal, thereby generating a digital signal for each pixel. This A/D conversion can be general processing. For example, the light amount of the object is photoelectrically converted into a signal and then converted into a digital signal representing a 14-bit digital value.
Next, thesignal processing unit106 performs demosaicing processing for a general Bayer arrangement, white balance processing, and gamma processing for the digital signal of each pixel converted by the A/D conversion unit105 to generate sensed image data. Thesignal processing unit106 reconstructs the sensed image data to parallax image data based on the light field, and generates display image data to be displayed on thedisplay unit116. The parallax image data generation processing and the display image data generation processing will be described with reference to the flowchart ofFIG. 6.
In step S601, the parallaxnumber calculation unit201 acquires image sensing unit parameters from theimage sensing unit100. The image sensing unit parameters are the pixel size s in one side direction of a pixel on an image sensing element, the distance L between theaperture stop102 and themicrolens array103, the effective diameter D of theimaging lens101, and the distance f between themicrolens array103 and theimage sensor104. Acquiring the image sensing unit parameters from theimage sensing unit100 has been described above. However, for example, the image sensing unit parameters may be stored in theROM111 in advance, and the parallaxnumber calculation unit201 may acquire the image sensing unit parameters from theROM111. Alternatively, a parameter input field may be displayed on thedisplay unit116, and the values the user inputs to the parameter input field by operating theoperation unit113 may be used as the image sensing unit parameters. Otherwise, for example, an effective diameter D0 of theaperture stop102 in the full-aperture state may be stored in theROM111. When an F-number Fnum of the stop is obtained, the effective diameter D may be obtained by calculating
D=D0/Fnum  (2)
The distance L between theaperture stop102 and themicrolens array103 or the distance f between themicrolens array103 and theimage sensor104 may be detected using a radar device provided for distance measurement. Note that the image sensing unit parameters may be reacquired when, for example, the stop or the distance between theimaging lens101 and themicrolens array103 or theimage sensor104 has changed.
In step S602, the parallaxnumber calculation unit201 calculates the parallax number based on the image sensing unit parameters acquired in step S601.FIG. 4 shows the image sensing unit parameters and the positional relationship between theimaging lens101, theaperture stop102, themicrolens array103, and theimage sensor104. At this time, the number m of elements for receiving light from each microlens is given by
m×s:f=D:Lm=LsDf(3)
where m is a positive integer.
The number m×n of elements for receiving light on the image sensing elements is calculated from equations (3). The parallaxnumber calculation unit201 sends the number m×n of elements to theLF creation unit202 as the parallax number of the image sensing apparatus according to this embodiment. Note that the parallax number calculation is merely an example of one form of the image sensing apparatus holding a lens array, and the present invention is not limited to this. For example, the parallax number may be acquired from theROM111. When the image sensing unit parameters are reacquired, the parallax number is also recalculated.
In step S603, theLF creation unit202 divides theimaging lens101 into areas based on the parallax number and assigns an index to each area. The number of elements for receiving light for each microlens, which is calculated by the parallaxnumber calculation unit201, is the parallax number. Hence, when the parallax number is m×n, the imaging lens is divided into m×n areas, and indices ML(1,1), ML(2,1), . . . , ML(m,n) are assigned to the areas, respectively. For example,FIG. 7 illustrates an example of area division and the indices of areas of theimaging lens101 when m=3, and n=3.
In step S604, based on the image sensing unit parameters and area division of the imaging lens, theLF creation unit202 creates a light field that is a list in which the correspondence between coordinates (x,y) of each image sensing element and coordinates (u,v), on theimaging lens101, of a light beam that becomes incident on the image sensing element is registered. As shown inFIG. 8, the coordinates (u,v) on the imaging lens are defined on a coordinate system including the imaging lens while placing the origin at the center of the imaging lens. For example, assume that the u-v coordinate system ranges from −1 to 1, as shown inFIG. 8. The created light field is a table of the coordinates (u,v) corresponding to the coordinates (x,y) of each pixel and the index of the divided area including the coordinates (u,v), as shown inFIG. 9. For the light field of the image sensing apparatus, a line is drawn from a pixel to the center of a microlens, as shown inFIG. 10, thereby acquiring the coordinates (u,v) of the passage point on the imaging lens as the coordinates corresponding to the pixel. This operation is performed for all pixels to create the light field. Note that the light field shown inFIG. 10 is merely an example for the positional relationship of the components in theimage sensing unit100. The present invention is not limited to this as far as the light field represents the correspondence between the coordinates (u,v) on theimaging lens101 and the coordinates (x,y) of the pixel on theimage sensor104. The light field may be recreated when the distance L between theaperture stop102 and themicrolens array103 or the distance f between themicrolens array103 and theimage sensor104 in theimage sensing unit100 has changed.
In step S605, the digital signal input from the A/D conversion unit105 is stored in thebuffer204 as sensed image data.
In step S606, theimage reconstruction unit203 reconstructs the sensed image data held in thebuffer204 into parallax image data based on the light field obtained from theLF creation unit202. More specifically, the pixels in the sensed image data are rearranged so that u of the coordinates (u,v) in the light field increases from left to right, and v increases from above to below. That is, when the coordinates of a pixel after rearrangement are represented by (x′,y′), rearrangement is done by
(x′,y′)=(u,v)  (4)
This allows to reconstruct parallax image data having parallaxes as many as the divided areas of the imaging lens. Parallax images indicate an image group with parallaxes including, when an object as shown inFIG. 12A is sensed, the image of the object sensed from the upper side with respect to the center and the image of the object sensed from the right side, as shown inFIG. 12B.
In step S607, theimage reconstruction unit203 stores the reconstructed parallax image data in thebuffer204. If parallax image data is already stored in thebuffer204, theimage reconstruction unit203 updates the already stored parallax image data. Theimage reconstruction unit203 also stores the light field in thebuffer204.
In step S608, theimage reconstruction unit203 determines whether there is a change in the image sensing unit parameters of the light field. Upon determining that there is no change, the updating is determined to be completed, and the process advances to step S609. Upon determining that there is a change in the image sensing unit parameters, the process returns to step S601.
In step S609, theimage reconstruction unit203 performs overlap determination in the light field. In the overlap determination, it is determined whether the imaging lens has two or more kinds of coordinates corresponding to each pixel. Upon determining that overlap exists, the process advances to step S610. Upon determining that no overlap exists, the process advances to step S611. For example, assume that theimage sensor104 has light-receiving areas in the state shown inFIG. 13. As shown inFIG. 13, when each light-receiving area is wide, overlap areas are generated.
In step S610, theimage reconstruction unit203 gives overlap information representing existence of overlap to a light field corresponding to a pixel determined as overlap and updates the light field in thebuffer204. In step S611, the parallaximage extraction unit205 confirms the currently set display mode. The user can set any one of the display modes by operating theoperation unit113. More specifically, when the user inputs an instruction to select a display mode using theoperation unit113, thecharacter generation unit115 generates a display mode selection screen (GUI) and displays it on the display screen of thedisplay unit116.FIG. 14 shows a display example on the display screen of thedisplay unit116.
The 3×3 matrix shown as a GUI corresponds to 3×3 parallax images. For example, pointing out the rectangle at the upper left corner using theoperation unit113 makes it possible to designate a parallax image ML(−1,−1) obtained by sensing the object from upper left. In this way, pointing out a rectangle ith (1≦i≦3 inFIG. 14) rightward from upper left and jth (1≦j≦3 inFIG. 14) downward using theoperation unit113 allows to designate a parallax image ML(−2+i,−2+j). The parallax image designation method and the arrangement of the GUI therefor are not limited to those described above, as a matter of course.
When the user points out one rectangle using the operation unit113 a parallax image corresponding to the pointed position is designated, and a 1-parallax display mode is set. When the user instructs to select all rectangles (all rectangles of the GUI) using theoperation unit113, an all-parallax display mode is set. Note that the method of setting the 1-parallax display mode or the all-parallax display mode is not limited to this. Alternatively, for example, one of the all-parallax display mode and the 1-parallax display mode may be selected using a check box generated by thecharacter generation unit115. In the above-described example, two display modes are selectable. However, the present invention is not limited to this. There may also exist a mode to select and display several parallax images.
Anyway, when the user sets a display mode using theoperation unit113, data representing the set display mode (when the 1-parallax display mode is set, the data includes data representing the designated parallax image) is written in theRAM112. Hence, in step S611, the data written in theRAM112 to represent the display mode is referred to, and which display mode is represented by the data is determined.
Upon determining in step S611 that the 1-parallax display mode is set, the process advances to step S612. When the all-parallax display mode is set, the parallaximage extraction unit205 directly arranges the parallax image data for each parallax, as shown inFIG. 11, and outputs it to theencoder unit107 and the D/A conversion unit109 as display image data.
How to arrange and display the parallax images is not limited to that described above, as a matter of course. For example, when the parallax images are arranged as shown inFIG. 11, some of the parallax images may be extracted and arranged as display images. A parallax image having an average pixel value equal to or smaller than a given threshold may be excluded from the display target because it is “too dark”.
FIG. 15 illustrates an example in which all parallax images are arranged in accordance with extraction start coordinates when m=5 and n=5. At this time, when the opening portion of theaperture stop102 is circular, as shown inFIG. 3, non-light-receiving areas where light receiving does not occur are generated on theimage sensor104, as shown inFIG. 5. Referring toFIG. 15, the parallax images (hatched portions inFIG. 15) to be listed below are formed from pixels extracted from the non-light-receiving areas and are therefore darker than the remaining parallax images.
    • Display image data areas ML(1,1), ML(2,1), ML(4,1), ML(5,1), ML(1,2), ML(5,2), ML(1,4), ML(5,4), ML(1,5), ML(2,5), ML(4,5), and ML(5,5)
Only parallax images each having an average pixel value equal to or larger than a threshold may be arranged, as shown inFIG. 16, and displayed on the display screen of thedisplay unit116. A dark parallax image may be displayed after made lighter by multiplying each pixel value by a gain. Alternatively, parallax images in the neighborhood may be added.
Alternatively, as shown inFIG. 17, the parallax images may be extracted, arranged and enlarged to 3×3, and thus displayed on the display screen of thedisplay unit116. A parallax image given overlap information in step S610 may be excluded from the display target. For a parallax image given overlap information, “information representing occurrence of overlap” may be displayed in place of the parallax image, as shown inFIG. 19.
Referring back toFIG. 6, in step S612, the parallaximage extraction unit205 determines whether to refer to the “data representing the designated parallax image” included in the data representing the 1-parallax display mode. Whether to refer the data can be either preset or selected by the user. To refer to the data, the process advances to step S613. Not to refer, the process advances to step S614. When an instruction to select the 1-parallax display mode is simply input using a GUI than the above-described matrix GUI, there exists no data representing a designated parallax image. In this case, the process advances to step S614.
In step S613, the parallaximage extraction unit205 enlarges the designated parallax image as needed, and outputs the enlarged parallax image data to the D/A conversion unit109 as display image data. For example, inFIG. 11, when the rectangle at the upper left corner is pointed out on the 3×3 matrix GUI of the above-described example, the parallax image data ML(1,1) is extracted, enlarged as needed, and output to the D/A conversion unit109. A known enlarging method such as a bicubic method is usable. The scaling factor of enlargement is calculated from the number of pixels of the display screen of thedisplay unit116. The enlargement processing is not essential, and reduction may be performed in place of enlargement, as a matter of course.
The operation on the GUI (switching the display target parallax image) may be done during display image display on thedisplay unit116. For example, when the user points out another parallax image using theoperation unit113 during display of a display image on thedisplay unit116, a display image is generated from the designated parallax image and output to the D/A conversion unit109.
Note that the display configuration on the display screen of thedisplay unit116 is not limited to the above-described example. For example, a GUI as shown inFIG. 18 may be displayed on thedisplay unit116 to display a display image on the upper side of thedisplay unit116 and a parallax image on the lower side and cause the user to select a parallax image on the lower side. In the GUI shown inFIG. 18, each parallax image data generated in step S607 is reduced and arranged on the lower side of thedisplay unit116, and the selected parallax image is enlarged and displayed on the upper side. Selection of a parallax image given overlap information may be prohibited. The parallax image data may directly be arranged for each parallax, as shown inFIG. 11, and output to theencoder unit107 and the D/A conversion unit109 as display image data.
In step S614, the parallaximage extraction unit205 enlarges an area ML located at the center of the parallax image data, as needed, as in step S613, and outputs the enlarged parallax image to the D/A conversion unit109 as a display image. The enlargement processing is not essential, and reduction may be performed in place of enlargement, as a matter of course.
Note that thedisplay unit116 displays the display image data received from the D/A conversion unit109 as a display image on, for example, an LCD.
As described above, according to this embodiment, the user can confirm via thedisplay unit116 whether a parallax image acquired by the image sensing apparatus including themicrolens array103 is in focus. For example, when the object is a planar chart as shown inFIG. 20A, a display image as shown inFIG. 20B is conventionally obtained through themicrolens array103. Hence, it is impossible to confirm whether the planar chart is in focus. However, according to this embodiment, the image of the center viewpoint or the like is enlarged and displayed, as shown inFIG. 20C. This allows to confirm whether the planar chart is in focus.
[Second Embodiment]
The series of processes described in the first embodiment are performed at a predetermined time interval to update the display image in real time, thereby implementing live view display. In addition, every time an assigned pixel number calculation parameter or the lens position is updated, the above-described processing is performed. This makes it possible to reflect the influence of the focal length, the aperture stop, and the movement of the lens and image sensing elements upon shooting on the display image in real time. For example, when the focal length or anaperture stop102 changes, the light-receiving area on animage sensor104 changes.
FIG. 21 is a schematic view of light-receiving areas on theimage sensor104 when theaperture stop102 is stopped down.FIG. 13 is a schematic view of light-receiving areas when theaperture stop102 is opened. At this time, since the image sensing unit parameter changes, the procedure of asignal processing unit106 is performed from step S601 again to create a new light field and thus cope with the change. For example, when theaperture stop102 is stopped down, the light-receiving areas become smaller, as shown inFIG. 21. Hence, the parallax number m×n decreases as can be seen from equations (2) and (3). Assume that the parallax number of parallax image data is 5×5, and this decreases to 3×3 when the aperture stop is stopped down. At this time, the light field is recreated. The parallax number of the parallax image data decreases from that inFIG. 22 to that inFIG. 23. As described above, even when the light-receiving areas are narrowed by, for example, stopping down theaperture stop102 or increasing a distance L, the parallax image data can correctly be generated.
When theaperture stop102 is opened to widen the light-receiving areas on theimage sensor104, the parallax number m×n increases as is apparent from equations (2) and (3). For example, when the parallax number increases from 5×5 to 7×7, parallax image data as shown inFIG. 24 is obtained. As described above, even when the light-receiving areas are widened by, for example, opening the aperture stop or shortening the distance L, the parallax image data can correctly be generated.
In the above-described embodiment, the in-focus state can be confirmed by displaying a parallax image. However, the image can also be used for another purpose, as a matter of course. For example, contrast AF processing may be done from aCPU110 and an image sensingsystem control unit114 using a display image.
The processing has been described above concerning an image sensing apparatus which can change the focal length and move theaperture stop102, theimaging lens101, and theimage sensor104. However, this does not apply in an image sensing apparatus that uses a single focal point or a single aperture stop and does not change the size of each light-receiving area on theimage sensor104. In this image sensing apparatus, the light field may be held in aROM111 or the like in advance because it is not recreated from the image sensing unit parameters. The same processing can be done even in an image sensing apparatus which does not change the light field because of the positional relationship in animage sensing unit100 regardless of the aperture state of the aperture stop. Processing after light field creating is the same as that from step S605.
[Third Embodiment]
In this embodiment, an example will be explained in which display is performed to allow the user to easily confirm whether focus adjustment processing (to be referred to as refocus hereinafter) after shooting, which is to be performed by an image sensing apparatus holding a lens array, is possible. Points different from the first embodiment will mainly be described below.
<Example of Arrangement of Image Sensing Apparatus>
FIG. 25 is a block diagram showing an example of the functional arrangement of asignal processing unit106 according to this embodiment. Thesignal processing unit106 includes a parallaxnumber calculation unit201, anLF creation unit202, animage reconstruction unit203, abuffer204, a parallaximage extraction unit205, a mainobject extraction unit2501, and a parallaxamount calculation unit2502.
The parallaxnumber calculation unit201, theLF creation unit202, theimage reconstruction unit203, and thebuffer204 are the same as in the first embodiment, and a description thereof will not be repeated.
The parallaximage extraction unit205 extracts a parallax image from the center viewpoint of parallax image data stored in thebuffer204, enlarges the parallax image, and sends it to a D/A conversion unit109.
The mainobject extraction unit2501 acquires the information of the main object designated by the user via anoperation unit113 or the touch panel of adisplay unit116, and extracts the main object from each parallax image data.
The parallaxamount calculation unit2502 calculates the barycentric coordinates of the main object on each parallax image coordinate system from the main object extracted from each parallax image data, and calculates the parallax amount based on the distance between the barycentric coordinates and those of the parallax image of the center viewpoint. Next, it is determined based on the parallax amount whether the main object has a parallax sufficient for refocus. Then, refocus determination information is generated from each parallax image including the main object and its parallax amount, gives the information to the parallax image, and transmits it to the D/A conversion unit109.
<Operation of Image Sensing Apparatus>
The operation of the image sensing apparatus according to this embodiment will be described next.
Processing up to sensed image data generation of thesignal processing unit106 is the same as in the first embodiment, and a description thereof will not be repeated. Refocus evaluation information generation processing will be described below with reference to the flowchart ofFIG. 26.
The processes of steps S2601 to S2610 are the same as those of steps S601 to S610 described above, and a description thereof will not be repeated.
In step S2611, the parallaximage extraction unit205 extracts the parallax image of the center viewpoint of the parallax image data, enlarges the parallax image, and transmits it to the D/A conversion unit109. In addition, a message to cause the user to select the main object is displayed on thedisplay unit116.
In step S2612, the mainobject extraction unit2501 acquires main object selection information from theoperation unit113. The main object selection information is a point (x″c,y″c) in the x″-y″ coordinate system of the parallax image data of the center viewpoint, which is selected by the user using theoperation unit113 from the parallax image of the center viewpoint displayed on thedisplay unit116 in step S2611. For example, assume that a face image is sensed in the parallax image data of the center viewpoint, as shown inFIG. 27. When the user designates the point indicated by a full circle and designates the face as the main object, the x″y″ coordinates of the full circle are stored in aRAM112 as the main object selection information. The mainobject extraction unit2501 acquires this information.
In step S2613, the mainobject extraction unit2501 acquires the parallax image data of the center viewpoint from thebuffer204, extracts the object based on the main object selection information, and calculates the center coordinates (x″0,y″0) of the extracted object. The center coordinates are obtained by surrounding the object by a rectangle and defining the center of the rectangle, as shown inFIG. 28. Note that object extraction can be done by a known method.
In step S2614, the mainobject extraction unit2501 acquires parallax image data other that of the center viewpoint from thebuffer204, and extracts the object extracted in step S2613 from each parallax image data. Note that extraction can be done by a known method such as pattern matching. Next, the center coordinates (x″1,y″1), (x″2,y″2), (x″3,y″3), . . . of the object are calculated for the respective parallax images, as in step S2613. The mainobject extraction unit2501 then sends the center coordinates of all parallax images from which the object can be extracted to the parallaxamount calculation unit2502.
In step S2615, the parallaxamount calculation unit2502 calculates the parallax amount of each parallax image including the object designated by the user. For example, a parallax amount E1 of the parallax image having the object center coordinates (x″1,y″1) is calculated by
E1=√{square root over ((x″1−x″0)2+(y″1−y″0)2)}{square root over ((x″1−x″0)2+(y″1−y″0)2)}  (5)
In a similar manner, parallax amounts E2, E3, . . . are calculated for the parallax images from which the object can be extracted.
In step S2616, the parallaxamount calculation unit2502 generates refocus evaluation information based on each parallax amount calculated in step S2615, and gives the refocus evaluation information to corresponding parallax image data. All parallax image data are transmitted to the D/A conversion unit109 as display image data. The refocus evaluation information is 1-bit information ◯ or x. When the object exists in the parallax image, and the parallax amount of the parallax image data is larger than a threshold Th necessary and sufficient for refocus, refocus evaluation information ◯ is given to the parallax image data. When the parallax amount is smaller than the threshold Th, refocus evaluation information x is given to the parallax image data. The refocus evaluation information determination condition is not limited to that described above. The threshold Th may be increased as the parallax image data is farther apart from the parallax image data of the center viewpoint. In the above-described example, only the scalar quantity of the parallax amount is used as the determination condition. However, a direction may be combined. For example, when parallax image data is located above the parallax image data of the center viewpoint, the parallax amount vector of the object needs to be upward. Hence, whether the vector is upward may be determined.
Next, thedisplay unit116 displays display image data on, for example, an LCD as a display image. The images may be displayed in, for example, the all-parallax mode, as shown inFIG. 29. At this time, the refocus evaluation information ◯ or x is displayed at the upper right of each parallax image based on the refocus evaluation information given in step S2616. The display method is not limited to this. Only parallax images given the refocus evaluation information may be displayed on thedisplay unit116 one by one, and switched as the time elapses. At this time, the refocus evaluation information of the parallax image is displayed simultaneously.
A display image is displayed on thedisplay unit116 by performing the above-described processing. This allows the user to easily confirm via thedisplay unit116 whether the main object of the sensed image acquired by the image sensing apparatus including amicrolens array103 has a parallax amount sufficient for refocus to be performed later.
Other Embodiments
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s). For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (for example, computer-readable medium).
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application Nos. 2010-282397 filed Dec. 17, 2010 and 2011-251024 filed Nov. 16, 2011, which are hereby incorporated by reference herein in their entirety.

Claims (6)

What is claimed is:
1. An image sensing apparatus comprising:
a unit that acquires a sensed image generated from an image signal output from an image sensor for receiving light that becomes incident sequentially through an imaging lens and a microlens array that is a two-dimensional array including a plurality of microlenses;
a creation unit that creates a list in which, for each pixel position on the image sensor, a correspondence between coordinates of the light incident at the pixel position on the imaging lens and coordinates of the pixel position is registered;
a generation unit that generates, as a parallax image group, images obtained by rearranging pixels at pixel positions on the image sensor corresponding to the coordinates in accordance with an arrangement order of the coordinates on the imaging lens registered in the list; and
an output unit that outputs the images generated by said generation unit.
2. The apparatus according toclaim 1, wherein said creation unit creates a list in which, for each pixel position on the image sensor, a correspondence between coordinates at which a line segment passing through the pixel position and a center position of the microlens intersects the imaging lens and the coordinates of the pixel position is registered.
3. The apparatus according toclaim 1, wherein
upon acquiring an instruction to output one parallax image out of the parallax image group, said output unit outputs said one parallax image to a display screen so as to cause the display screen to display said one parallax image, and
upon acquiring an instruction to output the parallax image group, said output unit outputs the images generated by said generation unit to the display screen so as to cause the display screen to display the images.
4. The apparatus according toclaim 3, wherein said output unit outputs only a parallax image having an average pixel value not less than a threshold to the display screen.
5. The apparatus according toclaim 3, wherein when pixels collected to generate a parallax image are the same as pixels collected to generate another parallax image, said output unit outputs parallax images other than the parallax image and said other parallax image, to the display screen.
6. A method of controlling an image sensing apparatus including an image sensor for receiving light that becomes incident sequentially through an imaging lens and a microlens array that is a two-dimensional array including a plurality of microlenses, comprising:
a step of acquiring a sensed image generated from an image signal output from the image sensor;
a creation step of creating a list in which, for each pixel position on the image sensor, a correspondence between coordinates of the light incident at the pixel position on the imaging lens and coordinates of the pixel position is registered;
a generation step of generating, as a parallax image group, images obtained by rearranging pixels at pixel positions on the image sensor corresponding to the coordinates in accordance with an arrangement order of the coordinates on the imaging lens registered in the list; and
an output step of outputting the images generated in the generation step.
US13/308,9072010-12-172011-12-01Image sensing apparatus and method of controlling the image sensing apparatusExpired - Fee RelatedUS8581998B2 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US14/048,405US8711269B2 (en)2010-12-172013-10-08Image sensing apparatus and method of controlling the image sensing apparatus

Applications Claiming Priority (4)

Application NumberPriority DateFiling DateTitle
JP2010-2823972010-12-17
JP20102823972010-12-17
JP2011251024AJP5906062B2 (en)2010-12-172011-11-16 Imaging apparatus and control method thereof
JP2011-2510242011-11-16

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US14/048,405ContinuationUS8711269B2 (en)2010-12-172013-10-08Image sensing apparatus and method of controlling the image sensing apparatus

Publications (2)

Publication NumberPublication Date
US20120154651A1 US20120154651A1 (en)2012-06-21
US8581998B2true US8581998B2 (en)2013-11-12

Family

ID=46233926

Family Applications (2)

Application NumberTitlePriority DateFiling Date
US13/308,907Expired - Fee RelatedUS8581998B2 (en)2010-12-172011-12-01Image sensing apparatus and method of controlling the image sensing apparatus
US14/048,405Expired - Fee RelatedUS8711269B2 (en)2010-12-172013-10-08Image sensing apparatus and method of controlling the image sensing apparatus

Family Applications After (1)

Application NumberTitlePriority DateFiling Date
US14/048,405Expired - Fee RelatedUS8711269B2 (en)2010-12-172013-10-08Image sensing apparatus and method of controlling the image sensing apparatus

Country Status (3)

CountryLink
US (2)US8581998B2 (en)
JP (1)JP5906062B2 (en)
CN (1)CN102547095B (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20150029386A1 (en)*2012-02-282015-01-29Lytro, Inc.Microlens array architecture for avoiding ghosting in projected images
US20150116794A1 (en)*2013-10-242015-04-30Masamoto NakazawaPhotoelectric conversion element, image reading device, image forming apparatus, image reading method, and image forming method
US9420276B2 (en)2012-02-282016-08-16Lytro, Inc.Calibration of light-field camera geometry via robust fitting
US9635332B2 (en)2014-09-082017-04-25Lytro, Inc.Saturated pixel recovery in light-field images
US9681042B2 (en)2012-09-122017-06-13Canon Kabushiki KaishaImage pickup apparatus, image pickup system, image processing device, and method of controlling image pickup apparatus
US10205896B2 (en)2015-07-242019-02-12Google LlcAutomatic lens flare detection and correction for light-field images
US10275898B1 (en)2015-04-152019-04-30Google LlcWedge-based light-field video capture
US10275892B2 (en)2016-06-092019-04-30Google LlcMulti-view scene segmentation and propagation
US10298834B2 (en)2006-12-012019-05-21Google LlcVideo refocusing
US10334151B2 (en)2013-04-222019-06-25Google LlcPhase detection autofocus using subaperture images
US10341632B2 (en)2015-04-152019-07-02Google Llc.Spatial random access enabled video system with a three-dimensional viewing volume
US10354399B2 (en)2017-05-252019-07-16Google LlcMulti-view back-projection to a light-field
US10412373B2 (en)2015-04-152019-09-10Google LlcImage capture for virtual reality displays
US10419737B2 (en)2015-04-152019-09-17Google LlcData structures and delivery methods for expediting virtual reality playback
US10440407B2 (en)2017-05-092019-10-08Google LlcAdaptive control for immersive experience delivery
US10444931B2 (en)2017-05-092019-10-15Google LlcVantage generation and interactive playback
US10469873B2 (en)2015-04-152019-11-05Google LlcEncoding and decoding virtual reality video
US10474227B2 (en)2017-05-092019-11-12Google LlcGeneration of virtual reality with 6 degrees of freedom from limited viewer data
US10540818B2 (en)2015-04-152020-01-21Google LlcStereo image generation and interactive playback
US10546424B2 (en)2015-04-152020-01-28Google LlcLayered content delivery for virtual and augmented reality experiences
US10545215B2 (en)2017-09-132020-01-28Google Llc4D camera tracking and optical stabilization
US10552947B2 (en)2012-06-262020-02-04Google LlcDepth-based image blurring
US10565734B2 (en)2015-04-152020-02-18Google LlcVideo capture, processing, calibration, computational fiber artifact removal, and light-field pipeline
US10567464B2 (en)2015-04-152020-02-18Google LlcVideo compression with adaptive view-dependent lighting removal
US10594945B2 (en)2017-04-032020-03-17Google LlcGenerating dolly zoom effect using light field image data
US10679361B2 (en)2016-12-052020-06-09Google LlcMulti-view rotoscope contour propagation
US10965862B2 (en)2018-01-182021-03-30Google LlcMulti-camera navigation interface
WO2021058604A1 (en)2019-09-242021-04-01Koninklijke Philips N.V.Hypochlorite composition and system and method for preparing a hypochlorite composition and use of the same
WO2021058611A1 (en)2019-09-242021-04-01Koninklijke Philips N.V.Kit, system and method for preparing and use of a peroxide-containing composition
US11328446B2 (en)2015-04-152022-05-10Google LlcCombining light-field data with active depth data for depth map generation

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8949078B2 (en)*2011-03-042015-02-03Ricoh Co., Ltd.Filter modules for aperture-coded, multiplexed imaging systems
JP5310837B2 (en)*2011-12-282013-10-09カシオ計算機株式会社 Image generating apparatus, digital camera, method, and program
KR20130112541A (en)*2012-04-042013-10-14삼성전자주식회사Plenoptic camera apparatus
JP6029380B2 (en)2012-08-142016-11-24キヤノン株式会社 Image processing apparatus, imaging apparatus including image processing apparatus, image processing method, and program
JP5978082B2 (en)*2012-09-192016-08-24日本放送協会 Stereoscopic image capturing apparatus and method
JP5756572B2 (en)*2012-09-252015-07-29富士フイルム株式会社 Image processing apparatus and method, and imaging apparatus
JP5820794B2 (en)*2012-10-052015-11-24オリンパス株式会社 Imaging device
EP2725781B1 (en)2012-10-252014-06-04Axis ABMethod of setting focus of a digital video camera and a digital video camera doing the same
JP2014086968A (en)*2012-10-252014-05-12Ricoh Co LtdImage processing device, image processing method, and program
JP6305053B2 (en)*2013-01-152018-04-04キヤノン株式会社 Image processing apparatus, imaging apparatus, image processing method, and program
JP6162971B2 (en)*2013-02-062017-07-12キヤノン株式会社 Image processing apparatus, image processing method, imaging apparatus, and control method thereof
US9621794B2 (en)2013-02-212017-04-11Nec CorporationImage processing device, image processing method and permanent computer-readable medium
JP6274901B2 (en)*2013-03-252018-02-07キヤノン株式会社 Imaging apparatus and control method thereof
JP2015008387A (en)2013-06-252015-01-15キヤノン株式会社Image processing apparatus, method and program for image processing and imaging apparatus
JP6188531B2 (en)*2013-10-222017-08-30キヤノン株式会社 Imaging apparatus, control method thereof, and program
JP6452360B2 (en)*2013-12-192019-01-16キヤノン株式会社 Image processing apparatus, imaging apparatus, image processing method, and program
EP3104604A1 (en)*2015-06-082016-12-14Thomson LicensingLight field imaging device
JP6611531B2 (en)2015-09-162019-11-27キヤノン株式会社 Image processing apparatus, image processing apparatus control method, and program
FR3047322B1 (en)*2016-01-292018-08-17Thales OPTICAL SYSTEM COMPRISING AN OPTICAL DETECTION BLOCK WITH DEPTH ESTIMATION INDEPENDENT OF THE FOCAL OF THE OPTICAL SYSTEM
JP6870304B2 (en)*2016-12-052021-05-12住友電気工業株式会社 Manufacturing method of semiconductor devices
KR102240896B1 (en)*2019-12-102021-04-15국방과학연구소Camera device using single aperture
US11250235B2 (en)*2020-04-172022-02-15Novatek Microelectronics Corp.Under display light field sensor, device with under display light field sensor for sensing fingerprint or touch, and method for reconstructing image

Citations (22)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH05134294A (en)1991-11-141993-05-28Asahi Optical Co Ltd Fienda
JP2001186401A (en)1999-12-242001-07-06Minolta Co LtdDigital camera
US20020028014A1 (en)*2000-08-252002-03-07Shuji OnoParallax image capturing apparatus and parallax image processing apparatus
US20020126210A1 (en)*2001-01-192002-09-12Junichi ShinoharaMethod of and unit for inputting an image, and computer product
US20030071905A1 (en)*2001-10-122003-04-17Ryo YamasakiImage processing apparatus and method, control program, and storage medium
JP2006039486A (en)2004-07-302006-02-09Victor Co Of Japan LtdMap information processor
US20070230944A1 (en)*2006-04-042007-10-04Georgiev Todor GPlenoptic camera
US20090128658A1 (en)*2007-11-122009-05-21Sony CorporationImage pickup apparatus
JP2009165115A (en)2007-12-122009-07-23Sony CorpImaging device
US20090185801A1 (en)*2008-01-232009-07-23Georgiev Todor GMethods and Apparatus for Full-Resolution Light-Field Capture and Rendering
US7652679B2 (en)2004-03-032010-01-26Canon Kabushiki KaishaImage display method, program, image display apparatus and image display system
US20100039501A1 (en)*2007-01-302010-02-18Satoshi NakamuraImage recording device and image recording method
US20100283884A1 (en)*2009-05-082010-11-11Sony CorporationImaging device
US20110019067A1 (en)*2009-07-272011-01-27Panasonic CorporationImaging apparatus
US7936392B2 (en)2004-10-012011-05-03The Board Of Trustees Of The Leland Stanford Junior UniversityImaging arrangements and methods therefor
US20110242289A1 (en)*2010-03-312011-10-06Rieko FukushimaDisplay apparatus and stereoscopic image display method
US20120019625A1 (en)*2010-07-262012-01-26Nao MishimaParallax image generation apparatus and method
US8106994B2 (en)*2008-01-282012-01-31Sony CorporationImage pickup apparatus having a microlens array
US20120081513A1 (en)*2010-10-012012-04-05Masahiro YamadaMultiple Parallax Image Receiver Apparatus
US20120154551A1 (en)*2010-12-172012-06-21Canon Kabushiki KaishaStereo image display system, stereo imaging apparatus and stereo display apparatus
US8325241B2 (en)*2009-02-052012-12-04Sony CorporationImage pickup apparatus that stores adjacent and contiguous pixel data before integration of same
US8368744B2 (en)*2011-02-212013-02-05Kabushiki Kaisha ToshibaImage display apparatus, image processing device, and image processing method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP4826152B2 (en)*2005-06-232011-11-30株式会社ニコン Image composition method and imaging apparatus
US8103111B2 (en)*2006-12-262012-01-24Olympus Imaging Corp.Coding method, electronic camera, recording medium storing coded program, and decoding method
JP4969474B2 (en)*2007-02-092012-07-04オリンパスイメージング株式会社 Decoding method, decoding device, and decoding program
JP5294794B2 (en)*2007-12-142013-09-18キヤノン株式会社 Imaging apparatus and image display method
JP4538766B2 (en)*2008-08-212010-09-08ソニー株式会社 Imaging device, display device, and image processing device
US8558915B2 (en)*2009-12-222013-10-15Samsung Electronics Co., Ltd.Photographing apparatus and method
JP2013179564A (en)*2012-02-012013-09-09Canon IncImage processing method, image processing device, and imaging device

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH05134294A (en)1991-11-141993-05-28Asahi Optical Co Ltd Fienda
JP2001186401A (en)1999-12-242001-07-06Minolta Co LtdDigital camera
US6961089B2 (en)1999-12-242005-11-01Minolta Co., Ltd.Digital camera that displays a previously captured image on an LCD when a half-mirror is in motion
US20020028014A1 (en)*2000-08-252002-03-07Shuji OnoParallax image capturing apparatus and parallax image processing apparatus
US20020126210A1 (en)*2001-01-192002-09-12Junichi ShinoharaMethod of and unit for inputting an image, and computer product
US20030071905A1 (en)*2001-10-122003-04-17Ryo YamasakiImage processing apparatus and method, control program, and storage medium
US7652679B2 (en)2004-03-032010-01-26Canon Kabushiki KaishaImage display method, program, image display apparatus and image display system
JP2006039486A (en)2004-07-302006-02-09Victor Co Of Japan LtdMap information processor
US7936392B2 (en)2004-10-012011-05-03The Board Of Trustees Of The Leland Stanford Junior UniversityImaging arrangements and methods therefor
US20070230944A1 (en)*2006-04-042007-10-04Georgiev Todor GPlenoptic camera
US20100039501A1 (en)*2007-01-302010-02-18Satoshi NakamuraImage recording device and image recording method
US7932941B2 (en)*2007-11-122011-04-26Sony CorporationImage pickup apparatus
US20090128658A1 (en)*2007-11-122009-05-21Sony CorporationImage pickup apparatus
JP2009165115A (en)2007-12-122009-07-23Sony CorpImaging device
US20090185801A1 (en)*2008-01-232009-07-23Georgiev Todor GMethods and Apparatus for Full-Resolution Light-Field Capture and Rendering
US8106994B2 (en)*2008-01-282012-01-31Sony CorporationImage pickup apparatus having a microlens array
US8325241B2 (en)*2009-02-052012-12-04Sony CorporationImage pickup apparatus that stores adjacent and contiguous pixel data before integration of same
US20100283884A1 (en)*2009-05-082010-11-11Sony CorporationImaging device
US20110019067A1 (en)*2009-07-272011-01-27Panasonic CorporationImaging apparatus
US20110242289A1 (en)*2010-03-312011-10-06Rieko FukushimaDisplay apparatus and stereoscopic image display method
US20120019625A1 (en)*2010-07-262012-01-26Nao MishimaParallax image generation apparatus and method
US20120081513A1 (en)*2010-10-012012-04-05Masahiro YamadaMultiple Parallax Image Receiver Apparatus
US20120154551A1 (en)*2010-12-172012-06-21Canon Kabushiki KaishaStereo image display system, stereo imaging apparatus and stereo display apparatus
US8368744B2 (en)*2011-02-212013-02-05Kabushiki Kaisha ToshibaImage display apparatus, image processing device, and image processing method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Ren. Ng, et al "Light Field Photography with a Hand-Held Plenoptic Camera", Stanford Tech Report CTSR Feb. 2005.

Cited By (34)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US10298834B2 (en)2006-12-012019-05-21Google LlcVideo refocusing
US9172853B2 (en)*2012-02-282015-10-27Lytro, Inc.Microlens array architecture for avoiding ghosting in projected images
US9420276B2 (en)2012-02-282016-08-16Lytro, Inc.Calibration of light-field camera geometry via robust fitting
US20150029386A1 (en)*2012-02-282015-01-29Lytro, Inc.Microlens array architecture for avoiding ghosting in projected images
US10552947B2 (en)2012-06-262020-02-04Google LlcDepth-based image blurring
US9681042B2 (en)2012-09-122017-06-13Canon Kabushiki KaishaImage pickup apparatus, image pickup system, image processing device, and method of controlling image pickup apparatus
US10334151B2 (en)2013-04-222019-06-25Google LlcPhase detection autofocus using subaperture images
US20150116794A1 (en)*2013-10-242015-04-30Masamoto NakazawaPhotoelectric conversion element, image reading device, image forming apparatus, image reading method, and image forming method
US9179083B2 (en)*2013-10-242015-11-03Ricoh Company, LimitedPhotoelectric conversion element, image reading device, image forming apparatus, image reading method, and image forming method
US9635332B2 (en)2014-09-082017-04-25Lytro, Inc.Saturated pixel recovery in light-field images
US10565734B2 (en)2015-04-152020-02-18Google LlcVideo capture, processing, calibration, computational fiber artifact removal, and light-field pipeline
US10275898B1 (en)2015-04-152019-04-30Google LlcWedge-based light-field video capture
US10341632B2 (en)2015-04-152019-07-02Google Llc.Spatial random access enabled video system with a three-dimensional viewing volume
US11328446B2 (en)2015-04-152022-05-10Google LlcCombining light-field data with active depth data for depth map generation
US10412373B2 (en)2015-04-152019-09-10Google LlcImage capture for virtual reality displays
US10419737B2 (en)2015-04-152019-09-17Google LlcData structures and delivery methods for expediting virtual reality playback
US10469873B2 (en)2015-04-152019-11-05Google LlcEncoding and decoding virtual reality video
US10567464B2 (en)2015-04-152020-02-18Google LlcVideo compression with adaptive view-dependent lighting removal
US10540818B2 (en)2015-04-152020-01-21Google LlcStereo image generation and interactive playback
US10546424B2 (en)2015-04-152020-01-28Google LlcLayered content delivery for virtual and augmented reality experiences
US10205896B2 (en)2015-07-242019-02-12Google LlcAutomatic lens flare detection and correction for light-field images
US10275892B2 (en)2016-06-092019-04-30Google LlcMulti-view scene segmentation and propagation
US10679361B2 (en)2016-12-052020-06-09Google LlcMulti-view rotoscope contour propagation
US10594945B2 (en)2017-04-032020-03-17Google LlcGenerating dolly zoom effect using light field image data
US10474227B2 (en)2017-05-092019-11-12Google LlcGeneration of virtual reality with 6 degrees of freedom from limited viewer data
US10444931B2 (en)2017-05-092019-10-15Google LlcVantage generation and interactive playback
US10440407B2 (en)2017-05-092019-10-08Google LlcAdaptive control for immersive experience delivery
US10354399B2 (en)2017-05-252019-07-16Google LlcMulti-view back-projection to a light-field
US10545215B2 (en)2017-09-132020-01-28Google Llc4D camera tracking and optical stabilization
US10965862B2 (en)2018-01-182021-03-30Google LlcMulti-camera navigation interface
WO2021058604A1 (en)2019-09-242021-04-01Koninklijke Philips N.V.Hypochlorite composition and system and method for preparing a hypochlorite composition and use of the same
WO2021058611A1 (en)2019-09-242021-04-01Koninklijke Philips N.V.Kit, system and method for preparing and use of a peroxide-containing composition
EP3811922A1 (en)2019-10-242021-04-28Koninklijke Philips N.V.Kit, system and method for preparing and use of a peroxide-containing composition
EP3811929A1 (en)2019-10-242021-04-28Koninklijke Philips N.V.Hypochlorite composition and system and method for preparing a hypochlorite composition and use of the same

Also Published As

Publication numberPublication date
US8711269B2 (en)2014-04-29
CN102547095A (en)2012-07-04
JP2012142918A (en)2012-07-26
JP5906062B2 (en)2016-04-20
CN102547095B (en)2014-10-29
US20140036130A1 (en)2014-02-06
US20120154651A1 (en)2012-06-21

Similar Documents

PublicationPublication DateTitle
US8581998B2 (en)Image sensing apparatus and method of controlling the image sensing apparatus
US9866810B2 (en)Optimization of optical systems for improved light field capture and manipulation
US9521320B2 (en)Image processing apparatus, image capturing apparatus, image processing method, and storage medium
US8593509B2 (en)Three-dimensional imaging device and viewpoint image restoration method
JP6029380B2 (en) Image processing apparatus, imaging apparatus including image processing apparatus, image processing method, and program
JP6341736B2 (en) Imaging apparatus, control method, program, storage medium
US9609208B2 (en)Image generation method, image generation apparatus, program, and storage medium
JP6789833B2 (en) Image processing equipment, imaging equipment, image processing methods and programs
US9113071B2 (en)Imaging device, imaging method, and imaging program for displaying a composite image indicating focus deviation
US8860852B2 (en)Image capturing apparatus
JP2013009274A (en)Image processing device, image processing method, and program
KR101889932B1 (en)Apparatus and Method for photographing image
JP5882789B2 (en) Image processing apparatus, image processing method, and program
CN104580921B (en)Picture pick-up device and its control method
JP2017060010A (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
WO2013027507A1 (en)Imaging device
JP6964806B2 (en) Image sensor, image sensor, image data processing method, and program
JP5640383B2 (en) Imaging device
JP5743769B2 (en) Image processing apparatus and image processing method
JP6672085B2 (en) Information processing apparatus, information processing method, and program
JP2024012828A (en) Imaging device and its control method
JP2012124650A (en)Imaging apparatus, and imaging method
JP6817855B2 (en) Image processing equipment, imaging equipment, image processing methods, and programs
JP2024175256A (en) Information processing device and imaging device
JP2019057798A (en)Image processing device, imaging device, method of controlling imaging device, and program

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:CANON KABUSHIKI KAISHA, JAPAN

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OHNO, TAKENORI;REEL/FRAME:027930/0607

Effective date:20111125

STCFInformation on status: patent grant

Free format text:PATENTED CASE

FEPPFee payment procedure

Free format text:PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAYFee payment

Year of fee payment:4

FEPPFee payment procedure

Free format text:MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPSLapse for failure to pay maintenance fees

Free format text:PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCHInformation on status: patent discontinuation

Free format text:PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FPLapsed due to failure to pay maintenance fee

Effective date:20211112


[8]ページ先頭

©2009-2025 Movatter.jp