Movatterモバイル変換


[0]ホーム

URL:


US9270875B2 - Dual image capture processing - Google Patents

Dual image capture processing
Download PDF

Info

Publication number
US9270875B2
US9270875B2US13/335,028US201113335028AUS9270875B2US 9270875 B2US9270875 B2US 9270875B2US 201113335028 AUS201113335028 AUS 201113335028AUS 9270875 B2US9270875 B2US 9270875B2
Authority
US
United States
Prior art keywords
image
monoscopic
enhanced
sensor
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/335,028
Other versions
US20130021447A1 (en
Inventor
Laurent Brisedoux
David Plowman
Ron Fridental
Benjamin Sewell
Naushir Patuck
Cresida Harding
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Broadcom Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Broadcom CorpfiledCriticalBroadcom Corp
Assigned to BROADCOM CORPORATIONreassignmentBROADCOM CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: PLOWMAN, DAVID, SEWELL, BENJAMIN, HARDING, CRESSIDA, Patuck, Naushir, Brisedoux, Laurent, FRIDENTAL, RON
Priority to US13/335,028priorityCriticalpatent/US9270875B2/en
Priority to EP18188593.0Aprioritypatent/EP3429189B1/en
Priority to EP12004966.3Aprioritypatent/EP2549763A3/en
Priority to TW101124641Aprioritypatent/TWI526068B/en
Priority to KR1020120078610Aprioritypatent/KR101428635B1/en
Priority to CN201210254807.9Aprioritypatent/CN102892008B/en
Publication of US20130021447A1publicationCriticalpatent/US20130021447A1/en
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENTreassignmentBANK OF AMERICA, N.A., AS COLLATERAL AGENTPATENT SECURITY AGREEMENTAssignors: BROADCOM CORPORATION
Publication of US9270875B2publicationCriticalpatent/US9270875B2/en
Application grantedgrantedCritical
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.reassignmentAVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATIONreassignmentBROADCOM CORPORATIONTERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTSAssignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Assigned to AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITEDreassignmentAVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITEDMERGER (SEE DOCUMENT FOR DETAILS).Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Assigned to AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITEDreassignmentAVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITEDCORRECTIVE ASSIGNMENT TO CORRECT THE EFFECTIVE DATE PREVIOUSLY RECORDED ON REEL 047229 FRAME 0408. ASSIGNOR(S) HEREBY CONFIRMS THE THE EFFECTIVE DATE IS 09/05/2018.Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Assigned to AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITEDreassignmentAVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITEDCORRECTIVE ASSIGNMENT TO CORRECT THE PATENT NUMBER 9,385,856 TO 9,385,756 PREVIOUSLY RECORDED AT REEL: 47349 FRAME: 001. ASSIGNOR(S) HEREBY CONFIRMS THE MERGER.Assignors: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

Embodiments of imaging devices of the present disclosure automatically utilize simultaneous image captures in an image processing pipeline. In one embodiment, control processing circuitry initiates simultaneous capture of the first image by the first image sensor and the second image by the second image sensor; and image processing circuitry generates an enhanced monoscopic image comprising at least portions of the first image and the second image.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application claims priority to copending U.S. provisional application entitled, “Image Capture Device Systems and Methods,” having Ser. No. 61/509,747, filed Jul. 20, 2011, which is entirely incorporated herein by reference.
This application is related to copending U.S. utility patent application entitled “Multiple Image Processing” filed Sep. 19, 2011 and accorded Ser. No. 13/235,975, which is entirely incorporated herein by reference.
BACKGROUND
Some types of image processing, such as high dynamic range (HDR) image processing, involves combining one camera's sequential still image output (e.g., each with differing exposure) into a single still image with a higher dynamic range (i.e., an image with a larger range of luminance variation between light and dark image areas). This approach is often called exposure bracketing and can be found in conventional cameras.
BRIEF DESCRIPTION OF THE DRAWINGS
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
FIG. 1 is a block diagram of one embodiment of an image processing circuitry according to the present disclosure.
FIGS. 2-5 are block diagrams of embodiments of an image signal processing pipeline implemented by the pipeline processing logic from the image processing circuitry ofFIG. 1.
FIG. 6 is a block diagram illustrating an embodiment of an electronic device employing the image processing circuitry ofFIG. 1.
FIGS. 7-9 are flow chart diagrams depicting various functionalities of embodiments of image processing circuitry ofFIG. 1.
DETAILED DESCRIPTION
This disclosure pertains to a device, method, computer useable medium, and processor programmed to automatically utilize simultaneous image captures in an image processing pipeline in a digital camera or digital video camera. One of ordinary skill in the art would recognize that the techniques disclosed may also be applied to other contexts and applications as well.
For cameras in embedded devices, e.g., digital cameras, digital video cameras, mobile phones, personal data assistants (PDAs), tablets, portable music players, and desktop or laptop computers, to produce more visually pleasing images, techniques such as those disclosed herein can improve image quality without incurring significant computational overhead or power costs.
To acquire image data, a digital imaging device may include an image sensor that provides a number of light-detecting elements (e.g., photodetectors) configured to convert light detected by the image sensor into an electrical signal. An image sensor may also include a color filter array that filters light captured by the image sensor to capture color information. The image data captured by the image sensor may then be processed by an image processing pipeline circuitry, which may apply a number of various image processing operations to the image data to generate a full color image that may be displayed for viewing on a display device, such as a monitor.
Conventional image processes, such as conventional high dynamic range (HDR) image processing requires multiple images to be captured sequentially and then combined to yield an HDR with enhanced image characteristics. In conventional HDR image processing, multiple images are captured sequentially by a single image sensor at different exposures and are combined to produce a single image with higher dynamic range than possible with capture of a single image. For example, capture of an outdoor night time shot with a neon sign might result in either over-exposure of the neon sign or under-exposure of the other portions of the scene. However, capturing both an over-exposed image and an under-exposed image and combining the multiple images can yield an HDR image with both adequate exposure for both the sign and the scene. This approach is often called exposure bracketing, but a requirement is that the images captured must be substantially similar even though taken sequentially to prevent substantial introduction of blurring or ghosting.
Embodiments of the present disclosure provide enhanced image processing by utilizing multiple images that are captured simultaneously. Referring toFIG. 1, a block diagram of one embodiment of an image processing circuitry100 is shown for animaging device150. The illustratedimaging device150 may be provided as a digital camera configured to acquire both still images and moving images (e.g., video). Thedevice150 may includemultiple lenses110 andmultiple image sensors101 configured to capture and convert light into electrical signals. By way of example only, an individual image sensor may include a CMOS (complementary metal-oxide-semiconductor) image sensor (e.g., a CMOS active-pixel sensor (APS)) or a CCD (charge-coupled device) sensor.
One prospective use of animaging device150 with multiple cameras or image sensors would be to increase the number of dimensions represented in a displayed image. An example of this type of functionality is a stereoscopic camera which typically has two cameras (e.g., two image sensors). Embodiments of the present disclosure, however, may have more than two cameras or image sensors. Further, embodiments of animaging device150 may have modes of operation such that one mode may allow for theimaging device150 to capture a 2-dimensional (2D) image; a second mode may allow for the imaging device to capture a multi-dimensional image (e.g., 3D image), and a third mode may allow the imaging device to simultaneously capture multiple images and use them to produce one or more 2D enhanced images for which an image processing effect has been applied. Accordingly, some embodiments of the present disclosure encompass a configurable and adaptable multi-imager camera architecture which operates in either a stereoscopic (3D) mode, monoscopic (single imager 2D) mode, and a combinational monoscopic (multiple imager 2D) mode. In one embodiment, mode configuration involves user selection, while adaptation can be automatic or prompted mode operation. For example, monoscopic mode may be used in normally sufficient situations but switched to combinational monoscopic operations when the need is detected bycontrol logic105.
In some embodiments, the image processing circuitry100 may include various subcomponents and/or discrete units of logic that collectively form an image processing “pipeline” for performing each of various image processing steps. These subcomponents may be implemented using hardware (e.g., digital signal processors or ASICs (application-specific integrated circuits)) or software, or via a combination of hardware and software components. The various image processing operations may be provided by the image processing circuitry100.
The image processing circuitry100 may include front-end processing logic103,pipeline processing logic104, andcontrol logic105, among others. The image sensor(s)101 may include a color filter array (e.g., a Bayer filter) and may thus provide both light intensity and wavelength information captured by each imaging pixel of theimage sensors101 to provide for a set of raw image data that may be processed by the front-end processing logic103.
The front-end processing logic103 may receive pixel data frommemory108. For instance, the raw pixel data may be sent tomemory108 from theimage sensor101. The raw pixel data residing in thememory108 may then be provided to the front-end processing logic103 for processing.
Upon receiving the raw image data (fromimage sensor101 or from memory108), the front-end processing logic103 may perform one or more image processing operations. The processed image data may then be provided to thepipeline processing logic104 for additional processing prior to being displayed (e.g., on display device106), or may be sent to thememory108. Thepipeline processing logic104 receives the “front-end” processed data, either directly from the front-end processing logic103 or frommemory108, and may provide for additional processing of the image data in the raw domain, as well as in the RGB and YCbCr color spaces, as the case may be. Image data processed by thepipeline processing logic104 may then be output to the display106 (or viewfinder) for viewing by a user and/or may be further processed by a graphics engine. Additionally, output from thepipeline processing logic104 may be sent tomemory108 and thedisplay106 may read the image data frommemory108. Further, in some implementations, thepipeline processing logic104 may also include anencoder107, such as a compression engine, etc., for encoding the image data prior to being read by thedisplay106.
Theencoder107 may be a JPEG (Joint Photographic Experts Group) compression engine for encoding still images, or an H.264 compression engine for encoding video images, or some combination thereof. Also, it should be noted that thepipeline processing logic104 may also receive raw image data from thememory108.
Thecontrol logic105 may include a processor620 (FIG. 6) and/or microcontroller configured to execute one or more routines (e.g., firmware) that may be configured to determine control parameters for theimaging device150, as well as control parameters for thepipeline processing logic104. By way of example only, the control parameters may include sensor control parameters, camera flash control parameters, lens control parameters (e.g., focal length for focusing or zoom), or a combination of such parameters for the image sensor(s)101. The control parameters may also include image processing commands, such as autowhite balance, autofocus, autoexposure, and color adjustments, as well as lens shading correction parameters for thepipeline processing logic104. The control parameters may further comprise multiplexing signals or commands for thepipeline processing logic104.
Referring now toFIG. 2, one embodiment of thepipeline processing logic104 may perform processes of an image signal processing pipeline by first sending image information to afirst process element201 which may take the raw data produced by the image sensor101 (FIG. 1) and generate a digital image that will be viewed by a user or undergo further processing by a downstream process element. Accordingly, the processing pipeline may be considered as a series of specialized algorithms that adjusts image data in real-time and is often implemented as an integrated component of a system-on-chip (SoC) image processor. With an image signal processing pipeline implemented in hardware, front-end image processing can be completed without placing any processing burden on the main application processor620 (FIG. 6).
In one embodiment, thefirst process element201 of an image signal processing pipeline could perform a particular image process such as noise reduction, defective pixel detection/correction, lens shading correction, lens distortion correction, demosaicing, image sharpening, color uniformity, RGB (red, green, blue) contrast, saturation boost process, etc. As discussed above, the pipeline may include asecond process element202. In one embodiment, thesecond process element202 could perform a particular and different image process such as noise reduction, defective pixel detection/correction, lens shading correction, demosaicing, image sharpening, color uniformity, RGB contrast, saturation boost process etc. The image data may then be sent to additional element(s) of the pipeline as the case may be, saved tomemory108, and/or input fordisplay106.
In one embodiment, an image process performed by aprocess element201,202 in the image signal processing pipeline is an enhanced high dynamic range process. A mode of operation for the enhanced high dynamic range process causes simultaneous images to be captured byimage sensors101. By taking multiple images simultaneously, the multiple pictures the object being photographed will be captured at the same time in each image. Under the mode of operation for the enhanced high dynamic range process, multiple images are to be captured at different exposure levels (e.g., different gain settings) or some other characteristic and then be combined to produce an image having an enhanced range for the particular characteristic. For example, an enhanced image may be produced with one portion having low exposure, another portion having a medium exposure, and another portion having a high exposure, depending on the number of images that have been simultaneously captured. In a different scenario, simultaneous images may be captured for different focus levels.
In another embodiment, a different image process performed by aprocess element201,202 in the image signal processing pipeline is an enhanced autofocusing process that can be utilized in many contexts including enhanced continuous autofocusing. A mode of operation for the enhanced high dynamic range process causes simultaneous images to be captured byimage sensors101. One of the image sensors101 (in an assistive role) may be caused to focus on an object and then scan an entire focusing range to find an optimum focus for the first image sensor. The optimum focus range is then used by a primary image sensor to capture an image of the object. In one scenario, theprimary image sensor101 may be capturing video of the object or a scene involving the object. Accordingly, the optimum focus range attributed to the second orassistive image sensor101 may change as the scene changes and therefore, the focus used by theprimary image sensor101 may be adjusted as the video is captured.
In an additional embodiment, an image process performed by a process element in the image signal processing pipeline is an enhanced depth of field process. A mode of operation for the enhanced process causes simultaneous images to be captured byimage sensors101. Focusing of theimage sensors101 may be independently controlled bycontrol logic105. Accordingly, one image sensor may be focused or zoomed closely on an object in a scene and a second image sensor may be focused at a different level on a different aspect of the scene. Image processing in the image single processing pipeline may then take the captured images and combine them to produce an enhanced image with a greater depth of field. Accordingly, multiple images may be combined to effectively extend the depth of field. Also, some embodiments may utilize images from more than two imagers orimage sensors101.
In various embodiments,multiple image sensors101 may not be focused on a same object in a scene. For example, an order may be applied to theimage sensors101 or imagers, where a primary imager captures a scene and secondary camera captures scene at a different angle or different exposure, different gain, etc., where the second image is used to correct or enhance the primary image. Exemplary operations include, but are not limited to including, HDR capture and enhanced denoise operations by using one frame to help denoise the other, as one example. To illustrate, in one implementation, a scene captured in two simultaneous images may be enhanced by averaging the values of pixels for both images which will improve the signal-to-noise ratio for the captured scene. Also, by having multiple images captured simultaneously at different angles, a curve of the lens shading may be calculated (using the location difference of the same object(s) in the image captures between the two (or more) image sensors) and used to correct effected pixels.
Accordingly, in an additional embodiment, an image process performed by aprocess element201,202 in the image signal processing pipeline is a corrective process. A mode of operation for the enhanced process causes simultaneous images to be captured byimage sensors101. The lens of therespective imagers101 may have different angles of views. Therefore, in the image process, images captured at the different angles of views may be compared to determine a difference in the two images. For example, defective hardware or equipment may cause a defect to be visible in a captured image. Therefore, the defect in captured images frommultiple image sensors101 is not going to be in the same position in both views/images due to the different angles of view. There will be a small difference, and the image signal processing pipeline is able to differentiate between the defect from the real image and apply some form of correction.
In an additional embodiment, an image process performed by aprocess element201,202 in the image signal processing pipeline is an enhanced image resolution process. A mode of operation for the enhanced process causes simultaneous images to be captured byimage sensors101 at a particular resolution (e.g., 10 Megapixels). Image processing in the image single processing pipeline may then take the captured images and combine them to produce an enhanced image with an increased or super resolution (e.g., 20 Megapixels). Further, in some embodiments, one of the captured images may be used to improve another captured image and vice versa. Accordingly, multiple enhanced monoscopic images may be produced from the simultaneous capture of images.
In an additional embodiment, an image process performed by a process element in the image signal processing pipeline is an enhanced image resolution process. A mode of operation for the enhanced process causes simultaneous video streams of images to be captured byimage sensors101 during low lighting conditions.
Consider that camera image quality often suffers during low light conditions. Ambient lighting is often low and not adequate for image sensor arrays designed for adequate lighting conditions. Thus, such sensor arrays receive insufficient photons to capture images with good exposure leading to dark images. Attempting to correct this via analog or digital gain may help somewhat but also tends to over amplify underlying noise (which is more dominant in low lighting conditions). One possible solution is to extend exposure time, but this may not be feasible as hand shaking may introduce blurring. Another conventional solution is to add larger aperture lensing and external flash. The former is a very expensive and size consuming proposition, while the latter may not be allowed (such as in museums) or may not be effective (such as for distance shots). Flash systems also are also costly and consume a lot of power.
Select embodiments of the present disclosure utilize a combination of different image sensors101 (e.g., infrared, RGB, panchromatic, etc.). For example, one image sensor may advantageously compensate for image information not provided by the other image sensor and vice versa. Accordingly, the image sensors may capture images simultaneously where a majority of image information is obtained from a primary image sensor and additional image information is provided from additional image sensor(s), as needed.
In one embodiment, lowlight image sensors101 orpanchromatic image sensors101 in concert with a standard RGB (Bayer pattern) image sensor array are used. Panchromatic sensors receive up to three times the photons of a single RGB sensor due to having a smaller imager die size, but rely on the RGB neighbors for color identification. Such sensor array design is outperformed by an ordinary RGB sensor at higher lighting levels due to the larger image die size. One embodiment of animaging device150 utilizes a RGB type CMOS or CCD type sensor array for high lighting situations, and a second low light type of sensor designed for low lighting conditions (e.g., fully panchromatic—black and white luma only, or interspersed panchromatic). Then, theimaging device150 automatically switches between the two sensors to best capture images under current lighting conditions. Further, in one embodiment, simultaneous images may be captured during low lighting. In particular, by capturing multiple images using apanchromatic imager101 and anormal lighting imager101, the captured images can be correlated and combined to produce a more vivid low light image.
As an example, apanchromatic image sensor101 may be used to capture a video stream at a higher frame rate under low lighting conditions while the chroma data is only sampled at half that rate. This corresponds to a temporal compression approach counterpart to a spatial approach that treats chroma with a lesser resolution than luma. Output of theprocess element201,202 may be a single frame sequence or may actually comprise two separate streams for post processing access.
In another scenario, motion blur can be reduced using thepanchromatic imager101 and anormal lighting imager101. Motion blur is when an object is moving in front of theimaging device150 and in a low light condition, for example, a chosen exposure for the low light condition may capture motion of an object being shot or of shaking of theimaging device150 itself. Accordingly, the panchromatic imager is used to capture an image at a smaller exposure than a second image is captured by the normal lighting imager. The captured images can be correlated and combined to produce an image with motion blur corrected.
Embodiments of theimaging device150 are not limited to having two image sensors and can be applied to a wide number ofimage sensors101. For example, a tablet device could possibly have two imagers in the front and two imagers in the back of the device, where images (including video) from each of the imagers are simultaneously captured and combined into a resulting image.
Referring next toFIG. 3, in one embodiment, an image signal processing pipeline implemented bypipeline processing logic104 contains parallel paths instead of a single linear path. For example, the parallel paths may provide a first path and a second path. Further, in one embodiment, the first path comprises a main processing path and the second path comprises a supplemental processing path. Therefore, while image data from afirst image sensor101 is being processed in the first path, raw image data from asecond image sensor101 may be processed in the second and parallel path. It may be that the second path contains fewer stages orelements321,322 than the first path. Alternatively, the first path may contain the same number of or less number of stages orelements311,312 as compared to the second path. Further, the second path may involve resolution down-conversion of the image to lessen the amount of pixels that need to be processed during image processing, such as for image analysis, in the pipeline. The benefits of the parallel paths may apply to still images as well as video images captured by the image sensor(s)101. Use of parallel paths in the image signal processing pipeline may enable processing of multiple image data simultaneously while maximizing final image quality.
Referring toFIG. 4, in one embodiment of an image processing pipeline, processingelements411,412, may be divided up between elements that are suited for the main image andprocessing elements421,422 that are suited for the secondary image. Accordingly, a secondary image may be initially processed, such as being made smaller or scaled, for the benefit of downstream elements. As an example, the path of the secondary image may contain a noise filtering element due to a downstream element needed for the secondary image to have undergone noise reduction.
In some embodiments, the images generated by the first and second paths may be stored inmemory108 and made available for subsequent use by other procedures and elements that follow. Accordingly, in one embodiment, while a main image is being processed in a main path of the pipeline, another image which might be downsized or scaled of that image or a previous image may be read by the main path. This may enable more powerful processing in the pipeline, such as during noise filtering.
Also, in some embodiments, similar pixels in the multiple images may be processed once and then disparate pixels will be processed separately. It is noted that simultaneous capturing of images from two image sensors in close proximity with one another will be quite similar. Therefore, pixels of a first captured image may be processed in a main path of the pipeline. Additionally, similar pixels in a second captured image may be identified with a similarity mask, where the similar pixels are also contained in the first captured image (and are already being processed). After removal of the similar pixels in the second captured image, the remaining pixels may be processed in a secondary path of the pipeline. By removing redundant processing, significant power savings in the image signal processing pipeline may be realized.
Further, in some embodiments, the images generated by the first and second paths may be simultaneously displayed. For example, one display portion of adisplay106 can be used to show a video (e.g., outputted from the first path) and a second display portion of thedisplay106 can be used to show a still image or “snap-shot” from the video (e.g., outputted from the second path) which is responsive to a pause button on an interface of theimaging device150. Alternatively, an image frame may be shown in a split screen of the display (e.g., left section) and another image frame may be shown in a right section of the display. The imaging device may be configured to allow for a user to select a combination of frames (e.g., the frames being displayed in the split screen) and then compared and combined by processinglogic103,104,105 to generate an enhanced image having improved image quality and resolution.
As previously mentioned, embodiments of theimaging device150 may employ modes of operation that are selectable from interface elements of the device. Interface elements may include graphical interface elements selectable from adisplay106 or mechanical buttons or switches selectable or switchable from a housing of theimaging device150. In one embodiment, a user may activate a stereoscopic mode of operation, in whichprocessing logic103,104,105 of theimaging device150 produces a 3D image, using captured images, that is viewable on thedisplay106 or capable of being saved inmemory108. The user may also activate a 2D mode of operation, where a single image is captured and displayed or saved inmemory108. Further, the user may activate an enhanced 2D mode of operation, where multiple images are captured and used to produce a 2D image with enhanced characteristics (e.g., improved depth of field, enhanced focus, HDR, super-resolution, etc.) that may be viewed or saved inmemory108.
In processing an image, binning allows charges from adjacent pixels to be combined which can provide improved signal-to-noise ratios albeit at the expense of reduced spatial resolution. In various embodiments, different binning levels can be used in each of the multiple image sensors. Therefore, better resolution may be obtained from the image sensor having the lower binning level and better signal-to-noise ratio may be obtained from the image sensor having the higher binning level. The two versions of a captured scene or image may then be combined to produce an enhanced version of the image.
In particular, in one embodiment,multiple image sensors101 capture multiple images, each with different exposure levels. Aprocess element201,202 of an image signaling processing pipeline correlates and performs high dynamic range processing on different combinations of the captured images. The resulting images from the different combinations may be displayed to a user and offered for selection by the user as to the desired final image which may be saved and/or displayed. In some embodiments, a graphical interface slide-bar (or other user interface control element) may also be presented that allows gradual or stepwise shifting providing differing weighting combinations between images having different exposures. For video, such setting may be maintained across all frames.
Multiplexing of the image signal processing pipeline is also implemented in an embodiment utilizingmultiple image sensors101. For example, consider a stereoscopic imaging device (e.g., one embodiment of imaging device150) that delivers a left image and a right image of an object to a single image signal processing pipeline, as represented inFIG. 5. The single image pipeline inpipeline processing logic104 can therefore be multiplexed by front-end processing logic103 between the left and right images that are being input in parallel to the pipeline. Alternatively, in enhanced 2D image processing, simultaneous image captures may also be input in parallel to the pipeline via multiplexing between the images.
Therefore, instead of processing one of the images in its entirety after the other has been processed in its entirety, the images can be processed concurrently by switching processing of the images between one another as processing time allows by front-end processing logic103. This reduces latency by not delaying processing of an image until completion of the other image, and processing of the two images will finish more quickly.
Keeping the above points in mind,FIG. 6 is a block diagram illustrating an example of anelectronic device650 that may provide for the processing of image data using one or more of the image processing techniques briefly mentioned above. Theelectronic device650 may be any type of electronic device, such as a laptop or desktop computer, a mobile phone, tablet, a digital media player, or the like, that is configured to receive and process image data, such as data acquired using one or more image sensing components.
Regardless of its form (e.g., portable or non-portable), it should be understood that theelectronic device650 may provide for the processing of image data using one or more of the image processing techniques briefly discussed above, among others. In some embodiments, theelectronic device650 may apply such image processing techniques to image data stored in a memory of theelectronic device650. In further embodiments, theelectronic device650 may include multiple imaging devices, such as an integrated or external digital camera orimager101, configured to acquire image data, which may then be processed by theelectronic device650 using one or more of the above-mentioned image processing techniques.
As shown inFIG. 6, theelectronic device605 may include various internal and/or external components which contribute to the function of thedevice605. Those of ordinary skill in the art will appreciate that the various functional blocks shown inFIG. 6 may comprise hardware elements (including circuitry), software elements (including computer code stored on a computer readable medium) or a combination of both hardware and software elements. For example, in the presently illustrated embodiment, theelectronic device605 may include input/output (I/O)ports610, one ormore processors620,memory device630,non-volatile storage640,networking device650,power source660, anddisplay670. Additionally, theelectronic device605 may includeimaging devices680, such as digital cameras orimagers101, andimage processing circuitry690. As will be discussed further below, theimage processing circuitry690 may be configured implement one or more of the above-discussed image processing techniques when processing image data. As can be appreciated, image data processed byimage processing circuitry690 may be retrieved from thememory630 and/or the non-volatile storage device(s)640, or may be acquired using theimaging device680.
Before continuing, it should be understood that the system block diagram of thedevice605 shown inFIG. 6 is intended to be a high-level control diagram depicting various components that may be included in such adevice605. That is, the connection lines between each individual component shown inFIG. 6 may not necessarily represent paths or directions through which data flows or is transmitted between various components of thedevice605. Indeed, as discussed above, the depicted processor(s)620 may, in some embodiments, include multiple processors, such as a main processor (e.g., CPU), and dedicated image and/or video processors. In such embodiments, the processing of image data may be primarily handled by these dedicated processors, thus effectively offloading such tasks from a main processor (CPU).
Referring next toFIG. 7, shown is a flowchart that provides one example of the operation of a portion of the image processing circuitry100 according to various embodiments. It is understood that the flowchart ofFIG. 7 provides merely an example of the many different types of functional arrangements that may be employed to implement the operation of the portion of the image processing circuitry100 as described herein. As an alternative, the flowchart ofFIG. 7 may be viewed as depicting an example of steps of a method implemented in the electronic device605 (FIG. 6) according to one or more embodiments.
Beginning instep702,control logic105 triggers or initiates simultaneous capture of multiple images fromimage sensors101, where the multiple images include at least a first image and a second image. The first image contains an imaging characteristic or setting that is different from an imaging characteristic of the second image. Possible imaging characteristics include exposure levels, focus levels, depth of field settings, angle of views, etc. Instep704,processing logic103,104 combines at least the first and second images or portions of the first and second images to produce an enhanced image having qualities of the first and second images. The enhanced image, as an example, may contain portions having depths of field from the first and second images, exposure levels from the first and second images, combined resolutions of the first and second images, etc. The enhanced image is output from an image signal processing pipeline of the processing logic and is provided for display, instep706.
Next, referring toFIG. 8, shown is a flowchart that provides an additional example of the operation of a portion of the image processing circuitry100 according to various embodiments. Beginning instep802,control logic105 triggers simultaneous capture of multiple images fromimage sensors101, where the multiple images include at least a first image and a second image. The first image contains an imaging characteristic or setting that is different from an imaging characteristic of the second image. Further, due to the different characteristic or setting, one image may contain an image degradation that does not exist in the other image. For example, if one image has a longer exposure than the other image, then the image with the longer exposure could possibly have motion blur degradation that is not captured in the other image, although the other image may have other undesired characteristics, such as low lighting levels. Instep804,processing logic104 compares at least the first and second images or portions of the first and second images to detect an image degradation in the first image, and then instep806, thepipeline processing logic104 compensates for the image degradation and produces an enhanced image having qualities of the first and second images. The enhanced image is output from an image signal processing pipeline of thepipeline processing logic104 and is provided for display, instep808. In an alternative embodiment, multiple enhanced images may be output, where one captured image may be used to detect an image degradation or defect in a second image and the second image may also be used to detect an image degradation/defect in the first image.
InFIG. 9, a flow chart is shown that provides an additional example of the operation of a portion of the image processing circuitry100 according to various embodiments. Beginning instep902,control logic105 activates a stereoscopic mode of operation for animaging device150, where captured images are used to produce a 3D image that is viewable on thedisplay106 or capable of being saved inmemory108. In one embodiment, a user may generate a command for thecontrol logic105 to activate the stereoscopic mode of operation. In an alternative embodiment, thecontrol logic105 may be configured to automatically activate the stereoscopic mode of operation.
Correspondingly, instep904,control logic105 activates a 2D or monoscopic mode of operation for theimaging device150, where a single image is captured and displayed or saved inmemory108. In one embodiment, a user may generate a command for thecontrol logic105 to activate the 2D mode of operation. In an alternative embodiment, thecontrol logic105 may be configured to automatically activate the 2D mode of operation without user prompting.
Further, instep906,control logic105 activates an enhanced 2D or monoscopic mode of operation for theimaging device150, where multiple images are captured and used to produce a 2D image with enhanced characteristics (e.g., improved depth of field, enhanced focus, HDR, super-resolution, etc.) that may be viewed or saved inmemory108. Additionally, in various embodiments, one of the outputs of the image processing may not be an enhanced image and may be image information, such as depth of field information, for the enhanced image. In one embodiment, a user may generate a command for thecontrol logic105 to activate the enhanced 2D mode of operation. In an alternative embodiment, thecontrol logic105 may be configured to automatically activate the enhanced 2D mode of operation without user prompting.
Any process descriptions or blocks in flow charts should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of embodiments of the present disclosure in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art.
In the context of this document, a “computer readable medium” can be any means that can contain, store, communicate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device. More specific examples (a nonexhaustive list) of the computer readable medium would include the following: an electrical connection (electronic) having one or more wires, a portable computer diskette (magnetic), a random access memory (RAM) (electronic), a read-only memory (ROM) (electronic), an erasable programmable read-only memory (EPROM or Flash memory) (electronic), an optical fiber (optical), and a portable compact disc read-only memory (CDROM) (optical). In addition, the scope of certain embodiments includes embodying the functionality of the embodiments in logic embodied in hardware or software-configured mediums.
It should be emphasized that the above-described embodiments are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.

Claims (21)

Therefore, having thus described various embodiments, at least the following is claimed:
1. An image capture device, comprising:
a first image sensor for recording a first image of a scene;
a second image sensor for recording a second image of the scene;
control processing circuitry to initiate simultaneous capture of the first image of the scene by the first image sensor and the second image of the scene by the second image sensor; and
image processing circuitry to generate an enhanced monoscopic image of the scene comprising at least portions of the first image and the second image,
wherein the image processing circuitry is configured to process first pixels of the first image less than all of which are also contained in the second image in a first processing path and to process second pixels of the second image that are not contained in the first image in a second processing path.
2. The image capture device ofclaim 1, wherein the enhanced monoscopic image comprises a first portion obtained from the first image and a second portion obtained from the second image,
wherein the first portion comprises at least one of having an exposure level that is different from the second portion or having a depth of field that is different from the second portion.
3. The image capture device ofclaim 1, wherein a resolution of the enhanced monoscopic image is greater than individual resolutions of the first image and the second image.
4. The image capture device ofclaim 1, wherein the second image is analyzed and compared with the first image to isolate a defect in the first image, wherein the enhanced monoscopic image is a corrected version of the first image with the defect removed.
5. The image capture device ofclaim 1, wherein:
the control processing circuitry is configured to operate in a stereoscopic mode utilizing simultaneous operation of the first image sensor and the second image sensor to generate a stereoscopic image; a monoscopic mode utilizing singular operation of the first image sensor to generate a monoscopic image; and an enhanced monoscopic mode utilizing simultaneous operation of the first image sensor and the second image sensor to generate the enhanced monoscopic image; and
the control processing circuitry is configured to automatically switch between modes of operation comprising at least two of the monoscopic mode, the stereoscopic mode, or the enhanced monoscopic mode.
6. The image capture device ofclaim 1, wherein the first image is used as image data for the enhanced monoscopic image and the second image is used as enhancement data to enhance at least one image characteristic of the first image, wherein the enhanced monoscopic image comprises the first image with the at least one improved characteristic.
7. The image capture device ofclaim 1, wherein the first image sensor is configured to operate at a first binning level during the simultaneous capture that is different from a second binning level at which the second image sensor is configured to operate during the simultaneous capture.
8. The image capture device ofclaim 1, wherein at least one pixel of the enhanced monoscopic image is derived from an average of at least one pixel in the first image and at least one pixel in the second image.
9. The image capture device ofclaim 1, wherein the image processing circuitry is configured to:
identify the first pixels of the first image that are also contained in the second image;
remove the identified pixels in the second image from the second image; and
process remaining pixels of the second image in the second processing path, wherein the remaining pixels comprise the second pixels.
10. An image processing method, comprising:
recording a first image captured by a first image sensor;
recording a second image captured by a second image sensor, wherein the first image and the second image are simultaneously captured;
comparing at least portions of the first image and the second image, wherein the comparing comprises:
identifying, using a mask, pixels in the second image that are similar to pixels in the first image;
removing the identified pixels in the second image from the second image;
processing the first image in a first processing path;
processing remaining pixels of the second image in a second processing path; and
responsive to outputs of the first processing path and the second processing path, generating an enhanced monoscopic image.
11. The image processing method ofclaim 10, wherein the first image is used as image data for the enhanced monoscopic image and the second image is used as enhancement data to enhance at least one image characteristic of the first image, wherein the enhanced monoscopic image comprises the first image with the at least one improved characteristic.
12. The image processing method ofclaim 11, wherein the at least one improved characteristic comprises at least one of an improved depth of field, an improved resolution; or an improved exposure level.
13. The image processing method ofclaim 10, wherein the second image is analyzed and compared with the first image to isolate a defect in the first image, wherein the enhanced monoscopic image is a corrected version of the first image with the defect removed.
14. The image processing method ofclaim 13, wherein:
the defect comprises a lens shading defect, and
the first image sensor comprises an image sensor that is configured to pass red, green, or blue light to sensor pixels and the second image sensor comprises a different type of image sensor than the first image sensor.
15. The image processing method ofclaim 10, wherein an image capture device comprises the first image sensor and the second image sensor, the method further comprising:
switching operation of the image capture device between a stereoscopic mode utilizing simultaneous operation of the first image sensor and the second image sensor to generate a stereoscopic image; a monoscopic mode utilizing singular operation of the first image sensor to generate a monoscopic image; and an enhanced monoscopic mode utilizing simultaneous operation of the first image sensor and the second image sensor to generate the enhanced monoscopic image.
16. The image processing method ofclaim 10, wherein at least one pixel of the enhanced monoscopic image is derived from an average of at least one pixel in the first image and at least one pixel in the second image.
17. A non-transitory computer readable medium having an image processing program, when executed by a hardware processor, causing the hardware processor to:
record a first image captured by a first image sensor;
record a second image captured by a second image sensor, wherein the first image and the second image are simultaneously captured;
compare at least portions of the first image and the second image, wherein the comparing comprises:
identifying, using a mask, pixels in the second image that are similar to pixels in the first image;
removing the identified pixels in the second image from the second image;
processing the first image in a first processing path;
processing remaining pixels of the second image in a second processing path; and
responsive to outputs of the first processing path and the second processing path, generate an enhanced monoscopic image.
18. The non-transitory computer readable medium ofclaim 17, wherein an image capture device comprises the first image sensor and the second image sensor, the image processing program further causing the hardware processor to:
switch operation of the image capture device between a stereoscopic mode utilizing simultaneous operation of the first image sensor and the second image sensor to generate a stereoscopic image; a monoscopic mode utilizing singular operation of the first image sensor to generate a monoscopic image; and an enhanced monoscopic mode utilizing simultaneous operation of the first image sensor and the second image sensor to generate the enhanced monoscopic image.
19. The non-transitory computer readable medium ofclaim 17, wherein the first image is used as image data for the enhanced monoscopic image and the second image is used as support data to correct a defect in the first image.
20. The non-transitory computer readable medium ofclaim 17, wherein the first image is used as image data for the enhanced monoscopic image and the second image is used as enhancement data to enhance at least one image characteristic of the first image, wherein the enhanced monoscopic image comprises the first image with the at least one improved characteristic.
21. The non-transitory computer readable medium ofclaim 17, wherein at least one pixel of the enhanced monoscopic image is derived from an average of at least one pixel in the first image and at least one pixel in the second image.
US13/335,0282011-07-202011-12-22Dual image capture processingActive2032-12-29US9270875B2 (en)

Priority Applications (6)

Application NumberPriority DateFiling DateTitle
US13/335,028US9270875B2 (en)2011-07-202011-12-22Dual image capture processing
EP18188593.0AEP3429189B1 (en)2011-07-202012-07-04Dual image capture processing
EP12004966.3AEP2549763A3 (en)2011-07-202012-07-04Dual image capture processing
TW101124641ATWI526068B (en)2011-07-202012-07-09 Image capturing device and image processing method
KR1020120078610AKR101428635B1 (en)2011-07-202012-07-19Dual image capture processing
CN201210254807.9ACN102892008B (en)2011-07-202012-07-20Dual image capture processes

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US201161509747P2011-07-202011-07-20
US13/335,028US9270875B2 (en)2011-07-202011-12-22Dual image capture processing

Publications (2)

Publication NumberPublication Date
US20130021447A1 US20130021447A1 (en)2013-01-24
US9270875B2true US9270875B2 (en)2016-02-23

Family

ID=46514066

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US13/335,028Active2032-12-29US9270875B2 (en)2011-07-202011-12-22Dual image capture processing

Country Status (5)

CountryLink
US (1)US9270875B2 (en)
EP (2)EP2549763A3 (en)
KR (1)KR101428635B1 (en)
CN (1)CN102892008B (en)
TW (1)TWI526068B (en)

Cited By (58)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20170324891A1 (en)*2012-11-212017-11-09Infineon Technologies AgDynamic conservation of imaging power
US10156706B2 (en)2014-08-102018-12-18Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US10225479B2 (en)2013-06-132019-03-05Corephotonics Ltd.Dual aperture zoom digital camera
US10230898B2 (en)2015-08-132019-03-12Corephotonics Ltd.Dual aperture zoom camera with video support and switching / non-switching dynamic control
US10250797B2 (en)2013-08-012019-04-02Corephotonics Ltd.Thin multi-aperture imaging system with auto-focus and methods for using same
US10284780B2 (en)2015-09-062019-05-07Corephotonics Ltd.Auto focus and optical image stabilization with roll compensation in a compact folded camera
US10288897B2 (en)2015-04-022019-05-14Corephotonics Ltd.Dual voice coil motor structure in a dual-optical module camera
US10288840B2 (en)2015-01-032019-05-14Corephotonics LtdMiniature telephoto lens module and a camera utilizing such a lens module
US10288896B2 (en)2013-07-042019-05-14Corephotonics Ltd.Thin dual-aperture zoom digital camera
US10319079B2 (en)2017-06-302019-06-11Microsoft Technology Licensing, LlcNoise estimation using bracketed image capture
US10371928B2 (en)2015-04-162019-08-06Corephotonics LtdAuto focus and optical image stabilization in a compact folded camera
US10379371B2 (en)2015-05-282019-08-13Corephotonics LtdBi-directional stiffness for optical image stabilization in a dual-aperture digital camera
US10488631B2 (en)2016-05-302019-11-26Corephotonics Ltd.Rotational ball-guided voice coil motor
US10534153B2 (en)2017-02-232020-01-14Corephotonics Ltd.Folded camera lens designs
US10578948B2 (en)2015-12-292020-03-03Corephotonics Ltd.Dual-aperture zoom digital camera with automatic adjustable tele field of view
US10616484B2 (en)2016-06-192020-04-07Corephotonics Ltd.Frame syncrhonization in a dual-aperture camera system
US10645286B2 (en)2017-03-152020-05-05Corephotonics Ltd.Camera with panoramic scanning range
US10694168B2 (en)2018-04-222020-06-23Corephotonics Ltd.System and method for mitigating or preventing eye damage from structured light IR/NIR projector systems
US10706518B2 (en)2016-07-072020-07-07Corephotonics Ltd.Dual camera system with improved video smooth transition by image blending
US10762708B2 (en)*2016-06-232020-09-01Intel CorporationPresentation of scenes for binocular rivalry perception
US10771684B2 (en)2015-01-192020-09-08Microsoft Technology Licensing, LlcProfiles identifying camera capabilities
US10845565B2 (en)2016-07-072020-11-24Corephotonics Ltd.Linear ball guided voice coil motor for folded optic
US10884321B2 (en)2017-01-122021-01-05Corephotonics Ltd.Compact folded camera
US10904512B2 (en)2017-09-062021-01-26Corephotonics Ltd.Combined stereoscopic and phase detection depth mapping in a dual aperture camera
USRE48444E1 (en)2012-11-282021-02-16Corephotonics Ltd.High resolution thin multi-aperture imaging systems
US10951834B2 (en)2017-10-032021-03-16Corephotonics Ltd.Synthetically enlarged camera aperture
US10976567B2 (en)2018-02-052021-04-13Corephotonics Ltd.Reduced height penalty for folded camera
US11188776B2 (en)2019-10-262021-11-30Genetec Inc.Automated license plate recognition system and related method
US11268829B2 (en)2018-04-232022-03-08Corephotonics LtdOptical-path folding-element with an extended two degree of freedom rotation range
US11287081B2 (en)2019-01-072022-03-29Corephotonics Ltd.Rotation mechanism with sliding joint
US11315276B2 (en)2019-03-092022-04-26Corephotonics Ltd.System and method for dynamic stereoscopic calibration
US11333955B2 (en)2017-11-232022-05-17Corephotonics Ltd.Compact folded camera structure
US11363180B2 (en)2018-08-042022-06-14Corephotonics Ltd.Switchable continuous display information system above camera
US11368631B1 (en)2019-07-312022-06-21Corephotonics Ltd.System and method for creating background blur in camera panning or motion
US11367267B2 (en)2018-02-082022-06-21Genetec Inc.Systems and methods for locating a retroreflective object in a digital image
US11531209B2 (en)2016-12-282022-12-20Corephotonics Ltd.Folded camera structure with an extended light-folding-element scanning range
US20220417382A1 (en)*2017-07-282022-12-29Advanced Micro Devices, Inc.Buffer management for plug-in architectures in computation graph structures
US11635596B2 (en)2018-08-222023-04-25Corephotonics Ltd.Two-state zoom folded camera
US11637977B2 (en)2020-07-152023-04-25Corephotonics Ltd.Image sensors and sensing methods to obtain time-of-flight and phase detection information
US11640047B2 (en)2018-02-122023-05-02Corephotonics Ltd.Folded camera with optical image stabilization
US11659135B2 (en)2019-10-302023-05-23Corephotonics Ltd.Slow or fast motion video using depth information
US11693064B2 (en)2020-04-262023-07-04Corephotonics Ltd.Temperature control for Hall bar sensor correction
US11770618B2 (en)2019-12-092023-09-26Corephotonics Ltd.Systems and methods for obtaining a smart panoramic image
US11770609B2 (en)2020-05-302023-09-26Corephotonics Ltd.Systems and methods for obtaining a super macro image
US11832018B2 (en)2020-05-172023-11-28Corephotonics Ltd.Image stitching in the presence of a full field of view reference image
US11910089B2 (en)2020-07-152024-02-20Corephotonics Lid.Point of view aberrations correction in a scanning folded camera
US11928799B2 (en)2020-06-292024-03-12Samsung Electronics Co., Ltd.Electronic device and controlling method of electronic device
US11949976B2 (en)2019-12-092024-04-02Corephotonics Ltd.Systems and methods for obtaining a smart panoramic image
US11946775B2 (en)2020-07-312024-04-02Corephotonics Ltd.Hall sensor—magnet geometry for large stroke linear position sensing
US11968453B2 (en)2020-08-122024-04-23Corephotonics Ltd.Optical image stabilization in a scanning folded camera
US12007668B2 (en)2020-02-222024-06-11Corephotonics Ltd.Split screen feature for macro photography
US12007671B2 (en)2021-06-082024-06-11Corephotonics Ltd.Systems and cameras for tilting a focal plane of a super-macro image
US12069399B2 (en)2022-07-072024-08-20Snap Inc.Dynamically switching between RGB and IR capture
US12081856B2 (en)2021-03-112024-09-03Corephotonics Lid.Systems for pop-out camera
US12101575B2 (en)2020-12-262024-09-24Corephotonics Ltd.Video support in a multi-aperture mobile camera with a scanning zoom camera
US12328523B2 (en)2018-07-042025-06-10Corephotonics Ltd.Cameras with scanning optical path folding elements for automotive or surveillance
US12328505B2 (en)2022-03-242025-06-10Corephotonics Ltd.Slim compact lens optical image stabilization
US12442665B2 (en)2025-02-062025-10-14Corephotonics Ltd.Hall sensor—magnet geometry for large stroke linear position sensing

Families Citing this family (82)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP5814566B2 (en)*2011-02-282015-11-17オリンパス株式会社 IMAGING DEVICE, IMAGING METHOD, AND IMAGING DEVICE CONTROL PROGRAM
EP2865179A4 (en)*2012-06-202016-03-02Nokia Technologies OyDisplay camera operation
US20140010476A1 (en)*2012-07-042014-01-09Hui DengMethod for forming pictures
US8854362B1 (en)*2012-07-232014-10-07Google Inc.Systems and methods for collecting data
US9137455B1 (en)*2014-11-052015-09-15Duelight LlcImage sensor apparatus and method for obtaining multiple exposures with zero interframe time
US9918017B2 (en)2012-09-042018-03-13Duelight LlcImage sensor apparatus and method for obtaining multiple exposures with zero interframe time
US9531961B2 (en)2015-05-012016-12-27Duelight LlcSystems and methods for generating a digital image using separate color and intensity data
US9819849B1 (en)2016-07-012017-11-14Duelight LlcSystems and methods for capturing digital images
US9807322B2 (en)2013-03-152017-10-31Duelight LlcSystems and methods for a digital image sensor
US10558848B2 (en)2017-10-052020-02-11Duelight LlcSystem, method, and computer program for capturing an image with correct skin tone exposure
US20140267701A1 (en)*2013-03-122014-09-18Ziv AvivApparatus and techniques for determining object depth in images
EP2779629B1 (en)*2013-03-132018-12-26Samsung Electronics Co., Ltd.Electronic device and method for processing image
KR102124188B1 (en)*2013-03-132020-06-26삼성전자주식회사Electronic device and method for processing image
US11013398B2 (en)*2013-03-132021-05-25Stryker CorporationSystem for obtaining clear endoscope images
US9912929B2 (en)2013-03-212018-03-06Mediatek Inc.Video frame processing method
CN103338355A (en)*2013-06-172013-10-02广东新视野信息科技有限公司3G aviation bellyhold video monitoring method
JP6306845B2 (en)*2013-09-122018-04-04キヤノン株式会社 Imaging apparatus and control method thereof
US9443335B2 (en)2013-09-182016-09-13Blackberry LimitedUsing narrow field of view monochrome camera for producing a zoomed image
US20150103146A1 (en)*2013-10-162015-04-16Qualcomm IncorporatedConversion of at least one non-stereo camera into a stereo camera
CN105830425A (en)*2013-10-182016-08-03泽莱特科股份有限公司Methods and apparatus for capturing and/or combining images
CN105340267A (en)*2013-12-062016-02-17华为终端有限公司Method for generating picture and twin-lens device
EP3067746B1 (en)*2013-12-062019-08-21Huawei Device Co., Ltd.Photographing method for dual-camera device and dual-camera device
US9319576B2 (en)2014-01-292016-04-19Google Technology Holdings LLCMulti-processor support for array imagers
EP3186661B1 (en)2014-08-262021-04-07Massachusetts Institute of TechnologyMethods and apparatus for three-dimensional (3d) imaging
KR101991754B1 (en)*2014-08-292019-09-30후아웨이 테크놀러지 컴퍼니 리미티드 Image processing method and apparatus, and electronic device
TWI542224B (en)*2014-09-222016-07-11瑞昱半導體股份有限公司Image signal processing method and image signal processor
US9672594B2 (en)*2014-10-212017-06-06The Boeing CompanyMultiple pixel pitch super resolution
US10924688B2 (en)2014-11-062021-02-16Duelight LlcImage sensor apparatus and method for obtaining low-noise, high-speed captures of a photographic scene
US12401911B2 (en)2014-11-072025-08-26Duelight LlcSystems and methods for generating a high-dynamic range (HDR) pixel stream
US11463630B2 (en)2014-11-072022-10-04Duelight LlcSystems and methods for generating a high-dynamic range (HDR) pixel stream
US12401912B2 (en)2014-11-172025-08-26Duelight LlcSystem and method for generating a digital image
CN104363391B (en)*2014-11-282018-11-27广东欧珀移动通信有限公司Dead pixel points of images compensation method, system and photographing device
CN104469164B (en)*2014-12-242018-10-12联想(北京)有限公司Image capture device, Image Acquisition module and image processing method
WO2016107961A1 (en)*2014-12-292016-07-07Nokia CorporationMethod, apparatus and computer program product for motion deblurring of images
KR102347591B1 (en)*2015-08-242022-01-05삼성전자주식회사Image sensing apparatus and image processing system
KR102400104B1 (en)*2015-10-282022-05-19삼성전자주식회사Image processing apparatus and Image processing method
KR102446442B1 (en)2015-11-242022-09-23삼성전자주식회사 Digital photographing apparatus and method of operation thereof
JP6603558B2 (en)2015-11-252019-11-06キヤノン株式会社 Imaging device and imaging apparatus
EP3174286B1 (en)2015-11-252021-01-06Canon Kabushiki KaishaImage sensor and image capturing apparatus
CN105872393A (en)*2015-12-082016-08-17乐视移动智能信息技术(北京)有限公司High dynamic range image generation method and device
US9712774B1 (en)*2016-01-142017-07-18Omnivision Technologies, Inc.Method and system for implementing dynamic ground sharing in an image sensor with pipeline architecture
CN105827909B (en)*2016-01-252017-06-23维沃移动通信有限公司A kind of dual camera quick start method and mobile terminal
US10257394B2 (en)*2016-02-122019-04-09Contrast, Inc.Combined HDR/LDR video streaming
US10264196B2 (en)2016-02-122019-04-16Contrast, Inc.Systems and methods for HDR video capture with a mobile device
WO2017139596A1 (en)2016-02-122017-08-17Contrast Optical Design & Engineering, Inc.Devices and methods for high dynamic range video
CN107102499A (en)*2016-02-222017-08-29深圳富泰宏精密工业有限公司Many lens systems and the portable electron device with many lens systems
KR102603426B1 (en)2016-06-272023-11-20삼성전자주식회사Apparatus and method for processing an image
US10554901B2 (en)*2016-08-092020-02-04Contrast Inc.Real-time HDR video for vehicle control
WO2018044314A1 (en)2016-09-012018-03-08Duelight LlcSystems and methods for adjusting focus based on focus target information
WO2018048838A1 (en)*2016-09-062018-03-15Apple Inc.Still image stabilization/optical image stabilization synchronization in multi-camera image capture
GB2568647B (en)2016-09-192022-04-20Tau Tech LlcMulti-camera imaging systems
US10943100B2 (en)2017-01-192021-03-09Mindmaze Holding SaSystems, methods, devices and apparatuses for detecting facial expression
EP3571627A2 (en)2017-01-192019-11-27Mindmaze Holding S.A.Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location including for at least one of a virtual and augmented reality system
CN110892408A (en)*2017-02-072020-03-17迈恩德玛泽控股股份有限公司Systems, methods, and apparatus for stereo vision and tracking
US10630888B2 (en)2017-02-092020-04-21Samsung Electronics Co., Ltd.Method and apparatus for selecting capture configuration based on scene analysis
JP7024782B2 (en)*2017-03-272022-02-24ソニーグループ株式会社 Image processing device and image processing method and image pickup device
CN107277348B (en)*2017-06-162019-08-16Oppo广东移动通信有限公司Focusing method, focusing device, computer readable storage medium and mobile terminal
WO2019014057A1 (en)2017-07-102019-01-17Contrast, Inc. STEREOSCOPIC CAMERA
US10721419B2 (en)*2017-11-302020-07-21International Business Machines CorporationOrtho-selfie distortion correction using multiple image sensors to synthesize a virtual image
CN108024056B (en)2017-11-302019-10-29Oppo广东移动通信有限公司 Imaging method and device based on dual cameras
US11328533B1 (en)2018-01-092022-05-10Mindmaze Holding SaSystem, method and apparatus for detecting facial expression for motion capture
US10951888B2 (en)2018-06-042021-03-16Contrast, Inc.Compressed high dynamic range video
CN110166795B (en)*2018-07-192022-02-18腾讯科技(深圳)有限公司Video screenshot method and device
US11303932B2 (en)2018-08-142022-04-12Contrast, Inc.Image compression
US11647284B2 (en)2018-08-202023-05-09Sony Semiconductor Solutions CorporationImage processing apparatus and image processing system with image combination that implements signal level matching
US10880475B2 (en)2018-10-252020-12-29Korea Electronics Technology InstituteVideo conversion apparatus and system for generating 360-degree virtual reality video in real time
KR102012717B1 (en)*2018-10-252019-08-21전자부품연구원Image conversion device and system for generating 360 VR image in real time
EP3899463A4 (en)2018-12-142022-12-21Spectral MD, Inc.System and method for high precision multi-aperture spectral imaging
EP3726459B1 (en)*2019-04-172022-02-16Leica Instruments (Singapore) Pte. Ltd.Signal to noise ratio adjustment circuit, signal to noise ratio adjustment method and signal to noise ratio adjustment program
KR102771181B1 (en)2019-07-122025-02-24삼성전자 주식회사Image sensor and electronic device comprising the image sensor
CN110392149A (en)*2019-07-232019-10-29华为技术有限公司 image capture display terminal
TW202110184A (en)*2019-07-302021-03-01日商索尼半導體解決方案公司Sending device, receiving device, and communication system
RU2725973C1 (en)*2019-12-312020-07-08Вячеслав Михайлович СмелковMethod of generating a video signal in a television-computer system for monitoring industrial articles having a circular ring shape
US20210334586A1 (en)*2020-04-282021-10-28Mediatek Inc.Edge learning display device and method
US11891075B2 (en)2020-06-232024-02-06Tusimple, Inc.Redundant hardware and software architecture for autonomous vehicles
US11853845B2 (en)*2020-09-022023-12-26Cognex CorporationMachine vision system and method with multi-aperture optics assembly
WO2022051516A1 (en)2020-09-032022-03-10Cyberdontics (Usa), Inc.Method and apparatus for cna analysis of tooth anatomy
AU2022249956A1 (en)2021-03-292023-11-02Alcon Inc.Stereoscopic imaging platform with continuous autofocusing mode
EP4314701A4 (en)*2021-03-302025-02-26Perceptive Technologies, Inc. OPTICAL COHERENCE TOMOGRAPHY FOR INTRAORAL SCANNING
US11575828B1 (en)*2021-10-142023-02-07Meta Platforms, Inc.Dynamically identifying visual media capture formats based upon conditions
WO2023141216A2 (en)*2022-01-212023-07-27Spectral Md, Inc.System and method for topological characterization of tissue
KR20250086629A (en)2022-09-082025-06-13퍼셉티브 테크놀로지스, 아이엔씨. Optical coherence tomography scanning system and method

Citations (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN2569474Y (en)2002-01-072003-08-27张国梁Stereo image shooting and broadcasting system
US20050128323A1 (en)2003-10-312005-06-16Kwang-Cheol ChoiImage photographing device and method
WO2006079963A2 (en)2005-01-282006-08-03Koninklijke Philips Electronics N.V.Device for registering images
US7086735B1 (en)*2005-05-272006-08-08Anthony Italo ProvitolaEnhancement of visual perception
US20080030592A1 (en)2006-08-012008-02-07Eastman Kodak CompanyProducing digital image with different resolution portions
US20080218611A1 (en)2007-03-092008-09-11Parulski Kenneth AMethod and apparatus for operating a dual lens camera to augment an image
CN101365071A (en)2007-09-272009-02-11豪威科技有限公司Double-model camera scheme, equipment, system and method
KR20090033487A (en)2006-07-252009-04-03퀄컴 인코포레이티드 Stereo image and video capturing device with dual digital sensors and method of using the same
KR20090088435A (en)2006-12-122009-08-19돌비 레버러토리즈 라이쎈싱 코오포레이션 HDR camera with multiple sensors
TW200937344A (en)2008-02-202009-09-01Ind Tech Res InstParallel processing method for synthesizing an image with multi-view images
US20100238327A1 (en)*2009-03-192010-09-23Griffith John DDual Sensor Camera
US20130335535A1 (en)*2011-03-242013-12-19Paul James KaneDigital 3d camera using periodic illumination

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8970680B2 (en)*2006-08-012015-03-03Qualcomm IncorporatedReal-time capturing and generating stereo images and videos with a monoscopic low power mobile device
JP4288623B2 (en)*2007-01-182009-07-01ソニー株式会社 Imaging device, noise removal device, noise removal method, noise removal method program, and recording medium recording noise removal method program
DK3876510T3 (en)*2008-05-202024-11-11Adeia Imaging Llc CAPTURE AND PROCESSING OF IMAGES USING MONOLITHIC CAMERA ARRAY WITH HETEROGENEOUS IMAGES
EP2518995B1 (en)*2009-12-242018-08-22Sharp Kabushiki KaishaMultocular image pickup apparatus and multocular image pickup method

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN2569474Y (en)2002-01-072003-08-27张国梁Stereo image shooting and broadcasting system
US20050128323A1 (en)2003-10-312005-06-16Kwang-Cheol ChoiImage photographing device and method
WO2006079963A2 (en)2005-01-282006-08-03Koninklijke Philips Electronics N.V.Device for registering images
US7086735B1 (en)*2005-05-272006-08-08Anthony Italo ProvitolaEnhancement of visual perception
KR20090033487A (en)2006-07-252009-04-03퀄컴 인코포레이티드 Stereo image and video capturing device with dual digital sensors and method of using the same
CN101496415A (en)2006-07-252009-07-29高通股份有限公司Stereo image and video capturing device with dual digital sensors and methods of using the same
US20080030592A1 (en)2006-08-012008-02-07Eastman Kodak CompanyProducing digital image with different resolution portions
KR20090088435A (en)2006-12-122009-08-19돌비 레버러토리즈 라이쎈싱 코오포레이션 HDR camera with multiple sensors
US20080218611A1 (en)2007-03-092008-09-11Parulski Kenneth AMethod and apparatus for operating a dual lens camera to augment an image
JP2010521102A (en)2007-03-092010-06-17イーストマン コダック カンパニー Operation of double lens camera to expand image
CN101365071A (en)2007-09-272009-02-11豪威科技有限公司Double-model camera scheme, equipment, system and method
TW200937344A (en)2008-02-202009-09-01Ind Tech Res InstParallel processing method for synthesizing an image with multi-view images
US20100238327A1 (en)*2009-03-192010-09-23Griffith John DDual Sensor Camera
US20130335535A1 (en)*2011-03-242013-12-19Paul James KaneDigital 3d camera using periodic illumination

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
European Search Report in co-pending related EP Application No. 12 00 4966 mailed Aug. 21, 2013.
Korean Office Action in co-pending related Korean Application No. 10-2012-0078610 mailed Aug. 20, 2013.

Cited By (189)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20170324891A1 (en)*2012-11-212017-11-09Infineon Technologies AgDynamic conservation of imaging power
US10313570B2 (en)*2012-11-212019-06-04Infineon Technologies AgDynamic conservation of imaging power
USRE48945E1 (en)2012-11-282022-02-22Corephotonics Ltd.High resolution thin multi-aperture imaging systems
USRE48477E1 (en)2012-11-282021-03-16Corephotonics LtdHigh resolution thin multi-aperture imaging systems
USRE49256E1 (en)2012-11-282022-10-18Corephotonics Ltd.High resolution thin multi-aperture imaging systems
USRE48697E1 (en)2012-11-282021-08-17Corephotonics Ltd.High resolution thin multi-aperture imaging systems
USRE48444E1 (en)2012-11-282021-02-16Corephotonics Ltd.High resolution thin multi-aperture imaging systems
US12262120B2 (en)2013-06-132025-03-25Corephotonics Ltd.Dual aperture zoom digital camera
US10841500B2 (en)2013-06-132020-11-17Corephotonics Ltd.Dual aperture zoom digital camera
US10904444B2 (en)2013-06-132021-01-26Corephotonics Ltd.Dual aperture zoom digital camera
US11838635B2 (en)2013-06-132023-12-05Corephotonics Ltd.Dual aperture zoom digital camera
US10326942B2 (en)2013-06-132019-06-18Corephotonics Ltd.Dual aperture zoom digital camera
US10225479B2 (en)2013-06-132019-03-05Corephotonics Ltd.Dual aperture zoom digital camera
US12069371B2 (en)2013-06-132024-08-20Corephotonics Lid.Dual aperture zoom digital camera
US11470257B2 (en)2013-06-132022-10-11Corephotonics Ltd.Dual aperture zoom digital camera
US11852845B2 (en)2013-07-042023-12-26Corephotonics Ltd.Thin dual-aperture zoom digital camera
US11614635B2 (en)2013-07-042023-03-28Corephotonics Ltd.Thin dual-aperture zoom digital camera
US10620450B2 (en)2013-07-042020-04-14Corephotonics LtdThin dual-aperture zoom digital camera
US11287668B2 (en)2013-07-042022-03-29Corephotonics Ltd.Thin dual-aperture zoom digital camera
US12164115B2 (en)2013-07-042024-12-10Corephotonics Ltd.Thin dual-aperture zoom digital camera
US12265234B2 (en)2013-07-042025-04-01Corephotonics Ltd.Thin dual-aperture zoom digital camera
US10288896B2 (en)2013-07-042019-05-14Corephotonics Ltd.Thin dual-aperture zoom digital camera
US10250797B2 (en)2013-08-012019-04-02Corephotonics Ltd.Thin multi-aperture imaging system with auto-focus and methods for using same
US12267588B2 (en)2013-08-012025-04-01Corephotonics Ltd.Thin multi-aperture imaging system with auto-focus and methods for using same
US10694094B2 (en)2013-08-012020-06-23Corephotonics Ltd.Thin multi-aperture imaging system with auto-focus and methods for using same
US10469735B2 (en)2013-08-012019-11-05Corephotonics Ltd.Thin multi-aperture imaging system with auto-focus and methods for using same
US12114068B2 (en)2013-08-012024-10-08Corephotonics Ltd.Thin multi-aperture imaging system with auto-focus and methods for using same
US11470235B2 (en)2013-08-012022-10-11Corephotonics Ltd.Thin multi-aperture imaging system with autofocus and methods for using same
US11716535B2 (en)2013-08-012023-08-01Corephotonics Ltd.Thin multi-aperture imaging system with auto-focus and methods for using same
US11991444B2 (en)2013-08-012024-05-21Corephotonics Ltd.Thin multi-aperture imaging system with auto-focus and methods for using same
US11856291B2 (en)2013-08-012023-12-26Corephotonics Ltd.Thin multi-aperture imaging system with auto-focus and methods for using same
US12007537B2 (en)2014-08-102024-06-11Corephotonics Lid.Zoom dual-aperture camera with folded lens
US10509209B2 (en)2014-08-102019-12-17Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US10976527B2 (en)2014-08-102021-04-13Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US11982796B2 (en)2014-08-102024-05-14Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US11543633B2 (en)2014-08-102023-01-03Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US11002947B2 (en)2014-08-102021-05-11Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US11262559B2 (en)2014-08-102022-03-01Corephotonics LtdZoom dual-aperture camera with folded lens
US11042011B2 (en)2014-08-102021-06-22Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US10571665B2 (en)2014-08-102020-02-25Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US10156706B2 (en)2014-08-102018-12-18Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US12105268B2 (en)2014-08-102024-10-01Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US11703668B2 (en)2014-08-102023-07-18Corephotonics Ltd.Zoom dual-aperture camera with folded lens
US12259524B2 (en)2015-01-032025-03-25Corephotonics Ltd.Miniature telephoto lens module and a camera utilizing such a lens module
US11125975B2 (en)2015-01-032021-09-21Corephotonics Ltd.Miniature telephoto lens module and a camera utilizing such a lens module
US12405448B2 (en)2015-01-032025-09-02Corephotonics Ltd.Miniature telephoto lens module and a camera utilizing such a lens module
US11994654B2 (en)2015-01-032024-05-28Corephotonics Ltd.Miniature telephoto lens module and a camera utilizing such a lens module
US10288840B2 (en)2015-01-032019-05-14Corephotonics LtdMiniature telephoto lens module and a camera utilizing such a lens module
US12216246B2 (en)2015-01-032025-02-04Corephotonics Ltd.Miniature telephoto lens module and a camera utilizing such a lens module
US10771684B2 (en)2015-01-192020-09-08Microsoft Technology Licensing, LlcProfiles identifying camera capabilities
US10288897B2 (en)2015-04-022019-05-14Corephotonics Ltd.Dual voice coil motor structure in a dual-optical module camera
US10558058B2 (en)2015-04-022020-02-11Corephontonics Ltd.Dual voice coil motor structure in a dual-optical module camera
US10962746B2 (en)2015-04-162021-03-30Corephotonics Ltd.Auto focus and optical image stabilization in a compact folded camera
US10571666B2 (en)2015-04-162020-02-25Corephotonics Ltd.Auto focus and optical image stabilization in a compact folded camera
US10459205B2 (en)2015-04-162019-10-29Corephotonics LtdAuto focus and optical image stabilization in a compact folded camera
US10371928B2 (en)2015-04-162019-08-06Corephotonics LtdAuto focus and optical image stabilization in a compact folded camera
US10613303B2 (en)2015-04-162020-04-07Corephotonics Ltd.Auto focus and optical image stabilization in a compact folded camera
US10656396B1 (en)2015-04-162020-05-19Corephotonics Ltd.Auto focus and optical image stabilization in a compact folded camera
US12105267B2 (en)2015-04-162024-10-01Corephotonics Ltd.Auto focus and optical image stabilization in a compact folded camera
US12222474B2 (en)2015-04-162025-02-11Corephotonics Ltd.Auto focus and optical image stabilization in a compact folded camera
US12422651B2 (en)2015-04-162025-09-23Corephotonics Ltd.Auto focus and optical image stabilization in a compact folded camera
US11808925B2 (en)2015-04-162023-11-07Corephotonics Ltd.Auto focus and optical image stabilization in a compact folded camera
US10670879B2 (en)2015-05-282020-06-02Corephotonics Ltd.Bi-directional stiffness for optical image stabilization in a dual-aperture digital camera
US10379371B2 (en)2015-05-282019-08-13Corephotonics LtdBi-directional stiffness for optical image stabilization in a dual-aperture digital camera
US11350038B2 (en)2015-08-132022-05-31Corephotonics Ltd.Dual aperture zoom camera with video support and switching / non-switching dynamic control
US11770616B2 (en)2015-08-132023-09-26Corephotonics Ltd.Dual aperture zoom camera with video support and switching / non-switching dynamic control
US12231772B2 (en)2015-08-132025-02-18Corephotonics Ltd.Dual aperture zoom camera with video support and switching/non-switching dynamic control
US10356332B2 (en)2015-08-132019-07-16Corephotonics Ltd.Dual aperture zoom camera with video support and switching / non-switching dynamic control
US10567666B2 (en)2015-08-132020-02-18Corephotonics Ltd.Dual aperture zoom camera with video support and switching / non-switching dynamic control
US11546518B2 (en)2015-08-132023-01-03Corephotonics Ltd.Dual aperture zoom camera with video support and switching / non-switching dynamic control
US12401904B2 (en)2015-08-132025-08-26Corephotonics Ltd.Dual aperture zoom camera with video support and switching / non-switching dynamic control
US10917576B2 (en)2015-08-132021-02-09Corephotonics Ltd.Dual aperture zoom camera with video support and switching / non-switching dynamic control
US10230898B2 (en)2015-08-132019-03-12Corephotonics Ltd.Dual aperture zoom camera with video support and switching / non-switching dynamic control
US12022196B2 (en)2015-08-132024-06-25Corephotonics Ltd.Dual aperture zoom camera with video support and switching / non-switching dynamic control
US10284780B2 (en)2015-09-062019-05-07Corephotonics Ltd.Auto focus and optical image stabilization with roll compensation in a compact folded camera
US10498961B2 (en)2015-09-062019-12-03Corephotonics Ltd.Auto focus and optical image stabilization with roll compensation in a compact folded camera
US10935870B2 (en)2015-12-292021-03-02Corephotonics Ltd.Dual-aperture zoom digital camera with automatic adjustable tele field of view
US11599007B2 (en)2015-12-292023-03-07Corephotonics Ltd.Dual-aperture zoom digital camera with automatic adjustable tele field of view
US11314146B2 (en)2015-12-292022-04-26Corephotonics Ltd.Dual-aperture zoom digital camera with automatic adjustable tele field of view
US11726388B2 (en)2015-12-292023-08-15Corephotonics Ltd.Dual-aperture zoom digital camera with automatic adjustable tele field of view
US10578948B2 (en)2015-12-292020-03-03Corephotonics Ltd.Dual-aperture zoom digital camera with automatic adjustable tele field of view
US11392009B2 (en)2015-12-292022-07-19Corephotonics Ltd.Dual-aperture zoom digital camera with automatic adjustable tele field of view
US12372758B2 (en)2016-05-302025-07-29Corephotonics Ltd.Rotational ball-guided voice coil motor
US10488631B2 (en)2016-05-302019-11-26Corephotonics Ltd.Rotational ball-guided voice coil motor
US11977210B2 (en)2016-05-302024-05-07Corephotonics Ltd.Rotational ball-guided voice coil motor
US11650400B2 (en)2016-05-302023-05-16Corephotonics Ltd.Rotational ball-guided voice coil motor
US11689803B2 (en)2016-06-192023-06-27Corephotonics Ltd.Frame synchronization in a dual-aperture camera system
US12200359B2 (en)2016-06-192025-01-14Corephotonics Ltd.Frame synchronization in a dual-aperture camera system
US10616484B2 (en)2016-06-192020-04-07Corephotonics Ltd.Frame syncrhonization in a dual-aperture camera system
US11172127B2 (en)2016-06-192021-11-09Corephotonics Ltd.Frame synchronization in a dual-aperture camera system
US10762708B2 (en)*2016-06-232020-09-01Intel CorporationPresentation of scenes for binocular rivalry perception
US11977270B2 (en)2016-07-072024-05-07Corephotonics Lid.Linear ball guided voice coil motor for folded optic
US11550119B2 (en)2016-07-072023-01-10Corephotonics Ltd.Linear ball guided voice coil motor for folded optic
US12124106B2 (en)2016-07-072024-10-22Corephotonics Ltd.Linear ball guided voice coil motor for folded optic
US12298590B2 (en)2016-07-072025-05-13Corephotonics Ltd.Linear ball guided voice coil motor for folded optic
US10706518B2 (en)2016-07-072020-07-07Corephotonics Ltd.Dual camera system with improved video smooth transition by image blending
US10845565B2 (en)2016-07-072020-11-24Corephotonics Ltd.Linear ball guided voice coil motor for folded optic
US11048060B2 (en)2016-07-072021-06-29Corephotonics Ltd.Linear ball guided voice coil motor for folded optic
US11531209B2 (en)2016-12-282022-12-20Corephotonics Ltd.Folded camera structure with an extended light-folding-element scanning range
US12092841B2 (en)2016-12-282024-09-17Corephotonics Ltd.Folded camera structure with an extended light-folding-element scanning range
US12366762B2 (en)2016-12-282025-07-22Corephotonics Ltd.Folded camera structure with an extended light- folding-element scanning range
US11693297B2 (en)2017-01-122023-07-04Corephotonics Ltd.Compact folded camera
US12259639B2 (en)2017-01-122025-03-25Corephotonics Ltd.Compact folded camera
US10884321B2 (en)2017-01-122021-01-05Corephotonics Ltd.Compact folded camera
US12038671B2 (en)2017-01-122024-07-16Corephotonics Ltd.Compact folded camera
US11815790B2 (en)2017-01-122023-11-14Corephotonics Ltd.Compact folded camera
US11809065B2 (en)2017-01-122023-11-07Corephotonics Ltd.Compact folded camera
US10670827B2 (en)2017-02-232020-06-02Corephotonics Ltd.Folded camera lens designs
US10534153B2 (en)2017-02-232020-01-14Corephotonics Ltd.Folded camera lens designs
US10571644B2 (en)2017-02-232020-02-25Corephotonics Ltd.Folded camera lens designs
US12309496B2 (en)2017-03-152025-05-20Corephotonics Ltd.Camera with panoramic scanning range
US11671711B2 (en)2017-03-152023-06-06Corephotonics Ltd.Imaging system with panoramic scanning range
US10645286B2 (en)2017-03-152020-05-05Corephotonics Ltd.Camera with panoramic scanning range
US10319079B2 (en)2017-06-302019-06-11Microsoft Technology Licensing, LlcNoise estimation using bracketed image capture
US12113946B2 (en)*2017-07-282024-10-08Advanced Micro Devices, Inc.Buffer management for plug-in architectures in computation graph structures
US20220417382A1 (en)*2017-07-282022-12-29Advanced Micro Devices, Inc.Buffer management for plug-in architectures in computation graph structures
US10904512B2 (en)2017-09-062021-01-26Corephotonics Ltd.Combined stereoscopic and phase detection depth mapping in a dual aperture camera
US11695896B2 (en)2017-10-032023-07-04Corephotonics Ltd.Synthetically enlarged camera aperture
US10951834B2 (en)2017-10-032021-03-16Corephotonics Ltd.Synthetically enlarged camera aperture
US12372856B2 (en)2017-11-232025-07-29Corephotonics Ltd.Compact folded camera structure
US11809066B2 (en)2017-11-232023-11-07Corephotonics Ltd.Compact folded camera structure
US11333955B2 (en)2017-11-232022-05-17Corephotonics Ltd.Compact folded camera structure
US11619864B2 (en)2017-11-232023-04-04Corephotonics Ltd.Compact folded camera structure
US12007672B2 (en)2017-11-232024-06-11Corephotonics Ltd.Compact folded camera structure
US12189274B2 (en)2017-11-232025-01-07Corephotonics Ltd.Compact folded camera structure
US12007582B2 (en)2018-02-052024-06-11Corephotonics Ltd.Reduced height penalty for folded camera
US11686952B2 (en)2018-02-052023-06-27Corephotonics Ltd.Reduced height penalty for folded camera
US10976567B2 (en)2018-02-052021-04-13Corephotonics Ltd.Reduced height penalty for folded camera
US11367267B2 (en)2018-02-082022-06-21Genetec Inc.Systems and methods for locating a retroreflective object in a digital image
US11830256B2 (en)2018-02-082023-11-28Genetec Inc.Systems and methods for locating a retroreflective object in a digital image
US12352931B2 (en)2018-02-122025-07-08Corephotonics Ltd.Folded camera with optical image stabilization
US11640047B2 (en)2018-02-122023-05-02Corephotonics Ltd.Folded camera with optical image stabilization
US10911740B2 (en)2018-04-222021-02-02Corephotonics Ltd.System and method for mitigating or preventing eye damage from structured light IR/NIR projector systems
US10694168B2 (en)2018-04-222020-06-23Corephotonics Ltd.System and method for mitigating or preventing eye damage from structured light IR/NIR projector systems
US11733064B1 (en)2018-04-232023-08-22Corephotonics Ltd.Optical-path folding-element with an extended two degree of freedom rotation range
US11268830B2 (en)2018-04-232022-03-08Corephotonics LtdOptical-path folding-element with an extended two degree of freedom rotation range
US11359937B2 (en)2018-04-232022-06-14Corephotonics Ltd.Optical-path folding-element with an extended two degree of freedom rotation range
US11976949B2 (en)2018-04-232024-05-07Corephotonics Lid.Optical-path folding-element with an extended two degree of freedom rotation range
US11867535B2 (en)2018-04-232024-01-09Corephotonics Ltd.Optical-path folding-element with an extended two degree of freedom rotation range
US12085421B2 (en)2018-04-232024-09-10Corephotonics Ltd.Optical-path folding-element with an extended two degree of freedom rotation range
US12379230B2 (en)2018-04-232025-08-05Corephotonics Ltd.Optical-path folding-element with an extended two degree of freedom rotation range
US11268829B2 (en)2018-04-232022-03-08Corephotonics LtdOptical-path folding-element with an extended two degree of freedom rotation range
US12328523B2 (en)2018-07-042025-06-10Corephotonics Ltd.Cameras with scanning optical path folding elements for automotive or surveillance
US11363180B2 (en)2018-08-042022-06-14Corephotonics Ltd.Switchable continuous display information system above camera
US11852790B2 (en)2018-08-222023-12-26Corephotonics Ltd.Two-state zoom folded camera
US11635596B2 (en)2018-08-222023-04-25Corephotonics Ltd.Two-state zoom folded camera
US12025260B2 (en)2019-01-072024-07-02Corephotonics Ltd.Rotation mechanism with sliding joint
US11287081B2 (en)2019-01-072022-03-29Corephotonics Ltd.Rotation mechanism with sliding joint
US11315276B2 (en)2019-03-092022-04-26Corephotonics Ltd.System and method for dynamic stereoscopic calibration
US11527006B2 (en)2019-03-092022-12-13Corephotonics Ltd.System and method for dynamic stereoscopic calibration
US11368631B1 (en)2019-07-312022-06-21Corephotonics Ltd.System and method for creating background blur in camera panning or motion
US12177596B2 (en)2019-07-312024-12-24Corephotonics Ltd.System and method for creating background blur in camera panning or motion
US12125234B2 (en)2019-10-262024-10-22Genetec Inc.Automated license plate recognition system and related method
US12067743B2 (en)2019-10-262024-08-20Genetec Inc.Automated license plate recognition system and related method
US11188776B2 (en)2019-10-262021-11-30Genetec Inc.Automated license plate recognition system and related method
US11659135B2 (en)2019-10-302023-05-23Corephotonics Ltd.Slow or fast motion video using depth information
US11949976B2 (en)2019-12-092024-04-02Corephotonics Ltd.Systems and methods for obtaining a smart panoramic image
US11770618B2 (en)2019-12-092023-09-26Corephotonics Ltd.Systems and methods for obtaining a smart panoramic image
US12328496B2 (en)2019-12-092025-06-10Corephotonics Ltd.Systems and methods for obtaining a smart panoramic image
US12075151B2 (en)2019-12-092024-08-27Corephotonics Ltd.Systems and methods for obtaining a smart panoramic image
US12007668B2 (en)2020-02-222024-06-11Corephotonics Ltd.Split screen feature for macro photography
US11693064B2 (en)2020-04-262023-07-04Corephotonics Ltd.Temperature control for Hall bar sensor correction
US12174272B2 (en)2020-04-262024-12-24Corephotonics Ltd.Temperature control for hall bar sensor correction
US11832018B2 (en)2020-05-172023-11-28Corephotonics Ltd.Image stitching in the presence of a full field of view reference image
US12096150B2 (en)2020-05-172024-09-17Corephotonics Ltd.Image stitching in the presence of a full field of view reference image
US11962901B2 (en)2020-05-302024-04-16Corephotonics Ltd.Systems and methods for obtaining a super macro image
US11770609B2 (en)2020-05-302023-09-26Corephotonics Ltd.Systems and methods for obtaining a super macro image
US12395733B2 (en)2020-05-302025-08-19Corephotonics Ltd.Systems and methods for obtaining a super macro image
US12167130B2 (en)2020-05-302024-12-10Corephotonics Ltd.Systems and methods for obtaining a super macro image
US11928799B2 (en)2020-06-292024-03-12Samsung Electronics Co., Ltd.Electronic device and controlling method of electronic device
US12368975B2 (en)2020-07-152025-07-22Corephotonics Ltd.Image sensors and sensing methods to obtain time-of-flight and phase detection information
US11832008B2 (en)2020-07-152023-11-28Corephotonics Ltd.Image sensors and sensing methods to obtain time-of-flight and phase detection information
US12003874B2 (en)2020-07-152024-06-04Corephotonics Ltd.Image sensors and sensing methods to obtain Time-of-Flight and phase detection information
US11637977B2 (en)2020-07-152023-04-25Corephotonics Ltd.Image sensors and sensing methods to obtain time-of-flight and phase detection information
US12192654B2 (en)2020-07-152025-01-07Corephotonics Ltd.Image sensors and sensing methods to obtain time-of-flight and phase detection information
US11910089B2 (en)2020-07-152024-02-20Corephotonics Lid.Point of view aberrations correction in a scanning folded camera
US12108151B2 (en)2020-07-152024-10-01Corephotonics Ltd.Point of view aberrations correction in a scanning folded camera
US11946775B2 (en)2020-07-312024-04-02Corephotonics Ltd.Hall sensor—magnet geometry for large stroke linear position sensing
US12247851B2 (en)2020-07-312025-03-11Corephotonics Ltd.Hall sensor—magnet geometry for large stroke linear position sensing
US11968453B2 (en)2020-08-122024-04-23Corephotonics Ltd.Optical image stabilization in a scanning folded camera
US12184980B2 (en)2020-08-122024-12-31Corephotonics Ltd.Optical image stabilization in a scanning folded camera
US12101575B2 (en)2020-12-262024-09-24Corephotonics Ltd.Video support in a multi-aperture mobile camera with a scanning zoom camera
US12081856B2 (en)2021-03-112024-09-03Corephotonics Lid.Systems for pop-out camera
US12439142B2 (en)2021-03-112025-10-07Corephotonics Ltd .Systems for pop-out camera
US12007671B2 (en)2021-06-082024-06-11Corephotonics Ltd.Systems and cameras for tilting a focal plane of a super-macro image
US12328505B2 (en)2022-03-242025-06-10Corephotonics Ltd.Slim compact lens optical image stabilization
US12069399B2 (en)2022-07-072024-08-20Snap Inc.Dynamically switching between RGB and IR capture
US12443091B2 (en)2024-05-132025-10-14Corephotonics Ltd.Split screen feature for macro photography
US12442665B2 (en)2025-02-062025-10-14Corephotonics Ltd.Hall sensor—magnet geometry for large stroke linear position sensing

Also Published As

Publication numberPublication date
EP2549763A3 (en)2014-03-12
EP2549763A2 (en)2013-01-23
TWI526068B (en)2016-03-11
KR101428635B1 (en)2014-08-08
TW201309003A (en)2013-02-16
CN102892008A (en)2013-01-23
EP3429189B1 (en)2024-04-17
CN102892008B (en)2016-12-21
KR20130011951A (en)2013-01-30
US20130021447A1 (en)2013-01-24
EP3429189A1 (en)2019-01-16

Similar Documents

PublicationPublication DateTitle
US9270875B2 (en)Dual image capture processing
US20130021504A1 (en)Multiple image processing
US8199222B2 (en)Low-light video frame enhancement
US7940311B2 (en)Multi-exposure pattern for enhancing dynamic range of images
US8698924B2 (en)Tone mapping for low-light video frame enhancement
JP5845464B2 (en) Image processing apparatus, image processing method, and digital camera
CN116324882A (en)Image signal processing in a multi-camera system
US8854503B2 (en)Image enhancements through multi-image processing
US10762600B2 (en)Image processing apparatus, image processing method, and non-transitory computer-readable recording medium
US8982230B2 (en)Image pickup apparatus including image adjustment processing for improving an appearance of an image, the image adjustment processing to be applied when it is determined that an imaging scene is finalized
WO2016117137A1 (en)Image-capturing device, image-capturing method, and image display device
CN110278375B (en) Image processing method, device, storage medium and electronic device
JP5146015B2 (en) Imaging apparatus and imaging method
CN110266965B (en)Image processing method, image processing device, storage medium and electronic equipment
JP5452269B2 (en) Imaging device
JP2012134745A (en)Image signal processing device
Corcoran et al.Consumer imaging i–processing pipeline, focus and exposure
JP2006121165A (en)Imaging apparatus and image forming method
JP2013074368A (en)Imaging apparatus and imaging method
JP7352745B2 (en) Image processing device, imaging device, image processing method, and image processing program
HK1179790A (en)Dual image capture processing
JP2011139270A (en)Imaging apparatus and program
KR20140145447A (en)Image processing device and operation method thereof

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:BROADCOM CORPORATION, CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BRISEDOUX, LAURENT;PLOWMAN, DAVID;FRIDENTAL, RON;AND OTHERS;SIGNING DATES FROM 20111215 TO 20111221;REEL/FRAME:027435/0256

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text:PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date:20160201

Owner name:BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text:PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date:20160201

ASAssignment

Owner name:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date:20170120

Owner name:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date:20170120

ASAssignment

Owner name:BROADCOM CORPORATION, CALIFORNIA

Free format text:TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date:20170119

ASAssignment

Owner name:AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITE

Free format text:MERGER;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:047229/0408

Effective date:20180509

ASAssignment

Owner name:AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITE

Free format text:CORRECTIVE ASSIGNMENT TO CORRECT THE EFFECTIVE DATE PREVIOUSLY RECORDED ON REEL 047229 FRAME 0408. ASSIGNOR(S) HEREBY CONFIRMS THE THE EFFECTIVE DATE IS 09/05/2018;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:047349/0001

Effective date:20180905

ASAssignment

Owner name:AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE. LIMITE

Free format text:CORRECTIVE ASSIGNMENT TO CORRECT THE PATENT NUMBER 9,385,856 TO 9,385,756 PREVIOUSLY RECORDED AT REEL: 47349 FRAME: 001. ASSIGNOR(S) HEREBY CONFIRMS THE MERGER;ASSIGNOR:AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD.;REEL/FRAME:051144/0648

Effective date:20180905

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:4

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8


[8]ページ先頭

©2009-2025 Movatter.jp