CROSS-REFERENCE TO RELATED APPLICATION(S)This application claims the benefit under 35 U.S.C. §119(e) of a U.S. Provisional application filed on Jun. 12, 2014 in the U.S. Patent and Trademark Office and assigned Ser. No. 62/011,311, the entire disclosure of which is hereby incorporated by reference.
TECHNICAL FIELDThe present disclosure relates to an apparatus and method for processing an image. More particularly, the present disclosure relates to an apparatus and method for demosaicing an image using sampled color values of a plurality of images.
BACKGROUNDMobile terminals are developed to provide wireless communication between users. As technology has advanced, mobile terminals now provide many additional features beyond simple telephone conversation. For example, mobile terminals are now able to provide image and video capture. As a result of the ubiquity of mobile terminals, image and video capture has become increasingly popular. Consequently, various image processing techniques are used to provide a user with an accurate representation of the image intended to be captured.
In order to provide a more accurate representation of the color of the image intended to be captured, a technique commonly referred to as demosaicing may be performed. Demosaicing refers to the digital imaging process used to reconstruct a full color image from color samples output from an image detector.
Accordingly, there is a need for an apparatus and method for providing an improved representation of the image intended to be captured. Further, there is a need for an apparatus and method for demosaicing color values sampled during image capture.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
SUMMARYAspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide an apparatus and method for demosaicing sampled color values.
In accordance with an aspect of the present disclosure, a method for demosaicing sampled color values is provided. The method includes storing image information for a plurality of images, selecting an intended viewpoint that is within a composite of image information across the plurality of images, and reconstructing color information for at least one color using image information from the composite of image information across the plurality of images that corresponds to the intended viewpoint.
In accordance with another aspect of the present disclosure, an apparatus for demosaicing sampled color values is provided. The apparatus includes a storage unit, and at least one processor configured to store image information for a plurality of images, to select an intended viewpoint that is within a composite of image information across the plurality of images, and to reconstruct color information for at least one color using image information from the composite of image information across the plurality of images that corresponds to the intended viewpoint.
In accordance with another aspect of the present disclosure, a method for demosaicing sampled color values is provided. The method includes storing image information for an intended image, determining the subpixels of the intended image for which color information of a particular color is captured, generating an adaptive filter to demosaic the image information for the intended image according to the subpixels of the intended image for which color information of a particular color is captured, and demosaicing the image information for the intended image using the adaptive filter.
In accordance with another aspect of the present disclosure, an apparatus for demosaicing sampled color values is provided. The apparatus includes a storage unit, and at least one processor configured to store image information for an intended image, to determine subpixels of the intended image for which color information of a particular color is captured, to generate an adaptive filter to demosaic the image information for the intended image according to the subpixels of the intended image for which color information of a particular color is captured, and to demosaic the image information for the intended image using the adaptive filter.
Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the disclosure.
BRIEF DESCRIPTION OF THE DRAWINGSThe above and other aspects, features, and advantages of various embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
FIG. 1 illustrates an array of photosites of a photo detector according to the related art;
FIG. 2A illustrates an array of photosites of a photo detector according to the related art;
FIG. 2B illustrates red color values sampled at a plurality of photosites using an image detector having an array of photosites of a photo detector such as, for example, the array of photosites illustrated inFIG. 2A according to the related art;
FIG. 3 illustrates a plurality of images according to an embodiment of the present disclosure;
FIGS. 4A,4B,4C, and4D illustrate a plurality of subpixels according to an embodiment of the present disclosure;
FIG. 5 illustrates a flowchart of a method of determining color values according to an embodiment of the present disclosure;
FIG. 6 illustrates a flowchart of a method of determining color values according to an embodiment of the present disclosure;
FIG. 7A illustrates a flowchart of a method of demosaicing according to an embodiment of the present disclosure;
FIG. 7B illustrates a plurality of subpixels according to an embodiment of the present disclosure;
FIG. 7C illustrates a filter according to an embodiment of the present disclosure;
FIG. 7D illustrates a plurality of subpixels according to an embodiment of the present disclosure;
FIG. 7E illustrates a filter according to an embodiment of the present disclosure;
FIG. 8 illustrates pseudo code for a method of demosaicing according to an embodiment of the present disclosure; and
FIG. 9 illustrates a block diagram schematically illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
DETAILED DESCRIPTIONThe following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the disclosure. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure are provided for illustration purpose only and not for the purpose of limiting the disclosure as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
By the term “substantially” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.
According to various embodiments of the present disclosure, an electronic device may include communication functionality. For example, an electronic device may be a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, a netbook PC, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an mp3 player, a mobile medical device, a camera, a wearable device (e.g., a Head-Mounted Device (HMD), electronic clothes, electronic braces, an electronic necklace, an electronic appcessory, an electronic tattoo, or a smart watch), and/or the like.
According to various embodiments of the present disclosure, an electronic device may be a smart home appliance with communication functionality. A smart home appliance may be, for example, a television, a Digital Video Disk (DVD) player, an audio, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a washer, a dryer, an air purifier, a set-top box, a TV box (e.g., Samsung HomeSync™, Apple TV™, or Google TV™), a gaming console, an electronic dictionary, an electronic key, a camcorder, an electronic picture frame, and/or the like.
According to various embodiments of the present disclosure, an electronic device may be a medical device (e.g., Magnetic Resonance Angiography (MRA) device, a Magnetic Resonance Imaging (MRI) device, Computed Tomography (CT) device, an imaging device, or an ultrasonic device), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), an automotive infotainment device, a naval electronic device (e.g., naval navigation device, gyroscope, or compass), an avionic electronic device, a security device, an industrial or consumer robot, and/or the like.
According to various embodiments of the present disclosure, an electronic device may be furniture, part of a building/structure, an electronic board, electronic signature receiving device, a projector, various measuring devices (e.g., water, electricity, gas or electro-magnetic wave measuring devices), and/or the like that include communication functionality.
According to various embodiments of the present disclosure, an electronic device may be any combination of the foregoing devices. In addition, it will be apparent to one having ordinary skill in the art that an electronic device according to various embodiments of the present disclosure is not limited to the foregoing devices.
Various embodiments of the present disclosure include an apparatus and method for demosaicing sampled color values. In addition, various embodiments of the present disclosure include an apparatus and method for determining color values.
An image is represented by a number of areas called pixels. Each pixel is associated with a color that should be substantially reproduced by a set of subpixels in a display. According to the related art, each subpixel displays a primary color. For example, each subpixel according to the related art is associated with some hue and saturation. Other colors may be obtained by mixing primary colors. Each pixel is mapped into a set of one or more subpixels which are to display the color of the pixel.
In some displays, each repeating set of subpixels includes a subpixel for each primary color. The subpixels are small, and are spaced closely together, to provide a desired resolution. This structure is not cost-effective however because the structure does not match the resolution of human vision. Humans are more perceptive to luminance differences than to chromatic differences. Therefore, some displays map an input pixel into a subpixel repeating set that does not include only the subpixels of each primary color. The chromatic resolution is reduced, but the luminance resolution remains high. One such display may be an RGBW display.
An image detector has a plurality of photo detectors used to sample an image. Each of the plurality of photo detectors may sample (e.g., capture) a value for a single color. For example, each of the plurality of photo detectors may be configured with a color filter. According to the related art, a Color Filter Array (CFA) or a Color Filter Mosaic (CFM) is an array or mosaic of color filters disposed above the plurality of photo detectors.
Each of the plurality of photo detectors may be located at a photosite of the image detector. The photosite refers to the spatial location at which a color may be sampled by a photo detector. The array or mosaic of color filters may be disposed above the plurality of photo detectors such that each photo detector has a single corresponding color filter. Accordingly, each photosite may have a corresponding sampled value for a single color.
Each of the photosites may be mapped or otherwise correspond to a subpixel of the image (e.g., when displayed on a display). Accordingly, each subpixel of the image may have a corresponding sampled value for a single color. Because each subpixel of the image does not have sampled values for all colors, an image represented by the sampled color values at each subpixel may appear pixelated with a disjointed color representation of the intended image. In other words, the image represented by the sampled color values at each subpixel may be an inaccurate representation of the image intended to be captured.
In order to provide a more accurate representation of the color of the image intended to be captured, a technique commonly referred to as demosaicing may be performed. Demosaicing refers to the digital imaging process used to reconstruct a full color image from sampled color values output from an image detector. Because each photo detector has a spatial footprint in the image detector, the image detector is unable to capture a color value for every color at each respective photosite thereof. The reconstruction of the full color image using color samples output from the respective photo detectors constituting the image detector may use interpolation or other numerical methods to determine values for each color (e.g., red, green, and blue) at each subpixel.
FIG. 1 illustrates an array of photosites of a photo detector according to the related art.
Referring toFIG. 1, an array ofphotosites100 comprising red, green, blue, and white photosites is illustrated. As an example, the red, green, blue, and white photosites may be created by placing an RGBW filter over the photo detectors respectively corresponding to the photosites.
The array ofphotosites100 may correspond to a plurality of pixels. For example, as illustrated inFIG. 1, the plurality of pixels are distinguished from one another by a solid line. Each of the plurality of pixels may include a plurality of subpixels. For example, each of the plurality of pixels may include four subpixels. The plurality of subpixels in each of the plurality of pixels are distinguished from one another by a dotted line inFIG. 1.
If an image detector comprises an RGBW filter, then the plurality of subpixels constituting a pixel corresponding to the respective groups of photosites of the image detector may include a white subpixel, a red subpixel, a blue subpixel, and a green subpixel. As illustrated inFIG. 1, a white subpixel is denoted by ‘W,’ a red subpixel is denoted by ‘R,’ a blue subpixel is denoted by ‘B,’ and a green subpixel is denoted by ‘G.’
The white subpixels may respectively sample a white color value in response to a request to capture an image. The white subpixels may correspond to asubpixel105a, asubpixel110a, asubpixel115a, asubpixel120a, asubpixel125a, asubpixel130a, asubpixel135a, asubpixel140a, and asubpixel145a.
The red subpixels may respectively sample a red color value in response to a request to capture an image. The red subpixels may correspond to asubpixel105b, asubpixel110b, asubpixel115b, asubpixel120b, asubpixel125b, asubpixel130b, asubpixel135b, asubpixel140b, and asubpixel145b.
The blue subpixels may respectively sample a blue color value in response to a request to capture an image. The blue subpixels may correspond to asubpixel105c, asubpixel110c, asubpixel115c, asubpixel120c, asubpixel125c, asubpixel130c, asubpixel135c, and asubpixel140c.
The green subpixels may respectively sample a green color value in response to a request to capture an image. The green subpixels may correspond to asubpixel105d, asubpixel110d, asubpixel115d, asubpixel120d, asubpixel125d, asubpixel130d, asubpixel135d, asubpixel140d, and asubpixel145d.
The image detector may have an integer number of rows of pixels, and an integer number of columns of pixels. The number of rows of pixels and the number of columns of pixels may be equal or may be different.
FIG. 2A illustrates an array of photosites of a photo detector according to the related art.
Referring toFIG. 2A, an array ofphotosites200 comprising red, green and blue photosites is illustrated. As an example, the red, green, and blue photosites may be created by placing a filter (e.g., an RBG filter) over the photo detectors respectively corresponding to the photosites.
Similar to the array ofphotosites100 illustrated inFIG. 1, the array ofphotosites200 may correspond to a plurality of pixels. For example, as illustrated inFIG. 2A, the plurality of pixels are distinguished from one another by a solid line. Each of the plurality of pixels may include a plurality of subpixels. For example, each of the plurality of pixels may include four subpixels. The plurality of subpixels in each of the plurality of pixels are distinguished from one another by a dotted line inFIG. 2A.
If an image detector comprises a filter, then the plurality of subpixels corresponding to the respective groups of photosites of the image detector may include a red subpixel, a blue subpixel, and two green subpixels. A pixel may include two green subpixels because the RGB pixel includes three colors, however, there are certain benefits associated with designing an array of photo detectors and thus photosites (respectively corresponding to subpixels of the display) are arranged by an order of two. The color green is selected as a color to be repeated in the pixel because the human eye has been determined to perceive luminance best through the color green.
As illustrated inFIG. 2A, a red subpixel is denoted by ‘R,’ a blue subpixel is denoted by ‘B,’ and a green subpixel is denoted by ‘G.’
The red photosites corresponding to red subpixels may respectively sample a red color value in response to a request to capture an image. The red subpixels may correspond to asubpixel205a, asubpixel210a, asubpixel215a, asubpixel220a, asubpixel225a, asubpixel230a, asubpixel235a, asubpixel240a, and asubpixel245a.
The green photosites corresponding to green subpixels may respectively sample a green color value in response to a request to capture an image. The green subpixels may correspond to asubpixel205b, asubpixel205c, asubpixel210b, asubpixel210c, asubpixel215b, asubpixel215c, asubpixel220b, asubpixel220c, asubpixel225b, asubpixel225c, asubpixel230b, asubpixel230c, asubpixel235b, asubpixel235c, asubpixel240b, asubpixel240c, asubpixel245b, and asubpixel245c.
The blue photosites corresponding to blue subpixels may respectively sample a blue color value in response to a request to capture an image. The blue subpixels may correspond to asubpixel205d, asubpixel210d, asubpixel215d, asubpixel220d, asubpixel225d, asubpixel230d, asubpixel235d, asubpixel240d, and asubpixel245d.
FIG. 2B illustrates red color values sampled at a plurality of photosites using an image detector having an array of photosites of a photo detector such as, for example, the array of photosites illustrated inFIG. 2A according to the related art.
Referring toFIG. 2B, red color values are sampled at the plurality of subpixels corresponding to the plurality of photosites of the array ofphotosites200 at which a red color filter is overlaid with the corresponding photo detector. For example, if an image is captured using a photo detector corresponding to the array of pixels illustrated inFIG. 2A, then a red color value may be sampled at photosites respectively corresponding to asubpixel205a, asubpixel210a, asubpixel215a, asubpixel220a, asubpixel225a, asubpixel230a, asubpixel235a, asubpixel240a, and asubpixel245a.
As illustrated inFIG. 2B, a red color value is sampled at only one subpixel for each pixel. Therefore, as discussed above, in order to provide a more accurate color representation of the image intended to be captured, the red color values at the subpixels for which a red color value is not sampled may be estimated. For example, the red values at the values at the subpixels for which a red color value is not sampled may be estimated using nearby sampled color values. Indeed, the red color value of thesubpixel205dmay be estimated using the red color values sampled at asubpixel205a, asubpixel210a, asubpixel215a, asubpixel220a, and/or the like. As discussed above, such a process may be referred to as demosaicing.
Conversely, because thesubpixel205aonly has only a sampled red color value associated therewith, a blue color value and a green color value may be estimated at thesubpixel205arespectively using sampled blue color values and sampled green color values at surrounding subpixels.
According to the related art, demosaicing methods have included, for example, pixel replication, bilinear interpolation and median interpolation. In pixel replication, each missing value is taken from the neighbor to the left, above, or diagonally above and left, whichever is nearest. Bilinear interpolation offers some improvement over pixel replication with a moderate increase in complexity. In the bilinear interpolation method, each missing value is calculated based on an average of the neighboring pixel values, horizontally, vertically and/or diagonally. Median interpolation, which is a nonlinear interpolation method, offers the best results among these three algorithms (pixel replication, bilinear and median), especially when there are defective pixels, but has the maximum complexity. Median interpolation has two steps. First, missing values having four diagonal neighbors are interpolated using the median of those four values. Second, the remaining missing pixels are interpolated by the median of north, south, east, and west neighbors.
The demosaicing methods according to the related art are limited to the use of sampled color values from a single image. In other words, the demosaicing methods according the related art are limited to the use of sampled color values from a single capture image to estimate the color values at subpixels for which a color value has not been sampled in order to reconstruct an accurate color representation of the image intended to be captured.
According to various embodiments of the present disclosure, an adaptive method of demosaicing is provided. The adaptive method of demosaicing may be used when there is a variable amount of information available. For example, an adaptive filter is used to interpolate color values from available surrounding color values. In the simplest case, the adaptive method of demosaicing will produce the same results as demosaicing methods according to the related art. However, if more sampled color values are available (e.g., through a plurality of captured images), then the adaptive method of demosaicing uses the adaptive filter to take advantage of the larger number of sampled color values.
According to various embodiments of the present disclosure, a plurality of images may be used as a source for sampled color values for at least a subset of subpixels of an intended image. For example, a plurality of images that capture a same viewpoint may comprise different color values sampled at a same spatial location (e.g., respectively corresponding to an applicable subpixel of an image). According to various embodiments of the present disclosure, a particular spatial location of the viewpoint may have more than one sampled color value for a particular color (e.g., a spatial location of the viewpoint may have two sampled red color values each from a different image) and/or may have more than one sampled color value for different colors (e.g., a spatial location of the viewpoint may have a sampled red color value, a sampled blue color value, a sampled green color value, and/or the like).
According to various embodiments of the present disclosure, a color representation of an image (e.g., of a particular viewpoint) may be reconstructed (e.g., demosaiced) using sampled color values from a plurality of images that substantially capture a same viewpoint. For example, the plurality of images may have at least partially overlapping spatial coverage of the viewpoint. The plurality of images that substantially capture the same viewpoint may be contemporaneously captured by at least one image detector. Alternatively, the plurality of images that substantially capture the same viewpoint may be captured by different image detectors (e.g., image detectors disposed in different electronic devices, distinct image detectors disposed in a same electronic device, and/or the like).
According to various embodiments of the present disclosure, and electronic device may be configured to contemporaneously capture one or more images of substantially a same viewpoint. For example, an electronic device may include one or more cameras (e.g., image detectors) each of which captures an image. Each of the one or more cameras may contemporaneously capture substantially a same viewpoint, thereby capturing one or more images of the viewpoint.
According to various embodiments of the present disclosure, the electronic device may include a camera that is configured to contemporaneously capture substantially a same viewpoint. For example, a camera may be configured to capture a viewpoint using a burst image capture feature. The burst image capture feature may correspond to a feature according to which the camera rapidly captures a series of images of substantially a same viewpoint. A user may engage a burst image capture feature by holding down an image capture button on an electronic device. The number of images captured during the burst image capture feature may be related to the length of time that a user holds the image capture button. As another example, a camera may contemporaneously capture substantially a same viewpoint in response to a series of distinct commands to capture an image. The user may press (e.g., tap) an image capture button more than once to capture substantially a same viewpoint. As another example, an electronic device may capture a plurality of images that contemporaneously capture substantially a same viewpoint when a camera application is executed. For example, as the camera is operated to capture an image, the electronic device may display an image to be captured on a display screen (e.g., in a viewfinder). The electronic device may store a series of images corresponding to the image displayed on the display screen in a buffer.
According to various embodiments of the present disclosure, the electronic device may determine whether a plurality of images correspond to a plurality of images of substantially a same viewpoint. For example, the electronic device may perform an analysis to determine whether at least a portion of two or more images of the plurality of images include substantially a same viewpoint (e.g., within a preset and/or configurable threshold statistical relevancy, and/or the like). The electronic device may determine whether two or more of a plurality of images stored on the electronic device correspond to contemporaneous using metadata associated with the electronic files of the two or more of the plurality of images (e.g., a date created field, a date modified filed, and/or the like).
According to various embodiments of the present disclosure, two or more of the plurality of images that substantially capture the same viewpoint may be spatially offset with each other. As an example, a spatial offset between any two of the plurality of images may be an integer multiple of a pixel or and/or a subpixel (e.g., two of the plurality of images may be translated so as to be offset by 8 subpixels, or the like). As another example, a spatial offset between any two of the plurality of images may be a fractional multiple of a pixel or and/or a subpixel (e.g., two of the plurality of images may be translated so as to be offset by 2.5 subpixels, or the like).
According to various embodiments of the present disclosure, a spatial offset between any two of the plurality of images may be caused by human error when the electronic device is used to capture the plurality of images. For example, human error may be introduced into the capture of the plurality of images in the form of a shaking or vibration of the electronic device during image capture (e.g., a user is generally unable to hold an electronic device completely so as to be motionless).
According to various embodiments of the present disclosure, a spatial offset may be generated between any two of the plurality of images by introducing a vibration and/or motion of the image detector (e.g., the electronic device) during image capture. For example, the electronic device may include a vibration unit (e.g., a motor, or the like) to introduce motion of the electronic device across the image capture of any two or more of the plurality of images. An instance in which a generated vibration across the image capture of any two or more of the plurality of images may be beneficial is when the electronic device (e.g., the camera) is positioned in a holding unit (e.g., a tripod) during image capture.
According to various embodiments of the present disclosure, motion of the camera (e.g., the electronic device) during image capture across any two or more of the plurality of images may increase the sampling of any one or more colors for at least a portion of the intended viewpoint (e.g., the viewpoint of the image to be captured). According to various embodiments of the present disclosure, motion may be introduced to the camera (e.g., the electronic device) during image capture to enhance the sampling of color values across the intended viewpoint (e.g., the viewpoint of the image to be captured). For example, the sampling of any single color may be enhanced by a motion of the camera during image capture (e.g., a motion across any two or more of the plurality of images) because a single color may be sampled at a greater number of spatial locations of the viewpoint. The increased sampling of color values for any one or more colors will reduce the need to rely on an estimated color value for a corresponding spatial location of the viewpoint. The increased sample of color values for any one or more colors will reduce the need for an interpolation (e.g., demosaicing) of a corresponding spatial location of the viewpoint. A sampled color value of a color at a spatial location of the viewpoint may generally be considered to be a more accurate representation of the color of the viewpoint at the spatial location of the viewpoint than an estimated (e.g., interpolated) color value of the same.
According to various embodiments of the present disclosure, the plurality of images that substantially capture the same viewpoint may be normalized. For example, image processing may be performed on at least a subset of the plurality of images that substantially capture the same viewpoint in order to align the plurality of images. The plurality of images that substantially capture the same viewpoint may be processed to align the plurality of images on a common plane (e.g., the plurality of images is processed to be co-planar). In other words, the plurality of images that substantially capture the same viewpoint may be processed to be co-planar with at least a portion of each of the plurality of images overlapping with an intended viewpoint (e.g., the viewpoint of the image to be captured).
FIG. 3 illustrates a plurality of images according to an embodiment of the present disclosure.
Referring toFIG. 3, afirst image310 may capture afirst viewpoint315, asecond image320 may capture asecond viewpoint325, and athird image330 may capture athird viewpoint335.
According to various embodiments of the present disclosure, thefirst viewpoint315, thesecond viewpoint325, and thethird viewpoint335 may each relate to an intended viewpoint (e.g., the viewpoint of the image to be captured). For example, each of thefirst viewpoint315, thesecond viewpoint325, and thethird viewpoint335 may at least partially overlap with an intended viewpoint (e.g., the viewpoint of the image to be captured). In other words, thefirst image310, thesecond image320, and thethird image330 may capture substantially a same viewpoint.
Any two or more of thefirst image310, thesecond image320, and thethird image330 may be co-planar. Thefirst image310, thesecond image320, thethird image330 may be normalized so as to be co-planar (e.g., to be co-planar representations of substantially the same viewpoint).
According to various embodiments of the present disclosure, the electronic device may select one of thefirst image310, thesecond image320, and thethird image330 to use as a reference viewpoint and/or to which the remaining of thefirst image310, thesecond image320, and thethird image330 are to be normalized so as to be co-planar with the selected one of thefirst image310, thesecond image320, and thethird image330.
According to various embodiments of the present disclosure, the electronic device may select the one of thefirst image310, thesecond image320, and thethird image330 according to user input.
According to various embodiments of the present disclosure, the electronic device may select the one of thefirst image310, thesecond image320, and thethird image330 according to a determination as to which of thefirst image310, thesecond image320, and thethird image330 was captured in the middle based on a time at which the images were captured. The image captured in the middle based on the time at which the images were capture may be assumed to be a most accurate image capture of the intended viewpoint based on an assumption that the plurality of images are centered around the median.
According to various embodiments of the present disclosure, the electronic device may select the one of thefirst image310, thesecond image320, and thethird image330 according to a determination as to which of thefirst image310, thesecond image320, and thethird image330 was captured first based on a time at which the images were captured.
According to various embodiments of the present disclosure, the electronic device may select the one of thefirst image310, thesecond image320, and thethird image330 according to a determination as to which of thefirst image310, thesecond image320, and thethird image330 has a greater amount of spatial overlap from a remaining of thefirst image310, thesecond image320, and the third image330 (e.g., based on number of the plurality images that at least partially overlap with a given image, based on the amount of aggregate spatial overlap of a given image from a remainder of the images). As illustrated inFIG. 3, thefirst image310 has the greatest number of overlapping images. Thesecond image320 and thethird image330 at least partially spatially overlap with thefirst image310. In contrast, only thefirst image310 at least partially spatially overlaps with the second image, and only thefirst image310 at least partially spatially overlaps with thethird image330. Similarly, thefirst image310 has the greatest amount of aggregate spatial overlap from the remaining images (e.g., thesecond image320 and the third image330). The aggregate spatial overlap offirst image310 is the sum of (i) the spatial overlap of thesecond image320 with thefirst image310, and (ii) the spatial overlap of thethird image330 with thefirst image310.
According to various embodiments of the present disclosure, the selected one of thefirst image310, thesecond image320, and thethird image330 may be used as a spatial reference. The spatial locations corresponding to the subpixels of the selected one of thefirst image310, thesecond image320, and thethird image330 may be used as a reference for associating sampled color values of one or more color (or image characteristic) to a spatial location of the intended viewpoint. For example, the spatial locations corresponding to the subpixels of the selected one of thefirst image310, thesecond image320, and thethird image330 may be used as a reference for forming a composite of the sampled color values of the intended image.
According to various embodiments of the present disclosure, the spatial location of the viewpoints of the non-selected images (e.g., thesecond image320 and the third image330) of the plurality of images may be normalized against the selected one of the plurality of images (e.g., the first image310). For example, the spatial location of the sample color values of the non-selected images of the plurality of images (e.g., thesecond image320 and the third image330) may be aligned with the spatial location of the sample color values of the selected one of the plurality of images (e.g., the first image310).
According to various embodiments of the present disclosure, if the subpixels of the non-selected images (e.g., thesecond image320 and the third image330) do not align with the subpixels of the selected one of the plurality of images (e.g., the first image310) when the spatial location of the viewpoints of the non-selected images (e.g., thesecond image320 and the third image330) are aligned with the viewpoint of the selected one of the plurality of images (e.g., the first image310), then the electronic device may determine a sampled color value that is to be associated with at least one spatial location of a corresponding at least one subpixel of the selected image. The electronic device may determine the sampled color value that is to be associated with at least one spatial location of a corresponding at least one subpixel of the selected image using at least the sampled color value from a corresponding subpixel of the non-selected image for which subpixels thereof do not align with the subpixels of the selected image. For example, the electronic device may estimate the color value that is to be associated with at least one spatial location of a corresponding at least one subpixel of the selected image using an interpolation (or other numerical method) using sampled color values of the non-selected image or which subpixels thereof do not align with the subpixels of the selected image.
According to various embodiments of the present disclosure, if the subpixels of the non-selected images (e.g., thesecond image320 and the third image330) do not align with the subpixels of the selected one of the plurality of images (e.g., the first image310) when the spatial location of the viewpoints of the non-selected images (e.g., thesecond image320 and the third image330) are aligned with the viewpoint of the selected one of the plurality of images (e.g., the first image310), then the electronic device may align the non-selected images with the selected image according to the nearest subpixel alignment (e.g., such that the spatial location of the viewpoint may be slightly unaligned).
FIGS. 4A,4B,4C, and4D illustrate a plurality of subpixels according to an embodiment of the present disclosure.
Referring toFIGS. 4A,4B,4C, and4D, sampling of red color values from different images is described in an example for which the subpixels of the respective images are assumed to be aligned with one another. Further, theFIGS. 4A,4B, and4C illustrate sampled red color values from a plurality of images that are at least partially overlapped with one another such as, for example, thefirst image310, thesecond image320, and thethird image330 illustrated inFIG. 3. The intended viewpoint (e.g., spatial coverage of the image) is assumed to be the viewpoint of thefirst image310 illustrated inFIG. 3.
Referring toFIG. 4A, an array ofsubpixels400 is illustrated. In particular, the subpixels for which red is sampled in the capture of thefirst image310 are illustrated. During capture of thefirst image310, red is sampled at asubpixel405a, asubpixel410a, asubpixel415a, asubpixel420a, asubpixel425a, asubpixel430a, asubpixel435a, asubpixel440a, and asubpixel445a.
Referring toFIG. 4B, an array ofsubpixels400 is illustrated. In particular, the subpixels of thesecond image320 that overlap with the intended viewpoint and for which red is sampled in the capture of thesecond image320 are illustrated. During capture of thesecond image320, red is sampled at asubpixel425b, asubpixel430b, asubpixel440b, and asubpixel445b.
Referring toFIG. 4C, an array ofsubpixels400 is illustrated. In particular, the subpixels of thethird image330 that overlap with the intended viewpoint and for which red is sampled in the capture of thethird image330 are illustrated. During capture of thethird image330, red is sampled at asubpixel420c, asubpixel425c, asubpixel435c, and asubpixel440c.
Referring toFIG. 4D, an array ofsubpixels400 corresponding to the intended image or viewpoint is illustrated. According to various embodiments of the present disclosure, a mapping of sampled color values for at least one color to the intended image or viewpoint is generated. According to various embodiments of the present disclosure, an image may be demosaiced using a composite of sampling color values from a plurality of images.
According to the related art, an image or viewpoint corresponding to the array ofsubpixels400 would be demosaiced using the array ofsubpixels400 illustrated inFIG. 4A. In other words, the image or viewpoint corresponding to the array ofsubpixels400 would be demosaiced using sampled color values from a single captured image. In contrast, according to various embodiments of the present disclosure, the intended image or viewpoint corresponding to the array ofsubpixels400 may be demosaiced (e.g., the color may be reconstructed) taking into account the sampled color values at applicable subpixels for a plurality of images that substantially capture a same viewpoint. As illustrated inFIG. 4D, an intended image or viewpoint may be reconstructed using the composite sampled color values for from thefirst image310, thesecond image320, and thethird image330 that overlap with the intended image or viewpoint. The composite sampled red color values for the array ofsubpixels400 includes 17 sampled red color values. In contrast, as illustrated inFIG. 4A, the sampled red color values used to demosaic the array ofsubpixels400 according to the related art would have merely9 sampled red color values.
The composite sampled red color values for the array ofsubpixels400 comprises the sampled red color values from thefirst image310, thesecond image320, and thethird image330 respectively at asubpixel405a, asubpixel410a, asubpixel415a, asubpixel420a, asubpixel420c, asubpixel425a, asubpixel425b, asubpixel425c, asubpixel430a, asubpixel430b, asubpixel435a, asubpixel435c, asubpixel440a, asubpixel440b, asubpixel440c, asubpixel445a, and asubpixel445b.
Comparing the array ofsubpixels400 illustrated inFIG. 4D to the array ofsubpixels400 illustrated inFIG. 4A, the array ofsubpixels400 illustrated inFIG. 4B, or the array ofsubpixels400 illustrated inFIG. 4C, the composite of sampled red color values ofFIG. 4D results in less of a need to estimate red color values. The composite of sampled red color values illustrated inFIG. 4D reduces the number of subpixels for which a red color value may require estimation (e.g., the subpixels which do not have a sampled red color value from the portions of thefirst image310, thesecond image320, and thethird image330 that overlap with the intended image or viewpoint.
According to various embodiments of the present disclosure, if two or more images of the plurality of images that substantially capture a same viewpoint has a sampled color value for a given color at a subpixel of the intended image or viewpoint, then the color value associated with the particular subpixel of the intended image or viewpoint for the composite sampled color values may be an average of the various sampled color values for the particular subpixel from the two or more images. According to various embodiments of the present disclosure, the average of the various sampled color values for the particular subpixel from the two or more images may be a weighted average. For example, the average of the various sampled color values may be weighted according to an extent to which an image associated with a respective sampled color value is processed to normalize or otherwise align the image with a reference plane or image. As another example, the average of the various sampled color values may be weighted according the extent to which the subpixels of the corresponding image aligned with the reference image when the spatial locations of the image are aligned with the intended image or viewpoint.
According to various embodiments of the present disclosure, the electronic device may determine the subpixels of the intended image or viewpoint for which a particular color is sampled across the plurality of images that substantially capture the same viewpoint. For example, the electronic device may determine the subpixels of the intended image or viewpoint for which the composite sampled color value of a particular color has a sampled value. The electronic device may use a sum buffer to determine the subpixels of the intended image or viewpoint for which a particular color is sampled across the plurality of images that substantially capture the same viewpoint. The sum buffer may have a field (e.g., a bit) associated with each subpixel of the intended image.
According to various embodiments of the present disclosure, the field associated with a particular subpixel of the intended image may be incremented for each sampled color value of the particular color across the plurality of images that substantially capture the same viewpoint. For example, the field associated with a particular subpixel of the intended image may be incremented so as to reflect the aggregate number of sampled color values for the particular color that exist at the particular subpixel across the plurality of images that substantially capture the same viewpoint.
According to various embodiments of the present disclosure, the field associated with a particular subpixel of the intended image may be a binary representation indicating whether any of the plurality of images that substantially capture the same viewpoint include a sampled color value of the particular color for the particular subpixel. For example, the field associated with a particular subpixel of the intended image may be set to one if the composite sampled color values of the particular color (e.g., or mapping thereof) has a sampled color value of the particular color.
According to various embodiments of the present disclosure, the electronic device may determine the subpixels of the intended image or viewpoint for which a particular color values is missing (e.g., the subpixels for which a value may be estimated using surrounding sampled color values of the particular color value from, for example, the composite sampled color values of the particular color).
FIG. 5 illustrates a flowchart of a method of determining color values according to an embodiment of the present disclosure.
Referring toFIG. 5, atoperation510, the electronic device captures a plurality of images that substantially capture the same viewpoint. As an alternative, the electronic device may receive one or more of the plurality of images that substantially capture the same viewpoint from another electronic device (e.g., a counterpart electronic device). The electronic device may store the plurality of images (e.g., in a storage unit, a frame buffer, and/or the like).
Atoperation520, the electronic device aligns the corresponding images. For example, the electronic device aligns the plurality of images that substantially capture the same viewpoint so as to align the spatial location of the plurality of images. The electronic device may process one or more of the plurality of images to make the plurality of images co-planar before aligning the spatial location of the plurality of images.
Atoperation530, the electronic device determines a coverage of each color for subpixels in the intended image. For the example, the electronic device may use a sum buffer to determine which of the subpixels of the intended image have a sampled color value of a particular color associated therewith. In other words, the electronic device may use a sum buffer to determine the subpixels of the intended image or viewpoint for which a particular color is sampled across the plurality of images used to reconstruct the color representation of the intended image. Alternatively or additionally, the electronic device may determine which of the subpixels of the intended image do not have a sampled color value of a particular color associated therewith. The electronic device may use a sum buffer to determine the subpixels of the intended image or viewpoint for which a particular color has not been sampled across the plurality of images used to reconstruct the color representation of the intended image.
Atoperation540, the electronic device reconstructs a color representation of the intended image. For example, the electronic device may estimate a color value for the subpixels of the intended image or viewpoint for which a particular color has not been sampled across the plurality of images used to reconstruct the color representation of the intended image. The electronic device may estimate the color value for a particular for such subpixels using at least one sampled color value of the particular value from at least one of the plurality of images. The electronic device may interpolate color values for the subpixels that do not have a sampled color value across the plurality of images using at least one of the sampled color values from at least one of the plurality of images.
According to various embodiments of the present disclosure, the electronic device may allocate a weighting to each of the sampled color values across the plurality of images for the estimation of the color value. For example, the electronic device may weigh the sampled color values for a particular color at subpixels spatially closer to the subpixel for which the color value is being estimated more heavily than the sampled color values for the particular color at subpixels relatively more distant from the subpixel for which the color value is being estimated.
Referring back toFIG. 4D, the subpixel445ddoes not have a sampled red color value across the plurality of images used to reconstruct the color representation of the array ofsubpixels400. According to various embodiments of the present disclosure, the electronic device may estimate a red color value for the subpixel445dusing at least one or more of the sampled red color values (e.g., the sampled red color values at asubpixel405a, asubpixel410a, asubpixel415a, asubpixel420a, asubpixel420c, asubpixel425a, asubpixel425b, asubpixel425c, asubpixel430a, asubpixel430b, asubpixel435a, asubpixel435c, asubpixel440a, asubpixel440b, asubpixel440c, asubpixel445a, and asubpixel445b). As discussed above, the electronic device may weigh the sampled red color values differently in order to estimate the color value at the subpixel445d. For example, the electronic device may associate a higher weighting to the sampled red color values at thesubpixel445aand thesubpixel445bthan to the sampled red color value at thesubpixel405a.
According to various embodiments of the present disclosure, the electronic device may store a filter used for determining an estimated value at a subpixel that does not have a sampled color value of a particular color across the plurality of images used to reconstruct the color representation of the intended image. The filter may be a filter table. The electronic device may generate coefficients in the filter table. The coefficients may correspond to the respective weighting to be associated with the color values (e.g., the sampled color values) for the subpixels surrounding the subpixel for which a color value is to be estimated. The coefficients may be normalized such that the sum of all the coefficients in the filter table is equal to one. The electronic device may use the filter (e.g., the filter table) to calculate a weighted average of the nearby surrounding sampled color values of the particular color that are present in order to estimate the color value for the particular color.
According to various embodiments of the present disclosure, the electronic device may determine (e.g., estimate) a color value of a particular color for every subpixel of the intended image for which the particular color has not been sampled across the plurality of images used to reconstruct the color representation of the intended image. According to various embodiments of the present disclosure, the electronic device may determine the color value of a particular color for every subpixel of the intended image for which the particular color has not been sampled for every color (e.g., red, green, blue).
Atoperation550, the electronic device may render the image. The electronic device may render the image using at least one or more of the sampled color values across the plurality of images used to reconstruct the image and using the estimated color values (e.g., the color values determined at operation540).
FIG. 6 illustrates a flowchart of a method of determining color values according to an embodiment of the present disclosure.
Referring toFIG. 6, at operation605, the electronic device may capture a plurality of images that substantially capture the same viewpoint. As alternative, the electronic device may receive one or more of the plurality of images that substantially capture the same viewpoint from another electronic device (e.g., a counterpart electronic device). The electronic device may store the plurality of images (e.g., in a storage unit, a frame buffer, and/or the like).
Atoperation610, the electronic device may determine whether a subpixel has a sampled color value for a particular color across the plurality of images used to reconstruct the intended image.
If the electronic device determines that the subpixel has a sampled color value for the particular color across the plurality of images atoperation610, then the electronic device may proceed tooperation615 at which the electronic device determines whether the subpixel has more than one sampled color value for the particular color values across the plurality of images. For example, the electronic device may determine whether more than one of the plurality of images has a sampled color value for the particular color.
If the electronic device determines that the subpixel does not have more than one sampled color value for the particular color values across the plurality of images atoperation615, then the electronic device may proceed tooperation620 at which the electronic device may determine the color value to be the sampled color value from one of the plurality of images. Thereafter, the electronic device may proceed tooperation625 at which the electronic device stores the corresponding sampled color value for the particular color as the color value for the particular color at the subpixel. Thereafter, the electronic device may proceed to operation650.
In contrast, if the electronic device determines that the subpixel has more than one sampled color value for the particular color values across the plurality of images atoperation615, then the electronic device may proceed tooperation630 at which the electronic device determines an average value of the sampled color values at the subpixel for the particular color across the plurality of images. Thereafter, the electronic device may proceed tooperation635 at which the electronic device may store the average value of the particular color as the color value for the particular color at the subpixel. Thereafter, the electronic device may proceed to operation650.
In contrast, if the electronic device determines that the subpixel does not have a sampled color value of the particular color across the plurality of images atoperation610, then the electronic device may proceed tooperation640 at which the electronic device may determine the color value of the particular color at the subpixel. The electronic device may determine the color value by estimating a color value using sampled color values of the particular color at other subpixels (e.g., surrounding subpixels) across the plurality of images. The electronic device may determine the color value of the particular color by interpolation using sampled color values of the particular color at other subpixels of the intended image across the plurality of images. The electronic device may weigh the sampled color values of the particular color at one or more subpixels across the plurality of images more heavily than the sampled color values of the particular color at other subpixels across the plurality of images. Thereafter, the electronic device may store the determined color value of the particular color value at the subpixel. Thereafter, the electronic device may proceed to operation650.
At operation650, the electronic device may render the image using the corresponding color values for every color at the subpixels of the intended image. According to various embodiments of the present disclosure, the electronic device may perform further image processing on the mapping of color values to subpixels of the intended image.
FIG. 7A illustrates a flowchart of a method of demosaicing according to an embodiment of the present disclosure.
Referring toFIG. 7A, atoperation710, an electronic device generates a sum buffer. According to various embodiments of the present disclosure, the electronic device may sum the number of sampled color values for a particular color for each subpixel of the intended image across the plurality of images. The electronic device may determine the number of color values for the particular color that are sampled across the plurality of images for each subpixel of the intended image. According to various embodiments of the present disclosure, the electronic device may populate a sum buffer with an indication as to which subpixels of the intended image have at least one corresponding sampled color value for a particular color across the plurality of images and/or an indication as to which subpixels of the intended image do not have at least one corresponding sampled color value for a particular color across the plurality of images.
Atoperation720, the electronic device may generate a filter. The electronic device may generate a filter in the form of a filter table. The filter may include coefficients corresponding to a weighting to be applied to at least a subset of the subpixels of the intended image. The coefficients in the filter may be normalized such that the sum of the coefficients is equal to one. According to various embodiments of the present disclosure, the electronic device may generate the filter for each subpixel of the intended image that does not have a corresponding sampled color value of a particular color across the plurality of images. According to various embodiments of the present disclosure, the electronic device may generate the filter for all subpixels of the intended image.
FIG. 7B illustrates a plurality of subpixels according to an embodiment of the present disclosure.
Referring toFIG. 7B, an array of subpixels of at least an area of an intended image is illustrated. The array of subpixels corresponds to a composite of sampled red color values across a plurality of images to be used to reconstruct a color representation of the intended image. The subpixels for which a red value has been captured (e.g., sampled) across the plurality of images are denoted by ‘R’. Conversely, the subpixels for which no red value has been captured (e.g., sampled) across the plurality of images is denoted by ‘0’.
According to various embodiments of the present disclosure, the electronic device may generate a filter using array of subpixels illustrated inFIG. 7B. The electronic device may populate or otherwise generate a sum buffer to determine the subpixels for which no red value has been captured across the plurality of images. As an example, the sub buffer may be populated with ‘1’ for fields corresponding to the subpixels for which a red color value has been captured (e.g., the subpixels having a value ‘R’ in the array of subpixels). As another example, the sum buffer may be populated with ‘0’ for fields corresponding to the subpixels for which a red value has not been captured (e.g., the subpixels having a value ‘0’ in the array of subpixels). According to various embodiments of the present disclosure, the sum buffer may be used to identify the subpixels for which a color value needs to be reconstructed (e.g., estimated).
FIG. 7C illustrates a filter according to an embodiment of the present disclosure.
Referring toFIG. 7C, the electronic device may generate a filter using the array of subpixels, the sum buffer, or the like. Each field in the filter corresponds to a subpixel. As illustrated inFIG. 7C, the coefficients corresponding to subpixels for which no red color value has been captured is equal to zero, and the coefficients corresponding to the subpixels for which a red color value has been captured is 0.5.
According to various embodiments of the present disclosure, the filter may be generated in relation to a specific subpixel. For example, the filter illustrated inFIG. 7C may be generated in relation tosubpixel725 ofFIG. 7B. As illustrated inFIG. 7C, the sampled red color values are equally weighted. The sampled red color values may be equally weighted because the sampled red color values are equidistant from thesubpixel725. The electronic device may normalize the filter illustrated inFIG. 7C. The filter may be normalized by summing the coefficients in the filter and dividing each coefficient by the sum of all the coefficients in the filter.
According to various embodiments of the present disclosure, when there are more sampled color values of a particular color than just the results of one image capture, a method of reconstructing the color representation at a subpixel (e.g., demosaicing the image), uses all the nearby sampled color values but weighs the sampled color values closer to the subpixel to be estimated more than the values that are farther away.
FIG. 7D illustrates a plurality of subpixels according to an embodiment of the present disclosure.
Referring toFIG. 7D, an array of subpixels of at least an area of an intended image is illustrated. The array of subpixels corresponds to a composite of sampled blue color values across a plurality of images to be used to reconstruct a color representation of the intended image. The subpixels for which a blue value has been captured (e.g., sampled) across the plurality of images are denoted by ‘B’. Conversely, the subpixels for which no blue value has been captured (e.g., sampled) across the plurality of images is denoted by ‘0’.
According to various embodiments of the present disclosure, the electronic device may generate a filter using array of subpixels illustrated inFIG. 7D. The electronic device may populate or otherwise generate a sum buffer to determine the subpixels for which no blue value has been captured across the plurality of images. As an example, the sub buffer may be populated with ‘1’ for fields corresponding to the subpixels for which a blue color value has been captured (e.g., the subpixels having a value ‘B’ in the array of subpixels). As another example, the sum buffer may be populated with ‘0’ for fields corresponding to the subpixels for which a blue value has not been captured (e.g., the subpixels having a value ‘0’ in the array of subpixels). According to various embodiments of the present disclosure, the sum buffer may be used to identify the subpixels for which a color value needs to be reconstructed (e.g., estimated).
FIG. 7E illustrates a filter according to an embodiment of the present disclosure.
Referring toFIG. 7E, the electronic device may generate a filter using the array of subpixels, the sum buffer, or the like. Each field in the filter corresponds to a subpixel. As illustrated inFIG. 7E, the coefficients corresponding to subpixels for which no blue color value has been captured is equal to zero. The coefficients corresponding to the subpixels for which a blue color value may be weighted according to a distance between the corresponding subpixel having the sampled blue color value and the subpixel for which a blue color value is being determined (e.g., estimated).
According to various embodiments of the present disclosure, the filter may be generated in relation to a specific subpixel. For example, the filter illustrated inFIG. 7E may be generated in relation to asubpixel735 ofFIG. 7D. As illustrated inFIG. 7E, each group of pixels having sampled blue color values that are equidistant from the subpixel for which the blue color value is being determined are equally weighted. However, a subpixel having a sampled blue color value and that is relatively closer to the subpixel for which the blue color is being determined may have a coefficient corresponding to a higher weight than another subpixel having a sampled blue color value and that is relatively farther away from the subpixel for which the blue color is being determined.
Referring back toFIG. 7A, atoperation730, the electronic device may determine a corresponding color value for a subpixel of the intended image that does not have a corresponding sampled color value of a particular color across the plurality of images. The electronic device may determine the corresponding color value for each subpixel of the intended image the does not have a corresponding sampled color value of the particular color across the plurality of images.
According to various embodiments of the present disclosure, the electronic device may determine the corresponding color value for a subpixel of the intended image that does not have a corresponding sampled color value of a particular color across the plurality of images using an applicable filter table (e.g., a filter table generated for use with the particular subpixel). According to various embodiments of the present disclosure, the electronic device may determine the corresponding color value for the subpixel by multiplying the corresponding sampled color values of the other subpixels (e.g., the surrounding subpixels) for which a corresponding color value has been captured (e.g., sampled) by the corresponding coefficient for that subpixel. Thereafter, the sum of the products of the coefficient of the filter and the corresponding sampled color value may be determined to be color value at the subpixel of the intended image that does not have a corresponding sampled color value of the particular color.
Atoperation740, the electronic device may store the determined (e.g., estimated) color value for the corresponding color at the applicable subpixel of the intended image. The electronic device may store the determined color values for each respective color at each respective subpixel that does not have a corresponding sampled color value across the plurality of images. Thereafter, the electronic device may reconstruct the image. For example, the electronic device may render the image using the sampled color values and the estimated color values. According to various embodiments of the present disclosure, the electronic device may perform image processing on the image (e.g., the electronic device may further process the mapping of color values at the subpixels of the intended image) before rendering the image.
According to various embodiments of the present disclosure, the electronic device may use a similar method for estimating other characteristics of the intended image at the subpixel level. The electronic device may use measured values of a particular characteristic across a plurality of images to determine (e.g., estimate) a corresponding value of the particular characteristic at subpixels for which no measurement exists.
FIG. 8 illustrates pseudo code for a method of demosaicing according to an embodiment of the present disclosure.
Referring toFIG. 8, the illustrated code shows how the variable of demosaicing is performed. The function “demo(color, x, y)” (e.g., demo(R,x,y)) corresponds to a function call to perform a demosaicing of a particular color at a particular subpixel denoted by the coordinates (x,y). The demosaic function may use an interpolation method or other numerical method for determining a corresponding color value at the particular subpixel. For example, the demosaic function may perform a pixel replication, a bilinear interpolation, a median interpolation, and/or the like.
As illustrated inFIG. 8, the function “demosaic” loops for all of the subpixels. The demosaic function may read the average pixel from the pipeline. The demosaic function may also read the sum buffer count from the “num” buffer for each pixel. If the sum buffer count is non-zero, then the average pixel is used as is (e.g., demosaicing is not performed on subpixels for which at least one sampled color value is captured across the plurality of images). In contrast, if the sum buffer count is zero, then demosaicing is determined to be required and the function “demo” is called to perform a demosaicing procedure.
FIG. 8 further illustrates the generation of a filter.
FIG. 9 illustrates a block diagram schematically illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
Referring toFIG. 9, anelectronic device900 may include acontrol unit910, astorage unit920, animage processing unit930, adisplay unit940, aninput unit950, and acommunication unit960.
According to various embodiments of the present disclosure, theelectronic device900 comprises at least onecontrol unit910. The at least onecontrol unit910 may be configured to operatively control theelectronic device900. For example, the at least onecontrol unit910 may control operation of the various components or units included in theelectronic device900. The at least onecontrol unit910 may transmit a signal to the various components included in theelectronic device900 and control a signal flow between internal blocks of theelectronic device900. The at least onecontrol unit910 may be or otherwise include at least one processor. The at least onecontrol unit910 may include an Application Processor (AP), and/or the like.
Thestorage unit920 may be configured to store user data, and the like, as well a program which performs operating functions according to various embodiments of the present disclosure. Thestorage unit920 may include a non-transitory computer-readable storage medium. As an example, thestorage unit920 may store a program for controlling general operation of anelectronic device900, an Operating System (OS) which boots theelectronic device900, and application program for performing other optional functions such as a camera function, a sound replay function, an image or video replay function, a signal strength measurement function, a route generation function, image processing, and the like. Further, thestorage unit920 may store user data generated according to a user of theelectronic device900, such as, for example, a text message, a game file, a music file, a movie file, and the like. According to various embodiments of the present disclosure, thestorage unit920 may store an application or a plurality of applications that individually or in combination operate a camera unit (not shown) to capture (e.g., contemporaneously) one or more images of substantially the same viewpoint, and/or the like. According to various embodiments of the present disclosure, thestorage unit920 may store an application or a plurality of applications that individually or in combination operate theimage processing unit930 or thecontrol unit910 to determine which subpixels across the plurality of images have a corresponding sampled color value of a particular color, to determine which subpixels across the plurality of images do not have a corresponding sampled color value of the a particular color, to estimate the color value at a subpixel across the plurality of images that do not have a corresponding sampled color value of the particular color, to render the image, and/or the like. Thestorage unit920 may store an application or a plurality of applications that individually or in combination operate thecontrol unit910 and thecommunication unit960 to communicate with a counterpart electronic device to receive one or more images from the counterpart electronic device, and/or the like. Thestorage unit920 may store an application or a plurality of applications that individually or in combination operatedisplay unit940 to display a graphical user interface, an image, a video, and/or the like.
Theimage processing unit930 may be configured to process image data, images, and/or the like. Theimage processing unit930 may include a Sub Pixel Rendering (SPR) unit (not shown), a demosaicing unit (not shown), and/or the like. In the alternative or in addition, theimage processing unit930 may be configured to perform demosaicing of image data and/or images, SPR, and/or the like. Theimage processing unit930 may be configured to determine which subpixels of an intended image have a corresponding sampled color value for a particular color, to determine which subpixels of an intended image do not have a corresponding sampled color value for a particular color, to determine (e.g., estimate) a color value for a particular value for the subpixels that do not have a corresponding sampled color value of the particular color, and/or the like.
Thedisplay unit940 displays information inputted by user or information to be provided to user as well as various menus of theelectronic device900. For example, thedisplay unit940 may provide various screens according to a user of theelectronic device900, such as an idle screen, a message writing screen, a calling screen, a route planning screen, and the like. According to various embodiments of the present disclosure, thedisplay unit940 may display an interface which the user may manipulate or otherwise enter inputs via a touch screen to enter selection of the function relating to the signal strength of theelectronic device900. Thedisplay unit940 can be formed as a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED), and the like. However, various embodiments of the present disclosure are not limited to these examples. Further, thedisplay unit940 can perform the function of theinput unit950 if thedisplay unit940 is formed as a touch screen.
Theinput unit950 may include input keys and function keys for receiving user input. For example, theinput unit950 may include input keys and function keys for receiving an input of numbers or various sets of letter information, setting various functions, and controlling functions of theelectronic device900. For example, theinput unit950 may include a calling key for requesting a voice call, a video call request key for requesting a video call, a termination key for requesting termination of a voice call or a video call, a volume key for adjusting output volume of an audio signal, a direction key, and the like. In particular, according to various embodiments of the present disclosure, theinput unit950 may transmit to the at least onecontrol unit910 signals related to the operation of a camera unit (not shown), to selection of an image, to selection of a viewpoint, and/or the like. Such aninput unit950 may be formed by one or a combination of input means such as a touch pad, a touchscreen, a button-type key pad, a joystick, a wheel key, and the like.
Thecommunication unit960 may be configured for communicating with other electronic devices and/or networks. According to various embodiments of the present disclosure, thecommunication unit960 may be configured to communicate using various communication protocols and various communication transceivers. For example, thecommunication unit960 may be configured to communicate via Bluetooth technology, NFC technology, WiFi technology, 2G technology, 3G technology, LTE technology, or another wireless technology, and/or the like.
It will be appreciated that various embodiments of the present disclosure according to the claims and description in the specification can be realized in the form of hardware, software or a combination of hardware and software.
Any such software may be stored in a non-transitory computer readable storage medium. The non-transitory computer readable storage medium stores one or more programs (software modules), the one or more programs comprising instructions, which when executed by one or more processors in an electronic device, cause the electronic device to perform a method of the present disclosure.
Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a Read Only Memory (ROM), whether erasable or rewritable or not, or in the form of memory such as, for example, Random Access Memory (RAM), memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a Compact Disk (CD), Digital Versatile Disc (DVD), magnetic disk or magnetic tape or the like. It will be appreciated that the storage devices and storage media are various embodiments of non-transitory machine-readable storage that are suitable for storing a program or programs comprising instructions that, when executed, implement various embodiments of the present disclosure. Accordingly, various embodiments provide a program comprising code for implementing apparatus or a method as claimed in any one of the claims of this specification and a non-transitory machine-readable storage storing such a program.
According to various embodiments of the present disclosure, one or more images may be used to reconstruct a color representation of an intended image. Because more information may be obtained from a plurality of images of an intended viewpoint than otherwise obtained from a single image of the intended viewpoint, various embodiments of the present disclosure may use such additional information to reconstruct the intended image (e.g., intended viewpoint). According to the various embodiments of the present disclosure, in the absence of a plurality of images that substantially capture the same viewpoint, a method of reconstructing the color representation of the intended image may be consistent with the results of a normal demosaicing process which uses a single image.
While the disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents.