Movatterモバイル変換


[0]ホーム

URL:


CN111629140B - Image sensors and electronics - Google Patents

Image sensors and electronics
Download PDF

Info

Publication number
CN111629140B
CN111629140BCN202010724148.5ACN202010724148ACN111629140BCN 111629140 BCN111629140 BCN 111629140BCN 202010724148 ACN202010724148 ACN 202010724148ACN 111629140 BCN111629140 BCN 111629140B
Authority
CN
China
Prior art keywords
row
column
filter
units
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010724148.5A
Other languages
Chinese (zh)
Other versions
CN111629140A (en
Inventor
程祥
王迎磊
宋锐男
张玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co LtdfiledCriticalShenzhen Goodix Technology Co Ltd
Publication of CN111629140ApublicationCriticalpatent/CN111629140A/en
Application grantedgrantedCritical
Publication of CN111629140BpublicationCriticalpatent/CN111629140B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The application provides an image sensor and an electronic device, which can enable a finally generated image to be more similar to a real effect. The image sensor comprises a filter unit array and a pixel unit array, wherein the filter unit array comprises a plurality of filter unit groups, each filter unit group in the plurality of filter unit groups comprises a white filter unit and a color filter unit, the pixel unit array is positioned below the filter unit array, pixel units in the pixel unit array are in one-to-one correspondence with the filter units in the plurality of filter unit groups, white pixel units in the pixel unit array are used for receiving light signals passing through the white filter units corresponding to the white pixel units, color pixel units in the pixel unit array are used for receiving light signals passing through the color filter units corresponding to the color pixel units, and light intensity information of the light signals sensed by the pixel unit array is used for determining pixel values of the color pixel units and a generation mode of a target image of a shooting object generated by the pixel values of the white pixel units.

Description

Image sensor and electronic device
The present application claims priority from chinese patent office, application number 202010410639.2, chinese application application entitled "image sensor and electronic device," filed 5/15/2020, the entire contents of which are incorporated herein by reference.
Technical Field
Embodiments of the present application relate to the field of images, and more particularly, to an image sensor and an electronic device.
Background
Imaging systems in electronic devices typically rely on image sensors to establish an electronic display of a visual image. Examples of such image sensors include charge-coupled device (CCD) image sensors and active pixel sensor (active pixel sensor, APS) devices, which are often also referred to as CMOS sensors because they can be fabricated in a complementary metal oxide semiconductor (complementary metal oxide semiconductor, CMOS) process.
These image sensors include a plurality of photosensitive pixels, often arranged in a regular pattern of rows and columns. In order to capture a color image, it is necessary to accumulate light signals of a specific wavelength on different pixels, i.e., signals corresponding to a specific color received, so a color filter is installed in the image sensor. For example, a filter having a Bayer (Bayer) array configured to include each color of red, green, and blue (RGB) is generally used.
In order to sensitize different pixels in the pixel array to only a part of the visible spectrum, the color filters need to be set to different colors to pass the light signals of the corresponding colors, so that the amount of light reaching each photosensitive pixel is reduced, thereby reducing the photosensitivity of each photosensitive pixel. In addition, since the image sensor is typically limited in size when used in a mobile device, the photosensitive area of the corresponding pixel array is also limited, and thus the performance of photographing in a low-light environment may be limited.
Disclosure of Invention
The application provides an image sensor and electronic equipment, which can enable a finally generated image to be closer to a real effect.
In a first aspect, an image sensor is provided, the image sensor comprises a filter unit array, a pixel unit array and a pixel unit array, wherein the filter unit array comprises a plurality of filter unit groups, each filter unit group in the plurality of filter unit groups comprises a white filter unit and a color filter unit, the pixel unit array is positioned below the filter unit array, the pixel units in the pixel unit array are in one-to-one correspondence with the filter units in the plurality of filter unit groups, the white pixel units in the pixel unit array are used for receiving light signals passing through the white filter units corresponding to the pixel units, the color pixel units in the pixel unit array are used for receiving light signals passing through the color filter units corresponding to the pixel units, and the light intensity information of the light signals sensed by the pixel unit array is used for determining the pixel values of the color pixel units and the pixel values of the white pixel units to generate a generation mode of a target image of a shooting object.
Based on the above technical scheme, the image sensor in the application can comprise the white filter unit, and compared with the filter unit array only provided with the monochromatic filter unit, the light inlet amount is greatly improved, and accordingly, the light inlet amount of the whole image sensor can be improved, so that the performance of the image sensor is not affected even in a low illumination environment.
In addition, the image sensor can generate the target image in different fusion modes under different ambient light conditions, so that the finally generated target image is closer to the real effect.
In one possible implementation manner, each filter unit group in the plurality of filter unit groups includes 4×4 filter units, a ratio of the white filter units to the color filter units in each filter unit group is 1:1, and the white filter units and the color filter units are alternately arranged.
The white filter unit with the 50% duty ratio can ensure that the white pixel unit has a higher spatial sampling rate, and is beneficial to the subsequent Remosaic algorithm to acquire a better high-resolution gray level image.
In addition, the white filter units and the color filter units may be alternately arranged, that is, in any one row or any one column, two adjacent white filter units or color filter units may not be present, so that crosstalk may be relatively averaged. In addition, the white filter unit and the color filter unit are distributed uniformly, so that a higher color space sampling rate can be realized, and the subsequent image restoration is facilitated.
In one possible implementation, the color filter units include 2 red filter units, 2 blue filter units, and 4 green filter units.
In one possible implementation, the filter units on one diagonal of each filter unit group are white filter units, and the other diagonal is 2 red filter units and 2 blue filter units.
In a possible implementation, the 2 red filter units on the other diagonal are arranged in a common vertex angle, and the 2 blue filter units are also arranged in a common vertex angle.
In one possible implementation manner, the white filter unit is located in a first row, a third column, a second row, a fourth column, a third row, a first column, a third row, a third column, a fourth row, a second column, and a fourth row in each filter unit group, the red filter unit is located in a third row, a first column, a fourth row, and a first column in each filter unit group, the blue filter unit is located in a first row, a fourth column, a second row, and a third column in each filter unit group, and the green filter unit is located in a first row, a second row, a first column, a third row, and a fourth row in each filter unit group.
In one possible implementation, the 2 red filter units on the other diagonal are arranged in common with the vertex angle, and the 2 blue filter units are arranged separately, or the 2 blue filter units on the other diagonal are arranged in common with the vertex angle, and the 2 red filter units are arranged separately.
In one possible implementation manner, the white filter unit is located in a first row, a third column, a second row, a second column, a second row, a fourth column, a third row, a first column, a third row, a third column, a fourth row, a second column, and a fourth row in each filter unit group, the red filter unit is located in a first row, a fourth row, a first column, and a blue filter unit is located in a second row, a third row, a second column, and a third row in each filter unit group, and the green filter unit is located in a first row, a second row, a first column, a third row, a fourth column, and a fourth row in each filter unit group.
In one possible implementation manner, the white filter unit is located in a first row, a third column, a second row, a second column, a second row, a fourth column, a third row, a first column, a third row, a third column, a fourth row, a second column, and a fourth row in each filter unit group, the red filter unit is located in a second row, a third row, a second column, and a third row in each filter unit group, the blue filter unit is located in a first row, a fourth row, a first column, and a fourth row in each filter unit group, and the green filter unit is located in a first row, a second row, a first column, a third row, and a fourth row in each filter unit group.
In a possible implementation, the red filter units and the blue filter units on the other diagonal line are alternately arranged.
In one possible implementation manner, the white filter unit is located in a first row, a third column, a second row, a second column, a second row, a fourth column, a third row, a first column, a third row, a third column, a fourth row, a second column, and a fourth row in each filter unit group, the red filter unit is located in a second row, a third column, a fourth row, a first column in each filter unit group, the blue filter unit is located in a first row, a fourth column, a third row, a second column, and a third row in each filter unit group, and the green filter unit is located in a first row, a second row, a first column, a third row, a fourth column, and a fourth row in each filter unit group.
In one possible implementation manner, the filter units on one diagonal line in each filter unit group are white filter units, and the other diagonal line is a green filter unit.
In one possible implementation manner, the white filter unit is located in a first row, a third column, a second row, a second column, a second row, a fourth column, a third row, a first column, a third row, a third column, a fourth row, a second column, and a fourth row in each filter unit group, the red filter unit is located in a second row, a third column, and a fourth row in each filter unit group, the blue filter unit is located in a first row, a second column, a third row, and a fourth column in each filter unit group, and the green filter unit is located in a first row, a fourth column, a second row, a third row, a second column, and a fourth row in each filter unit group.
In a possible implementation manner, the system further comprises a processor, wherein the processor is used for determining light intensity information according to the light signals sensed by the pixel unit array, and determining a pixel value of the color pixel unit and/or a pixel value of the white pixel unit according to the light intensity information to generate a generation mode of a target image of a shooting object.
In one possible implementation, the method further comprises a processor, wherein the processor is used for determining light intensity according to the light signals sensed by the pixel unit array, and generating the target image by using the pixel values of the color pixel units when the light intensity is greater than or equal to a first preset threshold value.
In one possible implementation manner, the image processing device further comprises a processor, wherein the processor is used for determining texture information according to the pixel value of the white pixel unit when the light intensity is smaller than the first preset threshold value and larger than or equal to the second preset threshold value, determining the color information and the pixel value of the position of the white pixel unit according to the pixel value of the white pixel unit and the colors of the pixel units around the white pixel unit, and generating the target image according to the color information and the pixel value of the position of the white pixel unit and the pixel value of the color pixel unit.
In one possible implementation manner, the system further comprises a processor, wherein the processor is used for generating first image data according to the pixel value of the white pixel unit and generating second image data according to the pixel value of the color pixel unit when the light intensity is smaller than the second preset threshold value and larger than or equal to a third preset threshold value, correcting the first image data by using a correction coefficient alpha to obtain corrected first image data, and fusing the corrected first image data and the second image data to obtain the target image, wherein the correction coefficient alpha is determined according to the light intensity, and 0< alpha <1.
In a possible implementation manner, the system further comprises a processor, wherein the processor is used for generating first image data according to the pixel value of the white pixel unit and generating second image data according to the pixel value of the color pixel unit when the light intensity is smaller than the third preset threshold value, and fusing the first image data and the second image data to obtain the target image.
The fusion mode can ensure that the generated target image can be close to the real effect no matter in a strong light environment or a weak light environment.
In one possible implementation manner, the camera further comprises a micro lens array, wherein the micro lens array comprises a plurality of micro lenses which are positioned above the optical filtering unit array and used for converging the optical signals returned by the shooting object to the optical filtering unit array, and one micro lens of the micro lenses corresponds to at least one optical filtering unit in the optical filtering unit array.
In one possible implementation, the microlenses in the microlens array are in one-to-one correspondence with the filter units in the filter unit array.
In a second aspect, an electronic device is provided, comprising the image sensor of the first aspect or any of the possible implementation manners of the first aspect.
Drawings
Fig. 1 is a schematic block diagram of an image processing apparatus provided by an embodiment of the present application.
Fig. 2 is a schematic diagram of color distribution of a conventional filter unit group.
Fig. 3 is a schematic top view of another image sensor according to an embodiment of the present application.
Fig. 4 is a schematic cross-sectional view of the image sensor of fig. 3 along A-A'.
Fig. 5 is a schematic illustration of an image taken under two different filter structures.
Fig. 6 to 11 are schematic diagrams illustrating an arrangement manner of filter units in a filter unit group according to an embodiment of the present application.
Fig. 12 to 13 are schematic structural diagrams of an image sensor according to an embodiment of the present application.
Fig. 14-18 are schematic flow diagrams of fusion processes provided by embodiments of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
The image processing device converts an optical image of a shooting object into an electric signal in a corresponding proportional relation with the optical image by utilizing a photoelectric conversion function of the pixel array, and further obtains an image of the shooting object. Fig. 1 shows a schematic block diagram of an image processing apparatus 100, where the image processing apparatus 100 may refer to any electronic device, for example, the image processing apparatus 100 may be a mobile phone, or the image processing apparatus 100 may be a part of an electronic device, for example, may be an image capturing module in the electronic device, and embodiments of the present application are not limited thereto.
As shown in fig. 1, the image processing apparatus 100 generally includes a pixel array (or may also be referred to as a photoelectric conversion unit 101 or an image sensor 101), a signal reading circuit 102, a memory 103, a controller 104, an image processor 105, an output interface 106, and a power supply 107. The electrical signal output end of the pixel array 101 is connected to the input end of the signal reading circuit 102, the control end of the pixel array 101 is connected to the input end of the controller 104, the output end of the signal reading circuit 102 is connected to the input end of the memory 103 and the input end of the controller 104, the output end of the controller 104 is connected to the input end of the image processor 105, the output end of the image processor 105 is connected to the input end of the output interface 106, and the power supply 107 is used for providing power for the above modules.
The pixel array 101 can adopt two different semiconductor structures of CCD or CMOS to realize capturing and photoelectric conversion of light, and the pixel array 101 can be used for collecting optical signals returned by an imaging object, converting the optical signals into electric signals and reflecting the optical image of the imaging object through the intensity of the electric signals. The signal reading circuit 102 is configured to read the electrical signal output by each pixel, and the signal reading circuit 102 may be an a/D converter for implementing analog-to-digital conversion. The memory 103 may be an internal memory that directly exchanges data, for example, the memory 103 may be a random access memory (random access memory, RAM) for storing required data. The controller 104 may be a complex programmable logic device (complex programmable logic device, CPLD) capable of satisfying the logic operation and timing control of the sensor. The image processor 105 is used to pre-process the read-out data and may perform different algorithmic processing for different filter patterns. The output interface 106 serves as an external data interaction interface for transmitting image data to the outside. The controller 104 is configured to output control signals for controlling each pixel in the pixel array to cooperate.
The core component of the image processing apparatus 100 is the pixel array 101. Each photosensitive structure in the pixel array 101 is similar, and typically each pixel structure may include a lens (or microlens), a filter (color filter), and a photosensitive element (or pixel). Wherein, the lens is located the top of light filter, and the light filter is located the top of photosensitive element. The light returned through the subject is focused by the lens, emitted from the lens emission area, filtered by the filter, and then enters a photosensor such as a Photodiode (PD), and the photosensor converts the optical signal into an electrical signal. The pixels may include red pixels (hereinafter, R pixels), green pixels (hereinafter, G pixels), and blue pixels (hereinafter, B pixels) according to the types of light that the different filters can transmit. The R pixels are used for receiving the red light signals filtered by the optical filters, and the principle of the G pixels and the B pixels is the same as that of the R pixels, which is not described herein.
The principle of the image processing apparatus 100 for generating color image data is that each pixel in the pixel array can only convert one type of optical signal into an electrical signal, and then, the image color of the area collected by the current pixel can be restored by performing interpolation operation in combination with the optical signals collected by other surrounding types of pixels, which is also called demosaicing (Demosaicing), and is usually completed in a processor. For example, the current pixel is an R pixel, and the R pixel can only convert the red light signal into an electric signal, so that the blue light and green light intensity of the current pixel can be restored by combining the electric signals collected by the surrounding B pixels or G pixels, and the image color of the current pixel can be determined.
Therefore, in order to acquire a color image, a color filter of a color specific arrangement, or may also be referred to as a Color Filter Array (CFA), needs to be provided over a photosensor array included in the photosensor array. Currently, for most photosensitive arrays, such as CCDs and CMOS image sensors, the CFA included therein adopts a Bayer format based on RGB three primary colors. The Bayer pattern is characterized in that its basic unit is a2×2 four-pixel array, which includes 1 red pixel R, one blue pixel B, and 2 green pixels G, where two green pixels G are disposed adjacent to each other at a common vertex angle, as shown in fig. 2. Since any pixel can only obtain a signal of a certain color in RGB, the restoration of the complete color information must be realized by a specific image processing algorithm.
This pure RGB bayer arrangement, which allows only light of a specific color to pass through each pixel, i.e., intercepts a large portion of the photons, may not accurately restore the image in a low-light environment.
In addition, miniaturized and multi-image pixel imaging devices have made pixels more and more, imaging equipment with high density pixels has become more and more important for capturing high resolution images. In this way, as the number of pixels of the image sensor is continuously increased and the size of the image sensor is smaller, the photosensitive area of each pixel is also reduced, which further affects the intensity of the light signal sensed by each pixel, so that the image sensor cannot accurately restore the image.
Alternatively, the image processor 105 may include, but is not limited to, an image signal processor (IMAGE SIGNAL processor, ISP) for linearizing, removing dead spots, removing noise, color correction, demosaicing, automatic exposure control (automatic exposure control, AEC), automatic gain control (automatic gain control, AGC), automatic white balance (auto white balance, AWB), and the like, of the digital image.
With the image processing apparatus 100 using the Bayer format CFA, the red pixel unit can only receive the red light signal, the green pixel unit can only receive the green light signal, the blue pixel unit can only receive the blue light signal, and the intensity of the light signal received by each pixel unit is smaller, resulting in a larger SNR of the image, thereby affecting the image quality.
In addition, with respect to the image sensor of the Bayer format CFA, high-frequency information of luminance and chromaticity information in an image are likely to overlap, color aliasing (color aliasing) is likely to occur, and color moire (color moire) is likely to occur.
In addition, miniaturized and multi-image pixel imaging devices have made pixels more and more, imaging equipment with high density pixels has become more and more important for capturing high resolution images. In this way, as the number of pixels of the image sensor is continuously increased and the size of the image sensor is smaller, the photosensitive area of each pixel is also reduced, which further affects the intensity of the light signal sensed by each pixel, so that the image sensor cannot accurately restore the image.
Based on the above, the embodiment of the application provides an image sensor, which is favorable for accurately restoring images in a low-illumination environment.
Fig. 3 is a schematic top view of an image sensor 200 according to an embodiment of the present application, and fig. 4 is a schematic cross-sectional view of the image sensor 200 along A-A'.
As shown in fig. 3 and 4, the image sensor 200 includes a filter cell array 210 and a pixel cell array 220. The filter cell array 210 may include a plurality of filter cell groups 211, and each filter cell group 211 may include a white filter cell and a color filter cell. The pixel unit array 220 may be disposed below the filter unit array 210, and the white pixel units in the pixel unit array 220 are configured to receive the light signals passing through the white filter units corresponding thereto, and the color pixel units are configured to receive the light signals filtered through the color filter units corresponding thereto.
In the image sensor provided by the embodiment of the application, the white (W) filter unit is added in the CFA, the color pixel units in the pixel unit array receive the color light signals, and the white pixel units receive the white light signals, so that the intensity of the whole light signals received by the pixel unit array can be increased. For a low illumination environment, the scheme can accurately restore the image of the object.
However, if the pixel values of the white pixel units are directly fused into the color pixel units after adding the white filter units in the CFA, excessive brightness information may cause excessive brightness of the final image when the light is sufficient, so that the final image deviates from the real effect.
As shown in FIG. 5, when the light is sufficient, the image generated by adopting the RGGB structure is shown as a (a) diagram and is relatively close to the real effect, but the image generated by adopting the RGBW structure is shown as a (b) diagram, and the brightness information of the W pixel units is fused, so that the brightness of the finally generated image is overlarge and deviates from the real effect.
Based on the above, the embodiment of the application also provides an image sensor, which can generate target images in different fusion modes under different ambient light conditions, so that the finally generated target images are closer to the real effect.
The image sensor includes a filter cell array and a pixel cell array. The filter cell array includes a plurality of filter cell groups, each of which includes a white filter cell and a color filter cell. The pixel unit array is positioned below the light filtering unit array, the white pixel units in the pixel unit array are used for receiving the light signals passing through the white light filtering units corresponding to the white pixel units, the color pixel units in the pixel unit array are used for receiving the light signals passing through the color light filtering units corresponding to the color pixel units, and the light intensity information of the light signals sensed by the pixel unit array is used for determining the pixel values of the color pixel units and/or the generation mode of the pixel values of the white pixel units to generate the target image of the shooting object.
The white pixel unit is a pixel unit corresponding to the white filter unit, and the color pixel unit is a pixel unit corresponding to the color filter unit.
In the present application, the white filter unit refers to a filter or a filter material for transmitting white light, and in some embodiments, the white filter unit may be a transparent material or an air gap for transmitting all light signals including white light in the environment. In particular, the white light may be a mixture of colored light. For example, light of three primary colors in the spectrum, blue, red and green, may be mixed in a proportion to produce white light, or the mixture of all visible light in the spectrum may be white light.
The color filter unit in the embodiment of the application can comprise at least one red filter unit, at least one blue filter unit and at least one green filter unit, so that the color integrity of the RGB space can be ensured.
The number of the optical filter units included in the optical filter unit group is not particularly limited in the embodiment of the application. For example, one filter unit group may include 2×2 filter units, or one filter unit group may include 3×3 filter units, or one filter unit group may include 4×4 filter units, or one filter unit group may include 6×6 filter units, or one filter unit group may include 8×8 filter units, or the like.
The proportion of the white filter units in each filter unit group can be 25% -75%.
Preferably, the duty ratio of the white filter unit may be 50%, that is, the ratio of the white filter unit to the color filter unit in one filter unit group may be 1:1, so that the overall photosensitivity of the sensor can be improved. Therefore, the white filter unit can provide enough brightness in a low-light environment, and the brightness is not too high, so that the final image is close to a real effect.
And secondly, the white filter unit with the 50% of the duty ratio can ensure that the white pixel unit has a higher spatial sampling rate, and is beneficial to the acquisition of better high-resolution gray level images by a subsequent reconstruction mosaic (Remosaic) algorithm.
In addition, the white filter units and the color filter units may be alternately arranged, that is, in any one row or any one column, two adjacent white filter units or color filter units may not be present, so that crosstalk may be relatively averaged. In addition, the white filter unit and the color filter unit are distributed uniformly, so that a higher color space sampling rate can be realized, and the subsequent image restoration is facilitated.
The arrangement of the filter units according to the embodiment of the present application will be described below by taking an example in which each filter unit group includes 4×4 filter units.
One filter unit group may include 2 red filter units, 2 blue filter units, and 4 green filter units. Because the sensitivity of human eyes to green light is higher than that of blue light and red light, the number of the green filter units in the filter unit is larger than that of the blue filter unit and the red filter unit, color restoration can be better realized, and meanwhile, the RGB is ensured to have a relatively average space sampling rate, so that the color image can be acquired by a subsequent Remosaic algorithm.
The filter units on one diagonal line in each filter unit group are white filter units, and the other diagonal line is 2 red filter units and 2 blue filter units.
As an example, the 2 red filter units may be arranged at the vertex angle, and the 2 blue filter units are also arranged at the co-vertex angle, i.e., 2 consecutive red filter units and2 consecutive blue filter units are included on the other diagonal line, as shown in the (a) diagram of fig. 6.
In the filter unit group shown in fig. 6 (a), the white filter unit may be located in a first row, a third column, a second row, a second column, a second row, a third row, a first column, a third row, a third column, a fourth row, a second column, and a fourth row in each filter unit group, the red filter unit may be located in a third row, a second column, a fourth row, a first column in each filter unit group, the blue filter unit may be located in a first row, a fourth column, a second row, and a third column in each filter unit group, and the green filter unit may be located in a first row, a second row, a first column, a third row, a fourth column, and a fourth row in each filter unit group.
In the structure, the filter units with the same color are arranged at the common vertex angle, so that the fusion process can be simplified in the subsequent fusion process. Taking fig. 4 as an example, the pixel units corresponding to the 2 green filter units in the upper left corner may be directly combined into one green pixel unit, the pixel units corresponding to the 2 blue filter units in the upper right corner may be directly combined into one blue pixel unit, the pixel units corresponding to the 2 red filter units in the lower left corner may be directly combined into one red pixel unit, the pixel units corresponding to the 2 green filter units in the lower right corner may be directly combined into one green pixel unit, and the combined pixel units may directly form one RGGB pattern.
As another example, 2 red filter cells on the other diagonal are arranged in common with the vertex angle, and 2 blue filter cells are arranged separately, or 2 blue filter cells on the other diagonal are arranged in common with the vertex angle, and 2 red filter cells are arranged separately.
Taking the diagram (b) in fig. 6 as an example, the white filter unit is located in the first row, the third column, the second row, the fourth column, the third row, the first column, the third row, the third column, the fourth row, the second column, and the fourth row in each filter unit group, the red filter unit is located in the first row, the fourth row, the first column, the blue filter unit is located in the second row, the third row, and the second column in each filter unit group, and the green filter unit is located in the first row, the second row, the first column, the third row, and the fourth row in each filter unit group.
Taking the diagram (c) in fig. 6 as an example, the white filter unit is located in the first row, the first row and the third column, the second row and the second column, the second row and the fourth column, the third row and the third column, the fourth row and the second column, the red filter unit is located in the second row and the third row and the second column, the blue filter unit is located in the first row and the fourth column, the fourth row and the first column, the green filter unit is located in the first row and the second column, the second row and the fourth column, and the fourth row and the third column.
Although the color filter units of the same color in the filter unit groups shown in fig. 6 (b) and (c) are not all arranged with respect to the vertex angle, a plurality of the filter unit groups are arranged in an array such that the plurality of filter unit groups are tiled on the entire surface of the image sensor, as shown in fig. 8, it can be seen that the filter unit groups after tiling can achieve the same effect as the filter unit groups shown in fig. 6 (a), the filter unit group 302 corresponds to the filter unit group shown in fig. 6 (a), the filter unit group 304 corresponds to the filter unit group shown in fig. 6 (b), and the filter unit group 306 corresponds to the filter unit group shown in fig. 6 (c).
As yet another example, red filter units and blue filter units on the other diagonal line are alternately arranged.
As shown in fig. 6 (d), the white filter unit is located in the first row, the third row, the second row, the fourth row, the third row, the fourth row, the second column, and the fourth row in each filter unit group, the red filter unit is located in the second row, the third column, the fourth row, the first column in each filter unit group, the blue filter unit is located in the first row, the fourth column, the third row, and the second column in each filter unit group, and the green filter unit is located in the first row, the second row, the first column, the third row, and the fourth row in each filter unit group.
Of course, the arrangement of the red filter unit and the blue filter unit on the other diagonal is not limited to the above description, and for example, the red filter unit and the blue filter unit on the other diagonal may be arranged in the manner shown in fig. 7.
Optionally, the filter units on one diagonal line in each filter unit group are white filter units, and the other diagonal line is a green filter unit. Wherein, 2 red filter units can be separately arranged, and 2 blue filter units are also separately arranged.
For example, as shown in fig. 6 (e), a white filter unit is located in a first row, a third row, a second row, a fourth row, a third row, a first column, a third row, a third column, a fourth row, a second column, and a fourth row in each filter unit group, a red filter unit is located in a second row, a first column, a fourth row, a third column in each filter unit group, a blue filter unit is located in a first row, a second column, a third row, and a fourth column in each filter unit group, and a green filter unit is located in a first row, a fourth column, a second row, a third row, a second column, and a fourth row in each filter unit group.
As can be seen from fig. 8, the filter cell group 308 in fig. 8 corresponds to the filter cell group shown in fig. 6 (e), that is, the filter cell group shown in fig. 6 (e) can also achieve the effect of the filter cell group shown in fig. 6 (a) (b) (c).
The filter units may be arranged in a manner shown in fig. 9, in addition to the manner shown in fig. 6 (e).
In addition, as can be seen from fig. 8, the filter unit group 310 is also an arrangement in which white filter units are disposed on one diagonal line, 3 green filter units are disposed continuously to the vertex angle, and another green filter unit is disposed separately from the 3 green filter units.
The filter units may be arranged in the manner shown in fig. 10, in addition to the filter unit group 310.
The filter cells in the filter cell group may be arranged in the manner shown in fig. 11, in addition to the filter cell groups shown in fig. 6 to 10.
As shown in fig. 12 and 13, the image sensor 200 in the embodiment of the present application may further include a microlens array 230, and the microlens array 230 may be disposed above the filter unit array 210 to converge the optical signal returned from the photographing object to the filter unit array 210. The microlens array 230 may include a plurality of microlenses, and one microlens of the plurality of microlenses corresponds to at least one filter unit of the filter unit array.
In an embodiment of the present application, the distribution of the microlens array 230 may be set corresponding to the filter unit array 210 located thereunder, for example, each microlens in the microlens array 230 may correspond to one or more of the filter unit arrays 210 located thereunder.
Alternatively, as an embodiment, the microlenses in the microlens array 230 and the filter units in the filter unit array 210 may be in one-to-one correspondence. Specifically, as shown in fig. 12, the microlens array 230 includes a plurality of first microlenses 231, and each first microlens 231 corresponds to one filter unit and also corresponds to one pixel unit.
Alternatively, as another embodiment, at least one microlens may be present in the microlens array 230 corresponding to a plurality of filter units in the filter unit array 230. For example, the microlens array 230 may include a plurality of second microlenses 232 therein, each second microlens 232 corresponding to a plurality of filter units, for example, each second microlens 232 may correspond to 2×2 filter units, each filter unit corresponding to one pixel unit.
For another example, the microlens array 230 may also include at least one first microlens 231 and at least one second microlens 232, where each first microlens 231 corresponds to one filter unit in the filter unit array 210 and each second microlens 232 corresponds to a plurality of filter units in the filter unit array 210. For example, as shown in fig. 13, the microlens array 230 includes a plurality of first microlenses 231 corresponding to the filter units one by one, and also includes at least one second microlens 232 corresponding to 2×2 filter units.
For the above-described second microlenses 232 corresponding to the plurality of filter units, the number of filter units corresponding to the second microlenses 232 may be set according to practical applications, and may be set to any value. For example, the second microlens 232 may correspond to 3×3 filter units or 1×2 filter units, and the embodiment of the present application is not limited thereto.
In addition, the plurality of filter units corresponding to the same second microlens 232 may have the same or different colors. For example, as shown in fig. 13, the 2×2 filter units corresponding to the second microlenses 232 may be white filter units, where the 2×2 white filter units may or may not belong to the same filter unit group, for example, the 2×2 filter units corresponding to the second microlenses 232 may also belong to two or more adjacent filter unit groups, and the embodiment of the present application is not limited thereto.
When the plurality of white filter units are arranged corresponding to the same second micro lens, the phase difference of the electric signals can be calculated by utilizing the electric signals converted by the mixed light emitted by different emergent areas of the second micro lens, so that the focal length of the image sensor can be adjusted according to the phase difference.
The image sensor 200 in the embodiment of the present application may further include other parts. As shown in fig. 12 and 13, a dielectric layer 240 may be further included between the filter unit array 210 and the pixel array 220.
As shown in fig. 12 and 13, the filter unit array 210 may further include a dielectric 215 and a reflective grid 216 around it, and the pixel array 220 may include a semiconductor substrate 221 and a photosensitive element 222, wherein the photosensitive element 222 is located in the semiconductor substrate 221, and the photosensitive element 222 may be a PD. Optionally, the pixel array 220 may further comprise an isolation region 223 between two photosensitive elements 222.
The fusion manner provided by the embodiment of the present application is described below with reference to fig. 14 to 18.
The embodiment of the application can determine the generation mode of the generated target image according to the light intensity information of the light signals sensed by the pixel unit array, namely, if the light intensities are different, the generation modes of the generated target image are different.
Because the signal-to-noise ratio of the optical signal sensed by the white pixel unit is relatively high, the optical signal sensed by the white pixel unit can be preferentially used as the light intensity judgment basis.
The image sensor of the embodiment of the application can also comprise a processor, wherein the processor can be used for determining light intensity information according to the light signals sensed by the pixel unit array, and determining the pixel value of the color pixel unit and/or the pixel value of the white pixel unit according to the light intensity information to generate a generation mode of a target image of a shooting object.
The embodiment of the application can divide the light intensity into different ranges, each range corresponds to different fusion processes, and the fused target image is sent to an image signal processor (IMAGE SIGNAL processor, ISP) for processing, as shown in FIG. 14.
When the light intensity is greater than or equal to the first preset threshold, indicating that the light signal is saturated or near saturation, the processor may generate a target image using flow 1.
When the light intensity is too high, the white pixel unit is saturated or nearly saturated, and the pixel value of the color pixel unit can be directly subjected to reconstruction mosaic algorithm processing without using the pixel value of the white pixel unit at the moment, and the RGB data is output. The method can ensure that the generated target image is not too bright and is closer to the real effect in the strong light environment.
When the light intensity is less than the first preset threshold and greater than or equal to the second preset threshold, the processor may generate the target image using flow 2.
The processor may determine texture information and make a guess based on pixel values of the white pixel elements and generate the target image based on pixel values of the color pixel elements. In this process, the white pixel units are not involved in the fusion process, but are only used to provide texture information and color guessing guidance for the target image.
For example, gray information of the target image may be generated from pixel values of the white pixel units, and the processor may be configured to determine texture information of the target image based on the gray information.
In addition, when determining color information at the white pixel unit position, color estimation may be performed according to the pixel values of the white pixel unit and the colors of the pixel units around the white pixel unit to improve the accuracy of color reproduction of the target image. For example, the color information of the position of the white pixel unit and the pixel value of the color may be determined according to the pixel value of the white pixel unit and the color of the pixel units around the white pixel unit.
When the light intensity is less than the second preset threshold and greater than or equal to the third preset threshold, the processor may generate the target image using flow 3.
The processor can generate first image data according to the pixel value of the white pixel unit, generate second image data according to the pixel value of the color pixel unit, correct the first image data by using the correction coefficient alpha to obtain corrected first image data, and fuse the corrected first image data with the second image data to obtain the target image.
Wherein the correction coefficient alpha is determined according to the magnitude of the light intensity, 0< alpha <1. It will be appreciated that the greater the intensity, the smaller alpha indicates fewer W pixels to be fused, and the lesser the intensity, the greater alpha indicates more W pixels to be fused.
When the light intensity is between the second preset threshold value and the third preset threshold value, the processor can weaken the problem of excessive brightness rise caused by the fusion of the white pixels, and only fusion of part of the white pixel units, so that the brightness of the generated target image is not too high, is not too low, and is closer to the real effect.
When the light intensity is less than the third preset threshold, the processor may generate a target image using flow 4.
The processor may generate first image data according to the pixel values of the white pixel units, generate second image data according to the pixel values of the color pixel units, and fuse the first image data with the second image data to obtain the target image.
Under the condition of low light intensity, the pixel value of the white pixel unit can be completely fused into the target image, so that the brightness of the target image in a low-illumination environment can be ensured, and the image sensor can accurately restore the image in the low-illumination environment.
The image sensor in the embodiment of the application can further comprise a light intensity judging module, wherein the light intensity judging module can be used for determining the intensity of the light signal sensed by the pixel unit array so as to facilitate the processor to determine the generation mode of generating the target image.
The fusion process in the embodiment of the present application is described below with reference to fig. 15 to 18.
Fig. 15 shows a case of light saturation or near saturation, and the fusion process shown in fig. 15 corresponds to the flow 1. In this case, it is possible to directly use the pixel values of the color pixel units and generate a target image of the photographing object by reconstructing a mosaic algorithm.
Fig. 16 shows a case where the light is sufficient, and the fusion process shown in fig. 16 corresponds to the flow 2. In this case, the W pixel unit may be separated from the pixel unit array, and a 4×4 full W pixel array may be obtained by interpolation, and may be used to provide texture information of an image and to make a color guess, which may be referred to in the process of generating a target image by the color pixel unit. In this process, the color pixel unit may generate a target image by reconstructing a mosaic algorithm.
Fig. 17 shows a case where the light is insufficient, and the fusion process shown in fig. 17 corresponds to the flow 3. In this case, part of the W pixel units may be fused. The method comprises the steps of separating white pixel units in a pixel unit array to obtain discrete W pixel units, and then interpolating by an interpolation algorithm to obtain a 4 multiplied by 4 full W pixel array. The full-W pixel array can provide texture information and make guesses in the process that the color pixel unit generates the second image through the reconstruction mosaic algorithm. The light intensity determination module may determine light intensity information that may be used to determine the correction factor α. The full W pixel array may be used to generate a first image. The pixel values of the full W pixel array may be multiplied by the correction coefficient α to obtain a corrected first image. And fusing the corrected first image and the corrected second image to obtain a target image of the shooting object.
Fig. 18 shows a case where the light is relatively low, and the fusion process shown in fig. 18 corresponds to the flow 4. In this case, all W pixel units may be fused. The method comprises the steps of separating white pixel units in a pixel unit array to obtain discrete W pixel units, and then interpolating by an interpolation algorithm to obtain a 4 multiplied by 4 full W pixel array. The full-W pixel array can provide texture information and make guesses in the process that the color pixel unit generates the second image through the reconstruction mosaic algorithm. The light intensity determination module may determine a correction factor α=1. The full W pixel array can be used for generating a first image, and fusing the first image with a second image to obtain a target image of a shooting object.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. The storage medium includes a U disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

The pixel unit array is positioned below the light filtering unit array, the pixel units in the pixel unit array are in one-to-one correspondence with the light filtering units in the plurality of light filtering unit groups, the white pixel units in the pixel unit array are used for receiving the light signals passing through the white light filtering units corresponding to the pixel units, the color pixel units in the pixel unit array are used for receiving the light signals passing through the color light filtering units corresponding to the pixel units, and the light intensity of the light signals induced by the pixel unit array is used for determining the pixel values of the color pixel units and the pixel values of the white pixel units to generate a generation mode of a target image of a shooting object;
CN202010724148.5A2020-05-152020-07-24 Image sensors and electronicsActiveCN111629140B (en)

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
CN2020104106392020-05-15
CN20201041063922020-05-15

Publications (2)

Publication NumberPublication Date
CN111629140A CN111629140A (en)2020-09-04
CN111629140Btrue CN111629140B (en)2025-03-18

Family

ID=72202804

Family Applications (11)

Application NumberTitlePriority DateFiling Date
CN202021303789.5UActiveCN212752379U (en)2020-05-152020-07-03Image sensor and electronic device
CN202021297709.XUActiveCN212435794U (en)2020-05-152020-07-03Image sensor and electronic device
CN202021297708.5UActiveCN212435793U (en)2020-05-152020-07-03Image sensor and electronic device
CN202010635332.2AActiveCN111756972B (en)2020-05-152020-07-03 Image sensors and electronics
CN202010636571.XAPendingCN111756973A (en)2020-05-152020-07-03 Image Sensors and Electronics
CN202010637147.7AActiveCN111756974B (en)2020-05-152020-07-03 Image sensors and electronics
CN202010708333.5AActiveCN111614886B (en)2020-05-152020-07-22Image sensor and electronic device
CN202021508422.7UActiveCN212785522U (en)2020-05-152020-07-24Image sensor and electronic device
CN202010724146.6APendingCN111654615A (en)2020-05-152020-07-24 Image Sensors and Electronics
CN202010724148.5AActiveCN111629140B (en)2020-05-152020-07-24 Image sensors and electronics
CN202021510460.6UActiveCN212752389U (en)2020-05-152020-07-24Image sensor and electronic device

Family Applications Before (9)

Application NumberTitlePriority DateFiling Date
CN202021303789.5UActiveCN212752379U (en)2020-05-152020-07-03Image sensor and electronic device
CN202021297709.XUActiveCN212435794U (en)2020-05-152020-07-03Image sensor and electronic device
CN202021297708.5UActiveCN212435793U (en)2020-05-152020-07-03Image sensor and electronic device
CN202010635332.2AActiveCN111756972B (en)2020-05-152020-07-03 Image sensors and electronics
CN202010636571.XAPendingCN111756973A (en)2020-05-152020-07-03 Image Sensors and Electronics
CN202010637147.7AActiveCN111756974B (en)2020-05-152020-07-03 Image sensors and electronics
CN202010708333.5AActiveCN111614886B (en)2020-05-152020-07-22Image sensor and electronic device
CN202021508422.7UActiveCN212785522U (en)2020-05-152020-07-24Image sensor and electronic device
CN202010724146.6APendingCN111654615A (en)2020-05-152020-07-24 Image Sensors and Electronics

Family Applications After (1)

Application NumberTitlePriority DateFiling Date
CN202021510460.6UActiveCN212752389U (en)2020-05-152020-07-24Image sensor and electronic device

Country Status (2)

CountryLink
CN (11)CN212752379U (en)
WO (1)WO2021227250A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN212752379U (en)*2020-05-152021-03-19深圳市汇顶科技股份有限公司Image sensor and electronic device
CN112235494B (en)*2020-10-152022-05-20Oppo广东移动通信有限公司 Image sensor, control method, imaging device, terminal, and readable storage medium
CN112312097B (en)*2020-10-292023-01-24维沃移动通信有限公司Sensor with a sensor element
CN114584725A (en)*2020-11-302022-06-03华为技术有限公司Image sensor and imaging device
CN114650343A (en)*2020-12-152022-06-21超聚变数字技术有限公司Image sensor and imaging device
CN112822466A (en)*2020-12-282021-05-18维沃移动通信有限公司Image sensor, camera module and electronic equipment
CN113037980A (en)*2021-03-232021-06-25北京灵汐科技有限公司Pixel sensing array and vision sensor
CN115225832A (en)*2021-04-212022-10-21海信集团控股股份有限公司 An image acquisition device and image encryption processing method, device and medium
CN113540138B (en)*2021-06-032024-03-12奥比中光科技集团股份有限公司Multispectral image sensor and imaging module thereof
CN113676652B (en)*2021-08-252023-05-26维沃移动通信有限公司Image sensor, control method, control device, electronic apparatus, and storage medium
CN113676651B (en)*2021-08-252023-05-26维沃移动通信有限公司Image sensor, control method, control device, electronic apparatus, and storage medium
CN113852797A (en)*2021-09-242021-12-28昆山丘钛微电子科技股份有限公司 Color filter arrays, image sensors, and camera modules
KR20230046816A (en)*2021-09-302023-04-06에스케이하이닉스 주식회사Image sensing device
CN114125318B (en)*2021-11-122024-08-02Oppo广东移动通信有限公司Image sensor, camera module, electronic device, image generation method and device
CN114125240A (en)*2021-11-302022-03-01维沃移动通信有限公司 Image sensor, camera module, electronic device and shooting method
CN114363486B (en)*2021-12-142024-08-02Oppo广东移动通信有限公司Image sensor, camera module, electronic device, image generation method and device
CN114157795B (en)*2021-12-142024-08-16Oppo广东移动通信有限公司 Image sensor, camera module, electronic device, image generation method and device
CN114494706B (en)*2022-01-242025-10-03惠州同为数码科技有限公司 Image sensor signal-to-noise ratio evaluation method based on specific flat area
CN115022562A (en)*2022-05-252022-09-06Oppo广东移动通信有限公司 Image Sensors, Cameras and Electronics
CN118412399A (en)*2022-05-312024-07-30深圳市聚飞光电股份有限公司 A photoelectric sensor and packaging method thereof
CN115696078B (en)*2022-08-012023-09-01荣耀终端有限公司 Color filter arrays, image sensors, camera modules and electronics

Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2014161022A (en)*2014-03-122014-09-04Sony CorpSolid-state imaging apparatus, signal processing method of the same and imaging apparatus
CN110649056A (en)*2019-09-302020-01-03Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN212785522U (en)*2020-05-152021-03-23深圳市汇顶科技股份有限公司Image sensor and electronic device

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7978240B2 (en)*2005-10-032011-07-12Konica Minolta Photo Imaging, Inc.Enhancing image quality imaging unit and image sensor
KR100772910B1 (en)*2006-06-262007-11-05삼성전기주식회사 Digital camera module
KR100967651B1 (en)*2008-03-122010-07-07주식회사 동부하이텍 CMOS image sensor device and its formation method
JP4626706B2 (en)*2008-12-082011-02-09ソニー株式会社 Solid-state imaging device, signal processing method for solid-state imaging device, and imaging device
TWI422020B (en)*2008-12-082014-01-01Sony Corp Solid-state imaging device
JP4683121B2 (en)*2008-12-082011-05-11ソニー株式会社 Solid-state imaging device, signal processing method for solid-state imaging device, and imaging device
KR20110075397A (en)*2009-12-282011-07-06주식회사 동부하이텍 How to improve the sensitivity of the image sensor
CN104412581B (en)*2012-07-062016-04-13富士胶片株式会社Color image sensor and camera head
CN104412580B (en)*2012-07-062016-04-06富士胶片株式会社Color image sensor and camera head
JP6012375B2 (en)*2012-09-282016-10-25株式会社メガチップス Pixel interpolation processing device, imaging device, program, and integrated circuit
JP2015008343A (en)*2013-06-242015-01-15コニカミノルタ株式会社Imaging device, and method for forming imaging image
US9692992B2 (en)*2013-07-012017-06-27Omnivision Technologies, Inc.Color and infrared filter array patterns to reduce color aliasing
CN104241309B (en)*2014-09-192018-01-02上海集成电路研发中心有限公司A kind of CMOS image pixel array for simulating random pixel effect
US9479745B2 (en)*2014-09-192016-10-25Omnivision Technologies, Inc.Color filter array with reference pixel to reduce spectral crosstalk
TWI552594B (en)*2014-10-272016-10-01聯詠科技股份有限公司 Color filter array for image sensing device and manufacturing method thereof
CN104581100A (en)*2015-02-122015-04-29张李静Color filter array and image processing method
CN104735327B (en)*2015-04-082019-07-26联想(北京)有限公司Imaging device and imaging method
CN105282529B (en)*2015-10-222018-01-16浙江宇视科技有限公司A kind of digital wide dynamic approach and device based on RAW spaces
CN105578071B (en)*2015-12-182018-03-20广东欧珀移动通信有限公司imaging method of image sensor, imaging device and electronic device
CN105578078B (en)*2015-12-182018-01-19广东欧珀移动通信有限公司Imaging sensor, imaging device, mobile terminal and imaging method
CN105516697B (en)*2015-12-182018-04-17广东欧珀移动通信有限公司 Image sensor, imaging device, mobile terminal and imaging method
JP6461429B2 (en)*2015-12-182019-01-30広東欧珀移動通信有限公司 Image sensor, control method, and electronic apparatus
CN105516700B (en)*2015-12-182018-01-19广东欧珀移动通信有限公司Imaging method, imaging device and the electronic installation of imaging sensor
CN105430359B (en)*2015-12-182018-07-10广东欧珀移动通信有限公司Imaging method, imaging sensor, imaging device and electronic device
JP2017175500A (en)*2016-03-252017-09-28学校法人成蹊学園Color image pickup method, color image interpolation processing method, and imaging apparatus
CN107105140B (en)*2017-04-282020-01-24Oppo广东移动通信有限公司 Dual-core focus image sensor, focus control method and imaging device thereof
CN108305883A (en)*2018-01-302018-07-20德淮半导体有限公司Imaging sensor
JP7349806B2 (en)*2018-03-282023-09-25ブラックマジック デザイン ピーティーワイ リミテッド Image processing method and filter array
CN109003995A (en)*2018-08-102018-12-14德淮半导体有限公司Imaging sensor, electronic device and its manufacturing method
CN109905681B (en)*2019-02-012021-07-16华为技术有限公司 Image sensor, method for acquiring image data therefrom, and imaging device
CN110649057B (en)*2019-09-302021-03-05Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP2014161022A (en)*2014-03-122014-09-04Sony CorpSolid-state imaging apparatus, signal processing method of the same and imaging apparatus
CN110649056A (en)*2019-09-302020-01-03Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN212785522U (en)*2020-05-152021-03-23深圳市汇顶科技股份有限公司Image sensor and electronic device

Also Published As

Publication numberPublication date
CN111629140A (en)2020-09-04
CN111756972A (en)2020-10-09
CN111756974B (en)2025-03-18
CN212435793U (en)2021-01-29
CN111654615A (en)2020-09-11
CN111756974A (en)2020-10-09
CN111756973A (en)2020-10-09
CN212435794U (en)2021-01-29
CN212785522U (en)2021-03-23
WO2021227250A1 (en)2021-11-18
CN212752379U (en)2021-03-19
CN212752389U (en)2021-03-19
CN111756972B (en)2025-03-18
CN111614886B (en)2021-10-19
CN111614886A (en)2020-09-01

Similar Documents

PublicationPublication DateTitle
CN111629140B (en) Image sensors and electronics
CN213279832U (en) Image Sensors, Cameras and Terminals
JP4421793B2 (en) Digital camera
JP4971323B2 (en) Color and panchromatic pixel processing
KR100827238B1 (en) Method and apparatus for displaying images for high quality images
JP5345944B2 (en) Low resolution image generation
EP1530873B1 (en)One chip, low light level color camera
JP5462345B2 (en) Image sensor with improved light sensitivity
US8587681B2 (en)Extended depth of field for image sensor
EP2179591B1 (en)Multiple component readout of image sensor
JP5330258B2 (en) Processing images with color and panchromatic pixels
EP1977614B1 (en)Image sensor with improved light sensitivity
US6924841B2 (en)System and method for capturing color images that extends the dynamic range of an image sensor using first and second groups of pixels
US8405748B2 (en)CMOS image sensor with improved photodiode area allocation
EP2415254B1 (en)Exposing pixel groups in producing digital images
CN112118378A (en) Image acquisition method and device, terminal and computer-readable storage medium
US9332199B2 (en)Imaging device, image processing device, and image processing method
WO2008027327A1 (en)Method, imager and system providing paired-bayer color filter array and interlaced readout
EP2502422A1 (en)Sparse color pixel array with pixel substitutes
CN114679551A (en)Solid-state imaging device, signal processing method for solid-state imaging device, and electronic apparatus
CN102948152B (en)Imaging device and formation method
WO2022073364A1 (en)Image obtaining method and apparatus, terminal, and computer readable storage medium
JP2009303020A (en)Image capturing apparatus and defective pixel correcting method

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp