Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides an image sensor and electronic equipment, wherein, image sensor is applied to in the electronic equipment, electronic equipment includes smart mobile phone, Pad, wearable equipment, PC, robot, vehicle event data recorder, intelligent cat eye etc.. The electronic equipment comprises a camera, an image sensor is integrated in the camera, and the camera shoots by utilizing the integrated image sensor.
Before the image sensor in the embodiment of the present application is described in detail, the following principles of the current CMOS image sensor will be briefly described to facilitate understanding of the solution in the embodiment of the present application. Fig. 1 is a schematic structural diagram of an image sensor in the prior art, and fig. 2 is a schematic diagram of a pixel arrangement in the prior art.
In fig. 1, animage sensor 100 includes a plurality of pixel units and amicrolens 110 disposed over each pixel unit, each pixel unit includes a color filter (color filter) and aphotodiode 120, and thephotodiode 120 is located on a side of the corresponding pixel unit away from themicrolens 110. The filter may be ared filter 131, agreen filter 132, or ablue filter 133, and correspondingly, the pixel unit includes a red pixel unit (R), a green pixel unit (G), and a blue pixel unit (B). The plurality of pixel units may be arranged as in fig. 2, and the arrangement shown in fig. 2 is also referred to as a bayer (bayer) arrangement. Ametal line 130 is also provided on the side of the photodiode remote from the microlens.
The incident light is filtered by the optical filter with the corresponding color after passing through the micro-lens, and then the received monochromatic light with the corresponding color is subjected to photoelectric processing by the photodiode. In fig. 1, a single pixel unit can only receive light with one wavelength of blue, green or red, and finally, interpolation is performed by using pixels (pixels) corresponding to adjacent pixel units to restore the color of a shot object.
The embodiment of the application provides an image sensor, which comprises at least one white pixel unit and a first micro lens positioned on the white pixel unit. The image sensor further includes at least one second pixel unit and a second microlens positioned on the second pixel unit. Wherein the first and second microlenses are disposed in the same layer, and the white pixel unit has the same size as the second pixel unit, for example, the same size in the second direction (horizontal direction), and the second pixel unit may be at least one of a red pixel unit, a green pixel unit, or a blue pixel unit. The improvement point of the embodiment of the present application is mainly a white pixel unit, which will be described in detail below, and in addition, the second pixel unit is consistent with the structure of the existing second pixel unit.
It is noted that the white pixels have no filters, so that all bands/wavelengths of light in the incident light can enter.
In one embodiment, the white pixel unit and the second pixel unit may be disposed at intervals. Such as providing one white pixel element, one second pixel element, etc. For example, one white pixel unit (W), one red pixel unit (R), one white pixel unit (W), one green pixel unit (G), one white pixel unit (W), one blue pixel unit (B), and the like may be provided.
In one embodiment, since the human eye is sensitive to green, the second pixel cell can be set as a green pixel cell. Fig. 3 is a schematic arrangement diagram of a white pixel unit and a second pixel unit (a green pixel unit) provided in the embodiment of the present application. Wherein, the green pixel unit and the white pixel unit are arranged at intervals.
It should be noted that the image sensor in the embodiment of the present application may also be any other arrangement including at least one white pixel unit and at least one second pixel unit, where the second pixel unit is at least one of a red pixel unit, a green pixel unit, or a blue pixel unit; among the plurality of pixel units provided, the plurality of pixel units may be arranged in any arrangement. Since the second pixel unit is at least one of a red pixel unit, a green pixel unit, or a blue pixel unit, there are multiple arrangements of at least one white pixel unit and at least one second pixel unit, and there are multiple corresponding arrangements of each arrangement, which is not illustrated herein.
The white pixel unit in the image sensor of the embodiment of the present application will be described in detail below.
Fig. 4 is a schematic structural diagram of an image sensor provided in an embodiment of the present application, fig. 5 is a schematic diagram of a splitting prism of a white pixel unit provided in the embodiment of the present application, fig. 6 is a schematic diagram of a gap layer and a splitting diode layer of the white pixel unit provided in the embodiment of the present application, fig. 7 is a schematic diagram of refraction and reflection of light in the splitting prism, fig. 8 is a schematic diagram of light after being emitted from the splitting prism, and fig. 9 is a schematic diagram of a photodiode layer receiving corresponding monochromatic light.
Referring to fig. 4 to 6, theimage sensor 200 includes at least onewhite pixel unit 210 and afirst microlens 220 on the white pixel unit. Wherein thewhite pixel unit 210 sequentially sets in a first direction:beam splitting prism 211,gap layer 212, andphotodiode layer 213. Wherein the first direction may be a vertical direction as shown in fig. 4.
Thebeam splitter prism 211 may be a glass prism or any other object that can disperse light. Thebeam splitter prism 211 is configured to receive incident light incident from thefirst microlens 220, where the incident light may be incident natural light, and after refraction, reflection, and beam splitting processes are performed by the beam splitter prism, monochromatic light with a plurality of different wavelengths is obtained, that is, the light is dispersed and is emitted from thebeam splitter prism 211.
In one embodiment, thebeam splitter prism 211 includes alight incident surface 2111 and alight emitting surface 2112. Thelight incident surface 2111 is located at a side close to thefirst microlens 220, and thelight emitting surface 2112 is located at a side far from thefirst microlens 220. Thelight incident surface 2111 receives incident light entering from thefirst microlens 220, and after the light is refracted, reflected, and split, the obtained monochromatic light with a plurality of different wavelengths is emitted from thelight emitting surface 2112.
Thelight incident surface 2111 may be concave or flat. In one embodiment, thelight incident surface 2111 is concave. Since thefirst microlenses 220 have a light-condensing effect, light can be scattered by the concave surface.
In one embodiment, the curvature of the concave surface is related to the angle of incidence of the light entering thefirst microlens 220. For example, the curvature of the concave surface may be set to be the same as the incident angle of the incident light to thefirst microlens 220 after the first side refraction of the light incident on the concave surface after the first microlens processing.
In one embodiment, the arc of the concave surface may be the same as the arc of thefirst microlens 220. Thus, the concave surface can counteract the light gathering effect of thefirst micro-lens 220.
In one embodiment, the radians of the concave surfaces corresponding to thelight incident surface 2111 in the white pixel units may be different. Understandably, the radian of the concave surface of thelight incident surface 2111 can be adjusted to change the radian of the concave surface.
In one embodiment, thelight incident surface 2111 is rotatably connected to the main body of the splitting prism. When the incident angles of the light entering thefirst microlens 220 are different, thelight incident surface 2111 is rotated, so that the angle of the light ray after the light is refracted for the first time through thelight incident surface 2111 is kept unchanged. Please refer to fig. 7.
It can be understood that thelight incident surface 2111 is rotated regardless of the incident angle, so that the angle of the light ray after the first refraction of the incident light with different incident angles is kept constant. The light angle of the incident light with different incident angles after the first refraction is kept unchanged, and correspondingly, the light angle of the monochromatic light emitted from thelight emitting surface 2112 is also kept unchanged, so that the height of the beam splitter prism does not need to be changed, the gap layer does not need to be changed, and the like.
In an embodiment, thelight emitting surface 2112 is disposed obliquely, as shown in fig. 4 and fig. 5, thelight emitting surface 2112 is disposed obliquely, so that monochromatic light (after the second refraction) emitted from thelight emitting surface 2112 can be vertically or relatively vertically incident on thephotodiode layer 213. In this way, thephotodiode layer 213 may be disposed below thefirst microlens 220 in a centered or relatively centered manner, which facilitates the overall design of the image sensor and avoids affecting the second pixel unit.
In an embodiment, the angle of the inclination angle of thelight emitting surface 2112 is related to the wavelength of the plurality of monochromatic lights separated by thelight splitting prism 211. For example, assuming that the incident light is natural light, the angle of the inclination angle of thelight exit surface 2112 may be set to be the angle of the central wavelength of the monochromatic light that exits perpendicular to the plane of thefirst microlens 220 after being refracted by thelight exit surface 2112, or it may be understood as the angle of the output central wavelength of the monochromatic light that exits perpendicular to thebottom photodiode layer 213. The monochromatic light having the central wavelength may be 550nm, for example, corresponding to green light.
In one embodiment, thelight emitting surface 2112 is disposed obliquely, and thelight emitting surface 2112 is movably disposed on the main body of the beam splitter prism. When the incident angles of the incident light inputted into thefirst microlens 220 are different, thelight emitting surface 2112 is moved to move the angle of the inclination angle of thelight emitting surface 2112, so that the emission angles of the same monochromatic light after the second refraction through thelight emitting surface 2112 are the same. The same monochromatic light after the second refraction through thelight emitting surface 2112 has the same emission angle, so that the height of the beam splitter prism does not need to be changed, the gap layer does not need to be changed, and the like.
It can be understood that the embodiment of the present application can provide two ways to make the emitting angles of the same monochromatic light after the second refraction through thelight emitting surface 2112 the same: the first is to adjust thelight incident surface 2111, and the second is to move thelight emitting surface 2112. In an embodiment, thelight incident surface 2111 and thelight emitting surface 2112 may be adjusted at the same time to achieve the same effect.
Thelight incident surface 2111 and thelight emitting surface 2112 are disposed opposite to each other, and the height of thelight splitting prism 211 is set such that the incident light from thefirst microlens 220 is refracted through thelight incident surface 2111 for the first time, and after being totally reflected once again at one side surface of thelight splitting prism 211, the incident light is refracted through thelight emitting surface 2112 for the second time and then emitted.
It should be noted that, in all the above adjustments, consideration is also needed to avoid the problem of total reflection on thelight exit surface 2112.
Referring to FIG. 4, thegap layer 212 may include an air gap layer or a vacuum gap layer, and may be a gap layer of other uniform media for light transmission. The plurality of monochromatic lights emitted from thelight emitting surface 2112 are transmitted in thegap layer 212, and after being transmitted for a certain distance, the monochromatic lights with different wavelengths/bands/spectra are separated.
It can be understood that there are a plurality of monochromatic lights emitted from each point on thelight exit surface 2112 where light refracts, but since the refraction angle of each monochromatic light is different, but the refraction angle of the same monochromatic light is the same, after the plurality of monochromatic lights emitted from each point on thelight exit surface 2112 where light refracts are transmitted for a certain distance, the plurality of monochromatic lights are separated, for example, the red monochromatic light is separated from the green monochromatic light, and the red monochromatic light emitted from each point appears in the same section of area. Thegap layer 212 functions to separate the plurality of monochromatic lights when reaching thebottom photodiode layer 213 after the plurality of monochromatic lights are transmitted for a certain distance, so that the plurality of photodiodes in thephotodiode layer 213 can independently receive the corresponding monochromatic lights. See in particular fig. 8 and 9.
The area of a first cross section of thegap layer 212 close to thebeam splitter prism 211 is smaller than the area of a second cross section close to thephotodiode layer 213, wherein the first cross section and the second cross section are parallel. As can be seen from fig. 4, the diameter ofgap layer 212 becomes thicker and thicker from the side nearlight splitting prism 211 to the side nearphotodiode layer 213.
Wherein the height of thegap layer 212 is set such that the plurality of monochromatic lights propagating in the gap layer are separated at a side of thegap layer 212 close to thephotodiode layer 213.
Wherein thephotodiode layer 213 includes a plurality of independently spaced first photodiodes arranged along the second direction, as shown in fig. 8. Wherein the first direction is different from the second direction. For example, the first direction is a vertical direction shown in fig. 4, and the second direction is a horizontal direction shown in fig. 4.
The length of thephotodiode layer 213 in the second direction is greater than the length of the side of thegap layer 212 close to thephotodiode layer 213, so that it is ensured that the plurality of monochromatic lights from thegap layer 212 can be received by the plurality of first photodiodes in thephotodiode layer 213, and will not be emitted from thephotodiode layer 213, thereby affecting other devices of the image sensor.
Each of the plurality of individual first photodiodes inphotodiode layer 213 is configured to receive a corresponding one of the monochromatic lights, and the wavelength of the monochromatic light that each of the first photodiodes can receive increases or decreases sequentially along the second direction. If the second direction is horizontal rightward, the wavelength of the monochromatic light which can be received by each first photodiode increases progressively in sequence, and if the second direction is horizontal leftward, the wavelength of the monochromatic light which can be received by each first photodiode decreases progressively in sequence. It is understood that, from left to right in fig. 8, the plurality of independent first photodiodes may receive violet light, blue light, cyan light, green light, yellow light, orange light, and red light, respectively. Correspondingly, the number of the plurality of first photodiodes is seven. In other embodiments, more or less light may be divided, and correspondingly, the number of the first photodiodes may be more or less. As shown in fig. 8, the leftmost two lines are blue, the middle two lines are yellow, and the rightmost two lines are red.
Thus, incident light passes through the first micro lens and then enters the light splitting prism in the white pixel unit to be refracted, then reaches the side wall of the light splitting prism to be totally reflected, and then enters the air gap layer from the bottom of the light splitting prism. After entering the air gap layer for a certain distance, monochromatic light with different spectrums/wavelengths/wave bands is separated until reaching the photoelectric secondary tube layer at the bottom. The plurality of first photodiodes in the photodiode layer are independently and correspondingly received according to the separated monochromatic light with different spectrums/wavelengths/bands. In this way, the white pixel cell can sense the monochromatic light of the full wave band for the back end image processing. The white pixel unit that image sensor used in this application promptly can respond to full wave band/full gloss register for image sensor can be under the same pixel size, and the light sensitiveness strengthens by a wide margin, simultaneously because can respond to the monochromatic light of full wave band, can improve the precision of color, promotes the degree of accuracy of color reduction, promotes the precision of automatic white balance.
As shown in fig. 4, the image sensor in the embodiment of the present application includes at least onesecond pixel unit 230 and asecond microlens 240 over thesecond pixel unit 230, in addition to thewhite pixel unit 210 and thefirst microlens 220 over thewhite pixel unit 210. Thefirst microlenses 220 and thesecond microlenses 240 have the same size and the same function, and thefirst microlenses 220 and thesecond microlenses 240 are disposed on the same layer. Correspondingly, thewhite pixel unit 210 and thesecond pixel unit 230 have the same size. The second pixel unit is at least one of a red pixel unit, a green pixel unit or a blue pixel unit.
In fig. 4, the left and right sides of thewhite pixel unit 210 are both the second pixel unit, such as both the green pixel unit.
Each of thesecond pixel units 230 includes acolor filter 231 corresponding to a color sequentially arranged in the first direction, a color filter corresponding to green if the pixel unit is green, and asecond photodiode 232.
In one embodiment, the distance from thesecond photodiode 232 to thesecond microlens 240 is less than the distance from thegap layer 212 to thefirst microlens 220, so that thewhite pixel unit 210 does not affect the second pixel unit, and the portion below the gap layer of thewhite pixel unit 210 does not affect thesecond pixel unit 230.
In one embodiment, the distance betweensecond photodiode 232 andsecond microlens 240 is less than the distance betweenphotodiode layer 213 andfirst microlens 220, so thatwhite pixel unit 210 does not affect the second pixel unit, and thus, the influence of the portion of photodiode layer 21 ofwhite pixel unit 210 onsecond pixel unit 230 is avoided.
As shown in fig. 9, an embodiment of the present application further provides an electronic device, where theelectronic device 300 includes acamera 310, and thecamera 310 includes the image sensor described in any of the embodiments above. Thecamera 310 captures an image or a video by using the light of the image sensor.
In one embodiment, as shown in FIG. 9, the electronic device further includes aprocessor 320 and amemory 330. Theprocessor 320 is electrically connected to thememory 330 and thecamera 310. Theprocessor 320 is a control center of theelectronic device 300, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or loading an application program stored in thememory 330, and calling data stored in thememory 330, thereby integrally monitoring the electronic device. Thememory 330 is also used for storing images or video data obtained by thecamera 310.Memory 330 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples,memory 330 may further include memory located remotely fromprocessor 320, which may be connected toelectronic device 300 via a network.
In an embodiment, as shown in fig. 10, the electronic device 400 may include the following modules and/or units in addition to thecamera 410, thecamera 410 including the image sensor, theprocessor 420 and thememory 430 described in any of the above embodiments.
TheRF circuit 440 is used for receiving and transmitting electromagnetic waves, and implementing interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices. RF circuit 610 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The RF circuit 610 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices over a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The wireless network described above may use various communication standards, protocols, and technologies, including some that currently exist, and may even include those that are not currently under development.
Theinput unit 450 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, theinput unit 450 may include a touch-sensitive surface 451 as well asother input devices 452. Touch-sensitive surface 451, also referred to as a touch display screen (touchscreen) or touch pad, may collect user touch operations on or near it (e.g., user operations on or near touch-sensitive surface 451 using a finger, stylus, or any other suitable object or attachment), and drive the corresponding connection devices according to a predetermined program. Theinput unit 450 may includeother input devices 452 in addition to the touch-sensitive surface 451. In particular,other input devices 452 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Thedisplay unit 460 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device 400, which may be made up of graphics, text, icons, video, and any combination thereof. TheDisplay unit 460 may include aDisplay panel 461, and optionally, theDisplay panel 461 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch-sensitive surface 451 may overlay thedisplay panel 461, and when a touch operation is detected on or near the touch-sensitive surface 451, the touch operation is transmitted to theprocessor 420 to determine the type of touch event, and then theprocessor 420 provides a corresponding visual output on thedisplay panel 461 according to the type of touch event. Although in the figures touch-sensitive surface 451 anddisplay panel 461 are shown as two separate components to implement input and output functions, it will be appreciated that touch-sensitive surface 451 is integrated withdisplay panel 461 to implement input and output functions.
The electronic device 400 may also include at least onesensor 470, such as a light sensor, a direction sensor, a proximity sensor, and other sensors. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured to the electronic device 400, detailed descriptions thereof are omitted.
Theaudio circuit 480,speaker 481,microphone 482 may provide an audio interface between a user and the electronic device 400. Theaudio circuit 480 may transmit the electrical signal converted from the received audio data to thespeaker 481, and convert the electrical signal into a sound signal for output by thespeaker 481; on the other hand, themicrophone 482 converts the collected sound signal into an electrical signal, which is received by theaudio circuit 480 and converted into audio data, which is then processed by the audiodata output processor 420, and then passed through theRF circuit 440 to be transmitted to, for example, another electronic device, or output to thememory 430 for further processing. Theaudio circuit 480 may also include an earbud jack to provide communication of peripheral headphones with the electronic device 400.
The electronic device 400, via the transmission module 490 (e.g., a Wi-Fi module), may assist the user in receiving requests, sending information, etc., which provides the user with wireless broadband internet access. Although thetransmission module 490 is illustrated, it is understood that it does not belong to the essential constitution of the electronic device 400 and may be omitted entirely as needed within the scope not changing the essence of the invention.
The electronic device 400 also includes a power supply 491 (e.g., a battery) that powers the various components and, in some embodiments, may be logically coupled to theprocessor 420 via a power management system that manages charging, discharging, and power consumption management. Thepower supply 491 may also include one or more DC or AC power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and any like components.
The electronic device 400 may further include a bluetooth module, an infrared module, etc., which are not described in detail herein.
The foregoing detailed description has provided an image sensor and an electronic device according to embodiments of the present application, and specific examples are used herein to explain the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.