Movatterモバイル変換


[0]ホーム

URL:


CN114422730A - Image sensor and electronic device - Google Patents

Image sensor and electronic device
Download PDF

Info

Publication number
CN114422730A
CN114422730ACN202210050802.8ACN202210050802ACN114422730ACN 114422730 ACN114422730 ACN 114422730ACN 202210050802 ACN202210050802 ACN 202210050802ACN 114422730 ACN114422730 ACN 114422730A
Authority
CN
China
Prior art keywords
light
pixel unit
image sensor
incident
microlens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210050802.8A
Other languages
Chinese (zh)
Other versions
CN114422730B (en
Inventor
刘义
何硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huizhou TCL Mobile Communication Co Ltd
Original Assignee
Huizhou TCL Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huizhou TCL Mobile Communication Co LtdfiledCriticalHuizhou TCL Mobile Communication Co Ltd
Priority to CN202210050802.8ApriorityCriticalpatent/CN114422730B/en
Publication of CN114422730ApublicationCriticalpatent/CN114422730A/en
Application grantedgrantedCritical
Publication of CN114422730BpublicationCriticalpatent/CN114422730B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The application discloses image sensor and electronic equipment, this image sensor includes: the white pixel unit is sequentially provided with a light splitting prism in a first direction, a gap layer and a photodiode layer, the light splitting prism comprises a light incoming surface and a light outgoing surface, the photodiode layer comprises a plurality of independent first photodiodes arranged in a second direction, light is emitted from the first microlenses, is emitted through the light incoming surface of the light splitting prism, is emitted through the light outgoing surface of the light splitting prism, is subjected to dispersion through the light splitting prism, is divided into monochromatic light with different wavelengths, the monochromatic light with different wavelengths is separated after passing through the gap layer, and is received by the plurality of first photodiodes arranged in the second direction, so that the photosensitivity of the image sensor is greatly improved, and the accuracy of automatic white balance and color reduction is improved.

Description

Image sensor and electronic device
Technical Field
The present application relates to the field of image sensor technology, and in particular, to an image sensor and an electronic device.
Background
The image sensor is a device for converting an optical signal into an electrical signal. According to the principle, the image sensor can be classified into a CCD (Charge Coupled Device) image sensor and a CMOS (Complementary Metal-Oxide Semiconductor) image sensor. Because the CMOS image sensor is manufactured by adopting the traditional CMOS circuit process, the CMOS image sensor and peripheral circuits required by the CMOS image sensor can be integrated, so that the CMOS image sensor has wider application prospect.
At present, most of electronic equipment uses a CMOS image sensor, in order to continuously improve the shooting effect of the electronic equipment in a dark state, the pixel size of the CMOS image sensor is continuously increased, so that the light sensitivity is improved, but the continuously increased pixel size is not beneficial to the whole electronic equipment design and stacking, and meanwhile, the accuracy of automatic white balance and color restoration is also one of the problems that the shooting needs to be improved all the time.
Disclosure of Invention
The embodiment of the application provides an image sensor and an electronic device, which can greatly improve the light sensitivity and can also improve the accuracy of automatic white balance and color restoration.
An embodiment of the present application provides an image sensor, including: the liquid crystal display device comprises at least one white pixel unit and a first micro lens positioned on the white pixel unit, wherein the white pixel unit is sequentially arranged in a first direction:
the light splitting prism comprises a light incident surface and a light emergent surface, the light incident surface is positioned at one side close to the first micro lens, and the light emergent surface is positioned at one side far away from the first micro lens;
a gap layer;
a photodiode layer comprising a plurality of independent first photodiodes disposed along a second direction, the first direction being different from the second direction.
The light incident surface is a concave surface.
The light incident surface is rotatably connected to the light splitting prism main body;
when the incident angle of the incident light entering the first micro lens is different, the light incident surface is rotated, so that the angle of the light ray of the incident light after the incident light is refracted for the first time through the light incident surface is kept unchanged.
The light-emitting surface is obliquely arranged, and the angle of the inclination angle of the light-emitting surface is related to the wavelength of the plurality of monochromatic lights which can be separated by the light-splitting prism.
The light-emitting surface is obliquely arranged and movably arranged on the main body of the beam splitter prism;
when the incident angles of the incident light entering the first micro lens are different, the light-emitting surface is moved to adjust the angle of the inclination angle of the light-emitting surface, so that the emitting angles of the same monochromatic light after the second refraction through the light-emitting surface are the same.
The area of a first cross section corresponding to one side, close to the beam splitting prism, of the gap layer is smaller than the area of a second cross section corresponding to one side, close to the photodiode layer, of the gap layer, and the first cross section is parallel to the second cross section.
Each first photodiode in the plurality of independent first photodiodes is configured to receive a corresponding monochromatic light, and the wavelength of the monochromatic light that each first photodiode can receive increases or decreases in sequence along the second direction.
The image sensor further comprises at least one second pixel unit and a second micro lens located on the second pixel unit, the first micro lens and the second micro lens are arranged on the same layer, the size of the white pixel unit is the same as that of the second pixel unit, and the second pixel unit is at least one of a red pixel unit, a green pixel unit or a blue pixel unit.
The second pixel unit comprises an optical filter and a second photodiode which are sequentially arranged in the first direction and correspond to colors, and the distance from the second photodiode to the second micro lens is smaller than the distance from the gap layer to the first micro lens.
The embodiment of the application further provides electronic equipment, electronic equipment includes the camera, include above-mentioned any in the camera image sensor, the camera utilizes image sensor sensitization shoots and obtains image or video.
The application provides an image sensor and electronic equipment, this image sensor include at least one white pixel unit and are located first microlens on the white pixel unit, white pixel unit sets gradually beam splitter prism on the first direction, interstitial layer and photodiode layer, beam splitter prism is including going into the plain noodles and play plain noodles, photodiode layer includes a plurality of independent first photodiode that set up on the second direction, wherein, light is penetrated into from first microlens, penetrate into through beam splitter prism's income plain noodles, penetrate out through beam splitter prism's play plain noodles, let light appear the dispersion through beam splitter prism, be the monochromatic light of different wavelengths, the monochromatic light of different wavelengths separates behind the interstitial layer, the monochromatic light after the separation is received by a plurality of first photodiode that set up on the second direction of edge, so, white pixel unit in the image sensor in the embodiment of the application can keep a plurality of spectrums/a plurality of wavelengths The light of (2) greatly improves the photosensitivity of the image sensor, and improves the accuracy of automatic white balance and color restoration.
Drawings
The technical solution and other advantages of the present application will become apparent from the detailed description of the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of a prior art image sensor.
Fig. 2 is a schematic diagram of a pixel arrangement of an image sensor in the prior art.
Fig. 3 is a schematic diagram of a pixel arrangement of an image sensor according to an embodiment of the present disclosure.
Fig. 4 is a schematic structural diagram of an image sensor according to an embodiment of the present application.
Fig. 5 is a schematic diagram of a light splitting prism of a white pixel unit according to an embodiment of the present application.
FIG. 6 is a schematic view of a gap layer and a photodiode layer provided in embodiments of the present application.
Fig. 7 is a schematic diagram of refraction and reflection of light in a beam splitter prism.
Fig. 8 is a schematic diagram of light after exiting the beam splitter prism.
FIG. 9 is a schematic diagram of the photodiode layer receiving corresponding monochromatic light.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 11 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides an image sensor and electronic equipment, wherein, image sensor is applied to in the electronic equipment, electronic equipment includes smart mobile phone, Pad, wearable equipment, PC, robot, vehicle event data recorder, intelligent cat eye etc.. The electronic equipment comprises a camera, an image sensor is integrated in the camera, and the camera shoots by utilizing the integrated image sensor.
Before the image sensor in the embodiment of the present application is described in detail, the following principles of the current CMOS image sensor will be briefly described to facilitate understanding of the solution in the embodiment of the present application. Fig. 1 is a schematic structural diagram of an image sensor in the prior art, and fig. 2 is a schematic diagram of a pixel arrangement in the prior art.
In fig. 1, animage sensor 100 includes a plurality of pixel units and amicrolens 110 disposed over each pixel unit, each pixel unit includes a color filter (color filter) and aphotodiode 120, and thephotodiode 120 is located on a side of the corresponding pixel unit away from themicrolens 110. The filter may be ared filter 131, agreen filter 132, or ablue filter 133, and correspondingly, the pixel unit includes a red pixel unit (R), a green pixel unit (G), and a blue pixel unit (B). The plurality of pixel units may be arranged as in fig. 2, and the arrangement shown in fig. 2 is also referred to as a bayer (bayer) arrangement. Ametal line 130 is also provided on the side of the photodiode remote from the microlens.
The incident light is filtered by the optical filter with the corresponding color after passing through the micro-lens, and then the received monochromatic light with the corresponding color is subjected to photoelectric processing by the photodiode. In fig. 1, a single pixel unit can only receive light with one wavelength of blue, green or red, and finally, interpolation is performed by using pixels (pixels) corresponding to adjacent pixel units to restore the color of a shot object.
The embodiment of the application provides an image sensor, which comprises at least one white pixel unit and a first micro lens positioned on the white pixel unit. The image sensor further includes at least one second pixel unit and a second microlens positioned on the second pixel unit. Wherein the first and second microlenses are disposed in the same layer, and the white pixel unit has the same size as the second pixel unit, for example, the same size in the second direction (horizontal direction), and the second pixel unit may be at least one of a red pixel unit, a green pixel unit, or a blue pixel unit. The improvement point of the embodiment of the present application is mainly a white pixel unit, which will be described in detail below, and in addition, the second pixel unit is consistent with the structure of the existing second pixel unit.
It is noted that the white pixels have no filters, so that all bands/wavelengths of light in the incident light can enter.
In one embodiment, the white pixel unit and the second pixel unit may be disposed at intervals. Such as providing one white pixel element, one second pixel element, etc. For example, one white pixel unit (W), one red pixel unit (R), one white pixel unit (W), one green pixel unit (G), one white pixel unit (W), one blue pixel unit (B), and the like may be provided.
In one embodiment, since the human eye is sensitive to green, the second pixel cell can be set as a green pixel cell. Fig. 3 is a schematic arrangement diagram of a white pixel unit and a second pixel unit (a green pixel unit) provided in the embodiment of the present application. Wherein, the green pixel unit and the white pixel unit are arranged at intervals.
It should be noted that the image sensor in the embodiment of the present application may also be any other arrangement including at least one white pixel unit and at least one second pixel unit, where the second pixel unit is at least one of a red pixel unit, a green pixel unit, or a blue pixel unit; among the plurality of pixel units provided, the plurality of pixel units may be arranged in any arrangement. Since the second pixel unit is at least one of a red pixel unit, a green pixel unit, or a blue pixel unit, there are multiple arrangements of at least one white pixel unit and at least one second pixel unit, and there are multiple corresponding arrangements of each arrangement, which is not illustrated herein.
The white pixel unit in the image sensor of the embodiment of the present application will be described in detail below.
Fig. 4 is a schematic structural diagram of an image sensor provided in an embodiment of the present application, fig. 5 is a schematic diagram of a splitting prism of a white pixel unit provided in the embodiment of the present application, fig. 6 is a schematic diagram of a gap layer and a splitting diode layer of the white pixel unit provided in the embodiment of the present application, fig. 7 is a schematic diagram of refraction and reflection of light in the splitting prism, fig. 8 is a schematic diagram of light after being emitted from the splitting prism, and fig. 9 is a schematic diagram of a photodiode layer receiving corresponding monochromatic light.
Referring to fig. 4 to 6, theimage sensor 200 includes at least onewhite pixel unit 210 and afirst microlens 220 on the white pixel unit. Wherein thewhite pixel unit 210 sequentially sets in a first direction:beam splitting prism 211,gap layer 212, andphotodiode layer 213. Wherein the first direction may be a vertical direction as shown in fig. 4.
Thebeam splitter prism 211 may be a glass prism or any other object that can disperse light. Thebeam splitter prism 211 is configured to receive incident light incident from thefirst microlens 220, where the incident light may be incident natural light, and after refraction, reflection, and beam splitting processes are performed by the beam splitter prism, monochromatic light with a plurality of different wavelengths is obtained, that is, the light is dispersed and is emitted from thebeam splitter prism 211.
In one embodiment, thebeam splitter prism 211 includes alight incident surface 2111 and alight emitting surface 2112. Thelight incident surface 2111 is located at a side close to thefirst microlens 220, and thelight emitting surface 2112 is located at a side far from thefirst microlens 220. Thelight incident surface 2111 receives incident light entering from thefirst microlens 220, and after the light is refracted, reflected, and split, the obtained monochromatic light with a plurality of different wavelengths is emitted from thelight emitting surface 2112.
Thelight incident surface 2111 may be concave or flat. In one embodiment, thelight incident surface 2111 is concave. Since thefirst microlenses 220 have a light-condensing effect, light can be scattered by the concave surface.
In one embodiment, the curvature of the concave surface is related to the angle of incidence of the light entering thefirst microlens 220. For example, the curvature of the concave surface may be set to be the same as the incident angle of the incident light to thefirst microlens 220 after the first side refraction of the light incident on the concave surface after the first microlens processing.
In one embodiment, the arc of the concave surface may be the same as the arc of thefirst microlens 220. Thus, the concave surface can counteract the light gathering effect of thefirst micro-lens 220.
In one embodiment, the radians of the concave surfaces corresponding to thelight incident surface 2111 in the white pixel units may be different. Understandably, the radian of the concave surface of thelight incident surface 2111 can be adjusted to change the radian of the concave surface.
In one embodiment, thelight incident surface 2111 is rotatably connected to the main body of the splitting prism. When the incident angles of the light entering thefirst microlens 220 are different, thelight incident surface 2111 is rotated, so that the angle of the light ray after the light is refracted for the first time through thelight incident surface 2111 is kept unchanged. Please refer to fig. 7.
It can be understood that thelight incident surface 2111 is rotated regardless of the incident angle, so that the angle of the light ray after the first refraction of the incident light with different incident angles is kept constant. The light angle of the incident light with different incident angles after the first refraction is kept unchanged, and correspondingly, the light angle of the monochromatic light emitted from thelight emitting surface 2112 is also kept unchanged, so that the height of the beam splitter prism does not need to be changed, the gap layer does not need to be changed, and the like.
In an embodiment, thelight emitting surface 2112 is disposed obliquely, as shown in fig. 4 and fig. 5, thelight emitting surface 2112 is disposed obliquely, so that monochromatic light (after the second refraction) emitted from thelight emitting surface 2112 can be vertically or relatively vertically incident on thephotodiode layer 213. In this way, thephotodiode layer 213 may be disposed below thefirst microlens 220 in a centered or relatively centered manner, which facilitates the overall design of the image sensor and avoids affecting the second pixel unit.
In an embodiment, the angle of the inclination angle of thelight emitting surface 2112 is related to the wavelength of the plurality of monochromatic lights separated by thelight splitting prism 211. For example, assuming that the incident light is natural light, the angle of the inclination angle of thelight exit surface 2112 may be set to be the angle of the central wavelength of the monochromatic light that exits perpendicular to the plane of thefirst microlens 220 after being refracted by thelight exit surface 2112, or it may be understood as the angle of the output central wavelength of the monochromatic light that exits perpendicular to thebottom photodiode layer 213. The monochromatic light having the central wavelength may be 550nm, for example, corresponding to green light.
In one embodiment, thelight emitting surface 2112 is disposed obliquely, and thelight emitting surface 2112 is movably disposed on the main body of the beam splitter prism. When the incident angles of the incident light inputted into thefirst microlens 220 are different, thelight emitting surface 2112 is moved to move the angle of the inclination angle of thelight emitting surface 2112, so that the emission angles of the same monochromatic light after the second refraction through thelight emitting surface 2112 are the same. The same monochromatic light after the second refraction through thelight emitting surface 2112 has the same emission angle, so that the height of the beam splitter prism does not need to be changed, the gap layer does not need to be changed, and the like.
It can be understood that the embodiment of the present application can provide two ways to make the emitting angles of the same monochromatic light after the second refraction through thelight emitting surface 2112 the same: the first is to adjust thelight incident surface 2111, and the second is to move thelight emitting surface 2112. In an embodiment, thelight incident surface 2111 and thelight emitting surface 2112 may be adjusted at the same time to achieve the same effect.
Thelight incident surface 2111 and thelight emitting surface 2112 are disposed opposite to each other, and the height of thelight splitting prism 211 is set such that the incident light from thefirst microlens 220 is refracted through thelight incident surface 2111 for the first time, and after being totally reflected once again at one side surface of thelight splitting prism 211, the incident light is refracted through thelight emitting surface 2112 for the second time and then emitted.
It should be noted that, in all the above adjustments, consideration is also needed to avoid the problem of total reflection on thelight exit surface 2112.
Referring to FIG. 4, thegap layer 212 may include an air gap layer or a vacuum gap layer, and may be a gap layer of other uniform media for light transmission. The plurality of monochromatic lights emitted from thelight emitting surface 2112 are transmitted in thegap layer 212, and after being transmitted for a certain distance, the monochromatic lights with different wavelengths/bands/spectra are separated.
It can be understood that there are a plurality of monochromatic lights emitted from each point on thelight exit surface 2112 where light refracts, but since the refraction angle of each monochromatic light is different, but the refraction angle of the same monochromatic light is the same, after the plurality of monochromatic lights emitted from each point on thelight exit surface 2112 where light refracts are transmitted for a certain distance, the plurality of monochromatic lights are separated, for example, the red monochromatic light is separated from the green monochromatic light, and the red monochromatic light emitted from each point appears in the same section of area. Thegap layer 212 functions to separate the plurality of monochromatic lights when reaching thebottom photodiode layer 213 after the plurality of monochromatic lights are transmitted for a certain distance, so that the plurality of photodiodes in thephotodiode layer 213 can independently receive the corresponding monochromatic lights. See in particular fig. 8 and 9.
The area of a first cross section of thegap layer 212 close to thebeam splitter prism 211 is smaller than the area of a second cross section close to thephotodiode layer 213, wherein the first cross section and the second cross section are parallel. As can be seen from fig. 4, the diameter ofgap layer 212 becomes thicker and thicker from the side nearlight splitting prism 211 to the side nearphotodiode layer 213.
Wherein the height of thegap layer 212 is set such that the plurality of monochromatic lights propagating in the gap layer are separated at a side of thegap layer 212 close to thephotodiode layer 213.
Wherein thephotodiode layer 213 includes a plurality of independently spaced first photodiodes arranged along the second direction, as shown in fig. 8. Wherein the first direction is different from the second direction. For example, the first direction is a vertical direction shown in fig. 4, and the second direction is a horizontal direction shown in fig. 4.
The length of thephotodiode layer 213 in the second direction is greater than the length of the side of thegap layer 212 close to thephotodiode layer 213, so that it is ensured that the plurality of monochromatic lights from thegap layer 212 can be received by the plurality of first photodiodes in thephotodiode layer 213, and will not be emitted from thephotodiode layer 213, thereby affecting other devices of the image sensor.
Each of the plurality of individual first photodiodes inphotodiode layer 213 is configured to receive a corresponding one of the monochromatic lights, and the wavelength of the monochromatic light that each of the first photodiodes can receive increases or decreases sequentially along the second direction. If the second direction is horizontal rightward, the wavelength of the monochromatic light which can be received by each first photodiode increases progressively in sequence, and if the second direction is horizontal leftward, the wavelength of the monochromatic light which can be received by each first photodiode decreases progressively in sequence. It is understood that, from left to right in fig. 8, the plurality of independent first photodiodes may receive violet light, blue light, cyan light, green light, yellow light, orange light, and red light, respectively. Correspondingly, the number of the plurality of first photodiodes is seven. In other embodiments, more or less light may be divided, and correspondingly, the number of the first photodiodes may be more or less. As shown in fig. 8, the leftmost two lines are blue, the middle two lines are yellow, and the rightmost two lines are red.
Thus, incident light passes through the first micro lens and then enters the light splitting prism in the white pixel unit to be refracted, then reaches the side wall of the light splitting prism to be totally reflected, and then enters the air gap layer from the bottom of the light splitting prism. After entering the air gap layer for a certain distance, monochromatic light with different spectrums/wavelengths/wave bands is separated until reaching the photoelectric secondary tube layer at the bottom. The plurality of first photodiodes in the photodiode layer are independently and correspondingly received according to the separated monochromatic light with different spectrums/wavelengths/bands. In this way, the white pixel cell can sense the monochromatic light of the full wave band for the back end image processing. The white pixel unit that image sensor used in this application promptly can respond to full wave band/full gloss register for image sensor can be under the same pixel size, and the light sensitiveness strengthens by a wide margin, simultaneously because can respond to the monochromatic light of full wave band, can improve the precision of color, promotes the degree of accuracy of color reduction, promotes the precision of automatic white balance.
As shown in fig. 4, the image sensor in the embodiment of the present application includes at least onesecond pixel unit 230 and asecond microlens 240 over thesecond pixel unit 230, in addition to thewhite pixel unit 210 and thefirst microlens 220 over thewhite pixel unit 210. Thefirst microlenses 220 and thesecond microlenses 240 have the same size and the same function, and thefirst microlenses 220 and thesecond microlenses 240 are disposed on the same layer. Correspondingly, thewhite pixel unit 210 and thesecond pixel unit 230 have the same size. The second pixel unit is at least one of a red pixel unit, a green pixel unit or a blue pixel unit.
In fig. 4, the left and right sides of thewhite pixel unit 210 are both the second pixel unit, such as both the green pixel unit.
Each of thesecond pixel units 230 includes acolor filter 231 corresponding to a color sequentially arranged in the first direction, a color filter corresponding to green if the pixel unit is green, and asecond photodiode 232.
In one embodiment, the distance from thesecond photodiode 232 to thesecond microlens 240 is less than the distance from thegap layer 212 to thefirst microlens 220, so that thewhite pixel unit 210 does not affect the second pixel unit, and the portion below the gap layer of thewhite pixel unit 210 does not affect thesecond pixel unit 230.
In one embodiment, the distance betweensecond photodiode 232 andsecond microlens 240 is less than the distance betweenphotodiode layer 213 andfirst microlens 220, so thatwhite pixel unit 210 does not affect the second pixel unit, and thus, the influence of the portion of photodiode layer 21 ofwhite pixel unit 210 onsecond pixel unit 230 is avoided.
As shown in fig. 9, an embodiment of the present application further provides an electronic device, where theelectronic device 300 includes acamera 310, and thecamera 310 includes the image sensor described in any of the embodiments above. Thecamera 310 captures an image or a video by using the light of the image sensor.
In one embodiment, as shown in FIG. 9, the electronic device further includes aprocessor 320 and amemory 330. Theprocessor 320 is electrically connected to thememory 330 and thecamera 310. Theprocessor 320 is a control center of theelectronic device 300, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or loading an application program stored in thememory 330, and calling data stored in thememory 330, thereby integrally monitoring the electronic device. Thememory 330 is also used for storing images or video data obtained by thecamera 310.Memory 330 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples,memory 330 may further include memory located remotely fromprocessor 320, which may be connected toelectronic device 300 via a network.
In an embodiment, as shown in fig. 10, the electronic device 400 may include the following modules and/or units in addition to thecamera 410, thecamera 410 including the image sensor, theprocessor 420 and thememory 430 described in any of the above embodiments.
TheRF circuit 440 is used for receiving and transmitting electromagnetic waves, and implementing interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices. RF circuit 610 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The RF circuit 610 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices over a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The wireless network described above may use various communication standards, protocols, and technologies, including some that currently exist, and may even include those that are not currently under development.
Theinput unit 450 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, theinput unit 450 may include a touch-sensitive surface 451 as well asother input devices 452. Touch-sensitive surface 451, also referred to as a touch display screen (touchscreen) or touch pad, may collect user touch operations on or near it (e.g., user operations on or near touch-sensitive surface 451 using a finger, stylus, or any other suitable object or attachment), and drive the corresponding connection devices according to a predetermined program. Theinput unit 450 may includeother input devices 452 in addition to the touch-sensitive surface 451. In particular,other input devices 452 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Thedisplay unit 460 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device 400, which may be made up of graphics, text, icons, video, and any combination thereof. TheDisplay unit 460 may include aDisplay panel 461, and optionally, theDisplay panel 461 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, the touch-sensitive surface 451 may overlay thedisplay panel 461, and when a touch operation is detected on or near the touch-sensitive surface 451, the touch operation is transmitted to theprocessor 420 to determine the type of touch event, and then theprocessor 420 provides a corresponding visual output on thedisplay panel 461 according to the type of touch event. Although in the figures touch-sensitive surface 451 anddisplay panel 461 are shown as two separate components to implement input and output functions, it will be appreciated that touch-sensitive surface 451 is integrated withdisplay panel 461 to implement input and output functions.
The electronic device 400 may also include at least onesensor 470, such as a light sensor, a direction sensor, a proximity sensor, and other sensors. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured to the electronic device 400, detailed descriptions thereof are omitted.
Theaudio circuit 480,speaker 481,microphone 482 may provide an audio interface between a user and the electronic device 400. Theaudio circuit 480 may transmit the electrical signal converted from the received audio data to thespeaker 481, and convert the electrical signal into a sound signal for output by thespeaker 481; on the other hand, themicrophone 482 converts the collected sound signal into an electrical signal, which is received by theaudio circuit 480 and converted into audio data, which is then processed by the audiodata output processor 420, and then passed through theRF circuit 440 to be transmitted to, for example, another electronic device, or output to thememory 430 for further processing. Theaudio circuit 480 may also include an earbud jack to provide communication of peripheral headphones with the electronic device 400.
The electronic device 400, via the transmission module 490 (e.g., a Wi-Fi module), may assist the user in receiving requests, sending information, etc., which provides the user with wireless broadband internet access. Although thetransmission module 490 is illustrated, it is understood that it does not belong to the essential constitution of the electronic device 400 and may be omitted entirely as needed within the scope not changing the essence of the invention.
The electronic device 400 also includes a power supply 491 (e.g., a battery) that powers the various components and, in some embodiments, may be logically coupled to theprocessor 420 via a power management system that manages charging, discharging, and power consumption management. Thepower supply 491 may also include one or more DC or AC power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and any like components.
The electronic device 400 may further include a bluetooth module, an infrared module, etc., which are not described in detail herein.
The foregoing detailed description has provided an image sensor and an electronic device according to embodiments of the present application, and specific examples are used herein to explain the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

CN202210050802.8A2022-01-172022-01-17Image sensor and electronic deviceActiveCN114422730B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202210050802.8ACN114422730B (en)2022-01-172022-01-17Image sensor and electronic device

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202210050802.8ACN114422730B (en)2022-01-172022-01-17Image sensor and electronic device

Publications (2)

Publication NumberPublication Date
CN114422730Atrue CN114422730A (en)2022-04-29
CN114422730B CN114422730B (en)2024-03-19

Family

ID=81274002

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202210050802.8AActiveCN114422730B (en)2022-01-172022-01-17Image sensor and electronic device

Country Status (1)

CountryLink
CN (1)CN114422730B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN115278127A (en)*2022-07-252022-11-01Oppo广东移动通信有限公司Image sensor, camera and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH11313334A (en)*1998-04-271999-11-09Nippon Hoso Kyokai <Nhk> Solid-state imaging device
KR20100052838A (en)*2008-11-112010-05-20주식회사 동부하이텍Cmos image sensor and method for fabricating of the same
CN102510447A (en)*2011-09-282012-06-20上海宏力半导体制造有限公司Image sensor
CN107613182A (en)*2017-10-272018-01-19北京小米移动软件有限公司 Camera photosensitive components, cameras and camera terminals
CN111739900A (en)*2020-07-282020-10-02深圳市汇顶科技股份有限公司 Image sensor, image sensing method, chip and electronic device
US20210066375A1 (en)*2019-09-042021-03-04Samsung Electronics Co., Ltd.Image sensor and imaging apparatus having the same

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN111614878B (en)*2020-05-262022-04-22维沃移动通信(杭州)有限公司Pixel unit, photoelectric sensor, camera module and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH11313334A (en)*1998-04-271999-11-09Nippon Hoso Kyokai <Nhk> Solid-state imaging device
KR20100052838A (en)*2008-11-112010-05-20주식회사 동부하이텍Cmos image sensor and method for fabricating of the same
CN102510447A (en)*2011-09-282012-06-20上海宏力半导体制造有限公司Image sensor
CN107613182A (en)*2017-10-272018-01-19北京小米移动软件有限公司 Camera photosensitive components, cameras and camera terminals
US20210066375A1 (en)*2019-09-042021-03-04Samsung Electronics Co., Ltd.Image sensor and imaging apparatus having the same
CN111739900A (en)*2020-07-282020-10-02深圳市汇顶科技股份有限公司 Image sensor, image sensing method, chip and electronic device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN115278127A (en)*2022-07-252022-11-01Oppo广东移动通信有限公司Image sensor, camera and electronic device

Also Published As

Publication numberPublication date
CN114422730B (en)2024-03-19

Similar Documents

PublicationPublication DateTitle
KR102650473B1 (en) Display components, display screens, and electronic devices
CN103066087B (en)Imageing sensor module and portable electric device
US20170142325A1 (en)Image sensor and electronic device having the same
WO2022143280A1 (en)Image sensor, camera module, and electronic device
WO2020132974A1 (en)Fingerprint recognition apparatus and electronic device
CN108600712B (en) An image sensor, mobile terminal and image capturing method
CN108965665B (en) An image sensor and mobile terminal
US11463641B2 (en)Image sensor, mobile terminal, and photographing method
US12032186B2 (en)Spectral filter, and image sensor and electronic device including spectral filter
US12372402B2 (en)Spectral filter, and image sensor and electronic device including the spectral filter
CN108965666B (en) A mobile terminal and image capturing method
JP2022064323A (en) Super spectroscopic elements, super spectroscopic sensors including them, and super spectroscopic image generators
US11996421B2 (en)Image sensor, mobile terminal, and image capturing method
WO2020015626A1 (en)Mobile terminal and image capturing method
CN114422730B (en)Image sensor and electronic device
US20250130107A1 (en)Spectral filter, and image sensor and electronic device including the spectral filter
KR102736556B1 (en) Pixel unit, photoelectric sensor, shooting module and electronics
EP4007933B1 (en)Lens assembly and electronic device including the same
US20230139533A1 (en)Optical sensor including nanophotonic microlens array and electronic device including the same
US11830281B2 (en)Electronic device including a fingerprint sensor
CN216014312U (en)Optical fingerprint identification device and electronic equipment
CN113471226B (en) Image sensor and electronic device
EP4194910B1 (en)Optical filter, and image sensor and electronic device including optical filter
KR20230024565A (en)Electronic device including a fingerprint sensor
CN120344014A (en) Image sensors, camera modules and electronic devices

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp