Detailed Description
In order that the application may be readily understood, a more complete description of the application will be rendered by reference to the appended drawings. Preferred embodiments of the present application are shown in the drawings. This application may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
As used herein, "electronic device" refers to a device capable of receiving and/or transmitting communication signals that includes, but is not limited to, a device connected via any one or several of the following connections:
(1) Via a wireline connection, such as via a public-switched telephone network (Public Switched Telephone Networks, PSTN), a digital subscriber line (Digital SubsAgiber Line, DSL), a digital cable, a direct cable connection;
(2) Via a wireless interface, such as a cellular network, a wireless local area network (Wireless Local Area Network, WLAN), a digital television network such as a DVB-H network, a satellite network, an AM-FM broadcast transmitter.
An electronic device arranged to communicate over a wireless interface may be referred to as a "mobile terminal". Examples of mobile terminals include, but are not limited to, the following electronic devices:
(1) Satellite phones or cellular phones;
(2) A personal communications system (Personal Communications System, PCS) terminal that may combine a cellular radiotelephone with data processing, facsimile and data communications capabilities;
(3) A radio telephone, pager, internet/intranet access, web browser, notepad, calendar, personal digital assistant (Personal DIGITAL ASSISTANT, PDA) equipped with a global positioning system (Global Positioning System, GPS) receiver;
(4) Conventional laptop and/or palmtop receivers;
(5) Conventional laptop and/or palmtop radiotelephone transceivers, and the like.
Referring to fig. 1, an electronic device 10 includes a housing 11, and components of the electronic device 10, such as an image capturing module 20, a motherboard (not shown), and a battery (not shown), are disposed in a space enclosed by the housing 11. The motherboard may integrate, among other things, the processor, power management module, memory unit, and baseband chip of the electronic device 10. It is to be understood that the electronic device 10 of the present embodiment includes, but is not limited to, a terminal device such as a mobile phone, a tablet computer, or other portable electronic devices 10.
In some embodiments, the housing 11 is provided with a light-transmitting portion 111, and the light-transmitting portion 111 may be a light-entering hole penetrating the housing 11, or may be a structural member such as glass or light-transmitting plastic. Taking the case 11 as an example with a light inlet, the camera module 20 is disposed in a space enclosed by the case 11, and the lens 21 of the camera module 20 is disposed corresponding to the light inlet, so that light outside the electronic device 10 can enter the lens 21 to adapt to the requirement of the camera module 20 for imaging.
As shown in fig. 2, the camera module 20 includes a lens 21, an image sensor 22, and an optical conductive element 23. The image sensor 22 is disposed at a distance from the lens 21, and the lens 21 and the image sensor 22 are located on the same side of the optical conductive member 23. The optical conduction element 23 is used to deflect the light collected by the lens 21 to the image sensor 22.
The image Sensor 22 includes, but is not limited to, a charge coupled device (Charge Coupled Device, CCD) or a complementary metal oxide semiconductor device (Complementary Metal-Oxide Semiconductor Sensor, CMOS Sensor).
The optical conducting element 23 includes a first reflecting surface 235 and a second reflecting surface 236 disposed opposite to each other, the first reflecting surface 235 is configured to reflect light collected by the lens 21 to the second reflecting surface 236, and the second reflecting surface 236 is configured to reflect the reflected light to the image sensor 22. In this way, the light is reflected by the first reflecting surface 235 and the second reflecting surface 236 of the optical conducting element 23, so that the light collected by the lens 21 can be deflected by the optical conducting element 23. So that the optical conducting element 23 transmits light between the lens 21 and the image sensor 22 to adapt to the shooting requirements of the camera module 20.
The inventor finds that after the light collected by the lens 21 enters the image sensor 22 under the reflection of the first reflecting surface 235 and the second reflecting surface 236, the imaging picture obtained by imaging by the image sensor 22 has a color cast condition, that is, the imaging color accuracy is poor, so that the photographed imaging picture is distorted. The inventors have further found that a factor that causes such poor color accuracy is crosstalk between different colors.
For ease of understanding, the basic principles of imaging with the image sensor 22 are described below.
Three-primary color imaging based on the "trichromatic theory" generally refers to red, green and blue (RGB) three-primary color imaging technology, and generally comprises two links of color imaging and color image display. The color separation structure of the related art three-primary color imaging system mainly adopts three primary colors of red, green and blue, namely RGB, or adopts four primary colors of cyan, magenta, yellow and green, namely CMYG, and is widely applied to two-dimensional staring imaging equipment and one-dimensional scanning imaging equipment, the former is mainly used for digital cameras or cameras, and the latter is mainly used for digital scanners.
Based on this, the image sensor 22 has RGB three primary color channels for synthesizing colors. Wherein, the R channel represents a red light channel, the G channel represents a green light channel, and the B channel represents a blue light channel.
As shown in fig. 3, in the related art, when reflected light enters the image sensor 22, there are superimposed spectral bands between different channels. Specifically, the reflectance at the intersection of the reflection curves of the B and G channels (at about 484 nm) reaches 70%, that is, both blue and green light is reflected into the image sensor 22, corresponding to a spectrum around 484nm, and thus crosstalk occurs between the blue and green light, resulting in a decrease in the accuracy of color and color at the time of imaging. Accordingly, as further shown in fig. 3, the reflectance at the intersection of the reflection curves of the R channel and the G channel (at about 575 nm) reaches 66%, so that the reflectance of the red light and the green light reflected into the image sensor 22 in the spectrum near the corresponding 575nm is high, and crosstalk occurs between the red light and the green light during imaging. Further, as can be seen from fig. 3, the reflectance corresponding to the intersection of the reflection curves of the R channel and the G channel (at about 560 nm) is less than 10%, and therefore, the reflectance of red light and green light is low in the vicinity of the corresponding 560nm, and crosstalk is not likely to occur.
As shown in connection with fig. 2, in an embodiment of the present application, the optically conductive element 23 includes a first transmission inducing film 24 and a second transmission inducing film 25.
The first transmission inducing film 24 is disposed on the first reflecting surface 235, and the second transmission inducing film 25 is disposed on the second reflecting surface 236. It should be noted that, in some embodiments, the positions of the first transmission-inducing film 24 and the second transmission-inducing film 25 may be interchanged, that is, the first transmission-inducing film 24 is disposed on the second reflective surface 236, and the second transmission-inducing film 25 is disposed on the first reflective surface 235.
The first transmission-inducing film 24 is used for filtering a first spectral band, which is a spectral band where a B channel and a G channel are superimposed. The first transmission-inducing film 24 is used for filtering a first spectral band, which is a spectral band where a B channel and a G channel are superimposed.
In this embodiment, the spectral bands overlapped by the B channel and the G channel are filtered through the first induced transmission film 24, and the spectral bands overlapped by the R channel and the G channel are filtered through the second induced transmission film 25, so that the light entering the image sensor 22 is reduced to be overlapped by the B channel and the G channel, and to be overlapped by the R channel and the G channel, so that the crosstalk of the light in the RGB three primary color channels of the image sensor 22 is avoided, and the imaging color accuracy is improved.
As shown in connection with fig. 4, in some embodiments, the first and second transmission-inducing films 24 and 25 each include a first dielectric film M1, a second dielectric film M2, and a metal film M2, with the metal film M2 sandwiched between layers of the first and second dielectric films M1 and M2.
For the metal film M2, the formula of its reflectivity in air for normal incident light rays is as follows:
Wherein R is the reflectivity, n-ik is the complex refractive index of the metal film M2, n is the refractive index, and k is the extinction coefficient.
In the embodiment of the present application, the first and second induced transmission films 24 and 25 serve as metal induced films, and each layer structure of the first and second induced transmission films 24 and 25 may be configured based on the metal induced film principle.
Specifically, for metal-induced films, the absorption of the metal film M2 is not only determined by the optical constants (refractive index n, extinction system k) and thickness of the metal film M2 itself, but is also closely related to the admittance of the adjacent medium. If a proper antireflection film stack (for example, the first dielectric film M1) is designed on the incident side, the reflection of the whole film system is reduced to be close to zero, and at this time, the transmittance of the metal film M2 corresponding to a certain spectrum can be maximized, so that the effect of inducing transmission is achieved.
Here, the dielectric film systems (i.e., the first dielectric film M1 and the second dielectric film M2) on both sides of the metal film M2 not only increase the transmittance of the center wavelength of the metal film M2, but also increase the transmittance for a limited band because each film system includes a considerable number of layers. But outside this band, the transition from increasing transmittance to increasing reflectance is rapid. In other words, a bandpass filter is produced.
According to the metal induced film principle, a metal material with a k/n as large as possible is selected at the center wavelength to produce the metal film M2.
For example, in the embodiment of the present application, the metal film M2 of a metal material such as silver, magnesium, or aluminum is used in the visible light range. Thus, the first induced transmission film 24 of the metal film M2 can filter the first spectrum band, so as to reduce the overlapping spectrum bands of the R channel and the G channel of the image sensor 22, and further reduce the problem of color crosstalk between pixels, so that the color and the color of the image are more accurate.
In some embodiments, the refractive index of the metal film M2 is less than or equal to 0.2, and the extinction coefficient is greater than or equal to 3.7, so that k/n is sufficiently large to facilitate the improvement of the light transmittance of the corresponding wavelength band, and then the first induced transmission film 24 exerts a good filtering effect on the first spectrum wavelength band, and the second induced transmission film 25 exerts a good filtering effect on the second spectrum wavelength band. For example, the material of the metal film M2 includes silver, magnesium, or aluminum. The material has large k/n, which is beneficial to improving the light transmittance of corresponding wave bands.
In the camera module 20 of the present application, the first transmission-inducing film 24 is disposed on one of the first reflective surface 235 and the second reflective surface 236, and the second transmission-inducing film 25 is disposed on the other one of the first reflective surface 235 and the second reflective surface 236, so that the camera module 20 of the present application can obtain a good long-focus shooting effect by using the optical conductive element 23. Based on this, the image capturing module 20 of the present application can improve the color accuracy of the imaging frame and enhance the experience of long-focus shooting while taking account of the long-focus shooting effect.
In some embodiments, the first dielectric film M1 and the second dielectric film M2 each include titanium dioxide layers and silicon dioxide layers alternately arranged, one side of the first dielectric film M1 where the silicon dioxide layers are arranged is connected to the metal film M2, and one side of the second dielectric film M2 where the silicon dioxide layers are arranged is connected to the metal film M2. In this embodiment, the titanium dioxide layer and the silicon dioxide layer which are alternately arranged are utilized as the first dielectric film M1 and the second dielectric film M2, so that both sides of the metal film M2 are protected, and meanwhile, the whole film system of the first transmission-inducing film 24 can realize the effect of transmission-inducing the first spectrum band. Accordingly, the entire film system of the second transmission-inducing film 25 in this structural form can achieve the effect of transmission-inducing the second spectral band.
In some embodiments, the thickness of the first dielectric film M1 of the first induced transmission film 24 is 400nm to 500nm, and the thickness of the second dielectric film M2 of the first induced transmission film 24 is 400nm to 500nm. For example, the thickness of the first dielectric film M1 of the first induced transmission film 24 is 400nm, and the thickness of the second dielectric film M2 of the first induced transmission film 24 is 400nm. Or the thickness of the first dielectric film M1 of the first induced transmission film 24 is 400nm, and the thickness of the second dielectric film M2 of the first induced transmission film 24 is 500nm. Or the thickness of the first dielectric film M1 of the first induced transmission film 24 is 500nm, and the thickness of the second dielectric film M2 of the first induced transmission film 24 is 400nm.
By the above-described arrangement of the film thickness, the entire film system of the first transmission-inducing film 24 can realize the effect of transmission-inducing the first spectral band.
The thickness of the first dielectric film M1 of the second induced transmission film 25 is 1400nm to 160 nm, and the thickness of the second dielectric film M2 of the second induced transmission film 25 is 400nm to 500nm. For example, the thickness of the first dielectric film M1 of the second transmission inducing film 25 is 1400nm, and the thickness of the second dielectric film M2 of the second transmission inducing film 25 is 400nm. Or the thickness of the first dielectric film M1 of the second induced transmission film 25 is 1400nm, and the thickness of the second dielectric film M2 of the second induced transmission film 25 is 500nm. Or the thickness of the first dielectric film M1 of the second induced transmission film 25 is 1600nm, and the thickness of the second dielectric film M2 of the second induced transmission film 25 is 400nm.
By the above-described arrangement of the film thickness, the entire film system of the second transmission-inducing film 25 can realize the effect of transmission-inducing the second spectral band.
In some embodiments, the first induced transmission film 24 has a filtering peak at 484nm for the first spectral band, and a corresponding reflectivity of 5% or less, a half-peak broadband of 10nm to 13nm, and a broadband of 90% reflectivity of 20nm to 25nm. Thus, the first spectrum band can cover most spectrum bands in the spectrum bands overlapped by the B channel and the G channel, so that the first induced transmission film 24 can filter the first spectrum band and meet the reflection requirement of the light rays of other non-first spectrum bands at the same time, thereby improving the imaging image quality.
The second induced transmission film 25 has a filtering peak value at 575nm for a second spectrum band, and a corresponding reflectivity of less than or equal to 5%, a half-peak broadband of 11 nm-14 nm, and a broadband of 90% reflectivity of 23 nm-27 nm. Therefore, the second spectrum band can cover most spectrum bands in the spectrum bands overlapped by the B channel and the G channel, so that the second induced transmission film 25 can filter the second spectrum band and meet the reflection requirement of the light rays of other non-second spectrum bands at the same time, thereby improving the imaging image quality.
For ease of understanding, the layer structure of the first induced transmission film 24 will be described below in connection with the structures of the first dielectric film M1 and the metal film M2.
In some embodiments, silver (Ag) is used as an example of the metal film M2 among the metal films M2. Table 1 below:
Table 1 shows the film layer structure of the first induced transmission film 24 and the thickness of each layer.
Wherein SIO2 represents a silicon dioxide layer, TIO2 represents a titanium dioxide layer, and Ag represents a metal film M2 made of metallic silver. In the first induction transmission film 24, the titanium oxide layers and the silicon oxide layers alternately arranged in the first dielectric film M1 are 46.34nm, 80.25nm, 46.34nm and 144.3nm, respectively, the metal film M2 is a silver film and has a thickness of 82nm, and the silicon oxide layers and the titanium oxide layers alternately arranged in the second dielectric film M2 are 144.25nm, 46.34nm, 80.25nm and 46.34nm, respectively.
In some embodiments, the material of the optical conductive element 23 may be H-K9L glass, and the refractive index is 1.52. By performing an optical test by making the light within the optical conductive element 23 incident on the first induced transmission film 24, a reflectance profile of the first induced transmission film 24 can be obtained.
Fig. 5 shows a graph of the reflectivity of the first induced transmission film 24, with wavelength on the abscissa and reflectivity on the ordinate. Referring to fig. 5, it is understood from the reflectance graph of the first transmission-inducing film 24 that the minimum reflectance of the first transmission-inducing film 24 for visible light having a wavelength band of 400nm to 700nm is 0% (at 484 nm), the half-peak broadband is equal to 11.8nm, and the broadband corresponding to r=90% is 22nm. Therefore, the reflectivity of the first induced transmission film 24 at the corresponding first spectrum band is low enough, so that the first spectrum band is transmitted out from the first reflecting surface 235, reflection is reduced, and crosstalk generated by overlapping the B channel and the G channel due to the light entering the image sensor 22 in the first spectrum band is reduced, so that color accuracy is improved, and experience during long-focus shooting is improved.
Table 2 below:
table 2 shows the film layer structure of the second induced transmission film 25 and the thickness of each layer.
Wherein SIO2 represents a silicon dioxide layer, TIO2 represents a titanium dioxide layer, and Ag represents a metal film M2 made of metallic silver. In the second induction transmission film 25, the titanium oxide layers and the silicon oxide layers alternately arranged in the first dielectric film M1 are respectively 58.99nm, 98.53nm, 58.99nm and 1177nm, the metal film M2 is a silver film and has a thickness of 82nm, and the silicon oxide layers and the titanium oxide layers alternately arranged in the second dielectric film M2 are respectively 177.36nm, 58.99nm, 98.53nm and 58.99nm.
In some embodiments, the material of the optical conductive element 23 may be H-K9L glass, and the refractive index is 1.52. By performing an optical test by making the light within the optical conductive element 23 incident on the second transmission-inducing film 25, a reflectance profile of the second transmission-inducing film 25 can be obtained.
Fig. 6 shows a graph of the reflectance of the second induced transmission film 25, with wavelength on the abscissa and reflectance on the ordinate. Referring to fig. 6, it is understood from the reflectance graph of the second transmission-inducing film 25 that the minimum reflectance of the second transmission-inducing film 25 for visible light having a wavelength band of 400nm to 700nm is 0% (at 484 nm), the half-band width is 11.8nm, and the corresponding broadband of r=90% is 22nm. Therefore, the reflectivity of the second induced transmission film 25 at the position corresponding to the second spectrum band is low enough, so that the second spectrum band is transmitted out from the second reflecting surface 236, reflection is reduced, and crosstalk generated by overlapping the R channel and the G channel due to the light entering the image sensor 22 in the second spectrum band is reduced, so that color accuracy is improved, and experience during long-focus shooting is improved.
Tables 1 and 2 above illustrate only the thickness of the respective layer structures of the first induced transmission film 2424 in some embodiments. The thicknesses of the first dielectric film M1, the second dielectric film M2, and the metal film M2 of the present application are not limited thereto.
For ease of understanding, the configuration of the optical path of the optical conductive element 23 between the lens 21 and the image sensor 22 will be described below in connection with the structure of the optical conductive element 23, but the structure of the optical conductive element 23 and the optical path thereof of the present application are not intended to be limited thereto.
The number of deflection of the light by the optical conductive element 23 may be 2 times, or may be 3 times or more.
For example, as shown in fig. 2, the optical conduction element 23 deflects the light collected by the lens 21 2 times and then makes the light incident on the image sensor 22.
As another example, as shown in fig. 7, the optical conduction element 23 deflects the light collected by the lens 213 times and then makes the light incident on the image sensor 22.
As another example, as shown in fig. 8, the optical conduction element 23 deflects the light collected by the lens 21 5 times and then makes the light incident on the image sensor 22.
The number of deflection of the light by the optical conductive element 23 may be 7 or more, and will not be described in detail herein.
It should be noted that, as the number of times of deflection of the light by the optical conduction element 23 increases, the propagation path of the light in the optical conduction element 23 increases, and thus the focal length between the lens 21 and the image sensor 22 increases, so as to be beneficial to the long-focus shooting effect. Meanwhile, by deflecting the light multiple times, the size of the optical conductive element 23 in the optical axis direction of the lens 21 (i.e., the thickness of the optical conductive element 23) can also be made smaller, thereby facilitating the reduction of the height of the camera module 20 and the miniaturization of the camera module 20.
In some embodiments, the camera module 20 further includes a filter element 26, where the filter element 26 is disposed in the optical path of the light emitted from the optical conducting element 23 to the image sensor 22. The filter element 26 is used to filter the interference light emitted from the optical conducting element 23, so as to prevent the interference light from reaching the image sensor 22 and affecting normal imaging. In one embodiment, filter element 26 may be an infrared cut filter.
As shown in fig. 7 to 9, the optical conductive element 23 includes a light-transmitting surface 233, a bottom surface 234, a first light-reflecting surface 235, and a second light-reflecting surface 236. The light-transmitting surface 233 is parallel to the bottom surface 234, the first side 231 and the second side 232 are connected between the light-transmitting surface 233 and the bottom surface 234, the first side 231 and the second side 232 are perpendicular to the light-transmitting surface 233, the first reflecting surface 235 and the second reflecting surface 236 are connected between the light-transmitting surface 233 and the bottom surface 234, the first reflecting surface 235 and the second reflecting surface 236 are perpendicular to the first side 231, the first reflecting surface 235 and the second reflecting surface 236 are obliquely arranged opposite to the light-transmitting surface 233, the optical axis of the image sensor 22 is perpendicular to the light-transmitting surface 233, the lens 21 is opposite to the first reflecting surface 235 along the optical axis direction, and the image sensor 22 is opposite to the second reflecting surface 236 along the optical axis direction.
It should be noted that, the light being reflected by the first reflecting surface 235 and the second reflecting surface 236 sequentially includes that the light reflected by the first reflecting surface 235 is directly incident on the second reflecting surface 236, and also includes that the light reflected by the first reflecting surface 235 is reflected by other surfaces of the optical conductive element 23 to the second reflecting surface 236.
For example, as shown in connection with FIG. 7, in some embodiments, the optical conducting element 23 is configured such that light collected by the lens 21 is reflected to the image sensor 22 via the first reflective surface 235, the light transmissive surface 233, and the second reflective surface 236 in sequence.
For another example, in some embodiments, as shown in connection with fig. 8, the light-transmitting surface 233 includes first and second reflective portions 2331 and 2332 disposed at intervals. The optical conducting element 23 is configured such that light collected by the lens 21 is reflected to the image sensor 22 through the first reflecting surface 235, the first reflecting portion 2331, the bottom surface 234, the second reflecting portion 2332 and the second reflecting surface 236 in sequence.
In the embodiment of the present application, the optical conductive element 23 may be an integral prism or may be a combination of a plurality of prisms.
The optically conducting element 23 comprises at least one trapezoidal prism 23a and/or the optically conducting element 23 comprises at least one triangular prism 23b.
For example, as shown in connection with fig. 10, the optical conducting element 23 includes 1 trapezoidal prism 23a and 1 triangular prism 23b, and the cross section of the optical conducting element 23 formed by combining the triangular prism 23b with the trapezoidal prism 23a is trapezoidal, it is understood that the optical conducting element 23 is one larger-sized trapezoidal prism formed by combining the 1 trapezoidal prism 23a with the 1 triangular prism 23 b. As another example, as shown in connection with fig. 11, the optical conducting element 23 includes 3 trapezoidal prisms 23a, and 2 trapezoidal prisms 23a are combined to form one larger sized trapezoidal prism.
The number and types of prisms in the optical conductive element 23 are not described herein, as long as the optical conductive element 23 can adapt to deflection of the light collected by the lens 21, so that the light enters the image sensor 22 to adapt to the shooting requirement of the camera module 20.
The shapes of the first and second transmission-inducing films 2424 and 25 may be identical to the shapes of the first and second reflection surfaces 235 and 236 to completely cover the first and second reflection surfaces 235 and 236. Of course, in some embodiments, as shown in connection with fig. 12, the first transmission-inducing film 2424 need only cover the first reflective surface 235 or the second reflective surface 236 for reflecting light into the region of the image sensor 22 for imaging. Therefore, the dimensions of the first and second transmission inducing films 2424 and 25 are not limited herein.
Referring to fig. 13, fig. 13 is a schematic structural diagram of an electronic device 10 according to an embodiment of the present application. The electronic device 10 may include Radio Frequency (RF) circuitry 501, memory 502 including one or more computer readable storage media, an input unit 503, a display unit 504, a sensor 505, audio circuitry 506, a wireless fidelity (WiFi, wireless Fidelity) module 507, a processor 508 including one or more processing cores, and a power supply 509. Those skilled in the art will appreciate that the configuration of the electronic device 10 shown in fig. 13 is not limiting of the electronic device 10 and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The radio frequency circuit 501 may be used to send and receive information, or receive and transmit signals during a call, in particular, after receiving downlink information from a base station, it may be processed by one or more processors 508, and in addition, send data related to uplink to the base station. Typically, the radio frequency circuitry 501 includes, but is not limited to, an antenna, at least one amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a low noise amplifier (LNA, low Noise Amplifier), a duplexer, and the like. In addition, the radio frequency circuit 501 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications (GSM, global System of Mobile communication), universal packet Radio Service (GPRS, general Packet Radio Service), code division multiple access (CDMA, code Division Multiple Access), wideband code division multiple access (WCDMA, wideband Code Division Multiple Access), long term evolution (LTE, long Term Evolution), email, short message Service (SMS, short MESSAGING SERVICE), and the like.
Memory 502 may be used to store applications and data. The memory 502 stores application programs including executable code. Applications may constitute various functional modules. The processor 508 executes various functional applications and data processing by running application programs stored in the memory 502. The memory 502 may mainly include a storage program area that may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), etc., and a storage data area that may store data created according to the use of the electronic device 10 (such as audio data, a phonebook, etc.), etc. In addition, memory 502 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device. Accordingly, the memory 502 may also include a memory controller to provide access to the memory 502 by the processor 508 and the input unit 503.
The input unit 503 may be used to receive input numbers, character information or user characteristic information such as fingerprints, and to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, in one particular embodiment, the input unit 503 may include a touch-sensitive surface, as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations thereon or thereabout by a user (e.g., operations thereon or thereabout by a user using any suitable object or accessory such as a finger, stylus, etc.), and actuate the corresponding connection means according to a predetermined program. Alternatively, the touch-sensitive surface may comprise two parts, a touch detection device and a touch controller. The touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 508, and can receive and execute commands sent by the processor 508.
The display unit 504 may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of the electronic device 10, which may be composed of graphics, text, icons, video, and any combination thereof. The display unit 504 may include a display panel. Alternatively, the display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay a display panel, and upon detection of a touch operation thereon or thereabout, the touch-sensitive surface is passed to the processor 508 to determine the type of touch event, and the processor 508 then provides a corresponding visual output on the display panel based on the type of touch event. Although in fig. 13 the touch sensitive surface and the display panel are implemented as two separate components for input and output functions, in some embodiments the touch sensitive surface may be integrated with the display panel to implement the input and output functions. It is understood that the display screen may include an input unit 503 and a display unit 504.
The electronic device 10 may also include at least one sensor 505, such as a light sensor, a motion sensor, and other sensors. In particular, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or backlight when the electronic device 10 is moved to the ear. As one of the motion sensors, the gravitational acceleration sensor may detect the acceleration in all directions (typically, three axes), and may detect the gravity and direction when stationary, and may be used for applications of recognizing the gesture of a mobile phone (such as horizontal/vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer, and knocking), and other sensors such as gyroscopes, barometers, hygrometers, thermometers, and infrared sensors, which may be further configured in the electronic device 10, will not be described herein.
Audio circuitry 506 may provide an audio interface between the user and electronic device 10 through speakers, microphones, and so forth. The audio circuit 506 may convert received audio data into electrical signals for transmission to a speaker for conversion to sound signals for output by the speaker, and on the other hand, the microphone may convert collected sound signals into electrical signals for receipt by the audio circuit 506 for conversion to audio data for processing by the audio data output processor 508 for transmission to, for example, another electronic device 10 via the radio frequency circuit 501 or for output to the memory 502 for further processing. The audio circuit 506 may also include a headset base to provide communication of the peripheral headset with the electronic device 10.
Wireless fidelity (WiFi) belongs to a short-range wireless transmission technology, and the electronic device 10 can help a user to send and receive e-mail, browse web pages, access streaming media and the like through the wireless fidelity module 507, so that wireless broadband internet access is provided for the user. Although fig. 13 illustrates a wireless fidelity module 507, it is understood that it is not a necessary component of the electronic device 10 and may be omitted entirely as desired without changing the essence of the invention.
The processor 508 is a control center of the electronic device 10, connects various portions of the entire electronic device 10 using various interfaces and lines, and performs various functions of the electronic device 10 and processes data by running or executing applications stored in the memory 502, and invoking data stored in the memory 502, thereby performing overall monitoring of the electronic device 10. Optionally, the processor 508 may include one or more processing cores, and preferably the processor 508 may integrate an application processor that primarily processes operating systems, user interfaces, applications, etc., with a modem processor that primarily processes wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 508.
The electronic device 10 also includes a power supply 509 that provides power to the various components. Preferably, the power supply 509 may be logically connected to the processor 508 through a power management system, so as to perform functions of managing charging, discharging, and power consumption management through the power management system. The power supply 509 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 13, the electronic device 10 may further include a bluetooth module or the like, which is not described herein. In the implementation, each module may be implemented as an independent entity, or may be combined arbitrarily, and implemented as the same entity or several entities, and the implementation of each module may be referred to the foregoing method embodiment, which is not described herein again.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the claims. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.