Detailed Description
It should be noted that in this specification, like reference numerals and letters refer to like items in the following drawings, and thus, once an item is defined in one drawing, it need not be further defined and explained in subsequent drawings.
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The present application provides an electronic device, which includes but is not limited to a mobile phone, a tablet computer, a television, a notebook computer, an ultra-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a wearable device, a virtual reality device, a camera, and other electronic devices that can be used for taking a picture.
Referring to fig. 1, fig. 1 is a schematic structural diagram of anelectronic device 100.
Theelectronic device 100 may include aprocessor 110, anexternal memory interface 120, aninternal memory 121, a Universal Serial Bus (USB)interface 130, acharging management module 140, apower management module 141, a battery 142, anantenna 1, anantenna 2, a mobile communication module 150, a wireless communication module 160, anaudio module 170, aspeaker 170A, areceiver 170B, amicrophone 170C, anearphone interface 170D, asensor module 180, abutton 190, amotor 191, anindicator 192, acamera 193, adisplay screen 194, a Subscriber Identity Module (SIM)card interface 195, and the like. Wherein thesensor module 180 may include a pressure sensor 180A, anambient light sensor 180L, a touch sensor 180K, and the like.
It is to be understood that the illustrated structure of the present application does not constitute a limitation on theelectronic device 100. In other embodiments,electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: theprocessor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. A memory may also be provided inprocessor 110 for storing instructions and data. Theprocessor 110 is configured to cause theelectronic device 100 to execute the provided photographing method.
Thecharging management module 140 is used for receiving charging input from a charger and transmitting the charging input to the battery 142 for charging; thepower management module 141 is used for managing power supply of the battery 142.
The wireless communication function of theelectronic device 100 may be implemented by theantenna 1, theantenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The mobile communication module 150 may provide wireless communication functions including 2G/3G/4G/5G, etc. applied to theelectronic device 100.
The wireless communication module 160 may provide a solution for wireless communication applied to theelectronic device 100, including a Wireless Local Area Network (WLAN) (e.g., a wireless fidelity (Wi-Fi) network), a Bluetooth (BT), and the like.
In some embodiments,antenna 1 ofelectronic device 100 is coupled to mobile communication module 150 andantenna 2 is coupled to wireless communication module 160 so thatelectronic device 100 can communicate with networks and other devices through wireless communication techniques.
Theelectronic device 100 implements display functions via the GPU, thedisplay screen 194, and the application processor. In some embodiments, theelectronic device 100 may include 1 orN display screens 194, with N being a positive integer greater than 1.
Theelectronic device 100 may implement a shooting function through the ISP, thecamera 193, the video codec, the GPU, thedisplay 194, the application processor, and the like.
The ISP is used to process the data fed back by thecamera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided incamera 193.
Thecamera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, theelectronic device 100 may include 1 orN cameras 193, N being a positive integer greater than 1.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on thedisplay screen 194. When a touch operation is applied to thedisplay screen 194, theelectronic apparatus 100 detects the touch operation by the pressure sensor 180A, and determines an operation of the user.
A distance sensor 180F for measuring a distance. Theelectronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene,electronic device 100 may utilize range sensor 180F to range for fast focus.
The ambientlight sensor 180L is used to sense the ambient light level.Electronic device 100 may adaptively adjust the brightness ofdisplay screen 194 based on the perceived ambient light level. The ambientlight sensor 180L may also be used to automatically adjust the white balance when taking a picture.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on thedisplay screen 194, and the touch sensor 180K and thedisplay screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor 180K may pass the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through thedisplay screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of theelectronic device 100, different from the position of thedisplay screen 194.
Thekeys 190 include a power-on key, a volume key, and the like. Thekeys 190 may be mechanical keys. Or may be touch keys. Theelectronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of theelectronic apparatus 100.
Theelectronic device 100 can be used for photographing, and although human eyes can only receive visible light with a wavelength of about 400 to 700nm, when photographing, the photosensitive element in thecamera 193 can receive not only electromagnetic waves in a visible light band but also electromagnetic waves in an invisible light band such as infrared (NIR) and ultraviolet (uv) light.
In the use of theelectronic device 100, manufacturers of theelectronic device 100 generally shield light in an infrared wavelength band such as near infrared rays from a corresponding wavelength band image by hardware such as a near infrared filter (NIR Cut) in thecamera 193 and shield light in a wavelength band such as ultraviolet rays from hardware such as a corresponding filter in order to facilitate the processing of a photographic algorithm during photographing.
Furthermore, an image obtained after imaging visualization of light in a near infrared band or an ultraviolet band is a single-channel image and has no color effect. The visible light is an RGB image obtained after imaging visualization, and the RGB image is a color effect.
In addition, crosstalk is easily formed between near-infrared band and ultraviolet band images and visible band bands, and in the current mode of separating visible bands from infrared bands, ultraviolet bands and the like, the visible band color distortion is serious, so that the existing method for separating near-infrared rays, ultraviolet rays and visible light cannot achieve true restoration of the images.
For example, as shown in fig. 2, where the relative sensitivities of the near infrared rays in the respective wavelength ranges are shown by a curve D, and the relative sensitivities of the light rays in the three color channels of red (R), green (G), and blue (B) in the visible RGB image without filtering the near infrared rays and the light rays after the light rays are superimposed with the near infrared rays in the respective wavelength ranges are shown by a curve B, A, C, it can be seen that the R, G, B light rays are effectively enhanced if the near infrared rays are not filtered.
In order to improve the effect of shooting the photos, the application provides the photo taking method, which can determine the light sensitivity according to the brightness information of the current photo taking environment, determine the photo taking mode according to the light sensitivity, and shoot the photos according to different photo taking modes so as to improve the quality of the shot photos.
When taking a photograph, a photo with filtered infrared rays is usually taken, so thecamera 193 in theelectronic device 100 may include a camera capable of filtering infrared rays, or certainly may include a camera without filtering infrared rays.
In the following description, a mobile phone having a photographing function is described as an example of theelectronic apparatus 100. As a carrier to which the embodiment of the present application is applied, theelectronic device 100 may also be a device with a photographing function, such as a tablet computer.
Further, referring to fig. 3, fig. 3 is a block diagram of a structure of acamera 193 equipped on a mobile phone, and in an implementation manner of the present application, thecamera 193 may include only onecamera 1931, and thecamera 1931 includes alens 193A, aphotosensitive element 193B, a dual-filter switcher 193C, and the like.
The dual-filter switch 193C includes an infrared cut filter 193C1, a full spectrum optical glass 193C2, and a control component 193C3, wherein the infrared cut filter 193C1 can reflect or absorb infrared rays such as near infrared rays, and filter the infrared rays to prevent the infrared rays from affecting the imaging color. For example, in the case where the visible light is sufficient in the daytime or the like, the control unit 193C3 controls the ir cut filter 193C1 to be in an on-position state, that is, the light passing through thelens 193A passes through the ir cut filter 193C1 and reaches thephotosensitive element 193B, so that the ir light such as near ir light is filtered by the ir cut filter 193C1, and the ir light can be prevented from being received by thephotosensitive element 193B. When the visible light is insufficient at night, the control component 193C3 controls the infrared cut filter 193C1 to move away, so that the infrared cut filter 193C1 does not filter the light, and the full-spectrum optical glass 193C2 starts to work, so that the infrared rays can be received by thephotosensitive element 193B, and thephotosensitive element 193B can fully utilize all the light, thereby greatly improving the night vision performance of thecamera 193 and enabling the shot picture to be clearer.
The control part 193C3 may include circuit control boards, motors, and transmission mechanisms, and the control part 193C3 may control the positioning and removing of the ir cut-off filter 193C1 according to the control command of theprocessor 110. In addition, the motor may be a drive coil or a stepper motor.
For example, in the present application, an automatic switching motor may be added to a photosensitive element (camera Sensor) of a general RGB camera, and the motor may be used to control the on and off positions of an infrared cut filter originally covered on the photosensitive element.
In addition, the dual-filter switch 193C can filter near infrared rays, so the dual-filter switch 193C can also be a near infrared dual-filter switch (NIR-CUT), and the infrared CUT filter 193C1 can be a near infrared CUT filter.
Based on thecamera 1931 shown in fig. 3, in one implementation of the photographing method provided by the present application, when taking a picture, the ir-cut filter 193C1 can be automatically controlled to be in-place or removed according to the ambient light brightness, for example, the ir-cut filter 193C1 is controlled to be in-place to take a picture of filtering out infrared rays, and the ir-cut filter 193C1 is controlled to be removed to take a picture of not filtering out infrared rays.
Further, in the present application, when thecamera 1931 is turned on, the infrared cut filter 193C1 in thecamera 1931 is normally defaulted to the on-position state.
For example, referring to fig. 4A, the photographing method provided by the present application includes:
s100, the mobile phone detects that the user calls the camera function, opens thecamera 1931 and displays a photographing interface.
It should be noted that the call here may refer to opening a camera application, or may refer to calling a camera function through another application.
For example, referring to fig. 4B, a "camera" application is displayed on the screen of the mobile phone, and in addition, applications such as "phone", "browser", etc. are also displayed on the screen of the mobile phone, although other applications may also be displayed on the screen of the mobile phone.
The user clicks the "camera" application on the screen, the mobile phone detects the click operation of the user on the "camera" application, opens thecamera 1931, and displays a photographing interface as shown in fig. 4C on the screen of the mobile phone.
Illustratively, referring to fig. 4C, the photographing interface includes animage display area 10 and a photographingbutton 20, theimage display area 10 is used for displaying a preview image formed by thecamera 1931, and the photographingbutton 20 is used for a user to click to take a photograph.
In addition, it should be noted that after thecamera 193 is turned on, the photographing interface may be a display interface displayed as shown in fig. 4C as a display interface corresponding to the "photographing mode by default, and in addition, the display interface may further include controls such as a" night view "photographing mode, a" portrait "photographing mode, and a" video recording "mode, so that the photographing mode may be selected through the controls. Furthermore, the photographing interface can also protect controls such as a flash light control, a setting control and the like, and the controls can be selected according to needs.
In an implementation manner of the present application, the mobile phone may execute the following S200 to S1200 in the "photographing mode by default.
In addition, as shown in fig. 4D, a "professional" photographing mode may be further included in the photographing interface, and if the user switches the photographing mode to the "professional" photographing mode, the mobile phone performs the following S200 to S1200.
S200, the mobile phone acquires the ambient light brightness of the photographing environment, and the light sensitivity (ISO value) is determined according to the ambient light brightness. The environmental light brightness is the environmental light brightness value, and the ISO information reflects the intensity of the environmental light brightness.
Illustratively, after thecamera 1931 is turned on, thecamera 1931 automatically acquires a frame and displays a preview image on the photographing interface. In addition, the mobile phone senses the ambient light brightness of the current environment through the ambientlight sensor 180L, and the ambientlight sensor 180L transmits the sensed ambient light brightness to theprocessor 110. Theprocessor 110 determines, in real time, sensitivity corresponding to the ambient light brightness of the current photographing environment according to the ambient light brightness.
The sensitivity corresponding to the environmental light brightness of the photographing environment may be determined by a common method, for example, a correspondence table in which the environmental light brightness of the photographing environment and the sensitivity are stored in the mobile phone may be determined according to the correspondence table.
For example, referring to table 1 below, the correspondence between the ambient light brightness and the sensitivity may be as shown in table 1 below.
| Ambient light brightness (lx) | Sensitivity (ISO value) |
| 1500 | 200 |
| 1200 | 400 |
| 500 | 1000 |
| 200 | 1500 |
| 10 | 2000 |
| …… | …… |
TABLE 1
For example, when the ambient light level sensed by the ambientlight sensor 180L is 1500, the predetermined sensitivity is 200.
Further, it should be noted that the relationship between the ISO value of the mobile phone and the ambient light brightness may be set (calibrated) according to the light sensitivity levels of the different light sensing elements.
Further, in the present application, please refer to fig. 4D, sensitivity information, such as "ISO 640", is displayed on the photographing interface in real time. It should be noted that the mobile phone may automatically adjust and display sensitivity information according to a change of the ambient light, and in addition, the user may further trigger the sensitivity information to manually adjust the sensitivity.
S300, the mobile phone receives the photographing operation of the user and determines a photographing mode according to the current sensitivity.
For example, the mobile phone receives a trigger operation of the user on the photographingbutton 20, acquires the current sensitivity (the moment when the mobile phone receives the trigger operation of the user on the photographing button 20), and determines the photographing mode according to the current sensitivity.
If the sensitivity detected in the cellular phone is less than 400 (an example of the first threshold), the cellular phone enters the bright scene photographing mode, and S400 is executed to keep the infrared cut filter 193C1 in an on-position state for photograph taking.
If the sensitivity is greater than 2000 (as an example of the second threshold), the mobile phone enters the dark scene photographing mode, 500 is executed, and the near-external line filter 193C2 is in a removed state for photographing.
If the sensitivity is greater than 400 and less than 2000, the normal photographing mode is performed, and S700 is executed to enable the infrared cut filter 193C1 to be in the on-position state and the removed state respectively for photographing.
In the above example, the first threshold is 400, and the ambient light brightness corresponding to the first threshold is in the brightness range of the boundary between the daytime and the evening, and the second threshold is 2000, and the ambient light brightness corresponding to the first threshold is in the brightness range of the boundary between the evening and the night.
In the above example, the ISO value is used as a reference basis for determining the selection of the photographing mode. The method can also be used as a reference basis for judging and selecting the photographing mode by using other parameters for representing the ambient light brightness.
Further, in this application, for S300, the mobile phone receives a photographing operation of a user, and determines a photographing mode according to the current sensitivity, or when thecamera 1931 is turned on to take a photograph, if the mobile phone detects that the mobile phone is not moved within a certain time, the mobile phone automatically takes a photograph, and determines the photographing mode according to the current sensitivity. The preset time may be 3S, 5S, etc., which may be selected as desired.
S400, the mobile phone defaults that the infrared cut-off filter 193C1 is in an in-place state to take a picture.
S500, removing the infrared cut-off filter 193C1 by the mobile phone for picture shooting; and S600 is performed.
S600, the mobile phone takes a picture to obtain a shot picture.
It should be noted that, after S600, an image enhancement algorithm for a single frame may be called to enhance the shot picture, so as to obtain a new shot picture, and store the new shot picture in the system memory. The single frame image enhancement algorithm includes, but is not limited to, using a color luminance shift algorithm known in the art such as AIHDR.
S700, the handset removes the infrared cut filter 193C1 for photo taking.
S800, the mobile phone takes a first picture.
The mobile phone takes a frame of image as a first picture, and the first picture is a fused image of the RGB image and the NIR image taken after the ir cut filter 193C1 is removed.
S900, the mobile phone moves back the infrared cut filter 193C1 to enable the infrared cut filter 193C1 to be in a state of being used for photo shooting.
And S1000, the mobile phone takes a second picture.
The cell phone takes one frame image as a second photograph, which is an RGB image taken without removing the ir cutfilter 193C 1.
And S1100, the mobile phone performs photo fusion on the first photo and the second photo to obtain a shot photo.
For example, the mobile phone may perform fusion processing on the first photo and the second photo by selecting a corresponding image fusion algorithm as needed, and performing photo fusion on the first shot photo and the second shot photo includes, but is not limited to, performing color supplementation and other fusion operations on an RGB image in the second photo according to NIR image information in the first photo, so as to obtain a more real photo.
Illustratively, as shown in fig. 5, the left part is Raw data (Raw Date) of the first photograph, and the right part is fusion processing of visible light (VIS Signals) and near infrared light (NIR Signals), and in addition, Noise reduction processing, such as Shot Noise (Shot Noise), Thermal Noise (Thermal Noise), and the like, may be performed.
Further, after the captured picture is obtained, the method may further include:
step S1200: and storing and displaying the shot pictures.
For example, after a shot photo is obtained by shooting, the mobile phone system stores the shot photo in the system content, and then the mobile phone stores the image stored in the memory and outputs image data in a specified format as the photo finally stored by the user.
For example, a shot photograph may be displayed directly on a post-shot photograph display interface as shown in fig. 4E for user viewing. In addition, the photo display interface can further comprise a sharing control, an editing control, a deleting control and the like, which are used for the user to operate the shot photos obtained by shooting, and the user can select the photos according to the needs.
It should be noted that, the steps S700, S800, S900 and S1000 may be performed in an alternative order, that is, the mobile phone first performs S900 to capture the second picture without removing the ir-cut filter 193C1, and then performs S700 to capture the first picture with removing the ir-cut filter 193C 1.
For example, in an actual photographing scene, if photographing is performed in a situation where light is insufficient such as at evening, a photographed picture obtained by photographing with a current common camera may be as shown in fig. 6A, and a picture of the photographed picture is relatively blurred and unclear, and a picture obtained by photographing in a third photographing mode in the photographing manner provided by the present application may be as shown in fig. 6B.
In addition, in an actual photographing scene, if photographing is performed at night or the like under the condition that light is insufficient, a photographed picture obtained by photographing with a current common camera can be as shown in fig. 6C, and a picture of the photographed picture is very unclear, and a picture obtained by photographing in the second photographing mode in the photographing mode provided by the application can be as shown in fig. 6D.
According to the photographing method, when photographing is carried out, the photographing mode can be determined according to the ambient light brightness, and then the state of the infrared cut-off filter 193C1 is automatically switched according to the determined photographing mode to carry out photographing, so that when the ambient light brightness is high, for example, under the condition that the light is sufficient in the daytime, the infrared cut-off filter 193C1 is in an in-place state, a photo with the infrared rays filtered out is taken as a photographed photo, and therefore the influence of the infrared rays on the photo, such as color distortion and the like, can be avoided, and a more real photo can be obtained; when the ambient light brightness is low, for example, under the condition of insufficient light at night, the infrared cut-off filter 193C1 is removed, and a photo without filtered infrared rays is taken as a shot photo, so that the light can be fully utilized to obtain a photo with a clear picture, the problem that the shot photo is not clear in image under the condition of insufficient light is avoided, and the night shooting effect is improved; in addition, the photo with the infrared rays filtered out and the photo without the infrared rays filtered out can be respectively shot under the condition that the ambient light brightness is dim, such as at the evening, the photo with the infrared rays filtered out and the photo without the infrared rays filtered out are subjected to photo fusion to obtain the final shot photo, the definition of the shot photo image can be ensured under the condition that the influence of the infrared rays on the photo, such as color distortion and the like, is avoided, and therefore a more real and clear photo can be obtained. Therefore, the photographing mode is determined according to the brightness of the environment light for photographing, real and clear photos under different brightness of the environment light can be photographed, the effect of the photographed photos is improved, the photos satisfying users are photographed, and the photographing experience of the users is improved.
At present, in the prior art, a method for improving a photographing effect in a scene with low ambient light brightness mainly depends on improving the light-sensing area and the aperture size of a light-sensing element in an electronic device, so as to improve the light-entering amount of a visible light band, and improve the image quality of a photographed picture. However, the hardware cost requirement corresponding to the photosensitive element is extremely high, and the photosensitive element is limited by the manufacturing process, and the photosensitive area and the aperture size of the photosensitive element cannot be increased without limit, so that the image quality of the shot picture has limitation and cannot be improved well; further, in a scene where the ambient light intensity is extremely low, for example, at dark nighttime, since the visible light emitting and reflecting abilities of some objects are weak, even if a photosensitive element with a large aperture is used, an image with high quality cannot be presented.
Compared with the prior art, the photographing method can obtain the image with higher quality without a photosensitive element with a large photosensitive area and a large aperture, not only improves the quality of the obtained picture, but also reduces the manufacturing cost of the mobile phone.
In addition, in the present application, the shooting of the first picture and the second picture can be completed only by dynamically switching the in-place and removal of the infrared cut filter 193C1 through onecamera 1931, and the subsequent picture synthesis is performed to obtain the shot picture, so that the production cost of the mobile phone is reduced, and further, the volume of the mobile phone can be reduced.
Further, in this application, thiscamera 1931 may be a front camera of the mobile phone, and may also be a rear camera, and certainly, the mobile phone may also include twocameras 1931 as a front camera and a rear camera respectively at the same time, and it may be selected as required.
Further, in another implementation manner of the present application, thecamera 193 may include two cameras, as shown in fig. 7, amain camera 1931 and a sub-camera 1932 may be included, where themain camera 1931 may be a camera including a dual-filter switcher 193C, and after thecamera 1931 is turned on, the infrared cut-off filter 193C1 in the dual-filter switcher 193C is in an on-position state by default; thesub camera 1932 may be a camera not provided with the dual filter switcher 193C.
Therefore, thefirst camera 1931 can be used for shooting a picture with infrared rays filtered out, and thesecond camera 1932 is used for shooting a picture without infrared rays filtered out.
Referring to fig. 8, another photographing method provided by the present application includes:
s110, the mobile phone detects that the user calls the camera function, opens thecamera 1931 and thecamera 1932, and displays a photographing interface.
S200, the mobile phone obtains the ambient light brightness of the photographing environment, and the light sensitivity is determined according to the ambient light brightness.
S300, the mobile phone receives the photographing operation of the user, determines a photographing mode according to the current sensitivity, and if the sensitivity is smaller than a first threshold value, the photographing mode is a first photographing mode, S400 is executed, and photo photographing is carried out through acamera 1931; if the sensitivity is greater than the second threshold, the second photographing mode is executed 500, and a picture is taken through thecamera 1932. If the sensitivity is greater than the first threshold and less than the second threshold, the third photographing mode is performed, and S700 is executed to photograph through thecamera 1931 and thecamera 1932, respectively.
S410, the mobile phone takes a picture through thecamera 1931 to obtain a shot picture.
And S510, the mobile phone takes a picture through thecamera 1932 to obtain a shot picture.
S610, the mobile phone takes a picture through thecamera 1931 to obtain a first picture.
And S710, the mobile phone takes a picture through thecamera 1932 to obtain a second picture.
And S810, the mobile phone performs photo fusion on the first photo and the second photo to obtain a shot photo.
Further, after the shot picture is obtained, the method further comprises the following steps:
s910: the mobile phone stores and displays the shot pictures.
Note that, in this embodiment, details of the same parts as those in the photographing method shown in fig. 4A are not repeated.
According to the photographing method provided by the embodiment, the photographing mode can be determined according to the ambient brightness of the photographing environment, the photos used for filtering infrared rays and the photos not used for filtering infrared rays can be respectively photographed through themain camera 1931 and the auxiliarysecond camera 1932, photographing suitable for different ambient light brightness can be conveniently achieved, the quality of photographing the photos is improved, and user experience is provided.
Further, please refer to fig. 9, in the present application, thecamera 193 may include theaforementioned camera 1931 and acamera 1933 for shooting a wide-angle image, wherein the mobile phone may shoot a photo without infrared light filtering or a photo with infrared light filtering through thecamera 1931, and shoot a photo with infrared light filtering through thecamera 1933.
Alternatively, referring to fig. 10, in the present application, thecamera 193 may include theaforementioned camera 1931,camera 1932, andcamera 1933 for capturing a wide-angle image. In an actual use process, themobile phone 100 may select thecamera 1931 to take a picture of the un-filtered infrared light or a picture of the filtered infrared light, select thecamera 1932 to take a picture of the un-filtered infrared light, or select thecamera 1933 to take a picture of the filtered infrared light according to an operation of a user.
Alternatively, referring to fig. 11, in the present application, thecamera 193 may include theaforementioned camera 1932 and acamera 1933 for capturing a wide-angle image. The mobile phone can shoot the photo without the infrared light being filtered through thecamera 1932, and shoot the photo with the infrared light being filtered through thecamera 1933.
Thecamera 1933 for taking a wide-angle image is used for taking a picture to obtain a picture with infrared rays filtered, so in the above-mentioned taking method, thecamera 1933 can be used for taking a picture with infrared rays filtered instead of the above-mentionedcamera 1931.
For example, in the foregoing steps S900 to S1100, the mobile phone may take a third photo through thecamera 1933, where the third photo is a wide-angle image photo, and the mobile phone performs photo fusion on the first photo and the third photo to obtain a taken photo.
Illustratively, for the aforementioned S610, a third photo may be taken by the mobile phone through thecamera 1933, where the third photo is a wide-angle image photo. And S810, carrying out photo fusion on the third photo and the second photo for the mobile phone to obtain a shot photo.
Further, in the present application, the infrared cut filter 193C1 may also control the on and off of its filtering function by an electrical signal, for example, the plating layer of the infrared cut filter 193C1 may be an electrochromic layer, and the on and off of the filtering function is realized by controlling the color change of the electrochromic layer, for example, when the electrochromic layer is colorless, it does not filter infrared rays, when the electrochromic layer is blue, it filters infrared rays, and the like. Of course, the on and off of the infrared ray filtering function of the ir-cut filter 193C1 can be controlled by other methods.
It should be noted that themobile phone 100 provided in the present application may further include other cameras besides theaforementioned camera 1931,camera 1932, andcamera 1933, which may be specifically set as needed, and this is not specifically described in the present application.
It should be noted that the electronic device provided in the present application may be an electronic device provided with a multispectral sensor, so as to implement the switching of the dual-filter switch 193C.
It should be noted that the terms "first," "second," and the like are used merely to distinguish one description from another, and are not intended to indicate or imply relative importance.
It should be noted that in the accompanying drawings, some structural or methodical features may be shown in a particular arrangement and/or order. However, it is to be understood that such specific arrangement and/or ordering may not be required. Rather, in some embodiments, the features may be arranged in a manner and/or order different from that shown in the illustrative figures. In addition, the inclusion of a structural or methodical feature in a particular figure is not meant to imply that such feature is required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that the foregoing is a more detailed description of the present application, and the present application is not intended to be limited to these details. Various changes in form and detail, including simple deductions or substitutions, may be made by those skilled in the art without departing from the spirit and scope of the present application.