Detailed Description
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
Currently, an electronic device may provide a photographing function to a user through a configured photographing module.
By way of example, the camera module may include a camera and an image sensor. In the case where a plurality of camera modules are provided in the electronic apparatus, each camera module may correspond to a camera. And different cameras may be respectively configured with corresponding image sensors. Or two or more of the different cameras may share one image sensor.
For the image sensor, after the optical signals acquired by the camera are acquired, the output of the original images in different modes can be performed in different scenes. The original image may be a RAW image.
By way of example, the different modes may include modeA and modeB. Wherein the frame rate of the output image in modeA mode is different from the frame rate of the input image in modeB mode. In some cases, the image size of the original image output in modeA mode and the original image output in modeB mode may also be different. For example, modeA may be a binding mode and modeB may be a InsensorZoom mode.
Wherein the frame rate is in frames. The frame rate is used to indicate the frequency at which the image sensor is exposed and outputs an image per unit time. In the following description, a process in which an image sensor exposes and outputs an image is simply referred to as an image sensor map. The frame rate is generally indicated using fps (frame per second). For example, the frame rate is 30fps, which represents 30 frames of images output within 1 second.
The image size is used to represent the size of the original image output by the image sensor.
As one possible implementation, the image size may include a length, a width, a resolution, etc. of the image data.
As yet another possible implementation, the image size may include parameters such as a line length, a frame length, and the like, to which the image data corresponds.
Wherein the line length (LINE LENGTH) is the length of the image sensor output image data over a line. The linelength may include HBlank. The line length may also be referred to as HTS.
Frame length (FRAME LENGTH) is the number of lines of one frame image. The frame length may include V Blank. The frame length may also be referred to as VTS.
It is understood that the line length may correspond to the length of the original image output by the image sensor. The frame length may determine the width of the original image output by the image sensor. The line length and frame length may together determine the resolution of the output image.
Thus, the electronic device can acquire and display the original images required in different scenes by controlling the image sensor to output modeA or modeB original images.
For example, taking a camera application installed in an electronic device, the electronic device controls an image sensor map through the camera application, thereby providing a photographing function as an example.
Referring to FIG. 1, a schematic diagram of an interface interaction is shown.
As shown in fig. 1, the electronic device may display an icon 102 of a camera application on a main interface 101. Upon receiving a click operation 103 of the icon 102 by the user, the electronic device may correspondingly display an interface 104 of the camera application.
Taking the example of providing a photographing function by default after the camera application is opened. As shown by interface 104 in fig. 1, the electronic device may centrally display the current shooting mode as "shot" in function bar 105. A preview image 106 of the object to be photographed may also be included in the interface 104. Thus, when the user determines to take a picture, the button 107 can be clicked. Correspondingly, the camera application may perform processing such as photographing and saving of the current photographing object in response to the button 107.
In some implementations, as shown in FIG. 1, other function options may also be displayed in the function bar 105. For example, the other function options may include a current focal length magnification (e.g., current magnification of 1×, optional magnification of 2×). As another example, other functional options may include other shooting modes (e.g., video mode, HDR mode), and so forth.
Referring to fig. 2, an internal interaction diagram of an electronic device is shown. Based on this interaction example as shown in fig. 2, the electronic device may provide the user with a corresponding photographing function in case of the user input operation 103.
As shown in fig. 2, after receiving the user's operation 103, the camera application sends an instruction 21 to a camera abstraction (CAMERA HAL) configured in the electronic device.
For example, the instruction 21 may include operation information corresponding to operation 103.
In some implementations, multiple camera modules may be configured in an electronic device. Different camera modules can be configured with different camera identifications, so that the electronic equipment can correctly call the corresponding camera modules to work by carrying the camera identifications in the commands.
For example, in some embodiments, the camera abstraction in the electronic device may determine the camera identity of the camera module that needs to be pulled up by itself according to the operation information carried in the instruction 21. In other implementations, the operation information may also include a camera identifier of the camera module that needs to be pulled up.
The camera abstraction may also determine that a map needs to be made for the image sensor configuration modeA or modeB. In the present application, the frame rate modeA may also be referred to as the second frame rate, and the frame rate modeB may also be referred to as the first frame rate. And using modeA to plot the adjacent two-frame images at a first time length, and using modeB to plot the adjacent two-frame images at a second time length.
The camera abstraction can then send instructions 22 to the camera driver corresponding to the camera module to be pulled up. The instruction 22 may carry a configuration parameter modeA or modeB, which is used to instruct the image sensor to output the original image according to the mode corresponding to the identifier.
In the case where a plurality of camera modules are arranged in the electronic apparatus, each camera module may be respectively provided with a corresponding camera driver. The camera driver may be a functional map of the corresponding camera module in the electronic device. The electronic device may send a related instruction to the corresponding camera driver when a certain camera module is required to be used.
In this way, the camera abstraction can send this instruction 22 to the camera driver of the camera module that needs to be pulled up. The camera driver may control the corresponding camera module to operate according to the instructions 22. For example, the camera driver may send instructions 22 to the image sensor of the camera module to control the image sensor to make a map.
It should be noted that one or more registers may be configured in the camera driver, for storing various configuration parameters that need to be used in the image sensor mapping process.
For example, referring to fig. 3, a register 31 may be configured in the camera driver. After receiving the instruction 22, the camera driver may write the configuration parameters carried by the instruction 22 into the register 31. For example, the configuration parameters carried by the instruction 22 may include the configuration parameter a of modeA. The configuration parameter a as modeA may include an image size S1 corresponding to a frame rate f0, modeA.
Thus, the camera driver can control the image sensor to perform the drawing according to the configuration parameter a in the register 31 as shown in fig. 3.
For example, under control of the camera drive, the image sensor may acquire an optical signal from the camera, outputting raw image data 23 according to modeA.
In the example of fig. 2, the image sensor may directly transmit raw image data 23 to an image signal Processor (IMAGE SIGNAL Processor, ISP) for subsequent processing.
In other embodiments, the image sensor may send raw image data 23 to the camera driver and the camera driver may transmit raw image data 23 to the ISP for processing.
Correspondingly, the ISP may process the received raw image data 23 to obtain processed image data 24. The electronic device may display on a display screen based on the image data 24. For example, the image data 24 may be displayed to present a preview image 106 as shown in FIG. 1.
Thus, the image sensor can realize the image drawing of one frame of image based on modeA. Thereafter, the image sensor may continuously perform the drawing at the frame rate f 0. Correspondingly, the electronic device can continuously display a plurality of frame images on the interface to form a preview stream.
In some cases, the electronic device may control the image sensor map mode to switch from modeA to modeB.
By way of example, the electronic device may switch from modeA to modeB by decision-making determination when any of the following occurs:
switching to different focal length multiplying power, for example, switching from 1X to 2X, namely, the electronic equipment receives an instruction input by a user to perform focal length adjustment;
And switching to a different shooting mode, such as switching from a shooting mode to a video mode, namely, the electronic equipment receives an instruction input by a user to perform shooting mode switching operation.
As an example, fig. 4 provides an interface interaction example of focus adjustment and shooting mode switching.
As shown in fig. 4, in the case where the user inputs an operation 401 indicating to adjust the focal length magnification to 2 x, the electronic device may determine to switch to a different focal length magnification according to the operation 401.
As shown in fig. 4, in the case where the user inputs operation 402 indicating to switch the photographing mode to the video mode, the electronic device may determine to switch to a different photographing mode according to operation 401.
In other implementations, the electronic device may switch from modeA to modeB based on the current environmental parameters, as determined by the decision.
Wherein the current environmental parameters may include a 3A data determination of the already acquired image. Among them, 3A data, that is, automatic Exposure (AE) data, automatic White Balance (AWB), and Automatic Focus (AF) data acquired during the processing of original image data by the ISP. For example, the electronic device may determine whether to switch from modeA to modeB based on the AE gain and AE exposure duration decisions.
As one example, upon receiving operation 401 or operation 402, the electronic device determines that the AE gain of the already acquired image is less than a preset gain threshold, and/or determines that the AE exposure duration of the already acquired image is less than a preset duration threshold, and determines that a switch from modeA to modeB is required.
In the above implementation, the electronic device switches from modeA to modeB based on user input operations and the 3A data decisions of the acquired images. In other implementations, the electronic device may also switch from modeA to modeB at its discretion.
For example, the electronic device may determine to switch from modeA to modeB if a trigger to enter an HDR scene is triggered.
The electronic device can enter an HDR scene under the instruction of a user, or the electronic device can automatically trigger to enter the HDR scene according to the fact that the current shot image comprises a high-light environment and a dark-light environment.
The switch modeA to modeB is implemented in a number of different schemes included in the existing schemes.
In some implementations, the camera abstraction can send a stop-stream indication to the camera driver. The stall indication may be used to indicate that the current mode (e.g., modeA) based plot is stopped. Correspondingly, after the camera driver receives the stop flow instruction, the image sensor is controlled to stop drawing.
Thereafter, the camera abstraction can send a start-up indication to the camera driver. The start-up indication may carry modeB configuration parameters B. Thus, the camera driver can control the image sensor to start map based on modeB according to configuration parameter B.
In the implementation process of the scheme, after the camera driver receives the stop-flow instruction and before receiving the start-flow instruction, the image sensor stops making the image. This results in interruption of the preview image display during the switching process.
In other implementations, the camera abstraction may send modeB configuration parameters directly to the camera driver. This scheme may also be referred to as a seamless switching scheme. In this seampless switching scheme, the camera driver may be configured to control the image sensor map based on the latest received configuration parameters (e.g., configuration parameter B).
Thus, the image sensor can receive the configuration parameter B and then perform drawing according to the configuration parameter B. For example, the map is made according to the frame rate f1 indicated by the configuration parameter B.
It can be understood that when the image frame rate of the image sensor is greater than or equal to the display frame rate of the display screen of the electronic device, the preview stream displayed on the display screen will not be blocked or interrupted.
And as the display frame rate of the electronic device increases, the user is more sensitive to the change of the image frame rate of the image sensor.
Based on the seampless switching scheme, even if the image sensor directly switches the image from the configuration parameter a after receiving the configuration parameter B, the problems of blocking of the preview stream and the like can be caused due to the frame rate difference before and after switching.
This is caused by the mapping mechanism controlling the image sensor.
For example, refer to fig. 5 in conjunction with fig. 3. The register 31 may store the validated configuration parameter a before receiving the configuration parameter B. Correspondingly, the image sensor can perform mapping according to the frame rate f0 indicated by the configuration parameter A.
Upon receiving the configuration parameter B, the image sensor stops plotting according to the frame rate f 0. For example, the camera driver corresponding to the image sensor may configure the configuration parameters in the register 31 to be unavailable or deleted.
The camera driver may then write configuration parameter B to register 31. After the configuration parameter B is written into the register, the image sensor can perform the plotting according to the frame rate f1 indicated by the configuration parameter B.
Because of a certain time consumption for each process before the new configuration parameters of the register 31 are validated, such as data modification, deletion, and writing, the image sensor cannot normally map in the configuration adjustment process. In addition, for the image sensor, the different modes of the graph involve the change of the internal configuration, and when the image sensor is switched from modeA to modeB, a certain time is required to enable the corresponding configuration of modeB to be effective, so that the image can be mapped according to the configuration parameters corresponding to modeB.
Whereby a jam or interruption of the preview stream may occur.
The problems of the above-described process will be specifically described with reference to the frame image map timing provided in fig. 6.
As shown in fig. 6, the configuration parameter a is validated when the image sensor performs the drawing of the N-1 th frame image and the previous N-2 nd and N-3 rd frame images. Then, the frame rate corresponding to the picture interval of the N-3 th frame image and the N-2 nd frame image is the frame rate f0. The frame rate corresponding to the picture interval of the N-2 th frame image and the N-1 st frame image is the frame rate f0.
After the N-1 frame is displayed, the camera driver of the image sensor receives the configuration parameter B. In this way, the camera driver can update the configuration parameter B into the register 31 according to the implementation as shown in fig. 5. After the configuration parameter B is updated, the image sensor may perform the drawing of the nth frame image and the subsequent frame image according to the new configuration parameter B. For example, the picture interval of the nth frame image and the n+1th frame image may correspond to the frame rate f1. The picture interval of the n+1st frame image and the n+2nd frame image may correspond to the frame rate f1.
After the N-1 frame image is mapped, before the N frame image is mapped, the mapping interval between the N frame image and the N-1 frame image is obviously lower than f0 or f1 due to the existence of the configuration adjustment process. For example, the picture interval of the nth frame image and the N-1 th frame image corresponds to the frame rate f2.
Thus, as shown in fig. 6, even if the seampless switching scheme is effective, an additional frame rate f2 occurs during the configuration adjustment. A jitter of the frame rate f0 to the frame rate f2 to the frame rate f1 occurs corresponding to the image displayed on the display screen, and thus appears as a jam of the preview stream.
Note that in some implementations, f0 and f1 are different examples. In other implementations, f0 and f1 may also be the same. In this way, due to the occurrence of the frame rate f2, a jump of the frame rate occurs in the process of modeA switching to modeB, which further appears as a clip of the preview stream.
Based on the technical scheme provided by the embodiment of the application, the electronic equipment can control the image sensor to continue to perform the drawing of at least one frame of image according to the validated configuration parameters before the new configuration parameters are validated under the condition that the configuration parameters are required to be adjusted.
For example, referring to fig. 7, in the case where the scheme provided in the embodiment of the present application is effective, after the N-1 th frame is mapped, the image sensor may continue to map the N-th frame image according to the frame rate f0 when the configuration parameter B is received. Thus, the picture interval of the N-1 th frame image and the N-th frame image is maintained at the frame rate f0. And then, after the configuration parameter B is effective, the N-1 frame image is directly mapped according to the new frame rate f 1.
It can be seen that in the case where the scheme provided by the present application is effective, the picture interval of the N-1 th frame image and the N-th frame image is maintained at f0 without occurrence of a jump to the frame rate f2 as shown in fig. 6. Therefore, the frame rate of the continuous images only changes from f0 to f1, and the problem of display blocking in the case shown in fig. 6 is avoided.
In the present application, the magnitude relation between f0 and f1 is not limited. By the scheme, the occurrence of the frame rate f2 is avoided, so that the problem of frame rate jump caused by the occurrence of the frame rate f2 can be solved no matter whether f0 is the same as f 1.
The following describes the scheme provided by the embodiment of the present application in detail.
It should be noted that, the electronic device in the embodiment of the present application may include at least one of a mobile phone, a foldable electronic device, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, an artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) device, a wearable device, a vehicle-mounted device, a smart home device, or a smart city device. The embodiment of the application does not limit the specific type of the electronic device.
Referring to fig. 8, a schematic diagram of software composition of an electronic device is provided in an embodiment of the present application.
In this example, the software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Embodiments of the application are configured in a layered mannerThe system is an example illustrating the software architecture of an electronic device.
As shown in fig. 8, the layered architecture divides the software into several layers, each with a clear role and division of work. The layers communicate with each other through a software interface. In some embodiments, it willThe system is divided into five layers, namely an application program layer, an application program framework layer, an Zhuoyun rows (Android runtime, ART) and a native C/C++ library, a hardware abstraction layer (Hardware Abstract Layer, HAL) and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 8, the application package may include applications for cameras, calendars, maps, WLANs, music, text messages, calls, navigation, bluetooth, video, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 8, the application framework layer may include a window manager, a content provider, a view system, a resource manager, a notification manager, an activity manager, an input manager, and the like.
The Window manager provides a Window management service (Window MANAGER SERVICE, WMS), and WMS may be used for Window management, window animation management, surface management, and as a transfer station for an input system.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The activity manager may provide an activity management service (ACTIVITY MANAGER SERVICE, AMS), and the AMS may be used for system component (e.g., activity, service, content provider, broadcast receiver) start-up, handoff, scheduling, and application process management and scheduling tasks.
The Input manager may provide an Input management service (Input MANAGER SERVICE, IMS) and the IMS may be used to manage inputs to the system, such as touch screen inputs, key inputs, sensor inputs, etc. The IMS retrieves events from the input device node and distributes the events to the appropriate windows through interactions with the WMS.
The android runtime includes a core library and An Zhuoyun rows. The android runtime is responsible for converting source code into machine code. Android runtime mainly includes employing Advanced Or Time (AOT) compilation techniques and Just In Time (JIT) compilation techniques.
The core library is mainly used for providing the functions of basic Java class libraries, such as basic data structures, mathematics, IO, tools, databases, networks and the like. The core library provides an API for the user to develop the android application.
The native C/c++ library may include a plurality of functional modules. Such as surface manager (surface manager), media Framework (Media Framework), libc, openGL ES, SQLite, webkit, etc.
The surface manager is used for managing the display subsystem and providing fusion of 2D and 3D layers for a plurality of application programs. Media frames support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. OpenGL ES provides for drawing and manipulation of 2D graphics and 3D graphics in applications. SQLite provides a lightweight relational database for applications of electronic devices.
The hardware abstraction layer runs in a user space (user space), encapsulates the kernel layer driver, and provides a call interface to the upper layer. By way of example, the hardware abstraction layer may include a display module, an audio module, a camera module, a bluetooth module, and the like. In some embodiments, the camera module may also be referred to as a camera abstraction.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver and a Bluetooth driver. In combination with the foregoing description, in the present application, in the case where a plurality of camera modules are configured in the electronic device, each camera module may be correspondingly configured with a corresponding camera driver.
It should be noted that, the electronic device composition provided in fig. 8 is only an example, and does not limit the electronic device according to the embodiment of the present application. In other embodiments, the electronic device may also have other software components.
Exemplary, referring to fig. 9, a schematic diagram of the composition of still another electronic device according to an embodiment of the present application is provided.
In this example as in fig. 9, the software system running in the processor may be divided into user space (userspace) and kernel space (KERNEL SPACE). Corresponding to the software architecture shown in FIG. 8, the user space may include a hardware abstraction layer as shown in FIG. 8. In some embodiments, the application layer, application framework layer, and native C/C++ libraries as shown in FIG. 8 may also be considered configured in user space. The kernel space as shown in fig. 9 may then correspond to the kernel layer as shown in fig. 8.
As shown in fig. 9, the user space and kernel space of the electronic device may be configured in the form of software modules in the processor.
Wherein the user space may be configured with a camera application, a camera abstraction.
The camera abstraction may include a decision module and a sensor node.
The decision module may also be referred to as an AEC decision module.
The decision module may be used to obtain operation information of a user's current input operation from the camera application. For example, the operation information may include operation information of operation 103 shown in fig. 1, operation information of operation 401 shown in fig. 4, operation information of operation 402 shown in fig. 4, and the like.
The decision module may also be used to obtain 3A data of the last frame of image that has been obtained from the ISP. For example, the decision module may acquire data such as AE exposure time, AE gain, etc. of the last frame image that has been acquired from the IFE of the ISP.
The decision module may also be configured to decide whether to perform switching of different patterns according to the information already acquired. For example, the switching of different graph modes may include switching from modeA to modeB, or switching from modeB to modeA.
The sensor node is also called sensor node. The sensor node may specifically include an enabling control module and an exposure control module.
Wherein the enable control module may also be referred to as reshutter control module. The enabling control module can be used for determining a target image sensor corresponding to the image pickup module which is required to be lifted currently. The enable control module may also be used to determine whether the target image sensor supports reshutter functions. In the embodiment of the present application, reshutter functions may also be referred to as first functions.
In an embodiment of the present application, the support reshutter function may correspond to that the image sensor may be configured to continue to perform the image processing of at least one frame image corresponding to the existing image configuration after receiving the new image configuration.
The enabling control module is also used for transmitting relevant control information to the camera drive of the target image sensor. Illustratively, the object sensor support reshutter functions are taken as an example. The control information may include a reshutter enable flag, a configuration parameter B of modeB.
As shown in fig. 9, an exposure control module may also be included in the sensor node.
In some embodiments of the present application, the exposure control module may be configured to control a frame length of the 1 st frame image after the pattern switching in a case where the pattern switching is performed.
For example, the exposure control module may control the exposure time period of the 1 st frame image after switching to modeB according to the configuration parameter B of modeB and the configuration parameter a of modeA. Therefore, after the pattern is switched, the frame length of the 1 st frame image of the image sensor is the same as or similar to the frame length of the last frame image before the pattern is switched. In some implementations, a frame length difference within 10% may be considered to be similar for both frames. Thus, the frame rate of the output image is ensured not to change greatly before and after the picture mode is switched.
It should be noted that, in some embodiments of the present application, the exposure control module may be omitted. Therefore, the control module can be enabled to realize that no jump of the frame rate occurs after the graph mode is switched.
As shown in fig. 9, a camera driver and an ISP may be disposed in the kernel space.
Wherein the camera driver may be used for command transmission between the relevant module and the image sensor in the user space. In the case where a plurality of camera modules are arranged in the electronic device, a plurality of camera drivers may be correspondingly arranged in the kernel space.
It should be noted that, in the embodiment of the present application, the image sensor and the camera may together form the camera module. The camera module may include one or more camera modules.
The number of image sensors (e.g., N) configured in the electronic device may be the same as or different from the number of cameras (e.g., M).
Correspondingly, the configuration of the camera driver can be configured in one-to-one correspondence with the image sensor. Or the configuration of the camera drivers may be configured in one-to-one correspondence with the cameras.
In the following description, a one-to-one correspondence between camera driving and image sensors is taken as an example.
In the present application, as shown in fig. 10, at least three registers may be configured in the corresponding camera driver for the image sensor supporting reshutter functions. Such as a GPH register (or referred to as a first register), reshutter register (or referred to as a second register), and a configuration register (or referred to as a third register).
The GPH register is used for enabling control of simultaneous writing of multiple groups of data corresponding to the same frame. Alternatively, GPH registers are used to enable configuration reshutter registers and configuration registers.
Reshutter registers are used to enable reshutter functions. The configuration register is used for storing configuration parameters of the graph mode. For example, the storage area A of the configuration register is used to store validated configuration parameters (e.g., the configuration parameter A of modeA). The storage area B of the configuration register is used to store configuration parameters to be validated (e.g., the configuration parameter B of modeB).
The camera driver is also used for controlling the image sensor to perform drawing according to the configuration condition of the register. For example, in the case where the reshutter register is configured to be 1 (or the first value), the camera driver may control the image sensor to perform the mapping of the next frame image according to the configuration parameter of the storage area a in the configuration register. The camera driver can also control the image sensor to carry out the drawing of the subsequent image according to the configuration parameters of the storage area B in the configuration register.
In some implementations, the ISP may include an Image Front End processing unit (IFE) and an Image processing engine (Image processing Engine, IPE).
The IFE is mainly responsible for preprocessing of the image data acquired by the camera. Such as color processing, denoising processing, enhancement processing, sharpening processing, and the like. The output of IFE is the pre-processed RAW image data. In the application, 3A data such as AE exposure time length, AE gain and the like can be obtained after IFE processing. IFE may also be used to send 3A data to a decision module.
The IPE is a main processing unit in the ISP and is responsible for performing various arithmetic processes on the image data output from the IFE. Such as white balance, color correction, noise reduction, etc. The data output after IPE processing can be used as the basis for the electronic equipment to send and display data.
The technical scheme provided by the embodiment of the application can be applied to the electronic equipment shown in fig. 8 or 9.
In the following examples, the implementation of the solution provided in the embodiment of the present application is described in detail with reference to the software composition shown in fig. 9.
Referring to fig. 11, a schematic diagram of interaction between modules is provided in an embodiment of the present application. In this implementation, the currently used camera module supports reshutter functions as an example. Before the N-1 frame image is mapped, the electronic device determines to use modeA to map as an example. Thus, the GPH register in the camera driver may be configured to be 0, the reshutter register may be configured to be 0, and the configuration parameter a may be stored in the configuration register.
By this flow shown in fig. 11, it is possible to realize that the N-1 st frame image is mapped with the configuration parameter a (e.g., the frame rate f0, etc.) corresponding to modeA.
As shown in fig. 11, the scheme may include:
S1101, the camera driver sends a map command 111 to the image sensor.
Wherein the map instruction 111 may correspond to the configuration parameter a. For example, the map instruction 111 may include a frame rate f0, an image size S1, corresponding to the configuration parameter a.
S1102, the image sensor transmits the original image data 112 to the IFE according to the map instruction 111.
Wherein the original image data 112 may correspond to an original image of an N-1 frame image.
In this example, the image sensor may process the optical signal from the camera according to the image size S1 carried by the map instruction 111, and output the raw image data 112 corresponding to the image size S1. Further, the frequency at which the image sensor transmits the raw image data 112 to the IFE may correspond to the frame rate f 0.
S1103, IFE sends the reference data 113 to the decision module.
The reference data 113 may include 3A data corresponding to an N-1 frame image. For example, the reference data 113 may include an AE exposure period and/or an AE gain corresponding to the N-1 th frame image.
Illustratively, the IFE may perform preliminary processing on the received raw image data 112, thereby obtaining 3A data of the N-1 frame image.
In this example, the IFE may send the 3A data of the corresponding frame image obtained after each frame processing to the decision module. So that the decision module can combine the 3A data of the acquired frame image to determine whether to trigger the adjustment of the image pattern in the image process of the subsequent frame image.
S1104, IFE sends image data 114 to IPE. The image data 114 may be data of an N-1 frame image after the IFE performs the preliminary processing.
S1105, IPE generates and outputs image data 115 from image data 114.
Thus, the ISP processing of the N-1 th frame image is realized by the S1104-S1105, and the image data 114 corresponding to the N-1 th frame image which can be used for display is acquired.
In the example, after the N-1 frame image is mapped by the electronic device, before the N frame image is mapped, the electronic device is determined to switch from modeA to modeB for mapping according to the operation of a user and environmental parameters.
Referring to fig. 12, a schematic diagram of still another interaction between modules according to an embodiment of the present application is provided. Based on the implementation of the flow shown in fig. 12, after the camera driver of the image sensor receives the mode switching instruction, the image sensor is continuously controlled to perform at least one frame of image drawing according to the validated modeA. That is, the nth frame image is rendered at the frame rate f0 indicated by modeA without waiting for modeB to take effect before rendering at the frame rate f1 of modeB. Thereby avoiding waiting before the nth frame image is imaged.
As shown in fig. 12, the scheme may include:
s1201, the camera application sends the operation information 121 to the decision module.
For example, the camera application may generate corresponding operation information after receiving the operation 401 or operation 402 of the user, and send the operation information to the decision module.
Taking the example of the camera application receiving the user's operation 401. Operation 401, in conjunction with the description in fig. 4, is used to instruct to adjust the focal length magnification to 2×.
Correspondingly, the camera application may generate operation information 121 and send the operation information to the decision module according to operation 401.
The operation information 121 may include information indicating that the focal length magnification is adjusted from 1×2×2.
S1202, the decision module sends configuration parameters B and camera identifications C1 to the enabling control module. In some embodiments, the configuration parameter B and the camera identifier C1 may be carried and issued in the first control information. In other embodiments, the first control information may also include only the frame rate f1 in the configuration parameter B, or the first control information may include the frame rate f1 and the camera identification C1.
The camera identifier C1 may be an identifier of the currently called camera.
In some embodiments, the decision module may obtain the camera identification C1 from the camera application. In other embodiments, the decision module may obtain the camera identity C1 from other modules (e.g., a camera management module in a framework layer of the electronic device).
In this example, the decision module may determine that the graph mode needs to be switched from modeA to modeB according to the operation information 121 and the reference data 113 acquired as S1103 in fig. 11.
Illustratively, the decision module may determine to switch from modeA to modeB according to current operation information indicated by the operation information 121, including that the focal length magnification is adjusted from 1× to 2×, and that 3A data corresponding to a previous frame of image satisfies a preset condition.
The preset condition may include that an AE gain of a previous frame image is smaller than a preset gain threshold, and/or that an AE exposure duration of the previous frame image is smaller than a preset duration threshold.
Taking the case that the reference data 113 satisfies the preset condition as an example.
In this way, the decision module can determine that it is necessary to switch from modeA to modeB, and control the image sensor to perform mapping according to modeB.
It should be noted that the decision module provided in this example determines that the determination condition for switching from modeA to modeB is merely an example. In other implementations, the decision module may also determine that a switch from modeA to modeB is required based on other mechanisms. The embodiments of the present application are not limited in this regard.
In this way, the decision module may send the configuration parameter B of modeB and the currently used camera id C1 to the enabling control module in case it is determined that modeA needs to be switched to modeB for mapping.
And S1203, the enabling control module sends the enabling identification of the configuration parameters B and reshutter to the camera driver.
For example, if the enabling control module receives the configuration parameter B and the camera identifier C1, it can know that modeA needs to be switched to modeB to make the map.
In the application, the enabling control module can judge whether the image sensor corresponding to the currently used camera supports reshutter functions according to the camera identifier C1.
In some implementations, a reshutter list may be configured in the enable control module. This reshutter list may also be referred to as a first function list.
The reshutter list may include at least one camera identification.
Taking at least one corresponding relation between the camera mark and the image sensor as an example, the camera mark C1 is included. This indicates that the camera corresponding to camera id C1, the image sensor used supports reshutter functions. That is, the image sensor applicable to the camera corresponding to the camera identifier C1 may continue to perform at least one frame of mapping according to the configuration parameter a under the control of the camera driver.
Thus, the enabling control module may query reshutter the list for the camera identity C1. In the case where the camera identification C1 is included in the reshutter list, it is determined that the currently used image sensor supports the reshutter function.
It will be appreciated that during the power-on of the electronic device, all hardware functions will be traversed. Thus, in other embodiments of the present application, the reshutter list may be generated and stored after the electronic device has completed the functional traversal of all cameras and corresponding image sensors.
In other embodiments of the present application, reshutter lists may also be stored as already configured at the shipping time of the electronic device.
In other embodiments of the present application, reshutter lists may also be flexibly adjusted according to the actual situation.
Thus, in the case that the enabling control module determines that the currently used image sensor supports reshutter functions, the enabling identification can be sent reshutter to the camera driver, so that the camera driver can be enabled to control the image sensor to continue to perform at least one frame of image drawing according to the configuration parameter a. In the following examples, taking as an example, a picture enabling the camera driving control image sensor to continue one frame image according to the configuration parameter a.
In addition, the enabling control module can also send the new configuration parameter B to the camera driver.
As shown in fig. 12, the camera driver may configure a corresponding register according to the received reshutter enable flag.
Illustratively, the camera driver may receive reshutter the enable flag, set the GPH register to 1, set the reshutter register to 1, and store the configuration parameter B in the configuration register.
The GPH register set to 1 corresponds to the need to configure parameters for reshutter registers and configuration registers. Reshutter register configuration 1 then the nth frame image corresponding to the image to be rendered continues to be rendered according to the existing configuration. Wherein an existing configuration may be stored in a storage area a in a configuration register. Such as configuration parameter a.
And the configuration parameters B may be stored to the storage area B of the configuration register.
In some implementations, after completing the data storage of reshutter registers and configuration registers, the camera driver may reset the GPH register to 0.
It should be noted that, in other embodiments of the present application, if the enabling control module determines that the corresponding image sensor does not support reshutter functions according to the camera identifier C1, the enabling identifier may not be generated reshutter, and the configuration parameter B is correspondingly sent only to the camera driver. And further, the camera is driven according to the prior scheme shown in fig. 3 or 5, and the image sensor is controlled to perform drawing.
S1204, the camera driver sends a map instruction 122 to the image sensor.
Illustratively, the camera driver may configure 1 according to reshutter registers without waiting for the new configuration parameter B to take effect, and perform the frame out of the next frame image (e.g., the nth frame image) according to the existing configuration parameter a.
As a possible implementation, the camera driver may read the existing configuration parameters a from the storage area a of the configuration register according to reshutter register configuration as 1 after the GPH register is configured as 0 (i.e., the register is configured).
The camera driver may generate a map instruction 122 according to the configuration parameter a, instructing the image sensor to map at the frame rate f0, the image size S1.
Correspondingly, the image sensor may generate an nth frame image for mapping. The nth frame image may be output at a frame rate f0, and the size of the nth frame image may correspond to the image size S1.
S1205, the image sensor sends the raw image data 123 to the IFE.
S1206, IFE sends reference data 124 to the decision module.
Similar to S1103 in fig. 11, the IFE may send 3A data of the nth frame image (e.g., AE exposure time and/or AE gain of the nth frame image) to the decision module, so that the decision module determines the image mode of the subsequent frame image accordingly.
S1207, IFE transmits image data 125 to IPE. The image data 125 may be image data obtained after the IFE processes the original image data 123.
S1208, IPE generates and outputs image data 126 from image data 125. The image data 126 may be used for a display of an nth frame image.
In this way, when the camera driver of the image sensor receives the image mode adjustment instruction before the nth frame image is output, the nth frame image can still be displayed according to the existing modeA. The effect shown in fig. 7 is obtained.
It will be appreciated that in some embodiments of the present application, the enabling control module may control, after receiving the configuration parameter B (i.e. the new image pattern) from the decision module, through similar logic in S1203-S1208, to continue the image from the nth frame image to the n+x frame image according to the existing modeA. In this way, it is ensured that modeB configurations can be immediately mapped out as modeB after they are validated. The time period in which the frame rate f2 appears as in fig. 6 is padded with the time of continuing the x-frame image of the graph according to modeA, thereby avoiding a jump in the frame rate to the frame rate f2 between the switching of f0 to f 1.
In different implementations, x may be flexibly configured to be 1 or a positive integer greater than 1. In the present application, an example is given in which x is configured as 1. Thus, after the nth frame image is rendered, the configuration parameter B has been validated. That is, after the nth frame of image is mapped, the image sensor may immediately map according to the frame rate f1 corresponding to the configuration parameter B under the control of the camera driver.
Therefore, after the nth frame image is finished, the electronic equipment can drive and control the image sensor to perform subsequent frame image (such as the (n+1) th frame image) image according to the new configuration parameter B through the camera.
Illustratively, in some embodiments, this process may be implemented by configuring registers in the camera driver.
For example, referring to fig. 13, after the nth frame image is mapped (e.g., after the camera driver sends a mapping instruction 122 to the image sensor), the camera driver may configure the GPH register to 0, the reshutter register to 0, and the configuration parameter a in the configuration register is deleted. Alternatively, the camera driver may store the configuration parameters B into the storage space a.
Thus, according to the modified register state, the camera driving can be implemented according to a scheme similar to that shown in fig. 11, and the image sensor is controlled to perform the drawing of the subsequent image (such as the n+1st frame image) according to the configuration parameter B.
Exemplary, referring to fig. 14, a schematic diagram of still another interaction between modules is provided in an embodiment of the present application. Based on the implementation of the flow shown in fig. 14, the camera driver can control the image sensor to perform map based on modeB according to the new validated configuration parameter B.
As shown in fig. 14, the scheme may include:
S1401, the camera driver transmits the map command 141 to the image sensor.
The map instruction 141 may correspond to the configuration parameter B. For example, the map instruction 141 may include a frame rate f1, an image size S2, corresponding to the configuration parameter B.
S1402, the image sensor sends the raw image data 142 to the IFE according to the map instruction 141. Wherein the original image data 142 may correspond to an original image of an n+1st frame image.
S1403, IFE sends reference data 143 to the decision module. The reference data 143 may include 3A data corresponding to an n+1st frame image. For example, the reference data 143 may include AE exposure time length and/or AE gain corresponding to the n+1st frame image.
S1404, IFE sends image data 144 to IPE. The image data 144 may be data of an n+1st frame image after the IFE performs the preliminary processing.
S1405, IPE generates image data 145 from image data 144 and outputs the same.
In this way, the drawing of the n+1st frame image according to the new modeB configuration parameters is realized.
It should be noted that, in other embodiments of the present application, the electronic device may further control the exposure time of the first frame image after the new configuration parameter is validated by configuring the exposure control module in the sensor node. Therefore, the frame length of the first frame image after the new configuration parameters take effect is limited, and the frame rate displayed on the interface before and after the picture mode switching is ensured not to be changed excessively.
Exemplary, referring to fig. 15 in conjunction with fig. 12, a schematic diagram of still another interaction between modules according to an embodiment of the present application is provided.
In the example of fig. 15, the electronic device performs the mapping of the nth frame image according to S1201-S1208 at the frame rate f0 of modeA, and may also implement the configuration of the exposure time of the n+1th frame image in the camera driving through S1501-S1502.
For example, as shown in fig. 15, after the enabling control module receives the configuration parameter B and the camera identifier C1 according to S1202, the image sensor support reshutter function may be determined according to the camera identifier C1 according to the logic shown in fig. 12.
In addition to executing S1203, the enable control module may also execute S1501 below.
S1501, the enabling control module sends reshutter an enabling identification to the exposure control module.
For example, the enabling control module may trigger the exposure control module to control exposure of the 1 st frame image (e.g., the n+1st frame image) after the pattern is switched by sending reshutter an enabling flag to the exposure control module.
In this example, the exposure control module may determine the exposure limit parameter for the n+1st frame image upon receiving reshutter enable flags.
As an example, the exposure control module may determine modeB exposure limit parameters for a1 st image (e.g., an n+1st image) after validation based on the configuration parameters of modeA that have been validated, and the configuration parameters of modeB to be validated.
For example, the exposure control module may determine the exposure limiting parameter according to the following equation (1).
Formula (1) ts= (FLL (a) - (m+y_add_end (a) -y_add_sta (a) +65)/n+96) LLP (a)/LLP (B) -96.
Where Ts is an exposure restriction parameter, FLL (a) is a frame length of mode a, y_add_end (a) is a termination position of mode a image height coordinates, y_add_sta (a) is a start position of mode a image height coordinates, LLP (a) is a line length of mode a, and LLP (B) is a line length of mode B. M and N are both fixed coefficients. For example, when mode a is full-frame (fullsize), m=4, n=1. As another example, when modeA is not a full frame, m=8, n=2.
Thus, by controlling the exposure time of the n+1th frame image to be smaller than the exposure limiting parameter, it is possible to ensure that the frame length of the first frame after switching to mode B is the same as or similar to the frame length of mode a.
For example, the enable control module may implement exposure control of the n+1st frame image according to the exposure limiting parameter according to S1502 and related logic as follows.
S1502, the exposure control module sends exposure limiting parameters to the camera driver.
The camera driver may be a camera driver corresponding to the currently used camera identifier C1. In some implementations, the exposure control module may obtain the camera identification C1 (not shown in fig. 15) from the enabling control module.
In this way, the camera driving may end to the configuration parameter B, reshutter enabling flag according to S1203, and may also receive the exposure restriction parameter according to S1502.
In connection with the description of the operation of the camera driver update register in fig. 12, in this example as in fig. 15, the camera driver may update the GPH register, reshutter register, and configuration register according to the scheme as shown in fig. 12.
Further, in this example as in fig. 15, the camera driver may also store the acquired exposure restriction parameters in a configuration register.
For example, the camera driver may store the exposure limit parameter and the configuration parameter B together in the storage area B. So that the exposure limit parameter and the configuration parameter B are validated together when the n+1th frame image frame is performed.
Thus, when the camera driving is implemented in accordance with the scheme shown in fig. 14 and the map command 141 is sent to the image sensor, the exposure restriction parameter Ts may be sent to the image sensor in addition to the frame rate f1 and the image size S2 corresponding to the configuration parameter B. Furthermore, when the image sensor performs image drawing of the (n+1) th frame of image, the exposure time does not exceed the exposure limiting parameter Ts except the image processing according to the configuration parameter B.
Therefore, the frame of the N-th frame image can be output according to the frame rate of f0 by matching the enabling control module and the exposure control module, and the exposure time of the N+1th frame image can be controlled to be smaller than or equal to the exposure limiting parameter Ts, so that the aim of controlling the frame length of the N+1th frame image to be equal to or close to the frame length of the N-th frame image is fulfilled.
In each of the embodiments of the present application, the switching of the drawing mode from modeA to modeB is taken as an example. In other embodiments of the present application, if the current image mode is modeB and the electronic device (e.g., a decision module configured in the electronic device) determines that the modeB needs to be switched to modeA, the technical solution provided in any one of fig. 11 to 15 may also be used to enable the camera driver to continue to control the image sensor to perform at least one frame of image according to modeB after receiving the image mode switching instruction. Alternatively, by implementing the scheme as provided in fig. 15, the exposure control module may determine the corresponding exposure limiting parameter according to the above formula (1) (e.g., interchange modeA related parameters and modeB related parameters in formula (1)). Based on the exposure limiting parameter, the exposure time of the first frame image after switching to modeA is controlled so that the frame length of the first frame image is the same as or similar to the frame length of the picture in modeB mode. Specific implementation in this scenario may refer to the specific descriptions in the foregoing embodiments, which are not repeated herein.
The above description mainly describes the scheme provided by the embodiment of the application from the perspective of each functional module. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
By way of example, fig. 16 illustrates a schematic diagram of the composition of an electronic device 1600. As shown in fig. 16, the electronic device 1600 may include a processor 1601 and a memory 1602. The memory 1602 is used to store computer-executable instructions. For example, in some embodiments, the processor 1601, when executing instructions stored in the memory 1602, can cause the electronic device 1600 to perform a method as shown in any of the above embodiments.
As shown in fig. 16, the electronic device 1600 may further include an image sensor 1603, where when the electronic device 1600 performs the method provided in the foregoing embodiment, the image sensor 1603 may be controlled to perform corresponding mapping according to the scheme provided in the embodiment of the present application.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
Fig. 17 shows a schematic diagram of the composition of a chip system 1700. The system-on-chip 1700 may include a processor 1701 and a communication interface 1702 to support related devices to implement the functions referred to in the embodiments above. In one possible design, the chip system further includes a memory to hold the necessary program instructions and data for the electronic device. The chip system can be composed of chips, and can also comprise chips and other discrete devices. It should be noted that, in some implementations of the present application, the communication interface 1702 may also be referred to as an interface circuit.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The functions or acts or operations or steps and the like in the embodiments described above may be implemented in whole or in part by software, hardware, firmware or any combination thereof. When implemented using a software program, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device including one or more servers, data centers, etc. that can be integrated with the medium. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
Although the application has been described in connection with specific features and embodiments thereof, it will be apparent that various modifications and combinations can be made without departing from the spirit and scope of the application. Accordingly, the specification and drawings are merely exemplary illustrations of the present application as defined in the appended claims and are considered to cover any and all modifications, variations, combinations, or equivalents that fall within the scope of the application. It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.