Movatterモバイル変換


[0]ホーム

URL:


CN118488310B - A method for controlling image output, electronic device and chip system - Google Patents

A method for controlling image output, electronic device and chip system
Download PDF

Info

Publication number
CN118488310B
CN118488310BCN202311583534.7ACN202311583534ACN118488310BCN 118488310 BCN118488310 BCN 118488310BCN 202311583534 ACN202311583534 ACN 202311583534ACN 118488310 BCN118488310 BCN 118488310B
Authority
CN
China
Prior art keywords
image
frame rate
image sensor
frame
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311583534.7A
Other languages
Chinese (zh)
Other versions
CN118488310A (en
Inventor
张亮
李海
胡凯强
眭新雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co LtdfiledCriticalHonor Device Co Ltd
Priority to CN202411985401.7ApriorityCriticalpatent/CN120034733A/en
Priority to CN202311583534.7Aprioritypatent/CN118488310B/en
Publication of CN118488310ApublicationCriticalpatent/CN118488310A/en
Application grantedgrantedCritical
Publication of CN118488310BpublicationCriticalpatent/CN118488310B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

Translated fromChinese

本申请实施例提供一种出图控制方法、电子设备和芯片系统,涉及电子设备技术领域。该方法能够在生成新的出图模式配置参数后,继续通过已配置的配置进行至少一帧的出图,避免出图配置调整过程中的帧率跳变。该方法可以包括:生成第一控制信息,该第一控制信息包括第一帧率。控制该第一图像传感器输出该第一图像。其中,输出该第一图像与输出第二图像的时间间隔为第一时长。该第一时长对应于第二帧率,该第二图像是该第一图像的上一帧图像。

The embodiments of the present application provide a method for controlling image output, an electronic device, and a chip system, and relate to the technical field of electronic devices. After generating new image output mode configuration parameters, the method can continue to output at least one frame through the configured configuration, avoiding frame rate jumps during the image output configuration adjustment process. The method may include: generating first control information, the first control information including a first frame rate. Control the first image sensor to output the first image. The time interval between outputting the first image and outputting the second image is a first duration. The first duration corresponds to a second frame rate, and the second image is the previous frame image of the first image.

Description

Picture control method, electronic equipment and chip system
Technical Field
The embodiment of the application relates to the technical field of electronic equipment, in particular to a graph control method, electronic equipment and a chip system.
Background
The electronic device may control the image sensor to map at different frame rates. When the frame rate needs to be adjusted, the image sensor needs to wait for the new frame rate configuration to take effect to finish the adjustment switching. Thus, after the image sensor receives the new frame rate configuration, the image will stop being displayed before the new frame rate configuration is validated. Therefore, the frame rate is hopped for a plurality of times before and after the frame rate adjustment, and the display effect is affected.
Disclosure of Invention
The application provides a graph control method, electronic equipment and a chip system, which can continue to perform graph of at least one frame through configured configuration after generating new graph mode configuration parameters, and avoid frame rate jump in the graph configuration adjustment process.
In order to achieve the technical purpose, the application adopts the following technical scheme:
In a first aspect, a graph control method is provided, where the method is applied to an electronic device, and a first image sensor is configured in the electronic device, where the first image sensor corresponds to at least one camera configured in the electronic device. The method includes generating first control information, the first control information including a first frame rate. The first image sensor is controlled to output the first image. The time interval between outputting the first image and outputting the second image is a first duration. The first time length corresponds to a second frame rate, the second image being a previous frame image of the first image.
Thus, after generating the new configuration parameters (e.g., control information including the first frame rate), the electronic device may continue to control the first image sensor to map a next frame of image (e.g., the first image) in accordance with the existing configuration (e.g., including the second frame rate). Thus avoiding frame rate hopping caused by waiting for new configuration parameters to take effect after receiving the new configuration parameters.
In some possible designs, the method further includes controlling the first image sensor to output the second image at a second frame rate prior to generating the first control information. It is hereby clarified that the electronic device configured parameters comprise a second frame rate, i.e. the mapping is performed according to the second frame rate, before the first control information is received.
In some possible designs, the first control information further includes a first camera identification. The first camera identification is an identification of a first camera currently in use, the first camera is included in the at least one camera, and the first camera corresponds to the first image sensor. Before the first image sensor is controlled to output the first image, the method further comprises determining that the first image sensor corresponding to the first camera supports a first function according to the first camera identification, wherein the first function corresponds to continuing to perform at least one frame of image drawing according to the configured frame rate after receiving the new frame rate configuration. Therefore, under the condition that the first image sensor can support to continue to map according to the existing configuration after receiving the new frame rate configuration, the scheme provided by the application is enabled, and the scheme can be effectively executed.
In some possible designs, the new frame rate configuration includes the first frame rate and the configured frame rate includes the second frame rate.
In some possible designs, the electronic device is configured with a first function list, where the first function list includes at least one camera identifier, and each of the first function lists is that an image sensor corresponding to the camera identifier supports the first function. The method for determining that the first image sensor corresponding to the first camera supports the first function according to the first camera identification comprises the step of determining that the first image sensor corresponding to the first camera supports the first function according to the fact that the first camera identification is included in the first function list. Thereby providing a judgment mechanism for the image sensor supporting the first function. In some embodiments, the first function list may be actively generated by the electronic device, e.g., the electronic device generates the first function list when powered on for hardware verification. In other embodiments, the first list of functions may be preconfigured.
In some possible designs, the electronic device is configured with a first register, a second register, and a third register for the first image sensor. After the determination that the first image sensor supports the first function, the method further includes configuring the first register to a first value indicative of enabling configuration of the second register and the third register. The second register is configured to a first value indicative of enabling the first function. The first frame rate is stored in the third register.
In some possible designs, the third register includes a first memory region and a second memory region. The storing the first frame rate in the third register includes storing the first frame rate in the second storage area. The first storage area stores the second frame rate before the first frame rate is stored in the second storage area.
In some possible designs, the method further includes configuring the first register to a second value indicating disabling configuration of the second register and the third register.
In some possible designs, the controlling the first image sensor to output the first image includes controlling the first image sensor to output the first image at the second frame rate stored in the first storage area with the second register being the first value.
In some possible designs, after the controlling the first image sensor to output the first image, the method further includes storing the first frame rate in the first storage area over the second frame rate.
Thus, through the configuration of the three registers, the first function of the first image sensor is enabled, and the drawing is continuously carried out according to the existing configuration after the new configuration is received.
In some possible designs, the method further comprises controlling the first image sensor to output a third image after the first image sensor is controlled to output the first image. The time interval between outputting the third image and outputting the first image is a second duration. The second duration corresponds to the first frame rate, and the third image is a next frame image of the first image. In this example, taking an example of a drawing in which one frame of image continues to be performed according to an existing configuration after receiving a new configuration. In other implementations, the electronic device may map multiple frames of images according to an existing configuration before the new configuration is validated.
In some possible designs, the first frame length is the same as or similar to a second frame length, the first frame length is the frame length of the third image, the second frame length is the frame length of the first image or the second image, and the first frame length is similar to the second frame length includes that the first frame length and the second frame length do not differ by more than 10%.
In some possible designs, the controlling the first image sensor to output the third image includes controlling an exposure time of the first image sensor to be less than or equal to an exposure limit parameter during the outputting of the third image by the first image sensor such that the first frame length is the same as or similar to the second frame length.
Therefore, the exposure time of the first frame image after the new configuration is validated is controlled, so that the frame lengths of the images before and after the new configuration is validated are consistent, and image jump is avoided.
In some possible designs, the method further includes determining the exposure limiting parameter based on a first configuration parameter and a second configuration parameter before the controlling the first image sensor to output the third image. The first configuration parameter is a configuration parameter corresponding to the first frame rate, and the second configuration parameter is a configuration parameter corresponding to the second frame rate.
In some possible designs, the method further comprises, prior to the generating the first control information, acquiring image parameters of the fourth image. The image parameters include an automatic exposure AE exposure time period and/or an AE exposure gain of the fourth image. And determining to generate the first control information according to the image parameters of the fourth image.
In some possible designs, the determining to generate the first control information based on the image parameters of the fourth image includes determining to generate the first control information based on the image parameters of the fourth image if the first operation information or the second operation information is received. The first operation information is generated by the electronic device according to the received first operation, and the first operation is used for indicating the electronic device to adjust the focal length. The second operation information is generated by the electronic device according to the received second operation, and the second operation is used for indicating the electronic device to adjust the shooting mode.
In some possible designs, the first operation is specifically to instruct the electronic device to adjust the focal length magnification from 1× to 2×.
Thus providing a mechanism for the electronic device to determine to make a pattern (e.g., including frame rate) adjustment.
In some possible designs, the first frame rate corresponds to a frame rate in InsensorZoom modes and the second frame rate corresponds to a frame rate in a binding mode.
In a second aspect, an electronic device is provided that includes a memory, one or more processors, and at least one image sensor. The memory, the processor, and the image sensor are coupled in pairs. Wherein the memory is for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to control the image sensor map in accordance with the method as provided in the first aspect and any one of its possible designs.
In a third aspect, the present application also provides a chip system for use in an electronic device, the chip system may include one or more interface circuits and one or more processors. The interface circuit and the processor are interconnected by a line, the interface circuit being adapted to receive a signal from a memory of the electronic device and to send the signal to the processor, the signal comprising computer instructions stored in the memory. The electronic device, when executing the computer instructions described above, performs the technical solutions provided in the first aspect and any one of its possible implementations.
In a fourth aspect, the present application also provides a computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the technical solution provided in the first aspect and any one of its possible implementations.
In a fifth aspect, the application also provides a computer program product which, when run on a computer, causes the computer to carry out the solution provided in the first aspect and any one of its possible implementations.
It will be appreciated that the solutions provided by the second aspect to the fifth aspect of the present application may correspond to the first aspect and any possible design thereof, so that the advantages achieved are similar, and are not repeated here.
Drawings
FIG. 1 is a schematic diagram of an interface interaction;
FIG. 2 is a schematic diagram illustrating interactions between internal modules of an electronic device;
FIG. 3 is a schematic diagram of a register configuration inside a camera driver;
FIG. 4 is a schematic diagram illustrating an interface interaction between focal length adjustment and shooting mode switching;
FIG. 5 is a schematic diagram of register parameter adjustment;
FIG. 6 is a schematic diagram of a frame image;
FIG. 7 is a schematic diagram of a frame image after the scheme provided by the embodiment of the application is validated;
fig. 8 is a schematic diagram of an electronic device according to an embodiment of the present application;
fig. 9 is a schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a register configuration according to an embodiment of the present application;
FIG. 11 is a schematic flow chart of interaction between modules according to an embodiment of the present application;
FIG. 12 is a schematic flow chart of interaction between modules according to an embodiment of the present application;
FIG. 13 is a schematic diagram of a register configuration according to an embodiment of the present application;
FIG. 14 is a schematic flow chart of interaction between modules according to an embodiment of the present application;
FIG. 15 is a schematic flow chart of interaction between modules according to an embodiment of the present application;
fig. 16 is a schematic diagram of an electronic device according to an embodiment of the present application;
Fig. 17 is a schematic diagram of a system-on-chip according to an embodiment of the present application.
Detailed Description
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
Currently, an electronic device may provide a photographing function to a user through a configured photographing module.
By way of example, the camera module may include a camera and an image sensor. In the case where a plurality of camera modules are provided in the electronic apparatus, each camera module may correspond to a camera. And different cameras may be respectively configured with corresponding image sensors. Or two or more of the different cameras may share one image sensor.
For the image sensor, after the optical signals acquired by the camera are acquired, the output of the original images in different modes can be performed in different scenes. The original image may be a RAW image.
By way of example, the different modes may include modeA and modeB. Wherein the frame rate of the output image in modeA mode is different from the frame rate of the input image in modeB mode. In some cases, the image size of the original image output in modeA mode and the original image output in modeB mode may also be different. For example, modeA may be a binding mode and modeB may be a InsensorZoom mode.
Wherein the frame rate is in frames. The frame rate is used to indicate the frequency at which the image sensor is exposed and outputs an image per unit time. In the following description, a process in which an image sensor exposes and outputs an image is simply referred to as an image sensor map. The frame rate is generally indicated using fps (frame per second). For example, the frame rate is 30fps, which represents 30 frames of images output within 1 second.
The image size is used to represent the size of the original image output by the image sensor.
As one possible implementation, the image size may include a length, a width, a resolution, etc. of the image data.
As yet another possible implementation, the image size may include parameters such as a line length, a frame length, and the like, to which the image data corresponds.
Wherein the line length (LINE LENGTH) is the length of the image sensor output image data over a line. The linelength may include HBlank. The line length may also be referred to as HTS.
Frame length (FRAME LENGTH) is the number of lines of one frame image. The frame length may include V Blank. The frame length may also be referred to as VTS.
It is understood that the line length may correspond to the length of the original image output by the image sensor. The frame length may determine the width of the original image output by the image sensor. The line length and frame length may together determine the resolution of the output image.
Thus, the electronic device can acquire and display the original images required in different scenes by controlling the image sensor to output modeA or modeB original images.
For example, taking a camera application installed in an electronic device, the electronic device controls an image sensor map through the camera application, thereby providing a photographing function as an example.
Referring to FIG. 1, a schematic diagram of an interface interaction is shown.
As shown in fig. 1, the electronic device may display an icon 102 of a camera application on a main interface 101. Upon receiving a click operation 103 of the icon 102 by the user, the electronic device may correspondingly display an interface 104 of the camera application.
Taking the example of providing a photographing function by default after the camera application is opened. As shown by interface 104 in fig. 1, the electronic device may centrally display the current shooting mode as "shot" in function bar 105. A preview image 106 of the object to be photographed may also be included in the interface 104. Thus, when the user determines to take a picture, the button 107 can be clicked. Correspondingly, the camera application may perform processing such as photographing and saving of the current photographing object in response to the button 107.
In some implementations, as shown in FIG. 1, other function options may also be displayed in the function bar 105. For example, the other function options may include a current focal length magnification (e.g., current magnification of 1×, optional magnification of 2×). As another example, other functional options may include other shooting modes (e.g., video mode, HDR mode), and so forth.
Referring to fig. 2, an internal interaction diagram of an electronic device is shown. Based on this interaction example as shown in fig. 2, the electronic device may provide the user with a corresponding photographing function in case of the user input operation 103.
As shown in fig. 2, after receiving the user's operation 103, the camera application sends an instruction 21 to a camera abstraction (CAMERA HAL) configured in the electronic device.
For example, the instruction 21 may include operation information corresponding to operation 103.
In some implementations, multiple camera modules may be configured in an electronic device. Different camera modules can be configured with different camera identifications, so that the electronic equipment can correctly call the corresponding camera modules to work by carrying the camera identifications in the commands.
For example, in some embodiments, the camera abstraction in the electronic device may determine the camera identity of the camera module that needs to be pulled up by itself according to the operation information carried in the instruction 21. In other implementations, the operation information may also include a camera identifier of the camera module that needs to be pulled up.
The camera abstraction may also determine that a map needs to be made for the image sensor configuration modeA or modeB. In the present application, the frame rate modeA may also be referred to as the second frame rate, and the frame rate modeB may also be referred to as the first frame rate. And using modeA to plot the adjacent two-frame images at a first time length, and using modeB to plot the adjacent two-frame images at a second time length.
The camera abstraction can then send instructions 22 to the camera driver corresponding to the camera module to be pulled up. The instruction 22 may carry a configuration parameter modeA or modeB, which is used to instruct the image sensor to output the original image according to the mode corresponding to the identifier.
In the case where a plurality of camera modules are arranged in the electronic apparatus, each camera module may be respectively provided with a corresponding camera driver. The camera driver may be a functional map of the corresponding camera module in the electronic device. The electronic device may send a related instruction to the corresponding camera driver when a certain camera module is required to be used.
In this way, the camera abstraction can send this instruction 22 to the camera driver of the camera module that needs to be pulled up. The camera driver may control the corresponding camera module to operate according to the instructions 22. For example, the camera driver may send instructions 22 to the image sensor of the camera module to control the image sensor to make a map.
It should be noted that one or more registers may be configured in the camera driver, for storing various configuration parameters that need to be used in the image sensor mapping process.
For example, referring to fig. 3, a register 31 may be configured in the camera driver. After receiving the instruction 22, the camera driver may write the configuration parameters carried by the instruction 22 into the register 31. For example, the configuration parameters carried by the instruction 22 may include the configuration parameter a of modeA. The configuration parameter a as modeA may include an image size S1 corresponding to a frame rate f0, modeA.
Thus, the camera driver can control the image sensor to perform the drawing according to the configuration parameter a in the register 31 as shown in fig. 3.
For example, under control of the camera drive, the image sensor may acquire an optical signal from the camera, outputting raw image data 23 according to modeA.
In the example of fig. 2, the image sensor may directly transmit raw image data 23 to an image signal Processor (IMAGE SIGNAL Processor, ISP) for subsequent processing.
In other embodiments, the image sensor may send raw image data 23 to the camera driver and the camera driver may transmit raw image data 23 to the ISP for processing.
Correspondingly, the ISP may process the received raw image data 23 to obtain processed image data 24. The electronic device may display on a display screen based on the image data 24. For example, the image data 24 may be displayed to present a preview image 106 as shown in FIG. 1.
Thus, the image sensor can realize the image drawing of one frame of image based on modeA. Thereafter, the image sensor may continuously perform the drawing at the frame rate f 0. Correspondingly, the electronic device can continuously display a plurality of frame images on the interface to form a preview stream.
In some cases, the electronic device may control the image sensor map mode to switch from modeA to modeB.
By way of example, the electronic device may switch from modeA to modeB by decision-making determination when any of the following occurs:
switching to different focal length multiplying power, for example, switching from 1X to 2X, namely, the electronic equipment receives an instruction input by a user to perform focal length adjustment;
And switching to a different shooting mode, such as switching from a shooting mode to a video mode, namely, the electronic equipment receives an instruction input by a user to perform shooting mode switching operation.
As an example, fig. 4 provides an interface interaction example of focus adjustment and shooting mode switching.
As shown in fig. 4, in the case where the user inputs an operation 401 indicating to adjust the focal length magnification to 2 x, the electronic device may determine to switch to a different focal length magnification according to the operation 401.
As shown in fig. 4, in the case where the user inputs operation 402 indicating to switch the photographing mode to the video mode, the electronic device may determine to switch to a different photographing mode according to operation 401.
In other implementations, the electronic device may switch from modeA to modeB based on the current environmental parameters, as determined by the decision.
Wherein the current environmental parameters may include a 3A data determination of the already acquired image. Among them, 3A data, that is, automatic Exposure (AE) data, automatic White Balance (AWB), and Automatic Focus (AF) data acquired during the processing of original image data by the ISP. For example, the electronic device may determine whether to switch from modeA to modeB based on the AE gain and AE exposure duration decisions.
As one example, upon receiving operation 401 or operation 402, the electronic device determines that the AE gain of the already acquired image is less than a preset gain threshold, and/or determines that the AE exposure duration of the already acquired image is less than a preset duration threshold, and determines that a switch from modeA to modeB is required.
In the above implementation, the electronic device switches from modeA to modeB based on user input operations and the 3A data decisions of the acquired images. In other implementations, the electronic device may also switch from modeA to modeB at its discretion.
For example, the electronic device may determine to switch from modeA to modeB if a trigger to enter an HDR scene is triggered.
The electronic device can enter an HDR scene under the instruction of a user, or the electronic device can automatically trigger to enter the HDR scene according to the fact that the current shot image comprises a high-light environment and a dark-light environment.
The switch modeA to modeB is implemented in a number of different schemes included in the existing schemes.
In some implementations, the camera abstraction can send a stop-stream indication to the camera driver. The stall indication may be used to indicate that the current mode (e.g., modeA) based plot is stopped. Correspondingly, after the camera driver receives the stop flow instruction, the image sensor is controlled to stop drawing.
Thereafter, the camera abstraction can send a start-up indication to the camera driver. The start-up indication may carry modeB configuration parameters B. Thus, the camera driver can control the image sensor to start map based on modeB according to configuration parameter B.
In the implementation process of the scheme, after the camera driver receives the stop-flow instruction and before receiving the start-flow instruction, the image sensor stops making the image. This results in interruption of the preview image display during the switching process.
In other implementations, the camera abstraction may send modeB configuration parameters directly to the camera driver. This scheme may also be referred to as a seamless switching scheme. In this seampless switching scheme, the camera driver may be configured to control the image sensor map based on the latest received configuration parameters (e.g., configuration parameter B).
Thus, the image sensor can receive the configuration parameter B and then perform drawing according to the configuration parameter B. For example, the map is made according to the frame rate f1 indicated by the configuration parameter B.
It can be understood that when the image frame rate of the image sensor is greater than or equal to the display frame rate of the display screen of the electronic device, the preview stream displayed on the display screen will not be blocked or interrupted.
And as the display frame rate of the electronic device increases, the user is more sensitive to the change of the image frame rate of the image sensor.
Based on the seampless switching scheme, even if the image sensor directly switches the image from the configuration parameter a after receiving the configuration parameter B, the problems of blocking of the preview stream and the like can be caused due to the frame rate difference before and after switching.
This is caused by the mapping mechanism controlling the image sensor.
For example, refer to fig. 5 in conjunction with fig. 3. The register 31 may store the validated configuration parameter a before receiving the configuration parameter B. Correspondingly, the image sensor can perform mapping according to the frame rate f0 indicated by the configuration parameter A.
Upon receiving the configuration parameter B, the image sensor stops plotting according to the frame rate f 0. For example, the camera driver corresponding to the image sensor may configure the configuration parameters in the register 31 to be unavailable or deleted.
The camera driver may then write configuration parameter B to register 31. After the configuration parameter B is written into the register, the image sensor can perform the plotting according to the frame rate f1 indicated by the configuration parameter B.
Because of a certain time consumption for each process before the new configuration parameters of the register 31 are validated, such as data modification, deletion, and writing, the image sensor cannot normally map in the configuration adjustment process. In addition, for the image sensor, the different modes of the graph involve the change of the internal configuration, and when the image sensor is switched from modeA to modeB, a certain time is required to enable the corresponding configuration of modeB to be effective, so that the image can be mapped according to the configuration parameters corresponding to modeB.
Whereby a jam or interruption of the preview stream may occur.
The problems of the above-described process will be specifically described with reference to the frame image map timing provided in fig. 6.
As shown in fig. 6, the configuration parameter a is validated when the image sensor performs the drawing of the N-1 th frame image and the previous N-2 nd and N-3 rd frame images. Then, the frame rate corresponding to the picture interval of the N-3 th frame image and the N-2 nd frame image is the frame rate f0. The frame rate corresponding to the picture interval of the N-2 th frame image and the N-1 st frame image is the frame rate f0.
After the N-1 frame is displayed, the camera driver of the image sensor receives the configuration parameter B. In this way, the camera driver can update the configuration parameter B into the register 31 according to the implementation as shown in fig. 5. After the configuration parameter B is updated, the image sensor may perform the drawing of the nth frame image and the subsequent frame image according to the new configuration parameter B. For example, the picture interval of the nth frame image and the n+1th frame image may correspond to the frame rate f1. The picture interval of the n+1st frame image and the n+2nd frame image may correspond to the frame rate f1.
After the N-1 frame image is mapped, before the N frame image is mapped, the mapping interval between the N frame image and the N-1 frame image is obviously lower than f0 or f1 due to the existence of the configuration adjustment process. For example, the picture interval of the nth frame image and the N-1 th frame image corresponds to the frame rate f2.
Thus, as shown in fig. 6, even if the seampless switching scheme is effective, an additional frame rate f2 occurs during the configuration adjustment. A jitter of the frame rate f0 to the frame rate f2 to the frame rate f1 occurs corresponding to the image displayed on the display screen, and thus appears as a jam of the preview stream.
Note that in some implementations, f0 and f1 are different examples. In other implementations, f0 and f1 may also be the same. In this way, due to the occurrence of the frame rate f2, a jump of the frame rate occurs in the process of modeA switching to modeB, which further appears as a clip of the preview stream.
Based on the technical scheme provided by the embodiment of the application, the electronic equipment can control the image sensor to continue to perform the drawing of at least one frame of image according to the validated configuration parameters before the new configuration parameters are validated under the condition that the configuration parameters are required to be adjusted.
For example, referring to fig. 7, in the case where the scheme provided in the embodiment of the present application is effective, after the N-1 th frame is mapped, the image sensor may continue to map the N-th frame image according to the frame rate f0 when the configuration parameter B is received. Thus, the picture interval of the N-1 th frame image and the N-th frame image is maintained at the frame rate f0. And then, after the configuration parameter B is effective, the N-1 frame image is directly mapped according to the new frame rate f 1.
It can be seen that in the case where the scheme provided by the present application is effective, the picture interval of the N-1 th frame image and the N-th frame image is maintained at f0 without occurrence of a jump to the frame rate f2 as shown in fig. 6. Therefore, the frame rate of the continuous images only changes from f0 to f1, and the problem of display blocking in the case shown in fig. 6 is avoided.
In the present application, the magnitude relation between f0 and f1 is not limited. By the scheme, the occurrence of the frame rate f2 is avoided, so that the problem of frame rate jump caused by the occurrence of the frame rate f2 can be solved no matter whether f0 is the same as f 1.
The following describes the scheme provided by the embodiment of the present application in detail.
It should be noted that, the electronic device in the embodiment of the present application may include at least one of a mobile phone, a foldable electronic device, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, an artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) device, a wearable device, a vehicle-mounted device, a smart home device, or a smart city device. The embodiment of the application does not limit the specific type of the electronic device.
Referring to fig. 8, a schematic diagram of software composition of an electronic device is provided in an embodiment of the present application.
In this example, the software system of the electronic device may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Embodiments of the application are configured in a layered mannerThe system is an example illustrating the software architecture of an electronic device.
As shown in fig. 8, the layered architecture divides the software into several layers, each with a clear role and division of work. The layers communicate with each other through a software interface. In some embodiments, it willThe system is divided into five layers, namely an application program layer, an application program framework layer, an Zhuoyun rows (Android runtime, ART) and a native C/C++ library, a hardware abstraction layer (Hardware Abstract Layer, HAL) and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 8, the application package may include applications for cameras, calendars, maps, WLANs, music, text messages, calls, navigation, bluetooth, video, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 8, the application framework layer may include a window manager, a content provider, a view system, a resource manager, a notification manager, an activity manager, an input manager, and the like.
The Window manager provides a Window management service (Window MANAGER SERVICE, WMS), and WMS may be used for Window management, window animation management, surface management, and as a transfer station for an input system.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The activity manager may provide an activity management service (ACTIVITY MANAGER SERVICE, AMS), and the AMS may be used for system component (e.g., activity, service, content provider, broadcast receiver) start-up, handoff, scheduling, and application process management and scheduling tasks.
The Input manager may provide an Input management service (Input MANAGER SERVICE, IMS) and the IMS may be used to manage inputs to the system, such as touch screen inputs, key inputs, sensor inputs, etc. The IMS retrieves events from the input device node and distributes the events to the appropriate windows through interactions with the WMS.
The android runtime includes a core library and An Zhuoyun rows. The android runtime is responsible for converting source code into machine code. Android runtime mainly includes employing Advanced Or Time (AOT) compilation techniques and Just In Time (JIT) compilation techniques.
The core library is mainly used for providing the functions of basic Java class libraries, such as basic data structures, mathematics, IO, tools, databases, networks and the like. The core library provides an API for the user to develop the android application.
The native C/c++ library may include a plurality of functional modules. Such as surface manager (surface manager), media Framework (Media Framework), libc, openGL ES, SQLite, webkit, etc.
The surface manager is used for managing the display subsystem and providing fusion of 2D and 3D layers for a plurality of application programs. Media frames support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. OpenGL ES provides for drawing and manipulation of 2D graphics and 3D graphics in applications. SQLite provides a lightweight relational database for applications of electronic devices.
The hardware abstraction layer runs in a user space (user space), encapsulates the kernel layer driver, and provides a call interface to the upper layer. By way of example, the hardware abstraction layer may include a display module, an audio module, a camera module, a bluetooth module, and the like. In some embodiments, the camera module may also be referred to as a camera abstraction.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver and a Bluetooth driver. In combination with the foregoing description, in the present application, in the case where a plurality of camera modules are configured in the electronic device, each camera module may be correspondingly configured with a corresponding camera driver.
It should be noted that, the electronic device composition provided in fig. 8 is only an example, and does not limit the electronic device according to the embodiment of the present application. In other embodiments, the electronic device may also have other software components.
Exemplary, referring to fig. 9, a schematic diagram of the composition of still another electronic device according to an embodiment of the present application is provided.
In this example as in fig. 9, the software system running in the processor may be divided into user space (userspace) and kernel space (KERNEL SPACE). Corresponding to the software architecture shown in FIG. 8, the user space may include a hardware abstraction layer as shown in FIG. 8. In some embodiments, the application layer, application framework layer, and native C/C++ libraries as shown in FIG. 8 may also be considered configured in user space. The kernel space as shown in fig. 9 may then correspond to the kernel layer as shown in fig. 8.
As shown in fig. 9, the user space and kernel space of the electronic device may be configured in the form of software modules in the processor.
Wherein the user space may be configured with a camera application, a camera abstraction.
The camera abstraction may include a decision module and a sensor node.
The decision module may also be referred to as an AEC decision module.
The decision module may be used to obtain operation information of a user's current input operation from the camera application. For example, the operation information may include operation information of operation 103 shown in fig. 1, operation information of operation 401 shown in fig. 4, operation information of operation 402 shown in fig. 4, and the like.
The decision module may also be used to obtain 3A data of the last frame of image that has been obtained from the ISP. For example, the decision module may acquire data such as AE exposure time, AE gain, etc. of the last frame image that has been acquired from the IFE of the ISP.
The decision module may also be configured to decide whether to perform switching of different patterns according to the information already acquired. For example, the switching of different graph modes may include switching from modeA to modeB, or switching from modeB to modeA.
The sensor node is also called sensor node. The sensor node may specifically include an enabling control module and an exposure control module.
Wherein the enable control module may also be referred to as reshutter control module. The enabling control module can be used for determining a target image sensor corresponding to the image pickup module which is required to be lifted currently. The enable control module may also be used to determine whether the target image sensor supports reshutter functions. In the embodiment of the present application, reshutter functions may also be referred to as first functions.
In an embodiment of the present application, the support reshutter function may correspond to that the image sensor may be configured to continue to perform the image processing of at least one frame image corresponding to the existing image configuration after receiving the new image configuration.
The enabling control module is also used for transmitting relevant control information to the camera drive of the target image sensor. Illustratively, the object sensor support reshutter functions are taken as an example. The control information may include a reshutter enable flag, a configuration parameter B of modeB.
As shown in fig. 9, an exposure control module may also be included in the sensor node.
In some embodiments of the present application, the exposure control module may be configured to control a frame length of the 1 st frame image after the pattern switching in a case where the pattern switching is performed.
For example, the exposure control module may control the exposure time period of the 1 st frame image after switching to modeB according to the configuration parameter B of modeB and the configuration parameter a of modeA. Therefore, after the pattern is switched, the frame length of the 1 st frame image of the image sensor is the same as or similar to the frame length of the last frame image before the pattern is switched. In some implementations, a frame length difference within 10% may be considered to be similar for both frames. Thus, the frame rate of the output image is ensured not to change greatly before and after the picture mode is switched.
It should be noted that, in some embodiments of the present application, the exposure control module may be omitted. Therefore, the control module can be enabled to realize that no jump of the frame rate occurs after the graph mode is switched.
As shown in fig. 9, a camera driver and an ISP may be disposed in the kernel space.
Wherein the camera driver may be used for command transmission between the relevant module and the image sensor in the user space. In the case where a plurality of camera modules are arranged in the electronic device, a plurality of camera drivers may be correspondingly arranged in the kernel space.
It should be noted that, in the embodiment of the present application, the image sensor and the camera may together form the camera module. The camera module may include one or more camera modules.
The number of image sensors (e.g., N) configured in the electronic device may be the same as or different from the number of cameras (e.g., M).
Correspondingly, the configuration of the camera driver can be configured in one-to-one correspondence with the image sensor. Or the configuration of the camera drivers may be configured in one-to-one correspondence with the cameras.
In the following description, a one-to-one correspondence between camera driving and image sensors is taken as an example.
In the present application, as shown in fig. 10, at least three registers may be configured in the corresponding camera driver for the image sensor supporting reshutter functions. Such as a GPH register (or referred to as a first register), reshutter register (or referred to as a second register), and a configuration register (or referred to as a third register).
The GPH register is used for enabling control of simultaneous writing of multiple groups of data corresponding to the same frame. Alternatively, GPH registers are used to enable configuration reshutter registers and configuration registers.
Reshutter registers are used to enable reshutter functions. The configuration register is used for storing configuration parameters of the graph mode. For example, the storage area A of the configuration register is used to store validated configuration parameters (e.g., the configuration parameter A of modeA). The storage area B of the configuration register is used to store configuration parameters to be validated (e.g., the configuration parameter B of modeB).
The camera driver is also used for controlling the image sensor to perform drawing according to the configuration condition of the register. For example, in the case where the reshutter register is configured to be 1 (or the first value), the camera driver may control the image sensor to perform the mapping of the next frame image according to the configuration parameter of the storage area a in the configuration register. The camera driver can also control the image sensor to carry out the drawing of the subsequent image according to the configuration parameters of the storage area B in the configuration register.
In some implementations, the ISP may include an Image Front End processing unit (IFE) and an Image processing engine (Image processing Engine, IPE).
The IFE is mainly responsible for preprocessing of the image data acquired by the camera. Such as color processing, denoising processing, enhancement processing, sharpening processing, and the like. The output of IFE is the pre-processed RAW image data. In the application, 3A data such as AE exposure time length, AE gain and the like can be obtained after IFE processing. IFE may also be used to send 3A data to a decision module.
The IPE is a main processing unit in the ISP and is responsible for performing various arithmetic processes on the image data output from the IFE. Such as white balance, color correction, noise reduction, etc. The data output after IPE processing can be used as the basis for the electronic equipment to send and display data.
The technical scheme provided by the embodiment of the application can be applied to the electronic equipment shown in fig. 8 or 9.
In the following examples, the implementation of the solution provided in the embodiment of the present application is described in detail with reference to the software composition shown in fig. 9.
Referring to fig. 11, a schematic diagram of interaction between modules is provided in an embodiment of the present application. In this implementation, the currently used camera module supports reshutter functions as an example. Before the N-1 frame image is mapped, the electronic device determines to use modeA to map as an example. Thus, the GPH register in the camera driver may be configured to be 0, the reshutter register may be configured to be 0, and the configuration parameter a may be stored in the configuration register.
By this flow shown in fig. 11, it is possible to realize that the N-1 st frame image is mapped with the configuration parameter a (e.g., the frame rate f0, etc.) corresponding to modeA.
As shown in fig. 11, the scheme may include:
S1101, the camera driver sends a map command 111 to the image sensor.
Wherein the map instruction 111 may correspond to the configuration parameter a. For example, the map instruction 111 may include a frame rate f0, an image size S1, corresponding to the configuration parameter a.
S1102, the image sensor transmits the original image data 112 to the IFE according to the map instruction 111.
Wherein the original image data 112 may correspond to an original image of an N-1 frame image.
In this example, the image sensor may process the optical signal from the camera according to the image size S1 carried by the map instruction 111, and output the raw image data 112 corresponding to the image size S1. Further, the frequency at which the image sensor transmits the raw image data 112 to the IFE may correspond to the frame rate f 0.
S1103, IFE sends the reference data 113 to the decision module.
The reference data 113 may include 3A data corresponding to an N-1 frame image. For example, the reference data 113 may include an AE exposure period and/or an AE gain corresponding to the N-1 th frame image.
Illustratively, the IFE may perform preliminary processing on the received raw image data 112, thereby obtaining 3A data of the N-1 frame image.
In this example, the IFE may send the 3A data of the corresponding frame image obtained after each frame processing to the decision module. So that the decision module can combine the 3A data of the acquired frame image to determine whether to trigger the adjustment of the image pattern in the image process of the subsequent frame image.
S1104, IFE sends image data 114 to IPE. The image data 114 may be data of an N-1 frame image after the IFE performs the preliminary processing.
S1105, IPE generates and outputs image data 115 from image data 114.
Thus, the ISP processing of the N-1 th frame image is realized by the S1104-S1105, and the image data 114 corresponding to the N-1 th frame image which can be used for display is acquired.
In the example, after the N-1 frame image is mapped by the electronic device, before the N frame image is mapped, the electronic device is determined to switch from modeA to modeB for mapping according to the operation of a user and environmental parameters.
Referring to fig. 12, a schematic diagram of still another interaction between modules according to an embodiment of the present application is provided. Based on the implementation of the flow shown in fig. 12, after the camera driver of the image sensor receives the mode switching instruction, the image sensor is continuously controlled to perform at least one frame of image drawing according to the validated modeA. That is, the nth frame image is rendered at the frame rate f0 indicated by modeA without waiting for modeB to take effect before rendering at the frame rate f1 of modeB. Thereby avoiding waiting before the nth frame image is imaged.
As shown in fig. 12, the scheme may include:
s1201, the camera application sends the operation information 121 to the decision module.
For example, the camera application may generate corresponding operation information after receiving the operation 401 or operation 402 of the user, and send the operation information to the decision module.
Taking the example of the camera application receiving the user's operation 401. Operation 401, in conjunction with the description in fig. 4, is used to instruct to adjust the focal length magnification to 2×.
Correspondingly, the camera application may generate operation information 121 and send the operation information to the decision module according to operation 401.
The operation information 121 may include information indicating that the focal length magnification is adjusted from 1×2×2.
S1202, the decision module sends configuration parameters B and camera identifications C1 to the enabling control module. In some embodiments, the configuration parameter B and the camera identifier C1 may be carried and issued in the first control information. In other embodiments, the first control information may also include only the frame rate f1 in the configuration parameter B, or the first control information may include the frame rate f1 and the camera identification C1.
The camera identifier C1 may be an identifier of the currently called camera.
In some embodiments, the decision module may obtain the camera identification C1 from the camera application. In other embodiments, the decision module may obtain the camera identity C1 from other modules (e.g., a camera management module in a framework layer of the electronic device).
In this example, the decision module may determine that the graph mode needs to be switched from modeA to modeB according to the operation information 121 and the reference data 113 acquired as S1103 in fig. 11.
Illustratively, the decision module may determine to switch from modeA to modeB according to current operation information indicated by the operation information 121, including that the focal length magnification is adjusted from 1× to 2×, and that 3A data corresponding to a previous frame of image satisfies a preset condition.
The preset condition may include that an AE gain of a previous frame image is smaller than a preset gain threshold, and/or that an AE exposure duration of the previous frame image is smaller than a preset duration threshold.
Taking the case that the reference data 113 satisfies the preset condition as an example.
In this way, the decision module can determine that it is necessary to switch from modeA to modeB, and control the image sensor to perform mapping according to modeB.
It should be noted that the decision module provided in this example determines that the determination condition for switching from modeA to modeB is merely an example. In other implementations, the decision module may also determine that a switch from modeA to modeB is required based on other mechanisms. The embodiments of the present application are not limited in this regard.
In this way, the decision module may send the configuration parameter B of modeB and the currently used camera id C1 to the enabling control module in case it is determined that modeA needs to be switched to modeB for mapping.
And S1203, the enabling control module sends the enabling identification of the configuration parameters B and reshutter to the camera driver.
For example, if the enabling control module receives the configuration parameter B and the camera identifier C1, it can know that modeA needs to be switched to modeB to make the map.
In the application, the enabling control module can judge whether the image sensor corresponding to the currently used camera supports reshutter functions according to the camera identifier C1.
In some implementations, a reshutter list may be configured in the enable control module. This reshutter list may also be referred to as a first function list.
The reshutter list may include at least one camera identification.
Taking at least one corresponding relation between the camera mark and the image sensor as an example, the camera mark C1 is included. This indicates that the camera corresponding to camera id C1, the image sensor used supports reshutter functions. That is, the image sensor applicable to the camera corresponding to the camera identifier C1 may continue to perform at least one frame of mapping according to the configuration parameter a under the control of the camera driver.
Thus, the enabling control module may query reshutter the list for the camera identity C1. In the case where the camera identification C1 is included in the reshutter list, it is determined that the currently used image sensor supports the reshutter function.
It will be appreciated that during the power-on of the electronic device, all hardware functions will be traversed. Thus, in other embodiments of the present application, the reshutter list may be generated and stored after the electronic device has completed the functional traversal of all cameras and corresponding image sensors.
In other embodiments of the present application, reshutter lists may also be stored as already configured at the shipping time of the electronic device.
In other embodiments of the present application, reshutter lists may also be flexibly adjusted according to the actual situation.
Thus, in the case that the enabling control module determines that the currently used image sensor supports reshutter functions, the enabling identification can be sent reshutter to the camera driver, so that the camera driver can be enabled to control the image sensor to continue to perform at least one frame of image drawing according to the configuration parameter a. In the following examples, taking as an example, a picture enabling the camera driving control image sensor to continue one frame image according to the configuration parameter a.
In addition, the enabling control module can also send the new configuration parameter B to the camera driver.
As shown in fig. 12, the camera driver may configure a corresponding register according to the received reshutter enable flag.
Illustratively, the camera driver may receive reshutter the enable flag, set the GPH register to 1, set the reshutter register to 1, and store the configuration parameter B in the configuration register.
The GPH register set to 1 corresponds to the need to configure parameters for reshutter registers and configuration registers. Reshutter register configuration 1 then the nth frame image corresponding to the image to be rendered continues to be rendered according to the existing configuration. Wherein an existing configuration may be stored in a storage area a in a configuration register. Such as configuration parameter a.
And the configuration parameters B may be stored to the storage area B of the configuration register.
In some implementations, after completing the data storage of reshutter registers and configuration registers, the camera driver may reset the GPH register to 0.
It should be noted that, in other embodiments of the present application, if the enabling control module determines that the corresponding image sensor does not support reshutter functions according to the camera identifier C1, the enabling identifier may not be generated reshutter, and the configuration parameter B is correspondingly sent only to the camera driver. And further, the camera is driven according to the prior scheme shown in fig. 3 or 5, and the image sensor is controlled to perform drawing.
S1204, the camera driver sends a map instruction 122 to the image sensor.
Illustratively, the camera driver may configure 1 according to reshutter registers without waiting for the new configuration parameter B to take effect, and perform the frame out of the next frame image (e.g., the nth frame image) according to the existing configuration parameter a.
As a possible implementation, the camera driver may read the existing configuration parameters a from the storage area a of the configuration register according to reshutter register configuration as 1 after the GPH register is configured as 0 (i.e., the register is configured).
The camera driver may generate a map instruction 122 according to the configuration parameter a, instructing the image sensor to map at the frame rate f0, the image size S1.
Correspondingly, the image sensor may generate an nth frame image for mapping. The nth frame image may be output at a frame rate f0, and the size of the nth frame image may correspond to the image size S1.
S1205, the image sensor sends the raw image data 123 to the IFE.
S1206, IFE sends reference data 124 to the decision module.
Similar to S1103 in fig. 11, the IFE may send 3A data of the nth frame image (e.g., AE exposure time and/or AE gain of the nth frame image) to the decision module, so that the decision module determines the image mode of the subsequent frame image accordingly.
S1207, IFE transmits image data 125 to IPE. The image data 125 may be image data obtained after the IFE processes the original image data 123.
S1208, IPE generates and outputs image data 126 from image data 125. The image data 126 may be used for a display of an nth frame image.
In this way, when the camera driver of the image sensor receives the image mode adjustment instruction before the nth frame image is output, the nth frame image can still be displayed according to the existing modeA. The effect shown in fig. 7 is obtained.
It will be appreciated that in some embodiments of the present application, the enabling control module may control, after receiving the configuration parameter B (i.e. the new image pattern) from the decision module, through similar logic in S1203-S1208, to continue the image from the nth frame image to the n+x frame image according to the existing modeA. In this way, it is ensured that modeB configurations can be immediately mapped out as modeB after they are validated. The time period in which the frame rate f2 appears as in fig. 6 is padded with the time of continuing the x-frame image of the graph according to modeA, thereby avoiding a jump in the frame rate to the frame rate f2 between the switching of f0 to f 1.
In different implementations, x may be flexibly configured to be 1 or a positive integer greater than 1. In the present application, an example is given in which x is configured as 1. Thus, after the nth frame image is rendered, the configuration parameter B has been validated. That is, after the nth frame of image is mapped, the image sensor may immediately map according to the frame rate f1 corresponding to the configuration parameter B under the control of the camera driver.
Therefore, after the nth frame image is finished, the electronic equipment can drive and control the image sensor to perform subsequent frame image (such as the (n+1) th frame image) image according to the new configuration parameter B through the camera.
Illustratively, in some embodiments, this process may be implemented by configuring registers in the camera driver.
For example, referring to fig. 13, after the nth frame image is mapped (e.g., after the camera driver sends a mapping instruction 122 to the image sensor), the camera driver may configure the GPH register to 0, the reshutter register to 0, and the configuration parameter a in the configuration register is deleted. Alternatively, the camera driver may store the configuration parameters B into the storage space a.
Thus, according to the modified register state, the camera driving can be implemented according to a scheme similar to that shown in fig. 11, and the image sensor is controlled to perform the drawing of the subsequent image (such as the n+1st frame image) according to the configuration parameter B.
Exemplary, referring to fig. 14, a schematic diagram of still another interaction between modules is provided in an embodiment of the present application. Based on the implementation of the flow shown in fig. 14, the camera driver can control the image sensor to perform map based on modeB according to the new validated configuration parameter B.
As shown in fig. 14, the scheme may include:
S1401, the camera driver transmits the map command 141 to the image sensor.
The map instruction 141 may correspond to the configuration parameter B. For example, the map instruction 141 may include a frame rate f1, an image size S2, corresponding to the configuration parameter B.
S1402, the image sensor sends the raw image data 142 to the IFE according to the map instruction 141. Wherein the original image data 142 may correspond to an original image of an n+1st frame image.
S1403, IFE sends reference data 143 to the decision module. The reference data 143 may include 3A data corresponding to an n+1st frame image. For example, the reference data 143 may include AE exposure time length and/or AE gain corresponding to the n+1st frame image.
S1404, IFE sends image data 144 to IPE. The image data 144 may be data of an n+1st frame image after the IFE performs the preliminary processing.
S1405, IPE generates image data 145 from image data 144 and outputs the same.
In this way, the drawing of the n+1st frame image according to the new modeB configuration parameters is realized.
It should be noted that, in other embodiments of the present application, the electronic device may further control the exposure time of the first frame image after the new configuration parameter is validated by configuring the exposure control module in the sensor node. Therefore, the frame length of the first frame image after the new configuration parameters take effect is limited, and the frame rate displayed on the interface before and after the picture mode switching is ensured not to be changed excessively.
Exemplary, referring to fig. 15 in conjunction with fig. 12, a schematic diagram of still another interaction between modules according to an embodiment of the present application is provided.
In the example of fig. 15, the electronic device performs the mapping of the nth frame image according to S1201-S1208 at the frame rate f0 of modeA, and may also implement the configuration of the exposure time of the n+1th frame image in the camera driving through S1501-S1502.
For example, as shown in fig. 15, after the enabling control module receives the configuration parameter B and the camera identifier C1 according to S1202, the image sensor support reshutter function may be determined according to the camera identifier C1 according to the logic shown in fig. 12.
In addition to executing S1203, the enable control module may also execute S1501 below.
S1501, the enabling control module sends reshutter an enabling identification to the exposure control module.
For example, the enabling control module may trigger the exposure control module to control exposure of the 1 st frame image (e.g., the n+1st frame image) after the pattern is switched by sending reshutter an enabling flag to the exposure control module.
In this example, the exposure control module may determine the exposure limit parameter for the n+1st frame image upon receiving reshutter enable flags.
As an example, the exposure control module may determine modeB exposure limit parameters for a1 st image (e.g., an n+1st image) after validation based on the configuration parameters of modeA that have been validated, and the configuration parameters of modeB to be validated.
For example, the exposure control module may determine the exposure limiting parameter according to the following equation (1).
Formula (1) ts= (FLL (a) - (m+y_add_end (a) -y_add_sta (a) +65)/n+96) LLP (a)/LLP (B) -96.
Where Ts is an exposure restriction parameter, FLL (a) is a frame length of mode a, y_add_end (a) is a termination position of mode a image height coordinates, y_add_sta (a) is a start position of mode a image height coordinates, LLP (a) is a line length of mode a, and LLP (B) is a line length of mode B. M and N are both fixed coefficients. For example, when mode a is full-frame (fullsize), m=4, n=1. As another example, when modeA is not a full frame, m=8, n=2.
Thus, by controlling the exposure time of the n+1th frame image to be smaller than the exposure limiting parameter, it is possible to ensure that the frame length of the first frame after switching to mode B is the same as or similar to the frame length of mode a.
For example, the enable control module may implement exposure control of the n+1st frame image according to the exposure limiting parameter according to S1502 and related logic as follows.
S1502, the exposure control module sends exposure limiting parameters to the camera driver.
The camera driver may be a camera driver corresponding to the currently used camera identifier C1. In some implementations, the exposure control module may obtain the camera identification C1 (not shown in fig. 15) from the enabling control module.
In this way, the camera driving may end to the configuration parameter B, reshutter enabling flag according to S1203, and may also receive the exposure restriction parameter according to S1502.
In connection with the description of the operation of the camera driver update register in fig. 12, in this example as in fig. 15, the camera driver may update the GPH register, reshutter register, and configuration register according to the scheme as shown in fig. 12.
Further, in this example as in fig. 15, the camera driver may also store the acquired exposure restriction parameters in a configuration register.
For example, the camera driver may store the exposure limit parameter and the configuration parameter B together in the storage area B. So that the exposure limit parameter and the configuration parameter B are validated together when the n+1th frame image frame is performed.
Thus, when the camera driving is implemented in accordance with the scheme shown in fig. 14 and the map command 141 is sent to the image sensor, the exposure restriction parameter Ts may be sent to the image sensor in addition to the frame rate f1 and the image size S2 corresponding to the configuration parameter B. Furthermore, when the image sensor performs image drawing of the (n+1) th frame of image, the exposure time does not exceed the exposure limiting parameter Ts except the image processing according to the configuration parameter B.
Therefore, the frame of the N-th frame image can be output according to the frame rate of f0 by matching the enabling control module and the exposure control module, and the exposure time of the N+1th frame image can be controlled to be smaller than or equal to the exposure limiting parameter Ts, so that the aim of controlling the frame length of the N+1th frame image to be equal to or close to the frame length of the N-th frame image is fulfilled.
In each of the embodiments of the present application, the switching of the drawing mode from modeA to modeB is taken as an example. In other embodiments of the present application, if the current image mode is modeB and the electronic device (e.g., a decision module configured in the electronic device) determines that the modeB needs to be switched to modeA, the technical solution provided in any one of fig. 11 to 15 may also be used to enable the camera driver to continue to control the image sensor to perform at least one frame of image according to modeB after receiving the image mode switching instruction. Alternatively, by implementing the scheme as provided in fig. 15, the exposure control module may determine the corresponding exposure limiting parameter according to the above formula (1) (e.g., interchange modeA related parameters and modeB related parameters in formula (1)). Based on the exposure limiting parameter, the exposure time of the first frame image after switching to modeA is controlled so that the frame length of the first frame image is the same as or similar to the frame length of the picture in modeB mode. Specific implementation in this scenario may refer to the specific descriptions in the foregoing embodiments, which are not repeated herein.
The above description mainly describes the scheme provided by the embodiment of the application from the perspective of each functional module. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
By way of example, fig. 16 illustrates a schematic diagram of the composition of an electronic device 1600. As shown in fig. 16, the electronic device 1600 may include a processor 1601 and a memory 1602. The memory 1602 is used to store computer-executable instructions. For example, in some embodiments, the processor 1601, when executing instructions stored in the memory 1602, can cause the electronic device 1600 to perform a method as shown in any of the above embodiments.
As shown in fig. 16, the electronic device 1600 may further include an image sensor 1603, where when the electronic device 1600 performs the method provided in the foregoing embodiment, the image sensor 1603 may be controlled to perform corresponding mapping according to the scheme provided in the embodiment of the present application.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
Fig. 17 shows a schematic diagram of the composition of a chip system 1700. The system-on-chip 1700 may include a processor 1701 and a communication interface 1702 to support related devices to implement the functions referred to in the embodiments above. In one possible design, the chip system further includes a memory to hold the necessary program instructions and data for the electronic device. The chip system can be composed of chips, and can also comprise chips and other discrete devices. It should be noted that, in some implementations of the present application, the communication interface 1702 may also be referred to as an interface circuit.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The functions or acts or operations or steps and the like in the embodiments described above may be implemented in whole or in part by software, hardware, firmware or any combination thereof. When implemented using a software program, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device including one or more servers, data centers, etc. that can be integrated with the medium. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
Although the application has been described in connection with specific features and embodiments thereof, it will be apparent that various modifications and combinations can be made without departing from the spirit and scope of the application. Accordingly, the specification and drawings are merely exemplary illustrations of the present application as defined in the appended claims and are considered to cover any and all modifications, variations, combinations, or equivalents that fall within the scope of the application. It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (20)

Translated fromChinese
1.一种出图控制方法,其特征在于,所述方法应用于电子设备,所述电子设备中配置有第一图像传感器,所述第一图像传感器与所述电子设备中配置的至少一个摄像头对应;1. A method for controlling image output, characterized in that the method is applied to an electronic device, wherein the electronic device is provided with a first image sensor, and the first image sensor corresponds to at least one camera provided in the electronic device;所述方法包括:The method comprises:生成第一控制信息,所述第一控制信息包括第一帧率;generating first control information, wherein the first control information includes a first frame rate;控制所述第一图像传感器输出所述第一图像;controlling the first image sensor to output the first image;其中,输出所述第一图像与输出第二图像的时间间隔为第一时长;所述第一时长对应于第二帧率,所述第二图像是所述第一图像的上一帧图像。The time interval between outputting the first image and outputting the second image is a first duration; the first duration corresponds to a second frame rate, and the second image is a previous frame image of the first image.2.根据权利要求1所述的方法,其特征在于,在生成所述第一控制信息之前,所述方法还包括:2. The method according to claim 1, characterized in that before generating the first control information, the method further comprises:控制所述第一图像传感器第二帧率输出所述第二图像。The first image sensor is controlled to output the second image at a second frame rate.3.根据权利要求1或2所述的方法,其特征在于,所述第一控制信息还包括第一摄像头标识;所述第一摄像头标识是当前使用的第一摄像头的标识,所述第一摄像头包括在所述至少一个摄像头中,所述第一摄像头与所述第一图像传感器对应;3. The method according to claim 1 or 2, characterized in that the first control information further includes a first camera identifier; the first camera identifier is an identifier of a first camera currently in use, the first camera is included in the at least one camera, and the first camera corresponds to the first image sensor;所述控制所述第一图像传感器输出所述第一图像之前,所述方法还包括:Before controlling the first image sensor to output the first image, the method further includes:根据所述第一摄像头标识,确定所述第一摄像头对应的所述第一图像传感器支持第一功能,所述第一功能对应于在接收到新的帧率配置后,按照已配置的帧率继续进行至少一帧的出图。According to the first camera identifier, it is determined that the first image sensor corresponding to the first camera supports a first function, and the first function corresponds to continuing to output at least one frame according to the configured frame rate after receiving a new frame rate configuration.4.根据权利要求3所述的方法,其特征在于,所述新的帧率配置包括所述第一帧率,所述已配置的帧率包括所述第二帧率。4 . The method according to claim 3 , wherein the new frame rate configuration includes the first frame rate, and the configured frame rate includes the second frame rate.5.根据权利要求3所述的方法,其特征在于,所述电子设备中配置有第一功能列表,所述第一功能列表包括至少一个摄像头标识,所述第一功能列表中的每个是摄像头标识对应的图像传感器均支持所述第一功能;5. The method according to claim 3, characterized in that a first function list is configured in the electronic device, the first function list includes at least one camera identifier, and each image sensor corresponding to the camera identifier in the first function list supports the first function;所述根据所述第一摄像头标识,确定所述第一摄像头对应的所述第一图像传感器支持第一功能,包括:The determining, according to the first camera identifier, that the first image sensor corresponding to the first camera supports the first function includes:根据所述第一摄像头标识包括在所述第一功能列表中,确定所述第一摄像头对应的所述第一图像传感器支持第一功能。According to the first camera identifier being included in the first function list, it is determined that the first image sensor corresponding to the first camera supports a first function.6.根据权利要求4或5所述的方法,其特征在于,所述电子设备为所述第一图像传感器配置有第一寄存器、第二寄存器以及第三寄存器;6. The method according to claim 4 or 5, characterized in that the electronic device configures a first register, a second register and a third register for the first image sensor;所述确定所述第一图像传感器支持第一功能后,所述方法还包括:After determining that the first image sensor supports the first function, the method further includes:配置所述第一寄存器为第一值,表示使能配置所述第二寄存器和所述第三寄存器;Configuring the first register to a first value indicates enabling configuration of the second register and the third register;配置所述第二寄存器为第一值,表示使能所述第一功能;Configuring the second register to a first value indicates enabling the first function;将所述第一帧率存储在所述第三寄存器中。The first frame rate is stored in the third register.7.根据权利要求6所述的方法,其特征在于,所述第三寄存器包括第一存储区域和第二存储区域;7. The method according to claim 6, characterized in that the third register comprises a first storage area and a second storage area;所述将所述第一帧率存储在所述第三寄存器中,包括:The storing the first frame rate in the third register comprises:将所述第一帧率存储在所述第二存储区域;storing the first frame rate in the second storage area;在将所述第一帧率存储在所述第二存储区域之前,所述第一存储区域中存储有所述第二帧率。Before the first frame rate is stored in the second storage area, the second frame rate is stored in the first storage area.8.根据权利要求6所述的方法,其特征在于,所述方法还包括:8. The method according to claim 6, characterized in that the method further comprises:配置所述第一寄存器为第二值,表示去使能配置所述第二寄存器和所述第三寄存器。Configuring the first register to a second value indicates disabling configuration of the second register and the third register.9.根据权利要求7所述的方法,其特征在于,所述控制所述第一图像传感器输出所述第一图像,包括:9. The method according to claim 7, wherein controlling the first image sensor to output the first image comprises:在所述第二寄存器为所述第一值的情况下,When the second register is the first value,控制所述第一图像传感器按照所述第一存储区域存储的所述第二帧率输出所述第一图像。The first image sensor is controlled to output the first image according to the second frame rate stored in the first storage area.10.根据权利要求7或9所述的方法,其特征在于,10. The method according to claim 7 or 9, characterized in that:在所述控制所述第一图像传感器输出所述第一图像之后,所述方法还包括:After controlling the first image sensor to output the first image, the method further includes:将所述第一帧率覆盖所述第二帧率存储在所述第一存储区域。The first frame rate covers the second frame rate and is stored in the first storage area.11.根据权利要求1或2或4或5或7或8或9所述的方法,其特征在于,所述控制所述第一图像传感器输出所述第一图像之后,所述方法还包括:11. The method according to claim 1 or 2 or 4 or 5 or 7 or 8 or 9, characterized in that after controlling the first image sensor to output the first image, the method further comprises:控制所述第一图像传感器输出第三图像;controlling the first image sensor to output a third image;其中,输出所述第三图像与输出所述第一图像的时间间隔为第二时长;所述第二时长对应于所述第一帧率,所述第三图像是所述第一图像的下一帧图像。The time interval between outputting the third image and outputting the first image is a second duration; the second duration corresponds to the first frame rate, and the third image is the next frame image of the first image.12.根据权利要求11所述的方法,其特征在于,第一帧长与第二帧长相同或相近;12. The method according to claim 11, characterized in that the first frame length is the same as or similar to the second frame length;所述第一帧长是所述第三图像的帧长,第二帧长是所述第一图像或所述第二图像的帧长;所述第一帧长与所述第二帧长相近包括:所述第一帧长与所述第二帧长差别不超过10%。The first frame length is the frame length of the third image, and the second frame length is the frame length of the first image or the second image; the first frame length is similar to the second frame length including: the difference between the first frame length and the second frame length does not exceed 10%.13.根据权利要求12所述的方法,其特征在于,13. The method according to claim 12, characterized in that所述控制所述第一图像传感器输出第三图像,包括:The controlling the first image sensor to output the third image comprises:控制所述第一图像传感器输出所述第三图像的过程中,所述第一图像传感器的曝光时间小于或等于曝光限制参数,以使得所述第一帧长与所述第二帧长相同或相近。In the process of controlling the first image sensor to output the third image, the exposure time of the first image sensor is less than or equal to an exposure limit parameter, so that the first frame length is the same as or similar to the second frame length.14.根据权利要求13所述的方法,其特征在于,在所述控制所述第一图像传感器输出第三图像之前,所述方法还包括:14. The method according to claim 13, characterized in that before controlling the first image sensor to output the third image, the method further comprises:根据第一配置参数和第二配置参数,确定所述曝光限制参数;Determining the exposure limit parameter according to the first configuration parameter and the second configuration parameter;所述第一配置参数是所述第一帧率对应的配置参数,所述第二配置参数是所述第二帧率对应的配置参数。The first configuration parameter is a configuration parameter corresponding to the first frame rate, and the second configuration parameter is a configuration parameter corresponding to the second frame rate.15.根据权利要求1或2或4或5或7或8或9或12或13或14所述的方法,其特征在于,在所述生成第一控制信息之前,所述方法还包括:15. The method according to claim 1 or 2 or 4 or 5 or 7 or 8 or 9 or 12 or 13 or 14, characterized in that before generating the first control information, the method further comprises:获取第四图像的图像参数;所述图像参数包括所述第四图像的自动曝光AE曝光时长和/或AE曝光增益;Acquire image parameters of the fourth image; the image parameters include automatic exposure (AE) exposure duration and/or AE exposure gain of the fourth image;根据所述第四图像的图像参数,确定生成所述第一控制信息。Determine, according to the image parameters of the fourth image, to generate the first control information.16.根据权利要求15所述的方法,其特征在于,所述根据所述第四图像的图像参数,确定生成所述第一控制信息,包括:16. The method according to claim 15, wherein determining to generate the first control information according to the image parameters of the fourth image comprises:在接收到第一操作信息或第二操作信息的情况下,根据所述第四图像的图像参数,确定生成所述第一控制信息;When the first operation information or the second operation information is received, determining to generate the first control information according to the image parameter of the fourth image;所述第一操作信息是所述电子设备根据接收到第一操作生成的,所述第一操作用于指示所述电子设备进行焦距调整;The first operation information is generated by the electronic device according to receiving a first operation, and the first operation is used to instruct the electronic device to adjust the focus;所述第二操作信息是所述电子设备根据接收到第二操作生成的,所述第二操作用于指示所述电子设备进行拍摄模式调整。The second operation information is generated by the electronic device according to receiving a second operation, and the second operation is used to instruct the electronic device to adjust the shooting mode.17.根据权利要求16所述的方法,其特征在于,所述第一操作具体用于指示所述电子设备将焦距倍率由1×调整为2×。17 . The method according to claim 16 , wherein the first operation is specifically used to instruct the electronic device to adjust the focal length magnification from 1× to 2×.18.根据权利要求1或2或4或5或7或8或9或12或13或14或16或17所述的方法,其特征在于,18. The method according to claim 1 or 2 or 4 or 5 or 7 or 8 or 9 or 12 or 13 or 14 or 16 or 17, characterized in that:所述第一帧率对应于InsensorZoom模式下的帧率,所述第二帧率对应于binning模式下的帧率。The first frame rate corresponds to a frame rate in the InsensorZoom mode, and the second frame rate corresponds to a frame rate in the binning mode.19.一种电子设备,其特征在于,所述电子设备包括:存储器,一个或多个处理器,以及至少一个图像传感器;所述存储器、所述处理器以及所述图像传感器两两耦合;19. An electronic device, characterized in that the electronic device comprises: a memory, one or more processors, and at least one image sensor; the memory, the processor, and the image sensor are coupled in pairs;其中,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述处理器执行所述计算机指令时,使所述电子设备按照如权利要求1-18中任一项权利要求中所述的方法控制所述图像传感器出图。The memory is used to store computer program code, and the computer program code includes computer instructions. When the processor executes the computer instructions, the electronic device controls the image sensor to output an image according to the method as described in any one of claims 1 to 18.20.一种芯片系统,其特征在于,所述芯片系统应用于电子设备;所述芯片系统包括一个或多个接口电路和一个或多个处理器;所述接口电路和所述处理器通过线路互联;所述接口电路用于从所述电子设备的存储器接收信号,并向所述处理器发送所述信号,所述信号包括所述存储器中存储的计算机指令;当所述处理器执行所述计算机指令时,所述电子设备执行如权利要求1-18中任一项所述的方法。20. A chip system, characterized in that the chip system is applied to an electronic device; the chip system comprises one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through lines; the interface circuit is used to receive a signal from a memory of the electronic device and send the signal to the processor, the signal comprising a computer instruction stored in the memory; when the processor executes the computer instruction, the electronic device executes the method as described in any one of claims 1-18.
CN202311583534.7A2023-11-232023-11-23 A method for controlling image output, electronic device and chip systemActiveCN118488310B (en)

Priority Applications (2)

Application NumberPriority DateFiling DateTitle
CN202411985401.7ACN120034733A (en)2023-11-232023-11-23 A method for controlling image output, electronic device and chip system
CN202311583534.7ACN118488310B (en)2023-11-232023-11-23 A method for controlling image output, electronic device and chip system

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202311583534.7ACN118488310B (en)2023-11-232023-11-23 A method for controlling image output, electronic device and chip system

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
CN202411985401.7ADivisionCN120034733A (en)2023-11-232023-11-23 A method for controlling image output, electronic device and chip system

Publications (2)

Publication NumberPublication Date
CN118488310A CN118488310A (en)2024-08-13
CN118488310Btrue CN118488310B (en)2025-01-10

Family

ID=92197659

Family Applications (2)

Application NumberTitlePriority DateFiling Date
CN202411985401.7APendingCN120034733A (en)2023-11-232023-11-23 A method for controlling image output, electronic device and chip system
CN202311583534.7AActiveCN118488310B (en)2023-11-232023-11-23 A method for controlling image output, electronic device and chip system

Family Applications Before (1)

Application NumberTitlePriority DateFiling Date
CN202411985401.7APendingCN120034733A (en)2023-11-232023-11-23 A method for controlling image output, electronic device and chip system

Country Status (1)

CountryLink
CN (2)CN120034733A (en)

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JP5460180B2 (en)*2009-08-252014-04-02キヤノン株式会社 Imaging apparatus and control method thereof
JP5566133B2 (en)*2010-03-052014-08-06キヤノン株式会社 Frame rate conversion processor
US10757342B1 (en)*2018-04-252020-08-25Snap Inc.Image device auto exposure
JP7378431B2 (en)*2018-06-182023-11-13マジック リープ, インコーポレイテッド Augmented reality display with frame modulation functionality
US11039081B2 (en)*2019-07-172021-06-15Pixart Imaging Inc.Image sensor capable of eliminating rolling flicker and adjusting frame rate
KR102860466B1 (en)*2020-11-132025-09-17삼성전자주식회사Electronic device including a plurality of image sensors and method for thereof
US11812165B2 (en)*2021-06-162023-11-07Mediatek Inc.Method and apparatus for dynamically changing frame rate of sensor output frames according to whether motion blur condition is met
CN115706853B (en)*2021-08-122025-08-26荣耀终端股份有限公司 Video processing method, device, electronic device and storage medium
US12231768B2 (en)*2021-11-232025-02-18Qualcomm IncorporatedReduced latency mode switching in image capture device
CN116261041A (en)*2021-12-082023-06-13华为技术有限公司 Shooting method, device, equipment and storage medium
JP2023095508A (en)*2021-12-242023-07-06キヤノン株式会社Imaging apparatus and method for controlling the same, and program
CN116506742A (en)*2022-01-202023-07-28Oppo广东移动通信有限公司 Camera switching method, device, electronic device and computer-readable storage medium
CN115550541B (en)*2022-04-222024-04-09荣耀终端有限公司Camera parameter configuration method and electronic equipment
CN118264920A (en)*2022-09-162024-06-28荣耀终端有限公司 Method and device for adjusting frame interval
CN116132713A (en)*2022-11-022023-05-16瑞芯微电子股份有限公司Method for adjusting frame rate, chip, storage medium and electronic device
CN116033275B (en)*2023-03-292023-08-15荣耀终端有限公司Automatic exposure method, electronic equipment and computer readable storage medium

Also Published As

Publication numberPublication date
CN120034733A (en)2025-05-23
CN118488310A (en)2024-08-13

Similar Documents

PublicationPublication DateTitle
JP7217357B2 (en) Mini-program data binding method, apparatus, device and computer program
CN111597000B (en)Small window management method and terminal
US8205159B2 (en)System, method and medium organizing templates for generating moving images
CN114640798B (en)Image processing method, electronic device, and computer storage medium
CN111225108A (en)Communication terminal and card display method of negative screen interface
CN113329176A (en)Image processing method and related device applied to camera of intelligent terminal
CN114374813A (en)Multimedia resource management method, recorder and server
CN115002336A (en) Video information generation method, electronic device and medium
CN113253905B (en)Touch method based on multi-finger operation and intelligent terminal
CN111176766A (en)Communication terminal and component display method
CN115460448A (en)Media resource editing method and device, electronic equipment and storage medium
CN112825536B (en)Electronic terminal and background card display method
CN118488310B (en) A method for controlling image output, electronic device and chip system
CN111479075B (en)Photographing terminal and image processing method thereof
CN113642010B (en)Method for acquiring data of extended storage device and mobile terminal
CN113254132B (en)Application display method and related device
CN114449171B (en)Method for controlling camera, terminal device, storage medium and program product
CN115484399A (en)Video processing method and electronic equipment
CN114520867A (en) Camera control method and terminal device based on distributed control
CN118381996B (en) Image shooting method and electronic device
CN113542711A (en)Image display method and terminal
CN119127039B (en) Display method and electronic device
CN113179362B (en)Electronic device and image display method thereof
CN114143456B (en)Photographing method and device
CN111988530B (en)Mobile terminal and photographing method thereof

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant
CP03Change of name, title or address
CP03Change of name, title or address

Address after:Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Patentee after:Honor Terminal Co.,Ltd.

Country or region after:China

Address before:3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong

Patentee before:Honor Device Co.,Ltd.

Country or region before:China


[8]ページ先頭

©2009-2025 Movatter.jp