Movatterモバイル変換


[0]ホーム

URL:


CN111787224A - Image acquisition method, terminal device and computer-readable storage medium - Google Patents

Image acquisition method, terminal device and computer-readable storage medium
Download PDF

Info

Publication number
CN111787224A
CN111787224ACN202010654953.5ACN202010654953ACN111787224ACN 111787224 ACN111787224 ACN 111787224ACN 202010654953 ACN202010654953 ACN 202010654953ACN 111787224 ACN111787224 ACN 111787224A
Authority
CN
China
Prior art keywords
image information
camera
picture
target
preview
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010654953.5A
Other languages
Chinese (zh)
Other versions
CN111787224B (en
Inventor
赵紫辉
代文慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Microphone Holdings Co Ltd
Original Assignee
Shenzhen Microphone Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Microphone Holdings Co LtdfiledCriticalShenzhen Microphone Holdings Co Ltd
Priority to CN202010654953.5ApriorityCriticalpatent/CN111787224B/en
Publication of CN111787224ApublicationCriticalpatent/CN111787224A/en
Priority to PCT/CN2021/101320prioritypatent/WO2022007622A1/en
Priority to CN202180044452.8Aprioritypatent/CN115812312A/en
Application grantedgrantedCritical
Publication of CN111787224BpublicationCriticalpatent/CN111787224B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

The application discloses an image acquisition method, which is applied to terminal equipment with cameras, wherein the terminal equipment comprises at least two cameras, focal sections shot by each camera are different, and the method comprises the following steps: when the triggering operation of the cameras is detected, starting the cameras, controlling the cameras to pick up image information respectively, and forming a camera preview interface; when the photographing triggering operation is detected, generating a picture according to the camera preview interface; and saving the picture. The application also discloses a terminal device and a computer readable storage medium. The picture that this application terminal equipment shot is formed by the image information combination of the different focal length that a plurality of cameras were gathered simultaneously, has promoted the fore-and-aft scene definition of picture and the stereoscopic effect of image in the picture to make and shoot the formation of image effect good.

Description

Image acquisition method, terminal device and computer-readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image obtaining method, a terminal device, and a computer-readable storage medium.
Background
With the development of science and technology, the shooting function of the terminal equipment is more and more advanced. If terminal equipment can realize mode such as super wide angle, wide angle and tele and shoot the image nowadays, super wide angle and wide angle can realize the large visual angle and shoot, can be suitable for in the shooting scene of difference with the tele shooting mode.
The terminal equipment who realizes on the existing market that super wide angle, wide angle and long burnt mode were shot generally is equipped with at least three camera, for example super wide-angle camera, wide-angle camera and long burnt camera etc. shoots super wide-angle image and adopts super wide-angle camera to shoot, shoots wide-angle image and adopts wide-angle camera to shoot, shoots long burnt image and adopts long burnt camera to shoot. In the specific shooting process, when a user opens a camera, a camera preview interface is formed by preview data collected by a default camera, then the user sets a zoom value according to the camera preview interface, the camera identifies the camera corresponding to the set zoom value, then the corresponding camera is started to pick up image data, the image data is displayed in the camera preview interface, and the user adopts the camera to obtain an image after pressing down to take a picture.
However, the terminal device still forms images with a single camera, and the shooting imaging effect is not good.
The above is only for the purpose of assisting understanding of the technical solutions of the present application, and does not represent an admission that the above is prior art.
Disclosure of Invention
The application mainly aims to provide an image acquisition method, a terminal device and a computer readable storage medium, and aims to solve the technical problem that the shooting and imaging effect of the terminal device is poor.
In order to achieve the above object, the present application provides an image acquisition method applied to a terminal device with a camera, where the terminal device includes at least two cameras, and a focal length captured by each camera is different, and the image acquisition method includes the following steps:
when the triggering operation of the cameras is detected, starting the cameras, controlling the cameras to pick up image information respectively, and forming a camera preview interface;
when the photographing triggering operation is detected, generating a picture according to the camera preview interface;
and saving the picture.
Optionally, the step of forming a camera preview interface includes:
determining a target camera according to a preset zoom value;
and combining main preview image information and supplementary preview image information to form the camera preview interface, wherein the main preview image information is image information picked up by the target camera, and the supplementary preview image information is image information picked up by other cameras except the target camera.
Optionally, the step of combining the main preview image information and the supplemental preview image information to form the camera preview interface includes:
acquiring an area overlapped with the main preview image information in the supplementary preview image information;
and combining the area overlapped with the main preview image information in the supplementary preview image information into the main preview image information to form the camera preview interface.
Optionally, the step of merging the region of the supplemental preview image information that overlaps with the main preview image information into the main preview image information to form the camera preview interface includes:
acquiring coordinates of the target camera and relative position parameters of the other cameras and the target camera;
calculating the coordinate position of the pixel of the region overlapped with the main preview image information in each supplementary preview image information according to the coordinate of the target camera and the relative position parameter;
and converting each pixel into a pixel plane corresponding to the main preview image information according to the coordinate position to form the camera preview interface.
Optionally, the preset zoom value is a default zoom value of the terminal device, or the preset zoom value is a set zoom value of a user.
Optionally, the step of saving the picture is executed while:
and storing the image information picked up by each camera, and associating the picture with each image information.
Optionally, after the step of saving the image information picked up by each camera and associating the picture with each image information, the method further includes:
when the picture editing operation is detected, acquiring editing parameters corresponding to the editing operation;
acquiring target image information corresponding to the editing parameters from each image information associated with the picture;
and generating edited target picture preview data according to the target image information.
Optionally, the step of acquiring target image information corresponding to the editing parameter from each piece of image information associated with the picture includes:
determining a zoom value of the adjusted picture according to the editing parameters;
acquiring a focal length where the zoom value is located, and taking a camera matched with the focal length as a target camera;
and taking the image information picked up by the target camera as the target image information.
Optionally, the step of generating edited target picture preview data according to the target image information includes:
and adjusting the target image information according to the zoom value, and generating edited target picture preview data based on the adjusted target image information.
Optionally, the step of generating edited target picture preview data according to the target image information includes:
and combining and generating edited target picture preview data according to the target image information and other image information associated with the picture.
Optionally, the editing operation comprises at least one of zooming in, zooming out, and cropping.
Optionally, when the editing operation is a zoom-in operation, a focal length in which the zoom value of the adjusted picture is located is larger than a focal length in which the current zoom value of the picture is located; and when the editing operation is a zooming-out operation, the focal length of the zoom value of the adjusted picture is smaller than the focal length of the current zoom value of the picture.
Optionally, when the editing operation is cropping, after the step of generating edited target picture preview data according to the target image information, the method further includes:
and after the cutting determining operation is detected, cutting the target picture according to the target picture preview data and the editing parameters.
The present application further provides a terminal device, including: a memory, a processor and an image acquisition program stored on the memory and executable on the processor, the image acquisition program, when executed by the processor, implementing the steps of the image acquisition method as described above.
Optionally, the processor includes at least two image processing modules, and each image processing module is connected to one camera.
Furthermore, the present application also provides a computer-readable storage medium having stored thereon an image acquisition program which, when executed by a processor, implements the steps of the image acquisition method as described above.
According to the image acquisition method, the terminal device and the computer readable storage medium, when the terminal device detects the triggering operation of the camera, the terminal device starts each camera, controls each camera to pick up image information respectively, generates a camera preview interface according to the image information picked up by each camera, and generates and stores a picture according to the camera preview interface when the triggering operation of photographing is detected. Because the picture is formed by combining the image information of different focal sections acquired by a plurality of cameras simultaneously, the front and back scene definition of the picture and the three-dimensional effect of the image in the picture are improved, and the imaging effect of the camera is good.
Drawings
Fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of a first embodiment of an image acquisition method according to the present application;
FIG. 3 is a detailed flowchart of the step S10 in FIG. 2;
fig. 4 is a detailed flowchart of step S12 in the second embodiment of the image obtaining method of the present application;
FIG. 5 is a schematic flowchart of a third embodiment of an image acquisition method according to the present application;
FIG. 6 is a detailed flowchart of the step S60 in FIG. 5;
fig. 7 is a schematic flowchart of a fourth embodiment of an image obtaining method according to the present application.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The main solution of the embodiment of the application is as follows: when the triggering operation of the cameras is detected, starting the cameras, controlling the cameras to pick up image information respectively, and forming a camera preview interface; when the photographing triggering operation is detected, generating a picture according to the camera preview interface; and saving the picture.
As shown in fig. 1, fig. 1 is a schematic structural diagram of a terminal device in a hardware operating environment according to an embodiment of the present application.
The terminal device can be a PC, and also can be a terminal device with a shooting function, such as a smart phone and a tablet personal computer.
As shown in fig. 1, the terminal device may include: aprocessor 1001, such as a CPU, anetwork interface 1004, auser interface 1003, amemory 1005, acommunication bus 1002. Wherein acommunication bus 1002 is used to enable connective communication between these components. Theuser interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and theoptional user interface 1003 may also include a standard wired interface, a wireless interface. Thenetwork interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). Thememory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). Thememory 1005 may alternatively be a storage device separate from theprocessor 1001.
Further, terminal equipment includes two at least cameras, every the focus section that the camera was shot is different, for example, when terminal equipment includes three camera, be long focus camera, wide-angle camera and super wide-angle camera respectively, wherein, long focus camera corresponds the focus section of shooing and is 3X-30X, wide-angle camera corresponds the focus section of shooing and is 1X-3X, and super wide-angle camera corresponds the focus section of shooing and is 0.6X-1X. The cameras are all connected with the processor.
Optionally, the processor includes at least two image processing modules, and each image processing module is connected to one of the cameras. The camera transmits image information to an image processing module connected with the camera after the image information is collected at the front end of the camera, and the image processing module processes the image information collected by the camera to form image information in the focal length and stores the image information in a memory. Because the image information that every camera gathered alone is handled by solitary image processing module, so can save the original image data information that this camera gathered in the memory, at least two when the camera gathered the image simultaneously, can also simultaneously respectively to at least two the image data that the camera gathered are handled, and then carry out data merging based on at least two the image data that the camera gathered simultaneously, form the image based on different focal length or different angle data merging, improve the shooting effect.
Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, amemory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an image acquisition program.
In the terminal shown in fig. 1, thenetwork interface 1004 is mainly used for connecting to a backend server and performing data communication with the backend server; theuser interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and theprocessor 1001 may be configured to invoke an image acquisition program stored in thememory 1005 and perform the following operations:
when the triggering operation of the cameras is detected, starting the cameras, controlling the cameras to pick up image information respectively, and forming a camera preview interface;
when the photographing triggering operation is detected, generating a picture according to the camera preview interface;
and saving the picture.
Further, theprocessor 1001 may call the image acquisition program stored in thememory 1005, and also perform the following operations:
determining a target camera according to a preset zoom value;
and combining main preview image information and supplementary preview image information to form the camera preview interface, wherein the main preview image information is image information picked up by the target camera, and the supplementary preview image information is image information picked up by other cameras except the target camera.
Further, theprocessor 1001 may call the image acquisition program stored in thememory 1005, and also perform the following operations:
acquiring an area overlapped with the main preview image information in the supplementary preview image information;
and combining the area overlapped with the main preview image information in the supplementary preview image information into the main preview image information to form the camera preview interface.
Further, theprocessor 1001 may call the image acquisition program stored in thememory 1005, and also perform the following operations:
acquiring coordinates of the target camera and relative position parameters of the other cameras and the target camera;
calculating the coordinate position of the pixel of the region overlapped with the main preview image information in each supplementary preview image information according to the coordinate of the target camera and the relative position parameter;
and converting each pixel into a pixel plane corresponding to the main preview image information according to the coordinate position to form the camera preview interface.
Further, theprocessor 1001 may call the image acquisition program stored in thememory 1005, and also perform the following operations:
and storing the image information picked up by each camera, and associating the picture with each image information.
Further, theprocessor 1001 may call the image acquisition program stored in thememory 1005, and also perform the following operations:
when the picture editing operation is detected, acquiring editing parameters corresponding to the editing operation;
acquiring target image information corresponding to the editing parameters from each image information associated with the picture;
and generating edited target picture preview data according to the target image information.
Further, theprocessor 1001 may call the image acquisition program stored in thememory 1005, and also perform the following operations:
determining a zoom value of the adjusted picture according to the editing parameters;
acquiring a focal length where the zoom value is located, and taking a camera matched with the focal length as a target camera;
and taking the image information picked up by the target camera as the target image information.
Further, theprocessor 1001 may call the image acquisition program stored in thememory 1005, and also perform the following operations:
and adjusting the target image information according to the zoom value, and generating edited target picture preview data based on the adjusted target image information.
Further, theprocessor 1001 may call the image acquisition program stored in thememory 1005, and also perform the following operations:
and combining and generating edited target picture preview data according to the target image information and other image information associated with the picture.
Further, theprocessor 1001 may call the image acquisition program stored in thememory 1005, and also perform the following operations:
and after the cutting determining operation is detected, cutting the target picture according to the target picture preview data and the editing parameters.
Referring to fig. 2, the present application provides a first embodiment of an image acquisition method, where the image acquisition method is applied to a terminal device with a camera, the terminal device includes at least two cameras, and focal segments captured by the cameras are different, and the image acquisition method includes:
step S10, when detecting the trigger operation of the camera, opening each camera, controlling each camera to pick up image information respectively, and forming a camera preview interface;
the terminal device in this embodiment may be a mobile phone, a tablet, a camera, or the like. The terminal equipment is provided with a camera application, and a user can trigger the camera application to carry out shooting work.
When a user triggers a camera application of the terminal equipment, the terminal equipment starts each camera and controls each camera to pick up image information respectively. The camera transmits the acquired image information to different processing modules respectively, and the acquired image information is stored in different storage areas after being processed by the processing modules. And the same trigger operation is correspondingly associated with the acquired image data.
It should be noted that the terminal device in this embodiment includes, but is not limited to, a tele camera, a wide camera, and an ultra-wide camera, where a focal length shot by the tele camera is 3X to 30X, a focal length shot by the wide camera is 1X to 3X, and a focal length shot by the ultra-wide camera is 0.6X to 1X.
The terminal equipment is provided with a display interface, and after the camera application is triggered, image data collected by the cameras are displayed on the display interface in a preview mode.
It can be understood that, based on the present embodiment that at least two cameras simultaneously pick up image information, and the picked-up image information is acquired based on different focal lengths and different coordinates, the camera preview interface may be formed by combining multiple sets of the image information.
Specifically, in an embodiment, referring to fig. 3, the camera preview interface is formed in a manner including, but not limited to, one of the following:
step S11, determining a target camera according to a preset zoom value;
step S12, combining main preview image information and supplemental preview image information to form the camera preview interface, where the main preview image information is image information picked up by the target camera, and the supplemental preview image information is image information picked up by other cameras except the target camera.
Namely, the main preview image information is determined according to the preset zoom value, and then the main preview image information is corrected by using the image information of other cameras, so that the display effect of the shot image is improved.
Specifically, the preset zoom value may be a default zoom value of the terminal device, or the preset zoom value may be a set zoom value of a user. And if the user triggers the camera application, the terminal equipment controls each camera to be started, and each camera respectively picks up image information and transmits the image information back to each processing module. At this time, if the terminal device does not detect the user-set zoom value, the default zoom value of the terminal device is adopted as the preset zoom value. And if the terminal equipment detects that the user sets the zoom value of the current camera, the set zoom value of the user is adopted as the preset zoom value. It can be understood that a zoom control is arranged on the terminal device, and a user can set a zoom value by triggering the zoom control.
When a camera application of the terminal device is triggered, the default of the camera preview interface is a default zoom value set by a system, and when it is detected that a user triggers a zoom control to adjust the zoom value of the preview interface, the default zoom value is adjusted to be a set zoom value. In the focusing process, since the present embodiment employs a plurality of cameras to pick up image information simultaneously, when the implied zoom value is adjusted to the set zoom value, the terminal device directly takes the image information picked up by the camera corresponding to the set zoom value as the main preview image information and the image information picked up by the other cameras as the supplementary preview image information, and combines and images without the need of the exemplary technique: when the default zoom value is adjusted to the set zoom value, the camera corresponding to the default zoom value is closed, the camera corresponding to the set zoom value is opened, and then image information is picked up. The embodiment omits the pick-up time of the image information after the zoom value is adjusted in the image pick-up process, and can avoid missed moment pictures caused by adjusting the zoom value to a certain extent.
It should be noted that, in this embodiment, a target camera is determined according to the preset zoom value, where the target camera is a camera corresponding to the preset zoom value, and if the preset zoom value is 1.0X, the corresponding target camera is a wide-angle camera; if the preset zoom value is 0.6X, the corresponding target camera is an ultra-wide-angle camera.
In this embodiment, after a target camera is determined according to a preset zoom value, image information picked up by the target camera is used as main preview image information, image information picked up by other cameras except the target camera is used as supplementary preview image information, and then the main preview image information is corrected by using the supplementary preview image information, so as to finally form a preview interface.
Because the picture is all represented on the photosensitive element of camera through the equal proportion formation of image by the object, and each photosensitive element all comprises different plane pixel, because the position coordinate of each camera has relative position, when a plurality of cameras shoot same object of being shot, have the visual angle of a plurality of differences, if merge into a picture with the data of a plurality of different visual angles, the little stereoeffect of the object of being shot is stronger, also can be stronger on the reduction degree of visual angle impression, simultaneously because the difference of focus section formation of image, make at the composite picture, the fore-and-aft view of picture also can promote by a wide margin in the definition.
Based on the camera preview interface in the embodiment, the image information picked up by the multiple cameras is combined to form the camera preview interface, and the multiple cameras form images, so that the camera preview interface in the embodiment has a good imaging effect.
Step S20, when the photographing triggering operation is detected, generating a picture according to the camera preview interface;
and step S30, saving the picture.
And the display interface of the terminal equipment is provided with a photographing determining control, when a user triggers the photographing determining control, the terminal equipment is judged to detect photographing triggering operation, and the terminal equipment generates a picture according to the image information currently displayed on the camera preview interface, stores the picture and finishes photographing.
In this embodiment, when the terminal device detects a camera trigger operation, the terminal device starts each camera, controls each camera to respectively pick up image information, generates a camera preview interface according to the image information picked up by each camera, generates a picture according to the camera preview interface when the terminal device detects a photographing trigger operation, and stores the picture. Because the picture is formed by combining the image information of different focal sections acquired by a plurality of cameras simultaneously, the front and back scene definition of the picture and the three-dimensional effect of the image in the picture are improved, and the imaging effect of the camera is good.
Further, referring to fig. 4, the present application provides a second embodiment of an image obtaining method, and based on the first embodiment, the step of combining the main preview image information and the supplemental preview image information to form the camera preview interface includes:
step S121, acquiring an area overlapping with the main preview image information in the supplementary preview image information;
step S122, merging the region overlapping with the main preview image information in the supplemental preview image information into the main preview image information, and forming the camera preview interface.
At least two cameras pick up the image information of the same object from different angles, so that an overlapping area and a non-overlapping area are necessary between each image information, the depth information of the edge position of the formed picture is increased through the combination of the overlapping areas in the embodiment, and the transparency of the picture content can be improved qualitatively. And based on multi-focus segment fusion, the edge of the picture can be supplemented and corrected through the supplementary preview image information of other focus segments, and compared with a single shot picture, the edge of the shot picture cannot be distorted.
In this embodiment, the supplementary preview data is converted to the plane where the main preview data is located by coordinate conversion, so that the supplementary preview data is calibrated on one plane, and thus, under the condition that a pixel point is not changed, four-axis scattering extension may exist by taking the point as a center.
The coordinate conversion mode is that the coordinate of the main preview image data is used as a central coordinate, and the supplementary preview information is converted to the central coordinate based on the relative relation between the central coordinate and the coordinate of the supplementary preview image information, so as to complete the coordinate conversion.
Specifically, the step of combining the region overlapping with the main preview image information in the supplemental preview image information into the main preview image information to form the camera preview interface includes:
acquiring coordinates of the target camera and relative position parameters of the other cameras and the target camera;
calculating the coordinate position of the pixel of the region overlapped with the main preview image information in each supplementary preview image information according to the coordinate of the target camera and the relative position parameter;
and converting each pixel into a pixel plane corresponding to the main preview image information according to the coordinate position to form the camera preview interface.
It should be noted that the target camera is located as the central coordinate, and the phase position parameter refers to a relative position of the coordinate positions of the other cameras with respect to the coordinate of the target camera. After the coordinates of the target camera and the relative position parameters are acquired, calculating the coordinate position of the pixel of the area, which is overlapped with the main preview image information, of the supplementary preview image information at the coordinates of the target camera, and specifically converting the coordinates of the pixel at the corresponding camera to the coordinates of the target camera based on the relative position parameters, so that each pixel point is converted to the pixel plane of the main preview image information to complete the combination with the main preview image information, and further displaying the combined image in the camera preview interface.
In the embodiment of the invention, the edge depth information of a single photo is increased through multi-path fusion imaging, so that the transparency of the photo content can be qualitatively improved; the problem of poor edge resolution is optimized, so that edge noise of the picture is generated, the picture is indirectly changed into a large pixel through multi-pixel splicing and proofreading, and the light sensitivity and the color restoration degree of the camera are improved.
If the user needs to adjust the zoom value first in the shooting process of the terminal device and then takes a picture, the process of adjusting the zoom value needs time, and the user may miss an instant picture due to adjustment of the zoom value.
Specifically, as a third embodiment of the image obtaining method provided by the present application, referring to fig. 5 based on the first and/or second embodiment, the image obtaining method performs the step of saving the picture and also performs:
step S40, saving the image information picked up by each camera and associating the picture with each image information.
That is, after the user triggers the photographing operation, the terminal device generates a picture according to the camera preview interface, stores the picture, stores the image information picked up by each camera, and associates the picture with the image information picked up by each camera. Therefore, the pictures are correspondingly associated with the multi-path image information data, when the terminal equipment carries out zooming processing on the pictures, the multi-path image information can be called based on the association relation between the pictures and the image information data, and the data processing is carried out by adopting the originally picked image information, so that when the zooming processing is carried out on the pictures, the definition of the pictures can be always kept as same as that of the original pictures.
Based on the picture associated with the image information picked up by each camera, when the user edits the image, the image obtaining method of the embodiment of the application may perform the following processing on the picture:
referring to fig. 5 specifically, after the step of saving the image information picked up by each camera and associating the picture with each image information, the method further includes:
step S50, when the picture editing operation is detected, acquiring editing parameters corresponding to the editing operation;
step S60, acquiring target image information corresponding to the editing parameter from each image information associated with the picture;
step S70, generating edited preview data of the target picture according to the target image information.
The method comprises the steps that a user can click the picture to edit the picture, when the terminal device detects that the user carries out editing operation triggered based on the picture, editing parameters corresponding to the editing operation are obtained, then target image information corresponding to the editing parameters is obtained from each piece of image information related to the picture, and edited target picture preview data are generated by adopting the target image information.
Wherein the editing operation comprises at least one of zooming in, zooming out and cropping, and the editing parameters comprise one or more of zooming in multiple, zooming out multiple and cropping size. When a user magnifies and edits a picture, target image information corresponding to the magnification factor is searched from each image information associated with the picture, and then the target image information is used as target picture preview data for the user to preview the display effect of the magnified picture. When the user triggers the zoom-out operation or the clipping operation, the terminal device processes the picture in the same manner as described above, which is not described herein again.
It is understood that, in this embodiment, the focal length of each camera is different, and the picked-up image information is also divided according to the zoom value, so as to facilitate reasonable invoking of the image information associated with the picture to generate the target image preview data, so that the edited picture display effect is optimal, in an embodiment, referring to fig. 6, the step of obtaining the target image information corresponding to the editing parameter from each image information associated with the picture includes:
step S61, determining the zoom value of the adjusted picture according to the editing parameters;
step S62, acquiring a focal length corresponding to the zoom value, and taking a camera matched with the focal length as a target camera;
step S63, regarding the image information picked up by the target camera as the target image information.
That is, in this embodiment, when the editing parameter corresponding to the editing operation is obtained, the editing parameter is converted into a zoom value, a camera matched with a focal length where the zoom value is located is determined according to the zoom value, image information picked up by the camera is used as target image information, and then the target picture preview data is generated according to the target image information.
The image information picked up by each camera is different based on different focal sections of the cameras, the terminal equipment determines a target camera according to the zoom value of the picture adjusted by the editing parameters, the target image information corresponding to the target camera is adopted to generate target picture preview data, and the target image information is original information collected by the cameras and is the same as the image information of the focal section of the zoom value adjusted by the picture, and pixels of the target image information do not need to be changed, so that the image displayed by the adjusted target picture preview data is clear, and the problem that the definition can be reduced in a mode of achieving the purpose of focusing by changing the image pixels in the exemplary technology is solved. In contrast, the embodiment improves the clear effect of editing the picture.
In addition, in the embodiment, the image information which is most matched with the editing parameter is determined through the zoom value, and then the target picture preview data is generated by adopting the image information, so that the display effect of the convenient picture is optimized.
It should be noted that each focal length has an upper limit value and a lower limit value, the upper limit value or the lower limit value of two adjacent focal lengths is the same, and if the zoom value is between the upper limit value or the lower limit value, but does not belong to the upper limit value or the lower limit value, the target picture preview data is generated in the following two ways:
firstly, generating edited target picture preview data based on the target image information;
that is, the edited target picture preview data is generated by directly adopting the target image information, so that the adjusted target picture preview data meets the focal length.
Secondly, the target image information is adjusted according to the zoom value, and edited target picture preview data is generated based on the adjusted target image information.
That is, in this embodiment, after the target image information is determined according to the focal length of the zoom value, the target image information is adjusted according to the zoom value, so that the adjusted target image information is matched with the zoom value, and the edited target picture preview data is generated according to the adjusted target image information, so that the target picture preview data meets the requirement of the zoom value.
When the editing operation is enlargement or reduction, the terminal device generates enlarged or reduced picture preview data, and displays the enlarged or reduced picture. At this time, the user may select to capture the enlarged or reduced picture and store it in the memory, or may select to quit the picture editing, so that the picture is restored to the state before editing.
When the editing operation is cutting, the terminal equipment generates target picture preview data in a cutting area, at the moment, a user can select cutting determination, and after the terminal equipment detects the cutting determination operation, the terminal equipment cuts the target picture according to the target picture preview data and the editing parameters. Namely, the terminal equipment determines the cutting size according to the editing parameters to form a cutting area, and then forms the data in the cutting area into the target picture.
Further, the present application provides a fourth embodiment of the image obtaining method based on the third embodiment, and with reference to fig. 7, the step of generating edited target picture preview data according to the target image information includes:
and step S71, combining the target image information and other image information related to the picture to generate edited target picture preview data.
In this embodiment, when a user edits a generated picture, and after determining target image information, in order to make the definition of a foreground and a background presented by edited target picture preview data high, an image presented by the edited target picture preview data also achieves a stereoscopic effect, so that the picture effect presented by the edited target picture preview data is consistent with the picture effect obtained during shooting, when the edited target picture preview data is generated, the target image information is used as main preview image information, and other image information associated with the picture is used as supplementary preview image information, and the main target image information and the other image information are combined to form the target picture preview data.
It should be noted that, when the terminal device edits the picture, the picture is edited from all image information associated with the picture, and the target picture preview data may be generated after merging based on all image information. The specific merging method is the same as the merging method for the camera preview interface in the photographing process of the terminal device, and reference may be made to the second embodiment specifically, which is not repeated here.
Furthermore, an embodiment of the present application also provides a computer-readable storage medium, on which an image acquisition program is stored, and the image acquisition program, when executed by a processor, implements the steps of the image acquisition method as described above.
The present application further provides a terminal device, the terminal device includes: a memory, a processor and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the method as described above.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method as described in the above various possible embodiments.
An embodiment of the present application further provides a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method described in the above various possible embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of an element by the phrase "comprising an … …" does not exclude the presence of additional like elements in the process, method, article, or apparatus that comprises the element, and further, where similarly-named elements, features, or elements in different embodiments of the disclosure may have the same meaning, or may have different meanings, that particular meaning should be determined by their interpretation in the embodiment or further by context with the embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (15)

CN202010654953.5A2020-07-102020-07-10Image acquisition method, terminal device and computer-readable storage mediumActiveCN111787224B (en)

Priority Applications (3)

Application NumberPriority DateFiling DateTitle
CN202010654953.5ACN111787224B (en)2020-07-102020-07-10Image acquisition method, terminal device and computer-readable storage medium
PCT/CN2021/101320WO2022007622A1 (en)2020-07-102021-06-21Image acquisition method, terminal device and computer-readable storage medium
CN202180044452.8ACN115812312A (en)2020-07-102021-06-21Image acquisition method, terminal device and computer-readable storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202010654953.5ACN111787224B (en)2020-07-102020-07-10Image acquisition method, terminal device and computer-readable storage medium

Publications (2)

Publication NumberPublication Date
CN111787224Atrue CN111787224A (en)2020-10-16
CN111787224B CN111787224B (en)2022-07-12

Family

ID=72758941

Family Applications (2)

Application NumberTitlePriority DateFiling Date
CN202010654953.5AActiveCN111787224B (en)2020-07-102020-07-10Image acquisition method, terminal device and computer-readable storage medium
CN202180044452.8APendingCN115812312A (en)2020-07-102021-06-21Image acquisition method, terminal device and computer-readable storage medium

Family Applications After (1)

Application NumberTitlePriority DateFiling Date
CN202180044452.8APendingCN115812312A (en)2020-07-102021-06-21Image acquisition method, terminal device and computer-readable storage medium

Country Status (2)

CountryLink
CN (2)CN111787224B (en)
WO (1)WO2022007622A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN112887603A (en)*2021-01-262021-06-01维沃移动通信有限公司Shooting preview method and device and electronic equipment
WO2022007622A1 (en)*2020-07-102022-01-13深圳传音控股股份有限公司Image acquisition method, terminal device and computer-readable storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN116051368B (en)*2022-06-292023-10-20荣耀终端有限公司 Image processing methods and related equipment

Citations (32)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1992802A (en)*2005-12-192007-07-04卡西欧计算机株式会社Imaging apparatus and imaging method
US20130258160A1 (en)*2012-03-292013-10-03Sony Mobile Communications Inc.Portable device, photographing method, and program
CN104168414A (en)*2013-05-172014-11-26光道视觉科技股份有限公司Method for shooting and splicing object images
CN104349063A (en)*2014-10-272015-02-11东莞宇龙通信科技有限公司Method, device and terminal for controlling camera shooting
CN104767937A (en)*2015-03-272015-07-08深圳市艾优尼科技有限公司Photographing method
CN104967775A (en)*2015-06-052015-10-07深圳市星苑科技有限公司Zoom lens imaging apparatus and method
CN204721459U (en)*2015-06-052015-10-21深圳市星苑科技有限公司A kind of device of zoom lens imaging
CN105676563A (en)*2016-03-312016-06-15深圳市极酷威视科技有限公司Zoom camera focusing method and camera
WO2016119150A1 (en)*2015-01-282016-08-04宇龙计算机通信科技(深圳)有限公司Photographing method of mobile terminal having multiple cameras and mobile terminal
CN105847674A (en)*2016-03-252016-08-10维沃移动通信有限公司Preview image processing method based on mobile terminal, and mobile terminal therein
CN106131408A (en)*2016-07-112016-11-16深圳市金立通信设备有限公司A kind of image processing method and terminal
CN106254780A (en)*2016-08-312016-12-21宇龙计算机通信科技(深圳)有限公司A kind of dual camera camera control method, photographing control device and terminal
CN106664356A (en)*2014-08-142017-05-10三星电子株式会社Image photographing apparatus, image photographing system for performing photographing by using multiple image photographing apparatuses, and image photographing methods thereof
CN106791376A (en)*2016-11-292017-05-31广东欧珀移动通信有限公司 Imaging device, control method, control device and electronic device
CN106791377A (en)*2016-11-292017-05-31广东欧珀移动通信有限公司 Control method, control device and electronic device
CN107360364A (en)*2017-06-282017-11-17维沃移动通信有限公司A kind of image capturing method and master mobile terminal
CN107690649A (en)*2015-06-232018-02-13三星电子株式会社Digital filming device and its operating method
WO2018045945A1 (en)*2016-09-062018-03-15努比亚技术有限公司Focusing method and terminal, storage medium
CN108769485A (en)*2018-06-272018-11-06北京小米移动软件有限公司Electronic equipment
CN109194881A (en)*2018-11-292019-01-11珠海格力电器股份有限公司Image processing method, system and terminal
CN109361794A (en)*2018-11-192019-02-19Oppo广东移动通信有限公司Zoom control method and device of mobile terminal, storage medium and mobile terminal
CN109436344A (en)*2018-11-162019-03-08航宇救生装备有限公司Airborne photography gondola based on parachute ballistic trajectory
CN109639997A (en)*2018-12-202019-04-16Oppo广东移动通信有限公司Image processing method, electronic device, and medium
CN110072058A (en)*2019-05-282019-07-30珠海格力电器股份有限公司Image shooting device and method and terminal
CN110248101A (en)*2019-07-192019-09-17Oppo广东移动通信有限公司Focusing method and device, electronic equipment and computer readable storage medium
CN110312075A (en)*2019-06-282019-10-08Oppo广东移动通信有限公司Device imaging method and device, storage medium and electronic device
CN110536057A (en)*2019-08-302019-12-03Oppo广东移动通信有限公司Image processing method and device, electronic equipment and computer readable storage medium
CN110830756A (en)*2018-08-072020-02-21华为技术有限公司 A monitoring method and device
CN111183632A (en)*2018-10-122020-05-19华为技术有限公司 Image capturing method and electronic device
CN111292278A (en)*2019-07-302020-06-16展讯通信(上海)有限公司Image fusion method and device, storage medium and terminal
CN111373727A (en)*2018-03-262020-07-03华为技术有限公司 A shooting method, device and equipment
CN111654629A (en)*2020-06-112020-09-11展讯通信(上海)有限公司Camera switching method and device, electronic equipment and readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
KR20170020069A (en)*2015-08-132017-02-22엘지전자 주식회사Mobile terminal and image capturing method thereof
CN106131397A (en)*2016-06-212016-11-16维沃移动通信有限公司A kind of method that multi-medium data shows and electronic equipment
WO2018076460A1 (en)*2016-10-282018-05-03华为技术有限公司Photographing method for terminal, and terminal
US10834310B2 (en)*2017-08-162020-11-10Qualcomm IncorporatedMulti-camera post-capture image processing
CN111083380B (en)*2019-12-312021-06-11维沃移动通信有限公司Video processing method, electronic equipment and storage medium
CN111787224B (en)*2020-07-102022-07-12深圳传音控股股份有限公司Image acquisition method, terminal device and computer-readable storage medium

Patent Citations (32)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN1992802A (en)*2005-12-192007-07-04卡西欧计算机株式会社Imaging apparatus and imaging method
US20130258160A1 (en)*2012-03-292013-10-03Sony Mobile Communications Inc.Portable device, photographing method, and program
CN104168414A (en)*2013-05-172014-11-26光道视觉科技股份有限公司Method for shooting and splicing object images
CN106664356A (en)*2014-08-142017-05-10三星电子株式会社Image photographing apparatus, image photographing system for performing photographing by using multiple image photographing apparatuses, and image photographing methods thereof
CN104349063A (en)*2014-10-272015-02-11东莞宇龙通信科技有限公司Method, device and terminal for controlling camera shooting
WO2016119150A1 (en)*2015-01-282016-08-04宇龙计算机通信科技(深圳)有限公司Photographing method of mobile terminal having multiple cameras and mobile terminal
CN104767937A (en)*2015-03-272015-07-08深圳市艾优尼科技有限公司Photographing method
CN104967775A (en)*2015-06-052015-10-07深圳市星苑科技有限公司Zoom lens imaging apparatus and method
CN204721459U (en)*2015-06-052015-10-21深圳市星苑科技有限公司A kind of device of zoom lens imaging
CN107690649A (en)*2015-06-232018-02-13三星电子株式会社Digital filming device and its operating method
CN105847674A (en)*2016-03-252016-08-10维沃移动通信有限公司Preview image processing method based on mobile terminal, and mobile terminal therein
CN105676563A (en)*2016-03-312016-06-15深圳市极酷威视科技有限公司Zoom camera focusing method and camera
CN106131408A (en)*2016-07-112016-11-16深圳市金立通信设备有限公司A kind of image processing method and terminal
CN106254780A (en)*2016-08-312016-12-21宇龙计算机通信科技(深圳)有限公司A kind of dual camera camera control method, photographing control device and terminal
WO2018045945A1 (en)*2016-09-062018-03-15努比亚技术有限公司Focusing method and terminal, storage medium
CN106791376A (en)*2016-11-292017-05-31广东欧珀移动通信有限公司 Imaging device, control method, control device and electronic device
CN106791377A (en)*2016-11-292017-05-31广东欧珀移动通信有限公司 Control method, control device and electronic device
CN107360364A (en)*2017-06-282017-11-17维沃移动通信有限公司A kind of image capturing method and master mobile terminal
CN111373727A (en)*2018-03-262020-07-03华为技术有限公司 A shooting method, device and equipment
CN108769485A (en)*2018-06-272018-11-06北京小米移动软件有限公司Electronic equipment
CN110830756A (en)*2018-08-072020-02-21华为技术有限公司 A monitoring method and device
CN111183632A (en)*2018-10-122020-05-19华为技术有限公司 Image capturing method and electronic device
CN109436344A (en)*2018-11-162019-03-08航宇救生装备有限公司Airborne photography gondola based on parachute ballistic trajectory
CN109361794A (en)*2018-11-192019-02-19Oppo广东移动通信有限公司Zoom control method and device of mobile terminal, storage medium and mobile terminal
CN109194881A (en)*2018-11-292019-01-11珠海格力电器股份有限公司Image processing method, system and terminal
CN109639997A (en)*2018-12-202019-04-16Oppo广东移动通信有限公司Image processing method, electronic device, and medium
CN110072058A (en)*2019-05-282019-07-30珠海格力电器股份有限公司Image shooting device and method and terminal
CN110312075A (en)*2019-06-282019-10-08Oppo广东移动通信有限公司Device imaging method and device, storage medium and electronic device
CN110248101A (en)*2019-07-192019-09-17Oppo广东移动通信有限公司Focusing method and device, electronic equipment and computer readable storage medium
CN111292278A (en)*2019-07-302020-06-16展讯通信(上海)有限公司Image fusion method and device, storage medium and terminal
CN110536057A (en)*2019-08-302019-12-03Oppo广东移动通信有限公司Image processing method and device, electronic equipment and computer readable storage medium
CN111654629A (en)*2020-06-112020-09-11展讯通信(上海)有限公司Camera switching method and device, electronic equipment and readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2022007622A1 (en)*2020-07-102022-01-13深圳传音控股股份有限公司Image acquisition method, terminal device and computer-readable storage medium
CN112887603A (en)*2021-01-262021-06-01维沃移动通信有限公司Shooting preview method and device and electronic equipment
CN112887603B (en)*2021-01-262023-01-24维沃移动通信有限公司 Shooting preview method, device and electronic equipment

Also Published As

Publication numberPublication date
WO2022007622A1 (en)2022-01-13
CN115812312A (en)2023-03-17
CN111787224B (en)2022-07-12

Similar Documents

PublicationPublication DateTitle
CN111294517B (en)Image processing method and mobile terminal
US10311649B2 (en)Systems and method for performing depth based image editing
CN110572581B (en) Method and device for acquiring zoom blurred image based on terminal equipment
US10009543B2 (en)Method and apparatus for displaying self-taken images
US9154687B2 (en)Imaging device and imaging method
CN111787224B (en)Image acquisition method, terminal device and computer-readable storage medium
CN105827964A (en)Image processing method and mobile terminal
KR20130112574A (en)Apparatus and method for improving quality of enlarged image
CN101841657A (en)Camera with image correction function, device and method for correcting image
CN113141450B (en)Shooting method, shooting device, electronic equipment and medium
CN112887617B (en) A shooting method, device and electronic equipment
EP4287610A1 (en)Focusing method and apparatus, electronic device, and medium
CN108810326B (en) A photographing method, device and mobile terminal
CN114500837B (en)Shooting method and device and electronic equipment
CN112529778B (en)Image stitching method and device of multi-camera equipment, storage medium and terminal
CN110602392A (en)Control method, imaging module, electronic device and computer-readable storage medium
CN112911059A (en)Photographing method and device, electronic equipment and readable storage medium
CN112532875B (en)Terminal device, image processing method and device thereof, and storage medium
CN112640430A (en)Imaging element, imaging device, image data processing method, and program
CN112887624A (en)Shooting method and device and electronic equipment
RU2792413C1 (en)Image processing method and mobile terminal
US20240346619A1 (en)Image capturing device and zooming method for depth-of-field image thereof
EP3041219B1 (en)Image processing device and method, and program
CN120640121A (en)Image processing method, device, electronic equipment, storage medium and product
CN117528250A (en)Multimedia file processing method, multimedia file processing device and electronic equipment

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp