技术领域Technical Field
本申请涉及终端技术领域,尤其涉及一种虚化方法和装置。The present application relates to the field of terminal technology, and in particular to a virtualization method and device.
背景技术Background technique
随着互联网的普及和发展,人们对于终端设备的功能需求也越发多样化。例如在拍摄或者视频录制过程中,终端设备可以对拍摄得到的照片、录制的视频或者预览画面进行虚化处理,以获得较好的空间感和电影感。With the popularization and development of the Internet, people's functional requirements for terminal devices are becoming more and more diverse. For example, during shooting or video recording, the terminal device can blur the captured photos, recorded videos or preview images to obtain a better sense of space and film.
通常情况下,终端设备可以基于双目摄像头(包括主路摄像头以及辅路摄像头)获取主路图像数据以及辅路图像数据,利用主路图像数据以及辅路图像数据计算得到深度图像,进而利用深度图像对主路图像数据进行虚化处理,使得主路图像数据中,位于后景的图像呈现出虚化的状态。Normally, the terminal device can obtain main image data and auxiliary image data based on a binocular camera (including a main camera and an auxiliary camera), calculate a depth image using the main image data and the auxiliary image data, and then use the depth image to blur the main image data, so that the image in the background of the main image data appears blurred.
然而,终端设备经常会出现主路图像数据虚化异常的现象,例如,出现主路图像数据中的后景没有虚化,或者主路图像数据中的前景被虚化等现象。However, the terminal device often has abnormal blurring of the main image data, for example, the background in the main image data is not blurred, or the foreground in the main image data is blurred.
发明内容Summary of the invention
本申请实施例提供一种虚化方法和装置,终端设备可以基于第一图像的曝光时间以及第二图像的曝光时间确定目标曝光时间,并使得第一图像以及第二图像在目标曝光时间处对齐,减少曝光时间不同对于第一图像以及第二图像之间相差的影响,提高深度计算的准确度,进而提高虚化处理的准确性。The embodiments of the present application provide a blurring method and apparatus, and a terminal device can determine a target exposure time based on the exposure time of a first image and the exposure time of a second image, and align the first image and the second image at the target exposure time, thereby reducing the influence of different exposure times on the phase difference between the first image and the second image, improving the accuracy of depth calculation, and thereby improving the accuracy of blurring processing.
第一方面,本申请实施例提供一种虚化方法,终端设备包括第一摄像头以及第二摄像头,方法包括:终端设备获取第一图像以及第二图像;其中,第一图像是基于第一摄像头得到的,第二图像是基于第二摄像头得到的;终端设备根据目标曝光时间处理第二图像,或者根据目标曝光时间处理第一图像和第二图像,其中,处理后的第一图像为按照目标曝光时间进行曝光的图像,处理后的第二图像为按照目标曝光时间进行曝光的图像;目标曝光时间为第一图像的曝光时间或第二图像的曝光时间中的任一个曝光时间;终端设备基于处理后的第二图像,或者基于处理后的第一图像以及处理后的第二图像,对第一图像进行虚化处理。这样,终端设备可以基于第一图像的曝光时间以及第二图像的曝光时间确定目标曝光时间,并使得第一图像以及第二图像在目标曝光时间处对齐,减少曝光时间不同对于第一图像以及第二图像之间相差的影响,提高深度计算的准确度,进而提高虚化处理的准确性。In a first aspect, an embodiment of the present application provides a blurring method, wherein a terminal device includes a first camera and a second camera, and the method includes: the terminal device acquires a first image and a second image; wherein the first image is obtained based on the first camera, and the second image is obtained based on the second camera; the terminal device processes the second image according to a target exposure time, or processes the first image and the second image according to the target exposure time, wherein the processed first image is an image exposed according to the target exposure time, and the processed second image is an image exposed according to the target exposure time; the target exposure time is any one of the exposure time of the first image or the exposure time of the second image; the terminal device blurs the first image based on the processed second image, or based on the processed first image and the processed second image. In this way, the terminal device can determine the target exposure time based on the exposure time of the first image and the exposure time of the second image, and align the first image and the second image at the target exposure time, thereby reducing the influence of different exposure times on the difference between the first image and the second image, improving the accuracy of depth calculation, and thus improving the accuracy of blurring processing.
其中,第一摄像头可以为本申请实施例中描述的主路摄像头,第一图像可以为本申请实施例中描述的主路图像数据;第二摄像头可以为本申请实施例中描述的辅路摄像头,第二图像可以为本申请实施例中描述的辅路图像数据;第一变换矩阵可以为本申请实施例中描述的第一warp矩阵;第二变换矩阵可以为本申请实施例中描述的第二warp矩阵。Among them, the first camera can be the main road camera described in the embodiment of the present application, and the first image can be the main road image data described in the embodiment of the present application; the second camera can be the auxiliary road camera described in the embodiment of the present application, and the second image can be the auxiliary road image data described in the embodiment of the present application; the first transformation matrix can be the first warp matrix described in the embodiment of the present application; the second transformation matrix can be the second warp matrix described in the embodiment of the present application.
在一种可能的实现方式中,终端设备根据目标曝光时间处理第二图像,或者根据目标曝光时间处理第一图像和第二图像,包括:终端设备获取第一变换矩阵以及第二变换矩阵;其中,第一变换矩阵用于将第一图像中的任一行像素点对齐到目标曝光时间对应的位置处;第二变换矩阵用于将第二图像中的任一行像素点对齐到目标曝光时间对应的位置处;终端设备利用第一变换矩阵处理第一图像,以及利用第二变换矩阵处理第二图像。这样,由于第一图像在逐行曝光时交错曝光的时间与第二图像在逐行曝光时交错曝光的时间不一致,使得两路图像数据的相差变化较大,因此终端设备可以通过分别获取对齐到目标曝光时间时对应的第一变换矩阵对第一图像进行矫正,以及基于对齐到目标曝光时间时对应的第二变换矩阵对第二图像分别进行校正,减少不同曝光时间带来的相差影响。In a possible implementation, the terminal device processes the second image according to the target exposure time, or processes the first image and the second image according to the target exposure time, including: the terminal device obtains a first transformation matrix and a second transformation matrix; wherein the first transformation matrix is used to align any row of pixels in the first image to a position corresponding to the target exposure time; the second transformation matrix is used to align any row of pixels in the second image to a position corresponding to the target exposure time; the terminal device processes the first image using the first transformation matrix, and processes the second image using the second transformation matrix. In this way, since the time of staggered exposure of the first image during line-by-line exposure is inconsistent with the time of staggered exposure of the second image during line-by-line exposure, the phase difference of the two image data varies greatly, so the terminal device can correct the first image by respectively obtaining the first transformation matrix corresponding to the alignment to the target exposure time, and correct the second image based on the second transformation matrix corresponding to the alignment to the target exposure time, so as to reduce the phase difference caused by different exposure times.
在一种可能的实现方式中,终端设备基于处理后的第二图像,或者基于处理后的第一图像以及处理后的第二图像,对第一图像进行虚化处理,包括:终端设备基于处理后的第一图像以及处理后的第二图像,确定第一深度图像;终端设备基于第一深度图像,对第一图像进行虚化处理。这样,终端设备可以通过处理后的第一图像以及处理后的第二图像得到相对准确的深度信息,进而提高虚化处理的效果。In a possible implementation, the terminal device performs blur processing on the first image based on the processed second image, or based on the processed first image and the processed second image, including: the terminal device determines the first depth image based on the processed first image and the processed second image; the terminal device performs blur processing on the first image based on the first depth image. In this way, the terminal device can obtain relatively accurate depth information through the processed first image and the processed second image, thereby improving the blur processing effect.
在一种可能的实现方式中,终端设备基于第一深度图像,对第一图像进行虚化处理,包括:终端设备基于第一变换矩阵对应的逆矩阵对第一深度图像进行矫正,得到第二深度图像;终端设备基于第二深度图像,对第一图像进行虚化处理。这样,由于通常情况下终端设备是基于第一摄像头对应的图像进行虚化处理的,因此可以将第一深度图像校正为与第一摄像头相关的深度图像,便于对第一摄像头对应的图像进行虚化处理。In a possible implementation, the terminal device performs blurring processing on the first image based on the first depth image, including: the terminal device corrects the first depth image based on the inverse matrix corresponding to the first transformation matrix to obtain the second depth image; the terminal device performs blurring processing on the first image based on the second depth image. In this way, since the terminal device usually performs blurring processing based on the image corresponding to the first camera, the first depth image can be corrected to a depth image related to the first camera, so as to facilitate blurring processing of the image corresponding to the first camera.
在一种可能的实现方式中,终端设备根据目标曝光时间处理第二图像,或者根据目标曝光时间处理第一图像和第二图像,包括:终端设备获取第一变换矩阵以及第二变换矩阵;其中,第一变换矩阵用于将第一图像中的任一行像素点对齐到目标曝光时间对应的位置处;第二变换矩阵用于将第二图像中的任一行像素点对齐到目标曝光时间对应的位置处;终端设备根据第一变换矩阵对应的逆矩阵以及第二变换矩阵,处理第二图像;其中,处理后的第二图像的曝光时间与第一图像的曝光时间相同。这样,使得终端设备可以利用第一变换矩阵对应的逆矩阵以及第二变换矩阵,模拟出与第一图像的曝光时间相同的图像,减少曝光时间不同对于第一图像以及处理后的第二图像之间的相差影响;并且,简化利用第一变换矩阵以及第二变换矩阵对图像数据进行矫正并确定深度图像的步骤,节省内存占用,并提高深度图像计算以及生成虚化图像的速度。In a possible implementation, the terminal device processes the second image according to the target exposure time, or processes the first image and the second image according to the target exposure time, including: the terminal device obtains the first transformation matrix and the second transformation matrix; wherein the first transformation matrix is used to align any row of pixels in the first image to the position corresponding to the target exposure time; the second transformation matrix is used to align any row of pixels in the second image to the position corresponding to the target exposure time; the terminal device processes the second image according to the inverse matrix corresponding to the first transformation matrix and the second transformation matrix; wherein the exposure time of the processed second image is the same as the exposure time of the first image. In this way, the terminal device can use the inverse matrix corresponding to the first transformation matrix and the second transformation matrix to simulate an image with the same exposure time as the first image, reduce the effect of different exposure times on the phase difference between the first image and the processed second image; and simplify the steps of correcting the image data and determining the depth image using the first transformation matrix and the second transformation matrix, save memory usage, and improve the speed of depth image calculation and generation of blurred images.
在一种可能的实现方式中,终端设备基于处理后的第二图像,或者基于处理后的第一图像以及处理后的第二图像,对第一图像进行虚化处理,包括:终端设备基于第一图像以及处理后的第二图像,确定第三深度图像;终端设备基于第三深度图像,对第一图像进行虚化处理。这样,使得终端设备可以基于相差变化较小的第一图像以及处理后的第二图像,准确的计算深度数据,进而提高虚化处理的效果。In a possible implementation, the terminal device performs blurring processing on the first image based on the processed second image, or based on the processed first image and the processed second image, including: the terminal device determines a third depth image based on the first image and the processed second image; the terminal device performs blurring processing on the first image based on the third depth image. In this way, the terminal device can accurately calculate the depth data based on the first image and the processed second image with a small phase difference, thereby improving the blurring effect.
在一种可能的实现方式中,方法还包括:终端设备对第一图像以及第二图像分别进行图像前处理,得到图像前处理后的第一图像以及图像前处理后的第二图像;终端设备根据目标曝光时间处理第二图像,或者根据目标曝光时间处理第一图像和第二图像,包括:终端设备根据目标曝光时间处理图像前处理后的第二图像,或者根据目标曝光时间处理图像前处理后的第一图像和图像前处理后的第二图像。这样,终端设备可以通过图像前处理,将RAW格式的图像转化为YUV格式的图像,节省虚化过程处理时的内存占用。In a possible implementation, the method further includes: the terminal device performs image preprocessing on the first image and the second image respectively to obtain the first image after image preprocessing and the second image after image preprocessing; the terminal device processes the second image according to the target exposure time, or processes the first image and the second image according to the target exposure time, including: the terminal device processes the second image after image preprocessing according to the target exposure time, or processes the first image after image preprocessing and the second image after image preprocessing according to the target exposure time. In this way, the terminal device can convert the RAW format image into the YUV format image through image preprocessing, saving memory usage during the blurring process.
第二方面,本申请实施例提供一种虚化装置,获取单元,用于获取第一图像以及第二图像;其中,第一图像是基于第一摄像头得到的,第二图像是基于第二摄像头得到的;处理单元,用于根据目标曝光时间处理第二图像,或者根据目标曝光时间处理第一图像和第二图像,其中,处理后的第一图像为按照目标曝光时间进行曝光的图像,处理后的第二图像为按照目标曝光时间进行曝光的图像;目标曝光时间为第一图像的曝光时间或第二图像的曝光时间中的任一个曝光时间;处理单元,还用于基于处理后的第二图像,或者基于处理后的第一图像以及处理后的第二图像,对第一图像进行虚化处理。In a second aspect, an embodiment of the present application provides a blurring device, including an acquisition unit for acquiring a first image and a second image; wherein the first image is obtained based on a first camera, and the second image is obtained based on a second camera; a processing unit for processing the second image according to a target exposure time, or processing the first image and the second image according to the target exposure time, wherein the processed first image is an image exposed according to the target exposure time, and the processed second image is an image exposed according to the target exposure time; the target exposure time is either the exposure time of the first image or the exposure time of the second image; the processing unit is also used to blur the first image based on the processed second image, or based on the processed first image and the processed second image.
在一种可能的实现方式中,获取单元,具体用于获取第一变换矩阵以及第二变换矩阵;其中,第一变换矩阵用于将第一图像中的任一行像素点对齐到目标曝光时间对应的位置处;第二变换矩阵用于将第二图像中的任一行像素点对齐到目标曝光时间对应的位置处;处理单元,具体用于利用第一变换矩阵处理第一图像,以及利用第二变换矩阵处理第二图像。In one possible implementation, the acquisition unit is specifically used to acquire a first transformation matrix and a second transformation matrix; wherein the first transformation matrix is used to align any row of pixel points in the first image to a position corresponding to a target exposure time; the second transformation matrix is used to align any row of pixel points in the second image to a position corresponding to the target exposure time; and the processing unit is specifically used to process the first image using the first transformation matrix, and to process the second image using the second transformation matrix.
在一种可能的实现方式中处理单元,具体用于:基于处理后的第一图像以及处理后的第二图像,确定第一深度图像;基于第一深度图像,对第一图像进行虚化处理。In a possible implementation manner, the processing unit is specifically configured to: determine a first depth image based on the processed first image and the processed second image; and blur the first image based on the first depth image.
在一种可能的实现方式中,处理单元,具体用于:基于第一变换矩阵对应的逆矩阵对第一深度图像进行矫正,得到第二深度图像;基于第二深度图像,对第一图像进行虚化处理。In a possible implementation, the processing unit is specifically configured to: correct the first depth image based on an inverse matrix corresponding to the first transformation matrix to obtain a second depth image; and blur the first image based on the second depth image.
在一种可能的实现方式中,获取单元,具体用于获取第一变换矩阵以及第二变换矩阵;其中,第一变换矩阵用于将第一图像中的任一行像素点对齐到目标曝光时间对应的位置处;第二变换矩阵用于将第二图像中的任一行像素点对齐到目标曝光时间对应的位置处;处理单元,具体用于根据第一变换矩阵对应的逆矩阵以及第二变换矩阵,处理第二图像;其中,处理后的第二图像的曝光时间与第一图像的曝光时间相同。In a possible implementation, the acquisition unit is specifically used to acquire a first transformation matrix and a second transformation matrix; wherein the first transformation matrix is used to align any row of pixel points in the first image to a position corresponding to a target exposure time; the second transformation matrix is used to align any row of pixel points in the second image to a position corresponding to the target exposure time; the processing unit is specifically used to process the second image according to an inverse matrix corresponding to the first transformation matrix and the second transformation matrix; wherein the exposure time of the processed second image is the same as the exposure time of the first image.
在一种可能的实现方式中,处理单元,具体用于:基于第一图像以及处理后的第二图像,确定第三深度图像;基于第三深度图像,对第一图像进行虚化处理。In a possible implementation manner, the processing unit is specifically configured to: determine a third depth image based on the first image and the processed second image; and blur the first image based on the third depth image.
在一种可能的实现方式中,处理单元,还用于:对第一图像以及第二图像分别进行图像前处理,得到图像前处理后的第一图像以及图像前处理后的第二图像;根据目标曝光时间处理图像前处理后的第二图像,或者根据目标曝光时间处理图像前处理后的第一图像和图像前处理后的第二图像。In one possible implementation, the processing unit is further used to: perform image pre-processing on the first image and the second image respectively to obtain the first image after image pre-processing and the second image after image pre-processing; process the second image after image pre-processing according to the target exposure time, or process the first image after image pre-processing and the second image after image pre-processing according to the target exposure time.
第三方面,本申请实施例提供一种终端设备,包括处理器和存储器,存储器用于存储代码指令;处理器用于运行代码指令,使得终端设备执行如第一方面或第一方面的任一种实现方式中描述的虚化方法。In a third aspect, an embodiment of the present application provides a terminal device, including a processor and a memory, the memory being used to store code instructions; the processor being used to run the code instructions so that the terminal device executes the virtualization method described in the first aspect or any one of the implementations of the first aspect.
第四方面,本申请实施例提供一种计算机可读存储介质,计算机可读存储介质存储有指令,当指令被执行时,使得计算机执行如第一方面或第一方面的任一种实现方式中描述的虚化方法。In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, which stores instructions. When the instructions are executed, the computer executes the virtualization method described in the first aspect or any one of the implementations of the first aspect.
第五方面,一种计算机程序产品,包括计算机程序,当计算机程序被运行时,使得计算机执行如第一方面或第一方面的任一种实现方式中描述的虚化方法。In a fifth aspect, a computer program product includes a computer program. When the computer program is executed, the computer executes the virtualization method described in the first aspect or any one of the implementations of the first aspect.
应当理解的是,本申请的第二方面至第五方面与本申请的第一方面的技术方案相对应,各方面及对应的可行实施方式所取得的有益效果相似,不再赘述。It should be understood that the second to fifth aspects of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by each aspect and the corresponding feasible implementation methods are similar and will not be repeated here.
附图说明BRIEF DESCRIPTION OF THE DRAWINGS
图1为本申请实施例提供的一种场景示意图;FIG1 is a schematic diagram of a scenario provided in an embodiment of the present application;
图2为本申请实施例提供的一种终端设备的硬件结构示意图;FIG2 is a schematic diagram of the hardware structure of a terminal device provided in an embodiment of the present application;
图3为本申请实施例提供的一种终端设备的软件结构示意图;FIG3 is a schematic diagram of a software structure of a terminal device provided in an embodiment of the present application;
图4为本申请实施例提供的一种虚化方法的流程示意图;FIG4 is a schematic diagram of a flow chart of a virtualization method provided in an embodiment of the present application;
图5为本申请实施例提供的一种获取图像数据的界面示意图;FIG5 is a schematic diagram of an interface for acquiring image data provided in an embodiment of the present application;
图6为本申请实施例提供的一种生成图像数据的原理示意图;FIG6 is a schematic diagram of a principle for generating image data provided by an embodiment of the present application;
图7为本申请实施例提供的一种图像矫正的原理示意图;FIG7 is a schematic diagram of the principle of image correction provided by an embodiment of the present application;
图8为本申请实施例提供的另一种虚化方法的流程示意图;FIG8 is a schematic diagram of a flow chart of another blurring method provided in an embodiment of the present application;
图9为本申请实施例提供的一种图像前处理的流程示意图;FIG9 is a schematic diagram of a process of image pre-processing provided in an embodiment of the present application;
图10为本申请实施例提供的一种虚化装置的结构示意图;FIG10 is a schematic diagram of the structure of a blurring device provided in an embodiment of the present application;
图11为本申请实施例提供的另一种终端设备的硬件结构示意图。FIG. 11 is a schematic diagram of the hardware structure of another terminal device provided in an embodiment of the present application.
具体实施方式Detailed ways
下面对本申请实施例中所描述的词汇进行说明。可以理解,该说明是为更加清楚的解释本申请实施例,并不必然构成对本申请实施例的限定。The following is an explanation of the vocabulary described in the embodiments of the present application. It is understood that the explanation is for a clearer explanation of the embodiments of the present application and does not necessarily constitute a limitation on the embodiments of the present application.
为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。例如,第一值和第二值仅仅是为了区分不同的值,并不对其先后顺序进行限定。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。In order to facilitate the clear description of the technical solutions of the embodiments of the present application, in the embodiments of the present application, words such as "first" and "second" are used to distinguish between identical or similar items with substantially the same functions and effects. For example, the first value and the second value are only used to distinguish different values, and their order is not limited. Those skilled in the art can understand that words such as "first" and "second" do not limit the quantity and execution order, and words such as "first" and "second" do not necessarily limit them to be different.
需要说明的是,本申请中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其他实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。It should be noted that, in this application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "for example" in this application should not be interpreted as being more preferred or more advantageous than other embodiments or designs. Specifically, the use of words such as "exemplary" or "for example" is intended to present related concepts in a specific way.
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a和b,a和c,b和c,或a、b和c,其中a,b,c可以是单个,也可以是多个。In this application, "at least one" means one or more, and "plurality" means two or more. "And/or" describes the association relationship of associated objects, indicating that three relationships may exist. For example, A and/or B can mean: A exists alone, A and B exist at the same time, and B exists alone, where A and B can be singular or plural. The character "/" generally indicates that the previous and next associated objects are in an "or" relationship. "At least one of the following" or similar expressions refers to any combination of these items, including any combination of single or plural items. For example, at least one of a, b, or c can mean: a, b, c, a and b, a and c, b and c, or a, b and c, where a, b, c can be single or multiple.
示例性的,图1为本申请实施例提供的一种场景示意图。如图1所示,该场景中可以包括终端设备101,例如该终端设备101可以为手机等,以及利用终端设备101拍摄得到的画面102,该画面102中可以包括位于前景的用户103以及位于后景的用户104。For example, Fig. 1 is a schematic diagram of a scenario provided by an embodiment of the present application. As shown in Fig. 1, the scenario may include a terminal device 101, for example, the terminal device 101 may be a mobile phone, etc., and a picture 102 captured by the terminal device 101, and the picture 102 may include a user 103 in the foreground and a user 104 in the background.
通常情况下,当终端设备接收到用户打开拍照功能或者视频功能时,终端设备可以基于摄像头采集包含画面102的图像数据。例如,终端设备可以利用主路摄像头获取画面102对应的主路图像数据、以及利用辅路摄像头获取画面102对应的辅路图像数据;通过对主路图像数据以及辅路图像数据的深度计算获取深度图像数据;进而利用深度图像数据对主路图像数据进行虚化处理,得到虚化处理结果。例如,该虚化处理结果可以为图1中的画面102,例如该画面102中可以虚化显示用户104以及未虚化显示用户103,或者画面102中可以未虚化显示用户104以及未虚化显示用户103,或者画面102中可以虚化显示用户104以及虚化显示用户103,本申请实施例中对最终的虚化处理结果不做具体限定。Normally, when the terminal device receives a signal from the user that the camera function or the video function is turned on, the terminal device can collect image data including the screen 102 based on the camera. For example, the terminal device can use the main camera to obtain the main image data corresponding to the screen 102, and use the auxiliary camera to obtain the auxiliary image data corresponding to the screen 102; obtain the depth image data by calculating the depth of the main image data and the auxiliary image data; and then use the depth image data to blur the main image data to obtain a blur processing result. For example, the blur processing result can be the screen 102 in Figure 1, for example, the user 104 can be blurred and the user 103 can not be blurred in the screen 102, or the user 104 can not be blurred and the user 103 can not be blurred in the screen 102, or the user 104 can be blurred and the user 103 can not be blurred in the screen 102, or the user 104 can be blurred and the user 103 can be blurred in the screen 102. The final blur processing result is not specifically limited in the embodiment of the present application.
然而,由于主路图像数据以及辅路图像数据的出图时间不一致、以及主路摄像头以及辅路摄像头的设置参数不一致等原因对主路图像数据以及辅路图像数据的曝光时间产生影响,使得主路图像数据以及辅路图像数据之间的相差发生变化,并对由两路图像数据得到的深度信息产生影响,进而造成虚化异常的情况。其中,该出图时间也可以理解为曝光起始时间。However, due to the inconsistent output time of the main image data and the auxiliary image data, and the inconsistent setting parameters of the main camera and the auxiliary camera, the exposure time of the main image data and the auxiliary image data is affected, so that the phase difference between the main image data and the auxiliary image data changes, and the depth information obtained from the two image data is affected, thereby causing abnormal blurring. Among them, the output time can also be understood as the exposure start time.
有鉴于此,本申请实施例提供一种虚化方法,终端设备包括第一摄像头以及第二摄像头,方法包括:终端设备获取第一图像以及第二图像;其中,第一图像是基于第一摄像头得到的,第二图像是基于第二摄像头得到的;终端设备根据目标曝光时间处理第二图像,或者根据目标曝光时间处理第一图像和第二图像,其中,处理后的第一图像为按照目标曝光时间进行曝光的图像,处理后的第二图像为按照目标曝光时间进行曝光的图像;目标曝光时间为第一图像的曝光时间或第二图像的曝光时间中的任一个曝光时间;终端设备基于处理后的第二图像,或者基于处理后的第一图像以及处理后的第二图像,对第一图像进行虚化处理。使得终端设备可以基于第一图像的曝光时间以及第二图像的曝光时间确定目标曝光时间,并使得第一图像以及第二图像在目标曝光时间处对齐,在相同的曝光时间下,减少曝光时间不同对于第一图像以及第二图像之间相差的影响,提高深度计算的准确度,进而提高虚化处理的准确性。In view of this, an embodiment of the present application provides a blurring method, wherein a terminal device includes a first camera and a second camera, and the method includes: the terminal device acquires a first image and a second image; wherein the first image is obtained based on the first camera, and the second image is obtained based on the second camera; the terminal device processes the second image according to a target exposure time, or processes the first image and the second image according to the target exposure time, wherein the processed first image is an image exposed according to the target exposure time, and the processed second image is an image exposed according to the target exposure time; the target exposure time is any one of the exposure time of the first image or the exposure time of the second image; the terminal device blurs the first image based on the processed second image, or based on the processed first image and the processed second image. The terminal device can determine the target exposure time based on the exposure time of the first image and the exposure time of the second image, and align the first image and the second image at the target exposure time. Under the same exposure time, the influence of different exposure times on the difference between the first image and the second image is reduced, the accuracy of depth calculation is improved, and the accuracy of blurring is improved.
其中,第一摄像头可以为本申请实施例中描述的主路摄像头,第一图像可以为本申请实施例中描述的主路图像数据;第二摄像头可以为本申请实施例中描述的辅路摄像头,第二图像可以为本申请实施例中描述的辅路图像数据;第一变换矩阵可以为本申请实施例中描述的第一warp矩阵;第二变换矩阵可以为本申请实施例中描述的第二warp矩阵。Among them, the first camera can be the main road camera described in the embodiment of the present application, and the first image can be the main road image data described in the embodiment of the present application; the second camera can be the auxiliary road camera described in the embodiment of the present application, and the second image can be the auxiliary road image data described in the embodiment of the present application; the first transformation matrix can be the first warp matrix described in the embodiment of the present application; the second transformation matrix can be the second warp matrix described in the embodiment of the present application.
可以理解的是,上述终端设备也可以称为终端,(terminal)、用户设备(userequipment,UE)、移动台(mobile station,MS)、移动终端(mobile terminal,MT)等。终端设备可以为拥有双目摄像头的手机(mobile phone)、智能电视、穿戴式设备、平板电脑(Pad)、带无线收发功能的电脑、虚拟现实(virtual reality,VR)终端设备、增强现实(augmentedreality,AR)终端设备、工业控制(industrial control)中的无线终端、无人驾驶(self-driving)中的无线终端、远程手术(remote medical surgery)中的无线终端、智能电网(smart grid)中的无线终端、运输安全(transportation safety)中的无线终端、智慧城市(smart city)中的无线终端、智慧家庭(smart home)中的无线终端等等。本申请的实施例对终端设备所采用的具体技术和具体设备形态不做限定。It is understandable that the above terminal devices may also be referred to as terminals, user equipment (UE), mobile stations (MS), mobile terminals (MT), etc. The terminal devices may be mobile phones with binocular cameras, smart TVs, wearable devices, tablet computers (Pad), computers with wireless transceiver functions, virtual reality (VR) terminal devices, augmented reality (AR) terminal devices, wireless terminals in industrial control, wireless terminals in self-driving, wireless terminals in remote medical surgery, wireless terminals in smart grids, wireless terminals in transportation safety, wireless terminals in smart cities, wireless terminals in smart homes, etc. The embodiments of the present application do not limit the specific technology and specific device form used by the terminal devices.
因此,为了能够更好地理解本申请实施例,下面对本申请实施例的终端设备的结构进行介绍。示例性的,图2为本申请实施例提供的一种终端设备的结构示意图。Therefore, in order to better understand the embodiment of the present application, the structure of the terminal device of the embodiment of the present application is introduced below. For example, FIG2 is a schematic diagram of the structure of a terminal device provided in the embodiment of the present application.
终端设备可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,指示器192,摄像头193,以及显示屏194等。The terminal device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, an indicator 192, a camera 193, and a display screen 194, etc.
可以理解的是,本申请实施例示意的结构并不构成对终端设备的具体限定。在本申请另一些实施例中,终端设备可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the terminal device. In other embodiments of the present application, the terminal device may include more or fewer components than shown in the figure, or combine certain components, or split certain components, or arrange the components differently. The components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
处理器110可以包括一个或多个处理单元。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。处理器110中还可以设置存储器,用于存储指令和数据。The processor 110 may include one or more processing units. Different processing units may be independent devices or integrated into one or more processors. The processor 110 may also be provided with a memory for storing instructions and data.
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processingunit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。The processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a controller, a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural-network processing unit (NPU), etc. Different processing units may be independent devices or integrated into one or more processors.
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为终端设备充电,也可以用于终端设备与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他终端设备,例如AR设备等。The USB interface 130 is an interface that complies with the USB standard specification, and can be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc. The USB interface 130 can be used to connect a charger to charge the terminal device, and can also be used to transfer data between the terminal device and peripheral devices. It can also be used to connect headphones to play audio through the headphones. The interface can also be used to connect other terminal devices, such as AR devices, etc.
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。电源管理模块141用于连接充电管理模块140与处理器110。The charging management module 140 is used to receive charging input from a charger, which may be a wireless charger or a wired charger. The power management module 141 is used to connect the charging management module 140 to the processor 110 .
终端设备的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。The wireless communication function of the terminal device can be implemented through antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, modem processor and baseband processor.
天线1和天线2用于发射和接收电磁波信号。终端设备中的天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals. The antenna in the terminal device can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve the utilization of the antenna.
移动通信模块150可以提供应用在终端设备上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。The mobile communication module 150 can provide solutions for wireless communications including 2G/3G/4G/5G, etc., applied to terminal devices. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), etc. The mobile communication module 150 can receive electromagnetic waves from the antenna 1, and filter, amplify, etc. the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
无线通信模块160可以提供应用在终端设备上的包括无线局域网(wirelesslocalarea networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequencymodulation,FM)等无线通信的解决方案。The wireless communication module 160 can provide wireless communication solutions applied to terminal devices, including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), etc.
终端设备通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。The terminal device realizes the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connecting the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering.
显示屏194用于显示图像,视频等。显示屏194包括显示面板。在一些实施例中,终端设备可以包括1个或N个显示屏194,N为大于1的正整数。The display screen 194 is used to display images, videos, etc. The display screen 194 includes a display panel. In some embodiments, the terminal device may include 1 or N display screens 194, where N is a positive integer greater than 1.
终端设备可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。The terminal device can realize the shooting function through ISP, camera 193, video codec, GPU, display screen 194 and application processor.
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。ISP is used to process the data fed back by camera 193. For example, when taking a photo, the shutter is opened, and the light is transmitted to the camera photosensitive element through the lens. The light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to ISP for processing and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on the noise and brightness of the image. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, ISP can be set in camera 193.
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的红绿蓝(red green blue,RGB),YUV(或理解为灰度值、色彩以及饱和度)等格式的图像信号。Camera 193 is used to capture still images or videos. The object generates an optical image through the lens and projects it onto the photosensitive element. The photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the light signal into an electrical signal, and then passes the electrical signal to the ISP for conversion into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signals in standard red green blue (RGB), YUV (or understood as grayscale value, color and saturation) and other formats.
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当终端设备在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。The digital signal processor is used to process digital signals. In addition to processing digital image signals, it can also process other digital signals. For example, when the terminal device is selecting a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
视频编解码器用于对数字视频压缩或解压缩。终端设备可以支持一种或多种视频编解码器。这样,终端设备可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。Video codecs are used to compress or decompress digital videos. Terminal devices can support one or more video codecs. In this way, terminal devices can play or record videos in multiple coding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
摄像头193用于捕获静态图像或视频。在一些实施例中,终端设备可以包括1个或N个摄像头193,N为大于1的正整数。The camera 193 is used to capture static images or videos. In some embodiments, the terminal device may include 1 or N cameras 193, where N is a positive integer greater than 1.
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展终端设备的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。The external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the terminal device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and videos can be stored in the external memory card.
内部存储器121可以用于存储计算机可执行程序代码,可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。The internal memory 121 can be used to store computer executable program codes, and the executable program codes include instructions. The internal memory 121 can include a program storage area and a data storage area.
终端设备可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。The terminal device can implement audio functions such as music playing and recording through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor.
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。终端设备可以通过扬声器170A收听音乐,或收听免提通话。受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当终端设备接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。耳机接口170D用于连接有线耳机。The audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. The speaker 170A, also known as the "speaker", is used to convert audio electrical signals into sound signals. The terminal device can listen to music or listen to hands-free calls through the speaker 170A. The receiver 170B, also known as the "earpiece", is used to convert audio electrical signals into sound signals. When the terminal device answers a call or voice message, the voice can be answered by placing the receiver 170B close to the human ear. The headphone interface 170D is used to connect wired headphones.
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。Microphone 170C, also called "microphone" or "microphone", is used to convert sound signals into electrical signals.
传感器模块180可以包括陀螺仪传感器180A。该陀螺仪传感器180A可以用于确定终端设备的运动姿态。在一些实施例中,可以通过陀螺仪传感器180A确定终端设备围绕三个轴(即,x,y和z轴)的角加速度。陀螺仪传感器180A可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180A检测终端设备抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消终端设备的抖动,实现防抖。The sensor module 180 may include a gyro sensor 180A. The gyro sensor 180A may be used to determine the motion posture of the terminal device. In some embodiments, the angular acceleration of the terminal device around three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 180A. The gyro sensor 180A may be used for anti-shake shooting. Exemplarily, when the shutter is pressed, the gyro sensor 180A detects the angle of the terminal device shaking, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to offset the shaking of the terminal device through reverse motion to achieve anti-shake.
可能的实现方式中,传感器模块180中还可以包括下述一种或多种传感器,例如:压力传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,或骨传导传感器等(图2中未示出)。其中,加速度传感器用于检测终端设备在各个方向上(一般为三轴)的加速度的大小,进而识别终端设备的姿态。In a possible implementation, the sensor module 180 may also include one or more of the following sensors, for example: a pressure sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, or a bone conduction sensor (not shown in FIG. 2 ). Among them, the acceleration sensor is used to detect the magnitude of the acceleration of the terminal device in each direction (generally three axes) to further identify the posture of the terminal device.
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。终端设备可以接收按键输入,产生与终端设备的用户设置以及功能控制有关的键信号输入。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。The button 190 includes a power button, a volume button, etc. The button 190 may be a mechanical button. It may also be a touch button. The terminal device may receive the button input and generate a key signal input related to the user settings and function control of the terminal device. The indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, message, missed call, notification, etc.
终端设备的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构等,在此不再赘述。The software system of the terminal device can adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture, etc., which will not be elaborated here.
图3为本申请实施例提供的一种终端设备的软件结构示意图。如图3所示,分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。FIG3 is a schematic diagram of the software structure of a terminal device provided in an embodiment of the present application. As shown in FIG3, the layered architecture divides the software into several layers, each layer has a clear role and division of labor. The layers communicate with each other through software interfaces. In some embodiments, the Android system is divided into four layers, from top to bottom, namely, the application layer, the application framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer.
应用程序层可以包括一系列应用程序包。The application layer can include a series of application packages.
如图3所示,应用程序包可以包括相机,日历,电话,地图,电话,音乐,设置,邮箱,视频,社交等应用程序。本申请实施例中,可以在相机应用中实现虚化方法。As shown in Figure 3, the application package may include applications such as camera, calendar, phone, map, phone, music, settings, mailbox, video, social, etc. In the embodiment of the present application, the blurring method can be implemented in the camera application.
应用程序框架层为应用程序层的应用程序提供应用编程接口(applicationprogramming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。The application framework layer provides an application programming interface (API) and a programming framework for the applications in the application layer. The application framework layer includes some predefined functions.
如图3所示,应用程序框架层可以包括窗口管理器,内容提供器,资源管理器,视图系统,通知管理器等。As shown in FIG. 3 , the application framework layer may include a window manager, a content provider, a resource manager, a view system, a notification manager, and the like.
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,触摸屏幕,拖拽屏幕,截取屏幕等。The window manager is used to manage window programs. The window manager can obtain the display size, determine whether there is a status bar, lock the screen, touch the screen, drag the screen, capture the screen, etc.
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。Content providers are used to store and retrieve data and make it accessible to applications. The data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。The view system includes visual controls, such as controls for displaying text, controls for displaying images, etc. The view system can be used to build applications. A display interface can be composed of one or more views. For example, a display interface including a text notification icon can include a view for displaying text and a view for displaying images.
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。The resource manager provides various resources for applications, such as localized strings, icons, images, layout files, video files, and so on.
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,终端设备振动,指示灯闪烁等。The notification manager enables applications to display notification information in the status bar. It can be used to convey notification-type messages and can disappear automatically after a short stay without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc. The notification manager can also be a notification that appears in the system top status bar in the form of a chart or scroll bar text, such as notifications of applications running in the background, or a notification that appears on the screen in the form of a dialog window. For example, a text message is displayed in the status bar, a prompt sound is emitted, the terminal device vibrates, the indicator light flashes, etc.
Android runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。Android runtime includes core libraries and virtual machines. Android runtime is responsible for scheduling and management of the Android system.
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。The core library consists of two parts: one part is the function that needs to be called by the Java language, and the other part is the Android core library.
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。The application layer and the application framework layer run in a virtual machine. The virtual machine executes the Java files of the application layer and the application framework layer as binary files. The virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。The system library may include multiple functional modules, such as surface manager, media library, 3D graphics processing library (such as OpenGL ES), 2D graphics engine (such as SGL), etc.
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。The surface manager is used to manage the display subsystem and provide the fusion of 2D and 3D layers for multiple applications.
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。The media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc. The media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。The 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
2D图形引擎是2D绘图的绘图引擎。A 2D graphics engine is a drawing engine for 2D drawings.
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。The kernel layer is the layer between hardware and software. The kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
下面以具体地实施例对本申请的技术方案以及本申请的技术方案如何解决上述技术问题进行详细说明。下面这几个具体的实施例可以独立实现,也可以相互结合,对于相同或相似的概念或过程可能在某些实施例中不再赘述。The following specific embodiments are used to describe in detail the technical solution of the present application and how the technical solution of the present application solves the above technical problems. The following specific embodiments can be implemented independently or in combination with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
示例性的,图4为本申请实施例提供的一种虚化方法的流程示意图。在图4对应的实施例中,双目摄像头可以包括主路摄像头以及辅路摄像头。Exemplarily, Fig. 4 is a flow chart of a blurring method provided in an embodiment of the present application. In the embodiment corresponding to Fig. 4, the binocular camera may include a main camera and an auxiliary camera.
可能的实现方式中,该双目摄像头可以包括:支持1x-3.5x的主摄像头(或理解为主路摄像头)以及超广角摄像头(或理解为辅路摄像头);或者,该双目摄像头也可以包括:支持3.5x以上的长焦摄像头(或理解为主路摄像头)以及主摄像头(或理解为辅路摄像头)。In a possible implementation, the binocular camera may include: a main camera supporting 1x-3.5x (or understood as the main camera) and an ultra-wide-angle camera (or understood as an auxiliary camera); or, the binocular camera may also include: a telephoto camera supporting 3.5x or above (or understood as the main camera) and a main camera (or understood as an auxiliary camera).
可能的实现方式中,终端设备中也可以包括多个摄像头,例如包括3个摄像头,该3个摄像头中的至少2个摄像头可以用于实现上述双目摄像头的功能,本申请实施例中对此不做限定。In a possible implementation, the terminal device may also include multiple cameras, for example, three cameras, at least two of the three cameras can be used to implement the functions of the above-mentioned binocular camera, which is not limited in the embodiments of the present application.
如图4所示,虚化方法可以包括如下步骤:As shown in FIG4 , the blurring method may include the following steps:
S401、当终端设备接收到打开视频录制功能或拍摄功能的操作时,终端设备利用主路摄像头获取主路图像数据以及利用辅路摄像头获取辅路图像数据。S401. When the terminal device receives an operation to turn on a video recording function or a shooting function, the terminal device uses a main camera to obtain main image data and uses an auxiliary camera to obtain auxiliary image data.
其中,该主路图像数据的格式以及辅路图像数据的格式均可以为原生(RAW)格式。The format of the main channel image data and the format of the auxiliary channel image data may both be in a native (RAW) format.
RAW格式的图像也可以称为原始图像数据,其可以为图像感应器将捕捉到的光源信号转化为数字信号时的原始数据。RAW文件中记录了摄像头的原始信息,同时记录了由摄像头拍摄所产生的一些元数据,如图像感光度的设置、快门速度、光圈值、白平衡数值等数据。RAW格式是未经处理、且未经压缩的格式。RAW format images can also be called raw image data, which can be the raw data when the image sensor converts the captured light source signal into a digital signal. RAW files record the original information of the camera, and also record some metadata generated by the camera shooting, such as image sensitivity settings, shutter speed, aperture value, white balance value and other data. RAW format is an unprocessed and uncompressed format.
示例性的,图5为本申请实施例提供的一种获取图像数据的界面示意图。在图5对应的实施例中,以终端设备为手机为例进行示例说明,该示例并不构成对本申请实施例的限定。For example, Fig. 5 is a schematic diagram of an interface for acquiring image data provided in an embodiment of the present application. In the embodiment corresponding to Fig. 5, a mobile phone is used as an example for illustration, and the example does not constitute a limitation on the embodiment of the present application.
当终端设备接收到用户打开相机应用中的视频录制功能的操作时,终端设备可以利用主路摄像头获取主路图像数据以及利用辅路摄像头获取辅路图像数据,并基于图4对应的实施例得到虚化处理结果,进而使得终端设备显示如图5中的a所示的界面,该界面可以为视频录制功能对应的界面,实现对于预览画面的虚化处理。When the terminal device receives an operation from the user to turn on the video recording function in the camera application, the terminal device can use the main camera to obtain the main image data and use the auxiliary camera to obtain the auxiliary image data, and obtain the blur processing result based on the embodiment corresponding to Figure 4, so that the terminal device displays the interface shown in a in Figure 5, which can be the interface corresponding to the video recording function, and realizes the blur processing of the preview screen.
可能的实现方式中,终端设备也可以在接收到用户在图5中的a所示的界面中,针对用于开启视频录制的控件501的触发操作时,基于图4对应的实施例对基于主路摄像头获取的主路图像数据以及辅路摄像头获取的辅路图像数据进行虚化处理,得到虚化处理结果,进而使得终端设备显示如图5中的b所示的界面,实现对于预览画面的虚化处理。In a possible implementation, when the terminal device receives a trigger operation from the user on the control 501 for starting video recording in the interface shown in a of Figure 5, the terminal device may blur the main image data obtained by the main camera and the auxiliary image data obtained by the auxiliary camera based on the embodiment corresponding to Figure 4 to obtain a blurring result, thereby causing the terminal device to display an interface as shown in b of Figure 5 to achieve blurring of the preview screen.
可能的实现方式中,终端设备也可以在接收到用户在图5中的b所示的界面中,针对用于结束录制的控件505的触发操作时,基于图4对应的实施例对基于主路摄像头获取的主路图像数据以及辅路摄像头获取的辅路图像数据进行虚化处理,得到虚化处理结果,并将虚化处理结果存储在终端设备中,生成包括虚化处理结果的视频文件。在此场景中,终端设备可以不对预览画面进行虚化处理,使得该图5中的a(和/或b)所示的界面中无法显示虚化处理结果;或者也可以对视频画面进行虚化处理,使得该图5中的a(和/或b)所示的界面中显示虚化处理结果,本申请实施例中对此不做限定。In a possible implementation, when the terminal device receives a trigger operation of the control 505 for ending recording from the user in the interface shown in b of FIG. 5, the terminal device may also perform blurring processing on the main image data obtained by the main camera and the auxiliary image data obtained by the auxiliary camera based on the embodiment corresponding to FIG. 4 to obtain blurring processing results, store the blurring processing results in the terminal device, and generate a video file including the blurring processing results. In this scenario, the terminal device may not perform blurring processing on the preview screen, so that the blurring processing results cannot be displayed in the interface shown in a (and/or b) of FIG. 5; or the terminal device may perform blurring processing on the video screen, so that the blurring processing results are displayed in the interface shown in a (and/or b) of FIG. 5, which is not limited in the embodiments of the present application.
如图5中的a所示的界面,该界面中可以包括相机应用的一级菜单中的一个或多个功能控件,例如:光圈控件、夜景控件、人像控件、录像控件(或也可以理解为用于开启视频录制功能的控件)、短视频控件、或用于开启相机应用中的更多功能的更多控件等。该界面中还可以包括下述的一种或多种,例如:基于摄像头实时采集到的画面,例如预览画面503,用于开启视频录制的控件501、用于打开图库的控件、用于切换摄像头的控件、用于对相机应用进行设置的设置控件、或用于调整拍摄倍数的控件、以及用于设置闪光灯开启或关闭的闪光灯控件等。其中,该预览画面503中可以包括:处于前景的用户502、以及处于后景的用户504,该处于后景的用户504可以处于虚化状态。As shown in the interface of a in FIG. 5 , the interface may include one or more function controls in the primary menu of the camera application, such as: aperture control, night scene control, portrait control, video control (or it may also be understood as a control for opening the video recording function), short video control, or more controls for opening more functions in the camera application, etc. The interface may also include one or more of the following, such as: based on the picture collected by the camera in real time, such as the preview picture 503, a control 501 for opening video recording, a control for opening the gallery, a control for switching the camera, a setting control for setting the camera application, or a control for adjusting the shooting magnification, and a flash control for setting the flash on or off, etc. Among them, the preview picture 503 may include: a user 502 in the foreground, and a user 504 in the background, and the user 504 in the background may be in a blurred state.
可以理解的是,如图5中的a所示的界面,本申请实施例可以通过虚线表示目标的虚化状态,通过实线表示目标的非虚化状态或也可以称为正常显示状态。It can be understood that, as shown in the interface a in Figure 5, the embodiment of the present application can represent the blurred state of the target by a dotted line, and represent the non-blurred state of the target by a solid line or can also be called a normal display state.
可能的实现方式中,当终端设备接收到用户针对用于开启视频录制的控件501的触发操作时,终端设备可以显示如图5中的b所示的界面。该界面中可以包括:基于虚化处理得到的预览画面503,用于指示视频录制时间的信息、用于暂停录制的控件、用于拍摄当前画面的控件、用于结束录制的控件505、以及用于调整拍摄倍数的控件等。In a possible implementation, when the terminal device receives a trigger operation from the user on the control 501 for starting video recording, the terminal device may display an interface as shown in b in FIG5. The interface may include: a preview screen 503 obtained based on blur processing, information indicating the video recording time, a control for pausing recording, a control for shooting the current screen, a control 505 for ending recording, and a control for adjusting the shooting magnification.
可能的实现方式中,终端设备也可以在接收到打开拍摄功能时,基于图4对应的实施例对基于主路摄像头获取的主路图像数据以及辅路摄像头获取的辅路图像数据进行虚化处理,得到虚化处理结果,并在预览画面中显示虚化处理结果和/或基于用户针对拍摄控件的触发操作将虚化处理结果保存在图像文件,虚化方法的触发方式与图5对应的实施例类似,在此不再赘述。In a possible implementation, when receiving a command to turn on the shooting function, the terminal device may also perform blurring processing on the main image data obtained by the main camera and the auxiliary image data obtained by the auxiliary camera based on the embodiment corresponding to FIG4 to obtain a blurring processing result, and display the blurring processing result in the preview screen and/or save the blurring processing result in an image file based on the user's trigger operation on the shooting control. The triggering method of the blurring method is similar to the embodiment corresponding to FIG5 and will not be repeated here.
S402、终端设备分别对主路图像数据以及辅路图像数据进行图像前处理,得到图像前处理后的主路图像数据以及图像前处理后的辅路图像数据。S402: The terminal device performs image pre-processing on the main channel image data and the auxiliary channel image data respectively to obtain the main channel image data after image pre-processing and the auxiliary channel image data after image pre-processing.
其中,该图像前处理用于将RAW格式的图像数据处理为YUV格式的图像数据。The image pre-processing is used to process the image data in RAW format into image data in YUV format.
示例性的,该图像前处理可以包括下述一种或多种,例如:去坏点校正处理、RAW域降噪处理、黑电平校正处理、光学阴影校正处理/自动白平衡处理、颜色插值处理、色调映射处理、色彩校正处理、Gamma校正处理、以及图像转换处理等,本申请实施例中对该图像前处理的具体过程不做限定。Exemplarily, the image pre-processing may include one or more of the following, such as: bad pixel correction processing, RAW domain noise reduction processing, black level correction processing, optical shading correction processing/automatic white balance processing, color interpolation processing, tone mapping processing, color correction processing, gamma correction processing, and image conversion processing, etc. The specific process of the image pre-processing is not limited in the embodiments of the present application.
S403、终端设备利用主路图像数据的曝光时间或辅路图像数据的曝光时间确定目标曝光时间。S403: The terminal device determines a target exposure time by using the exposure time of the primary image data or the exposure time of the secondary image data.
其中,该目标曝光时间可以为第一图像的曝光时间或第二图像的曝光时间中的任一个曝光时间。例如,该目标曝光时间为对主路图像数据的第一行像素进行曝光的起始曝光时间与对主路图像数据的最后一行像素进行曝光的起始曝光时间之间的中间时间;或者,目标曝光时间为对辅路图像数据的第一行像素进行曝光的起始曝光时间与对主路图像数据的最后一行像素进行曝光的起始曝光时间之间的中间时间。或者,该目标曝光时间也可以为对主路图像数据(或辅路图像数据)的第一行像素进行曝光的起始曝光时间与对主路图像数据(或辅路图像数据)的最后一行像素进行曝光的起始曝光时间之间的任一时间,本申请实施例中对此不做限定。Among them, the target exposure time can be any exposure time of the first image or the exposure time of the second image. For example, the target exposure time is the middle time between the starting exposure time for exposing the first row of pixels of the main image data and the starting exposure time for exposing the last row of pixels of the main image data; or, the target exposure time is the middle time between the starting exposure time for exposing the first row of pixels of the auxiliary image data and the starting exposure time for exposing the last row of pixels of the main image data. Alternatively, the target exposure time can also be any time between the starting exposure time for exposing the first row of pixels of the main image data (or auxiliary image data) and the starting exposure time for exposing the last row of pixels of the main image data (or auxiliary image data), which is not limited in the embodiments of the present application.
示例性的,图6为本申请实施例提供的一种生成图像数据的原理示意图。Exemplarily, FIG6 is a schematic diagram of a principle for generating image data provided in an embodiment of the present application.
终端设备可以按照图6中的b所示的正弦函数,在t1处开始对主路图像数据进行曝光,并在t4处完成对主路图像数据的曝光,进而生成主路图像数据。如图6中的a所示的,主路图像数据按照逐行曝光原则,在t1处对主路图像数据中的第一行像素点进行曝光,并在t2处完成对第一行像素点的曝光,逐行曝光,最终在t3处对主路图像数据中的最后一行像素点进行曝光,并在t4处完成对最后一行像素点的曝光,生成该主路图像数据。其中,t1与t2的差值,与t3与t4的差值相同,或可以理解为终端设备对每行进行曝光的时间相同。The terminal device may start exposing the main image data at t1 according to the sine function shown in b of FIG. 6 , and complete the exposure of the main image data at t4, thereby generating the main image data. As shown in a of FIG. 6 , the main image data is exposed at t1 for the first row of pixels in the main image data according to the line-by-line exposure principle, and completes the exposure of the first row of pixels at t2, and exposes line by line, and finally exposes the last row of pixels in the main image data at t3, and completes the exposure of the last row of pixels at t4, thereby generating the main image data. The difference between t1 and t2 is the same as the difference between t3 and t4, or it can be understood that the terminal device has the same exposure time for each row.
终端设备可以按照图6中的d所示的正弦函数,在t6处开始对辅路图像数据进行曝光,并在t9处完成对辅路图像数据的曝光,进而生成辅路图像数据。如图6中的c所示的,辅路图像数据按照逐行曝光原则,在t6处对辅路图像数据中的第一行像素点进行曝光,并在t8处完成对第一行像素点的曝光,逐行曝光,最终在t7处对辅路图像数据中的最后一行像素点进行曝光,并在t9处完成对最后一行像素点的曝光,生成该辅路图像数据。其中,其中,t6与t8的差值,与t7与t9的差值相同。t1可以早于t6,或理解为主路图像数据进行曝光的时间可以早于辅路图像数据进行曝光的时间。The terminal device can start exposing the auxiliary image data at t6 according to the sine function shown in d of FIG. 6, and complete the exposure of the auxiliary image data at t9, thereby generating the auxiliary image data. As shown in c of FIG. 6, the auxiliary image data is exposed at t6 for the first row of pixels in the auxiliary image data according to the principle of line-by-line exposure, and completes the exposure of the first row of pixels at t8, and exposes line by line, and finally exposes the last row of pixels in the auxiliary image data at t7, and completes the exposure of the last row of pixels at t9, thereby generating the auxiliary image data. Among them, the difference between t6 and t8 is the same as the difference between t7 and t9. t1 can be earlier than t6, or it can be understood that the time when the main image data is exposed can be earlier than the time when the auxiliary image data is exposed.
可以理解的是,由于出图时间以及传感器设置的影响,使得两路图像数据的起始曝光时间以及逐行曝光时的时间间隔不同(或理解为逐行曝光时的起始曝光时间)不同,而曝光时间不同将影响两路图像数据的相差,使得基于两路图像数据计算得到的深度信息不准确,进而影响虚化效果。It is understandable that due to the influence of the image output time and the sensor settings, the starting exposure time of the two image data and the time interval for line-by-line exposure (or understood as the starting exposure time for line-by-line exposure) are different. The different exposure times will affect the phase difference of the two image data, making the depth information calculated based on the two image data inaccurate, thereby affecting the blurring effect.
因此,如图6所示,终端设备可以确定目标曝光时间,该目标曝光时间可以为两路图像数据对齐的时间,该目标曝光时间可以使得两路图像数据在相同时间点处开始进行曝光。例如,该目标曝光时间可以为主路图像数据或辅路图像数据中的任一曝光时间,例如目标曝光时间可以为t5,该t5可以为t1与t3的中间值;或者,该目标曝光时间也可以为t6与t7之间的中间值,本申请实施例中对此不做限定。Therefore, as shown in FIG6 , the terminal device can determine the target exposure time, which can be the time when the two channels of image data are aligned, and the target exposure time can make the two channels of image data start to be exposed at the same time point. For example, the target exposure time can be any exposure time of the main channel image data or the auxiliary channel image data, for example, the target exposure time can be t5, and t5 can be the middle value between t1 and t3; or, the target exposure time can also be the middle value between t6 and t7, which is not limited in the embodiments of the present application.
基于此,终端设备可以利用目标曝光时间,使得两路图像数据在同一时间点处开启曝光,避免不同的曝光起始时间对于相差的影响。Based on this, the terminal device can use the target exposure time to start exposure of two channels of image data at the same time point, thereby avoiding the influence of different exposure start times on the phase difference.
S404、终端设备利用主路图像数据对应的第一warp矩阵对图像前处理后的主路图像数据进行矫正得到矫正后的主路图像数据,以及利用辅路图像数据对应的第二warp矩阵对图像前处理后的辅路图像数据进行矫正得到矫正后的辅路图像数据。S404. The terminal device uses the first warp matrix corresponding to the main image data to correct the main image data after image pre-processing to obtain corrected main image data, and uses the second warp matrix corresponding to the auxiliary image data to correct the auxiliary image data after image pre-processing to obtain corrected auxiliary image data.
本申请实施例中,第一warp矩阵(或称第一变换矩阵)用于将第一图像中的任一行像素点对齐到目标曝光时间对应的位置处,第二warp矩阵(或称为第二变换矩阵)用于将第二图像中的任一行像素点对齐到目标曝光时间对应的位置处。其中,该第一变换矩阵(或第二变换矩阵)中的每一个数值可以用于确定,每一行像素点对齐到目标曝光时间点对应的位置时需要移动的距离。In the embodiment of the present application, the first warp matrix (or the first transformation matrix) is used to align any row of pixels in the first image to the position corresponding to the target exposure time, and the second warp matrix (or the second transformation matrix) is used to align any row of pixels in the second image to the position corresponding to the target exposure time. Each value in the first transformation matrix (or the second transformation matrix) can be used to determine the distance that each row of pixels needs to move when it is aligned to the position corresponding to the target exposure time point.
示例性的,终端设备可以通过光学图像稳定器(optical image stabilizer,OIS)获取角加速度数据,并通过(electric image stabilization,EIS)电子防抖处理获取主路图像数据在目标曝光时间下对应的第一warp矩阵,以及辅路图像数据在目标曝光时间下对应的第二warp矩阵;进一步的,终端设备可以利用第一warp矩阵,对图像前处理后的主路图像进行位置校正,得到校正处理后的主路图像数据,以及利用第二warp矩阵,对图像前处理后的辅路图像进行位置校正,得到校正处理后的辅路图像数据。Exemplarily, the terminal device can obtain angular acceleration data through an optical image stabilizer (OIS), and obtain a first warp matrix corresponding to the main image data at a target exposure time, and a second warp matrix corresponding to the auxiliary image data at a target exposure time through electric image stabilization (EIS); further, the terminal device can use the first warp matrix to perform position correction on the main image after image pre-processing to obtain the corrected main image data, and use the second warp matrix to perform position correction on the auxiliary image after image pre-processing to obtain the corrected auxiliary image data.
可以理解的是,由于主路图像数据在逐行曝光时交错曝光的时间与辅路图像数据在逐行曝光时交错曝光的时间不一致,使得两路图像数据的相差变化较大,因此终端设备可以通过分别获取对齐到目标曝光时间时对应的warp矩阵,对图像数据进行校正,减少不同曝光时间带来的相差影响。It can be understood that since the time of interlaced exposure of the main image data when exposing line by line is inconsistent with the time of interlaced exposure of the auxiliary image data when exposing line by line, the phase difference between the two image data varies greatly. Therefore, the terminal device can correct the image data by respectively obtaining the warp matrices corresponding to the target exposure time, so as to reduce the phase difference caused by different exposure times.
示例性的,图7为本申请实施例提供的一种图像矫正的原理示意图。Exemplarily, FIG7 is a schematic diagram of the principle of image correction provided in an embodiment of the present application.
如图7中的a所示的主路图像数据,终端设备可以通过第一warp矩阵对主路图像数据进行图像位置矫正,得到如图7中的b所示的矫正后的主路图像数据,该矫正后的主路图像数据的曝光起始时间可以为t5,该矫正后的主路图像数据的曝光结束时间可以为t5+(t2-t1)。For the main image data as shown in a in Figure 7, the terminal device can perform image position correction on the main image data through the first warp matrix to obtain the corrected main image data as shown in b in Figure 7. The exposure start time of the corrected main image data can be t5, and the exposure end time of the corrected main image data can be t5+(t2-t1).
如图7中的c所示的辅路图像数据,终端设备可以通过第二warp矩阵对辅路图像数据进行图像位置矫正,得到如图7中的d所示的矫正后的辅路图像数据,该矫正后的辅路图像数据的曝光起始时间可以为t5,该矫正后的辅路图像数据的曝光结束时间可以为t5+(t8-t6)。As for the auxiliary channel image data as shown in c in Figure 7, the terminal device can perform image position correction on the auxiliary channel image data through the second warp matrix to obtain the corrected auxiliary channel image data as shown in d in Figure 7. The exposure start time of the corrected auxiliary channel image data can be t5, and the exposure end time of the corrected auxiliary channel image data can be t5+(t8-t6).
可以理解的是,该S404所示的步骤中描述的基于warp矩阵的图像矫正过程可以为终端设备图像后处理过程中的一个步骤。可能的实现方式中,终端设备也可以在S404之后,对该矫正后的主路图像数据以及矫正后的辅路图像数据进行其他图像后处理。例如,该图像后处理可以包括下述一种或多种:噪声处理、亮度和颜色处理、以及图像缩放处理等,本申请实施例中对该图像后处理的具体方式不做限定。It is understandable that the warp matrix-based image correction process described in the step shown in S404 can be a step in the image post-processing process of the terminal device. In a possible implementation, the terminal device can also perform other image post-processing on the corrected main image data and the corrected auxiliary image data after S404. For example, the image post-processing may include one or more of the following: noise processing, brightness and color processing, and image scaling processing, etc. The specific method of the image post-processing is not limited in the embodiment of the present application.
针对噪声处理,可以用于去除当前图像中的噪声影响。其中,终端设备可以通过低通滤波器、或双边滤波器等去除当前图像中的噪声。Noise processing can be used to remove the noise effect in the current image. The terminal device can remove the noise in the current image by using a low-pass filter or a bilateral filter.
针对亮度和颜色处理,可以用于调整由于光线条件等对于拍摄对象的亮度和颜色的影响。其中,该颜色处理方法可以包括:基于色彩校正矩阵的颜色处理方法等。该亮度处理方法可以包括:局部色调映射方法等。For brightness and color processing, it can be used to adjust the influence of light conditions on the brightness and color of the photographed object. Among them, the color processing method may include: a color processing method based on a color correction matrix, etc. The brightness processing method may include: a local tone mapping method, etc.
针对图像缩放处理,可以用于将当前图像从一种分辨率转换为另一种分辨率。其中,该图像缩放处理方法可以包括:最近邻插值、线性插值、区域插值、或三次样条插值等方法。Image scaling can be used to convert a current image from one resolution to another resolution. The image scaling method may include nearest neighbor interpolation, linear interpolation, regional interpolation, or cubic spline interpolation.
可以理解的是,该图像矫正和调整处理方法、噪声处理方法、亮度和颜色处理方法、以及图像缩放处理方法均可以根据实际场景包括其他内容,本申请实施例中对此不做限定。It can be understood that the image correction and adjustment processing method, the noise processing method, the brightness and color processing method, and the image scaling processing method can include other contents according to the actual scene, and this is not limited in the embodiments of the present application.
S405、终端设备基于矫正后的主路图像数据以及矫正后的辅路图像数据,确定深度图像数据。S405. The terminal device determines the depth image data based on the corrected main road image data and the corrected auxiliary road image data.
可以理解的是,终端设备可以确定矫正后的主路图像数据以及矫正后的辅路图像数据中的任一对像素点对应的相差,并利用相差以及两路摄像头之间的距离,确定该任一对像素点对应的深度,进而得到深度图像数据。It can be understood that the terminal device can determine the phase difference corresponding to any pair of pixels in the corrected main image data and the corrected auxiliary image data, and use the phase difference and the distance between the two cameras to determine the depth corresponding to any pair of pixels, thereby obtaining depth image data.
S406、终端设备利用第一warp矩阵对应的逆矩阵,对深度图像数据进行处理,得到矫正后的深度图像数据。S406: The terminal device processes the depth image data using the inverse matrix corresponding to the first warp matrix to obtain corrected depth image data.
可以理解的是,通常情况下终端设备是基于主路摄像头对应的图像数据进行虚化处理的,因此可以将深度图像数据校正为与主路摄像头相关的深度图像数据,得到校正后的深度图像数据。It is understandable that, usually, the terminal device performs blurring processing based on the image data corresponding to the main camera, so the depth image data can be corrected to the depth image data related to the main camera to obtain the corrected depth image data.
S407、终端设备利用矫正后的深度图像数据,对图像前处理后的主路图像数据进行虚化处理,得到虚化处理结果。S407: The terminal device uses the corrected depth image data to perform blurring on the main image data after image pre-processing to obtain a blurring result.
示例性的,终端设备可以通过高斯模糊处理、以及神经网络模型等方法进行虚化处理,本申请实施例中对该虚化处理方法不做限定。Exemplarily, the terminal device may perform blurring through Gaussian blur processing, neural network model and other methods, and the blurring processing method is not limited in the embodiments of the present application.
S408、终端设备将虚化处理结果送显至显示器,并作为预览画面,和/或,终端设备对虚化处理结果进行存储,并作为图像或视频。S408: The terminal device sends the blurring result to a display as a preview image, and/or the terminal device stores the blurring result as an image or video.
基于此,使得终端设备可以基于主路图像数据的曝光时间以及辅路图像数据的曝光时间确定目标曝光时间,并使得主路图像数据以及辅路图像数据在目标曝光时间处对齐,减少不同曝光时间对主路图像数据以及辅路图像数据之间相差的影响,提高深度计算的准确度,进而提高虚化处理的准确性。Based on this, the terminal device can determine the target exposure time based on the exposure time of the main image data and the exposure time of the auxiliary image data, and align the main image data and the auxiliary image data at the target exposure time, thereby reducing the impact of different exposure times on the phase difference between the main image data and the auxiliary image data, improving the accuracy of depth calculation, and thereby improving the accuracy of blur processing.
示例性的,图8为本申请实施例提供的另一种虚化方法的流程示意图。Exemplarily, FIG8 is a flowchart of another blurring method provided in an embodiment of the present application.
如图8所示,虚化方法可以包括如下步骤:As shown in FIG8 , the blurring method may include the following steps:
S801、当终端设备接收到打开视频录制功能或拍摄功能的操作时,终端设备利用主路摄像头获取主路图像数据以及利用辅路摄像头获取辅路图像数据。S801. When the terminal device receives an operation to turn on a video recording function or a shooting function, the terminal device uses a main camera to obtain main image data and uses an auxiliary camera to obtain auxiliary image data.
S802、终端设备分别对主路图像数据以及辅路图像数据进行图像前处理,得到图像前处理后的主路图像数据以及图像前处理后的辅路图像数据。S802: The terminal device performs image pre-processing on the main channel image data and the auxiliary channel image data respectively to obtain the main channel image data after image pre-processing and the auxiliary channel image data after image pre-processing.
S803、终端设备利用主路图像数据的曝光时间或辅路图像数据的曝光时间确定目标曝光时间。S803: The terminal device determines a target exposure time by using the exposure time of the primary image data or the exposure time of the secondary image data.
其中,该S801-S803所示的步骤中的描述可以参见S401-S403所示的步骤中的描述,在此不再赘述。The descriptions of the steps shown in S801-S803 can refer to the descriptions of the steps shown in S401-S403, and will not be repeated here.
S804、终端设备获取目标曝光时间下主路图像数据对应的第一warp矩阵、以及目标曝光时间下辅路图像数据对应的第二warp矩阵。S804. The terminal device obtains a first warp matrix corresponding to the main image data at the target exposure time, and a second warp matrix corresponding to the auxiliary image data at the target exposure time.
其中,该获取第一warp矩阵以及第二warp矩阵的过程可以参见S404所示的步骤中的描述,在此不再赘述。The process of obtaining the first warp matrix and the second warp matrix may refer to the description in step S404, which will not be described again.
S805、终端设备获取第一warp矩阵对应的逆矩阵,并计算第一warp矩阵对应的逆矩阵与第二warp矩阵的卷积,得到目标矩阵。S805. The terminal device obtains an inverse matrix corresponding to the first warp matrix, and calculates the convolution of the inverse matrix corresponding to the first warp matrix and the second warp matrix to obtain a target matrix.
S806、终端设备利用该目标矩阵对图像前处理后的辅路图像数据进行矫正,得到位置矫正后的辅路图像数据,并基于位置矫正后的辅路图像数据以及图像前处理后的主路图像数据,确定矫正后的深度图像数据。S806. The terminal device uses the target matrix to correct the auxiliary road image data after image pre-processing to obtain the auxiliary road image data after position correction, and determines the corrected depth image data based on the auxiliary road image data after position correction and the main road image data after image pre-processing.
可以理解的是,终端设备可以通过目标矩阵对图像前处理后的辅路图像数据进行矫正处理,使得位置矫正后的辅路图像数据可以与图像前处理后的主路图像数据的曝光时间相同,进而基于位置矫正后的辅路图像数据与图像前处理后的主路图像数据计算得到的深度图像较为准确。It can be understood that the terminal device can correct the auxiliary channel image data after image pre-processing through the target matrix, so that the auxiliary channel image data after position correction can have the same exposure time as the main channel image data after image pre-processing, and then the depth image calculated based on the auxiliary channel image data after position correction and the main channel image data after image pre-processing is more accurate.
可能的实现方式中,终端设备也可以对位置矫正后的辅路图像数据执行如S404中描述的图像后处理过程中的一种或多种,本申请实施例中对此不做限定。In a possible implementation, the terminal device may also perform one or more of the image post-processing processes described in S404 on the auxiliary road image data after position correction, which is not limited in the embodiments of the present application.
S808、终端设备利用矫正后的深度图像数据,对图像前处理后的主路图像数据进行虚化处理,得到虚化处理结果。S808. The terminal device uses the corrected depth image data to perform blurring on the main image data after image pre-processing to obtain a blurring result.
S809、终端设备将虚化处理结果送显至显示器,并作为预览画面,和/或,终端设备对虚化处理结果进行存储,并作为图像或视频。S809: The terminal device sends the blurring result to a display as a preview image, and/or the terminal device stores the blurring result as an image or video.
基于此,对比于图4对应的实施例,终端设备可以通过目标矩阵的计算,简化利用第一warp矩阵以及第二warp矩阵对图像数据进行矫正并确定深度图像的步骤,节省内存占用,并提高深度图像计算以及生成虚化图像的速度。Based on this, compared with the embodiment corresponding to Figure 4, the terminal device can simplify the steps of correcting the image data and determining the depth image using the first warp matrix and the second warp matrix through the calculation of the target matrix, save memory usage, and improve the speed of depth image calculation and generating blurred images.
可以理解的是,图4以及图8对应的实施例中虚化处理方法中各步骤之间的顺序可以不限于图4或图8对应的实施例中的描述,本申请实施例中对此不做限定。It can be understood that the order of the steps in the blurring processing method in the embodiments corresponding to FIG. 4 and FIG. 8 may not be limited to the description in the embodiments corresponding to FIG. 4 or FIG. 8 , and this is not limited in the embodiments of the present application.
在图4或图8对应的实施例的基础上,一种实现中,终端设备可以将虚化处理结果送显至显示器,并且对虚化处理结果进行实时存储,使得终端设备可以在预览界面中实时显示虚化处理结果,并且也可以在接收到用户结束拍摄或录制的操作时,将存储的虚化处理结果编码为虚化处理的视频内容或图像内容。在此场景中,终端设备不仅可以实现在预览界面中显示虚化处理结果,还可以在接收到用户回放该视频内容或者查看图像内容的操作时,显示虚化处理的视频内容或图像内容。Based on the embodiments corresponding to FIG. 4 or FIG. 8 , in one implementation, the terminal device can send the blurring processing result to the display, and store the blurring processing result in real time, so that the terminal device can display the blurring processing result in real time in the preview interface, and can also encode the stored blurring processing result into the blurred video content or image content when receiving the user's operation of ending shooting or recording. In this scenario, the terminal device can not only display the blurring processing result in the preview interface, but also display the blurred video content or image content when receiving the user's operation of playing back the video content or viewing the image content.
可以理解的是,终端设备将虚化处理结果送显至显示器,并且对虚化处理结果进行实时存储的处理方法,也可以实现在直播场景、以及通话等存在预览需求以及视频录制需求的场景中,本申请实施例对此不做限定。It is understandable that the terminal device sends the blur processing result to the display and stores the blur processing result in real time. This processing method can also be implemented in live broadcast scenes, calls and other scenes where there are preview requirements and video recording requirements. The embodiments of the present application are not limited to this.
另一种实现中,终端设备可以对虚化处理结果进行实时存储,并在接收到用户结束拍摄或录制的操作时,将存储的虚化处理结果编码为虚化处理的内容;或者,终端设备也可以对S401(或S801)中获取的主路图像数据以及辅路图像数据进行实时存储,并在接收到用户结束拍摄或录制的操作时,将存储的主路图像数据以及辅路图像数据执行如S402-S407(或S802-S807)所示的步骤,得到虚化处理结果,并编码为虚化处理的内容。在此场景中,终端设备的预览界面中可以不显示虚化处理结果,但在接收到用户回放视频内容或者查看图像内容的操作时,显示虚化处理的视频内容或者图像内容。In another implementation, the terminal device may store the blurring result in real time, and when receiving the user's operation of ending shooting or recording, encode the stored blurring result into blurring content; or, the terminal device may store the main channel image data and the auxiliary channel image data obtained in S401 (or S801) in real time, and when receiving the user's operation of ending shooting or recording, perform the steps shown in S402-S407 (or S802-S807) on the stored main channel image data and the auxiliary channel image data to obtain the blurring result, and encode it into blurring content. In this scenario, the blurring result may not be displayed in the preview interface of the terminal device, but when receiving the user's operation of replaying video content or viewing image content, the blurred video content or image content is displayed.
可以理解的是,终端设备对虚化处理结果进行实时存储的方法,可以实现用户对于终端设备的视频录制需求以及图像拍摄需求。It can be understood that the method in which the terminal device stores the blur processing results in real time can meet the user's video recording needs and image shooting needs for the terminal device.
可以理解的是,本申请实施例中对该虚化处理结果的后续处理流程不做具体限定。It is understandable that the subsequent processing flow of the blurring result is not specifically limited in the embodiment of the present application.
在图4(或图8)对应的实施例的基础上,可能的实现方式中,终端设备可以在本设备中执行S401-S407(或S801-S807)所示的步骤;或者,终端设备可以在服务器中执行虚化方法,例如终端设备在S401(或S801)中获取主路图像数据以及辅路图像数据后,可以将图像数据发送至服务器,使得服务器可以执行S402-S407(或S801-S807)所示的步骤得到虚化处理结果,服务器可以将虚化处理结果发送至终端设备,使得终端设备可以基于虚化处理结果进行后续存储处理或送显处理。Based on the embodiment corresponding to Figure 4 (or Figure 8), in a possible implementation method, the terminal device can execute the steps shown in S401-S407 (or S801-S807) in the device; or, the terminal device can execute the blurring method in the server. For example, after the terminal device obtains the main channel image data and the auxiliary channel image data in S401 (or S801), the image data can be sent to the server, so that the server can execute the steps shown in S402-S407 (or S801-S807) to obtain the blurring processing result, and the server can send the blurring processing result to the terminal device, so that the terminal device can perform subsequent storage processing or display processing based on the blurring processing result.
可以理解的是,本申请实施例中对虚化方法的执行设备,不做具体限定。It is understandable that the embodiments of the present application do not specifically limit the execution device of the virtualization method.
在图4或图8对应的实施例的基础上,可能的实现方式中,S402或S802所示的步骤中的图像前处理过程可以参见图9对应的实施例。Based on the embodiment corresponding to FIG. 4 or FIG. 8 , in a possible implementation, the image pre-processing process in the step shown in S402 or S802 can refer to the embodiment corresponding to FIG. 9 .
示例性的,图9为本申请实施例提供的一种图像前处理的流程示意图。Exemplarily, FIG9 is a schematic diagram of a process of image pre-processing provided in an embodiment of the present application.
如图9所示,该图像前处理可以包括下述一种或多种,例如:去坏点校正处理、RAW域降噪处理、黑电平校正处理、光学阴影校正处理/自动白平衡处理、颜色插值处理、色调映射处理、色彩校正处理、Gamma校正处理、以及图像转换处理等,本申请实施例中对该图像前处理的具体过程不做限定。As shown in Figure 9, the image pre-processing may include one or more of the following, for example: bad pixel correction processing, RAW domain noise reduction processing, black level correction processing, optical shading correction processing/automatic white balance processing, color interpolation processing, tone mapping processing, color correction processing, Gamma correction processing, and image conversion processing, etc. The specific process of the image pre-processing is not limited in the embodiment of the present application.
针对去坏点校正处理,坏点可以为亮度或者色彩与周围其他像素的点有较大的区别的点。例如,终端设备可以通过在全黑环境下检测亮点和彩点,以及在高亮环境下检测黑点和彩点的方法确定坏点。在去坏点校正处理中,终端设备可以通过在亮度域上取周围像素点的均值的方式消除坏点。For bad pixel correction processing, a bad pixel may be a pixel whose brightness or color is significantly different from other pixels around it. For example, the terminal device may determine the bad pixel by detecting bright pixels and color pixels in a completely dark environment, and by detecting black pixels and color pixels in a bright environment. In the bad pixel correction processing, the terminal device may eliminate the bad pixel by taking the average of the surrounding pixels in the brightness domain.
针对RAW域降噪处理,噪声可以在图像上表现为引起较强视觉效果的孤立像素点或像素块。在RAW域降噪处理中,终端设备可以通过低通滤波器(low pass filter,LPF)、或双边滤波器(bilateral filtering)等去除RAW域中的噪声。For RAW domain noise reduction processing, noise can appear on the image as isolated pixels or pixel blocks that cause strong visual effects. In RAW domain noise reduction processing, the terminal device can remove noise in the RAW domain through a low pass filter (LPF) or bilateral filtering.
针对黑电平校正处理,在调试摄像头的过程,把摄像头放入封闭的密封箱中,会发现画面呈现黑色,但是黑色的程度不够黑,这是由于暗电流的影响,使得传感器输出的图像数据并不是我们需要的黑平衡。在黑电平校正处理中,终端设备可以通过找到校正值,并将所有的区域的像素减去此校正值,使得画面呈现纯黑色。For black level correction, during the camera debugging process, put the camera into a sealed box, you will find that the picture is black, but the black is not dark enough. This is due to the influence of dark current, so that the image data output by the sensor is not the black balance we need. In the black level correction process, the terminal device can find the correction value and subtract the correction value from the pixels in all areas to make the picture appear pure black.
针对光学阴影校正处理,由于镜头本身的物理性质,可能造成画面四周亮度相对中心亮度逐渐降低的情况,同时由于边缘入射角大,会造成相邻像素间串扰,以及角落偏色的情况。在光学阴影校正处理中,终端设备可以根据一定的校正方法计算每个像素对应的亮度校正值,从而补偿周边衰减的亮度。其中,该校正方法可以为二次项校正、或四次项校正等。Regarding optical shading correction, due to the physical properties of the lens itself, the brightness of the surrounding areas of the picture may gradually decrease relative to the brightness of the center. At the same time, due to the large incident angle at the edge, it will cause crosstalk between adjacent pixels and color cast in the corners. In the optical shading correction process, the terminal device can calculate the brightness correction value corresponding to each pixel according to a certain correction method to compensate for the brightness attenuation of the periphery. Among them, the correction method can be quadratic correction, quartic correction, etc.
针对自动白平衡处理,由于色温的影响,使得一张白纸在低色温下会偏黄,高色温下会偏蓝。在自动白平衡处理中,白平衡可以使得白色物体在任何色温下都可以呈现出白色,避免偏色情况。其中,自动白平衡方法可以包括:灰度世界、或完美反射法等。For automatic white balance processing, due to the influence of color temperature, a piece of white paper will appear yellowish at low color temperature and blueish at high color temperature. In automatic white balance processing, white balance can make white objects appear white at any color temperature to avoid color cast. Among them, automatic white balance methods can include: gray world, perfect reflection method, etc.
针对颜色插值处理,由于每个像素只感知一种颜色分量,因此可以通过颜色插值使每个像素上同时包含RGB三个分量,因此颜色插值处理可以用于将RAW格式的图像数据转化为RGB格式的图像数据。Regarding color interpolation processing, since each pixel only perceives one color component, color interpolation can be used to make each pixel contain three components of RGB at the same time. Therefore, color interpolation processing can be used to convert RAW format image data into RGB format image data.
针对色调映射处理,用于对图像的整体亮度进行调整,使得亮度调整后的画面可以更接近于真实世界中呈现的亮度。The tone mapping process is used to adjust the overall brightness of the image so that the picture after brightness adjustment can be closer to the brightness presented in the real world.
针对色彩校正处理,由于用户眼睛可见光的频谱响应度和半导体传感器频谱响应度之间存在差别,以及受透镜等的影响,使得RGB值颜色会存在偏差。在色彩校正处理中,终端设备需要进行颜色校正,例如终端设备可以利用一个3x3的颜色变化矩阵进行颜色校正。Regarding color correction processing, due to the difference between the spectrum responsivity of the user's visible light and the spectrum responsivity of the semiconductor sensor, as well as the influence of the lens, the RGB value color will have deviations. In the color correction processing, the terminal device needs to perform color correction, for example, the terminal device can use a 3x3 color change matrix for color correction.
针对Gamma校正处理,Gamma校正处理用于对输入图像的灰度值进行的非线性操作,使得输出图像的灰度值与输入图像的灰度值呈指数关系。Regarding the gamma correction processing, the gamma correction processing is used to perform a nonlinear operation on the grayscale value of the input image, so that the grayscale value of the output image is exponentially related to the grayscale value of the input image.
针对图像转换处理,可以用于将红绿蓝RGB格式的图像数据转化为YUV格式的图像数据。For image conversion processing, it can be used to convert image data in red, green, and blue (RGB) format into image data in YUV format.
可以理解的是,本申请实施例提供的界面仅作为一种示例,并不能构成对本申请实施例的进一步限定。It should be understood that the interface provided in the embodiment of the present application is merely an example and does not constitute a further limitation on the embodiment of the present application.
上面结合图4-图9,对本申请实施例提供的方法进行了说明,下面对本申请实施例提供的执行上述方法的装置进行描述。如图10所示,图10为本申请实施例提供的一种虚化装置的结构示意图,该虚化装置可以是本申请实施例中的终端设备,也可以是终端设备内的芯片或芯片系统。The method provided by the embodiment of the present application is described above in conjunction with Figures 4 to 9, and the device for executing the above method provided by the embodiment of the present application is described below. As shown in Figure 10, Figure 10 is a structural schematic diagram of a virtualization device provided by the embodiment of the present application, and the virtualization device can be a terminal device in the embodiment of the present application, or a chip or chip system in the terminal device.
如图10所示,虚化装置1000可以用于通信设备、电路、硬件组件或者芯片中,该虚化装置1000包括:获取单元1001以及处理单元1002。其中,获取单元1001用于支持虚化装置1000执行的数据获取的步骤;处理单元1002用于支持虚化装置1000执行信息处理的步骤。As shown in FIG10 , the virtualization device 1000 can be used in a communication device, a circuit, a hardware component or a chip, and the virtualization device 1000 includes: an acquisition unit 1001 and a processing unit 1002. The acquisition unit 1001 is used to support the virtualization device 1000 in performing data acquisition steps; the processing unit 1002 is used to support the virtualization device 1000 in performing information processing steps.
具体的,本申请实施例提供一种虚化装置1000,终端设备包括第一摄像头以及第二摄像头,方法包括:获取单元1001,用于获取第一图像以及第二图像;其中,第一图像是基于第一摄像头得到的,第二图像是基于第二摄像头得到的;处理单元1002,用于根据目标曝光时间处理第二图像,或者根据目标曝光时间处理第一图像和第二图像,其中,处理后的第一图像为按照目标曝光时间进行曝光的图像,处理后的第二图像为按照目标曝光时间进行曝光的图像;目标曝光时间为第一图像的曝光时间或第二图像的曝光时间中的任一个曝光时间;处理单元1002,还用于基于处理后的第二图像,或者基于处理后的第一图像以及处理后的第二图像,对第一图像进行虚化处理。Specifically, an embodiment of the present application provides a blurring device 1000, wherein the terminal device includes a first camera and a second camera, and the method includes: an acquisition unit 1001, used to acquire a first image and a second image; wherein the first image is obtained based on the first camera, and the second image is obtained based on the second camera; a processing unit 1002, used to process the second image according to a target exposure time, or to process the first image and the second image according to the target exposure time, wherein the processed first image is an image exposed according to the target exposure time, and the processed second image is an image exposed according to the target exposure time; the target exposure time is either the exposure time of the first image or the exposure time of the second image; the processing unit 1002 is also used to blur the first image based on the processed second image, or based on the processed first image and the processed second image.
可能的实现方式中,虚化装置1000还可以包括:通信单元1003,通信单元1003用于指示虚化装置1000执行数据的发送和接收等步骤。其中,该通信单元1003可以是输入或者输出接口、管脚或者电路等。In a possible implementation, the virtualization device 1000 may further include: a communication unit 1003, which is used to instruct the virtualization device 1000 to perform steps such as sending and receiving data. The communication unit 1003 may be an input or output interface, a pin or a circuit.
可能的实施例中,虚化装置1000还可以包括:存储单元1004。处理单元1002、存储单元1004通过线路相连。存储单元1004可以包括一个或者多个存储器,存储器可以是一个或者多个设备、电路中用于存储程序或者数据的器件。存储单元1004可以独立存在,通过通信线路与虚化装置1000具有的处理单元1002相连。存储单元1004也可以和处理单元1002集成在一起。In a possible embodiment, the virtualization device 1000 may further include: a storage unit 1004. The processing unit 1002 and the storage unit 1004 are connected via a line. The storage unit 1004 may include one or more memories, and the memory may be a device used to store programs or data in one or more devices or circuits. The storage unit 1004 may exist independently and be connected to the processing unit 1002 of the virtualization device 1000 via a communication line. The storage unit 1004 may also be integrated with the processing unit 1002.
存储单元1004可以存储终端设备中的方法的计算机执行指令,以使处理单元1002执行上述实施例中的方法。存储单元1004可以是寄存器、缓存或者RAM等,存储单元1004可以和处理单元1002集成在一起。存储单元1004可以是只读存储器(read-only memory,ROM)或者可存储静态信息和指令的其他类型的静态存储设备,存储单元1004可以与处理单元1002相独立。The storage unit 1004 can store computer-executable instructions of the method in the terminal device so that the processing unit 1002 executes the method in the above embodiment. The storage unit 1004 can be a register, a cache, or a RAM, etc. The storage unit 1004 can be integrated with the processing unit 1002. The storage unit 1004 can be a read-only memory (ROM) or other types of static storage devices that can store static information and instructions, and the storage unit 1004 can be independent of the processing unit 1002.
图11为本申请实施例提供的另一种终端设备的硬件结构示意图,如图11所示,该终端设备包括处理器1101,通信线路1104以及至少一个通信接口(图11中示例性的以通信接口1103为例进行说明)。Figure 11 is a schematic diagram of the hardware structure of another terminal device provided in an embodiment of the present application. As shown in Figure 11, the terminal device includes a processor 1101, a communication line 1104 and at least one communication interface (communication interface 1103 is used as an example in Figure 11).
处理器1101可以是一个通用中央处理器(central processing unit,CPU),微处理器,特定应用集成电路(application-specific integrated circuit,ASIC),或一个或多个用于控制本申请方案程序执行的集成电路。The processor 1101 may be a general-purpose central processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of the program of the present application.
通信线路1104可包括在上述组件之间传送信息的电路。Communications link 1104 may include circuitry to transmit information between the above-described components.
通信接口1103,使用任何收发器一类的装置,用于与其他设备或通信网络通信,如以太网,无线局域网(wireless local area networks,WLAN)等。The communication interface 1103 uses any transceiver or other device for communicating with other devices or communication networks, such as Ethernet, wireless local area networks (WLAN), etc.
可能的,该终端设备还可以包括存储器1102。Possibly, the terminal device may further include a memory 1102 .
存储器1102可以是只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)或者可存储信息和指令的其他类型的动态存储设备,也可以是电可擦可编程只读存储器(electrically erasable programmable read-only memory,EEPROM)、只读光盘(compactdisc read-only memory,CD-ROM)或其他光盘存储、光碟存储(包括压缩光碟、激光碟、光碟、数字通用光碟、蓝光光碟等)、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质,但不限于此。存储器可以是独立存在,通过通信线路1104与处理器相连接。存储器也可以和处理器集成在一起。The memory 1102 may be a read-only memory (ROM) or other types of static storage devices that can store static information and instructions, a random access memory (RAM) or other types of dynamic storage devices that can store information and instructions, or an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compressed optical disc, laser disc, optical disc, digital versatile disc, Blu-ray disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store the desired program code in the form of instructions or data structures and can be accessed by a computer, but is not limited thereto. The memory may exist independently and be connected to the processor via a communication line 1104. The memory may also be integrated with the processor.
其中,存储器1102用于存储执行本申请方案的计算机执行指令,并由处理器1101来控制执行。处理器1101用于执行存储器1102中存储的计算机执行指令,从而实现本申请实施例所提供的方法。The memory 1102 is used to store computer-executable instructions for executing the solution of the present application, and the execution is controlled by the processor 1101. The processor 1101 is used to execute the computer-executable instructions stored in the memory 1102, thereby implementing the method provided by the embodiment of the present application.
可能的,本申请实施例中的计算机执行指令也可以称之为应用程序代码,本申请实施例对此不作具体限定。Possibly, the computer-executable instructions in the embodiments of the present application may also be referred to as application code, and the embodiments of the present application do not specifically limit this.
在具体实现中,作为一种实施例,处理器1101可以包括一个或多个CPU,例如图11中的CPU0和CPU1。In a specific implementation, as an embodiment, the processor 1101 may include one or more CPUs, such as CPU0 and CPU1 in FIG. 11 .
在具体实现中,作为一种实施例,终端设备可以包括多个处理器,例如图11中的处理器1101和处理器1105。这些处理器中的每一个可以是一个单核(single-CPU)处理器,也可以是一个多核(multi-CPU)处理器。这里的处理器可以指一个或多个设备、电路、和/或用于处理数据(例如计算机程序指令)的处理核。In a specific implementation, as an embodiment, the terminal device may include multiple processors, such as processor 1101 and processor 1105 in FIG. 11. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. The processor here may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包括一个或多个可用介质集成的服务器、数据中心等数据存储设备。例如,可用介质可以包括磁性介质(例如,软盘、硬盘或磁带)、光介质(例如,数字通用光盘(digital versatile disc,DVD))、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the process or function according to the embodiment of the present application is generated in whole or in part. The computer can be a general-purpose computer, a special-purpose computer, a computer network or other programmable device. The computer instructions can be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions can be transmitted from one website site, computer, server or data center to another website site, computer, server or data center by wired (e.g., coaxial cable, optical fiber, digital subscriber line (digital subscriber line, DSL) or wireless (e.g., infrared, wireless, microwave, etc.) mode. The computer-readable storage medium can be any available medium that a computer can store or a data storage device such as a server or data center that includes one or more available media integrated. For example, the available medium can include a magnetic medium (e.g., a floppy disk, a hard disk or a tape), an optical medium (e.g., a digital versatile disc (digital versatile disc, DVD)), or a semiconductor medium (e.g., a solid state drive (solid state disk, SSD)), etc.
本申请实施例还提供了一种计算机可读存储介质。上述实施例中描述的方法可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。计算机可读介质可以包括计算机存储介质和通信介质,还可以包括任何可以将计算机程序从一个地方传送到另一个地方的介质。存储介质可以是可由计算机访问的任何目标介质。The present application also provides a computer-readable storage medium. The methods described in the above embodiments can be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer-readable media may include computer storage media and communication media, and may also include any medium that can transfer a computer program from one place to another. The storage medium may be any target medium that can be accessed by a computer.
作为一种可能的设计,计算机可读介质可以包括紧凑型光盘只读储存器(compactdisc read-only memory,CD-ROM)、RAM、ROM、EEPROM或其它光盘存储器;计算机可读介质可以包括磁盘存储器或其它磁盘存储设备。而且,任何连接线也可以被适当地称为计算机可读介质。例如,如果使用同轴电缆,光纤电缆,双绞线,DSL或无线技术(如红外,无线电和微波)从网站,服务器或其它远程源传输软件,则同轴电缆,光纤电缆,双绞线,DSL或诸如红外,无线电和微波之类的无线技术包括在介质的定义中。如本文所使用的磁盘和光盘包括光盘(CD),激光盘,光盘,数字通用光盘(digital versatile disc,DVD),软盘和蓝光盘,其中磁盘通常以磁性方式再现数据,而光盘利用激光光学地再现数据。As a possible design, the computer readable medium may include a compact disc read-only memory (CD-ROM), RAM, ROM, EEPROM or other optical disc storage; the computer readable medium may include a magnetic disk storage or other magnetic disk storage device. Moreover, any connection line may also be appropriately referred to as a computer readable medium. For example, if the software is transmitted from a website, server or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL or wireless technology (such as infrared, radio and microwave), the coaxial cable, fiber optic cable, twisted pair, DSL or wireless technology such as infrared, radio and microwave are included in the definition of medium. Disk and disc as used herein include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while optical discs reproduce data optically using lasers.
上述的组合也应包括在计算机可读介质的范围内。以上,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以权利要求的保护范围为准。The above combinations should also be included in the scope of computer-readable media. The above are only specific embodiments of the present invention, but the protection scope of the present invention is not limited thereto. Any technician familiar with the technical field can easily think of changes or substitutions within the technical scope disclosed by the present invention, which should be included in the protection scope of the present invention. Therefore, the protection scope of the present invention shall be based on the protection scope of the claims.
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202211064862.1ACN116095517B (en) | 2022-08-31 | 2022-08-31 | Blurring method, terminal device and readable storage medium |
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202211064862.1ACN116095517B (en) | 2022-08-31 | 2022-08-31 | Blurring method, terminal device and readable storage medium |
| Publication Number | Publication Date |
|---|---|
| CN116095517A CN116095517A (en) | 2023-05-09 |
| CN116095517Btrue CN116095517B (en) | 2024-04-09 |
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202211064862.1AActiveCN116095517B (en) | 2022-08-31 | 2022-08-31 | Blurring method, terminal device and readable storage medium |
| Country | Link |
|---|---|
| CN (1) | CN116095517B (en) |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104717480A (en)* | 2014-01-28 | 2015-06-17 | 杭州海康威视数字技术股份有限公司 | Binocular camera pixel-level synchronous image acquisition device and method thereof |
| CN107948519A (en)* | 2017-11-30 | 2018-04-20 | 广东欧珀移动通信有限公司 | Image processing method, device and equipment |
| CN108492245A (en)* | 2018-02-06 | 2018-09-04 | 浙江大学 | Low light images based on wavelet decomposition and bilateral filtering are to fusion method |
| CN109863742A (en)* | 2017-01-25 | 2019-06-07 | 华为技术有限公司 | Image processing method and terminal device |
| CN110312056A (en)* | 2019-06-10 | 2019-10-08 | 青岛小鸟看看科技有限公司 | A Synchronous Exposure Method and Image Acquisition Device |
| CN111418201A (en)* | 2018-03-27 | 2020-07-14 | 华为技术有限公司 | Shooting method and equipment |
| WO2021136078A1 (en)* | 2019-12-31 | 2021-07-08 | RealMe重庆移动通信有限公司 | Image processing method, image processing system, computer readable medium, and electronic apparatus |
| CN113450391A (en)* | 2020-03-26 | 2021-09-28 | 华为技术有限公司 | Method and equipment for generating depth map |
| CN113592922A (en)* | 2021-06-09 | 2021-11-02 | 维沃移动通信(杭州)有限公司 | Image registration processing method and device |
| CN113658065A (en)* | 2021-08-09 | 2021-11-16 | Oppo广东移动通信有限公司 | Image noise reduction method and device, computer readable medium and electronic equipment |
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104717480A (en)* | 2014-01-28 | 2015-06-17 | 杭州海康威视数字技术股份有限公司 | Binocular camera pixel-level synchronous image acquisition device and method thereof |
| CN109863742A (en)* | 2017-01-25 | 2019-06-07 | 华为技术有限公司 | Image processing method and terminal device |
| CN107948519A (en)* | 2017-11-30 | 2018-04-20 | 广东欧珀移动通信有限公司 | Image processing method, device and equipment |
| CN108492245A (en)* | 2018-02-06 | 2018-09-04 | 浙江大学 | Low light images based on wavelet decomposition and bilateral filtering are to fusion method |
| CN111418201A (en)* | 2018-03-27 | 2020-07-14 | 华为技术有限公司 | Shooting method and equipment |
| CN110312056A (en)* | 2019-06-10 | 2019-10-08 | 青岛小鸟看看科技有限公司 | A Synchronous Exposure Method and Image Acquisition Device |
| WO2021136078A1 (en)* | 2019-12-31 | 2021-07-08 | RealMe重庆移动通信有限公司 | Image processing method, image processing system, computer readable medium, and electronic apparatus |
| CN113450391A (en)* | 2020-03-26 | 2021-09-28 | 华为技术有限公司 | Method and equipment for generating depth map |
| CN113592922A (en)* | 2021-06-09 | 2021-11-02 | 维沃移动通信(杭州)有限公司 | Image registration processing method and device |
| CN113658065A (en)* | 2021-08-09 | 2021-11-16 | Oppo广东移动通信有限公司 | Image noise reduction method and device, computer readable medium and electronic equipment |
| Publication number | Publication date |
|---|---|
| CN116095517A (en) | 2023-05-09 |
| Publication | Publication Date | Title |
|---|---|---|
| US11949978B2 (en) | Image content removal method and related apparatus | |
| CN115526787B (en) | Video processing method and device | |
| CN113810604B (en) | Document shooting method, electronic device and storage medium | |
| CN115564659B (en) | Video processing method and device | |
| CN115359105B (en) | Depth-of-field extended image generation method, device and storage medium | |
| CN116048323B (en) | Image processing method and electronic equipment | |
| CN115529411B (en) | Video blurring method and device | |
| CN114915745A (en) | Multi-view video recording method, device and electronic device | |
| CN116095517B (en) | Blurring method, terminal device and readable storage medium | |
| CN115460343B (en) | Image processing method, device and storage medium | |
| CN116112813B (en) | Blurring method and blurring device | |
| CN117135470A (en) | Photography methods, electronic equipment and storage media | |
| US20250168292A1 (en) | Image Processing Method and Electronic Device | |
| CN118102133B (en) | Image sensor, image focusing method, electronic device and storage medium | |
| CN113259582B (en) | Picture generation method and terminal | |
| CN120513640A (en) | Dynamic capture time | |
| CN117278865A (en) | Image processing method and related device | |
| CN118474524A (en) | Image processing method and device | |
| WO2025050895A1 (en) | Image motion estimation method and related apparatus | |
| CN118450234A (en) | Image generation method, medium and electronic device | |
| CN116757963A (en) | Image processing method, electronic device, chip system and readable storage medium | |
| CN117474926A (en) | Image detection method and device | |
| CN118524193A (en) | Projection picture processing method and electronic equipment | |
| CN120769176A (en) | Image processing method and device | |
| CN120050510A (en) | Thumbnail generation method and device |
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant | ||
| CP03 | Change of name, title or address | Address after:Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040 Patentee after:Honor Terminal Co.,Ltd. Country or region after:China Address before:3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong Patentee before:Honor Device Co.,Ltd. Country or region before:China | |
| CP03 | Change of name, title or address |