Movatterモバイル変換


[0]ホーム

URL:


CN111061372B - Equipment control method and related equipment - Google Patents

Equipment control method and related equipment
Download PDF

Info

Publication number
CN111061372B
CN111061372BCN201911310496.1ACN201911310496ACN111061372BCN 111061372 BCN111061372 BCN 111061372BCN 201911310496 ACN201911310496 ACN 201911310496ACN 111061372 BCN111061372 BCN 111061372B
Authority
CN
China
Prior art keywords
touch signal
user interface
glasses
floating window
key
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911310496.1A
Other languages
Chinese (zh)
Other versions
CN111061372A (en
Inventor
吴恒刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp LtdfiledCriticalGuangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911310496.1ApriorityCriticalpatent/CN111061372B/en
Publication of CN111061372ApublicationCriticalpatent/CN111061372A/en
Application grantedgrantedCritical
Publication of CN111061372BpublicationCriticalpatent/CN111061372B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Images

Classifications

Landscapes

Abstract

Translated fromChinese

本申请公开了一种设备控制方法及相关设备,应用于AR眼镜,所述AR眼镜包括蓝牙模块、显示器和人眼追踪镜头,所述人眼追踪镜头设于所述显示器的一侧,方法包括:在所述显示器上显示用户界面;通过所述人眼追踪镜头确定人眼注视所述用户界面的第一注视位置,以及通过所述蓝牙模块接收与所述AR眼镜配对的触控指环发送的第一触控信号;基于所述第一注视位置和所述第一触控信号进行控制操作。采用本申请实施例可便捷地控制AR眼镜。

Figure 201911310496

The present application discloses a device control method and related devices, which are applied to AR glasses. The AR glasses include a Bluetooth module, a display, and a human eye tracking lens. The human eye tracking lens is set on one side of the display. The method includes : displaying a user interface on the display; determining the first gaze position where the human eye gazes at the user interface through the human eye tracking lens, and receiving the information sent by the touch ring paired with the AR glasses through the bluetooth module a first touch signal; performing a control operation based on the first gaze position and the first touch signal. The AR glasses can be conveniently controlled by using the embodiments of the present application.

Figure 201911310496

Description

Translated fromChinese
设备控制方法及相关设备Equipment control method and related equipment

技术领域technical field

本申请涉及增强现实技术领域,尤其涉及一种设备控制方法及相关设备。The present application relates to the field of augmented reality technology, and in particular to a device control method and related devices.

背景技术Background technique

穿戴式显示设备是应用穿戴式技术对日常穿戴进行智能化设计、开发出可以穿戴的显示设备的总称。增强现实(Augmented Reality,AR)眼镜也是一种穿戴式设备,用户佩戴AR眼镜后,可以看到虚拟信息叠加到现实环境中的画面。随着AR眼镜使用频率的提高,如何便捷地控制AR眼镜,成为了亟待解决的技术问题。Wearable display device is a general term for the application of wearable technology to intelligently design daily wear and develop wearable display devices. Augmented reality (Augmented Reality, AR) glasses are also a wearable device. After wearing AR glasses, users can see the superimposed virtual information into the real environment. With the increase in the frequency of use of AR glasses, how to conveniently control AR glasses has become a technical problem that needs to be solved urgently.

发明内容Contents of the invention

本申请实施例提供一种设备控制方法及相关设备,用于便捷地控制AR眼镜。Embodiments of the present application provide a device control method and related devices for conveniently controlling AR glasses.

第一方面,本申请实施例提供一种设备控制方法,应用于增强现实AR眼镜,所述AR眼镜包括蓝牙模块、显示器和人眼追踪镜头,所述人眼追踪镜头设于所述显示器的一侧,方法包括:In the first aspect, the embodiment of the present application provides a device control method applied to augmented reality AR glasses, the AR glasses include a Bluetooth module, a display, and a human eye tracking lens, and the human eye tracking lens is set on one of the display side, methods include:

在所述显示器上显示用户界面;displaying a user interface on the display;

通过所述人眼追踪镜头确定人眼注视所述用户界面的第一注视位置,以及通过所述蓝牙模块接收与所述AR眼镜配对的触控指环发送的第一触控信号;Using the human eye tracking lens to determine the first gaze position of the human eye on the user interface, and receiving the first touch signal sent by the touch ring paired with the AR glasses through the Bluetooth module;

基于所述第一注视位置和所述第一触控信号进行控制操作。A control operation is performed based on the first gaze position and the first touch signal.

第二方面,本申请实施例提供一种设备控制装置,应用于增强现实AR眼镜,所述AR眼镜包括蓝牙模块、显示器和人眼追踪镜头,所述人眼追踪镜头设于所述显示器的一侧,装置包括:In the second aspect, the embodiment of the present application provides a device control device, which is applied to augmented reality AR glasses, and the AR glasses include a Bluetooth module, a display, and a human eye tracking lens, and the human eye tracking lens is arranged on one of the display. side, the device includes:

显示单元,用于在所述显示器上显示用户界面;a display unit for displaying a user interface on the display;

注视位置确定单元,用于通过所述人眼追踪镜头确定人眼注视所述用户界面的第一注视位置;A gaze position determining unit, configured to determine a first gaze position where human eyes gaze at the user interface through the human eye tracking lens;

通信单元,用于通过所述蓝牙模块接收与所述AR眼镜配对的触控指环发送的第一触控信号;A communication unit, configured to receive a first touch signal sent by a touch ring paired with the AR glasses through the Bluetooth module;

控制单元,用于基于所述第一注视位置和所述第一触控信号进行控制操作。A control unit, configured to perform a control operation based on the first gaze position and the first touch signal.

第三方面,本申请实施例提供一种AR眼镜,包括处理器、存储器、蓝牙模块、蓝牙模块、显示器、人眼追踪镜头、无线通信模块,以及一个或多个程序,其中,上述一个或多个程序被存储在上述存储器中,并且被配置由上述处理器执行,上述程序包括用于执行本申请实施例第一方面所述的方法中的步骤的指令。In the third aspect, the embodiment of the present application provides AR glasses, including a processor, a memory, a Bluetooth module, a Bluetooth module, a display, an eye-tracking lens, a wireless communication module, and one or more programs, wherein one or more of the above A program is stored in the above-mentioned memory and is configured to be executed by the above-mentioned processor, and the above-mentioned program includes instructions for executing the steps in the method described in the first aspect of the embodiments of the present application.

第四方面,本申请实施例提供了一种计算机可读存储介质,上述计算机可读存储介质用于存储计算机程序,上述计算机程序被处理器执行,以实现如本申请实施例第一方面所述的方法中所描述的部分或全部步骤。In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, the above-mentioned computer-readable storage medium is used to store a computer program, and the above-mentioned computer program is executed by a processor, so as to implement the first aspect of the embodiment of the present application. Some or all of the steps described in the method.

第五方面,本申请实施例提供了一种计算机程序产品,上述计算机程序产品包括存储了计算机程序的非瞬时性计算机可读存储介质,上述计算机程序可操作来使计算机执行如本申请实施例第一方面所述的方法中所描述的部分或全部步骤。该计算机程序产品可以为一个软件安装包。In the fifth aspect, the embodiment of the present application provides a computer program product, the computer program product includes a non-transitory computer-readable storage medium storing the computer program, and the computer program is operable to enable the computer to execute Part or all of the steps described in the method described in one aspect. The computer program product may be a software installation package.

可以看出,在本申请实施例中,AR眼镜先在显示器上显示用户界面,然后确定人眼注视该用户界面的注视位置,以及通过蓝牙模块接收与AR眼镜配对的触控指环发送的触控信号,最后基于注视位置和触控信号进行控制操作,其中,眼动追踪实现交互的精细度,触控指环实现交互的便利性,进而实现了便捷精准地控制AR眼镜。It can be seen that in the embodiment of this application, the AR glasses first display the user interface on the display, and then determine the gaze position of the human eye on the user interface, and receive the touch control sent by the touch ring paired with the AR glasses through the Bluetooth module. Finally, the control operation is performed based on the gaze position and the touch signal. Among them, the eye tracking realizes the fineness of the interaction, and the touch ring realizes the convenience of the interaction, thereby realizing the convenient and precise control of the AR glasses.

附图说明Description of drawings

为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to more clearly illustrate the technical solutions in the embodiments of the present application or the prior art, the following will briefly introduce the drawings that need to be used in the description of the embodiments or the prior art. Obviously, the accompanying drawings in the following description are only These are some embodiments of the present application. Those skilled in the art can also obtain other drawings based on these drawings without creative work.

图1A是本申请实施例提供的一种AR眼镜的结构示意图;FIG. 1A is a schematic structural diagram of an AR glasses provided by an embodiment of the present application;

图1B是本申请实施例提供的另一种AR眼镜的结构示意图;FIG. 1B is a schematic structural diagram of another AR glasses provided by an embodiment of the present application;

图1C是本申请实施例提供的另一种AR眼镜的结构示意图;Fig. 1C is a schematic structural diagram of another AR glasses provided by the embodiment of the present application;

图1D是本申请实施例提供的一种用户界面的示意图;FIG. 1D is a schematic diagram of a user interface provided by an embodiment of the present application;

图1E是本申请实施例提供的一种触控指环的结构示意图;FIG. 1E is a schematic structural diagram of a touch ring provided by an embodiment of the present application;

图2A是本申请实施例提供的一种设备控制方法的流程示意图;FIG. 2A is a schematic flowchart of a device control method provided in an embodiment of the present application;

图2B-图2W是本申请是实施例提供的另一种用户界面的示意图;Fig. 2B-Fig. 2W are schematic diagrams of another user interface provided by the embodiment of the present application;

图3是本申请是实施例提供的另一种AR眼镜的结构示意图;Fig. 3 is a schematic structural diagram of another AR glasses provided by the embodiment of the present application;

图4是本申请是实施例提供的一种设备控制装置的结构示意图。Fig. 4 is a schematic structural diagram of an equipment control device provided by an embodiment of the present application.

具体实施方式Detailed ways

为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分的实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。In order to enable those skilled in the art to better understand the solution of the present application, the technical solution in the embodiment of the application will be clearly and completely described below in conjunction with the accompanying drawings in the embodiment of the application. Obviously, the described embodiment is only It is an embodiment of a part of the application, but not all of the embodiments. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without creative efforts shall fall within the scope of protection of this application.

以下分别进行详细说明。Each will be described in detail below.

本申请的说明书和权利要求书及所述附图中的术语“第一”、“第二”、“第三”和“第四”等是用于区别不同对象,而不是用于描述特定顺序。此外,术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或单元。The terms "first", "second", "third" and "fourth" in the specification and claims of the present application and the drawings are used to distinguish different objects, rather than to describe a specific order . Furthermore, the terms "include" and "have", as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, product or device comprising a series of steps or units is not limited to the listed steps or units, but optionally also includes unlisted steps or units, or optionally further includes For other steps or units inherent in these processes, methods, products or apparatuses.

在本文中提及“实施例”意味着,结合实施例描述的特定特征、结构或特性可以包含在本申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是指相同的实施例,也不是与其它实施例互斥的独立的或备选的实施例。本领域技术人员显式地和隐式地理解的是,本文所描述的实施例可以与其它实施例相结合。Reference herein to an "embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the present application. The occurrences of this phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is understood explicitly and implicitly by those skilled in the art that the embodiments described herein can be combined with other embodiments.

请参见图1A-图1C,图1A-图1C是本申请实施例提供的一种AR眼镜的结构示意图。该AR眼镜包括AR眼镜主体101、主摄像头102、多个第一红外追踪发光二极管(LightEmitting Diode,LED)103、第一人眼追踪镜头104、多个第二红外追踪LED105、第二人眼追踪镜头106、左眼显示器107、右眼显示器108。Please refer to FIG. 1A-FIG. 1C. FIG. 1A-FIG. 1C are schematic structural diagrams of AR glasses provided by an embodiment of the present application. The AR glasses include an AR glassesmain body 101, amain camera 102, a plurality of first infrared tracking light emitting diodes (LightEmitting Diode, LED) 103, a first humaneye tracking lens 104, a plurality of secondinfrared tracking LEDs 105, a second humaneye tracking Lens 106 ,left eye display 107 ,right eye display 108 .

可选地,主摄像头102、多个第一红外追踪LED103、第一人眼追踪镜头104、多个第二红外追踪LED105、第二人眼追踪镜头106、左眼显示器107和右眼显示器108均固定在AR眼镜主体101上。多个第一红外追踪LED103沿左眼显示器107的周侧设置,第一人眼追踪镜头104设于左眼显示器107的一侧。多个第二红外追踪LED105沿右眼显示器108的周侧设置,第二人眼追踪镜头106设于左眼显示器108的一侧。Optionally, themain camera 102, a plurality of firstinfrared tracking LEDs 103, a first humaneye tracking lens 104, a plurality of secondinfrared tracking LEDs 105, a second humaneye tracking lens 106, a left-eye display 107 and a right-eye display 108 are all It is fixed on themain body 101 of the AR glasses. A plurality of firstinfrared tracking LEDs 103 are disposed along the periphery of the left-eye display 107 , and the first eye-trackinglens 104 is disposed on one side of the left-eye display 107 . A plurality of secondinfrared tracking LEDs 105 are disposed along the periphery of the right-eye display 108 , and the second eye-trackinglens 106 is disposed on one side of the left-eye display 108 .

可选地,AR眼镜内部还设有处理器、影像信息处理模块、存储器、蓝牙模块、眼动追踪处理模块、无线通信模块等。其中,影像信息处理模块和眼动追踪处理模块可以是独立于处理器的,也可以是集成在处理器中的。AR眼镜可通过无线通信模块与无线通信设备(如智能手机、平板电脑等)通信连接。Optionally, the AR glasses are further provided with a processor, an image information processing module, a memory, a Bluetooth module, an eye tracking processing module, a wireless communication module, and the like. Wherein, the image information processing module and the eye tracking processing module may be independent of the processor, or may be integrated in the processor. AR glasses can communicate with wireless communication devices (such as smart phones, tablet computers, etc.) through a wireless communication module.

其中,AR眼镜可以佩戴在用户头部以保证用户可以清晰地看到叠加在现实世界中的虚拟信息。主摄像头102用于拍照摄像,在启用后通过主摄像头102收集外界光线109并转换为数字信息,该数字信息通过AR眼镜的影像信息处理模块转换为可视数字图像信息。以右眼为例,该图像信息通过微型投射器110将光线投射到右眼显示器108,经由右眼显示器108改变光线方向后最终投射到用户右眼的视野中。微型投射器110固定在AR眼镜主体101上,微型投射器110有两个,一个设于左眼显示器107的一侧,另一个设于右眼显示器108的一侧。Among them, AR glasses can be worn on the user's head to ensure that the user can clearly see the virtual information superimposed in the real world. Themain camera 102 is used for taking pictures and taking pictures. After being activated, themain camera 102 collectsexternal light 109 and converts it into digital information. The digital information is converted into visible digital image information through the image information processing module of the AR glasses. Taking the right eye as an example, the image information is projected to the right-eye display 108 through the micro-projector 110 , the light direction is changed through the right-eye display 108 and finally projected into the field of view of the user's right eye. The micro-projector 110 is fixed on themain body 101 of the AR glasses. There are twomicro-projectors 110 , one is set on one side of the left-eye display 107 , and the other is set on one side of the right-eye display 108 .

其中,第一人眼追踪镜头104用于实时追踪用户左眼的视线方向和注视位置。第二人眼追踪镜头106用于实时追踪用户右眼的视线方向和注视位置。用户正确佩戴AR眼镜后,多个第一红外追踪LED103和多个第二红外追踪LED105向用户眼球投射红外光,以右眼为例,红外光在右眼角膜处形成光斑,第二人眼追踪镜头106实时捕捉人眼角膜反射的红外光斑,然后将红外光斑转换为数字信息,该数字信息通过眼动追踪处理模块分析出用户视线方向及位于用户界面中的注视位置。眼动追踪处理模块还可以根据红外光斑消失顺序和再次出现顺序判断用户的眨眼行为。Wherein, the first humaneye tracking lens 104 is used to track the gaze direction and gaze position of the user's left eye in real time. The secondeye tracking lens 106 is used to track the gaze direction and gaze position of the user's right eye in real time. After the user wears the AR glasses correctly, a plurality of firstinfrared tracking LEDs 103 and a plurality of secondinfrared tracking LEDs 105 project infrared light to the user's eyeball. Taking the right eye as an example, the infrared light forms a light spot on the cornea of the right eye, and the second human eye tracking Thelens 106 captures the infrared light spots reflected by the cornea of the human eye in real time, and then converts the infrared light spots into digital information. The digital information analyzes the direction of the user's line of sight and the gaze position in the user interface through the eye tracking processing module. The eye tracking processing module can also judge the blinking behavior of the user according to the disappearance sequence and reappearance sequence of the infrared light spots.

请参见图1D,图1D是本申请实施例提供的一种用户界面的示意图。影像信息处理模块将处理后的图像信息显示在用户界面的取景框内。取景框内显示的光标为眼动追踪模块计算人眼注视用户界面的注视位置。当光标位于用户界面中的某个对象(如录像按键、拍照按键、虚拟信息按键等)的热区之内时,该热区的显示效果会出现变化(如边界大小、边框颜色粗细等变化),此时用户可通过与AR眼镜蓝牙配对的触控指环完成相应交互操作。Please refer to FIG. 1D , which is a schematic diagram of a user interface provided by an embodiment of the present application. The image information processing module displays the processed image information in the viewfinder frame of the user interface. The cursor displayed in the viewfinder allows the eye tracking module to calculate the gaze position of the human eye on the user interface. When the cursor is located in the hot zone of an object in the user interface (such as video button, camera button, virtual information button, etc.), the display effect of the hot zone will change (such as border size, border color thickness, etc.) At this time, the user can complete the corresponding interactive operation through the touch ring paired with the AR glasses Bluetooth.

请参见图1E,图1E是本申请实施例提供的一种触控指环的结构示意图。触控指环可佩戴在用户一只手(如左手或右手)的食指第二关节处,然后通过拇指触摸触控指环的触控区域实现交互。触控指环内部设有蓝牙模块,触控区域展开可为一矩形平面。触控区域的电容触摸传感器实时记录触控操作,并将触控信号通过触控指环的蓝牙模块实时传输给AR眼镜的蓝牙模块。触控信号例如有:Please refer to FIG. 1E . FIG. 1E is a schematic structural diagram of a touch ring provided by an embodiment of the present application. The touch ring can be worn on the second joint of the index finger of the user's hand (such as the left hand or right hand), and then the thumb touches the touch area of the touch ring to realize interaction. There is a Bluetooth module inside the touch ring, and the touch area can be expanded into a rectangular plane. The capacitive touch sensor in the touch area records the touch operation in real time, and transmits the touch signal to the Bluetooth module of the AR glasses in real time through the Bluetooth module of the touch ring. Examples of touch signals include:

(1)轻触信号:单个触摸点时间小于500毫秒;(1) Light touch signal: the time of a single touch point is less than 500 milliseconds;

(2)长按信号:单个触摸点时间大于或等于500毫秒;(2) Long press signal: the time of a single touch point is greater than or equal to 500 milliseconds;

(3)右滑信号:单个触摸点并产生水平向右位移;(3) Swipe right signal: single touch point and generate horizontal rightward displacement;

(4)左滑信号:单个触摸点并产生水平向左位移;(4) Left slide signal: single touch point and generate horizontal leftward displacement;

(5)上滑信号:单个触摸点并产生垂直向上位移;(5) Slip up signal: single touch point and generate vertical upward displacement;

(6)下滑信号:单个触摸点并产生垂直向下位移;(6) Sliding signal: a single touch point and a vertical downward displacement;

(7))双指捏合信号:检测到两个触摸点产生位移,两个触控点的距离逐渐靠近;(7)) Two-finger pinching signal: the displacement of two touch points is detected, and the distance between the two touch points gradually approaches;

(8)双指分开信号:检测到两个触摸点产生位移,两个触控点的距离逐渐远离。(8) Two-finger separation signal: the displacement of two touch points is detected, and the distance between the two touch points is gradually separated.

请参见图2A,图2A是本申请实施例提供的一种设备控制方法的流程示意图,应用于上述AR眼镜,该方法包括:Please refer to FIG. 2A. FIG. 2A is a schematic flowchart of a device control method provided in an embodiment of the present application, which is applied to the above-mentioned AR glasses. The method includes:

步骤201:在所述显示器上显示用户界面。Step 201: Display a user interface on the display.

其中,该用户界面是AR眼镜在相机模式下的用户界面,相机模式例如有拍照模式、录像模式等。Wherein, the user interface is a user interface of the AR glasses in a camera mode, and the camera modes include, for example, a photographing mode, a video recording mode, and the like.

步骤202:通过所述人眼追踪镜头确定人眼注视所述用户界面的第一注视位置,以及通过所述蓝牙模块接收与所述AR眼镜配对的触控指环发送的第一触控信号。Step 202: Determine the first gaze position where the human eye gazes at the user interface through the human eye tracking lens, and receive a first touch signal sent by a touch ring paired with the AR glasses through the Bluetooth module.

在本申请的一实现方式中,所述通过所述人眼追踪镜头确定人眼注视所述用户界面的第一注视位置,包括:In an implementation manner of the present application, the determining the first gaze position where the human eye gazes at the user interface through the human eye tracking lens includes:

通过所述人眼追踪镜头捕捉人眼反射的红外光斑,所述红外光斑是所述多个红外追踪LED向人眼投射红外光后形成的;The infrared spot reflected by the human eye is captured by the human eye tracking lens, and the infrared spot is formed after the plurality of infrared tracking LEDs project infrared light to the human eye;

基于所述红外光斑确定人眼注视所述用户界面的注视位置。Determine the gazing position where human eyes gaze at the user interface based on the infrared light spots.

需要说明的是,基于红外光斑确定人眼注视用户界面的注视位置的具体实现方式是现有人眼追踪技术,在此不再叙述。It should be noted that the specific implementation of determining the gaze position of the human eye on the user interface based on the infrared light spot is the existing human eye tracking technology, which will not be described here.

步骤203:基于所述第一注视位置和所述第一触控信号进行控制操作。Step 203: Perform a control operation based on the first gaze position and the first touch signal.

在本申请的一实现方式中,所述AR眼镜处于相机模式,所述用户界面包括取景框,所述基于所述第一注视位置和所述第一触控信号进行控制操作,包括:若所述第一注视位置位于所述取景框内,且所述第一触控信号为轻触信号,则对所述第一注视位置进行对焦,其中,在对所述第一注视位置进行对焦后,所述用户界面显示对焦按键和曝光刻度。具体如图2B所示。In an implementation manner of the present application, the AR glasses are in the camera mode, the user interface includes a viewfinder frame, and the control operation based on the first gaze position and the first touch signal includes: if the The first gazing position is located in the viewfinder frame, and the first touch signal is a light touch signal, then focus on the first gazing position, wherein, after focusing on the first gazing position, The user interface displays a focus button and an exposure scale. Specifically as shown in Figure 2B.

需要说明的是,在所述AR眼镜处于相机模式,所述用户界面包括取景框,若所述第一注视位置位于所述取景框内,且双眼眨一下,则对所述第一注视位置进行对焦,在对所述第一注视位置进行对焦后,所述第一用户界面显示对焦按键和曝光刻度。It should be noted that, when the AR glasses are in the camera mode, the user interface includes a viewfinder frame, and if the first gaze position is located in the viewfinder frame, and both eyes blink once, the first gaze position is checked. focus, after focusing on the first gaze position, the first user interface displays a focus button and an exposure scale.

另外,在用户界面显示对焦按键和曝光刻度之后,若在第一时长内无任何操作,则移除所述对焦按键和所述曝光刻度。第一时长例如有2000ms、3000ms或是其他值。In addition, after the user interface displays the focus button and the exposure scale, if there is no operation within a first period of time, the focus button and the exposure scale are removed. The first duration is, for example, 2000ms, 3000ms or other values.

在本申请的一实现方式中,所述对所述第一注视位置进行对焦之后,所述方法还包括:In an implementation manner of the present application, after focusing on the first gaze position, the method further includes:

若在第一时长内通过所述蓝牙模块接收到所述触控指环发送的第四触控信号,且所述第四触控信号为右滑信号,则基于所述右滑信号降低曝光度。具体如图2C所示,另外,具体降低多少曝光度是基于所述右滑的滑动距离确定的,滑动距离越大降低的曝光度越大,滑动距离越小降低的曝光度越小。If the fourth touch signal sent by the touch ring is received through the bluetooth module within the first duration, and the fourth touch signal is a right-swipe signal, the exposure is reduced based on the right-swipe signal. Specifically as shown in FIG. 2C , in addition, how much to reduce the exposure is determined based on the sliding distance of the right swipe. The greater the sliding distance, the greater the reduced exposure, and the smaller the sliding distance, the smaller the reduced exposure.

在本申请的一实现方式中,所述对所述第一注视位置进行对焦之后,所述方法还包括:In an implementation manner of the present application, after focusing on the first gaze position, the method further includes:

若在第一时长内通过所述蓝牙模块接收到所述触控指环发送的第四触控信号,且所述第四触控信号为左滑信号,则基于所述左滑信号提高曝光度。具体如图2D所示,另外,具体降低多少曝光度是基于所述左滑的滑动距离确定的,滑动距离越大降低的曝光度越大,滑动距离越小降低的曝光度越小。If the fourth touch signal sent by the touch ring is received through the bluetooth module within the first duration, and the fourth touch signal is a left-swipe signal, the exposure is increased based on the left-swipe signal. Specifically as shown in FIG. 2D , in addition, how much to reduce the exposure is determined based on the sliding distance of the left swipe. The greater the sliding distance, the greater the reduced exposure, and the smaller the sliding distance, the smaller the reduced exposure.

在本申请的一实现方式中,所述AR眼镜处于相机模式,所述用户界面包括取景框,所述基于所述第一注视位置和所述第一触控信号进行控制操作,包括:In an implementation manner of the present application, the AR glasses are in camera mode, the user interface includes a viewfinder frame, and the control operation based on the first gaze position and the first touch signal includes:

若所述第一注视位置位于所述取景框内,且所述第一触控信号为双指捏合信号,则基于所述双指捏合信号进行视图缩小调节,在进行视图缩小调节时,所述用户界面显示缩放刻度。具体如图2E所示,另外,具体视图缩小多少是基于双指的距离确定的,双指的距离越小视图缩小越大,双指的距离越大视图缩小越小。If the first gaze position is located in the viewfinder frame, and the first touch signal is a two-finger pinch signal, the view zoom-out adjustment is performed based on the two-finger pinch signal, and when the view zoom-out adjustment is performed, the The user interface displays a zoom scale. Specifically as shown in FIG. 2E , in addition, how much the specific view is reduced is determined based on the distance between the two fingers. The smaller the distance between the two fingers, the larger the view is reduced, and the larger the distance between the two fingers, the smaller the view is reduced.

在本申请的一实现方式中,所述AR眼镜处于相机模式,所述用户界面包括取景框,所述基于所述第一注视位置和所述第一触控信号进行控制操作,包括:In an implementation manner of the present application, the AR glasses are in camera mode, the user interface includes a viewfinder frame, and the control operation based on the first gaze position and the first touch signal includes:

若所述第一注视位置位于所述取景框内,且所述第一触控信号为双指分开信号,则基于所述双指分开信号进行视图放大调节,在进行视图放大调节时,所述用户界面显示缩放刻度。具体如图2F所示,另外,具体视图放大多少是基于双指的距离确定的,双指的距离越小视图放大越小,双指的距离越大视图放大越大。If the first gaze position is located in the viewfinder frame, and the first touch signal is a two-finger separation signal, the view zoom-in adjustment is performed based on the two-finger separation signal, and when the view zoom-in adjustment is performed, the The user interface displays a zoom scale. Specifically as shown in FIG. 2F , in addition, the specific view magnification is determined based on the distance between the two fingers. The smaller the distance between the two fingers, the smaller the view magnification, and the larger the distance between the two fingers, the greater the view magnification.

在本申请的一实现方式中,所述AR眼镜处于相机模式,所述用户界面包括取景框,所述取景框内显示有缩放倍数图标,所述基于所述第一注视位置和所述第一触控信号进行控制操作,包括:In an implementation manner of the present application, the AR glasses are in the camera mode, the user interface includes a viewfinder frame, and a zoom factor icon is displayed in the viewfinder frame, and the user interface based on the first gaze position and the first Touch signals for control operations, including:

若所述第一注视位置位于所述缩放倍数图标边界内,且所述第一触控信号为上滑信号,基于所述上滑信号进行视图放大调节,在进行视图放大调节时,所述用户界面显示缩放刻度。具体如图2G所示,另外,具体视图放大多少是基于上滑的滑动距离确定的,滑动距离越小视图放大越小,滑动距离越大视图放大越大。If the first gaze position is located within the boundary of the zoom factor icon, and the first touch signal is a slide-up signal, the view zoom-in adjustment is performed based on the slide-up signal, and when the view zoom-in adjustment is performed, the user The interface displays the zoom scale. Specifically as shown in FIG. 2G , in addition, how much the specific view is zoomed in is determined based on the sliding distance of the upward slide. The smaller the sliding distance, the smaller the zooming in of the view, and the larger the zooming distance, the larger the zooming in of the view.

在本申请的一实现方式中,所述AR眼镜处于相机模式,所述用户界面包括取景框,所述取景框内显示有缩放倍数图标,所述基于所述第一注视位置和所述第一触控信号进行控制操作,包括:In an implementation manner of the present application, the AR glasses are in the camera mode, the user interface includes a viewfinder frame, and a zoom factor icon is displayed in the viewfinder frame, and the user interface based on the first gaze position and the first Touch signals for control operations, including:

若所述第一注视位置位于所述缩放倍数图标边界内,且所述第一触控信号为下滑信号,基于所述上滑信号进行视图缩小调节,在进行视图缩小调节时,所述用户界面显示缩放刻度。具体如图2H所示,另外,具体视图缩小多少是基于下滑的滑动距离确定的,滑动距离越小视图缩小越小,滑动距离越大视图缩小越大。If the first gaze position is located within the boundary of the zoom factor icon, and the first touch signal is a slide down signal, the zoom-out adjustment of the view is performed based on the slide-up signal, and when the zoom-out adjustment is performed, the user interface Show zoom scale. Specifically as shown in FIG. 2H , in addition, how much the specific view is reduced is determined based on the sliding distance of the slide. The smaller the sliding distance, the smaller the view reduction, and the larger the sliding distance, the greater the view reduction.

在本申请的一实现方式中,所述AR眼镜支持多种相机模式,所述多种相机模式是顺序排列设置的,所述AR眼镜当前处于第一相机模式,所述用户界面包括取景框,所述基于所述第一注视位置和所述第一触控信号进行控制操作,包括:若所述第一注视位置位于所述取景框,且所述第一触控信号为下滑信号,则切换至第二相机模式,所述第二相机模式与所述第一相机模式相邻且排列在所述第一相机模式之上;In an implementation manner of the present application, the AR glasses support multiple camera modes, the multiple camera modes are arranged in sequence, the AR glasses are currently in the first camera mode, the user interface includes a viewfinder frame, The control operation based on the first gaze position and the first touch signal includes: if the first gaze position is located in the viewfinder frame, and the first touch signal is a slide signal, switch to a second camera mode, the second camera mode being adjacent to and arranged above the first camera mode;

若所述第一注视位置位于所述取景框,且所述第一触控信号为上滑信号,则切换至第三相机模式,所述第三相机模式与所述第一相机模式相邻且排列在所述第一相机模式之下。If the first gaze position is located in the viewfinder frame, and the first touch signal is an upward slide signal, switch to a third camera mode, the third camera mode is adjacent to the first camera mode and Arranged under the first camera mode.

其中,假设第二相机模式为录像模式、第三相机模式为相机模式A,那么具体示意图如图2I所示。Wherein, assuming that the second camera mode is video recording mode and the third camera mode is camera mode A, the specific schematic diagram is shown in FIG. 2I .

在本申请的一实现方式中,所述AR眼镜处于相机模式,所述AR眼镜支持多种相机模式,所述AR眼镜当前处于第一相机模式,所述用户设备包括拍照按键,所述基于所述第一注视位置和所述第一触控信号进行控制操作,包括:In an implementation manner of the present application, the AR glasses are in camera mode, the AR glasses support multiple camera modes, the AR glasses are currently in the first camera mode, the user equipment includes a camera button, and the performing control operations on the first gaze position and the first touch signal, including:

若所述第一注视位置位于所述拍照按键边界内,且所述第一触控信号为轻触信号或下滑信号,则切换至第四相机模式,所述第四相机模式与所述第一相机模式相邻且排列在所述第一相机模式之上;If the first gaze position is located within the boundary of the camera button, and the first touch signal is a light touch signal or a slide signal, then switch to the fourth camera mode, and the fourth camera mode is the same as the first the camera modes are adjacent and arranged above the first camera mode;

若所述第一注视位置位于所述拍照按键边界内,且所述第一触控信号为轻触信号或上滑信号,则切换至第五相机模式,所述第五相机模式与所述第一相机模式相邻且排列在所述第一相机模式之下;If the first gaze position is within the boundary of the camera button, and the first touch signal is a light touch signal or an upward slide signal, then switch to the fifth camera mode, and the fifth camera mode is the same as the first camera mode. a camera mode is adjacent and arranged below said first camera mode;

若所述第一注视位置位于所述拍照按键边界内,且所述第一触控信号为轻触信号或连续下滑信号,则切换至更多功能模式,在所述更多功能模式下,所述用户界面显示多种相机模式的图标,以供用户选择。If the first gaze position is located within the boundary of the camera button, and the first touch signal is a light touch signal or a continuous sliding signal, switch to a more functional mode, and in the more functional mode, the The above user interface displays icons of various camera modes for the user to select.

其中,假设第四相机模式为录像模式、第五相机模式为相机模式A,那么具体示意图如图2J所示。Wherein, assuming that the fourth camera mode is video recording mode and the fifth camera mode is camera mode A, the specific schematic diagram is shown in FIG. 2J .

在本申请的一实现方式中,所述AR眼镜处于拍照模式,所述用户界面包括取景框,所述基于所述第一注视位置和所述第一触控信号进行控制操作,包括:In an implementation manner of the present application, the AR glasses are in the photographing mode, the user interface includes a viewfinder frame, and the control operation based on the first gaze position and the first touch signal includes:

若所述用户界面包括拍照按键、所述第一注视位置位于所述拍照按键边界内、且所述第一触控信号为轻触信号,则进行现实拍照,具体如图2K所示;If the user interface includes a camera button, the first gaze position is located within the boundary of the camera button, and the first touch signal is a light touch signal, then take a real photo, as shown in Figure 2K;

若所述用户界面包括对焦按键、所述第一注视位置位于所述对焦按键内、且所述第一触控信号为轻触信号,则进行现实拍照,具体如图2L所示;If the user interface includes a focus button, the first gaze position is located in the focus button, and the first touch signal is a light touch signal, take a real photo, as shown in Figure 2L;

若所述用户界面包括虚拟信息按键、所述第一注视位置位于所述虚拟信息按键边界内,且所述第一触控信号为轻触信号,则开启混合现实拍照功能,在所述混合现实拍照功能下,在所述取景框内显示叠加在现实场景中的AR虚拟信息,以及在所述混合现实拍照功能下,拍照得到的图像信息为叠加所述AR虚拟信息的现实场景图像信息,具体如图2M所示;If the user interface includes a virtual information button, the first gaze position is located within the boundary of the virtual information button, and the first touch signal is a light touch signal, the mixed reality camera function is enabled, and the mixed reality Under the photographing function, the AR virtual information superimposed on the real scene is displayed in the viewfinder, and under the mixed reality photographing function, the image information obtained by photographing is the real scene image information superimposed on the AR virtual information, specifically As shown in Figure 2M;

若所述用户界面包括浮窗模式按键、所述第一注视位置位于所述浮窗模式按键边界内、且所述第一触控信号为轻触信号,则开启浮窗模式拍照功能,在所述浮窗模式拍照功能下,在所述用户界面内显示第一浮窗,所述第一浮窗包括缩小后的所述取景框和拍照快门按键,具体如图2N所示。If the user interface includes a floating window mode button, the first gaze position is located within the boundary of the floating window mode button, and the first touch signal is a light touch signal, then the floating window mode camera function is turned on, and Under the floating window mode photographing function, a first floating window is displayed in the user interface, and the first floating window includes a reduced viewfinder frame and a shutter button for taking pictures, as shown in FIG. 2N .

需要说明的是,现实拍照指的是通过主摄像头对AR眼镜外界环境进行拍照,得到的图像为现实场景图像。It should be noted that real-world photography refers to taking pictures of the external environment of the AR glasses through the main camera, and the obtained images are images of real scenes.

在本申请的一实现方式中,所述开启浮窗模式拍照功能之后,所述方法还包括:In an implementation manner of the present application, after the function of taking pictures in floating window mode is turned on, the method further includes:

通过所述人眼追踪镜头确定人眼注视所述用户界面的第二注视位置,以及通过所述蓝牙模块接收所述触控指环发送的第二触控信号;Using the human eye tracking lens to determine a second gaze position where the human eye gazes at the user interface, and receiving a second touch signal sent by the touch ring through the Bluetooth module;

若所述第二注视位置位于所述取景框内、且所述第二触控信号为轻触信号,则关闭所述浮窗模式拍照功能,具体如图2O所示;If the second gaze position is located in the viewfinder frame and the second touch signal is a light touch signal, then turn off the floating window mode photographing function, as shown in FIG. 2O;

若所述第二注视位置位于所述拍照按键边界内、且所述第二触控信号为轻触信号,则进行现实拍照,具体如图2P所示;If the second gaze position is located within the boundary of the camera button and the second touch signal is a light touch signal, take a real photo, as shown in Figure 2P;

若所述第二注视位置位于所述第一浮窗内,且所述第二触控信号为长按信号,则基于人眼注视点移动所述第一浮窗,具体如图2Q所示。If the second gaze position is located in the first floating window, and the second touch signal is a long press signal, the first floating window is moved based on the gaze point of human eyes, as shown in FIG. 2Q .

其中,所述第二注视位置位于所述第一浮窗表示所述第二注视位置位于所述拍照快门按键边界内,或所述第二注视位置位于所述取景框边界内。Wherein, the second gazing position being located in the first floating window means that the second gazing position is located within the boundary of the shutter button for taking photos, or the second gazing position is located within the boundary of the viewfinder frame.

在本申请的一实现方式中,所述AR眼镜处于录像模式,所述基于所述第一注视位置和所述第一触控信号进行控制操作,包括:In an implementation manner of the present application, the AR glasses are in the recording mode, and the control operation based on the first gaze position and the first touch signal includes:

若所述用户界面包括录像按键、所述第一注视位置位于所述录像按键边界内、且所述第一触控信号为轻触信号,则进行现实录像,具体如图2R所示;If the user interface includes a video recording button, the first gaze position is located within the boundary of the video recording button, and the first touch signal is a light touch signal, real video recording is performed, as shown in Figure 2R;

若所述用户界面包括虚拟信息按键、所述第一注视位置位于所述虚拟信息按键边界内,且所述第一触控信号为轻触信号,则开启混合现实录像功能,在所述混合现实录像功能下,在所述取景框内显示叠加在现实场景中的AR虚拟信息,以及在所述混合现实录像功能下,录制得到的视频信息为叠加所述AR虚拟信息的现实场景视频信息,具体如图2S所示;If the user interface includes a virtual information button, the first gaze position is located within the boundary of the virtual information button, and the first touch signal is a light touch signal, the mixed reality recording function is turned on, and in the mixed reality Under the recording function, the AR virtual information superimposed on the real scene is displayed in the viewfinder, and under the mixed reality recording function, the recorded video information is the real scene video information superimposed on the AR virtual information, specifically As shown in Figure 2S;

若所述AR眼镜正在录像、所述用户界面包括浮窗模式按键、所述第一注视位置位于所述浮窗模式按键边界内、且所述第一触控信号为轻触信号,则开启浮窗模式录像功能,在所述浮窗模式拍照功能下,在所述用户界面内显示第二浮窗,所述第二浮窗包括缩小的已录制时长、所述取景框和暂停录制按键,具体如图2T所示。If the AR glasses are recording video, the user interface includes a floating window mode button, the first gaze position is located within the boundary of the floating window mode button, and the first touch signal is a light touch signal, then the floating window is turned on. Window mode video recording function, under the floating window mode photo taking function, a second floating window is displayed in the user interface, the second floating window includes the reduced recorded duration, the viewfinder frame and the pause recording button, specifically As shown in Figure 2T.

在本申请的一实现方式中,所述开启浮窗模式录像功能之后,所述方法还包括:In an implementation manner of the present application, after the floating window mode recording function is enabled, the method further includes:

通过所述人眼追踪镜头确定人眼注视所述用户界面的第三注视位置,以及通过所述蓝牙模块接收所述触控指环发送的第三触控信号;Using the human eye tracking lens to determine a third gaze position where the human eye gazes at the user interface, and receiving a third touch signal sent by the touch ring through the Bluetooth module;

若所述第三注视位置位于所述取景框内、且所述第三触控信号为轻触信号,则关闭所述浮窗模式录像功能,具体如图2U所示;If the third gaze position is located in the viewfinder frame and the third touch signal is a light touch signal, then close the floating window mode video recording function, as shown in Figure 2U;

若所述第三注视位置位于所述暂停录制按键边界内、且所述第三触控信号为轻触信号,则暂停录像,具体如图2V所示;If the third gaze position is located within the boundary of the pause recording button and the third touch signal is a light touch signal, then pause the video recording, as shown in Figure 2V;

若所述第三注视位置位于所述第二浮窗内,且所述第三触控信号为长按信号,则基于人眼注视点移动所述第二浮窗,具体如图2W所示。If the third gaze position is located in the second floating window, and the third touch signal is a long press signal, the second floating window is moved based on the gaze point of human eyes, as shown in FIG. 2W .

其中,所述第三注视位置位于所述第二浮窗表示所述第三注视位置位于所述拍已录制时长边界内,或所述第三注视位置位于所述取景框边界内,或所述第三注视位置位于所述暂停录制按键边界内。Wherein, the third gaze position is located in the second floating window means that the third gaze position is located within the duration boundary of the shot recorded, or the third gaze position is located within the boundary of the viewfinder frame, or the The third gaze location is located within the boundary of the pause recording button.

可以看出,在本申请实施例中,AR眼镜先在显示器上显示用户界面,然后确定人眼注视该用户界面的注视位置,以及通过蓝牙模块接收与AR眼镜配对的触控指环发送的触控信号,最后基于注视位置和触控信号进行控制操作,其中,眼动追踪实现交互的精细度,触控指环实现交互的便利性,进而实现了便捷精准地控制AR眼镜。It can be seen that in the embodiment of this application, the AR glasses first display the user interface on the display, and then determine the gaze position of the human eye on the user interface, and receive the touch control sent by the touch ring paired with the AR glasses through the Bluetooth module. Finally, the control operation is performed based on the gaze position and the touch signal. Among them, the eye tracking realizes the fineness of the interaction, and the touch ring realizes the convenience of the interaction, thereby realizing the convenient and precise control of the AR glasses.

请参见图3,图3是本申请实施例提供的一种AR眼镜的结构示意图,如图所示,该电子设备包括处理器、存储器、蓝牙模块、蓝牙模块、显示器、人眼追踪镜头、无线通信模块,所述人眼追踪镜头设于所述显示器的一侧;其中,上述一个或多个程序被存储在上述存储器中,并且被配置由上述处理器执行,上述程序包括用于执行以下步骤的指令:Please refer to Figure 3. Figure 3 is a schematic structural diagram of an AR glasses provided by an embodiment of the present application. As shown in the figure, the electronic device includes a processor, memory, Bluetooth module, Bluetooth module, display, human eye tracking lens, wireless In the communication module, the human eye tracking lens is set on one side of the display; wherein, the above-mentioned one or more programs are stored in the above-mentioned memory, and are configured to be executed by the above-mentioned processor, and the above-mentioned programs include the following steps: The command:

在所述显示器上显示用户界面;displaying a user interface on the display;

通过所述人眼追踪镜头确定人眼注视所述用户界面的第一注视位置,以及通过所述蓝牙模块接收与所述AR眼镜配对的触控指环发送的第一触控信号;Using the human eye tracking lens to determine the first gaze position of the human eye on the user interface, and receiving the first touch signal sent by the touch ring paired with the AR glasses through the Bluetooth module;

基于所述第一注视位置和所述第一触控信号进行控制操作。A control operation is performed based on the first gaze position and the first touch signal.

可以看出,在本申请实施例中,AR眼镜先在显示器上显示用户界面,然后确定人眼注视该用户界面的注视位置,以及通过蓝牙模块接收与AR眼镜配对的触控指环发送的触控信号,最后基于注视位置和触控信号进行控制操作,其中,眼动追踪实现交互的精细度,触控指环实现交互的便利性,进而实现了便捷精准地控制AR眼镜。It can be seen that in the embodiment of this application, the AR glasses first display the user interface on the display, and then determine the gaze position of the human eye on the user interface, and receive the touch control sent by the touch ring paired with the AR glasses through the Bluetooth module. Finally, the control operation is performed based on the gaze position and the touch signal. Among them, the eye tracking realizes the fineness of the interaction, and the touch ring realizes the convenience of the interaction, thereby realizing the convenient and precise control of the AR glasses.

在本申请的一实现方式中,所述AR眼镜还包括多个红外追踪LED,所述多个红外追踪LED沿所述显示器的周侧设置,在通过所述人眼追踪镜头确定人眼注视所述用户界面的第一注视位置方面,上述程序包括具体用于执行以下步骤的指令:In an implementation manner of the present application, the AR glasses further include a plurality of infrared tracking LEDs, and the plurality of infrared tracking LEDs are arranged along the circumference of the display. In terms of the first gaze position of the user interface, the above program includes instructions specifically for performing the following steps:

通过所述人眼追踪镜头捕捉人眼反射的红外光斑,所述红外光斑是所述多个红外追踪LED向人眼投射红外光后形成的;The infrared spot reflected by the human eye is captured by the human eye tracking lens, and the infrared spot is formed after the plurality of infrared tracking LEDs project infrared light to the human eye;

基于所述红外光斑确定人眼注视所述用户界面的第一注视位置。A first gaze position at which human eyes gaze at the user interface is determined based on the infrared light spot.

在本申请的一实现方式中,所述AR眼镜处于相机模式,所述用户界面包括取景框,在基于所述第一注视位置和所述第一触控信号进行控制操作方面,上述程序包括具体用于执行以下步骤的指令:In an implementation manner of the present application, the AR glasses are in the camera mode, the user interface includes a viewfinder frame, and in terms of performing control operations based on the first gaze position and the first touch signal, the above program includes specific Instructions to perform the following steps:

若所述第一注视位置位于所述取景框内,且所述第一触控信号为轻触信号,则对所述第一注视位置进行对焦,其中,在对所述第一注视位置进行对焦后,所述用户界面显示对焦按键和曝光刻度。If the first gazing position is located in the viewfinder frame, and the first touch signal is a light touch signal, then focus on the first gazing position, wherein, focus on the first gazing position After that, the user interface displays a focus button and an exposure scale.

在本申请的一实现方式中,所述AR眼镜处于拍照模式,所述用户界面包括取景框,在基于所述第一注视位置和所述第一触控信号进行控制操作方面,上述程序包括具体用于执行以下步骤的指令:In an implementation manner of the present application, the AR glasses are in the photographing mode, the user interface includes a viewfinder frame, and in terms of performing control operations based on the first gaze position and the first touch signal, the above program includes specific Instructions to perform the following steps:

若所述用户界面包括拍照按键、所述第一注视位置位于所述拍照按键边界内、且所述第一触控信号为轻触信号,则进行现实拍照;If the user interface includes a camera button, the first gaze position is located within the boundary of the camera button, and the first touch signal is a light touch signal, then take a real photo;

若所述用户界面包括对焦按键、所述第一注视位置位于所述对焦按键内、且所述第一触控信号为轻触信号,则进行现实拍照;If the user interface includes a focus button, the first gaze position is located in the focus button, and the first touch signal is a light touch signal, then take a real photo;

若所述用户界面包括虚拟信息按键、所述第一注视位置位于所述虚拟信息按键边界内,且所述第一触控信号为轻触信号,则开启混合现实拍照功能,在所述混合现实拍照功能下,在所述取景框内显示叠加在现实场景中的AR虚拟信息,以及在所述混合现实拍照功能下,拍照得到的图像信息为叠加所述AR虚拟信息的现实场景图像信息;If the user interface includes a virtual information button, the first gaze position is located within the boundary of the virtual information button, and the first touch signal is a light touch signal, the mixed reality camera function is enabled, and the mixed reality Under the photographing function, AR virtual information superimposed in the real scene is displayed in the viewfinder frame, and under the mixed reality photographing function, the image information obtained by photographing is real scene image information superimposed on the AR virtual information;

若所述用户界面包括浮窗模式按键、所述第一注视位置位于所述浮窗模式按键边界内、且所述第一触控信号为轻触信号,则开启浮窗模式拍照功能,在所述浮窗模式拍照功能下,在所述用户界面内显示第一浮窗,所述第一浮窗包括缩小后的所述取景框和拍照快门按键。If the user interface includes a floating window mode button, the first gaze position is located within the boundary of the floating window mode button, and the first touch signal is a light touch signal, then the floating window mode camera function is turned on, and Under the function of taking pictures in floating window mode, a first floating window is displayed in the user interface, and the first floating window includes a reduced viewfinder frame and a shutter button for taking photos.

在本申请的一实现方式中,在开启浮窗模式拍照功能之后,上述程序包括还用于执行以下步骤的指令:In an implementation of the present application, after the function of taking pictures in the floating window mode is turned on, the above-mentioned program includes instructions for performing the following steps:

通过所述人眼追踪镜头确定人眼注视所述用户界面的第二注视位置,以及通过所述蓝牙模块接收所述触控指环发送的第二触控信号;Using the human eye tracking lens to determine a second gaze position where the human eye gazes at the user interface, and receiving a second touch signal sent by the touch ring through the Bluetooth module;

若所述第二注视位置位于所述取景框内、且所述第二触控信号为轻触信号,则关闭所述浮窗模式拍照功能;If the second gaze position is located in the viewfinder frame and the second touch signal is a light touch signal, then close the floating window mode photographing function;

若所述第二注视位置位于所述拍照快门按键边界内、且所述第二触控信号为轻触信号,则进行现实拍照;If the second gaze position is located within the boundary of the shutter button for taking photos, and the second touch signal is a light touch signal, then take a real photo;

若所述第二注视位置位于所述第一浮窗内,且所述第二触控信号为长按信号,则基于人眼注视点移动所述第一浮窗。If the second gaze position is located in the first floating window, and the second touch signal is a long press signal, the first floating window is moved based on the gaze point of human eyes.

在本申请的一实现方式中,所述AR眼镜处于录像模式,在基于所述第一注视位置和所述第一触控信号进行控制操作方面,上述程序包括具体用于执行以下步骤的指令:In an implementation manner of the present application, the AR glasses are in video recording mode, and in terms of performing control operations based on the first gaze position and the first touch signal, the above program includes instructions specifically for performing the following steps:

若所述用户界面包括录像按键、所述第一注视位置位于所述录像按键边界内、且所述第一触控信号为轻触信号,则进行现实录像;If the user interface includes a video recording button, the first gaze position is located within the boundary of the video recording button, and the first touch signal is a light touch signal, real video recording is performed;

若所述用户界面包括虚拟信息按键、所述第一注视位置位于所述虚拟信息按键边界内,且所述第一触控信号为轻触信号,则开启混合现实录像功能,在所述混合现实录像功能下,在所述取景框内显示叠加在现实场景中的AR虚拟信息,以及在所述混合现实录像功能下,录制得到的视频信息为叠加所述AR虚拟信息的现实场景视频信息;If the user interface includes a virtual information button, the first gaze position is located within the boundary of the virtual information button, and the first touch signal is a light touch signal, the mixed reality recording function is turned on, and in the mixed reality Under the recording function, AR virtual information superimposed on the real scene is displayed in the viewfinder frame, and under the mixed reality recording function, the recorded video information is real scene video information superimposed on the AR virtual information;

若所述AR眼镜正在录像、所述用户界面包括浮窗模式按键、所述第一注视位置位于所述浮窗模式按键边界内、且所述第一触控信号为轻触信号,则开启浮窗模式录像功能,在所述浮窗模式拍照功能下,在所述用户界面内显示第二浮窗,所述第二浮窗包括缩小的已录制时长、所述取景框和暂停录制按键。If the AR glasses are recording video, the user interface includes a floating window mode button, the first gaze position is located within the boundary of the floating window mode button, and the first touch signal is a light touch signal, then the floating window is turned on. Window mode video recording function, under the floating window mode photographing function, a second floating window is displayed in the user interface, and the second floating window includes a reduced recorded duration, the viewfinder frame and a button to pause recording.

在本申请的一实现方式中,在开启浮窗模式录像功能之后,上述程序包括还用于执行以下步骤的指令:In an implementation of the present application, after the floating window mode video recording function is turned on, the above program includes instructions for performing the following steps:

通过所述人眼追踪镜头确定人眼注视所述用户界面的第三注视位置,以及通过所述蓝牙模块接收所述触控指环发送的第三触控信号;Using the human eye tracking lens to determine a third gaze position where the human eye gazes at the user interface, and receiving a third touch signal sent by the touch ring through the Bluetooth module;

若所述第三注视位置位于所述取景框内、且所述第三触控信号为轻触信号,则关闭所述浮窗模式录像功能;If the third gaze position is located in the viewfinder frame and the third touch signal is a light touch signal, then close the floating window mode video recording function;

若所述第三注视位置位于所述暂停录制按键边界内、且所述第三触控信号为轻触信号,则暂停录像;If the third gaze position is located within the border of the pause recording button and the third touch signal is a light touch signal, then pause the video recording;

若所述第三注视位置位于所述第二浮窗内,且所述第三触控信号为长按信号,则基于人眼注视点移动所述第二浮窗。If the third gaze position is located in the second floating window, and the third touch signal is a long press signal, the second floating window is moved based on the gaze point of human eyes.

需要说明的是,本实施例的具体实现过程可参见上述方法实施例所述的具体实现过程,在此不再叙述。It should be noted that, for the specific implementation process of this embodiment, reference may be made to the specific implementation process described in the foregoing method embodiments, which will not be described here again.

请参阅图4,图4是本申请实施例提供的一种设备控制装置,应用于AR眼镜,所述AR眼镜包括蓝牙模块、显示器和人眼追踪镜头,所述人眼追踪镜头设于所述显示器的一侧,该装置包括:Please refer to Figure 4, Figure 4 is a device control device provided by an embodiment of the present application, which is applied to AR glasses, and the AR glasses include a Bluetooth module, a display, and a human eye tracking lens, and the human eye tracking lens is set on the On one side of the monitor, the unit consists of:

显示单元401,用于在所述显示器上显示用户界面;adisplay unit 401, configured to display a user interface on the display;

注视位置确定单元402,用于通过所述人眼追踪镜头确定人眼注视所述用户界面的第一注视位置;A gazeposition determining unit 402, configured to determine a first gaze position where human eyes gaze at the user interface through the human eye tracking lens;

通信单元403,用于通过所述蓝牙模块接收与所述AR眼镜配对的触控指环发送的第一触控信号;Acommunication unit 403, configured to receive a first touch signal sent by a touch ring paired with the AR glasses through the Bluetooth module;

控制单元404,用于基于所述第一注视位置和所述第一触控信号进行控制操作。Thecontrol unit 404 is configured to perform a control operation based on the first gaze position and the first touch signal.

可以看出,在本申请实施例中,AR眼镜先在显示器上显示用户界面,然后确定人眼注视该用户界面的注视位置,以及通过蓝牙模块接收与AR眼镜配对的触控指环发送的触控信号,最后基于注视位置和触控信号进行控制操作,其中,眼动追踪实现交互的精细度,触控指环实现交互的便利性,进而实现了便捷精准地控制AR眼镜。It can be seen that in the embodiment of this application, the AR glasses first display the user interface on the display, and then determine the gaze position of the human eye on the user interface, and receive the touch control sent by the touch ring paired with the AR glasses through the Bluetooth module. Finally, the control operation is performed based on the gaze position and the touch signal. Among them, the eye tracking realizes the fineness of the interaction, and the touch ring realizes the convenience of the interaction, thereby realizing the convenient and precise control of the AR glasses.

在本申请的一实现方式中,所述AR眼镜还包括多个红外追踪LED,所述多个红外追踪LED沿所述显示器的周侧设置,在通过所述人眼追踪镜头确定人眼注视所述用户界面的第一注视位置方面,上述注视位置确定单元402具体用于:In an implementation manner of the present application, the AR glasses further include a plurality of infrared tracking LEDs, and the plurality of infrared tracking LEDs are arranged along the circumference of the display. Regarding the first gaze position of the user interface, the gazeposition determination unit 402 is specifically used for:

通过所述人眼追踪镜头捕捉人眼反射的红外光斑,所述红外光斑是所述多个红外追踪LED向人眼投射红外光后形成的;The infrared spot reflected by the human eye is captured by the human eye tracking lens, and the infrared spot is formed after the plurality of infrared tracking LEDs project infrared light to the human eye;

基于所述红外光斑确定人眼注视所述用户界面的第一注视位置。A first gaze position at which human eyes gaze at the user interface is determined based on the infrared light spot.

在本申请的一实现方式中,所述AR眼镜处于相机模式,所述用户界面包括取景框,在基于所述第一注视位置和所述第一触控信号进行控制操作方面,上述控制单元404具体用于:In an implementation manner of the present application, the AR glasses are in camera mode, the user interface includes a viewfinder frame, and in terms of performing control operations based on the first gaze position and the first touch signal, thecontrol unit 404 Specifically for:

若所述第一注视位置位于所述取景框内,且所述第一触控信号为轻触信号,则对所述第一注视位置进行对焦,其中,在对所述第一注视位置进行对焦后,所述用户界面显示对焦按键和曝光刻度。If the first gazing position is located in the viewfinder frame, and the first touch signal is a light touch signal, then focus on the first gazing position, wherein, focus on the first gazing position After that, the user interface displays a focus button and an exposure scale.

在本申请的一实现方式中,所述AR眼镜处于拍照模式,所述用户界面包括取景框,在基于所述第一注视位置和所述第一触控信号进行控制操作方面,上述控制单元404具体用于:In an implementation manner of the present application, the AR glasses are in the photographing mode, the user interface includes a viewfinder frame, and in terms of performing control operations based on the first gaze position and the first touch signal, thecontrol unit 404 Specifically for:

若所述用户界面包括拍照按键、所述第一注视位置位于所述拍照按键边界内、且所述第一触控信号为轻触信号,则进行现实拍照;If the user interface includes a camera button, the first gaze position is located within the boundary of the camera button, and the first touch signal is a light touch signal, then take a real photo;

若所述用户界面包括对焦按键、所述第一注视位置位于所述对焦按键内、且所述第一触控信号为轻触信号,则进行现实拍照;If the user interface includes a focus button, the first gaze position is located in the focus button, and the first touch signal is a light touch signal, then take a real photo;

若所述用户界面包括虚拟信息按键、所述第一注视位置位于所述虚拟信息按键边界内,且所述第一触控信号为轻触信号,则开启混合现实拍照功能,在所述混合现实拍照功能下,在所述取景框内显示叠加在现实场景中的AR虚拟信息,以及在所述混合现实拍照功能下,拍照得到的图像信息为叠加所述AR虚拟信息的现实场景图像信息;If the user interface includes a virtual information button, the first gaze position is located within the boundary of the virtual information button, and the first touch signal is a light touch signal, the mixed reality camera function is enabled, and the mixed reality Under the photographing function, AR virtual information superimposed in the real scene is displayed in the viewfinder frame, and under the mixed reality photographing function, the image information obtained by photographing is real scene image information superimposed on the AR virtual information;

若所述用户界面包括浮窗模式按键、所述第一注视位置位于所述浮窗模式按键边界内、且所述第一触控信号为轻触信号,则开启浮窗模式拍照功能,在所述浮窗模式拍照功能下,在所述用户界面内显示第一浮窗,所述第一浮窗包括缩小后的所述取景框和拍照快门按键。If the user interface includes a floating window mode button, the first gaze position is located within the boundary of the floating window mode button, and the first touch signal is a light touch signal, then the floating window mode camera function is turned on, and Under the function of taking pictures in floating window mode, a first floating window is displayed in the user interface, and the first floating window includes a reduced viewfinder frame and a shutter button for taking photos.

在本申请的一实现方式中,注视位置确定单元402,还用于在开启浮窗模式拍照功能之后,通过所述人眼追踪镜头确定人眼注视所述用户界面的第二注视位置;In an implementation manner of the present application, the gazeposition determination unit 402 is further configured to determine the second gaze position where the human eye gazes at the user interface through the human eye tracking lens after the floating window mode photographing function is turned on;

通信单元403,还用于通过所述蓝牙模块接收所述触控指环发送的第二触控信号;Thecommunication unit 403 is further configured to receive the second touch signal sent by the touch ring through the Bluetooth module;

控制单元404,还用于若所述第二注视位置位于所述取景框内、且所述第二触控信号为轻触信号,则关闭所述浮窗模式拍照功能;若所述第二注视位置位于所述拍照快门按键边界内、且所述第二触控信号为轻触信号,则进行现实拍照;若所述第二注视位置位于所述第一浮窗内,且所述第二触控信号为长按信号,则基于人眼注视点移动所述第一浮窗。Thecontrol unit 404 is further configured to close the floating window mode photographing function if the second gaze position is within the viewfinder frame and the second touch signal is a light touch signal; if the second gaze If the position is within the boundary of the camera shutter button, and the second touch signal is a light touch signal, then the actual photograph is taken; if the second gaze position is within the first floating window, and the second touch If the control signal is a long press signal, the first floating window is moved based on the gaze point of the human eye.

在本申请的一实现方式中,所述AR眼镜处于录像模式,在基于所述第一注视位置和所述第一触控信号进行控制操作方面,上述控制单元404具体用于:In an implementation manner of the present application, the AR glasses are in video recording mode, and in terms of performing control operations based on the first gaze position and the first touch signal, the above-mentionedcontrol unit 404 is specifically configured to:

若所述用户界面包括录像按键、所述第一注视位置位于所述录像按键边界内、且所述第一触控信号为轻触信号,则进行现实录像;If the user interface includes a video recording button, the first gaze position is located within the boundary of the video recording button, and the first touch signal is a light touch signal, real video recording is performed;

若所述用户界面包括虚拟信息按键、所述第一注视位置位于所述虚拟信息按键边界内,且所述第一触控信号为轻触信号,则开启混合现实录像功能,在所述混合现实录像功能下,在所述取景框内显示叠加在现实场景中的AR虚拟信息,以及在所述混合现实录像功能下,录制得到的视频信息为叠加所述AR虚拟信息的现实场景视频信息;If the user interface includes a virtual information button, the first gaze position is located within the boundary of the virtual information button, and the first touch signal is a light touch signal, the mixed reality recording function is turned on, and in the mixed reality Under the recording function, AR virtual information superimposed on the real scene is displayed in the viewfinder frame, and under the mixed reality recording function, the recorded video information is real scene video information superimposed on the AR virtual information;

若所述AR眼镜正在录像、所述用户界面包括浮窗模式按键、所述第一注视位置位于所述浮窗模式按键边界内、且所述第一触控信号为轻触信号,则开启浮窗模式录像功能,在所述浮窗模式拍照功能下,在所述用户界面内显示第二浮窗,所述第二浮窗包括缩小的已录制时长、所述取景框和暂停录制按键。If the AR glasses are recording video, the user interface includes a floating window mode button, the first gaze position is located within the boundary of the floating window mode button, and the first touch signal is a light touch signal, then the floating window is turned on. Window mode video recording function, under the floating window mode photographing function, a second floating window is displayed in the user interface, and the second floating window includes a reduced recorded duration, the viewfinder frame and a button to pause recording.

在本申请的一实现方式中,注视位置确定单元402,还用于在开启浮窗模式录像功能之后,通过所述人眼追踪镜头确定人眼注视所述用户界面的第三注视位置;In an implementation of the present application, the gazeposition determination unit 402 is further configured to determine a third gaze position where human eyes gaze at the user interface through the human eye tracking lens after the floating window mode video recording function is turned on;

通信单元403,还用于通过所述蓝牙模块接收所述触控指环发送的第三触控信号;Thecommunication unit 403 is further configured to receive the third touch signal sent by the touch ring through the Bluetooth module;

控制单元404,还用于若所述第三注视位置位于所述取景框内、且所述第三触控信号为轻触信号,则关闭所述浮窗模式录像功能;若所述第三注视位置位于所述暂停录制按键边界内、且所述第三触控信号为轻触信号,则暂停录像;若所述第三注视位置位于所述第二浮窗内,且所述第三触控信号为长按信号,则基于人眼注视点移动所述第二浮窗。Thecontrol unit 404 is further configured to close the floating window mode video recording function if the third gaze position is within the viewfinder frame and the third touch signal is a light touch signal; if the third gaze If the position is within the boundary of the pause recording button, and the third touch signal is a light touch signal, the recording is paused; if the third gaze position is within the second floating window, and the third touch If the signal is a long press signal, the second floating window is moved based on the gaze point of human eyes.

本申请实施例还提供了一种计算机可读存储介质,其中,所述计算机可读存储介质存储用于电子数据交换的计算机程序,其中,所述计算机程序使得计算机执行如上述方法实施例中AR眼镜所描述的部分或全部步骤。An embodiment of the present application also provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, wherein the computer program enables the computer to execute the AR in the above-mentioned method embodiment. Spectacles describe some or all of the steps.

本申请实施例还提供了一种计算机程序产品,其中,所述计算机程序产品包括存储了计算机程序的非瞬时性计算机可读存储介质,所述计算机程序可操作来使计算机执行如上述方法中AR眼镜所描述的部分或全部步骤。该计算机程序产品可以为一个软件安装包。An embodiment of the present application also provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to enable the computer to execute the AR in the above method. Spectacles describe some or all of the steps. The computer program product may be a software installation package.

本申请实施例所描述的方法或者算法的步骤可以以硬件的方式来实现,也可以是由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存放于随机存取存储器(Random Access Memory,RAM)、闪存、只读存储器(Read OnlyMemory,ROM)、可擦除可编程只读存储器(Erasable Programmable ROM,EPROM)、电可擦可编程只读存储器(Electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、只读光盘(CD-ROM)或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。另外,该ASIC可以位于接入网设备、目标网络设备或核心网设备中。当然,处理器和存储介质也可以作为分立组件存在于接入网设备、目标网络设备或核心网设备中。The steps of the methods or algorithms described in the embodiments of the present application may be implemented in the form of hardware, or may be implemented in the form of a processor executing software instructions. The software instructions can be composed of corresponding software modules, and the software modules can be stored in random access memory (Random Access Memory, RAM), flash memory, read only memory (Read Only Memory, ROM), erasable programmable read-only memory (Erasable Programmable ROM, EPROM), electrically erasable programmable read-only memory (Electrically EPROM, EEPROM), registers, hard disk, removable hard disk, CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be a component of the processor. The processor and storage medium can be located in the ASIC. In addition, the ASIC may be located in an access network device, a target network device or a core network device. Certainly, the processor and the storage medium may also exist in the access network device, the target network device or the core network device as discrete components.

本领域技术人员应该可以意识到,在上述一个或多个示例中,本申请实施例所描述的功能可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DigitalSubscriber Line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,数字视频光盘(DigitalVideo Disc,DVD))、或者半导体介质(例如,固态硬盘(Solid State Disk,SSD))等。Those skilled in the art should be aware that, in the above one or more examples, the functions described in the embodiments of the present application may be implemented in whole or in part by software, hardware, firmware or any combination thereof. When implemented using software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part. The computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server or data center by wired (such as coaxial cable, optical fiber, Digital Subscriber Line (DSL)) or wireless (such as infrared, wireless, microwave, etc.). The computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media. The available medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a digital video disc (DigitalVideo Disc, DVD)), or a semiconductor medium (for example, a solid state disk (Solid State Disk, SSD)) wait.

以上所述的具体实施方式,对本申请实施例的目的、技术方案和有益效果进行了进一步详细说明,所应理解的是,以上所述仅为本申请实施例的具体实施方式而已,并不用于限定本申请实施例的保护范围,凡在本申请实施例的技术方案的基础之上,所做的任何修改、等同替换、改进等,均应包括在本申请实施例的保护范围之内。The specific implementation manners described above further describe the purpose, technical solutions and beneficial effects of the embodiments of the present application in detail. To limit the protection scope of the embodiments of the present application, any modifications, equivalent replacements, improvements, etc. made on the basis of the technical solutions of the embodiments of the present application shall be included in the protection scope of the embodiments of the present application.

Claims (9)

CN201911310496.1A2019-12-182019-12-18Equipment control method and related equipmentActiveCN111061372B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN201911310496.1ACN111061372B (en)2019-12-182019-12-18Equipment control method and related equipment

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN201911310496.1ACN111061372B (en)2019-12-182019-12-18Equipment control method and related equipment

Publications (2)

Publication NumberPublication Date
CN111061372A CN111061372A (en)2020-04-24
CN111061372Btrue CN111061372B (en)2023-05-02

Family

ID=70302279

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN201911310496.1AActiveCN111061372B (en)2019-12-182019-12-18Equipment control method and related equipment

Country Status (1)

CountryLink
CN (1)CN111061372B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
DK180859B1 (en)2017-06-042022-05-23Apple Inc USER INTERFACE CAMERA EFFECTS
CN114967904A (en)*2021-02-192022-08-30北京京东方光电科技有限公司Sight line positioning method, head-mounted display device, computer device and storage medium
WO2024197130A1 (en)*2023-03-212024-09-26Apple Inc.Devices, methods, and graphical user interfaces for capturing media with a camera application
US20240373121A1 (en)2023-05-052024-11-07Apple Inc.User interfaces for controlling media capture settings
CN117008730A (en)*2023-08-072023-11-07中兴通讯股份有限公司Control method, electronic device, intelligent finger ring, control system and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2015120673A1 (en)*2014-02-112015-08-20惠州Tcl移动通信有限公司Method, system and photographing equipment for controlling focusing in photographing by means of eyeball tracking technology
WO2015183438A1 (en)*2014-05-302015-12-03Apple Inc.Realtime capture exposure adjust gestures
CN107003730A (en)*2015-03-132017-08-01华为技术有限公司 Electronic device, photographing method and photographing device
CN109600555A (en)*2019-02-022019-04-09北京七鑫易维信息技术有限公司A kind of focusing control method, system and photographing device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN103747181A (en)*2014-01-102014-04-23上海斐讯数据通信技术有限公司System for combining videos and camera-acquired pictures
CN105824522A (en)*2015-09-242016-08-03维沃移动通信有限公司Photographing method and mobile terminal
US10466780B1 (en)*2015-10-262019-11-05PillantasSystems and methods for eye tracking calibration, eye vergence gestures for interface control, and visual aids therefor
CN106131394A (en)*2016-06-152016-11-16青岛海信移动通信技术股份有限公司A kind of method and device taken pictures
CN106354253A (en)*2016-08-192017-01-25上海理湃光晶技术有限公司Cursor control method and AR glasses and intelligent ring based on same
CN107729871A (en)*2017-11-022018-02-23北方工业大学Infrared light-based human eye movement track tracking method and device
CN109725717A (en)*2018-11-302019-05-07成都理想境界科技有限公司Image processing method and AR equipment applied to AR equipment
CN109597489A (en)*2018-12-272019-04-09武汉市天蝎科技有限公司A kind of method and system of the eye movement tracking interaction of near-eye display device
CN110069101B (en)*2019-04-242024-04-02洪浛檩Wearable computing device and man-machine interaction method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
WO2015120673A1 (en)*2014-02-112015-08-20惠州Tcl移动通信有限公司Method, system and photographing equipment for controlling focusing in photographing by means of eyeball tracking technology
WO2015183438A1 (en)*2014-05-302015-12-03Apple Inc.Realtime capture exposure adjust gestures
CN107003730A (en)*2015-03-132017-08-01华为技术有限公司 Electronic device, photographing method and photographing device
CN109600555A (en)*2019-02-022019-04-09北京七鑫易维信息技术有限公司A kind of focusing control method, system and photographing device

Also Published As

Publication numberPublication date
CN111061372A (en)2020-04-24

Similar Documents

PublicationPublication DateTitle
CN111061372B (en)Equipment control method and related equipment
US20230336865A1 (en)Device, methods, and graphical user interfaces for capturing and displaying media
US10459520B2 (en)Systems and methods of eye tracking control
KR102494698B1 (en)Method and apparatus for changing focus of camera
JP7459798B2 (en) Information processing device, information processing method, and program
CN110546601B (en)Information processing device, information processing method, and program
CN111970456B (en)Shooting control method, device, equipment and storage medium
WO2022100712A1 (en)Method and system for displaying virtual prop in real environment image, and storage medium
JP2017199379A (en)Tracking display system, tracking display program, tracking display method, wearable device using the same, tracking display program for wearable device, and manipulation method for wearable device
US11650661B2 (en)Electronic device and control method for electronic device
US20150015542A1 (en)Control Method And Electronic Device
US20240028177A1 (en)Devices, methods, and graphical user interfaces for interacting with media and three-dimensional environments
CN112585566A (en)Hand-covering face input sensing for interacting with device having built-in camera
JP2016177658A (en)Virtual input device, input method, and program
CN107688385A (en)A kind of control method and device
US20240320930A1 (en)Devices, methods, and graphical user interfaces for capturing media with a camera application
US10488923B1 (en)Gaze detection, identification and control method
CN104427226B (en)Image-pickup method and electronic equipment
JP2021018634A (en)Electronic equipment and method of controlling the same
JP2020197976A (en)Electronic apparatus, control method for electronic apparatus, program, and recording medium
CN111782053B (en)Model editing method, device, equipment and storage medium
JP2023160103A (en)Electronic apparatus
JP2023087412A (en) ELECTRONIC DEVICE, ELECTRONIC DEVICE CONTROL METHOD, AND PROGRAM
CN116597334A (en) Image acquisition method, electronic device and storage medium
JP2024169176A (en) Imaging device, control method and program thereof

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp