Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following are detailed below.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1A to 1C, fig. 1A to 1C are schematic structural views of an AR glasses according to an embodiment of the present application. The AR glasses include an AR glassesmain body 101, amain camera 102, a plurality of first infrared tracking Light Emitting Diodes (LEDs) 103, a firsteye tracking lens 104, a plurality of secondinfrared tracking LEDs 105, a secondeye tracking lens 106, aleft eye display 107, and aright eye display 108.
Optionally, themain camera 102, the plurality of firstinfrared tracking LEDs 103, the firsteye tracking lens 104, the plurality of secondinfrared tracking LEDs 105, the secondeye tracking lens 106, theleft eye display 107, and theright eye display 108 are all fixed on the AR glassesmain body 101. The plurality of firstinfrared tracking LEDs 103 are disposed along the peripheral side of the left-eye display 107, and the firsteye tracking lens 104 is disposed on one side of the left-eye display 107. The plurality of secondinfrared tracking LEDs 105 are disposed along the peripheral side of the right-eye display 108, and the secondeye tracking lens 106 is disposed on one side of the left-eye display 108.
Optionally, the AR glasses are further provided with a processor, an image information processing module, a memory, a bluetooth module, an eye movement tracking processing module, a wireless communication module, and the like. The image information processing module and the eye tracking processing module may be independent of the processor or may be integrated in the processor. The AR glasses can be in communication connection with a wireless communication device (such as a smart phone, a tablet computer and the like) through the wireless communication module.
The AR glasses can be worn on the head of a user to ensure that the user can clearly see virtual information superposed in the real world. Themain camera 102 is used for photographing and shooting, and after themain camera 102 is started, theexternal light 109 is collected and converted into digital information, and the digital information is converted into visible digital image information through an image information processing module of the AR glasses. Taking the right eye as an example, the image information is projected to theright eye display 108 by the micro-projector 110, and finally projected to the field of view of the right eye of the user after the light direction is changed by theright eye display 108. The micro-projector 110 is fixed to theAR eyeglass body 101, and the micro-projector 110 has two, one is provided on one side of the left-eye display 107 and the other is provided on one side of the right-eye display 108.
The first eye-tracking lens 104 is used for tracking the direction of the eye line of the left eye of the user and the gaze position in real time. The second eye-trackinglens 106 is used for tracking the sight direction and the fixation position of the right eye of the user in real time. After the user correctly wears the AR glasses, the firstinfrared tracking LEDs 103 and the secondinfrared tracking LEDs 105 project infrared light to eyeballs of the user, taking the right eye as an example, the infrared light forms light spots at the position of the cornea of the right eye, the secondeye tracking lens 106 captures the infrared light spots reflected by the cornea of the eye in real time, then the infrared light spots are converted into digital information, and the digital information analyzes the sight direction of the user and the fixation position in the user interface through the eye movement tracking processing module. The eye movement tracking processing module can also judge the blinking behavior of the user according to the disappearance sequence and the reappearance sequence of the infrared light spots.
Referring to fig. 1D, fig. 1D is a schematic diagram of a user interface provided in the embodiment of the present application. And the image information processing module displays the processed image information in a view frame of the user interface. And a cursor displayed in the view-finding frame calculates the watching position of the human eye watching the user interface for the eye movement tracking module. When the cursor is located in a hot area of an object (such as a video recording key, a photographing key, a virtual information key, and the like) in the user interface, a display effect of the hot area changes (such as a change in a boundary size, a frame color thickness, and the like), and at this time, the user can complete corresponding interactive operation through a touch ring paired with the AR glasses bluetooth.
Referring to fig. 1E, fig. 1E is a schematic structural diagram of a touch ring according to an embodiment of the present disclosure. The touch-control ring can be worn at the second joint of the index finger of one hand (such as the left hand or the right hand) of a user, and then the interaction is realized by touching the touch-control area of the touch-control ring through the thumb. The inside bluetooth module that is equipped with of touch-control ring, the regional expansion of touch-control can be a rectangle plane. And a capacitive touch sensor in the touch area records touch operation in real time and transmits a touch signal to a Bluetooth module of the AR glasses in real time through a Bluetooth module of the touch ring. The touch signal includes, for example:
(1) a touch signal: single touch point time is less than 500 milliseconds;
(2) long press signal: a single touch point time is greater than or equal to 500 milliseconds;
(3) and (3) right slip signal: a single touch point and produces a horizontal right shift;
(4) left slip signal: a single touch point generates horizontal displacement to the left;
(5) an up-slide signal: a single touch point generates vertical upward displacement;
(6) a roll-off signal: a single touch point and a vertical downward displacement is generated;
(7) double finger pinch signal: detecting that the two touch points generate displacement, wherein the distance between the two touch points gradually approaches;
(8) two-finger split signal: and detecting that the two touch points generate displacement, wherein the distance between the two touch points is gradually far away.
Referring to fig. 2A, fig. 2A is a schematic flowchart of a device control method provided in an embodiment of the present application, and the method is applied to the AR glasses, and the method includes:
step 201: displaying a user interface on the display.
The user interface is a user interface of the AR glasses in a camera mode, for example, a photographing mode, a video recording mode, and the like.
Step 202: the method comprises the steps of determining that human eyes watch at a first watching position of a user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch ring matched with AR glasses through the Bluetooth module.
In an implementation manner of the present application, the determining, by the eye-tracking lens, that the human eye gazes at the first gazing position of the user interface includes:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the infrared tracking LEDs project infrared light to the human eyes;
and determining a gaze position where the human eye gazes at the user interface based on the infrared light spots.
It should be noted that, a specific implementation manner of determining the gaze location of the human eye on the user interface based on the infrared light spot is an existing human eye tracking technology, and is not described herein.
Step 203: and performing control operation based on the first gaze position and the first touch signal.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, and the performing a control operation based on the first gaze location and the first touch signal includes: and if the first watching position is positioned in the viewing frame and the first touch signal is a light touch signal, focusing the first watching position, wherein after the first watching position is focused, the user interface displays a focusing key and an exposure scale. As shown in particular in fig. 2B.
When the AR glasses are in the camera mode, the user interface includes a view frame, and when the first gaze position is located in the view frame and both eyes blink, the first gaze position is focused, and after the first gaze position is focused, the first user interface displays a focus key and an exposure scale.
In addition, after the focusing key and the exposure scale are displayed on the user interface, if no operation is performed within the first time length, the focusing key and the exposure scale are removed. The first duration may be 2000ms, 3000ms, or other values, for example.
In an implementation manner of the present application, after focusing the first gaze location, the method further includes:
and if a fourth touch signal sent by the touch ring is received by the Bluetooth module within the first duration and is a right slide signal, reducing the exposure level based on the right slide signal. As shown in fig. 2C, the specific exposure level is determined based on the sliding distance of the right slide, and the larger the sliding distance is, the larger the reduced exposure level is, the smaller the sliding distance is, the smaller the reduced exposure level is.
In an implementation manner of the present application, after focusing the first gaze location, the method further includes:
and if a fourth touch signal sent by the touch ring is received by the Bluetooth module within the first duration and is a left slide signal, improving the exposure based on the left slide signal. As shown in fig. 2D, the specific exposure level is determined based on the sliding distance of the left slide, and the larger the sliding distance is, the larger the reduced exposure level is, the smaller the sliding distance is, the smaller the reduced exposure level is.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, and the performing a control operation based on the first gaze location and the first touch signal includes:
and if the first watching position is located in the view frame and the first touch signal is a double-finger pinch signal, zooming out and adjusting the view based on the double-finger pinch signal, and displaying a zooming scale on the user interface when zooming out and adjusting the view. Specifically, as shown in fig. 2E, how much the specific view is reduced is determined based on the distance between the two fingers, and the smaller the distance between the two fingers is, the larger the view is reduced, and the larger the distance between the two fingers is, the smaller the view is reduced.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, and the performing a control operation based on the first gaze location and the first touch signal includes:
and if the first watching position is positioned in the viewing frame and the first touch signal is a double-finger separating signal, carrying out view amplification adjustment based on the double-finger separating signal, and displaying a zoom scale on the user interface when carrying out view amplification adjustment. As shown in fig. 2F, in addition, how much the specific view magnification is determined based on the distance between the two fingers, the smaller the distance between the two fingers is, the smaller the view magnification is, and the larger the distance between the two fingers is, the larger the view magnification is.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, a zoom factor icon is displayed in the view finder, and the control operation is performed based on the first gaze position and the first touch signal, including:
and if the first fixation position is located in the zoom multiple icon boundary and the first touch signal is an upward sliding signal, carrying out view amplification adjustment based on the upward sliding signal, and displaying a zoom scale on the user interface when the view amplification adjustment is carried out. As shown in fig. 2G in particular, in addition, how much the specific view magnification is determined based on the sliding distance of the upward slide, the smaller the view magnification is when the sliding distance is smaller, and the larger the sliding distance is, the larger the view magnification is.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, a zoom factor icon is displayed in the view finder, and the control operation is performed based on the first gaze position and the first touch signal, including:
and if the first fixation position is located in the zoom multiple icon boundary and the first touch signal is a downslide signal, performing view reduction adjustment based on the upslide signal, and displaying a zoom scale on the user interface when the view reduction adjustment is performed. As shown in fig. 2H, in addition, how much the specific view is reduced is determined based on the sliding distance of the slide, and the view is reduced as the sliding distance is smaller, and the view is reduced as the sliding distance is larger.
In an implementation manner of the present application, the AR glasses support multiple camera modes, the multiple camera modes are sequentially arranged, the AR glasses are currently in a first camera mode, the user interface includes a view finder, and performing a control operation based on the first gaze location and the first touch signal includes: if the first gaze location is located on the view frame and the first touch signal is a glide signal, switching to a second camera mode, the second camera mode being adjacent to and arranged above the first camera mode;
if the first gaze location is located on the view frame and the first touch signal is an up-slide signal, switching to a third camera mode, the third camera mode being adjacent to and arranged below the first camera mode.
If the second camera mode is a video recording mode and the third camera mode is a camera mode a, the specific diagram is shown in fig. 2I.
In an implementation manner of the present application, the AR glasses are in a camera mode, the AR glasses support multiple camera modes, the AR glasses are currently in a first camera mode, the user equipment includes a photographing key, and the control operation is performed based on the first gaze location and the first touch signal, including:
if the first gaze location is located within the photographing key boundary and the first touch signal is a light touch signal or a glide signal, switching to a fourth camera mode, the fourth camera mode being adjacent to and arranged above the first camera mode;
if the first gaze location is located within the photographing key boundary and the first touch signal is a light touch signal or an up-slide signal, switching to a fifth camera mode, the fifth camera mode being adjacent to and arranged below the first camera mode;
and if the first watching position is positioned in the photographing key boundary and the first touch signal is a light touch signal or a continuous gliding signal, switching to a more functional mode, and displaying icons of various camera modes on the user interface in the more functional mode for a user to select.
If the fourth camera mode is a video recording mode and the fifth camera mode is a camera mode a, the specific diagram is shown in fig. 2J.
In an implementation manner of the present application, the AR glasses are in a photographing mode, the user interface includes a view finder, and the control operation is performed based on the first gaze location and the first touch signal, including:
if the user interface comprises a photographing key, the first gaze location is located within the photographing key boundary, and the first touch signal is a light touch signal, performing real photographing, as shown in fig. 2K specifically;
if the user interface includes a focusing key, the first gaze location is located in the focusing key, and the first touch signal is a light touch signal, then performing real photographing, as shown in fig. 2L specifically;
if the user interface comprises a virtual information key, the first gaze location is located within the boundary of the virtual information key, and the first touch signal is a light touch signal, then starting a mixed reality photographing function, displaying AR virtual information superimposed in a real scene in the view-finding frame under the mixed reality photographing function, and obtaining image information that is real scene image information superimposed with the AR virtual information under the mixed reality photographing function, as shown in FIG. 2M specifically;
if the user interface comprises a floating window mode key, the first gaze location is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, the floating window mode photographing function is started, and a first floating window is displayed in the user interface under the floating window mode photographing function, wherein the first floating window comprises a reduced view frame and a photographing shutter key, which are specifically shown in fig. 2N.
It should be noted that the real-world photographing refers to photographing the external environment of the AR glasses through the main camera, and the obtained image is a real-world scene image.
In an implementation manner of the present application, after the floating window mode photographing function is started, the method further includes:
determining a second fixation position where the human eye gazes at the user interface through the human eye tracking lens, and receiving a second touch signal sent by the touch ring through the Bluetooth module;
if the second gaze location is located in the view frame and the second touch signal is a light touch signal, turning off the floating window mode photographing function, as shown in fig. 2O specifically;
if the second gaze location is located within the photographing key boundary and the second touch signal is a light touch signal, performing real photographing, as shown in fig. 2P;
if the second gaze location is within the first floating window and the second touch signal is a long press signal, the first floating window is moved based on the gaze point of the human eye, as shown in fig. 2Q.
The second fixation position is located in the first floating window, which means that the second fixation position is located in the boundary of the photographing shutter button, or the second fixation position is located in the boundary of the viewing frame.
In an implementation manner of the present application, the AR glasses are in a video recording mode, and the performing a control operation based on the first gaze location and the first touch signal includes:
if the user interface includes a video recording key, the first gaze location is located within the video recording key boundary, and the first touch signal is a light touch signal, then performing real video recording, as shown in fig. 2R specifically;
if the user interface includes a virtual information key, the first gaze location is located within the boundary of the virtual information key, and the first touch signal is a light touch signal, then starting a mixed reality video recording function, displaying AR virtual information superimposed in a real scene in the view frame under the mixed reality video recording function, and recording obtained video information as real scene video information superimposed with the AR virtual information under the mixed reality video recording function, as shown in fig. 2S specifically;
if the AR glasses are recording, the user interface includes a floating window mode key, the first gaze location is located within the boundary of the floating window mode key, and the first touch signal is a tap signal, the floating window mode recording function is turned on, and under the floating window mode photographing function, a second floating window is displayed within the user interface, the second floating window includes a reduced recorded duration, a view frame, and a record pause key, as specifically shown in fig. 2T.
In an implementation manner of the present application, after the video recording function in the floating window mode is started, the method further includes:
determining a third fixation position where the human eye gazes at the user interface through the human eye tracking lens, and receiving a third touch signal sent by the touch ring through the Bluetooth module;
if the third gaze location is within the view frame and the third touch signal is a light touch signal, turning off the floating window mode recording function, as shown in fig. 2U specifically;
if the third gaze location is located within the record pause key boundary and the third touch signal is a light touch signal, pausing video recording, as shown in fig. 2V specifically;
if the third gaze location is within the second floating window and the third touch signal is a long press signal, moving the second floating window based on the gaze point of the human eye, as shown in fig. 2W.
The third fixation position is located in the second floating window, which means that the third fixation position is located in the boundary of the recorded duration of the beat, or the third fixation position is located in the boundary of the view frame, or the third fixation position is located in the boundary of the record-pause key.
It can be seen that, in the embodiment of the application, the AR glasses firstly display the user interface on the display, then determine the gaze position where the human eyes gaze the user interface, receive the touch signal sent by the touch ring paired with the AR glasses through the bluetooth module, and finally perform control operation based on the gaze position and the touch signal, wherein the eye movement tracking realizes interactive fineness, and the touch ring realizes interactive convenience, thereby realizing convenient and accurate control of the AR glasses.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an AR glasses according to an embodiment of the present disclosure, and as shown in the figure, the electronic device includes a processor, a memory, a bluetooth module, a display, an eye tracking lens, and a wireless communication module, where the eye tracking lens is disposed on one side of the display; wherein the one or more programs are stored in the memory and configured to be executed by the processor, the programs including instructions for performing the steps of:
displaying a user interface on the display;
determining a first fixation position where a human eye gazes at the user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch ring matched with the AR glasses through the Bluetooth module;
and performing control operation based on the first gaze position and the first touch signal.
It can be seen that, in the embodiment of the application, the AR glasses firstly display the user interface on the display, then determine the gaze position where the human eyes gaze the user interface, receive the touch signal sent by the touch ring paired with the AR glasses through the bluetooth module, and finally perform control operation based on the gaze position and the touch signal, wherein the eye movement tracking realizes interactive fineness, and the touch ring realizes interactive convenience, thereby realizing convenient and accurate control of the AR glasses.
In an implementation of the present application, the AR glasses further include a plurality of infrared tracking LEDs, the plurality of infrared tracking LEDs are along the periphery of the display is disposed, and in terms of the first gaze location of the user interface, the above program includes instructions specifically for performing the following steps:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the infrared tracking LEDs project infrared light to the human eyes;
determining a first gaze location of a human eye gazing at the user interface based on the infrared spot.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, and in terms of performing a control operation based on the first gaze location and the first touch signal, the program includes instructions specifically configured to perform the following steps:
and if the first watching position is positioned in the viewing frame and the first touch signal is a light touch signal, focusing the first watching position, wherein after the first watching position is focused, the user interface displays a focusing key and an exposure scale.
In an implementation manner of the present application, the AR glasses are in a photographing mode, the user interface includes a view finder, and in terms of performing control operation based on the first gaze location and the first touch signal, the program includes instructions specifically configured to perform the following steps:
if the user interface comprises a photographing key, the first watching position is located in the boundary of the photographing key, and the first touch signal is a light touch signal, real photographing is carried out;
if the user interface comprises a focusing key, the first watching position is located in the focusing key, and the first touch signal is a light touch signal, real photographing is carried out;
if the user interface comprises a virtual information key, the first fixation position is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, starting a mixed reality photographing function, displaying AR virtual information superposed in a real scene in the view-finding frame under the mixed reality photographing function, and obtaining image information which is real scene image information superposed with the AR virtual information under the mixed reality photographing function;
if the user interface comprises a floating window mode key, the first fixation position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, starting a floating window mode photographing function, displaying a first floating window in the user interface under the floating window mode photographing function, wherein the first floating window comprises a reduced view-finding frame and a photographing shutter key.
In an implementation manner of the present application, after the floating window mode photographing function is turned on, the program includes instructions further configured to perform the following steps:
determining a second fixation position where the human eye gazes at the user interface through the human eye tracking lens, and receiving a second touch signal sent by the touch ring through the Bluetooth module;
if the second fixation position is located in the view-finding frame and the second touch signal is a light touch signal, closing the floating window mode photographing function;
if the second fixation position is located in the photographing shutter button boundary and the second touch signal is a light touch signal, performing real photographing;
and if the second fixation position is located in the first floating window and the second touch signal is a long press signal, moving the first floating window based on the fixation point of the human eye.
In an implementation manner of the present application, the AR glasses are in a video recording mode, and in terms of performing a control operation based on the first gaze location and the first touch signal, the program includes instructions specifically configured to perform the following steps:
if the user interface comprises a video recording key, the first watching position is located in the boundary of the video recording key, and the first touch signal is a light touch signal, then real video recording is carried out;
if the user interface comprises a virtual information key, the first watching position is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, starting a mixed reality video recording function, displaying AR virtual information superposed in a real scene in the view-finding frame under the mixed reality video recording function, and recording obtained video information as real scene video information superposed with the AR virtual information under the mixed reality video recording function;
if the AR glasses are recording, the user interface comprises a floating window mode key, the first watching position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, starting a floating window mode recording function, and displaying a second floating window in the user interface under the floating window mode photographing function, wherein the second floating window comprises a shortened recording time, a view frame and a record pause key.
In an implementation manner of the present application, after the floating window mode video recording function is turned on, the program includes instructions further configured to perform the following steps:
determining a third fixation position where the human eye gazes at the user interface through the human eye tracking lens, and receiving a third touch signal sent by the touch ring through the Bluetooth module;
if the third fixation position is located in the view-finding frame and the third touch signal is a light touch signal, closing the floating window mode video recording function;
if the third watching position is located in the recording pause key boundary and the third touch signal is a light touch signal, stopping recording;
and if the third gaze location is located in the second floating window and the third touch signal is a long press signal, moving the second floating window based on the gaze point of the human eye.
It should be noted that, for the specific implementation process of the present embodiment, reference may be made to the specific implementation process described in the above method embodiment, and a description thereof is omitted here.
Referring to fig. 4, fig. 4 is a device control apparatus provided in an embodiment of the present application, which is applied to AR glasses, where the AR glasses include a bluetooth module, a display, and a human eye tracking lens, the human eye tracking lens is disposed on one side of the display, and the apparatus includes:
adisplay unit 401 for displaying a user interface on the display;
a gazelocation determination unit 402 configured to determine a first gaze location where a human eye gazes at the user interface through the human eye tracking lens;
acommunication unit 403, configured to receive, through the bluetooth module, a first touch signal sent by a touch ring paired with the AR glasses;
acontrol unit 404, configured to perform a control operation based on the first gaze location and the first touch signal.
It can be seen that, in the embodiment of the application, the AR glasses firstly display the user interface on the display, then determine the gaze position where the human eyes gaze the user interface, receive the touch signal sent by the touch ring paired with the AR glasses through the bluetooth module, and finally perform control operation based on the gaze position and the touch signal, wherein the eye movement tracking realizes interactive fineness, and the touch ring realizes interactive convenience, thereby realizing convenient and accurate control of the AR glasses.
In an implementation manner of the present application, the AR glasses further include a plurality of infrared tracking LEDs, the plurality of infrared tracking LEDs are along the periphery of the display is disposed, and the eye tracking lens determines that the eye gazes at the first gazing position of the user interface, and the gazingposition determining unit 402 is specifically configured to:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the infrared tracking LEDs project infrared light to the human eyes;
determining a first gaze location of a human eye gazing at the user interface based on the infrared spot.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, and in terms of performing a control operation based on the first gaze location and the first touch signal, thecontrol unit 404 is specifically configured to:
and if the first watching position is positioned in the viewing frame and the first touch signal is a light touch signal, focusing the first watching position, wherein after the first watching position is focused, the user interface displays a focusing key and an exposure scale.
In an implementation manner of the present application, the AR glasses are in a photographing mode, the user interface includes a view finder, and in terms of performing control operation based on the first gaze location and the first touch signal, thecontrol unit 404 is specifically configured to:
if the user interface comprises a photographing key, the first watching position is located in the boundary of the photographing key, and the first touch signal is a light touch signal, real photographing is carried out;
if the user interface comprises a focusing key, the first watching position is located in the focusing key, and the first touch signal is a light touch signal, real photographing is carried out;
if the user interface comprises a virtual information key, the first fixation position is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, starting a mixed reality photographing function, displaying AR virtual information superposed in a real scene in the view-finding frame under the mixed reality photographing function, and obtaining image information which is real scene image information superposed with the AR virtual information under the mixed reality photographing function;
if the user interface comprises a floating window mode key, the first fixation position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, starting a floating window mode photographing function, displaying a first floating window in the user interface under the floating window mode photographing function, wherein the first floating window comprises a reduced view-finding frame and a photographing shutter key.
In an implementation manner of the present application, the gazelocation determining unit 402 is further configured to determine, through the eye tracking lens, a second gaze location where the human eye gazes at the user interface after the floating window mode photographing function is started;
thecommunication unit 403 is further configured to receive, through the bluetooth module, a second touch signal sent by the touch ring;
thecontrol unit 404 is further configured to close the floating window mode photographing function if the second gaze location is located in the view frame and the second touch signal is a light touch signal; if the second fixation position is located in the photographing shutter button boundary and the second touch signal is a light touch signal, performing real photographing; and if the second fixation position is located in the first floating window and the second touch signal is a long press signal, moving the first floating window based on the fixation point of the human eye.
In an implementation manner of the present application, the AR glasses are in a video recording mode, and in terms of performing a control operation based on the first gaze location and the first touch signal, thecontrol unit 404 is specifically configured to:
if the user interface comprises a video recording key, the first watching position is located in the boundary of the video recording key, and the first touch signal is a light touch signal, then real video recording is carried out;
if the user interface comprises a virtual information key, the first watching position is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, starting a mixed reality video recording function, displaying AR virtual information superposed in a real scene in the view-finding frame under the mixed reality video recording function, and recording obtained video information as real scene video information superposed with the AR virtual information under the mixed reality video recording function;
if the AR glasses are recording, the user interface comprises a floating window mode key, the first watching position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, starting a floating window mode recording function, and displaying a second floating window in the user interface under the floating window mode photographing function, wherein the second floating window comprises a shortened recording time, a view frame and a record pause key.
In an implementation manner of the present application, the gazelocation determining unit 402 is further configured to determine, through the eye-tracking lens, a third gaze location where the human eye gazes at the user interface after the floating-window mode video recording function is turned on;
thecommunication unit 403 is further configured to receive, through the bluetooth module, a third touch signal sent by the touch ring;
thecontrol unit 404 is further configured to close the floating window mode video recording function if the third gaze location is located in the view frame and the third touch signal is a light touch signal; if the third watching position is located in the recording pause key boundary and the third touch signal is a light touch signal, stopping recording; and if the third gaze location is located in the second floating window and the third touch signal is a long press signal, moving the second floating window based on the gaze point of the human eye.
Embodiments of the present application also provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform some or all of the steps described in the above method embodiments for AR glasses.
Embodiments of the present application also provide a computer program product, wherein the computer program product comprises a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described for AR glasses in the above method. The computer program product may be a software installation package.
The steps of a method or algorithm described in the embodiments of the present application may be implemented in hardware, or may be implemented by a processor executing software instructions. The software instructions may be comprised of corresponding software modules that may be stored in Random Access Memory (RAM), flash Memory, Read Only Memory (ROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, a hard disk, a removable disk, a compact disc Read only Memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. Additionally, the ASIC may reside in an access network device, a target network device, or a core network device. Of course, the processor and the storage medium may reside as discrete components in an access network device, a target network device, or a core network device.
Those skilled in the art will appreciate that in one or more of the examples described above, the functionality described in the embodiments of the present application may be implemented, in whole or in part, by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy Disk, a hard Disk, a magnetic tape), an optical medium (e.g., a Digital Video Disk (DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the embodiments of the present application in further detail, and it should be understood that the above-mentioned embodiments are only specific embodiments of the present application, and are not intended to limit the scope of the embodiments of the present application, and any modifications, equivalent substitutions, improvements and the like made on the basis of the technical solutions of the embodiments of the present application should be included in the scope of the embodiments of the present application.