Movatterモバイル変換


[0]ホーム

URL:


CN114529691B - A window control method, electronic device and computer readable storage medium - Google Patents

A window control method, electronic device and computer readable storage medium
Download PDF

Info

Publication number
CN114529691B
CN114529691BCN202011218043.9ACN202011218043ACN114529691BCN 114529691 BCN114529691 BCN 114529691BCN 202011218043 ACN202011218043 ACN 202011218043ACN 114529691 BCN114529691 BCN 114529691B
Authority
CN
China
Prior art keywords
window
windows
target
area
level
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011218043.9A
Other languages
Chinese (zh)
Other versions
CN114529691A (en
Inventor
杨婉艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co LtdfiledCriticalHuawei Technologies Co Ltd
Priority to CN202011218043.9ApriorityCriticalpatent/CN114529691B/en
Publication of CN114529691ApublicationCriticalpatent/CN114529691A/en
Application grantedgrantedCritical
Publication of CN114529691BpublicationCriticalpatent/CN114529691B/en
Activelegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Landscapes

Abstract

The application relates to the technical field of computers, and discloses a window control method, electronic equipment and a computer readable storage medium. The selected target window may be a plurality of windows at the same level or at different levels, respectively. After the target windows are selected, the electronic device executes the pointing component indication operation on each target window. Operations indicated by the pointing element include, but are not limited to, a zoom-in operation to zoom out or zoom in on the target window, a replace operation to replace the selected target window with each other, a delete operation to delete the selected target window, etc. Therefore, the user selects a plurality of target windows by using the pointing component, so that the electronic equipment can conveniently perform batch operation on each target window, and the experience of the user is improved.

Description

Window control method, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a window control method, an electronic device, and a computer readable storage medium.
Background
Augmented reality (Augmented Reality, AR) is a completely new human-computer interaction technology. Through the AR technology, the participants and the virtual objects can interact in real time, so that wonderful visual experience is obtained, and the experience which cannot be experienced in the real world can be felt through space, time and other objective limits. Virtual Reality (VR) is a Virtual Reality technology, which generates a simulation environment through computer technology, and simultaneously immerses a user into a created three-dimensional dynamic Reality, which can be understood as a simulation system for the real world. While the earliest VR technology was applied in the military, the most common product was a head mounted display.
The interaction technique with AR/VR devices mainly utilizes motion sensors integrated into the head-mounted display. The scene in the field of view may be changed as the user rotates the head. With the continued development of AR/VR technology, a range of pointing devices such as gloves, watches, cell phones, and handles, etc., have evolved. The pointing component presents both hands in the virtual scene, so that the purpose of moving around the user is achieved. By tracking the states of rotation, movement, etc. of these or hands, the motion states of these pointing components are mapped to interactions of movement, selection, rotation, scaling, etc. of the virtual object. At present, the interaction technology using AR/VR equipment is simpler, and mainly focuses on interaction between gestures, handles and interfaces. When interacting with an interface using gestures, handles, only a single window in a single plane of the interface can be independently manipulated. Such as segmentation, closing, movement, etc., of a single window. Thus, when there are multiple windows in the interface, this approach is not applicable, resulting in a poor user experience.
Disclosure of Invention
The invention aims to provide a window control method, electronic equipment and a computer readable storage medium, which are convenient for simultaneously operating a plurality of windows and improve the user experience
In a first aspect, an embodiment of the present application discloses a window control method, applied to an electronic device based on a virtual reality technology or augmented reality, where the electronic device is associated with a pointing component, where the window control method is used to control a plurality of window layers displayed in a visual area of the electronic device, where each window layer is sequentially arranged along a direction away from a user side of the electronic device, and each window layer includes a plurality of windows, where the window control method includes:
Detecting movement of the pointing member;
determining a selection area based on a movement track of the pointing part;
determining at least one target window from the plurality of window layers according to the selection area;
The pointing device is configured to simultaneously execute the pointing device instruction operation for each of the at least one target window.
According to the window control method disclosed by the embodiment of the application, the electronic equipment and the pointing component are mutually associated, and the electronic equipment can be used for controlling a plurality of windows selected by the pointing component. When a user is using the electronic device, multiple windows at the same level or at different levels are presented within a viewable area of the electronic device. The user can select a partial window from the plurality of windows as a target window to be controlled by using the pointing device. The selected target window may be a plurality of windows at the same level or at different levels, respectively. After the target windows are selected, the device simultaneously executes the pointing component indication operation on each target window. Operations indicated by the indication device include, but are not limited to, a zoom-in operation to zoom out or zoom in on the target window, a replace operation to replace the selected target window with each other, a delete operation to delete the selected target window, etc. Therefore, the user selects a plurality of target windows by using the pointing component, so that the electronic equipment can conveniently perform batch operation on each target window, and the experience of the user is improved.
According to some embodiments provided in the first aspect of the application, the movement of the pointing element is for a click of the window, and the selection area is determined based on the click position of the pointing element.
According to some embodiments provided for by the first aspect of the application, the movement of the pointing element is in a predetermined trajectory,
The selection area is determined based on the trajectory formed when the pointing member is moved.
According to some embodiments of the first aspect of the present application, the track formed when the pointing device moves forms a closed figure, the electronic device determines the selection area based on the area covered by the closed figure, and the window in the area of the closed figure as the selection area is the selected target window.
According to some embodiments of the first aspect of the present application, the zooming-in and zooming-out operation based on the pointing device instruction includes zooming in, zooming out, and moving the window position of each target window.
According to some embodiments provided in the first aspect of the present application, determining at least one target window from the plurality of window layers according to the selection area includes:
at least one target window in the same window layer is determined based on the selection region.
According to some embodiments provided in the first aspect of the present application, determining at least one target window from the plurality of window layers according to the selection area includes:
Based on the selection region, a plurality of target windows in a plurality of window layers are determined, at least one window in each window layer being selected.
According to some embodiments provided in the first aspect of the present application, the range of the closed figure determined by the selection area covers a plurality of windows in a plurality of window layers, and the plurality of windows is determined as target windows.
According to some embodiments provided in the first aspect of the present application, the window selected by the selection area on one window layer of the plurality of window layers is mapped to at least one window determined on the other window layer as the target window.
According to some embodiments of the present application, in a case where the at least one target window includes at least two windows, performing the pointing device indication operation performs the pointing device indication operation on the plurality of target windows, including
The positions of at least two windows are interchanged.
According to some embodiments of the present application, the at least two windows are divided into two window groups, and the operation of performing the pointing device indication performs the operation of pointing device indication on the plurality of target windows, including
The positions of the two window groups are interchanged.
According to some embodiments provided by the first aspect of the application, the area of the first of the at least two windows is adjusted to fit the display area at the location of the second target window, or the area of the second of the at least two windows is adjusted to fit the display area at the location of the first target window.
In a second aspect, an embodiment of the present application discloses an electronic device, where the electronic device is associated with a pointing element, and the electronic device is based on a virtual reality technology or augmented reality, and the electronic device includes:
the memory is used for storing window control instructions;
the processor realizes the following steps when executing the window control instruction:
Detecting movement of the pointing member;
determining a selection area based on a movement track of the pointing part;
determining at least one target window from the plurality of window layers according to the selection area;
The pointing device is configured to simultaneously execute the pointing device instruction operation for each of the at least one target window.
According to the electronic device disclosed by the embodiment of the application, the electronic device and the pointing component are mutually associated, and the electronic device can be used for controlling a plurality of windows selected by the pointing component. When a user is using the electronic device, multiple windows at the same level or at different levels are presented within a viewable area of the electronic device. The user can select a partial window from the plurality of windows as a target window to be controlled by using the pointing device. The selected target window may be a plurality of windows at the same level or at different levels, respectively. After the target windows are selected, the device simultaneously executes the pointing component indication operation on each target window. Operations indicated by the indication device include, but are not limited to, a zoom-in operation to zoom out or zoom in on the target window, a replace operation to replace the selected target window with each other, a delete operation to delete the selected target window, etc. Therefore, the user selects a plurality of target windows by using the pointing component, so that the electronic equipment can conveniently perform batch operation on each target window, and the experience of the user is improved.
According to some embodiments provided in the second aspect of the present application, the processor is further configured to determine the selection area based on a click position of the pointing element when executing the window control instruction.
According to some embodiments provided in the second aspect of the present application, the processor is further configured to determine the selection area based on a trajectory formed when the pointing device is moved when executing the window control instruction.
According to some embodiments of the second aspect of the present application, the processor is further configured to, when executing the window control instruction, zoom in and zoom out each target window based on the pointing device instruction, and move the window position.
According to some embodiments of the second aspect of the present application, the processor is further configured to determine at least one target window in the same window layer based on the selection area, or determine a plurality of target windows in a plurality of window layers based on the selection area, where at least one window in each window layer is selected, when executing the window control instruction.
According to some embodiments of the second aspect of the application, the memory is further configured to store the selection area, to cause the processor to invoke the selection area.
According to some embodiments provided in the second aspect of the present application, the processor is further configured to exchange positions of at least two windows with each other or exchange positions of two window groups with each other when executing the window control instruction.
In a third aspect, embodiments of the present application disclose a computer readable storage medium storing a storage window control instruction which when executed by a processor implements a window control method as any of the above mentioned.
Additional features and corresponding advantages of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the invention.
Drawings
Fig. 1A and fig. 1B are schematic diagrams of a scene formed by an electronic device and a pointing device according to an embodiment of the present application;
fig. 1C and fig. 1D are schematic structural diagrams of a pointing device according to an embodiment of the present application;
FIGS. 2a to 2c are schematic diagrams illustrating selection manners of a target window according to an embodiment of the present application;
fig. 3A to fig. 3C are schematic diagrams of selection results corresponding to a selection manner of a target window according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a window distribution of an example of an embodiment of the present application;
fig. 5A illustrates a schematic structural diagram of a VR device according to an embodiment of the present application;
fig. 5B illustrates a schematic diagram of a memory of a VR device according to an embodiment of the present application;
Fig. 6A to fig. 6B are schematic diagrams illustrating the folding and unfolding of windows in the same hierarchy according to an embodiment of the present application;
Fig. 7A to 7C are schematic diagrams illustrating the collapsing and releasing of windows at different levels according to an embodiment of the present application;
Fig. 8A to 8C are schematic diagrams illustrating the folding and unfolding of windows in different levels but in the same longitudinal space according to an embodiment of the present application;
fig. 9A to 9C are schematic diagrams illustrating another embodiment of the application for folding and unfolding windows at different levels;
10A-10C are schematic diagrams illustrating replacement of two windows at the same level according to an embodiment of the present application;
11A-11B are schematic diagrams illustrating replacement of multiple windows at different levels according to embodiments of the present application;
12A-12B are schematic diagrams illustrating replacement of multiple windows at different levels but in the same longitudinal space according to embodiments of the present application;
Fig. 13 is a schematic flow chart of a window control method applied to control the retraction of windows of different levels or the same level according to an embodiment of the present application;
Fig. 14A to fig. 14B are schematic flow diagrams of a window control method according to an embodiment of the present application applied to control windows of different levels or the same level to be replaced;
Fig. 15 is a schematic structural diagram of a window control device according to an embodiment of the present application;
Fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
Fig. 17 is a schematic structural diagram of an SOC according to an embodiment of the present application.
Detailed Description
The window control method provided by the embodiment of the application is applied to electronic equipment, wherein the electronic equipment comprises but is not limited to AR equipment or VR equipment. Among them, the AR device adopts an augmented reality technology, which is a new technology of integrating real world information and virtual world information "seamlessly". The AR device adopts a virtual reality technology, and the virtual reality technology mainly comprises aspects of simulation environment, perception, natural technology, sensing equipment and the like. The simulated environment is a computer-generated, dynamic three-dimensional realistic image. Perception means that an ideal VR should have the perception of everyone. In addition to the visual perception generated by computer graphics technology, there are auditory, tactile, kinesthesia, and other perceptions. Natural skills refer to human head rotation, eyes, gestures, or other human behavior actions, and data corresponding to the actions of participants are processed by a computer, and are responded in real time to user inputs and fed back to the five sense organs of the user respectively.
Referring to fig. 1A and fig. 1B, fig. 1A and fig. 1B are schematic views of an electronic device and a pointing device according to an embodiment of the present application.
In the scene diagrams shown in fig. 1A and 1B, including but not limited to an electronic device 10, exemplified by a VR device, a pointing device 11, exemplified by a remote control handle, and a plurality of windows 1000, presented within a viewable area 100 (e.g., the area between dashed line 101 and dashed line 102 in fig. 1A) of the VR device. The VR device and the remote control handle are associated such that the VR device controls a plurality of windows selected by the remote control handle. After the VR device is worn by the user, a plurality of windows 1000 are presented within the viewable area 100 of the VR device. The user can select a partial window from the plurality of windows as a target window to be controlled using the pointing part 11. After selecting the target windows, the VR device performs the operation indicated by the pointing part 11 on each target window at the same time.
It is to be noted that, although the example of the pointing part 11 explained above may be a remote control handle, the present application is not limited to the pointing part 11 being a glove as shown in fig. 1B, a wristwatch as shown in fig. 1C, a terminal such as a bracelet as shown in fig. 1D, or a hand of a user. Although the pointing device 11 is explained above as being used to control a window displayed by a VR device, the present application is not limited thereto, and the pointing device may also be used for other functional operations, such as controlling movement of an object in a game, performing an action, and the like.
According to the embodiment of the application, the selection modes of the target window include, but are not limited to, point selection (click selection, the clicked window is selected as the target window), surface selection (drawing a closed figure to select, the window covered by the closed figure is the target window), linear selection (drawing a line to select, the window covered by the line is selected as the target window), and the like.
Fig. 2 is a schematic diagram of a selection manner of a target window according to an embodiment of the present application. For example, as illustrated in the leftmost fig. 2a of fig. 2, a user may click one or more windows presented in a viewable area of the VR device using keys provided on the remote control handle, or by holding the remote control handle to perform a similar tap or click action. The clicked window may be the selected target window. As illustrated in the middle fig. 2b of fig. 2, the user may select windows in the area through which the hand swipe passes by a tap or click action of the hand with respect to one or more windows, or swipe the hand, as the selected target window. The hand swing can move in a visible area of the VR device, and when a moving track forms a closed figure, a window covered by the closed figure is a target window. When the hand moves in the visible area of the VR device, the moving track forms a curve or a straight line, and the window covered by the curve or the straight line can be selected as the target window, and it can be understood that the user can also use the remote control handle to move in the visible area of the VR device, so as to establish a closed graph to select the target window. As illustrated in the middle fig. 2c of fig. 2, the user may grab a portion of the window by hand to serve as a target window, and when grabbing by hand, the window covered by the hand is the selected target window.
Fig. 3A to fig. 3C are schematic diagrams of selection results corresponding to a selection manner of a target window according to an embodiment of the present application. As illustrated in fig. 3A, when the user clicks on window a, window C, and window D among window a, window B, window C, and window D based on the remote handle or finger. Window a, window C, and window D, which are selected target windows, may show the selected logo 1001 in the upper right corner. The selected logo 1001 may be a black dot as shown in fig. 3A. Other shapes and types of logos are also possible, such as triangular, diamond, square, etc. The embodiments of the present application are not limited herein.
As illustrated in fig. 3B, the user moves within the viewable area of the VR device using a hand or a remote control handle. The closed pattern formed by the moving track is a regular rectangle 1002, a triangle 1003, an irregular shape 1004, or the like as shown in fig. 3B. The window a, window B, and window C covered by the regular rectangle 1002, triangle 1003, and irregular shape 1004 shown in the closed figure of fig. 3B are selected target windows. Of course, the shape of the closed figure is not limited to that shown in fig. 3B in the present embodiment, but may be any other shape, and the embodiment of the present application is not limited herein. It will be appreciated that the closed figure formed by the movement track may determine that the selected target window is the target window if the window is entirely within the closed figure, and the selected target window if the window is partially within the closed figure (e.g., if 50% of the area of the window is within the closed figure, the selected target window is considered the target window).
As illustrated in fig. 3C, the user moves within the viewable area of the VR device using a hand or a remote control handle. The movement locus forms a locus such as a straight line 1005 or a curve 1006 shown in fig. 3C. Window a, window B, and window C, through which line 1005 or curve 1006 shown in fig. 3C passes, are selected target windows.
For multiple windows 1000 within the viewing area 100 of the VR device, they are located in three-dimensional space. The three-dimensional space is a space composed of three dimensions of length, width, and height, which corresponds to an x-axis (horizontal axis), a y-axis (vertical axis), and a z-axis (vertical axis) of the three-dimensional space. The windows 1000 may be located in the same plane region (the same hierarchy) in the three-dimensional space, or may be located in different plane regions (different hierarchies). For example, as shown in fig. 4, the window a, the window B, the window C, and the window D are all in the same plane area (as the first plane area), that is, the window a, the window B, the window C, and the window D are windows in the same hierarchy (which is denoted as Layer-1 in the embodiment of the present application). Window E and window F are in a second planar region after the first planar region, i.e., window E and window F are in a second level (denoted Layer-2 in the present embodiment), and window a and window B, window C, and window D are in a first level. I.e. window E and window F are at a different level than window a and window B and window C and window D. Window G, window H, and window I are in a third planar region after the second planar region, i.e., window G, window H, and window I are in a third level (this is denoted as Layer-3 in the embodiment of the present application). The windows in the three planar areas are then respectively at different levels.
Of course, there may be other types of display of the plurality of windows 1000 in the viewing area 100 of the VR device, and embodiments of the present application are not limited in this respect.
According to some embodiments of the application, the VR device and the remote control handle are associated with each other, and the VR device can be used to control a plurality of windows selected by the remote control handle. When a user is using a VR device, multiple windows 1000 at the same level or at different levels as shown in fig. 1A and 1B are presented within the viewable area 100 of the VR device. The user can select a partial window from the plurality of windows as a target window to be controlled using the pointing part 11. The selected target window may be a plurality of windows at the same level or at different levels, respectively. After selecting the target windows, the VR device performs the operation indicated by the pointing part 11 on each target window at the same time. Operations indicated by the indication device 11 include, but are not limited to, a zoom-in operation to zoom out or zoom in on the target window, a replace operation to replace the selected target window with each other, a delete operation to delete the selected target window, and the like. Therefore, the user selects a plurality of target windows by using the pointing component, so that batch operation of VR equipment on each target window is facilitated, and the experience of the user is improved.
The following describes the structure of a VR device implementing the window control method exemplified in the above embodiment of the present application:
As shown in fig. 5A, the VR device 50 shown in fig. 5A includes, but is not limited to, a processor 11, where the processor 11 is configured to generate corresponding operation control signals to corresponding components in the device, and read and process data in software, and in particular, read and process data and programs in a memory, so that each functional module in the device performs a corresponding function, thereby controlling the corresponding components to act as required by the instructions. Such as for various media processing algorithms including human-machine interaction, motion tracking/prediction (e.g., tracking user hand movement, remote handle movement and rotation, etc., in embodiments of the present application), rendering display, audio processing, window reduction or enlargement, window replacement, window deletion, etc.
The sensor system 12 is used to acquire, acquire or transmit information, including image information and distance information, such as hand information, and information of a remote control handle's ray click in an embodiment of the present application. The sensor system of the embodiment of the application can comprise a 3-axis or 6-axis sensor for acquiring motion information of VR equipment, such as angular velocity and linear acceleration, and simultaneously positioning, tracking and identifying hand motion, and acquiring static and dynamic characteristics of hands. Static characteristic information such as fingertip pointing, palm centroid, hand joints, etc. Such features typically employ single frame data acquisition. Dynamic characteristic information such as displacement vector, motion speed, etc. Such characteristic information is typically obtained by multi-frame data. As a sensor system, some specific program instructions may also be stored therein.
The memory 13 is used for storing programs and various data, mainly storing software elements such as an operating system, applications and functional instructions, or a subset thereof, or an extended set thereof. Non-volatile random access memory may also be included to provide the processor 11 with control software and applications including managing hardware, software, and data resources in the computing processing device. The method is also used for storing the selection range of the multiple windows by a user and storing running programs and applications. As shown in fig. 5B, at least one storage unit may be disposed in the memory 13 of the VR device 50, where each storage unit may have a respective storage function, for example, a first storage unit is used to store software units such as an operating system, applications, and function instructions, a second storage unit is used to store applications and running programs, and a third storage unit is used to store a selection range selected by a user for multiple windows.
The display element 14 typically includes a display screen and associated optics for content display, and typically a display interface is presented in the display screen for human-machine interaction and viewing of the window.
Acoustic elements 15 such as microphones, speakers, headphones, etc. for outputting sound.
Physical hardware 16, such as physical function keys, e.g., on-off keys, volume keys, mechanical control keys, etc.
The device may also comprise other components 17 than the above 11-16 for making the function and appearance of the device more rich and elegant.
The above hardware 11-16 and part of the hardware 17 may be coupled for communication via bus electrical connections.
The following describes a window control method according to an embodiment of the present application with reference to the accompanying drawings:
According to some embodiments of the application, the plurality of windows within the viewable area of the VR device are windows at the same level. The window control method of the embodiment of the application is applied to control a plurality of windows at the same level to be played back.
As shown in fig. 6A, windows at the same level (Layer-1) include, but are not limited to, window a, window B, window C, and window D. The user establishes the selection area in any one of the ways shown in fig. 2. For example, the user creates a rectangular movement track by using the movement of the hand, and the rectangular shape formed by the movement track covers the window a, the window B, the window C, and the window D. The selected window A, window B, window C and window D are used as target windows. And, the selected window A, window B, window C and window D present a selected identifier 1001 for prompting the user which pages are currently in the selected state and waiting for subsequent operations.
After the window A, the window B, the window C and the window D are selected, the user can simultaneously zoom in or zoom out the window A, the window B, the window C and the window D through the retracting operation of the hands. As shown in fig. 6B, when the fingers of the user's hand are closed, window a, window B, window C, and window D are reduced together. The scale of the window a, window B, window C, and window D reduction may be determined by the degree of closure of the user's hand fingers relative to their home positions. For example, the VR device tracks the movement of the fingers of the user's hand in real time, with window a, window B, window C, and window D scaled to a minimum when the fingers of the user's hand are fully closed. When the fingers of the user's hand are closed to half of the original state, window A, window B, window C, and window D are scaled to half of the original size. The scaling of window a, window B, window C, and window D may also be determined by other means, and embodiments of the present application are not limited in this respect.
In addition, it is understood that in the context of utilizing a remote control handle, the reduction or expansion of the control window may be accomplished by a case of the remote control handle or by a particular manner of waving the remote control handle.
It should be noted that, the reduced displayed window a, window B, window C, and window D may still be displayed in the planar areas of the window a, window B, window C, and window D before being reduced, and the reduced levels of the window a, window B, window C, and window D may be the same as the levels of the window a, window B, window C, and window D before being reduced. In addition, the various windows that are scaled down may be presented in the form of icons at locations within the viewable area of the VR device that are not centered (e.g., lower right or lower left) so that they are again magnified for display by the user.
Therefore, the window control method provided by the embodiment of the application is applied to control a plurality of windows in the same hierarchy, and the plurality of windows can be operated in batches, so that the experience of a user is improved relative to a mode of operating a single window by the user.
Although the above embodiment explained that all of the windows a to D on the first-level display area are selected and reduced. The present application is not limited thereto and may be to select a part of a plurality of windows therein, such as windows a and D, or windows A, B and C.
According to some embodiments of the application, the plurality of windows within the viewable area of the VR device are windows at different levels. The window control method of the embodiment of the application is applied to control a plurality of windows at different levels to be played back. The window control method provided by the embodiment of the present application will be described as being applied to control a plurality of windows at different levels, taking as an example windows at a first level (which is denoted as Layer-1 by the embodiment of the present application), a second level (which is denoted as Layer-2 by the embodiment of the present application), and a third level (which is denoted as Layer-3 by the embodiment of the present application), respectively. In the embodiment of the present application, the plurality of windows of different levels refer to a window of a first level, a window of a second level, and a window of a third level, which may be located at the same position on the z-axis of the three-dimensional space coordinate. It will be appreciated that the levels of the remaining windows in the VR device may be different from the first to third levels mentioned in the embodiments of the present application, and that there may be more levels in the visible area in the VR device, which embodiments of the present application are not limited in this respect.
In three-dimensional space, as shown in fig. 7A, the window of the first level, the window of the second level, and the window of the third level may be at the same position on the z-axis of the three-dimensional space coordinates. That is, the coordinates of the vertex Layer-10 of the planar area formed by the window of the first level, the vertex Layer-20 of the planar area formed by the window of the second level, and the vertex Layer-30 of the planar area formed by the window of the third level are the same in the z-axis, that is, the coordinates in the z-axis are z1. Wherein windows at the first level include, but are not limited to, window a, window B, window C, and window D. Windows at the second level include, but are not limited to, window E and window F. Windows at the third level include, but are not limited to, window G, window H, and window I.
As shown in fig. 7B, the user establishes the selection area in any of the ways shown in fig. 2. For example, the user selects window a, window B, and window D in the first hierarchy as the target window (first selection area) selected in the first hierarchy by clicking or establishing the selection area with the movement of the finger. And selecting the window F of the second level as a selected target window (second selected area) by clicking or establishing the selected area, and selecting the window I in the third level as a selected target window (third selected area) by clicking or establishing the selected area. And, the selected window A, window B, window D, window F and window I present a selected identifier 1001 for prompting the user which pages are now in the selected state and waiting for subsequent operations. Notably, when the user clicks on a window of a different hierarchy with a finger to select a target window, when the user clicks on a window a, a window B, and a window D of a first hierarchy as the target window selected in the first hierarchy. Through the first level, when the overlapping area between the window a, the window B and the window D of the second level located at the next level of the first level and the selected window a, window B and window D of the first level exceeds the threshold (the overlapping area may be 80%), the window of the overlapping area of the second level may be directly selected as the target window.
After the window A, the window B, the window D, the window F and the window I are selected, the user can simultaneously zoom in or zoom out the window A, the window B, the window D, the window F and the window I through the retracting operation of the hand. As shown in fig. 7C, when the fingers of the user's hand are closed, window a, window B, window D, window F, and window I are reduced together. The scale of the window a, window B, window D, window F, and window I scaling may be determined by the degree of closure of the user's hand fingers relative to their home positions. For example, the VR device tracks the movement of the fingers of the user's hand in real time, with window a, window B, window D, window F, and window I scaled to a minimum when the fingers of the user's hand are fully closed. When the fingers of the user's hand are closed to half of the original state, window A, window B, window D, window F, and window I are scaled to half of the original size. The scaling of window a, window B, window D, window F, and window I may also be determined by other means, and embodiments of the present application are not limited in this respect.
Notably, window a, window B, and window D of the scaled-down presentation may still present a planar area at the first level of window a, window B, and window D before scaling down. I.e. the levels of window a, window B and window D after shrinking may be the same as the levels of window a, window B and window D before non-shrinking. The window F that is scaled down may still be displayed in the planar area where the second level of the window F is located before being scaled down. The window I displayed in a reduced manner may still be displayed in a planar area where the third level of the window I is located before being reduced. Therefore, the window of the first level is reduced, so that the window of the later level can be conveniently displayed more. Further, the various windows that are scaled down may be presented in the form of icons at locations within the viewable area of the VR device that are not centered (e.g., the lower right or lower left corner of the viewable area) so that the user again zooms in on.
In addition, it is understood that in the context of utilizing a remote control handle, the reduction or expansion of the control window may be accomplished by the keys of the remote control handle or by a particular manner of waving the remote control handle.
Therefore, the window control method provided by the embodiment of the application is applied to control a plurality of windows in different levels, and the plurality of windows can be operated in batches, so that the experience of a user is improved relative to a mode of operating a single window by the user.
Although the above embodiment explains that the window a, the window B, the window D, the window F, and the window I are all selected and reduced on the first to third hierarchical display areas. However, the present application is not limited thereto, and some of the plurality of windows may be selected, for example, a first level of windows a and B, a second level of windows E and F, and a third level of windows H and I.
According to some embodiments of the application, the plurality of windows within the viewable area of the VR device are windows at different levels but in the same longitudinal space. The window control method of the embodiment of the application is applied to control a plurality of windows which are in different levels and are in the same longitudinal space to be played back. Taking windows respectively at a first level (which is denoted as Layer-1 in the embodiment of the present application), a second level (which is denoted as Layer-2 in the embodiment of the present application), and a third level (which is denoted as Layer-3 in the embodiment of the present application) as an example, the window control method provided in the embodiment of the present application is applied to control a plurality of windows at different levels. In the embodiment of the present application, the plurality of windows of different levels refer to a window of a first level, a window of a second level, and a window of a third level, which are located at the same position on the y-axis of the three-dimensional space coordinate (the same position refers to that the overlapping area of the windows of three levels reaches a threshold value, and the threshold value may be any one of 80%, 85%, etc., which is not limited in this embodiment of the present application). It will be appreciated that the levels of the remaining windows in the VR device may be different from the first to third levels mentioned in the embodiments of the present application, and that there may be more levels in the visible area in the VR device, which embodiments of the present application are not limited in this respect.
In three-dimensional space, as shown in fig. 8A, the window of the first level, the window of the second level, and the window of the third level may be at the same position on the y-axis of the three-dimensional space coordinates. That is, the coordinates of the vertex Layer-40 of the planar area formed by the window of the first level, the vertex Layer-50 of the planar area formed by the window of the second level, and the vertex Layer-60 of the planar area formed by the window of the third level are the same on the y-axis, that is, the coordinates on the y-axis are all y1. Wherein windows at the first level include, but are not limited to, window a, window B, window C, and window D. Windows at the second level include, but are not limited to, window E, window F, window G, and window H. Windows at the third level include, but are not limited to, window I, window J, and window K.
As shown in fig. 8B, the user establishes the selection area in any of the ways shown in fig. 2. For example, the user selects the window a and the window B in the first level by clicking or establishing a selection area, selects the window E in the second level by clicking or establishing a selection area, and selects the window I in the third level as a target window of the first selection area (in the embodiment of the present application, indicated by the group 1) by clicking or establishing a selection area by using the movement of the finger. The user selects the window D in the first level by clicking or establishing a selection area by using the movement of the finger, selects the window M and the window G in the second level by clicking or establishing a selection area, and selects the window K in the third level as a target window of the second selection area (expressed by the group2 in the embodiment of the application) by clicking or establishing a selection area. And, the selected window A, window B, window E, window I, window D, window M, window G and window K present a selected identifier 1001 for prompting the user which pages are currently in the selected state and waiting for subsequent operations.
After the window A, the window B, the window E, the window I, the window D, the window M, the window G and the window K are selected, a user can zoom in or zoom out the window A, the window B, the window E, the window I, the window D, the window M, the window G and the window K through the retracting operation of the hand. As shown in fig. 8C, when the fingers of the user's hand are closed, window a, window B, window E, window I, window D, window M, window G, and window K are reduced together. Wherein window a, window B, window E, window I are scaled down as a first set of windows. Window D, window M, window G, and window K are scaled down as a second set of windows. The scale of the window A, window B, window E, window I, window D, window M, window G, and window K scaling may be determined by the degree of closure of the user's hand fingers relative to their home positions. For example, the VR device tracks the movement of the fingers of the user's hand in real time, with window a, window B, window E, window I, window D, window M, window G, and window K scaled to a minimum when the fingers of the user's hand are fully closed. When the fingers of the user's hand are closed to half of the original state, window A, window B, window E, window I, window D, window M, window G, and window K are scaled to half of the original size. The scaling of window a, window B, window E, window I, window D, window M, window G, and window K may also be determined by other means, and embodiments of the present application are not limited in this respect.
In addition, it is understood that in the context of utilizing a remote control handle, the reduction or expansion of the control window may be accomplished by a case of the remote control handle or by a particular manner of waving the remote control handle.
Notably, window a, window B, window E, window I, window D, window M, window G, and window K of the scaled-down presentation may still present planar areas of the hierarchy of windows a, B, and D prior to scaling. That is, the levels of window a, window B, window E, window I, window D, window M, window G, and window K after shrinking may be the same as the levels of window a, window B, window E, window I, window D, window M, window G, and window K before shrinking. Therefore, the window of the first level is reduced, so that the window of the later level can be conveniently displayed more. Further, the various windows that are scaled down may be presented in the form of icons at locations within the viewable area of the VR device that are not centered (e.g., the lower right or lower left corner of the viewable area) so that the user again zooms in on.
Therefore, the window control method provided by the embodiment of the application is applied to control a plurality of windows which are in different levels and in the same longitudinal space, and the plurality of windows can be operated in batches, so that the experience of a user is improved relative to a mode of operating a single window by the user.
According to some embodiments of the application, the plurality of windows within the viewable area of the VR device are windows at different levels. The window control method of the embodiment of the application is applied to control a plurality of windows at different levels to be played back. The window control method provided by the embodiment of the present application will be described as being applied to control a plurality of windows at different levels, taking as an example windows at a first level (which is denoted as Layer-1 by the embodiment of the present application), a second level (which is denoted as Layer-2 by the embodiment of the present application), and a third level (which is denoted as Layer-3 by the embodiment of the present application), respectively. In the embodiment of the present application, the plurality of windows of different levels refer to a window of a first level, a window of a second level, and a window of a third level, which may be located at the same position on the x-axis of the three-dimensional space coordinate. It will be appreciated that the levels of the remaining windows in the VR device may be different from the first to third levels mentioned in the embodiments of the present application, and that there may be more levels in the visible area in the VR device, which embodiments of the present application are not limited in this respect.
In three-dimensional space, as shown in fig. 9A, the windows of the first level, the windows of the second level, and the windows of the third level may be overlaid at the same position on the x-axis of the three-dimensional space coordinates. That is, the coordinates of the vertex Layer-70 of the planar area formed by the window of the first level, the vertex Layer-80 of the planar area formed by the window of the second level, and the vertex Layer-90 of the planar area formed by the window of the third level are the same or partially coincide on the x-axis. I.e. the coordinates on the x-axis are x1. For example, the window D of the first hierarchy, when the user clicks on the window D with a finger, is selected, and the coordinates of the window D and the coordinates of the window F of the second hierarchy and the coordinates of the window I of the third hierarchy. It will be appreciated that when the coordinates of the window F of the second level and the coordinates of the window I of the third level overlap with the coordinates of the window D, it may be considered that the window of the first level, the window of the second level and the window of the third level may be covered at the same position on the x-axis of the three-dimensional space coordinates, and the embodiment of the present application is not limited herein. Wherein windows at the first level include, but are not limited to, window a, window B, window C, and window D. Windows at the second level include, but are not limited to, window E and window F. Windows at the third level include, but are not limited to, window G, window H, and window I.
As shown in fig. 9B, the user establishes the selection area in any of the ways shown in fig. 2. For example, the user selects the window D in the first hierarchy as the target window (first selection area) selected in the first hierarchy by clicking or establishing the selection area with the movement of the finger. And selecting the window F of the second level as a selected target window (second selected area) by clicking or establishing the selected area, and selecting the window I in the third level as a selected target window (third selected area) by clicking or establishing the selected area. And, the selected window D, window F and window I present a selected identifier 1001, which is used to prompt the user which pages are currently in the selected state, and wait for the subsequent operation. Notably, when the user clicks on a window of a different hierarchy with a finger to select a target window, when the user clicks on a window D of a first hierarchy as the target window selected in the first hierarchy. Through the first level, when the overlapping area between the window a, the window B and the window D of the second level located in the next level of the first level and the selected window a, window B and window D of the first level exceeds a threshold value (may be that the overlapping area is 80%), the window (such as window F) of the overlapping area of the second level may be directly selected as the target window. The selection mode of the target window is not limited in the embodiment of the application.
After the window D, the window F and the window I are selected, the user can simultaneously zoom in or zoom out the window D, the window F and the window I through the retracting operation of the hands. As shown in fig. 9C, when the fingers of the user's hand are closed, window D, window F, and window I are reduced together. The scale of the window D, window F, and window I reduction may be determined by the degree of closure of the user's hand fingers relative to their home positions. For example, the VR device tracks the movement of the fingers of the user's hand in real time, with window D, window F, and window I scaled to a minimum when the fingers of the user's hand are fully closed. When the fingers of the user's hand are closed to half of the original state, window D, window F, and window I are scaled to half of the original size. The scaling of window D, window F, and window I may also be determined in other manners, and embodiments of the application are not limited in this regard.
In addition, it is understood that in the context of utilizing a remote control handle, the reduction or expansion of the control window may be accomplished by a case of the remote control handle or by a particular manner of waving the remote control handle.
It is noted that the window D displayed in a reduced manner may still be displayed in a planar area where the first level of the window D is located before being reduced. I.e. the level of the window D after shrinking may be the same as the level of the window D before non-shrinking. The window F that is scaled down may still be displayed in the planar area where the second level of the window F is located before being scaled down. The window I displayed in a reduced manner may still be displayed in a planar area where the third level of the window I is located before being reduced. Further, the various windows that are scaled down may be presented in the form of icons at locations within the viewable area of the VR device that are not centered (e.g., the lower right or lower left corner of the viewable area) so that the user again zooms in on.
In addition, the selection area established for the target window can be stored in a memory, so that the selection area is convenient to recall and directly use, the selection area is prevented from being established again, and the efficiency of operating the window is improved.
Therefore, the window control method provided by the embodiment of the application is applied to control a plurality of windows in different levels, and the plurality of windows can be operated in batches, so that the experience of a user is improved relative to a mode of operating a single window by the user.
Although the above embodiment explains that the window D, the window F, and the window I on the first to third hierarchical display areas are all selected and reduced. However, the present application is not limited thereto, and some of the plurality of windows may be selected, for example, a first level of windows a and B, a second level of windows E and F, and a third level of windows H and I.
According to some embodiments of the application, the plurality of windows within the viewable area of the VR device are windows at different levels or at the same level. The window control method of the embodiment of the application is applied to control the replacement between two windows at the same level. The window at the first level is taken as an example to describe that the window control method provided by the embodiment of the application is applied to control the replacement between two windows at the same level.
In the embodiment of the present application, windows in different levels are exemplified by windows in a first level (which is denoted as Layer-1 in the embodiment of the present application), a second level (which is denoted as Layer-2 in the embodiment of the present application), and a third level (which is denoted as Layer-3 in the embodiment of the present application), respectively. The plurality of windows of different levels refer to a window of a first level, a window of a second level, and a window of a third level may be overlaid at the same position on an x-axis of three-dimensional space coordinates. That is, the coordinates of the planar area formed by the windows of the first level, the planar area formed by the windows of the second level, and the planar area formed by the windows of the third level are the same or partially coincide on the x-axis. For example, a window D of the first hierarchy, when the user clicks on the window D with a finger, the window D is selected. Coordinates on the plane of the first hierarchy for the position clicked by the user on window D are mapped onto the plane of the second hierarchy and the plane of the third hierarchy, coinciding with window F of the second hierarchy and with window I of the third hierarchy, respectively. Thus, windows F of the second level coincide, and windows I of the third level are also selected. In addition, it is understood that when the coordinates of the window F of the second level and the coordinates of the window I of the third level overlap with the coordinates covered by the window D to a preset extent, the coordinates of the window F of the first level, the window F of the second level, and the window F of the third level may be considered to overlap at the same position on the x-axis of the three-dimensional space coordinates. For example, when the user clicks on window D with a finger, window D is selected. When it is determined that the range covered by the window D maps onto the plane of the second hierarchy and the plane of the third hierarchy, the window F of the second hierarchy is determined to be coincident with the window F of the second hierarchy and the window I of the third hierarchy is also selected to be coincident with the window I of the third hierarchy to more than 50% of the area covered by the window D. The embodiments of the present application are not limited herein. It will be appreciated that the levels of the remaining windows in the VR device may be different from the first to third levels mentioned in the embodiments of the present application, and that there may be more levels in the visible area in the VR device, which embodiments of the present application are not limited in this respect.
In the three-dimensional space, the positions of the windows of the first level, the windows of the second level, and the windows of the third level in the three-dimensional space may be referred to as positions shown in fig. 9A. The embodiments of the present application are not described herein.
As shown in fig. 10A, the user establishes the selection area in any of the ways shown in fig. 2. For example, the user selects window a in the first hierarchy as the first target window by clicking or creating a selection field with movement of a finger. And selecting the window B in the first level as a second target window by using a finger in a clicking or selecting area establishing mode. And, the selected window A and window B present a selected identifier 1001, which is used to prompt the user which pages are currently in the selected state, and wait for the subsequent operation.
After the window a and the window B are selected, as shown in fig. 10B, the user moves the window B with a finger toward the display area where the window a is located. To move window B to the position of window a and window a to the position of window B.
As shown in fig. 10C, after the replacement of the positions of the window a and the window B is completed, the window B is located in the display area of the window a, and the window a is located in the display area of the window B. It is noted that when the area of the display area of the window a is not identical to the area of the display area of the window B, after the window a and the window B are interchanged, the window area of the window B is adjusted to be identical to the area of the display area at the position of the window a. The window area of the window A is adjusted to be the same as the area of the display area at the position of the window B. The area of the first target window is adapted to the display area at the position of the second target window, and the area of the second target window is adapted to the display area at the position of the first target window.
The above is to move some windows toward other windows with fingers and trigger replacement recently, and according to an embodiment of the present application, it is also possible to trigger with a specific gesture, such as rotating the palm of the hand, after selecting both windows as the replacement objects. In addition, it is understood that in the context of utilizing a remote control handle, the replacement of the control window may be accomplished by a key of the remote control handle or by a particular way of waving the remote control handle.
Therefore, the window control method provided by the embodiment of the application is applied to control a plurality of windows at the same level, and the plurality of windows can be operated in batches, so that the experience of a user is improved relative to a mode of operating a single window by the user.
Although the above embodiment explained that both window a and window B are selected on the first-level display area, the positions of window a and window B are replaced with each other. However, the present application is not limited thereto, and the remaining windows of the first hierarchy or the windows of other hierarchies may be selected, for example, the windows a and D of the first hierarchy or the windows E and F of the second hierarchy or the windows H and I of the third hierarchy are selected.
According to some embodiments of the application, the plurality of windows within the viewable area of the VR device are windows at different levels or at the same level. The window control method of the embodiment of the application is applied to control the replacement among a plurality of windows at different levels. The window control method provided by the embodiment of the application is applied to control the replacement among a plurality of windows in different levels by taking the window in a first level and the window in a second level as an example.
In the embodiment of the present application, windows in different levels are exemplified by windows in a first level (which is denoted as Layer-1 in the embodiment of the present application), a second level (which is denoted as Layer-2 in the embodiment of the present application), and a third level (which is denoted as Layer-3 in the embodiment of the present application), respectively.
The windows respectively located at the different levels may be selected in a manner of selecting the windows of the different levels as explained in connection with fig. 9A to 9B above.
In the three-dimensional space, the positions of the windows of the first level, the windows of the second level, and the windows of the third level in the three-dimensional space may be referred to as positions shown in fig. 9A. The embodiments of the present application are not described herein.
As shown in fig. 11A, the user establishes the selection area in any of the ways shown in fig. 2. For example, the user selects window a, window B, and window C in the first hierarchy as the first target window by clicking or creating a selection field with movement of a finger. And selecting the window E in the second level by using a finger in a clicking or selecting area establishing mode as a second target window. And, the selected window A, window B, window C and window E present a selected identifier 1001 for prompting the user which pages are currently in the selected state and waiting for subsequent operations.
After the window A, the window B, the window C and the window E are selected, the user moves the window A, the window B and the window C with fingers to approach to the display area where the window E is positioned. And moving the window A, the window B and the window C to the position of the window E, and moving the window E to the position of a display area formed by the window A, the window B and the window C.
As shown in fig. 11B, after the positions of the window a, the window B, and the window C are replaced, the window E is located in the display area where the window a, the window B, and the window C are located, and the window a, the window B, and the window C are located in the display area where the window E is located.
It is noted that when the area of the display area formed by the window a, the window B, and the window C is inconsistent with the area of the display area of the window E, after the positions of the window a, the window B, and the window C are interchanged with the window E, the window area of the window E is adjusted to be the same as the area of the display area at the position where the window a, the window B, and the window C are located. And adjusting the window area of a plane area formed by the window A, the window B and the window C to be the same as the area of a display area at the position of the window E. The area of the first target window is adapted to the display area at the position of the second target window, and the area of the second target window is adapted to the display area at the position of the first target window.
The above is to move some windows toward other windows with fingers and trigger replacement recently, and according to an embodiment of the present application, it is also possible to trigger with a specific gesture, such as rotating the palm of the hand, after selecting both windows as the replacement objects. In addition, it is understood that in the context of utilizing a remote control handle, the replacement of the control window may be accomplished by a key of the remote control handle or by a particular way of waving the remote control handle.
Therefore, the window control method provided by the embodiment of the application is applied to control a plurality of windows in different levels, and the plurality of windows can be operated in batches, so that the experience of a user is improved relative to a mode of operating a single window by the user.
Although the above embodiment explains that the window a, the window B, the window C, and the window E of the second hierarchy are selected on the first hierarchy display area, and the positions of the window a, the window B, the window C, and the window E are replaced with each other. However, the present application is not limited thereto, and the remaining windows of the first level or the windows of other levels may be selected for position replacement, for example, the windows a, B, C and D of the first level and the windows E and F of the second level are selected for replacement, or the windows G and E of the third level and the second level are selected for replacement.
According to some embodiments of the application, the plurality of windows within the viewable area of the VR device are windows at different levels or at the same level. The window control method of the embodiment of the application is applied to control the replacement among a plurality of windows which are in different levels and are in the same longitudinal space. The window control method provided by the embodiment of the application is applied to control the replacement among a plurality of windows in different levels by taking the window in the first level, the window in the second level and the window in the third level as an example.
In the embodiment of the present application, windows in different levels are exemplified by windows in a first level (which is denoted as Layer-1 in the embodiment of the present application), a second level (which is denoted as Layer-2 in the embodiment of the present application), and a third level (which is denoted as Layer-3 in the embodiment of the present application), respectively. The plurality of windows of different levels refer to a window of a first level, a window of a second level, and a window of a third level, which may be overlaid at the same position on the y-axis of the three-dimensional space coordinate (i.e., the window of the first level, the window of the second level, and the window of the third level are in the same longitudinal space). That is, the coordinates of the plane area formed by the windows of the first level, the plane area formed by the windows of the second level, and the plane area formed by the windows of the third level are the same or partially coincide on the y-axis. For example, when the user clicks on the window D with a finger, the window D is selected, and the coordinates of the window D are identical to or partially overlap with the coordinates of the window H of the second hierarchy and the coordinates of the window K of the third hierarchy. It is to be understood that, when the coordinates of the window H of the second level and the coordinates of the window K of the third level overlap with the coordinates of the window D, it may be considered that the window of the first level, the window of the second level and the window of the third level may be covered at the same position on the y-axis of the three-dimensional space coordinates, which is not limited herein. It will be appreciated that the levels of the remaining windows in the VR device may be different from the first to third levels mentioned in the embodiments of the present application, and that there may be more levels in the visible area in the VR device, which embodiments of the present application are not limited in this respect.
In the three-dimensional space, the positions of the windows of the first level, the windows of the second level, and the windows of the third level in the three-dimensional space may be referred to as positions shown in fig. 8A. The embodiments of the present application are not described herein.
As shown in fig. 12A, the user establishes the selection area in any of the ways shown in fig. 2. For example, the user selects the window B, the window C, the window F of the second level, and the window J of the third level in the first level as the first target window (indicated by group1 in the figure) by clicking or creating a selection area by using the movement of the finger. The first-level window D, the second-level windows G and H and the third-level window K are selected by clicking or establishing a selection area by using a finger, and serve as second target windows (shown as group2 in the figure). And, the selected windows B, C, F, J, D, G, H and K present a selected identifier 1001 for prompting the user which pages are currently in the selected state and waiting for the subsequent operation.
After window B, window C, window F, window J, and window D, window G, window H, and window K are selected, the user moves the window B, window C, window F, and window J with a finger toward the display area where window D, window G, window H, and window K are located. To move windows B and C to the display area where window D is located, window F to the display area where window H and G are located, and window J to the display area where window K is located. Window D moves to the display area where window B and window C are located, window H and window G move to the display area where window F is located, window K moves to the display area where window J is located.
As shown in fig. 12B, after the replacement of the positions of the window B, the window C, the window F, the window J, and the window D, the window G, the window H, and the window K is completed, the window B and the window C are located in the display area where the window D is located, the window F is shifted in the display area where the window H and the window G are located, and the window J is located in the display area where the window K is located. The window D is located in the display area of the positions of the window B and the window C, the window H and the window G are located in the display area of the position of the window F, and the window K is located in the display area of the position of the window J.
It is noted that after the window B, the window C and the window D are interchanged, when the area of the display area formed by the window B and the window C is inconsistent with the area of the display area of the window D, the area of the window B and the area of the display area formed by the window C are adjusted to be the same as the area of the display area at the position of the window D. The window area of the window D is adjusted to be the same as the area of the display area of the plane area formed by both windows B and C.
After the window F and the windows G and H are interchanged, when the window area of the window F is inconsistent with the area of the plane area formed by the window G and the window H, the window area of the window F and the window G and the window H are adjusted, and the display areas of the plane area formed by the positions of the window F and the window G and the window H are identical. The area of the plane area formed by the two adjustment windows G and H is the same as the area of the display area at the position of the window F.
After the window K and the window J are interchanged, when the window area of the window K is inconsistent with the area of the display area at the position of the window J, the window area of the window K is adjusted to be the same as the area of the display area at the position of the window J. The window area of the adjustment window J is the same as the area of the display area at the position of the window K.
In addition, it is understood that in the context of utilizing a remote control handle, the replacement of the control window may be accomplished by a key of the remote control handle or by a particular way of waving the remote control handle. Further, after the replacement of the positions of window B, window C, window F, window J, and window D, window G, window H, and window K is completed, to prompt the user which pages are now in the selected state and are replaced, the display of the identifier 1001 on the selected window may be continued.
Therefore, the window control method provided by the embodiment of the application is applied to control a plurality of windows in different levels, and the plurality of windows can be operated in batches, so that the experience of a user is improved relative to a mode of operating a single window by the user.
The following describes a procedure of controlling the retraction of windows of different levels or the same level by applying the window control method provided by the embodiment of the present application.
Referring to fig. 13, fig. 13 is a schematic flow chart of a window control method applied to control the retraction of windows of different levels or the same level according to an embodiment of the present application.
The method includes steps S130-S133.
Step S130, receiving a batch operation instruction, where the batch operation instruction is used to instruct to play and receive a plurality of windows in a visible area of the VR device, where the plurality of windows may be a plurality of windows in a same level, for example, a window a, a window B, a window C, and a window D in a same level shown in fig. 6A. The plurality of windows may also be a plurality of windows at different levels, such as window a, window B, window C, and window D, window E and window F, window G, window H, and window I of the first level to the third level shown in fig. 7A. Window a, window B, window C, and window D of the first hierarchy shown in fig. 8A. Window E, window F, window G, and window H of the second hierarchy. Window I, window J, and window K of the third hierarchy. And windows a, B, C, and D of the first hierarchy shown in fig. 9A. A window E and a window F of the second hierarchy. Window G, window H, and window I of the third hierarchy. The batch operation instruction may be issued by a user using a pointing device such as a gesture or a remote control handle, for example, the user triggers an zoom-in instruction or a zoom-out instruction by expanding or closing five fingers in a visible area of the VR device (the batch operation instruction includes a zoom-in instruction or a zoom-out instruction). Or the user triggers an enlargement instruction or a contraction instruction (the batch operation instruction includes an enlargement instruction or a contraction instruction) by manipulating a key on the remote control handle.
Step S131, identifying batch operation instructions. The batch operation instruction includes, but is not limited to, an enlargement instruction to enlarge a window in a visible area of the VR device or a contraction instruction to reduce the window. The enlargement ratio and the reduction ratio of the enlargement instruction or the reduction instruction can be determined by the unfolding degree or the folding degree of the fingers of the user or can be determined by keys on a remote control handle. For example, the VR device tracks the movement of the fingers of the user's hand in real time, and when the fingers of the user's hand are fully closed, the selected target window is scaled to a minimum. When the fingers of the user's hand are closed to half of the original state, the selected target window is scaled to half of the original size. The scale of the selected target window may also be determined by other manners, and embodiments of the present application are not limited in this regard.
Step S132, determining a selection range. The selection range may be a plane area where the selected target window is located. The selection of the target window may be determined by the user using a selection area established by the hand or a remote control handle or clicking on a single window. For example, a user may click one or more windows presented in a viewable area of the VR device using keys provided on a remote control handle, or by holding the remote control handle in the hand to perform a similar tap or click action. The clicked window may be the selected target window. For example, as shown in fig. 7B, the user selects window a, window B, and window D in the first hierarchy as the target window (first selection area) selected in the first hierarchy by clicking or establishing the selection area with the movement of the finger. And selecting the window F of the second level as a selected target window (second selected area) by clicking or establishing the selected area, and selecting the window I in the third level as a selected target window (third selected area) by clicking or establishing the selected area.
Or the user swings the window in the area through which the hand is swung, and takes the window as the selected target window. The hand swing can move in a visible area of the VR device, and when a moving track forms a closed figure, a window covered by the closed figure is a target window. When the hand moves in the visible area of the VR device, the moving track forms a curve or a straight line, and the window covered by the curve or the straight line can be selected as the target window. For example, as shown in fig. 6A, the user creates a rectangular movement track using the movement of the hand, and the rectangular formed by the movement track covers the window a, the window B, the window C, and the window D as the selected target window.
It will be appreciated that the user may also use the remote control handle to move within the viewable area of the VR device to create a closed figure to select a target window. The closed figure formed may be a regular rectangle 1002 or triangle 1003 or irregular shape 1004 shown in fig. 3B, or the like. It will be appreciated that the closed figure formed by the movement track may determine that the selected target window is the target window if the window is entirely within the closed figure, and the selected target window if the window is partially within the closed figure (e.g., if 50% of the area of the window is within the closed figure, the selected target window is considered the target window).
Step S133, executing the operation indicated by the batch operation instruction. After the selection range is determined, the target window in the selection range is enlarged or reduced after the enlargement operation or the reduction operation in the batch operation instruction at the identification point. The above zoom-in operation or zoom-out operation is an example of an operation instruction, which may also include instructions for other control windows, such as closing a window, or the like.
Therefore, the window control method provided by the embodiment of the application is applied to control a plurality of windows in different levels or in the same level, and the plurality of windows can be operated in batches, so that the experience of a user is improved relative to a mode of operating a single window by the user.
Next, a process of replacing a window control method applied to control windows of different levels or the same level according to the embodiment of the present application is described below.
Referring to fig. 14A, fig. 14A is a schematic flow chart of a window control method for controlling replacement of windows of different levels or the same level according to an embodiment of the present application.
The method includes steps S140-S144.
Step S140, receiving a batch operation instruction, where the batch operation instruction is used to perform a replacement operation on a plurality of windows in a visible area of the VR device, where the plurality of windows may be a plurality of windows in a same level, or may be a plurality of windows in different levels, such as a window a, a window B, a window C, and a window D in a first level shown in fig. 9A. A window E and a window F of the second hierarchy. Window G, window H, and window I of the third hierarchy. The batch operation instruction may be issued by a user using a pointing device such as a gesture or a remote control handle, for example, the user may trigger a replacement instruction by moving a finger in a visible area of the VR device (the batch operation instruction includes the replacement instruction). Or the user triggers the replacement instruction by manipulating a key on the remote control handle.
Step S141, recognizing a batch operation instruction. Wherein the batch operation instructions include, but are not limited to, instructions to replace between at least two windows within a viewable area of the VR device. For example, the VR device tracks the movement of the finger of the user's hand in real time, and when the finger of the user's hand moves toward window a by long pressing window B as shown in fig. 10B until moving to the display area where window a is located, the batch operation instruction at this time is a replacement instruction between target windows.
Step S142, determining a selection range. The selection range may be a plane area where the first target window and the second target window of the same or different selected hierarchy are located. The selection of the target window may be determined by the user using a selection area established by the hand or a remote control handle or clicking on a single window. For example, a user may click one or more windows presented in a viewable area of the VR device using keys provided on a remote control handle, or by holding the remote control handle in the hand to perform a similar tap or click action. The clicked window may be the selected target window. For example, as shown in fig. 10A, the user selects window a in the first hierarchy as the first target window by clicking or creating a selection field with the movement of the finger. And selecting the window B in the first level as a second target window by using a finger in a clicking or selecting area establishing mode. Or as shown in fig. 11A, the user uses the movement of the finger to select the window a, the window B and the window C in the first hierarchy as the first target window in a clicking or selecting area setting manner. And selecting the window E in the second level by using a finger in a clicking or selecting area establishing mode as a second target window. Or as shown in fig. 12A, the user uses the movement of the finger to select the window B, the window C and the window F in the first level and the window J in the third level as the first target window (shown as group1 in the figure) in a clicking or selecting area setting manner. The first-level window D, the second-level windows G and H and the third-level window K are selected by clicking or establishing a selection area by using a finger, and serve as second target windows (shown as group2 in the figure).
Step S143, the coordinates of the first target window are identified. Referring to fig. 14B, the windows a and B at the same level serve as the selected first target window and second target window. Corresponding to the three-dimensional space, window a and window B are both in the region of forward formation of the x-axis and the z-axis. Corresponding to the rectangular window a, four vertexes of the window a have four coordinates in three-dimensional space, A1 (x 1,0, z 1), A2 (x 1,0, z 2), A3 (x 2,0, z 1), A4 (x 2,0, z 2), respectively. Of course, the first target window may also be located at the rest of the three-dimensional space, and the number of the first target windows is not limited to one, which is not limited in this embodiment of the present application.
Step S144 identifies coordinates of the second target window. Referring to fig. 14B, corresponding to a rectangular window B in three-dimensional space, four vertices of the window B have four coordinates in three-dimensional space, B1 (x 3,0, z 3), B2 (x 3,0, z 4), B3 (x 4,0, z 3), and B4 (x 4,0, z 4), respectively. Of course, the second target window may also be located at the rest of the three-dimensional space, the number of the second target windows is not limited to one, and the first target window and the second target window may also be windows of different levels, which is not limited herein.
Step S145, executing the operation indicated by the batch operation instruction. After the coordinates of the first target window and the second target window in the selection range are determined, the positions of the first target window and the second target window in the selection range are replaced for the identified replacement operation in the batch operation instruction. Namely, the first target window is moved to the coordinate where the second target window is located, and the second target window is moved to the coordinate where the first target window is located. When the coordinates of the first target window are not coincident with the coordinates of the second target window, the area of the first target window is adjusted so that the coordinates at the vertex of the first target window and the coordinates at the vertex of the second target window are coincident. When the coordinates of the second target window are not coincident with the coordinates of the first target window, the area of the second target window is adjusted so that the coordinates at the vertex of the second target window are coincident with the coordinates at the vertex of the first target window.
Therefore, the window control method provided by the embodiment of the application is applied to control a plurality of windows in different levels or in the same level, and the plurality of windows can be operated in batches, so that the experience of a user is improved relative to a mode of operating a single window by the user.
Referring to fig. 15, fig. 15 is a schematic structural diagram of a window control device according to an embodiment of the present application.
The window control device 2 shown in fig. 15 includes a receiving module 150, an acquisition module 151, a collision detection module 152, a region identification module 153, a grouping module 154, a display module 155, and an output module 156.
The receiving module 150 is configured to receive the batch operation instruction shown in fig. 13 and 14A or receive the position of the selected target window in the visible area 100 detected by the collision detecting module 152. The batch operation instruction may be a window retraction instruction or a window replacement instruction issued by the user by using movement of a hand or a remote control handle, or the like.
The collection module 151 is configured to collect gestures, actions, and areas pointed by the pointing component, so as to determine a movement track of the pointing component.
The collision detection module 152 is configured to detect a portion of the window 1000 in the visible area 100 that is collided by a ray emitted from the remote control handle into the visible area 100 shown in fig. 1A or 1B. Or the user may be directed to the window 1000 within the viewable area 100 shown in fig. 1A or 1B in the manner of a glove or gesture. The window where the ray emitted by the remote control handle collides or the window where the direction pointed by the finger of the user is located is the target window. Or the window covered by the closed figure shown in fig. 3B formed by the ray emitted by the remote control handle, the collision detection module 152 may also identify and determine the window covered in the closed figure as the target window.
The area identifying module 153 is configured to identify a selected area of the window selected by the user using the hand or the remote control handle. For example, the user creates an area as shown in fig. 3B by using the movement of the hand, and the area recognition module 153 is used to recognize the shape, size and window covered in the closed figure of the selected area formed by the closed figure. Or the user creates a straight line or curve as shown in fig. 3C by using the movement of the hand, and the area recognition module 153 is used to recognize the length of the straight line or curve and the window through which the straight line or curve passes.
The grouping module 154 groups the windows and manages the windows by the user.
A display module 155 for displaying the selected target frame, for example, on the selected target frame using the identifier 1001 shown in fig. 3A. Or to display the position of the target window after completion of the playback or replacement, such as the display position in the viewable area 100 after the window shown in fig. 6B, 7C, 8C, and 9C is reduced. Or the display position after the replacement of the target window or the target window group as shown in fig. 10C, 11B, and 12B.
And an output module 156 for performing the action of the window batch operation interaction. The actions of window batch operation interactions include, but are not limited to, moving windows, stitching windows, closing windows, maximizing and minimizing windows, and the like.
In some embodiments of the present application, an electronic device is further provided, and an electronic device in the embodiment of the present application is described below with reference to fig. 16. Fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
For at least one embodiment, the controller hub 804 communicates with the processor 801 via a multi-drop bus, such as a Front Side Bus (FSB), a point-to-point interface, such as a Quick Path Interconnect (QPI), or similar connection. The processor 801 executes instructions that control the general type of data processing operations. In one embodiment, controller hub 804 includes, but is not limited to, a Graphics Memory Controller Hub (GMCH) (not shown) and an input/output hub (IOH) (which may be on separate chips) (not shown), where the GMCH includes memory and graphics controllers and is coupled to the IOH.
The electronic device 800 may also include a coprocessor 806 coupled to the controller hub 804 and memory 802. Or one or both of the memory 802 and GMCH may be integrated within the processor 801 (as described in the present application), with the memory 802 and co-processor 806 being directly coupled to the processor 801 and the controller hub 804, the controller hub 804 being in a single chip with the IOH.
In one embodiment, memory 802 may be, for example, dynamic Random Access Memory (DRAM), phase Change Memory (PCM), or a combination of both. Memory 802 may include one or more tangible, non-transitory computer-readable media for storing data and/or instructions therein. The computer-readable storage medium has stored therein instructions, and in particular, temporary and permanent copies of the instructions.
In one embodiment, coprocessor 806 is a special-purpose processor, such as, for example, a high-throughput MIC processor, a network or communication processor, compression engine, graphics processor, GPU, embedded processor, or the like. Optional properties of coprocessor 806 are shown in dashed lines in fig. 16.
In one embodiment, electronic device 800 may further comprise a Network Interface (NIC) 803. The network interface 803 may include a transceiver to provide a radio interface for the device 800 to communicate with any other suitable device (e.g., front end module, antenna, etc.). In various embodiments, the network interface 803 may be integrated with other components of the electronic device 800. The network interface 803 can realize the functions of the communication unit in the above-described embodiment.
In one embodiment, as shown in FIG. 16, the electronic device 800 may further include an input/output (I/O) device 805. Input/output (I/O) devices 805 may include a user interface designed to enable a user to interact with electronic device 800, a peripheral component interface designed to enable a peripheral component to also interact with electronic device 800, and/or a sensor designed to determine environmental conditions and/or location information associated with electronic device 800.
It is noted that fig. 16 is merely exemplary. That is, although the electronic apparatus 800 is shown in fig. 16 as including a plurality of devices such as the processor 801, the controller hub 804, the memory 802, etc., in practical applications, the apparatus using the methods of the present application may include only a part of the devices of the electronic apparatus 800, for example, may include only the processor 801 and the NIC803. The nature of the alternative device is shown in dashed lines in fig. 16.
In some embodiments of the present application, the computer readable storage medium of the electronic device 800 may have stored therein instructions that, when executed by at least one unit in a processor, cause the device to implement the method for detecting wireless charging alignment mentioned in the above embodiments. The instructions, when executed on a computer, cause the computer to perform the window control method as mentioned in the above embodiments.
Referring now to fig. 17, fig. 17 is a schematic diagram of an SOC according to an embodiment of the present application, and is shown as a block diagram of an exemplary SOC (System on Chip) 1000 according to an embodiment of the present application. In fig. 17, similar parts have the same reference numerals. In addition, the dashed box is an optional feature of a more advanced SoC. The SoC may be used in an electronic device according to an embodiment of the present application, and may implement corresponding functions according to instructions stored therein.
In fig. 17, soC 1000 includes an interconnect unit 1002 coupled to processor 1001, a system agent unit 1006, a bus controller unit 1005, an integrated memory controller unit 1003, a set of one or more coprocessors 1007 that may include integrated graphics logic, an image processor, an audio processor, and a video processor, a Static Random Access Memory (SRAM) unit 1008, and a Direct Memory Access (DMA) unit 1004. In one embodiment, coprocessor 1007 includes a special-purpose processor, such as, for example, a network or communication processor, compression engine, GPU, high-throughput MIC processor, embedded processor, or the like.
One or more computer-readable media for storing data and/or instructions may be included in a Static Random Access Memory (SRAM) unit 1008. The computer-readable storage medium may have stored therein instructions, and in particular, temporary and permanent copies of the instructions.
When SoC 1000 is applied to an electronic device in accordance with the present application, the computer-readable storage medium having stored therein instructions may include instructions that, when executed by at least one unit in a processor, cause the electronic device to implement a method of detecting wireless charging alignment as mentioned in the above embodiments. The instructions, when executed on a computer, cause the computer to perform the window control method as mentioned in the above embodiments.
In addition, the embodiment of the application also discloses a computer readable storage medium, wherein a processing program is stored on the computer readable storage medium, and the processing program realizes the window control method mentioned in the above embodiment when being executed by a processor.
The computer readable storage medium may be a read-only memory, a random access memory, a hard disk, an optical disk, or the like.

Claims (20)

Translated fromChinese
1.一种窗口控制方法,应用于基于虚拟现实技术或增强现实的电子设备,所述电子设备关联有指向部件,其特征在于,所述窗口控制方法用于控制显示在所述电子设备的可视区域内的多个窗口层,各个窗口层沿远离电子设备的用户侧的方向依次排列,各个窗口层中包括多个窗口,其中,所述窗口控制方法包括:1. A window control method, applied to an electronic device based on virtual reality technology or augmented reality, wherein the electronic device is associated with a pointing component, characterized in that the window control method is used to control multiple window layers displayed in a visible area of the electronic device, each window layer is arranged in sequence along a direction away from a user side of the electronic device, and each window layer includes multiple windows, wherein the window control method includes:检测所述指向部件的移动;detecting movement of the pointing component;基于所述指向部件的移动轨迹确定选择区域;Determining a selection area based on the movement trajectory of the pointing component;根据选择区域从所述多个窗口层中确定至少一个目标窗口;Determining at least one target window from the plurality of window layers according to the selected area;对各至少一个所述目标窗口同时执行所述指向部件指示的操作;executing the operation indicated by the pointing component on at least one of the target windows simultaneously;所述根据选择区域从所述多个窗口层中确定至少一个目标窗口包括:Determining at least one target window from the plurality of window layers according to the selected area comprises:基于所述选择区域,确定多个窗口层中的多个目标窗口;Based on the selected area, determining multiple target windows in multiple window layers;其中,位于第一层级的目标窗口映射到其他层级,与位于所述其他层级的目标窗口的重合面积大于或等于第一阈值。The target window at the first level is mapped to other levels, and the overlapping area with the target window at the other levels is greater than or equal to a first threshold.2.如权利要求1所述的窗口控制方法,其特征在于,所述指向部件的移动是针对窗口的点击,基于所述指向部件的点击位置确定所述选择区域。2. The window control method according to claim 1, wherein the movement of the pointing component is a click on the window, and the selection area is determined based on the click position of the pointing component.3.如权利要求1所述的窗口控制方法,其特征在于,所述指向部件的移动是以预定轨迹移动,3. The window control method according to claim 1, wherein the movement of the pointing component is along a predetermined trajectory.基于所述指向部件移动时形成的轨迹,确定所述选择区域。The selection area is determined based on a track formed when the pointing component moves.4.如权利要求3所述的窗口控制方法,其特征在于,所述指向部件移动时形成的轨迹形成封闭图形,所述电子设备基于所述封闭图形所覆盖的区域确定所述选择区域,作为选择区域的所述封闭图形的区域内的窗口为选中的所述目标窗口。4. The window control method as described in claim 3 is characterized in that the trajectory formed by the movement of the pointing component forms a closed figure, the electronic device determines the selection area based on the area covered by the closed figure, and the window within the area of the closed figure serving as the selection area is the selected target window.5.如权利要求1-4任意一项所述的窗口控制方法,其特征在于,基于所述指向部件指示的收放操作包括对各所述目标窗口进行缩小、放大、移动窗口位置。5. The window control method according to any one of claims 1 to 4, characterized in that the retracting and releasing operations indicated by the pointing component include reducing, enlarging, and moving the window position of each of the target windows.6.如权利要求1-4任意一项所述的窗口控制方法,其特征在于,所述根据选择区域从所述多个窗口层中确定至少一个目标窗口包括:6. The window control method according to any one of claims 1 to 4, wherein determining at least one target window from the plurality of window layers according to the selected area comprises:基于所述选择区域,确定同一个窗口层中的至少一个目标窗口。At least one target window in the same window layer is determined based on the selected area.7.如权利要求1-4任意一项所述的窗口控制方法,其特征在于,每个窗口层中至少一个窗口被选中。7. The window control method according to any one of claims 1 to 4, characterized in that at least one window in each window layer is selected.8.如权利要求6所述的窗口控制方法,其特征在于,所述选择区域所确定的封闭图形的范围,覆盖多个窗口层中的多个窗口,所述多个窗口被确定作为所述目标窗口。8. The window control method according to claim 6, wherein the range of the closed figure determined by the selection area covers multiple windows in multiple window layers, and the multiple windows are determined as the target windows.9.如权利要求6所述的窗口控制方法,其特征在于,所述选择区域在所述多个窗口层中的一个窗口层上选择的窗口,以及所述选择区域被映射到其他窗口层上所确定的至少一个窗口,作为所述目标窗口。9. The window control method according to claim 6, characterized in that the window selected by the selection area on one window layer among the multiple window layers, and the at least one window determined by mapping the selection area to other window layers, are used as the target window.10.如权利要求1所述的窗口控制方法,其特征在于,在所述至少一个所述目标窗口包括至少两个窗口的情况下,执行所述指向部件指示的操作对所述多个目标窗口执行所述指向部件指示的操作,包括:10. The window control method according to claim 1, wherein, when the at least one target window includes at least two windows, executing the operation indicated by the pointing component to execute the operation indicated by the pointing component on the multiple target windows comprises:互相调换所述至少两个窗口的位置。The positions of the at least two windows are swapped with each other.11.如权利要求10所述的窗口控制方法,其特征在于,所述至少两个窗口被分成两个窗口组,执行所述指向部件指示的操作对所述多个目标窗口执行所述指向部件指示的操作,包括:11. The window control method according to claim 10, wherein the at least two windows are divided into two window groups, and executing the operation indicated by the pointing component to execute the operation indicated by the pointing component on the multiple target windows comprises:互相调换所述两个窗口组的位置。The positions of the two window groups are swapped with each other.12.如权利要求10或11所述的窗口控制方法,其特征在于,调整至少两个所述窗口中的第一目标窗口的面积以和第二目标窗口的位置处的显示区域相适应,或调整所述至少两个所述窗口中的第二目标窗口的面积以和所述第一目标窗口的位置处的显示区域相适应。12. The window control method according to claim 10 or 11, characterized in that the area of the first target window of at least two of the windows is adjusted to adapt to the display area at the position of the second target window, or the area of the second target window of the at least two of the windows is adjusted to adapt to the display area at the position of the first target window.13.一种电子设备,其特征在于,所述电子设备关联有指向部件,且所述电子设备基为基于虚拟现实技术或增强现实的电子设备,所述电子设备包括:13. An electronic device, characterized in that the electronic device is associated with a pointing component, and the electronic device is an electronic device based on virtual reality technology or augmented reality, and the electronic device comprises:存储器,所述存储器用于存储窗口控制指令;A memory, wherein the memory is used to store window control instructions;处理器,所述处理器在执行所述窗口控制指令时实现如下步骤:A processor, wherein the processor implements the following steps when executing the window control instruction:检测所述指向部件的移动;detecting movement of the pointing component;基于所述指向部件的移动轨迹确定选择区域;Determining a selection area based on the movement trajectory of the pointing component;根据选择区域从多个窗口层中确定至少一个目标窗口;determining at least one target window from a plurality of window layers according to the selected area;对至少一个所述目标窗口同时执行所述指向部件指示的操作;executing the operation indicated by the pointing component on at least one of the target windows simultaneously;所述根据选择区域从所述多个窗口层中确定至少一个目标窗口包括:Determining at least one target window from the plurality of window layers according to the selected area comprises:基于所述选择区域,确定多个窗口层中的多个目标窗口;Based on the selected area, determining a plurality of target windows in a plurality of window layers;其中,位于第一层级的目标窗口映射到其他层级,与位于所述其他层级的目标窗口的重合面积大于或等于第一阈值。The target window at the first level is mapped to other levels, and the overlapping area with the target window at the other levels is greater than or equal to a first threshold.14.如权利要求13所述的电子设备,其特征在于,所述处理器在执行所述窗口控制指令时,还用于基于所述指向部件的点击位置确定所述选择区域。14. The electronic device according to claim 13, wherein when executing the window control instruction, the processor is further configured to determine the selection area based on a click position of the pointing component.15.如权利要求13所述的电子设备,其特征在于,所述处理器在执行所述窗口控制指令时,还用于基于所述指向部件移动时形成的轨迹,确定所述选择区域。15 . The electronic device according to claim 13 , wherein when executing the window control instruction, the processor is further configured to determine the selection area based on a track formed when the pointing component moves.16.如权利要求13-15任意一项所述的电子设备,其特征在于,所述处理器在执行所述窗口控制指令时,还用于基于所述指向部件指示的收放操作包括对各所述目标窗口进行缩小、放大、移动窗口位置。16. The electronic device according to any one of claims 13 to 15, characterized in that when the processor executes the window control instruction, it is further configured to reduce, enlarge, and move the position of each target window based on the retraction and expansion operations indicated by the pointing component.17.如权利要求13-15任意一项所述的电子设备,其特征在于,每个窗口层中至少一个窗口被选中。17. The electronic device according to any one of claims 13 to 15, characterized in that at least one window in each window layer is selected.18.如权利要求17所述的电子设备,其特征在于,所述存储器还用于存储所述选择区域,以使所述处理器调用所述选择区域。18. The electronic device according to claim 17, wherein the memory is further used to store the selected area so that the processor calls the selected area.19.如权利要求13-15任意一项所述的电子设备,其特征在于,所述处理器在执行所述窗口控制指令时,还用于互相调换至少两个窗口的位置或者互相调换两个窗口组的位置。19. The electronic device according to any one of claims 13 to 15, characterized in that when executing the window control instruction, the processor is further configured to swap positions of at least two windows or swap positions of two window groups.20.一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有存储窗口控制指令,所述存储窗口控制指令被处理器执行时实现如权利要求1-12任意一项所述的窗口控制方法。20. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a storage window control instruction, and when the storage window control instruction is executed by a processor, the window control method according to any one of claims 1 to 12 is implemented.
CN202011218043.9A2020-11-042020-11-04 A window control method, electronic device and computer readable storage mediumActiveCN114529691B (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
CN202011218043.9ACN114529691B (en)2020-11-042020-11-04 A window control method, electronic device and computer readable storage medium

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
CN202011218043.9ACN114529691B (en)2020-11-042020-11-04 A window control method, electronic device and computer readable storage medium

Publications (2)

Publication NumberPublication Date
CN114529691A CN114529691A (en)2022-05-24
CN114529691Btrue CN114529691B (en)2025-06-06

Family

ID=81618878

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202011218043.9AActiveCN114529691B (en)2020-11-042020-11-04 A window control method, electronic device and computer readable storage medium

Country Status (1)

CountryLink
CN (1)CN114529691B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN115033333B (en)*2022-07-192022-12-16荣耀终端有限公司Suspended window display method, electronic equipment and storage medium
CN115421626B (en)*2022-11-022023-02-24海看网络科技(山东)股份有限公司AR virtual window interaction method based on mobile terminal
CN116301482B (en)*2023-05-232023-09-19杭州灵伴科技有限公司Window display method of 3D space and head-mounted display device
CN119718060A (en)*2023-09-272025-03-28珠海莫界科技有限公司 Control method, device, computer equipment and storage medium of extended screen assembly

Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110692031A (en)*2017-06-012020-01-14三星电子株式会社 System and method for window control in a virtual reality environment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102890603B (en)*2011-07-202015-05-27深圳市快播科技有限公司Video image processing method and video image processing device
US9954927B2 (en)*2015-01-262018-04-24Hong Kong Applied Science and Technology Research Institute Company LimitedMethod for managing multiple windows on a screen for multiple users, and device and system using the same
CN108829314B (en)*2018-05-242021-01-19广州视源电子科技股份有限公司Screenshot selecting interface selection method, device, equipment and storage medium
CN108920238A (en)*2018-06-292018-11-30上海连尚网络科技有限公司Operate method, electronic equipment and the computer-readable medium of application

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN110692031A (en)*2017-06-012020-01-14三星电子株式会社 System and method for window control in a virtual reality environment

Also Published As

Publication numberPublication date
CN114529691A (en)2022-05-24

Similar Documents

PublicationPublication DateTitle
US12307580B2 (en)Methods for manipulating objects in an environment
CN114529691B (en) A window control method, electronic device and computer readable storage medium
EP3814876B1 (en)Placement and manipulation of objects in augmented reality environment
TWI690842B (en)Method and apparatus of interactive display based on gesture recognition
CN102915112B (en)For the system and method for closely motion tracking
Chun et al.Real-time hand interaction for augmented reality on mobile phones
US10649616B2 (en)Volumetric multi-selection interface for selecting multiple objects in 3D space
JP7480388B2 (en) Head-mounted information processing device
KR101833253B1 (en)Object manipulation method in augmented reality environment and Apparatus for augmented reality implementing the same
HenryssonBringing augmented reality to mobile phones
JP2013037675A5 (en)
Telkenaroglu et al.Dual-finger 3d interaction techniques for mobile devices
CN103426202A (en)Display system and display method for three-dimensional panoramic interactive mobile terminal
CN111383345B (en)Virtual content display method and device, terminal equipment and storage medium
Smith et al.Digital foam interaction techniques for 3D modeling
CN112527112B (en)Visual man-machine interaction method for multichannel immersion type flow field
CN105354031A (en)Leap Motion based 3D commodity display method
CN112181135B (en) A 6-DOF visual-tactile interaction method based on augmented reality
CN116126205A (en) Interactive control method, device, wearable device and storage medium
Liu et al.COMTIS: Customizable touchless interaction system for large screen visualization
GB2533777A (en)Coherent touchless interaction with steroscopic 3D images
CN114995635A (en)Desktop gesture interaction method based on mixed reality
JPH08249500A (en) How to display 3D graphics
SchkolneMaking digital shapes by hand
US12223580B2 (en)Interfacing method and apparatus for 3D sketch

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination
GR01Patent grant
GR01Patent grant

[8]ページ先頭

©2009-2025 Movatter.jp