Movatterモバイル変換


[0]ホーム

URL:


WO2025045363A1 - A computer software module arrangement, an xr device and a method for providing an improved extended reality interface - Google Patents

A computer software module arrangement, an xr device and a method for providing an improved extended reality interface
Download PDF

Info

Publication number
WO2025045363A1
WO2025045363A1PCT/EP2023/073843EP2023073843WWO2025045363A1WO 2025045363 A1WO2025045363 A1WO 2025045363A1EP 2023073843 WEP2023073843 WEP 2023073843WWO 2025045363 A1WO2025045363 A1WO 2025045363A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
handedness
movement
gui
control area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2023/073843
Other languages
French (fr)
Inventor
Peter ÖKVIST
Andreas Kristensson
Tommy Arngren
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Telefonaktiebolaget LM Ericsson AB
Original Assignee
Telefonaktiebolaget LM Ericsson AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget LM Ericsson ABfiledCriticalTelefonaktiebolaget LM Ericsson AB
Priority to PCT/EP2023/073843priorityCriticalpatent/WO2025045363A1/en
Publication of WO2025045363A1publicationCriticalpatent/WO2025045363A1/en
Pendinglegal-statusCriticalCurrent
Anticipated expirationlegal-statusCritical

Links

Classifications

Definitions

Landscapes

Abstract

An extended reality, XR, device (100) comprising a controller (110), wherein said controller (110) comprises: a graphics processing circuit (111) configured to provide one or more graphic control objects (210), which one or more graphic control objects (210) are to be displayed in a control area (220) as part of a graphical user interface, GUI, a hand-movement-detection circuit (112) configured to detect a movement of a hand H of a user, a handedness-detection circuit (113) configured to detect a handedness based on the movement of the hand H, whereby the handedness indicates a main hand and a GUI adaptation circuit (114) configured to determine a position for the control area (220) based on the detected handedness, wherein the handedness-detection circuit (113) is configured to detect the handedness based on the movement of the hand H by determining that the movement of the hand result in operations which operations are within a first accuracy range associated with a main hand and in response thereto determine the handedness as being for the hand movements within the first accuracy range, and wherein the position for the control area (220) is a first position when the handedness indicates a first main hand and wherein the position for the control area (220) is a second position when the handedness indicates a second main hand.

Description

A COMPUTER SOFTWARE MODULE ARRANGEMENT, AN XR DEVICE AND A METHOD FOR PROVIDING AN
IMPROVED EXTENDED REALITY INTERFACE
TECHNICAL FIELD
The present invention relates to an arrangement, an arrangement comprising computer software modules, an extended reality, XR, device and a method for providing an improved extended reality interface, and in particular to an arrangement comprising computer software modules, an arrangement comprising circuits, a device and a method for providing an improved extended reality interface which adapts to a handedness of a user.
BACKGROUND
Contemporary devices such as smartphones, tablet computers, extended reality devices, such as goggles, are designed for right-handed users. This often makes interaction with the device and its functionality cumbersome for a left-handed user when for example menus fold out in the right-hand direction and thus areas of interaction become occluded by any hand coming from the left-hand side. Similarly, hand-arm occlusion aspects also occur in XR environments and are observed for example by the Oculus™ user community.
There is thus a need for a user interface in XR devices that does not occlude the content for some of the users.
SUMMARY
For the purposes of this text, Extended Reality (XR) is a term referring to all real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables. For example, XR includes representative forms such as augmented reality (AR), mixed reality (MR) and virtual reality (VR) and the areas interpolated among them. The levels of virtuality range from partially sensory inputs to immersive virtuality, also called VR.
XR is a superset that includes the entire spectrum from "the complete real" to "the complete virtual" in the concept of reality-virtuality continuum introduced by Paul Milgram. Still, its connotation lies in the extension of human experiences especially relating to the senses of existence (represented by VR) and the acquisition of cognition (represented by AR). With the continuous development in humancomputer interactions, this connotation is still evolving.
Furthermore, in human biology, handedness is an individual's preferential use of one hand, known as the dominant hand, due to it being stronger, faster or better in dexterity. The other hand, comparatively often the weaker, less dexterous or simply less subjectively preferred, is called the nondominant hand.
Returning to the present invention, an object of the teachings herein is to overcome or at least reduce or mitigate the problems discussed in the background section.
According to one aspect there is provided an extended reality, XR, device comprising a controller, wherein said controller comprises: a graphics processing circuit configured to provide one or more graphic control objects, which one or more graphic control objects are to be displayed in a control area as part of a graphical user interface, GUI, a hand-movement-detection circuit configured to detect a movement of a hand (H) of a user, a handedness-detection circuit configured to detect a handedness based on the movement of the hand (H) detected by the hand-movement detection circuit, whereby the handedness indicates a main hand and a GUI adaptation circuit configured to determine a position for the control area based on the detected handedness, wherein the handedness-detection circuit is configured to detect the handedness based on the movement of the hand (H) detected by the hand- movement-detection circuit by determining that the movement of the hand result in operations which operations are within a first accuracy range associated with a main hand and in response thereto determine the handedness as being for the hand movements within the first accuracy range, and wherein GUI adaptation circuit is further configured to determine the position for the control area to be first position when the handedness indicates a first main hand and wherein the position for the control area is a second position when the handedness indicates a second main hand. Alternatively or additionally in some embodiments the position for the control area is a first position when the movement of the hand correspond to movements indicating a first main hand and wherein the position for the control area is a second position when the movement of the hand correspond to movements indicating a second main hand.
According to another aspect there is provided a method for use in an extended reality, XR, device, wherein said method comprises: providing one or more graphic control objects, which one or more graphic control objects are to be displayed in a control area as part of a graphical user interface, GUI, in an extended reality at a control area, detecting a movement of a hand (H) of a user, detecting a handedness based on the movement of the hand (H), whereby the handedness indicates a main hand and determining a position for the control area based on the detected handedness, wherein the detection of the handedness is based on the movement of the hand (H) by the method further comprising determining that the movement of the hand result in operations which operations are within a first accuracy range associated with a main hand and in response thereto determining the handedness as being for the hand movements within the first accuracy range, and wherein the position for the control area is a first position when the handedness indicates a first main hand and wherein the position for the control area is a second position when the handedness indicates a second main hand. Alternatively or additionally in some embodiments the position for the control area is a first position when the movement of the hand correspond to movements indicating a first main hand and wherein the position for the control area is a second position when the movement of the hand correspond to movements indicating a second main hand.
According to another aspect there is provided a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of an XR device enables the XR device method according to herein.
According to another aspect there is provided an extended reality, XR, viewing device comprising a software code module arrangement, wherein the software code module arrangement comprises a software code module for providing one or more graphic control objects, which one or more graphic control objects are to be displayed in a control area as part of a graphical user interface, GUI, in an extended reality at a control area, a software code module for detecting a movement of a hand (H) of a user and for detecting a handedness based on the movement of the hand (H), whereby the handedness indicates a main hand and a software code module for adapting the GUI by determining a position for the control area based on the detected handedness, wherein the detection of the handedness is based on the movement of the hand (H) by the a software code module further comprises a software code for determining that the movement of the hand result in operations which operations are within a first accuracy range associated with a main hand and in response thereto determining the handedness as being for the hand movements within the first accuracy range, and wherein the position for the control area is a first position when the handedness indicates a first main hand and wherein the position for the control area is a second position when the handedness indicates a second main hand. Alternatively or additionally in some embodiments the position for the control area is a first position when the movement of the hand correspond to movements indicating a first main hand and wherein the position for the control area is a second position when the movement of the hand correspond to movements indicating a second main hand.
For the context of the teachings herein a software code module may be replaced or supplemented by a software module.
Further embodiments and advantages of the present invention will be given in the detailed description. It should be noted that the teachings herein find use in smartphones, smartwatches, tablet computers, media devices, and even in vehicular displays.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the invention will be described in the following, reference being made to the appended drawings which illustrate non-limiting examples of how the inventive concept can be reduced to practice.
Figure 1A shows a schematic view of an XR device according to some embodiments of the present invention.
Figure IB shows a schematic view of an XR device according to some embodiments of the present invention.
Figure 1C shows a schematic view of an XR device according to some embodiments of the present invention.
Figures 2A to 2H each shows a schematic view of an XR device system having a user interface according to some embodiments of the teachings herein.
Figures 3A and 3B each shows a schematic view of an XR device system having a user interface according to some embodiments of the teachings herein.
Figures 4A to 4C each shows a schematic view of an XR device system having a user interface according to some embodiments of the teachings herein.
Figure 5 shows a flowchart of a general method according to some embodiments of the present invention.
Figure 6 shows a schematic view of a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of an arrangement enables the arrangement to implement some embodiments of the present invention, and
Figure 7 shows a component view for a software component arrangement according to some embodiments of the teachings herein. DETAILED DESCRIPTION
Figure 1A shows a schematic view of an XR device 100 according to some embodiments of the present invention. The XR device comprises a controller 110, a memory 120, a sensor 130 and an image presenting device 150. The sensor 130 and/or the image presenting device 150 are optional in that one or both of them may be connected to the XR device. In some embodiments, the XR device 100 may also comprise an XR device interface 140. The XR interface 140 is optional.
The controller 110 is configured to control the overall operation of the XR device 100. In some embodiments, the controller 110 may be a general-purpose controller, wherein general purpose refers to hardware (and/or software) that perform a variety of tasks. As a skilled person would understand there are many alternatives for how to implement a controller, such as using Field -Programmable Gate Arrays circuits, ASICs, GPUs, etc. in addition to or as an alternative. For the purpose of this application, all such possibilities and alternatives will be referred to simply as the controller 110.
As will be discussed in further detail below, the controller 110 comprises one or more circuits 111 to 114, each adapted (individually and/or in combination) to perform specific functions which will be discussed in further detail below.
It should also be noted that in some embodiments, parts of or all of the processing of the controller may be performed remotely, where a local controller 110 is configured to provide input data to a remote processing unit, such as to a cloud server, causing the remote processing unit to perform the processing and receive the results of such processing as output from the remote processing unit. For the purpose of this application, such possibilities and alternatives will also be referred to simply as the controller 110. The controller 110 may thus represent both the local controller and the remote processing unit. In such embodiments, one, some or all of the circuits 111-114 may be remote.
The memory 120 is configured to store graphics data, User Interface (Ul) settings and computer-readable instructions that when loaded into the controller 110 indicate how the XR device 100 is to be controlled. The memory 120 may comprise several memory units or devices, but they will be perceived as being part of the same overall memory 120. There may be one memory unit for the image presenting device storing graphics data, one memory unit for the sensor for storing settings, one memory for the communications interface (if such is present) for storing settings, and so on. As a skilled person would understand, there are many possibilities of how to select where data should be stored. In one embodiment, the XR device may comprise a general memory 120 in turn comprising any and all such memory units for the purpose of this application. As a skilled person would understand there are many alternatives of how to implement a memory, for example using non-volatile memory circuits, such as EEPROM memory circuits, or using volatile memory circuits, such as RAM memory circuits. For the purpose of this application all such alternatives will be referred to simply as the memory 120.
The image presenting device 150 may in some embodiments be a display arrangement comprising one or more displays arranged to present visual data, predominantly through images. In some such embodiments, the image presenting device 150 may be a touch screen thereby enabling user input to be provided to and received by the XR device 100. The visual data is related to the user interface of the XR device being presented by the XR viewing device 100. The XR device is thereby arranged to present image data through a (graphical) user interface in a manner controlled by the controller 110.
The sensor 130 may in some embodiments be an image sensor, such as a camera or image sensor module, arranged to provide an image (or stream of images) of the user environment when the user is utilizing the XR device 100, and specifically of the hand and/or arm of the user, wherein images of the user may be analyzed using image processing techniques known in the art in order to determine movement patterns of the user.
In some embodiments, as illustrated in figure 1A, the sensor 130 is comprised within the XR device 100. In an alternative embodiment the sensor 130 is arranged remotely in relation to the XR devicelOO, by being connected to the XR device 100. In some embodiments, as will be discussed in further detail below, the sensor 130 may be a motion sensor (referenced 410) connected to the XR device 100, where the motion sensor is arranged to detect a motion or movement of the user, or more precisely a motion or movement of the user's arm and/or hand.
In some embodiments, the XR device may comprise a first sensor 130, for example an image sensor, and a second sensor, for example a motion sensor (referenced 410).
As a skilled person would understand, the XR device 100 may comprise one controller 110 and the sensor 130 (410) may comprise another controller, but for the purpose of the teachings herein, they will be considered to be the same controller 110 in order to cover all possible variations of exactly where the determination of movement or motion takes place. In some embodiments, the XR device 100 may comprise an image presenting device, such as a display or screen 150 arranged to display visual content (not shown in figures 1A, IB or 1C but referenced 210 in later figures).
It should be noted that the XR device 100 may be distributed across several devices and apparatuses or may comprise a single device.
It should be noted that the teachings herein find use in XR devices 100 in many areas of XR, such as for example smart phones, tablet computers, smart watches, media devices (such as smart TVs) or specifically in XR devices. In some embodiments, the XR device 100 may be a smart phone or tabletlB). In some embodiments, the XR device 100 may be a pair of XR goggles 100 (as in figure 1C).
Figure IB shows a schematic view of a XR device 100 being a smart device 100 according to some embodiments of the present invention. In this embodiment, the XR device 100 may be a smartphone or a tablet computer. In some such embodiments, the XR device 100 may comprise one or more input elements. The input elements may be physical, such as keys or buttons 150A. Alternatively or additionally, the input elements may be virtual, such as visual icons 105 that are displayed and selectable on the touchscreen 150.
Figure 1C shows a schematic view of an XR device 100 being an optical-see-through device, such as a pair of XR goggles 100 according to some embodiments of the present invention. In some embodiments, the XR device may comprise more than one image sensor, and in the embodiments according to figure 1C (and as the skilled person would understand, these are also possible for the embodiments of figure IB) the XR device 100 may be arranged as a front facing image sensor 130 as well as a rearward facing image sensor 130, arranged to cover a forward field-of-view (FFOV) and a rear field- of-view (RFOV) respectively.
In some embodiments the image sensor covering the RFOV may be arranged to detect an eye E of the user and to detect a gaze direction or in short a gaze G of the user, i.e. a direction in which the user is looking.
In the embodiments of figure 1C, the image device is a see-through image device, such as a see-through display.
The XR device 100 as exemplified in either of figures 1A, IB or 1C may be arranged with a communication interface 140. The communication interface 140 is arranged to enable communication with other devices, such as other smart devices 100 or a server (not shown) for receiving content, instructions and/or settings or other data. The communication interface 140 may be wired and/or wireless. The communication interface may also comprise several interfaces.
In some embodiments, the communication interface 140 may comprise a USB (Universal Serial Bus) interface. In some embodiments the communication interface 140 may comprise a HDMI (High- Definition Multimedia Interface) interface. In other embodiments, the communication interface 140 may comprise a Display Port interface. In some embodiments the communication interface 140 may comprise an Ethernet interface. In yet other embodiments, the communication interface 140 may comprise a MIPI (Mobile Industry Processor Interface) interface. In other embodiments, the communication interface may comprise an analog interface, a CAN (Controller Area Network) bus interface, an I2C (Inter-Integrated Circuit) interface, or other interfaces.
In some embodiments, the communication interface 140 may comprise a radio frequency (RF) communications interface. In some such embodiments the communication interface 140 may comprise a Bluetooth™ interface, a WiFi™ interface, a ZigBee™ interface, a RFID™ (Radio Frequency I Dentifier) interface, Wireless Display (WiDi) interface, Miracast interface, and/or other RF interface commonly used for short range RF communication. In alternative or supplemental such embodiments, the communication interface 140 may comprise a cellular communications interface such as a fifth generation (5G) cellular communication interface, an LTE (Long Term Evolution) interface, a GSM (Global Systeme Mobile) interface and/or other interface commonly used for cellular communication. In some embodiments, the communication interface 140 may be configured to communicate using the UPnP (Universal Plug n Play) protocol. In other embodiments the communication interface 140 may be configured to communicate using the DLNA (Digital Living Network Appliance) protocol.
In some embodiments, the communication interface 140 may be configured to enable communication through more than one of the example technologies given above. As an example, a wired interface, such as MIPI may be used for establishing an interface between the display arrangement, the controller and the user interface, and a wireless interface, for example WiFi™ could be used to enable communication between the XR device 100 and an external host device (not shown).
The communications interface 140 may be configured to enable the XR device 100 to communicate with other devices, such as other smartphones, Internet tablets, computer tablets or other computers, media devices, such as television sets, gaming consoles, video viewer or projectors (not shown), or eyewear detectors for receiving data. In the following, simultaneous reference will be made to the XR devices 100 of figures 1A, IB and 1C.
As mentioned in the background section, there is a need for a user interface in XR devices that does not occlude the content for some of the users. The inventors have therefore realized a method that using user-pattern information characteristics (e.g. via (Artificial Intelligence (Al) or Machine Learning (ML) model) deducts a user's main hand, i.e. the handedness, and adjusts (for example mirrors) the user interface of the XR environment to be optimized or at least improved for the user's handedness.
The inventors have realized that detecting handedness is not as simple as only detecting what hand is used. The reason for this is that a device may have certain applications associated with respective hands' usage, for example, holding the device with the left hand may be associated with private mode (private apps, etc.) and holding it with the right hand may be associated with work mode (work apps, etc.).
Simply detecting which hand is used may thus not be sufficient to detect handedness.
Instead, the inventors are proposing an XR device and a method that utilizes information from user-behavior characteristics e.g. via Al / ML model, gait recognition, etc. as input to determine a user's main hand in order to adjust (e.g. "flip") user interface, Ul, of the XR environment to be main-hand optimized. A device may use information about the user's main hand in order to for example select a suitable Ul taking the main hand into account as well as selecting between left- / right-hand associated apps / system parts to display for a user in response to said hand in use. This provides a user of an XR device with a better experience and interaction with a device UL
Figure 2A shows a schematic view of an XR device 100, and more particularly the content displayed on the image presenting device 150 of the XR device 100.
The XR device 100 is arranged to display various graphic control objects 220. The area where the graphic control objects 220 are located may be referred to as a control area 220. In figure 2A the control area 220 is arranged on the right-hand side and is thus referenced 220-R. Similarly, when the representation control area 220 is arranged on the left side, it will be referenced 220-L.
The XR device 100 further comprises a graphics processing circuit 111 configured to provide one or more graphic control objects displayed as 210A, 210B, 210C and 210D in Figs. 2A and 2B and .as 210A-210F in Figs. 2C and 2D.These graphic control objects are to be displayed in a control area 220 (such as the control area arranged on the right; 220-R) as part of a graphical user interface, GUL In the example of figure 2A two hands are shown, a left hand H-L and a right hand H-R. Also, in the example of figure 2A the control area 220 is displayed or provided adjacent one hand (in this example the right hand H-R) at a distance d from the hand. The control area 220 is thus a control area 220-R adapted for right-hand use. In some embodiments the distance d from the hand is the distance from the edge (closest to the control area) of the hand to the edge of the control area. In some embodiments the distance d from the hand is the distance from a center of the hand (such as from the index finger as in figure 2A) to the edge of the control area.
As mentioned above, the XR device is arranged to detect the movement of a user's hand(s) and/or arm(s) and detect a handedness of the user, and adapt the user interface (in this example, the placement of the control objects 210 accordingly.
In order to detect the movement of a user's hand(s)he XR device further comprises a hand- movement-detection circuit 112 configured to detect a movement of a hand H of a user. In some embodiments the XR device handedness-detection circuit 113 may be configured to detect handedness based on the movement of the hand H, whereby the handedness indicates the user's main hand, i.e. the hand that the user uses most frequently in interacting with the user interface shown in the XR device. In some embodiments the hand-movement-detection circuit 112 and the handedness-detection circuit 113 are separate circuits and in some embodiments, they are a combined circuit.
It should be noted that the hand-movement-detection circuit 112 in some embodiments, may also be configured to determine arm movements as part of the detected hand movements, whereby the handedness-detection circuit 113 in some such embodiments may be configured to determine the handedness alternatively or additionally using such arm movements.
The XR device further comprises a graphic user interface, GUI, adaptation circuit 114 configured to determine a position for the control area 220 based on the detected handedness.
As mentioned above, the inventors have realized that it is simply not enough to detect that a left or right hand is being used, and the inventors are therefore providing an XR device, wherein the handedness-detection circuit 113 is configured to detect the handedness based on the movement of the hand H by determining that the movement of the hand results in operations which are within a first accuracy range associated with a main user hand and in response thereto determine the handedness as being for the hand movements within the first accuracy range.
This allows for detecting the handedness not only based on which hand is being used, but how well the hand is being used. For example, a user operating the GUI with his/her main hand will do so at a higher accuracy, than when operating with the other hand. For example, a user operating with the main hand will hit control objects more precisely, do less changes or cancellations of commands given, and may also or additionally have faster movements, spend less time before selecting, exhibit an experienced time gazing when using the main hand.
Optionally, the XR device may also be enabled to detect that the hand being used is not the main hand, wherein the handedness-detection circuit 113 may be configured to detect the handedness based on the movement of the hand (H) by determining that the movement of the hand results in operations which are within a second accuracy range associated with the other hand (not main hand) and in response thereto determine the handedness as being for the other hand than the hand performing the movements within the second accuracy range.
As mentioned, in some embodiments the hand-detection-circuit may also be enabled to detect movements of a limb, for example an arm, and the handedness detection circuit 113 may then also be enabled to determine the handedness based on the movements of the limb.
In some embodiments, the movement of the hand may therefore be seen to include movement of an arm carrying the hand wherein the first accuracy range may include a first arm movement accuracy range.
This allows for detecting the handedness also based on the arm movement, which may also be faster or more precise when using the main hand compared to when using the other hand.
Furthermore, the inventors have realized that the biggest or more significant difference in accuracy may not be in the hand, but in how the finger(s) move. The finger movement thus being one example of subtle actions or motor skills of the user. The hand movement detection circuit 112 may thus be further configured to detect movement of (a) finger(s) and the handedness detection circuit 113 may then also enabled to determine the handedness based on the movements of the finger(s). In some such embodiments, hand movement may include finger movement wherein the first accuracy range may include a first finger movement accuracy range.
The accuracy range(s) may in some embodiments be determined over time based on detected hand movements. In some embodiments, the accuracy range(s) may be determined based on stored default settings.
The inventors have further realized that in some situations a user may not be able to use the main hand, for example due to injury, and that in such situations determining whether a hand movement is in a first (or second) accuracy range may be misguided as it may lead to incorrect or rather unwanted adaptations.
The XR device 100 may therefore, in some embodiments be adapted to alternatively or additionally determine the preferred hand and thereby the preferred handedness (even if different from the main and actual handedness) based on a number of usages or rather which hand is used the most. In some embodiments, the handedness-detection circuit 113 may be configured to detect the handedness based on the movement of the hand H by determining that the movement of the hand indicates a preferred hand and in response thereto determine the handedness as being for the preferred hand, wherein the preferred hand is the hand used the most.
In some alternative or additional embodiments, the preferred hand is indicated based on a number of movements with the hand exceeding a range value.
In some alternative or additional embodiments, the range value is the number of movements with the other hand. This will set the hand used the most as the preferred hand.
In some alternative or additional embodiments, the range value is the number of movements with the other hand, multiplied by a factor of 1.25, 1.5, 2.
In some alternative or additional embodiments, the preferred hand is the hand first moving.
In some alternative or additional embodiments, the preferred hand is the hand used to pick up controls.
As the handedness has been determined, the GUI adaptation circuit 114 determines the position for the control area 220 based on the detected handedness wherein the position for the control area 220 is a first position when the handedness indicates a first main hand and wherein the position for the control area 220 is a second position when the handedness indicates a second main hand. Alternatively or additionally, the position for the control area 220 is a first position when the movement of the hand correspond to movements indicating a first main hand and wherein the position for the control area 220 is a second position when the movement of the hand correspond to movements indicating a second main hand.
The XR device 100, specifically, the handedness-detection circuit 113 is further configured to determine that the detected handedness is different from a current handedness (which may be a default handedness as per device settings stored in the memory 120) and the GUI adaptation circuit 114 will only move the control area 220 from the first position to the second position if the handedness is different. As mentioned before, handedness may be stored as a default handedness in the memory 120 of the XR device 100, and in some such embodiments the hand-detection circuit 112 may be further configured to store the detected handedness as a default handedness for a user account profile. The default handedness may thus be for a device, or for a specific user profile.
In some such embodiments, where the default handedness is for a user profile, the same handedness may be used by all applications (or at least the ones adapted accordingly) for a user, wherein one user of the device may have one handedness and wherein another user may have a different handedness, but where both users may be allowed to use the device according to their respective handedness.
Some user profiles may even be of a meta type, i.e. a profile that is associated with more than one device, examples of such meta profiles are Google™ accounts and Apple ™ accounts. Such meta profiles may thus be relevant also to other applications and/or devices. In some embodiments, the user account profile may be a meta profile relevant to more than one application.
The XR device may also be enabled to change the default handedness, which is especially useful if it is detected that the wrong handedness is stored as the default handedness.
Furthermore, the XR device 100 may in some embodiments be enabled to detect that a detected handed is different from a default handedness of a current profile but corresponds to another profile accessible to the XR device, such as also stored in the memory 120 of the XR device 100 where the XR device may be adapted to change to the profile associated with the detected handedness. This allows the XR viewing device to adapt to a user starting to use another device even though the user has forgotten to log in. In some embodiments, the hand-detection circuit 112 may thus be further configured to determine that the detected handedness is different from a default handedness of a current account profile and in response thereto change to a second account profile associated with the detected handedness and wherein the GUI adaptation circuit 114 may be further configured to adapt the GUI based on the second account profile.
Returning to the adaptation of the GUI, in the example of figure 2A and also of figure 2B, this adaptation is illustrated by the control area 220 being moved from the right-hand side of the display view 150 to the left-hand side or rather to a position close(r) to the left hand H-L of the user, as it is determined that the handedness of the user is not right-handed, but rather left-handed. The control area 220 is now at the second position, being a control area 220L adapted for left-hand use. The control area is again placed at a distance d from the left hand H-L. This distance d may be the same distance as from the right hand (as indicated by the same reference) or a different distance (as indicated by the example in figures 2A and 2B).
The examples of figures 2A and 2B thus illustrate examples of embodiments wherein the first position P-R, P-L is in one portion of (such as half of) the display view (150) corresponding to the detected handedness and the second position P-L, P-R is in the remaining portion (such as the other half) of the display view (150) (the halves being separated by a dashed line in figures 2A and 2B).
The examples of figures 2A and 2B thus illustrate examples of embodiments wherein the position P-R, P-L for the control area 220-R, 220-L is within a threshold distance (d) from the detected movement of the hand (H).
As an alternative to or in addition to determining a location of the dominant hand and placing the control objects 210 in relation thereto, the control objects may simply be placed in relation to a viewing side instead, or in addition thereto (i.e. the control objects are both placed with regards to the position of the hand and the distance to the side). A viewing side - or view side, being a side or portion of the display on the side of the main hand, named so as that will most likely be the side that the user is looking at the most.
The examples of figures 2A and 2B thus illustrate examples of embodiments wherein the GUI adaptation circuit 114 is further configured to determine a view side 150-L for the detected handedness H-L and to adapt the GUI by ordering the one or more control objects 210 along the view side 150-L, 150-R. In the example of figures 2A and 2B, the control objects are thus moved from the right-hand side 150-R to the left-hand side 150-L of the display 150.
As can be seen in figures 2A and 2B, the order of the control objects 210 is maintained. The examples of figures 2A and 2B thus illustrate examples of embodiments wherein the GUI adaptation circuit 114 is further configured to determine an order of the one or more control objects 210 in the control area 220-R, 220-L and to maintain the order of the one or more control objects 210 in the control area 220-L, 220-R at the second position.
As also indicated in figures 2A and 2B, at least one (in this example control object 210D) of the one or more control objects 210 is associated with an activation direction adL. In some such embodiments, the GUI adaptation circuit 114 is further configured to change the activation direction adL to an opposite direction adR.
Figures 2C and 2D show examples where there are more control objects 210 than in figures 2A and 2B. In embodiments or examples where the control objects are not arranged in a (straight) line, it may be beneficial to rearrange the objects as they are moved in order to allow the objects to be (equally) reachable by the user under the new handedness.
In the example of figures 2C and 2D, the control objects have been arranged at the left-hand side (figure 2D) in the left-hand position P-L in a mirrored order compared to how they are arranged at the right-hand side in the right-hand position-R (figure 2C).
The examples of figures 2C and 2D thus illustrate examples of embodiments wherein the GUI adaptation circuit 114 is further configured to determine an order of the one or more control objects 210 in the control area 220-R, 220-L and to determine a mirrored order based on the order, wherein the graphics processing circuit 111 is further configured to provide the one or more graphic control objects 210 to be displayed in the control area 220 at the second position in the mirrored order.
It should be noted that a control object is a graphical object (an object having a graphical representation) associated with a function that when the object is activated the associated function is executed. Such control objects are well-known and need no further explanation.
Other objects may also be part of the graphical user interface, and may also be moved as part of the adaptation of the user interface. One example of some such other objects is a notification object. A notification object is an object comprising a notification (text or image). The notification object may be seen as a control object that when activated displays the notification. The notification object may also or alternatively be a control object wherein the graphical representation carries the notification. In such embodiments, the associated function may be null, i.e. a non-action. In some embodiments at least one of the one or more control objects 210 is a notification object 210.
As has been discussed above, and is shown in the figures 2A to 2H, the control area 220 is the area of the one or more graphic control objects 210. The control area 220 may thus be seen as an area encompassing the one or more graphic control objects 210.
In some embodiments the control objects, or the function of the control objects, may be associated with a priority indicating an importance of being easily reached by a user irrespective of handedness.
Figure 2E shows examples of an adaptation as an alternative to the adaptation of figure 2D. In figure 2E the control objects have not been mirrored (as in figure 2D) but have had their individual order changed. This change of order is in some embodiments based on a priority of the object (and possibly associated function). Figure 2E thus shows examples wherein at least one of the one or more control objects 210 are associated with a priority, and wherein the GUI adaptation circuit 114 is further configured to adapt the GUI by ordering the one or more control objects 210 according to the priority.
In the examples of figures 2A to 2E, all control objects are moved. However, this is not mandatory, and in some embodiments, only a portion of the control objects are moved.
In some such embodiments, the controller 110 is configured to move the most important control objects, i.e. the controls objects associated with a higher (exceeding a threshold value) priority. In some embodiments the threshold value is associated with a number of usages, and is for example the average of number of uses for control objects. Alternatively, the threshold value is 110, 120, 1250, 130, 150 or higher percentage of the average of number of uses for the control objects. In some embodiments the threshold value is associated with the number of control objects, and is for example 50, 75, 80, or 90 percent (or closest thereto) of the number of control objects 210.
In the example of figure 2F only two control objects 210E, 210B have been moved, whereas the others have been maintained at the original position of the control area. As can be seen in figure 2F a reorder of the remaining objects may be done so as to not leave gaps.
Only moving a portion of the control objects 210 enables for maintaining a graphical or artistic feel or view of the GUI, wherein some objects are better associated with or fitted within the background of the GUI.
Figure 2E thus shows examples wherein the GUI adaptation circuit 114 is further configured to adapt the GUI by assigning at least one control object 210E, 210B having a priority exceeding a threshold value to the control area 220-L, 220-R at the first position and assigning the remaining control objects 210A, 210F, 210C, 210D to a control area 220-R, 220-L at the second position.
In some embodiments the GUI adaptation circuit 114 is further configured to determine the priority for a control object based on a frequency of use.
In some embodiments the GUI adaptation circuit 114 is further configured to determine the priority for a control object based on a number of activations.
And, in some embodiments the GUI adaptation circuit 114 is further configured to determine the priority for a control object based on an application associated with the control object.
It should be noted that the priority may be determined based on any, some or all of these manners in combination, and different objects may have their priority determined differently or in the same manner, for any, some or all objects. Y1
As mentioned above a control object may be part of or preferably arranged to or at a specific position as regards the background of the graphical user interface. In such or also in other cases, it may be beneficial to keep an object at a specific position and assign it a different function as the GUI is adapted instead of moving it.
Figure 2G shows examples where control objects 210 are associated each with a function F. However, not all objects need be associated with a function as discussed above. In figure 2G the control objects 210 are arranged at a right position and associated with right-handedness. Furthermore, specifically control object 210C is associated with the function Fl, control object 210A is associated with the function F2, control object 210D is associated with the function F3 and control object 210B is associated with the function F4. Figure 2H shows a corresponding example where the GUI has been adapted based on a different handedness, and in this example the GUI is adapted to left-handedness.
In the example shown in figure 2H the control objects have maintained their individual positions, but have been associated with other functions, the functions have been remapped. Specifically, control object 210C is now associated with the function F2, control object 210A is now associated with the function Fl, control object 210D is now associated with the function F4 and control object 210B is now associated with the function F3.
Figures 2G and 2H thus illustrate examples wherein at least one of the one or more control objects (210) is associated with a mapped function F1-F4 and wherein the GUI adaptation circuit 114 is further configured to adapt the GUI by re-associating the at least one of the one or more control objects to a different mapped function.
Alternatively or additionally, the graphical representation of a control object may also be changed as part of the adaptation.
As mentioned above the XR device 100 may be arranged to communicate with other devices, and as also mentioned above a user profile may be associated with or used by several devices.
The inventors are therefore also proposing to enable a first device to communicate, through a direct connection or through a user profile, with a second device and adapt the GUI in the second device based on a handedness detected by the first device. This enables aa user to use a different device where the handedness is set correctly without the second device having to spend time detecting which hand is used and how well the hand is being used.
Figure 3A shows examples situation where a first XR device 100-1 presents or displays control objects 210 in a first (right) control area at a first (right) position thereby providing the control objects 210 as per a handedness indicating the main hand to be the right hand. As is also illustrated a second XR device same or similar setup where control objects 210 are provided based on a handedness indicating the main hand to be the right hand.
As indicated in figure 3A by the dotted rounded rectangle around the left hand H-L, the left hand has been detected as the main hand. The detected handedness is thus different from the handedness the control objects 210 are currently provided according to. As per the teachings herein, the GUI is therefore adapted by the first XR device 100. This is illustrated in the example of figure 3B where the control objects 210 are now provided in a second (left) control area at a second (left) position thereby providing the control objects 210 as per a handedness indicating the main hand to be the left hand.
As is indicated in the examples of figure 3A, the first XR device 100-1 communicates or indicates the detected handedness or the change in detected handedness to the second XR device as indicated by the dashed zigzag line.
As is illustrated in the examples of figure 3B, the second viewing device 100-2 also adapts its GUI according to the detected handedness. The second device may in some embodiments receive an indication that the handedness has been changed, or an indication of the detected handedness and determine that the handedness has been changed. The indication of the detected handedness is in some embodiments the detection made by the hand-movement-detection circuit.
In the display 150 of the second XR device 100-2 the objects 210 are thus also provided in a second (left) control area at a second (left) position thereby providing the control objects 210 as per a handedness indicating a dominant hand being the left hand.
The examples of figures 3A and 3B thus show examples wherein the XR device 100 further comprises a communication interface 140 for connecting to a second device 100-2 and for communicating an indication of the detected handedness to the second device 100-2 thereby enabling the second device to adapt a GUI of the second device 100-2 according to the detected handedness.
As a skilled person would understand the second or connected device need not be a second XR device 100 but may also be other devices such as a headset. Most commonly a headset would have a direct connection (for example through Bluetooth™) and a second XR device commonly (but not necessarily) connected through a server or cloud service.
Returning to the actual detection of hand (possibly including limb and/or finger) movement discussed above, the detection is based on the hand-movement-detection circuit 112 receiving input from a sensor. The sensor 130 is exemplified as an image receiving device (for example a camera or image sensor) but may additionally or alternatively be a movement sensor arranged to detect movements, for example through the use of a gyro, Inertial measurement unit (IMU) or other movement detection device. In such embodiments, the XR device 100 is configured to connect to or arranged to include a motion sensor 410 displayed in Fig. 4A. In some embodiments more than one motion sensor 410 is utilized, for example one per hand.
The motion sensor 410 is arranged to detect the relevant movement, in other words, for an XR device that detects hand movements, the motion sensor 410 is also enabled to detect hand motion, for an XR device that detects limb movements, the motion sensor 410 is also enabled to detect limb motion and for an XR viewing device that detects device that detects finger movements, the motion sensor 410 is also enabled to detect finger motion. Examples of all such motion sensors and for combined detections are well-known and require no further details. Naturally the motion sensor also includes some form of power source and communication capabilities.
Figure 4A shows examples of such embodiments, where an XR device 100 is connected to a motion sensor 410, the two possibly forming an XR viewing system 400.
The motion sensor 410 may be comprised in a glove G, such as an XR glove, to be worn by the user. Alternatively or additionally, the motion sensor 410 is to be worn in another way by a user for example to be strapped to the hand(s) or in a device carried by the user. The sensor is thus in some embodiments a motion sensor 410 carried by the user and wherein the hand-detection circuit 112 is further configured to determine the movement pattern based on motion data analysis.
Figure 4A shows examples of such a system comprising an XR device 100 and a XR glove G, wherein the glove G comprises the motion sensor 410.
Figure 4B shows examples of alternative systems 400 wherein In some such embodiments, the motion sensor device is an XR controller 410A that comprises the motion sensor 410.
The motion sensor 410 is configured to detect a motion which over time forms a movement pattern, indicated in figure 4A by a dashed line referenced MP. This movement pattern is then used by the handedness detection circuit 113 to detect the actual handedness as discussed above.
As mentioned above and as is illustrated in the examples of figure 4C, the system 400 may additionally or alternatively comprise an XR device 100, wherein the sensor (130) is a camera or image sensor (130) and wherein the hand-detection circuit (112) is further configured to determine the movement pattern based on image analysis. The example shown in figures 4A, 4B and 4C thus shows examples wherein the hand-detection circuit (112) is further configured to receive sensor input from a sensor, wherein the sensor may be an image sensor 130, a motion sensor 410 or a combination of the two, and determine a movement pattern for a hand based on the sensor input, wherein the sensor input indicates the movement of the hand.
Figure 5 shows a general flowchart for a method according to the teachings herein. The method corresponds to the operation of the XR device 100 as discussed in the above, wherein said method comprises: providing 510 one or more graphic control objects 210, which one or more graphic control objects 210 are to be displayed in a control area 220 as part of a graphical user interface, GUI, in an extended reality at a control area. The method also comprises detecting 520 a movement of a hand (H) of a user and then detecting 530 a handedness based on the movement of the hand (H), whereby the handedness indicates a main hand. The method also includes adapting 540 the GUI by determining a position for the control area 220 based on the detected handedness, wherein the detection of the handedness is based on the movement of the hand (H) by the method further comprising determining that the movement of the hand result in operations which operations are within a first accuracy range associated with a main hand and in response thereto determining the handedness as being for the hand movements within the first accuracy range, and wherein the position for the control area 220 is a first position when the handedness indicates a first main hand and wherein the position for the control area 220 is a second position when the handedness indicates a second main hand.
Alternatively or additionally the position for the control area 220 is a first position when the movement of the hand correspond to movements indicating a first main hand and wherein the position for the control area 220 is a second position when the movement of the hand correspond to movements indicating a second main hand.
The method of figure 5 is a general method and also allows for implementing other features as disclosed in above as sub functionality of any part of the method as disclosed.
Figure 6 shows a schematic view of a computer-readable medium 120 carrying computer instructions 121 that when loaded into and executed by a controller of a XR device 100 enables the XR viewing device to execute the teachings herein.
The computer-readable medium 120 may be tangible such as a hard drive or a flash memory, for example a USB memory stick or a cloud server. Alternatively, the computer-readable medium 120 may be intangible such as a signal carrying the computer instructions enabling the computer instructions to be downloaded through a network connection, such as an internet connection.
In the example of figure 6, a computer-readable medium 120 is shown as being a computer disc 120 carrying computer-readable computer instructions 121, being inserted in a computer disc reader 122. The computer disc reader 122 may be part of a cloud server 123 - or other server - or the computer disc reader may be connected to a cloud server 123 - or another server. The cloud server 123 may be part of the internet or at least connected to the internet. The cloud server 123 may alternatively be connected through a proprietary or dedicated connection. In one example embodiment, the computer instructions are stored at a remote server 123 and be downloaded to the memory 120 of the XR device 100 for being executed by the controller 110.
The computer disc reader 122 may also or alternatively be connected to (or possibly inserted into) a XR device 100 for transferring the computer-readable computer instructions 121 to a controller of the XR device via a memory of the XR viewing device 100.
Figure 6 shows both the situation when a XR device 100 receives the computer-readable computer instructions 121 via a server connection and the situation when another XR device 100 receives the computer-readable computer instructions 121 through a wired interface. This enables for computer-readable computer instructions 121 being downloaded into a XR viewing device 100 thereby enabling the XR device 100 to operate according to and implement the invention as disclosed herein.
Figure 7 shows a component view for a software component or module arrangement 700 according to some embodiments of the teachings herein. The software component arrangement 700 is adapted to be used in an XR device 100 as taught herein for providing adaptation of a user interface as taught herein and corresponds to the operation of the XR device in the above. The software component arrangement 700 comprises a software component 710 for providing one or more graphic control objects 210, which one or more graphic control objects 210 are to be displayed in a control area 220 as part of a graphical user interface, GUI, in an extended reality at a control area. The software component arrangement 700 comprises a software component 720 for detecting a movement of a hand (H) of a user and then detecting a handedness based on the movement of the hand (H), whereby the handedness indicates a main hand. The software component arrangement 700 comprises a software component 730 for adapting the GUI by determining a position for the control area 220 based on the detected handedness, wherein the detection of the handedness is based on the movement of the hand (H) by the method further comprising determining that the movement of the hand result in operations which operations are within a first accuracy range associated with a main hand and in response thereto determining the handedness as being for the hand movements within the first accuracy range, and wherein the position for the control area 220 is a first position when the handedness indicates a first main hand and wherein the position for the control area 220 is a second position when the handedness indicates a second main hand.
Alternatively or additionally the position for the control area 220 is a first position when the movement of the hand correspond to movements indicating a first main hand and wherein the position for the control area 220 is a second position when the movement of the hand correspond to movements indicating a second main hand. The software component arrangement 700 comprises software component(s) 740 for further functionalities as discussed in the teachings herein.
For the context of the teachings herein a software code module may be replaced or supplemented by a software component.

Claims

1. An extended reality, XR, viewing device (100) comprising a controller (110), wherein said controller (110) comprises: a graphics processing circuit (111) configured to provide one or more graphic control objects (210), which one or more graphic control objects (210) are to be displayed in a control area (220) as part of a graphical user interface, GUI, a hand-movement-detection circuit (112) configured to detect a movement of a hand (H) of a user, a handedness-detection circuit (113) configured to estimate a handedness based on the movement of the hand (H) detected by the hand-movement detection circuit (112), whereby the handedness indicates a main hand, a GUI adaptation circuit (114) configured to determine a position for the control area (220) based on the detected handedness, wherein the handedness-detection circuit (113) is configured to estimate the handedness based on the movement of the hand (H) detected by the hand-movement detection circuit (112) and by determining that the movement of the hand results in operations which are within a first accuracy range associated with a main handand in response thereto determine an estimate of the handedness as being for the hand movements within the first accuracy range, and wherein the GUI adaptation circuit (114) is further configured to determine the position for the control area (220) to be a first position when the handedness indicates a first main hand and to be a second position when the handedness indicates a second main hand.
2. The XR device (100) according to claim 1, wherein the handedness-detection circuit (113) is configured to detect the handedness based on the movement of the hand (H) by determining that the movement of the hand results in operations which operations are within a second accuracy range associated with a non-main hand and in response thereto determine the handedness as being for the other hand than the hand performing the movements within the second accuracy range
3. The XR device (100) according to claim 1 or 2, wherein the hand-movement detection circuit (112) is further configured to detect movement of an arm carrying the hand and wherein the first accuracy range detected by the handedness-detection circuit (113) includes a first arm movement accuracy range.
4. The XR device (100) according to claim 1, 2 or 3, wherein the hand-movement detection circuit (112) is further configured to detect movement of a finger of the hand and wherein the first accuracy range detected by the handedness-detection circuit (113) includes a first finger movement accuracy range.
5. The XR device (100) according to any preceding claim, wherein the handedness-detection circuit (113) is configured to detect the handedness based on the movement of the hand (H) by determining that the movement of the hand indicates a preferred hand and in response thereto determine the handedness as being for the preferred hand.
6. The XR device (100) according to claim 5, wherein the preferred hand is indicated based on a number of movements with the hand exceeds a range value.
7. The XR device (100) according to claim 6, wherein the range value comprises the number of movements with the other hand.
8. The XR device (100) according to claim 7, wherein the range value further comprises the number of movements with the other hand, multiplied by a factor of 1.25, 1.5, 2 or more.
9. The XR device (100) according to any of claims 5 to 8, wherein the preferred hand detected by the handedness-detection circuit (113) is the hand first moving.
10. The XR device (100) according to any of claims 5 to 9, wherein the preferred hand detected by the handedness-detection circuit (113) is the hand used to pick up controls.
11. The XR device (100) according to any preceding claim, wherein the first position (P-R, P-L) is located in a half of a display view (150) corresponding to the detected handedness and the second position (P-L, P-R) is located in another half of the display view (150). 12. The XR device (100) according to any preceding claim, wherein the position (P-R, P-L) for the control area (220-R, 220-L) is within a threshold distance (d) from the detected movement of the hand (H).
13. The XR device (100) according to any preceding claim, wherein the handedness-detection circuit (113) is further configured to determine that the detected handedness is different from a current handedness and wherein the
GUI adaptation circuit (114) is further configured to move the control area (220) from the first position to the second position in response thereto.
14. The XR device (100) according to claim 13, wherein the GUI adaptation circuit (114) is further configured to determine an order of the one or more control objects (210) in the control area (220-R, 220-L) and to maintain the order of the one or more control objects (210) in the control area (220-L, 220-R) at the second position.
15. The XR device (100) according to claim 13, wherein the GUI adaptation circuit (114) is further configured to determine an order of the one or more control objects (210) in the control area (220-R, 220-L) and to determine a mirrored order based on the order, wherein the graphics processing circuit (111) is further configured to provide the one or more graphic control objects (210) to be displayed in the control area (220) at the second position in the mirrored order.
16. The XR device (100) according to any preceding claim, wherein the GUI adaptation circuit (114) is further configured to determine a view side (150-L) for the detected handedness (H-L) and to adapt the GUI by ordering the one or more control objects (210) along the view side (150-L, 150-R).
17. The XR device (100) according to any preceding claim, wherein at least one of the one or more control objects (210) are associated with a priority, and wherein the GUI adaptation circuit (114) is further configured to adapt the GUI by ordering the one or more control objects (210) according to the priority. 18. The XR device (100) according to claim 17, wherein the GUI adaptation circuit (114) is further configured to adapt the GUI by assigning at least one control object (210E, 210B) having a priority exceeding a threshold value to the control area (220-L, 220-R) at the first position by assigning the remaining control objects (210A, 210F, 210C, 210D) to a control area (220-R, 220-L) at the second position.
19. The XR device (100) according to claim 18, wherein the GUI adaptation circuit (114) is further configured to determine the priority for a control object based on a frequency of use.
20. The XR device (100) according to claim 18 or 19, wherein the GUI adaptation circuit (114) is further configured to determine the priority for a control object based on a number of activations.
21. The XR device (100) according to any of claims 18-20, wherein the GUI adaptation circuit (114) is further configured to determine the priority for a control object based on an application associated with the control object.
22. The XR device (100) according to any of claims 13 to 21, wherein at least one of the one or more control objects (210) is associated with a mapped function (F1-F4) and wherein the GUI adaptation circuit (114) is further configured to adapt the GUI by re-associating the at least one of the one or more control objects to a different mapped function.
23. The XR device (100) according to any of claims 13 to 22, wherein at least one (210D) of the one or more control objects (210) is associated with an activation direction (ad L) and wherein the GUI adaptation circuit (114) is further configured to change the activation direction (ad L) to an opposite direction (adR).
24. The XR device (100) according to any preceding claims, wherein the hand-detection circuit (112) is further configured to store the detect handedness as a default handedness for a user account profile. 25. The XR device (100) according to claim 24, wherein the user account profile is a meta profile relevant to more than one application.
26. The XR device (100) according to any preceding claims, wherein the hand-detection circuit (112) is further configured to determine that the detected handedness is different from a default handedness of a current account profile and in response thereto change to a second account profile associated with the detected handedness and wherein the GUI adaptation circuit (114) is further configured to adapt the GUI based on the second account profile.
27. The XR device (100) according to any preceding claims, wherein at least one of the one or more control objects (210) is a notification object (210).
28. The XR device (100) according to any preceding claims, further comprising a communication interface (140) for connecting to a second device (100-2) and to communicate an indication of the detected handedness to the second device (100-2) thereby enabling the second device to adapt a GUI of the second device (100-2) according to the detected handedness.
29. The XR device (100) according to any preceding claims, wherein the hand-detection circuit (112) is further configured to receive sensor input from a sensor (130, 410) and determine a movement pattern for a hand based on the sensor input, wherein the sensor input indicates the movement of the hand.
30. The XR device (100) according to claim 29, wherein the sensor (130) is a camera or image sensor (130) and wherein the hand-detection circuit (112) is further configured to determine the movement pattern based on image analysis.
31. The XR device (100) according to any of claims 29 or 30, wherein the sensor (410) is a motion sensor (410) to be carried by the user (U) and wherein the hand-detection circuit (112) is further configured to determine the movement pattern based on motion data analysis. 32. A XR system (400) comprising an XR device (100) according to claim 31 and a XR glove (G), wherein the glove (G) comprises the motion sensor (410).
33. A XR system (400) comprising an XR device (100) according to claim 31 and a XR controller (410A), wherein the XR controller (410A) comprises the motion sensor (410).
34. A method for use in an extended reality, XR, device (100), wherein said method comprises: providing one or more graphic control objects (210), which one or more graphic control objects (210) are to be displayed in a control area (220) as part of a graphical user interface, GUI, in an extended reality at a control area, detecting a movement of a hand (H) of a user, detecting a handedness based on the movement of the hand (H), whereby the handedness indicates a main hand and determining a position for the control area (220) based on the detected handedness, wherein the detection of the handedness is based on the movement of the hand (H) by the method further comprising determining that the movement of the hand result in operations which operations are within a first accuracy range associated with a main hand and in response thereto determining the handedness as being for the hand movements within the first accuracy range, and wherein the position for the control area (220) is a first position when the handedness indicates a first main hand and wherein the position for the control area (220) is a second position when the handedness indicates a second main hand.
35. A computer-readable medium (120) carrying computer instructions (121) that when loaded into and executed by a controller (150) of an XR device (100) enables the XR device (100) to implement the method according to claim 34.
36. An extended reality, XR, device (100) comprising a software code module arrangement
(600), wherein the software code module arrangement (600) comprises: a software code module (710) for a software code module (710) for providing one or more graphic control objects (210), which one or more graphic control objects (210) are to be displayed in a control area (220) as part of a graphical user interface, GUI, in an extended reality at a control area, a software code module (720) for detecting a movement of a hand (H) of a user, a software code module (730) for detecting a handedness based on the movement of the hand (H), whereby the handedness indicates a main hand and a software code module (740) for adapting the graphical user interface by determining a position for the control area (220) based on the detected handedness, wherein the detection of the handedness is based on the movement of the hand (H) by the software code module (730) further comprises a software code for determining that the movement of the hand result in operations which operations are within a first accuracy range associated with a main hand and in response thereto determining the handedness as being for the hand movements within the first accuracy range, and wherein the position for the control area (220) is a first position when the handedness indicates a first main hand and wherein the position for the control area (220) is a second position when the handedness indicates a second main hand.
PCT/EP2023/0738432023-08-302023-08-30A computer software module arrangement, an xr device and a method for providing an improved extended reality interfacePendingWO2025045363A1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
PCT/EP2023/073843WO2025045363A1 (en)2023-08-302023-08-30A computer software module arrangement, an xr device and a method for providing an improved extended reality interface

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
PCT/EP2023/073843WO2025045363A1 (en)2023-08-302023-08-30A computer software module arrangement, an xr device and a method for providing an improved extended reality interface

Publications (1)

Publication NumberPublication Date
WO2025045363A1true WO2025045363A1 (en)2025-03-06

Family

ID=87886746

Family Applications (1)

Application NumberTitlePriority DateFiling Date
PCT/EP2023/073843PendingWO2025045363A1 (en)2023-08-302023-08-30A computer software module arrangement, an xr device and a method for providing an improved extended reality interface

Country Status (1)

CountryLink
WO (1)WO2025045363A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20140335916A1 (en)*2013-05-102014-11-13Motorola Mobility LlcMethod and Device for Determining User Handedness and Controlling a User Interface
US20150346994A1 (en)*2014-05-302015-12-03Visa International Service AssociationMethod for providing a graphical user interface for an electronic transaction with a handheld touch screen device
US20170060398A1 (en)*2015-09-022017-03-02Sap SeDynamic display of user interface elements in hand-held devices
US20190384406A1 (en)*2018-06-142019-12-19Dell Products, L.P.ONE-HANDED GESTURE SEQUENCES IN VIRTUAL, AUGMENTED, AND MIXED REALITY (xR) APPLICATIONS
US20230019962A1 (en)*2021-07-132023-01-19Stmicroelectronics S.R.L.Gesture and handedness determination

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20140335916A1 (en)*2013-05-102014-11-13Motorola Mobility LlcMethod and Device for Determining User Handedness and Controlling a User Interface
US20150346994A1 (en)*2014-05-302015-12-03Visa International Service AssociationMethod for providing a graphical user interface for an electronic transaction with a handheld touch screen device
US20170060398A1 (en)*2015-09-022017-03-02Sap SeDynamic display of user interface elements in hand-held devices
US20190384406A1 (en)*2018-06-142019-12-19Dell Products, L.P.ONE-HANDED GESTURE SEQUENCES IN VIRTUAL, AUGMENTED, AND MIXED REALITY (xR) APPLICATIONS
US20230019962A1 (en)*2021-07-132023-01-19Stmicroelectronics S.R.L.Gesture and handedness determination

Similar Documents

PublicationPublication DateTitle
US20220229524A1 (en)Methods for interacting with objects in an environment
US12190461B2 (en)Sharing virtual content in a mixed reality scene
US10416835B2 (en)Three-dimensional user interface for head-mountable display
EP3341818B1 (en)Method and apparatus for displaying content
US8872735B2 (en)Head mounted display for adjusting audio output and video output in relation to each other and method for controlling the same
CN105765513B (en)Information processing apparatus, information processing method, and program
US20170293351A1 (en)Head mounted display linked to a touch sensitive input device
EP3400580B1 (en)Method and apparatus for facilitating interaction with virtual reality equipment
EP3370102A1 (en)Hmd device and method for controlling same
US20240256052A1 (en)User interactions with remote devices
US12182391B2 (en)Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
CN108885496A (en)Information processing unit, information processing method and program
JP2016126687A (en)Head-mounted display, operation reception method, and operation reception program
US20250076977A1 (en)Providing a pass-through view of a real-world environment for a virtual reality headset for a user interaction with real world objects
EP3109734A1 (en)Three-dimensional user interface for head-mountable display
WO2025045363A1 (en)A computer software module arrangement, an xr device and a method for providing an improved extended reality interface
US20210349533A1 (en)Information processing method, information processing device, and information processing system
KR20150105131A (en)System and method for augmented reality control
JPWO2020031493A1 (en) Terminal device and control method of terminal device
KR102349210B1 (en) Direct manipulation of display devices using wearable computing devices
WO2024249679A9 (en)Devices, methods, and graphical user interfaces for displaying a virtual keyboard
WO2025198713A1 (en)Devices, methods, and graphical user interfaces for navigating user interfaces within three-dimensional environments
WO2024026024A1 (en)Devices and methods for processing inputs to a three-dimensional environment
WO2025217572A1 (en)Devices, methods, and graphical user interfaces for digital image adjustment for displays
WO2024233224A1 (en)Devices, methods, and graphical user interfaces for providing environment tracking content

Legal Events

DateCodeTitleDescription
121Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number:23762510

Country of ref document:EP

Kind code of ref document:A1


[8]ページ先頭

©2009-2025 Movatter.jp