Movatterモバイル変換


[0]ホーム

URL:


HK1191707B - Sensor fusion algorithm - Google Patents

Sensor fusion algorithm
Download PDF

Info

Publication number
HK1191707B
HK1191707BHK14104848.0AHK14104848AHK1191707BHK 1191707 BHK1191707 BHK 1191707BHK 14104848 AHK14104848 AHK 14104848AHK 1191707 BHK1191707 BHK 1191707B
Authority
HK
Hong Kong
Prior art keywords
computing device
orientation
accessory
accessory device
host computing
Prior art date
Application number
HK14104848.0A
Other languages
Chinese (zh)
Other versions
HK1191707A1 (en
Inventor
R. Perek David
A. Schwager Michael
Drasnin Sharon
J. Seilstad Mark
Original Assignee
微软技术许可有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/471,202external-prioritypatent/US8548608B2/en
Application filed by 微软技术许可有限责任公司filedCritical微软技术许可有限责任公司
Publication of HK1191707A1publicationCriticalpatent/HK1191707A1/en
Publication of HK1191707BpublicationCriticalpatent/HK1191707B/en

Links

Description

Sensor fusion algorithm
RELATED APPLICATIONS
This application claims priority under 35 u.s.c. § 119(e) to the following U.S. provisional patent applications, the entire disclosure of each of which is incorporated by reference in its entirety:
U.S. provisional patent application No.61/606,321, filed 3/2/2012, attorney docket No. 336082.01 and entitled "Screen Edge";
U.S. provisional patent application No.61/606,301, filed 3, month 2, 2012, attorney docket No. 336083.01 and entitled "Input Device Functionality";
U.S. provisional patent application No.61/606,313, filed 3/2/2012, attorney docket No. 336084.01 and entitled "Functional Hinge";
U.S. provisional patent application No.61/606,333, filed 3, 2/2012, attorney docket No. 336086.01 and entitled "use and Authentication";
U.S. provisional patent application No.61/613,745, filed 3/21/2012, attorney docket No. 336086.02 and entitled "use and Authentication";
U.S. provisional patent application No.61/606,336, filed 3/2/2012, attorney docket No. 336087.01 and entitled "Kickstand and Camera"; and
U.S. Provisional patent application No.61/607,451, filed 3/6/2012, attorney docket No. 336143.01 and entitled "Spanaway Provisional".
Background
Mobile computing devices have been developed to add functionality that makes available to users in mobile environments. For example, a user may interact with a mobile phone, tablet computer, or other mobile computing device to check email, surf the web, compose text, interact with an application, and so forth. Some mobile computing devices may connect to and interact with various accessory devices to provide different input technologies, extend functionality, and so forth. One challenge facing developers of mobile computing devices is managing the behavior and interaction with companion accessory devices. For example, the host computing device may have limited control over how the accessory device behaves, and thus the actions of the accessory device may sometimes interfere with the operation of the host computing device. Moreover, the user experience may be adversely affected by accessory devices that do not respond in a manner consistent with the host computing device. Thus, integrated management of the behavior and interaction of accessory devices can be a challenging consideration for developers of mobile computing devices.
Disclosure of Invention
Sensor fusion algorithm techniques are described. In one or more implementations, behavior of the host device and the accessory device is controlled as a function of an orientation of the host device and the accessory device relative to each other. The combined spatial position and/or orientation of the host device may be obtained from raw measurements obtained from at least two different types of sensors. Additionally, the spatial position and/or orientation of the accessory device is ascertained using one or more sensors of the accessory device. An orientation (or position) of the accessory device relative to the host computing device can then be calculated from the combined spatial position/orientation of the host computing device and the ascertained spatial position/orientation of the accessory device. The calculated relative orientation may then be used in various ways to control the behavior of the host computing device and/or the accessory device.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Drawings
The detailed description is described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items. The entities represented on the figures may indicate one or more entities, and thus, singular or plural forms of the entities may be referred to interchangeably at the time of discussion.
FIG. 1 is an illustration of an environment in an exemplary implementation that is operable to employ techniques described herein.
FIG. 2 depicts an exemplary implementation of the computing device of FIG. 1 in greater detail.
FIG. 3 depicts an exemplary implementation of the accessory device of FIG. 1 showing the flexible hinge in greater detail.
FIG. 4 depicts an exemplary orientation of an accessory device relative to a computing device in accordance with one or more embodiments.
FIG. 5 depicts an exemplary orientation of an accessory device relative to a computing device in accordance with one or more embodiments.
FIG. 6 depicts an exemplary orientation of an accessory device relative to a computing device in accordance with one or more embodiments.
FIG. 7 depicts an exemplary orientation of an accessory device relative to a computing device in accordance with one or more embodiments.
FIG. 8 depicts an exemplary orientation of an accessory device relative to a computing device in accordance with one or more embodiments.
FIG. 9 depicts an exemplary orientation of an accessory device relative to a computing device in accordance with one or more embodiments.
FIG. 10 depicts an illustration of some exemplary rotational orientations of a computing device relative to an input device in accordance with one or more embodiments.
FIG. 11 is a flow diagram that describes an exemplary process in accordance with one or more embodiments.
FIG. 12 is a flow diagram that describes an exemplary process in accordance with one or more embodiments.
Fig. 13 illustrates an exemplary system that includes various components of an exemplary device that may be implemented as any type of computing device as described with reference to fig. 1-12 to implement embodiments of the techniques described herein.
Detailed Description
Overview
Traditionally, a host computing device may have limited control over how an associated accessory device behaves. The action of the accessory device may therefore sometimes interfere with the operation of the host computing device, which may detract from the user experience. Thus, integrated management of the behavior and interaction of the accessory device may be a consideration for developers of mobile computing devices.
Sensor fusion algorithm techniques are described. In one or more implementations, behavior of the host device and the accessory device is controlled as a function of an orientation of the host device and the accessory device relative to each other. The combined spatial position and/or orientation of the host device may be obtained from raw measurements obtained from at least two different types of sensors. Additionally, the spatial position and/or orientation of the accessory device is ascertained using one or more sensors of the accessory device. An orientation (or position) of the accessory device relative to the host computing device can then be calculated from the combined spatial position/orientation of the host computing device and the ascertained spatial position/orientation of the accessory device. The calculated relative orientation may then be used in various ways to control the behavior of the host computing device and/or the accessory device.
In the following discussion, an exemplary environment and device are first described that may utilize the techniques described herein. Exemplary processes that can be performed in the exemplary environment and by the device are then described, as well as in other environments and by other devices. Thus, execution of the exemplary process is not limited to the exemplary environment/device, and the exemplary environment/device is not limited to execution of the exemplary process.
Exemplary operating Environment
FIG. 1 is an illustration of an environment 100 in an exemplary implementation that is operable to utilize techniques described herein. The illustrated environment 100 includes an example of a computing device 102 physically and communicatively coupled to an accessory device 104 via a flexible hinge 106. The computing device 102 may be configured in a variety of ways. For example, the computing device 120 may be configured for mobile use, such as a mobile phone, a tablet computer as shown, and so forth. Thus, the computing device 102 may range from a full resource device with substantial memory and processor resources to a low resource device with limited memory and/or processing resources. The computing device 102 may also involve software that causes the computing device 102 to perform one or more operations.
Computing device 102 is illustrated, for example, as including input/output module 108. Input/output module 108 represents functionality related to input processing and output presentation of computing device 102. A variety of different inputs may be processed by the input/output module 108, such as processing inputs related to functionality of keys corresponding to an input device, keys of a virtual keyboard displayed by the display device 110, to recognize gestures and cause operations corresponding to the gestures to be performed, the gestures may be recognized by touchscreen functionality of the accessory device 104 and/or the display device 110, and so forth. Thus, the input/output module 108 may support a wide variety of different input technologies by discerning and leveraging (leveraging) the differentiation of input types including key presses, gestures, and so forth.
In the illustrated example, the accessory device 104 is a keyboard configured with a QWERTY arrangement of keys, although other arrangements of keys are also contemplated. Moreover, other unconventional configurations for the accessory device 104 are also contemplated, such as game controllers, configurations to emulate a musical instrument, power adapters, and so forth. Thus, the accessory device 104 can assume a wide variety of different configurations to support a wide variety of different functionalities. Different accessory devices may be connected to the computing device at different times. Moreover, a particular accessory device may also be functionally adapted to exhibit different configurations and capabilities, such as through different selectable modes, software/firmware updates, add-on module devices/components, and so forth. This can cause a change in the manner in which keys or other controls of the accessory device are arranged, and also change the manner in which the host and application manipulate input from the accessory device. For example, the accessory device may operate as a keyboard and as a game controller by: the type of key/control, the label displayed, and the position of the control are adaptively toggled to present different configurations at different times.
As previously mentioned, in this example, the accessory device 104 is physically and communicatively coupled to the computing device 102 using the flexible hinge 106. Flexible hinge 106 represents one illustrative example of an interface suitable for connecting and/or attaching an accessory device to host computing device 102. The flexible hinge 106 is flexible in that the rotational movement supported by the hinge is achieved by flexing (e.g., bending) of the material forming the hinge, as opposed to mechanical rotation supported by a pin, although that embodiment is also contemplated. Moreover, such flexible rotation can be configured to support movement in one direction (e.g., vertically in the figure), but to limit movement in other directions, such as lateral movement of the accessory device 104 relative to the computing device 102. This can be used to support consistent alignment of the accessory device 104 with respect to the computing device 102, such as aligning sensors used to change power states, application states, and so forth.
The flexible hinge 106 may be formed, for example, using one or more layers of fabric, and may include a conductor formed as a flexible trace (trace) to communicatively couple the accessory device 104 to the computing device 102, and vice versa. Such communication may be used, for example, to communicate the results of a key press to computing device 102, receive power from a computing device, perform authentication, provide supplemental power to computing device 102, and so forth. The flexible hinge 106 or other interface can be configured in a variety of ways to support a plurality of different accessory devices 104, further discussion of which may be found in relation to the following figures.
As further illustrated in fig. 1, the computing device 102 may include various applications 112 that provide different functionality to the device. A wide variety of applications 112 typically associated with computing devices are contemplated including, but not limited to: an operating system, a productivity suite that integrates multiple office productivity modules, a web browser, a game, a multimedia player, a word processor, a spreadsheet program, a photo management program, and so forth. The computing device 102 also includes a plurality of host sensors 114 configured to sense corresponding inputs in response to manipulation of the computing device 102. Likewise, the accessory device 104 includes one or more accessory sensors 116 configured to sense corresponding inputs generated in response to manipulation of the accessory device 104.
In accordance with the techniques described herein, the inputs obtained from the host sensor 114 and the accessory sensor 116 may be processed and/or combined in accordance with a suitable sensor fusion algorithm to resolve the orientation of the accessory device 104 and the computing device 102 with respect to each other. Typically, inputs from a plurality of different types of sensors regarding position and/or orientation are processed in combination in order to calculate the orientation. The calculated orientation can then be used to control the behavior of the host and accessory and to perform various corresponding operations. A wide variety of different types of sensors and algorithms suitable for solving for orientation may be utilized, as discussed in more detail with respect to the following figures.
For further explanation, consider fig. 2, which depicts the exemplary computing device 102 of fig. 1, indicated generally at 200, in greater detail. In the depicted example, the computing device 102 is shown in an isolated configuration without the accessory device 104 attached. In addition to the components discussed with respect to fig. 1, the exemplary computing device of fig. 2 also includes a processing system 202 and computer-readable media 204, which represent a wide variety of different types of processing components, media, memory and storage components, and/or devices that may be associated with a computing device and used to provide a wide range of device functionality, and combinations thereof. In at least some embodiments, processing system 202 and computer-readable media 204 represent processing power and memory/storage that can be utilized for general purpose computing operations. More generally, computing device 120 may be configured as any suitable computing system and/or device that utilizes various processing systems and computer-readable media, additional details and examples of which will be discussed with respect to the exemplary computing system of FIG. 13.
The computing device 102 may also implement selected device functionality through one or more microcontrollers 206. Microcontroller 206 represents a hardware device/system designed to perform a predefined set of specified tasks. Microcontroller 206 may represent various on-chip systems/circuits with self-contained resources such as processing components, I/O devices/peripherals, various types of memory (ROM, RAM, flash memory, EEPROM), programmable logic, and so forth. Different microcontrollers may be configured to provide different embedded applications/functionalities, which are implemented at least partially in hardware and perform corresponding tasks. The microcontroller 206 enables certain tasks to be performed in addition to the operation of the general purpose processing system and other applications/components of the computing device or accessory device. Generally, the power consumption of a microcontroller is low for a device compared to operating a general purpose processing system.
As further described, the computing device 102 may also include a sensor fusion module 208, a behavior module 210, and a sensor fusion Application Programming Interface (API) 212 to implement aspects of the sensor fusion algorithm techniques described herein. The sensor fusion module 208 generally represents functionality such that: which applies an appropriate sensor fusion algorithm to derive the orientation from the inputs from the plurality of sensors as described above and below. The sensor fusion module 208 is operable to collect input supplied via various sensors regarding position/orientation, etc., process the input, and compute a corresponding orientation describing a spatial relationship of the computing device 102 and the accessory device 104.
Behavior module 210 represents functionality to control and/or modify a wide variety of different behaviors associated with computing device 102 and/or accessory device 104 based on the calculated orientation. This may include, but is not limited to: managing power state/consumption, selecting an operating mode or device state, adjusting the sensitivity of one or more sensors, controlling interactions between a host, accessory, and/or peripheral device, modifying device functionality, enabling/disabling network connections, activating/deactivating applications, and/or setting application states, to name a few. These and other examples of behaviors that may be controlled according to the calculated orientation are described in more detail with respect to the exemplary processes discussed below.
Sensor fusion Application Programming Interface (API) 212 represents functionality to expose (expose) information about computer orientation for use by application 112. For example, the application 112 may utilize a sensor fusion API to request orientation information on demand and/or subscribe to orientation updates from the sensor fusion module 208 and/or an associated notification system. The sensor fusion API may then interact with the sensor fusion module 208 on behalf of the application 112 to cause orientation information to be conveyed to the application 112. The application 112 may use the orientation information in various ways, examples of which may be found in the discussion of the exemplary process 1200 of FIG. 12 below.
As previously mentioned, a variety of different types of sensors may be utilized to implement the techniques described herein. The host computing device may include a sensor array that is used to provide orientation information. By way of example, and not limitation, host sensors 114 for the example computing device 102 of fig. 2 are depicted as including a gyroscope 214, an accelerometer 216, a magnetometer 218, and a hall effect (hall effect) sensor 220. Various other sensors 222 suitable for deriving information about position and/or orientation may also be utilized.
Fig. 3 depicts an exemplary implementation 300 of the accessory device 104 of fig. 1 in displaying the flexible hinge 106 in greater detail. In this example, the accessory device 104 is depicted as being separate from the computing device. Here, a connection portion 302 of an input device is shown that is configured to provide communicative and physical connection between the accessory device 104 and the computing device 102. In this example, the connecting portion 302 has a height and cross-section configured to be received within a slot in a housing of the computing device 102, although this arrangement could be reversed without departing from the spirit and scope thereof. The connection portion 202 provides an interface through which attachment/connection of the accessory device 104 to the computing device can be detected. In at least some embodiments, this interface also enables communication for interaction and/or control of the accessory device 104 as described herein. For example, the computing device 102, the sensor fusion module 208, and/or the behavior module 210 may communicate with the accessory device through the interface to obtain input from the various accessory sensors 116 and direct behavior of the accessory device.
The connecting portion 302 is flexibly connected to a portion of the accessory device 104 including the key by using the flexible hinge 106. Thus, when the connection portion 302 is physically connected to the computing device, the combination of the connection portion 302 and the flexible hinge 106 supports movement of the accessory device 104 relative to the computing device 102 similar to a hinge of a book. Naturally, a wide variety of orientations can be supported, some examples of which are described in the following paragraphs.
The connecting portion 302 is illustrated in this example as including magnetic coupling devices 304, 306, mechanical coupling protrusions 308, 310, and a plurality of communication contacts 312. The magnetic coupling devices 304, 306 are configured to magnetically couple to complementary magnetic coupling devices of the computing device 102 through the use of one or more magnets. In this way, the accessory device 104 may be physically secured to the computing device 102 using magnetic attraction. The connection portion 302 also includes mechanical coupling protrusions 308, 310 to form a mechanical physical connection between the accessory device 104 and the computing device 102. The communication contacts 312 are configured to contact corresponding communication contacts of the computing device 102 to form a communicative coupling between the devices to facilitate various types of communication.
Having discussed an exemplary environment in which embodiments may operate, consider now certain exemplary device orientations in accordance with one or more embodiments.
Exemplary device orientation
The following discussion presents certain exemplary device orientations. As detailed, different device orientations may be associated with different device power states, different application states, triggering different behaviors, and so forth. Exemplary orientations, as well as other orientations, can be determined using the sensor fusion algorithm techniques described above and below. The determined orientation can then be used to drive different behaviors with respect to the host and/or accessory.
Fig. 4 illustrates that the accessory device 104 can be rotated such that the accessory device 104 is placed in close proximity to the display device 110 of the computing device 102 to assume an orientation 400. In the orientation 400, the accessory device 104 may act as a cover such that the accessory device 104 may protect the display device 110 from damage. In an implementation, the orientation 400 may correspond to a closed position of the computing device 102.
Fig. 5 illustrates the input device 104 rotated away from the computing device 102 such that the computing device assumes an orientation 500. Orientation 500 includes a gap 502 introduced between computing device 102 and accessory device 104. In an implementation, the orientation 500 may be unintentionally caused by a user, such as by inadvertent contact with the computing device 102 and/or the accessory device 104, causing the computing device 102 to sag slightly away from the accessory device 104 such that the gap 502 is introduced.
Fig. 6 illustrates an exemplary orientation 600 of the computing device 102. In the orientation 600, the accessory device 104 lies flat against a surface and the computing device 102 is disposed at an angle, such as by using a stand 602 disposed on a rear surface of the computing device 102, to allow viewing of the display device 110. Orientation 600 may correspond to a typing arrangement whereby input may be received via accessory device 104, such as by using keys of a keyboard, a track pad, and so forth.
FIG. 7 illustrates another exemplary orientation of the computing device 102, generally designated 700. In orientation 700, computing device 102 is oriented such that display device 110 faces away from accessory device 104. In this example, the stand 602 can support the computing device 102, such as via contact with a back surface of the accessory device 104. Although not explicitly shown here, a cover may be utilized to cover and protect the front surface of the accessory device 104. In the depicted orientation, an angle 702 is established between the device and the host. Various angles corresponding to different positions/orientations may also be established, as discussed above and below.
Fig. 8 illustrates an exemplary orientation 800 in which the accessory device 104 can also be rotated so as to be disposed against the back of the computing device 102, e.g., against a rear housing of the computing device 102 that is disposed on the computing device 102 on an opposite side of the display device 10. In this example, the flexible hinge 106 is caused to "wrap around" the connection portion 202 by the orientation of the connection portion 202 relative to the computing device 102 to position the accessory device 104 behind the computing device 102.
This wrapping causes a portion of the rear of the computing device 102 to remain exposed. This can be leveraged for a variety of functionality, such as allowing the camera 802 to be positioned behind the computing device 102 to be used, although a substantial portion of the rear of the computing device 102 is covered by the accessory device 104 in this example orientation 800. In addition to the example illustrated in fig. 8, the display device 110 of the computing device 102 may be determined to be oriented at an angle 804 relative to the accessory device 104. In general, the angle 804 can change as the accessory device 104 is manipulated into different positions. For example, the angle 804 shown in FIG. 8 may be determined to be approximately 360 degrees. Other orientations may correspond to other angles, and a range of angles may be established and associated with defined patterns or states that may trigger different behaviors. Thus, behavior can be controlled according to a particular mode/state corresponding to a current angle between the host and the accessory.
FIG. 9 illustrates another exemplary orientation of computing device 102, indicated generally at 900. In orientation 900, computing device 102 is rotated to one side (side), e.g., in a portrait orientation relative to surface 902 on which computing device 102 is disposed. The display device 110 is visible, allowing the accessory device 104 to rotate away from the display device 110. In at least some implementations, the width of the accessory device 104 can be narrower than the width of the computing device 102. Additionally or alternatively, the width of the accessory device 104 can be tapered such that the edge closest to the hinge 106 is wider than the outermost edge. This may enable the front of the display device 110 to recline in the orientation 900 to provide a proper viewing angle.
Fig. 10 illustrates that the computing device 102 can rotate within a variety of different angular ranges relative to the accessory device 104. As detailed herein, different angular ranges may be associated with different power states, different application states, and so forth.
An angular range 1000 corresponding to a closed position for the computing device 102 is illustrated. Thus, if the computing device 102 is positioned at an angle within the angular range 1000 relative to the accessory device 104, the computing device 102 may be determined to be in the closed position. The closed position may include an associated closed state, wherein various functionalities/behaviors for the computing device 102 and the accessory device 104 may be modified accordingly according to the closed state.
Also illustrated is an angular range 1002 that may correspond to a typing orientation for the computing device 102. Thus, if the computing device 102 is positioned at an angle within the angular range 1002 relative to the accessory device 104, the computing device 102 may be determined to be in a typing orientation. In this orientation, the computing device 102 and/or the accessory device 104 may be placed in a typing power state, wherein various functionalities/behaviors for the computing device 102 and the accessory device 104 may be customized accordingly based on the typing state.
Fig. 10 also illustrates an angular range 1004 corresponding to a viewing position for the computing device 102. Thus, if the computing device 102 is positioned at an angle within the angular range 1004 relative to the accessory device 104, the computing device 102 may be determined to be in a viewing orientation. In this orientation, the functionality/behavior for the computing device 102 and the accessory device 104 may be controlled accordingly based on the viewing state.
The orientations, angular ranges, power states, etc. discussed above are given for illustration only. It is contemplated that a wide variety of different orientations, device states, and angular ranges may be implemented within the spirit and scope of the claimed embodiments.
Having discussed some example device orientations, consider now some example processes in accordance with one or more embodiments.
Exemplary procedure
The following discussion describes sensor fusion algorithm techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, software, or a combination thereof. The processes are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference may be made to the exemplary operating environment 100 of FIG. 1, the exemplary devices of FIGS. 2-3, and the exemplary orientations shown in FIGS. 4-10, respectively.
FIG. 11 depicts an example process 1100 in which an orientation of an accessory relative to a host is calculated. In at least some embodiments, the process may be performed by a suitably configured computing device, such as the exemplary computing device 102 of fig. 2 that includes or otherwise utilizes the sensor fusion module 208 and/or the behavior module 210.
The raw spatial location of the host computing device is independently computed using at least two different types of sensors (block 1102). The raw spatial locations are processed to obtain a combined spatial location of the host computing device (block 1104).
For example, the sensor fusion module 208 may be configured to implement a specified sensor fusion algorithm. In general, the sensor fusion algorithm is configured to aggregate information from an array of heterogeneous host sensors 114 utilized by the computing device 102. The aggregation of multiple different sensing technologies and sensor types may provide an improved solution to location and may smooth out errors that may be introduced by individual technologies and sensors. In at least some embodiments, the sensor fusion algorithm is configured to account for at least two independent calculations of the raw spatial position of the computing device 102 by using different respective sensors. Multiple independent calculations of the original position may then be used to produce a combined spatial position. Each independent calculation may utilize one or more of the various types of host sensors 114 described above and below. At least some of the sensors used for different independent calculations are of different types. Thus, the sensor fusion algorithm takes input from a variety of different host sensors 114 and combines this information to solve for the location of the computing device 102.
In one aspect, computing device 102 includes a gyroscope 214 that can be used to obtain one of the independent calculations of the home position. Typically, gyroscopes use the principle of angular momentum to account for orientation and rotation. The gyroscope 214 may be used to discern motion within three-dimensional space and enable the determination of position relative to a reference target/point, such as the earth. Using the input obtained from gyroscope 214, sensor fusion module 208 may operate to calculate the raw spatial location of the computing device. The original spatial position may be expressed as coordinates in a three-dimensional coordinate system defined by x, y, and z axes with respect to a reference target/point (e.g., the earth).
In particular, angular velocity input obtained from a gyroscope may be processed to determine an angular position of a computing device. Initially, the input from the gyroscope may be filtered to remove the low pass constant offset of the gyroscope. Such a low-pass constant offset may be generated when the gyroscope is stuck in a non-zero position, and is therefore removed to prevent inaccuracies in the calculation. The algorithm may integrate over multiple axes (e.g., x, y, and z axes) of the gyroscope to derive a transformation describing the original spatial location of the computing device. This processing may involve integrating the angular velocity input from the gyroscope by a Runge-Kutta integration algorithm (or other suitable algorithm) to obtain corresponding pulse data. The pulse data may be expressed as quaternions for different axes, which when multiplied together produce a quaternion that is: the quaternion describes the transformation between the computing device 102 and the earth (or other selected reference target/point) with respect to their respective axes/coordinate systems. This provides a separate version of the original spatial location of the computing device 102.
Another independent calculation of the raw spatial position may be obtained by using the accelerometer 216 and magnetometer 218 in combination. Here, accelerometer 216 is configured as a three-axis accelerometer that can be utilized to derive two of the degrees of freedom of the device (e.g., position relative to the x-axis and the y-axis). In the low pass, the vector of acceleration is approximately 1g pointing down the center of the earth. The components of the acceleration measured via the accelerometer 216 may be obtained as being distributed across each of the three axes. The component of acceleration in turn can be used to calculate the angle of the accelerometer/device axis relative to the low pass vector pointing to the geocentric. This provides two of three degrees of freedom with respect to the tilt or orientation of the device. In particular, the accelerometer processing just described is used to solve for the tilt/orientation of the x-axis and y-axis of the computing device 102.
The magnetometer 218 can now be utilized to solve for the remaining degrees of freedom regarding the tilt/orientation of the device. The magnetometer 218 may be initialized/configured to function as a compass. In this scenario, the magnetometer 218 can be used to compute a vector parallel to the ground (e.g., the surface of the earth). This vector points to the magnetic north pole and can be used to determine the rotation of the device relative to the z-axis. Now, the tilt/orientation of the x-axis and y-axis from the accelerometer and the rotation of the device relative to the z-axis from the magnetometer 218 can be used to construct another quaternion that describes the transformation between the computing device 102 and the earth (or other selected reference target/point) relative to their respective axes/coordinate systems. This provides another independent way in which the original spatial location of the computing device 102 may be obtained. Other examples are contemplated using different sensors and combinations of sensors. For example, Global Positioning Satellite (GPS) radio may be used to provide certain positioning data, which may be used alone or in combination with other kinds of sensor data to calculate the position/orientation of the computing device 102.
Thus, at least two different results for the original spatial positions are calculated using the exemplary techniques described above or other suitable techniques. The sensor fusion algorithm may also be configured to combine multiple independent calculations of raw spatial positions in various ways. The combining typically involves interpolating between two or more original spatial locations to reduce or eliminate inaccuracies and/or smooth results. The interpolation produces a combined spatial position of the computing device that is based on two or more independently obtained original spatial positions.
By way of example, but not limitation, the results obtained by using a gyroscope may be more accurate in the short term relative to other sensors and position determination techniques. However, small integration errors associated with gyroscope calculations may accumulate over time to produce increasingly larger offsets, which may lead to inaccurate results in the long term. Thus, interpolating the gyroscope results with other independently obtained results may effectively adjust for the expected integration error of the gyroscope results. In one approach, normalized linear interpolation is utilized, which may bias the gyroscope results because the results are initially more accurate and suffer less noise. Other independent results, such as results from accelerometers/magnetometers, may be included in the interpolation to constrain the gyroscope results and slowly tune the skew used to combine the results away from the gyroscope results and toward the other results over time. This produces a mathematically smooth transformation as a result of the combination.
Using one or more sensors of the accessory device, a spatial location of the accessory device connected to the host computing device is ascertained (block 1106). The spatial location of the accessory device 104 can be calculated in any suitable manner, including, but not limited to, the techniques described with respect to the computing device 102. The accessory sensors 116 for the different accessory devices may include any of the various types of sensors described herein. Thus, different corresponding techniques can be used to ascertain the spatial location of the accessory device based on appropriate input from the one or more accessory sensors 116. Different technologies may also be utilized for different accessory devices depending on the type of sensor included within the accessory device. In general, the sensor fusion module 206 may be configured to obtain inputs from different sensors of the accessory device through an appropriate interface with the accessory device and calculate corresponding spatial locations from the inputs.
In one particular example, the sensor fusion module 206 can calculate the spatial location by using an accelerometer 216 associated with the accessory device 104. In this approach, the accelerometer 216 may be utilized to resolve the tilt/orientation of the accessory device 104 about the x-axis and the y-axis. This may be done in a manner similar to that described above using an associated accelerometer to calculate the same kind of information for the computing device 102.
In some arrangements, the accessory device 104 can be configured to connect to the computing device 102 using the connection portion 302, the connection portion 302 being connectable to an interface of the computing device via a known location. For example, in the hinge example described above, at least some information regarding the location of the accessory device can be established based on the known location and nature of the connection with the host device. Thus, in such a situation, it is sufficient to solve for the position of the accessory relative to the host using two degrees of freedom for the accessory device 104 (e.g., x-axis and y-axis position/spacing and scrolling). However, it should be noted that in some embodiments, that rotation of the accessory device 104 relative to the z-axis may also be calculated using the magnetometer 218 discussed above or using other sensors and techniques. This can be utilized in a configuration where: wherein the accessory device can be manipulated in three dimensions even when connected to the host device, such as by means of a ball and socket type connection.
From the combined spatial location of the host computing device and the ascertained spatial location of the accessory device, an orientation of the accessory device relative to the host computing device is calculated (block 1108). The calculated orientation may correspond to any of the different orientations discussed with respect to fig. 4-10, as well as other possible orientations. Here, a comparison can be made between the combined spatial location of computing device 102 and the ascertained spatial location of accessory device 104 to derive information about the orientation of the devices with respect to each other. In particular, the combined spatial locations indicate a transformation between how axes in a coordinate system for computing device 102 are oriented relative to axes associated with a reference coordinate system for earth or other reference. Similarly, the ascertained spatial position of the accessory device 104 indicates a transformation between how the axes in the coordinate system for the accessory device are oriented relative to the axes of the reference coordinate system. Thus, the two locations can be used to compute a transformation of the accessory device 104 relative to the computing device 102 that is independent of the reference coordinate system.
As an example, in some cases, the orientation may be defined as an angle of the accessory device 104 relative to the computing device 102, as represented on fig. 10. As also discussed above, different angles may be associated with different interaction states, such as the examples of closed state, typing state, and viewing state given above. The orientation may alternatively be expressed in another suitable way, such as using x, y, z coordinates.
Optionally, the calculated orientation may be verified by using the hall effect sensor 220 of the computing device 102. The hall effect sensor 220 can be configured to utilize magnetic force to detect proximity between the computing device 102 and the accessory device 104. For example, the hall effect sensor 220 can measure proximity from one or more magnets included with the computing device 102 and/or the accessory device 104. The hall effect sensor 220 can be configured to align with and detect a magnet of the accessory device 104 when the computing device 102 is rotated to the closed position. The hall effect sensor 220 may not be able to detect the magnet when the computing device 102 is positioned away from the accessory device 104 in the open position, or the detected magnetic force may change when the computing device 102 is rotated to a different angle relative to the accessory device 104. The hall effect sensor 220 provides another way in which orientation can be determined. Thus, the hall effect sensor 220 can be used as an additional check as to whether the orientation calculated using the other sensors is accurate. This additional verification may be made before causing and/or controlling certain actions, such as powering down the device or turning off different components depending on orientation.
One or more behaviors of the host computing device and the accessory device are controlled based on the calculated orientation (block 1110). Various behaviors and responsive actions may be driven according to the calculated orientation of the accessory relative to the host. The behavior module 210 may be configured to obtain orientation results from the sensor fusion module 208 and control various behaviors accordingly.
The control behavior may include at least power management operations for the computing device 102 and/or the host device. Typically, power management operations are configured to control power consumption and extend battery life. For example, the behavior module 210 may cause a change in power mode/state depending on a particular orientation. This may include switching the device and/or selected components on/off according to the determined orientation. For example, in the closed state, the host and accessory may be powered down or placed in a sleep mode. In another example, the accessory can be powered down when the orientation corresponds to a viewing state. The accessory device 104 may also wake up automatically in a particular orientation, such as when a typing status is detected. A variety of other power management examples occurring in response to the calculated orientation are also contemplated.
In another example, the control action may include selectively adjusting and/or enabling/disabling different sensors for the device according to orientation. As an example, the accessory rotating completely around to cover the back of the host may indicate a game play state. In this arrangement, the accelerometer 216 is likely to be used for game play, and the use of touch functionality for keyboard/typing input from the accessory is likely to be impossible. Thus, in this arrangement, the sensitivity of the accelerometer 216 may be increased/turned on, while the touch sensitivity may be decreased or disabled. In the typing state, the opposite result may be true, and the accelerometer 216 may be disabled or adjusted to a lower sensitivity, and the touch sensitivity may be increased or re-enabled. Thus, the sensitivity of the sensor can be adjusted and a particular sensor can be turned on/off depending on the orientation. It should be noted that the sensors being controlled may include the sensors involved in the calculation of the orientation as well as other sensors of the host or accessory.
In yet another example, functionality activated for the accessory and/or the host can be modified depending on orientation. For example, the accessory may be configured to act as a game controller when wrapped around the back, and transition to provide keyboard typing input when in a typing orientation. In another example, readout of gestures to scroll or flip pages via the accessory can be enabled by input across the accessory device in a viewing orientation and can be disabled for other states/orientations. These kinds of changes in functionality provided by the accessory can occur by selectively revealing, enabling, configuring, or otherwise activating different controls, functions, and gestures in different orientations.
Comparable changes to gestures, touch keys, and other functionality that activate the host computing device may also occur depending on orientation. For example, gestures for manipulating media content on display 110 may be active in certain orientations (e.g., a viewing state or a gaming state) and inactive in other scenarios. Some additional examples of modifications that may be made to the functionality activated/available to the computing device depending on orientation include: selectively enable/disable network connections and/or control host interactions with accessory devices and/or peripheral devices (e.g., printers, streaming media devices, storage devices) according to the calculated orientation.
In addition, the behavior of the application 112 may also be controlled according to the calculated orientation. For example, the behavior module 210 may be configured to selectively activate or deactivate different applications 112 depending on orientation. This may include switching between applications operating in foreground and background processing, launching and shutting down specific applications, minimizing/maximizing, and so forth. The application 112 may also retrieve and/or subscribe to receive updates of the calculated orientation, which the application may utilize in various ways, some details of which are provided in relation to the following figures. Thus, a wide variety of behaviors can be controlled depending on the calculated orientation, the specific behaviors enumerated above being just a few illustrative examples.
FIG. 12 depicts an exemplary process 1200 in which the calculated orientation is surfaced for use by an application. In at least some embodiments, the process can be performed by a suitably configured computing device, such as the exemplary computing device 102 of fig. 2, that includes or otherwise utilizes a sensor fusion Application Programming Interface (API) 212.
From the combined spatial location of the host computing device and the ascertained spatial location of the accessory device, an orientation of the accessory device with respect to the host computing device is calculated (block 1202). This may be done in accordance with a specified sensor fusion algorithm as discussed with respect to the exemplary process 1100 of fig. 11 above.
An interface is exposed that may be operated on by one or more applications to obtain the calculated orientation (block 1204). In response to receiving a request from an application via the interface, the calculated orientation is provided to the application (block 1206). In particular, the computing device 102 may include a sensor fusion Application Programming Interface (API) 212 operable to provide the calculated orientation information to the application 112. In one aspect, the sensor fusion API may provide orientation information on demand in response to an individual's request. Additionally or alternatively, the sensor fusion API may be configured to facilitate registration of the application 112 in order to subscribe to receive charging direction updates. In response to the request for the subscription, the API may register the application with the sensor fusion module 208 and/or an associated notification system configured to provide a notification message to the registered application when an orientation change occurs. The application 112 may then receive a notification message that describes the update to the orientation sent via the notification system.
The sensor fusion API can provide orientation and/or related information to applications in a variety of formats. For example, the orientation may be in the form of a transformation of the accessory device 104 relative to the computing device 102, as calculated in the manner described above. In this case, the application may process the provided orientation information to obtain information having a format appropriate for the application, such as an orientation angle or a defined orientation state corresponding to the calculated orientation. Additionally or alternatively, the sensor fusion module 208 may be operable to compute an orientation state on behalf of an application. Thus, the information provided via the sensor fusion API may include a state name or identifier, which may be directly available to the application.
The application 112 may utilize the orientation information provided through the API in various ways. For example, the application 112 may selectively modify a user interface and/or functionality of a user interface for the application as a function of orientation. This may include activating different controls, menus, gestures, and/or input modes for different respective orientations. For example, a navigation menu appearing in one orientation (typing/keyboard entry orientation) may disappear in the viewing orientation. Also, the application 112 may be configured to include various modes and switch between modes depending on orientation. For example, the messaging application may switch from a text entry mode to a video mode in accordance with the calculated orientation. In another example, an application may modify the manner in which a particular input is interpreted in different orientations. For example, pressing of a button in a typing orientation may be used for alphanumeric entry, while the same button may be used for content control functions in a viewing orientation. Other buttons, keys, and other controls may also be selectively enabled or disabled when the orientation is changed. A variety of other examples are also contemplated.
Having considered the above exemplary processes, consider now a discussion of exemplary systems and devices that can be utilized to implement aspects of the techniques in one or more embodiments.
Exemplary System and device
Fig. 13 illustrates an exemplary system, indicated generally at 1300, including an exemplary computing device 1302 that represents one or more computing systems and/or devices that can implement the various techniques described herein. The computing device 1302 may be configured, for example, to assume a mobile configuration using a formed housing and a size to be grasped and carried by one or more hands of a user, illustrative examples of which include mobile phones, mobile gaming and music devices, and tablet computers, although other examples are also contemplated.
The illustrated exemplary computing device 1302 includes a processing system 1304, one or more computer-readable media 1306, and one or more I/O interfaces 1308, which are communicatively coupled to each other. Although not shown, computing device 1302 may also include a system bus or other data and command transfer system that couples the various components to each other. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. Various other examples are also contemplated, such as control and data lines.
The processing system 1304 represents functionality to perform one or more operations through the use of hardware. Thus, the processing system 1304 is illustrated as including hardware units 1310 that may be configured as processors, functional blocks, and the like. This may include implementation in hardware, such as an application specific integrated circuit or other logic device formed using one or more semiconductors. Hardware units 1310 are not limited by the materials from which they are formed or the processing mechanisms utilized therein. For example, a processor may include semiconductors and/or transistors (e.g., electronic Integrated Circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.
The computer-readable storage medium 1306 is illustrated as including memory/storage 1312. Memory/storage 1312 represents memory/storage 1312 capabilities associated with one or more computer-readable media. Memory/storage component 1312 may include volatile media (such as Random Access Memory (RAM)) and/or nonvolatile media (such as Read Only Memory (ROM), flash memory, optical disks, magnetic disks, and so forth). Memory/storage component 1312 may include fixed media (e.g., RAM, ROM, a fixed hard drive, etc.) as well as removable media (e.g., flash memory, a removable hard drive, an optical disk, and so forth). Computer-readable medium 1306 may be configured in a variety of other ways as further described below.
Input/output interface 1308 represents functionality to allow a user to enter commands and information to computing device 1302, and to also allow information to be presented to the user and/or other components or devices through the use of various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors configured to detect physical touches), a camera (e.g., which may utilize visible or non-visible wavelengths, such as infrared frequencies, to discern motion as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, a haptic response device, and so forth. Thus, computing device 1302 may be configured in a variety of ways to support user interaction.
Computing device 1302 is also illustrated as being communicatively and physically coupled to an accessory device 1314 that is physically and communicatively detachable from computing device 1302. As such, a wide variety of different accessory devices can be coupled to computing device 1302 having a wide variety of configurations to support a wide variety of functionality. In this example, the accessory device 1314 includes one or more controls 1316, which may be configured as pressure sensitive keys, mechanical switch keys, and so forth.
The accessory device 1314 is also illustrated as including one or more modules 1318 that can be configured to support a wide variety of functionality. The one or more modules 1318, for example, may be configured to process analog and/or digital signals received from controls 1316 to determine whether a keystroke is expected, to determine whether the input indicates a resting pressure, to support authentication of accessory device 1314 for operation with computing device 1302, and so forth.
Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, units, components, data structures, etc. that perform particular tasks or implement particular abstract data types. As used herein, the terms "module," "functionality," and "component" generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercially available computing platforms having a variety of processors.
An implementation of the described modules and techniques may be stored on or transmitted across some form of computer readable media. Computer-readable media can include a variety of media that can be accessed by computing device 1302. By way of example, and not limitation, computer-readable media may comprise "computer-readable storage media" and "computer-readable signal media".
"computer-readable storage medium" may refer to media and/or devices capable of persistently and/or non-transiently storing information, as opposed to merely a signal transmission, carrier wave, or signal per se. Accordingly, computer-readable storage media refers to non-signal bearing media. Computer-readable storage media include hardware such as volatile and nonvolatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer-readable instructions, data structures, program modules, logic units/circuits, or other data. Examples of computer readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage devices, tangible media, or an article of manufacture suitable for storing the desired information and which can be accessed by a computer.
"computer-readable signal medium" may refer to a signal-bearing medium configured to transmit instructions to the hardware of computing device 1302, such as via a network. Signal media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave, data signal or other transport mechanism. Signal media also includes any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
As previously described, hardware unit 1310 and computer-readable medium 1306 represent modules, programmable device logic, and/or fixed device logic implemented in hardware form that may be utilized in certain embodiments to implement at least some aspects of the techniques described herein, such as to execute one or more instructions. The hardware may include components of integrated circuits or systems-on-chips, microcontroller devices, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), Complex Programmable Logic Devices (CPLDs), and other implementations using silicon or other hardware. In this context, hardware may operate as a processing device to perform program tasks defined by instructions and/or logic embodied by hardware, as well as hardware utilized to store instructions for execution (e.g., the computer-readable storage media described above).
Combinations of the foregoing may also be utilized to implement the various techniques described herein. Thus, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage medium and/or may be implemented by one or more hardware units 1310. Computing device 1302 may be configured to implement particular instructions and/or functions corresponding to software and/or hardware modules. Thus, implementation of modules executable by the computing device 1302 as software may be achieved at least in part in hardware, for example, through use of computer-readable storage media and/or hardware units 1310 of the processing system 1304. The instructions and/or functions may be executable/operable by one or more articles of manufacture (e.g., one or more computing devices 1302 and/or processing systems 1304) to implement the techniques, modules, and examples described herein.
Conclusion
Although exemplary implementations have been described in language specific to structural features and/or methodological acts, it is to be understood that the implementations defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as exemplary forms of implementing the claimed features.

Claims (10)

1. A method implemented by a host computing device, comprising:
independently accounting (1102) a plurality of raw spatial locations of a host computing device using at least two different types of sensors of the host computing device;
processing (1104) the plurality of raw spatial locations to obtain a combined spatial location of a host computing device;
ascertaining (1106) a spatial location of an accessory device connected to a host computing device using one or more sensors of the accessory device; and
an orientation of the accessory device relative to the host computing device is calculated (1108) from the combined spatial location of the host computing device and the ascertained spatial location of the accessory device.
2. A method as described in claim 1, further comprising exposing the calculated orientation for use by one or more applications of the host computing device via an Application Programming Interface (API).
3. A method as described in claim 1, wherein accounting for the plurality of original spatial locations of the host computing device comprises: one of the plurality of raw spatial positions is calculated using a gyroscope.
4. A method as described in claim 1, wherein accounting for the plurality of original spatial locations of the host computing device comprises: an accelerometer and magnetometer are used in combination to compute one of a plurality of raw spatial positions.
5. A method as described in claim 1, wherein the spatial location of the accessory device is ascertained via an accelerometer of the accessory device.
6. A method as described in claim 1, wherein the different types of sensors include a gyroscope, an accelerometer, and a magnetometer of the host computing device.
7. A method as described in claim 1, wherein processing the plurality of raw spatial locations to obtain a combined spatial location comprises: interpolation is performed between the original spatial positions to reduce inaccuracies in the combined spatial positions.
8. A method as described in claim 1, wherein the accounting, processing, ascertaining, and calculating are performed via one or more microcontrollers of a host computing device, the microcontrollers configured to implement the sensor fusion module at least partially in hardware.
9. A method as described in claim 1, further comprising controlling one or more behaviors of the host computing device in accordance with the calculated orientation.
10. A method as described in claim 9, wherein controlling one or more behaviors of the master computing device includes: the sensitivity of one or more sensors is selectively adjusted according to the calculated orientation.
HK14104848.0A2012-03-022014-05-23Sensor fusion algorithmHK1191707B (en)

Applications Claiming Priority (16)

Application NumberPriority DateFiling DateTitle
US201261606333P2012-03-022012-03-02
US201261606313P2012-03-022012-03-02
US201261606301P2012-03-022012-03-02
US201261606336P2012-03-022012-03-02
US201261606321P2012-03-022012-03-02
US61/606,3362012-03-02
US61/606,3012012-03-02
US61/606,3332012-03-02
US61/606,3212012-03-02
US61/606,3132012-03-02
US201261607451P2012-03-062012-03-06
US61/607,4512012-03-06
US201261613745P2012-03-212012-03-21
US61/613,7452012-03-21
US13/471,202US8548608B2 (en)2012-03-022012-05-14Sensor fusion algorithm
US13/471,2022012-05-14

Publications (2)

Publication NumberPublication Date
HK1191707A1 HK1191707A1 (en)2014-08-01
HK1191707Btrue HK1191707B (en)2017-08-04

Family

ID=

Similar Documents

PublicationPublication DateTitle
US9619071B2 (en)Computing device and an apparatus having sensors configured for measuring spatial information indicative of a position of the computing devices
CN203178852U (en)Protective cap and accessory equipment
CN108139779B (en)Apparatus and method for changing operating state of convertible computing device
US20130019192A1 (en)Pickup hand detection and its application for mobile devices
CN107202572B (en)Electronic compass calibration method, electronic compass and electronic equipment
CN103513894A (en)Display apparatus, remote controlling apparatus and control method thereof
US11061492B2 (en)Gyratory sensing system to enhance wearable device user experience via HMI extension
JP2015532562A (en) Tablet device, arrangement method, and computer device
CN101568896A (en)Information processing apparatus, input device, information processing system, information processing method, and program
US11914916B2 (en)Method for controlling display and electronic device therefor
HK1191707B (en)Sensor fusion algorithm
CN106500689B (en) A method for determining the posture of a mobile terminal and the mobile terminal
CN115421617A (en)Holding mode detection method, electronic device and storage medium
JP2018018124A (en)Electronic device and power supply control method
US10474108B2 (en)Magnetic sensor array for crown rotation
CN114144749A (en) Operation method based on touch input and electronic device thereof
JP5704001B2 (en) Touch panel operating device, touch panel operating device control method, and program
KR20250015524A (en)Wearable device, medium and input control method based on sensing information thereof
JP2014071833A (en)Electronic apparatus, display change method, display change program
KR20240050835A (en)Foldable electronic device and method for processing user input in foldable electronic device
HK1192961A (en)Mobile device power state

[8]ページ先頭

©2009-2025 Movatter.jp