CROSS REFERENCE TO RELATED APPLICATIONThis application claims the benefit of priority to U.S. Provisional Patent Application Ser. No. 62/274,514, entitled “MODULAR SENSING DEVICE FOR PROCESSING GESTURES AS INPUT,” and filed on Jan. 4, 2016; and U.S. Provisional Application Ser. No. 62/346,216, entitled “MODULAR SENSING DEVICE FOR PROCESSING GESTURES AS INPUT,” and filed on Jun. 6, 2016; the aforementioned priority applications being hereby incorporated by reference in their respective entireties.
BACKGROUNDRemotely operated self-propelled devices are typically operable by way of analog or digital controller devices that communicate a limited amount of preconfigured commands. Such commands typically involve signaled radio frequency communications to accelerate and maneuver the self-propelled device. Furthermore, wearable device technology in consumer electronics is rapidly being integrated into routine user activities, such as sporting activities, content viewing or browsing, and task-oriented activities (e.g., gaming).
BRIEF DESCRIPTION OF THE DRAWINGSThe disclosure herein is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements, and in which:
FIG. 1A is a block diagram illustrating an example modular sensing device, as described herein;
FIG. 1B is a block diagram illustrating an example modular sensing device in communication with a self-propelled device;
FIG. 2A illustrates an example modular sensing device in communication with a mobile computing device and a self-propelled device;
FIG. 2B illustrates an example modular sensing device for generating commands for execution on a self-propelled device;
FIG. 3A is a high level flow chart describing an example method of translating sensor data by an example modular sensing device for implementation in a self-propelled device;
FIG. 3B is a low level flow chart describing an example method for translating sensor data by an example modular sensing device for implementation in a self-propelled device;
FIG. 4 is a flow chart describing an example method of initiating a training mode on a wearable device in connection with a self-propelled device;
FIG. 5 is a schematic illustrating an example self-propelled device with which wearable devices as describe with other examples can be implemented, as described herein;
FIG. 6 is a block diagram of an example computer system upon which examples described herein may be implemented;
FIG. 7 is a block diagram of a mobile computing device upon which examples described herein may be implemented;
FIG. 8 is a block diagram of an example modular sensing device upon which examples described herein may be implemented;
FIG. 9 illustrates an embodiment of multiple sensing devices that concurrently provide input for a program or application which utilizes the inputs, along with inferences which can be made about a person or object that carries the devices, according to one or more examples;
FIG. 10 illustrates a system which concurrently utilizes input from multiple modular sensing devices in connection with execution of an application or program;
FIG. 11 illustrates an example of a modular sensing device that is insertable into a plurality of compatible apparatuses; and
FIG. 12 illustrates an implementation of the modularized sensing device.
DETAILED DESCRIPTIONExamples described herein relate to a multi-modal modular sensing device, worn or carried by a user (e.g., as a wrist-worn device), to enable a variety of interactions with other devices through sense movement of the modular sensing device. Among other activities, examples provide for a modular sensing device that can individually, or in combination with another device (e.g., a controller device, such as a mobile computing device), control other devices, interact with compatible devices of other users, and/or operate in connection with task-oriented activities (e.g., gameplay). In some examples, the modular sensing device corresponds to a wearable device (e.g., a watch, a pendant for a necklace, a hat, glasses) that can be placed in a mode to control the characteristics of movement of another device. For example, the modular sensing device can control acceleration and maneuvering of a self-propelled device.
In certain aspects, a wearable device can be operated in connection with a separate mobile computing device that can execute a designated application in order to enable the wearable device to operate in a specified mode. According to some examples, the mobile computing device can be utilized to connect an example wearable device to a smart home device, a self-propelled device, and/or one or more additional wearable devices for purpose of enabling the user to interact with, control, and/or operate such connected devices via user gestures or “body part” gestures (e.g., arm gestures) that can be sensed through the wearable device. Still further, in variations and other applications and implementations, a wearable device can be operable to detect and acquire virtual resources to be utilized by the user in a virtual or network-based environment (e.g., an online gameplay environment).
According to some examples, a modular sensing device is operable to detect its own movement in three-dimensional space using an inertial measurement unit (IMU). In some implementations, the IMU can be an integrated device. Alternatively, the IMU can be implemented through a combination of sensors, such as a three-dimensional accelerometer or gyroscope. In some examples, the modular sensing device can include a processor and memory to interpret the sensor data, and to communicate interpreted sensor data to another device (e.g., mobile computing device) using a wireless connection (e.g., BLUETOOTH). In variations, the IMU can generate raw sensor data based on the user gestures, which can be processed, based on the processing resources of the modular sensing device, and the processing load which is implemented for the portable device.
As used herein, “a modular sensing device” or “modular sensing device” can include any electronic device that includes sensor resources for detecting its own movement, and of dimension and form factor suitable for being carried with one hand or worn on a human body. Numerous examples of portable sending devices are provided in the context of a “wearable device,” such as a wrist worn device (e.g., a watch, a watch band, a bracelet). But as noted by other examples, variations to the type and form factor of a wearable device can vary significantly, encompassing, for example, eyeglasses, hats, pendants, armbands, glasses and various other form factors. While many examples described functionality in the context of a wearable device, embodiments extend such examples to other forms of modular sensing devices, such as wands, fobs, or mobile communication devices.
In many examples, the wearable device can include one or more sensors to detect the device's own movements. In particular, a wearable device can include an accelerometer and/or a gyroscopic sensor. In some examples, sensor data, corresponding to gestures performed by the user wearing the wearable device, can be translated into control commands or data packets to be transmitted and implemented based on the selected mode of the wearable device. According to many examples, the wearable device can include an inductive interface to inductively pair with other devices, which can trigger a specified mode on the wearable device. For example, an inductive pairing between the wearable device and a self-propelled device can trigger a “drive mode” in which the wearable device can be utilized by the user to operate the self-propelled device. Additionally or alternatively, the wearable device can include an input mechanism, such as an analog or digital button, that enables the user to select a particular mode and/or scroll through a series of modes for the wearable device.
Among other functionality, some examples described herein provide for alternative modes of operation, including, for example (i) a “drive mode” in which the wearable device is utilized to control a self-propelled device; (ii) a “control mode” in which the wearable device is utilized in connection with smart home devices; (iii) a “finding mode” or “finder mode” in which the wearable device is utilized to detect virtual or digital resources; (iv) a “mining mode” which can be initiated by the user to collect virtual resources when they are detected in the finder mode; (v) a “training mode” in which the wearable device is utilized in connection with a self-propelled device to assist the user in training for certain achievements or for increasing the user's abilities to perform task-oriented activities (e.g., increasing skills for a subsequent game or sporting activity); (vi) a “sword mode” in which the wearable device provides feedback (e.g., haptic, audio, and/or visual feedback) when the user performs actions while holding an object; (vii) a “default mode” in which the device monitors for and detects other proximate wearable devices (e.g., wearable devices of the same type) which enables the users to pair with each other's wearable devices; (viii) an “interactive mode” or “battle mode” selectable in response to two or more device pairings in which users are able to interact with each other with predetermined sets of actions (e.g., offensive and defensive actions learned and perfected by users practicing in the training mode); (ix) a “sharing mode” selectable in response to two or more device pairings in which users can share information stored in each other's wearable devices, or user accounts associated with the wearable devices (e.g., sharing collected virtual resources discovered and mined in the finder and mining modes to be expended or consumed in a gameplay environment); and (x) a “gaming mode” in which the wearable device can be utilized in connection with a game.
Still further, numerous examples make reference to a “self-propelled” device. A self-propelled device can include, for example, a device that can be wirelessly and remotely controlled in its movement, whether the movement is on ground, water, or air. For example, a self-propelled device can include a wirelessly and remotely controlled drone, car, plane, helicopter, boat, etc. While conventional examples enable control of a self-propelled device, conventional approaches generally utilize a perspective of the device being controlled. While some conventional devices, for example, enable a computing device held by the user to project a perspective of the device under control, examples described herein enable control of such devices to utilize an orientation of the user. Specifically, some examples include a modular sensing device that can determine an orientation of the user, and further enable control of the self-propelled device through an environment that accommodates or is in the perspective of the user, based on the orientation of the user (as determined by the modular sensing device). By way of example, the control of a self-propelled device can be projected through an orientation or perspective of the user for purpose of a virtual environment.
Some examples include a wearable device having a wireless communication module (e.g., a BLUETOOTH low energy module) that enables communication of sensor data (e.g., raw sensor data from the accelerometer or gyroscopic sensor), or translated data (i.e., translations of the sensor data based on the selected mode of the wearable device). In some examples, the sensor data may be relayed for translation by a mobile computing device before being transmitted to another device (e.g., a paired wearable device or a paired self-propelled device). In other examples, processing resources of the wearable device can execute mode instructions, based on the selected mode, to translate the sensor data for direct transmission to one or more other devices, as described herein.
As used herein, “body part gestures” or “user gestures” include gestures performed by a user while utilizing the wearable device. For example, the wearable device may be a wrist-worn device, in which case the user gestures may comprise arm gestures, and can include any number of physical movements or actions that affect the sensors of the wearable device when it is worn on the wrist. Such movements and actions can include shaking, arm movements (e.g., raising, lowering, pointing, twisting, and any combination thereof), wrist movements, hand actions (such as grasping or grabbing), and the like. However, the wearable device is not limited to wrist-worn devices, but may be utilized as a ring (e.g., a finger-worn device), an ankle-worn device, a neck-worn device, a head-worn device, a belt (e.g., a waist-worn device), etc. Thus, user gestures performed using the wearable device can be any actions or movements in which correlated sensor data from sensors of the device can be translated into commands, instructions, feedback, etc. depending on the mode of the wearable device.
Among other benefits, examples described herein achieve a technical effect of enhancing user interactivity with other devices and other users. Such interactivity may include utilizing the wearable device to control a self-propelled device, interact with other users of wearable devices, collect and share data, control smart home devices, and the like.
One or more examples described herein provide that methods, techniques, and actions performed by a computing device are performed programmatically, or as a computer-implemented method.
Programmatically, as used herein, means through the use of code or computer-executable instructions. These instructions can be stored in one or more memory resources of the computing device. A programmatically performed step may or may not be automatic.
One or more examples described herein can be implemented using programmatic modules or components of a system. A programmatic module or component can include a program, a sub-routine, a portion of a program, or a software component or a hardware component capable of performing one or more stated tasks or functions. As used herein, a module or component can exist on a hardware component independently of other modules or components. Alternatively, a module or component can be a shared element or process of other modules, programs or machines.
Some examples described herein can generally require the use of computing devices, including processing and memory resources. For example, one or more examples described herein can be implemented, in whole or in part, on computing devices such as digital cameras, digital camcorders, desktop computers, cellular or smart phones, personal digital assistants (PDAs), virtual reality (VR), augmented reality (AR), or mixed reality (MR) headsets, laptop computers, printers, digital picture frames, and tablet devices. Memory, processing, and network resources may all be used in connection with the establishment, use, or performance of any example described herein (including with the performance of any method or with the implementation of any system).
Furthermore, one or more examples described herein may be implemented through the use of instructions that are executable by one or more processors. These instructions may be carried on a computer-readable medium. Machines shown or described with figures below provide examples of processing resources and computer-readable mediums on which instructions for implementing examples can be carried and/or executed. In particular, the numerous machines shown with examples include processor(s) and various forms of memory for holding data and instructions. Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers. Other examples of computer storage mediums include portable storage units, such as CD or DVD units, flash memory (such as carried on smart phones, multifunctional devices or tablets), and magnetic memory. Computers, terminals, network enabled devices (e.g., mobile devices, such as cell phones) are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums. Additionally, examples may be implemented in the form of computer-programs, or a non-transitory computer usable carrier medium capable of carrying such a program.
SYSTEM DESCRIPTIONFIG. 1A is a block diagram illustrating an example modular sensing device. According to examples such as shown withFIG. 1A, a modular sensing device is implemented as a multi-modalwearable device100 which can be utilized in any number of multiple possible modes. Such modes include, but are not limited to a drive mode, a control mode, a finder mode, a mining mode, a training mode, a sword mode, a default mode, an interactive or battle mode, a sharing mode, and a gaming mode.
Furthermore, each mode may be selected by the user using either amode selector103 on thewearable device100, or by triggering a certain mode via aninductive pairing109, using aninductive interface105 of thewearable device100, with one or more paired device(s)108, or a combination of both (e.g., detecting an inductive pairing and then a selection of the mode selector103).
Additionally or alternatively, the user may connect thewearable device100 with a mobile computing device, such as the user's smart phone or tablet computer. Mode selection may be performed automatically by the user initiating a designated application of the mobile computing device, such as a smart home application, a controller application (e.g., to control a self-propelled device), a gaming application, and the like. In variations, the user can execute a designated application in connection with thewearable device100 that enables the user to scroll through the various modes. The user may scroll through the modes on the mobile computing device, or via successive selection inputs on themode selector103, which can trigger the mobile computing device to display a selectable mode. In other variations, multiple types of inputs can be performed in themode selector103, such as tap gestures and tap and hold gestures, which can correlate to scrolling through the modes and selecting a particular mode respectively. As provided herein, themode selector103 can be an input mechanism such as an analog or digital button, a touch panel such as a track pad, or a miniature touch-sensitive display device.
In certain aspects, thewearable device100 can include a magnetic clasp that enables the user to fit thewearable device100. For example, in wrist-worn implementations, thewearable device100 can utilize a magnetic bond to initially calibrate the sizing requirement for the user. Multiple sizings may be available initially, in which the magnetic clasp can be adjusted by the user. For example, the magnetic clasp can include a slidable feature that can be adjusted to any one of multiple detents on a periphery of thewearable device100. When the right detent is selected, the user can magnetically attach the slidable feature to the selected detent, which can set the size of thewearable device100. The magnetic coupling between the selected detent and the slidable feature can be stronger than any additional clasp (e.g., a mechanical clasp) that may be engaged and disengaged to attach or remove thewearable device100.
According to examples described herein, thewearable device100 can include amemory130 that storesmode instruction sets132 for execution by a processing/translation engine110 depending on the selected mode. Accordingly, when a mode is ultimately selected, the processing/translation engine110 can generate amode call114 to pull a corresponding set ofmode instructions131 from thememory130 for execution. Pairingdata107 from a paired device108 (e.g., another wearable device, a self-propelled device, a smart home device, etc.) can include information indicating the type of device, which the processing/translation engine110 can utilize to execute the appropriate set ofmode instructions131.
Additionally or alternatively, thewearable device100 can be operable as a standalone, unpaired device in certain modes (e.g., the finder and mining modes, the sword mode, the default mode, and/or the gaming mode). Based on the selected mode, the processing/translation engine110 can execute theappropriate mode instructions131 and begin monitoring the one or more sensor(s)120 of thewearable device100. In either case, the executedmode instructions131 can determine the manner in which the processing/translation engine110 translates thesensor data122 from thesensor120. As provided herein, the sensor(s)120 can detect user gestures performed by the user, and can comprise one or more accelerometers, gyroscopic sensors, inertial measurement unit(s) (IMUs), and magnetometers.
In some aspects, execution of themode instructions131 can cause the processing/translation engine110 to initiate acommunication interface115 to wirelessly communicate with one or more other devices, such as a paireddevice108. Thecommunication interface115 can comprise a wireless communication module, and can be implemented as any type of wireless protocol, such as a BLUETOOTH low energy protocol.Data communications118 between thewearable device100 and another device can include receiveddata117 from the other device, which can further be processed by the processing/translation engine110 in accordance with the executingmode instructions131. The receiveddata117 can also be mode dependent (e.g., command data received from a self-propelled device in the training mode, offensive or defensive data from another wearable device in the interactive mode, or sharing data received in the sharing mode). Additionally, the processing/translation engine110 can generatedata112 for transmission to the other device depending on the mode. Such transmitteddata112 can be generated based on translating thesensor data122 and/or in response to the receiveddata117 from the other device.
In certain aspects, the receiveddata117 can cause the processing/translation engine110 to store information in the memory as well (e.g., collected virtual resources to be stored in data logs). Additionally or alternatively, translation of the receiveddata117 and/or thesensor data122 can cause the processing/translation engine110 to generatefeedback data119. Thefeedback data119 can be processed by afeedback mechanism135 of thewearable device100, which can generate acorresponding feedback output137. Thefeedback output137 can include any combination of haptic, audio, and visual feedback for the user. Thus, thefeedback mechanism135 can include a lighting system that includes one or more lights (e.g., light emitting diodes (LEDs) or a single RGB LED). Thefeedback mechanism135 can also include a haptic system to provide vibratory response of varying intensities, and/or an audio system that includes at least one speaker.
Thefeedback mechanism135 can also be initiated to provide feedback output137 (e.g., a combination of haptic, audio, and visual feedback) to assist in connecting with other devices (e.g., a self-propelled device or mobile computing device). In many aspects, thewearable device100 does not include a display screen, which can typically provide visual feedback regarding BLUETOOTH or Wi-Fi connections. Accordingly, thewearable device100 can provide a sequence of feedback using the feedback mechanism135 (e.g., using audio and haptic samples) to assist the user in connecting with other devices. Such feedback can include colored light sequences (e.g., red to green to blue) to indicate successive steps of establishing a connection, as well as audio and haptic feedback to instruct the user to perform a function or to indicate success or failure.
Thewearable device100 can further include apower source125, such as a battery, that provides electrical power to the various components of thewearable device100. In some examples, thepower source125 may be coupled to an inductive interface, such as theinductive interface105 shown inFIG. 1A, that enables thewearable device100 to charge thepower source125 via induction charging. In variations, thepower source125 can be coupled to a charging port, such as a universal serial bus (USB) port, in which awired power input124 can charge thepower source125. Thus, thepower input124 can include an inductive input and/or a wired input to provide charge to thepower source125.
FIG. 1B is a block diagram illustrating an implementation of awearable device100 in communication with a self-propelleddevice150, according to some examples described herein. In the example provided, thewearable device100 is worn on the wrist of auser101, and theuser101 can perform various user gestures141 that can cause the sensors of thewearable device100 to generate sensor data106. Furthermore, the arrangement shown inFIG. 1B can be implemented in the drive mode and the training mode of thewearable device100. In the example shown, thewearable device100 transmits sensor data106 to acommunication interface155 of the self-propelleddevice150. However, various examples are described herein in which sensor data106 translation is performed by thewearable device100 executingcertain mode instructions131 for operating the self-propelled device150 (e.g., in the drive mode), implementing the self-propelleddevice150 in the training mode, or otherwise utilizing the self-propelleddevice150 in any of the other modes described herein. Furthermore, execution of a particular mode on thewearable device100 can cause the self-propelleddevice150 to execute a correlated mode itself.
In many examples, thetranslation module160 and thecontroller170 can executemode instructions166 to enable the self-propelleddevice150 to operate in multiple modes, such as an autonomous mode, a normal control mode, a drive mode, and a training mode. In normal control mode, the self-propelleddevice150 can be operated by a mobile computing device executing a control application. Upon executing the control application on the mobile computing device by the user, thecontroller170 and thetranslation module160 can operate based on user inputs on a graphical user interface (GUI) generated on a display screen of the mobile computing device.
Referring toFIG. 1B, the user can further initiate a mode on thewearable device100 that can perform aninductive link147 between thewearable device100 and aninductive interface145 of the self-propelleddevice150. Theinductive link147 can providemode information149 to acontroller170 and atranslation module160 of the self-propelleddevice150 via theinductive interface145. Accordingly, thecontroller170 andtranslation module160 can pull the appropriatemode instruction set166 from amemory165 of the self-propelleddevice150 for execution. In examples described herein, thememory165 can store multiplemode instruction sets166 corresponding to multiple operable modes.
Once in a specified mode (e.g., the drive mode), thecommunication interface155 of the self-propelleddevice150 can receive thesensor data156, from thewearable device100, corresponding to user gestures141 performed by theuser101. Thesensor data156 can be translated by thetranslation module160 in accordance with the executedmode instruction set166. The translateddata162 can be transmitted to thecontroller170, which can process the translateddata162 to provide control commands173 to adrive system175 of the self-propelleddevice150. Thecontroller170 can further process the translateddata162 to generateoutput data171, which can be utilized by an output mechanism180 (e.g., a speaker set, a lighting system, an external accessory device, etc.), which can generate a corresponding output response (e.g., any combination of audio, visual, or motional gestures, such as an anthropomorphic gesture like a head shake or a head nod using an accessory device described herein).
In various implementations, theinternal drive system175 of the self-propelleddevice150 can include a number of wheels that are actively biased against an inner wall of aspherical housing186 of the self-propelleddevice150 by abiasing mechanism183. Thebiasing mechanism183 can include a number of spring elements and/or a number of spring-loaded portal axles that provide a force against the inner surface of thespherical housing186. The force against the inner surface by the spring element(s) or portal axles, force thedrive system175 against the inner surface of thespherical housing186 such that when power is applied to the wheels, thespherical housing186 is caused to roll, and the self-propelleddevice150 is caused to accelerate and maneuver.
FIG. 2A illustrates an example modular sensing device (shown as wearable device200) in communication with amobile computing device220 and a self-propelleddevice230. In the examples provided in connection withFIG. 2A, acommunication link216 can be established between amobile computing device220 and thewearable device200. Thecommunication link216 can be established upon the user executing acontrol application218 on themobile computing device220, or upon a user input on thewearable device200. Thecommunication link216 can be a short range wireless link, such as a Classic BLUETOOTH or BLUETOOTH low energy link. Furthermore, themobile computing device220 can establish acommunication link227 with the self-propelleddevice230. In some examples, thiscommunication link227 can be established according to the same protocol as thecommunication link216 between thewearable device200 and themobile computing device220. In such examples, thewearable device200, themobile computing device220 and the self-propelleddevice230 can all be wirelessly coupled in a single piconet within the short communication range of the wireless protocol (e.g., BLUETOOTH low energy).
In many examples, the user of thewearable device200 can perform an initial calibration to indicate a forward direction of the self-propelleddevice230. This initial calibration can include a user input on acalibration feature204 of thewearable device200. Thecalibration feature204 can be any type of input mechanism, such as an analog button or a touch-sensitive panel. According to examples described herein, thecalibration feature204 can be the same as themode selector103 described with respect toFIG. 1A. The calibration can be performed manually by the user, or can be automated. Manual calibration can involve the user positioning the self-propelleddevice230 such that the forward direction of the self-propelleddevice230 is pointing away from the user. The user may then provide an input on thecalibration feature204 when thewearable device200 is forward directionally aligned with the self-propelleddevice230. Upon providing the input on thecalibration feature204, a calibration signal can be generated by thewearable device200 and transmitted to the self-propelleddevice230 to indicate the forward direction. Upon receiving the calibration signal, the self-propelleddevice230 can adjust a forward operational direction of the internal drive system to be aligned with thewearable device200.
Automated calibration can involve an exploitation of an asymmetry in a radiation pattern of a wireless signal, such as a BLUETOOTH low energy signal, generated by the self-propelleddevice230. In certain implementations, the user can provide an input on thecalibration feature204, which can generate an automatic spin command for transmission to the self-propelleddevice230. The spin command can be transmitted directly to the self-propelleddevice230, or can be relayed through themobile computing device220. The control system of the self-propelled device250 can execute the spin command on to initiate a calibration spin. Variations in the communication signal due to the asymmetry in the radiation pattern can be detected by themobile computing device220 and/or thewearable device200. The signal variation may also be detected by the self-propelleddevice230 to identify the directional relationship between the user and the self-propelleddevice230, and calibrate the forward directional alignment. Additionally or alternatively, themobile computing device220 or thewearable device200 can transmit adirectional command217 to the self-propelleddevice230 to cause the self-propelleddevice230 to rotationally maneuver to be in forward alignment with thewearable device200.
Manual or automated directional calibration and/or gyro alignment can lock both the self-propelled device's230 initial coordinate system and the wearable device's200 initial coordinate system to Earth's inertial reference frame, which can readily indicate the spatial relationship between thewearable device200 and the self-propelleddevice230 at any given time. Locking coordinate systems can further indicate the spatial and orientation relationship between the internal drive system of the self-propelleddevice230 and thewearable device200 at any given time during operation. As such, the sensors (e.g., gyroscopic sensors and/or accelerometers) of thewearable device200 and the internal drive system can be initially calibrated with respect to the Earth and each other in either the manual calibration (e.g., an aiming routine) or automated calibration.
Calibration may further be assisted, and the user provided with feedback, via the wearable device's200 feedback system in conjunction with a calibration indicator on the self-propelleddevice230. In many aspects, the internal drive system of the self-propelleddevice230 can include an internal LED indicating a rearward direction so as to be viewable by the user. To provide added feedback concerning calibration, thewearable device200 can provide a combination of audio, visual, and/or haptic feedback indicating successful or unsuccessful alignment. Furthermore, if collisions are detected on the self-propelleddevice230, thewearable device200 can provide feedback (e.g., haptic feedback for each collision), and an indication of whether the initial calibration has been affected beyond predetermined tolerances or thresholds. If the alignment has been affected, thewearable device200 can initiate realignment feedback instructing the user to “re-aim” or recalibrate the directional alignment (e.g., a red light and audio instructions).
Once calibrated, the user wearing thewearable device200 can performuser gestures206, andsensor data215 corresponding to thosegestures206 can be streamed to themobile computing device220 over thecommunication link216. The executingcontrol application218 on thecomputing device220 can cause processing resources of thecomputing device220 to translate thesensor data215 into control commands226, which can be transmitted to the self-propelled device250 over thecommunication link227. The control commands226 can be received by the control system254 of the self-propelled device250 and implemented on the drive system252, as described herein.
The user gestures206 may be any movements or actions performed by the user using thewearable device200. For example, if thewearable device200 is a wrist-worn device, the user can perform arm actions to control movement of the self-propelled device250. In certain implementations, the arm actions can include the user raising or lowering an arm wearing the wrist-worn device, rotating the arm, moving the arm from side-to-side, and various combinations of movements and gestures. Each of these actions can include a specific sensor profile, which can be detected by themobile computing device220. Accordingly, themobile computing device220 can receivesensor data215 corresponding to the user actions, where each of the actions may be identifiable in thesensor data215 as individual sensor patterns in thedata215.
The translation performed by themobile computing device220 can be based on an identification of each sensor pattern and a correlated translation into a specified control command. For example, when the user raises the arm, an accelerometer and/or gyroscopic sensor in the wrist-worn device can output the correspondingdata215 to themobile computing device220. Thedata215 can indicate the arm raising motion, and the executingcontrol application218 can translate thedata215 into acontrol command226 to accelerate the self-propelled device250 forward. When the user's arm is lowered, the correspondingcontrol command226 can cause the self-propelled device250 to brake or otherwise decelerate.
In some examples,mobile computing device220 can include a database mapping sensor patterns to individual control commands226, such that when a sensor pattern is detected in thesensor data215, themobile computing device220 can immediately identify a correlatedcontrol command226 and transmit thecontrol command226 to the self-propelleddevice230 for implementation. Accordingly,sensor data215, corresponding to the user gestures206 performed by the user, can be processed into control commands226 by themobile computing220, which may then be transmitted to the self-propelleddevice230 for implementation.
In certain implementations, thewearable device200 can be space limited, and only include a limited amount memory and computational resources. In such implementations, thewearable device200 can represent each gesture that can be performed by a user as a simple state machine. Thus, for each gesture, a state machine corresponding to that gesture can either positively identify its gesture or negatively identify its gesture. When a positive gesture is identified, the state machine corresponding to that gesture can report the positive identification to themobile computing device220 or the self-propelleddevice230, which can execute acontrol command226 based on the gesture associated with the state machine.
As an illustration, thewearable device200 can include amemory210 implementing a number of state machines (e.g.,SM1203,SM2205,SM3207,SM4209, . . . , SMN211), each being associated with a particular gesture. For example,SM1203 can be associated with the user raising an arm,SM2205 can be associated with the user lower an arm,SM3207 can be associated with the user pointing an arm to the right, andSM4209 can be associated with the user pointing an arm to the left. Furthermore, any number of state machines may be implemented in thememory210 representing any number of gestures. At any given time step, the state machines can be instantiated for each gesture type, and each state machine can inspect the instantaneous sensor data to determine whether to update its state. If at any time after instantiating, a respective state machine determines that its associated gesture is not being performed, it can request destruction, immediately releasing its memory resources, until the gesture is complete, and/or the state machine can reinstantiate accordingly. If, however, the state machine determines that its associated gesture has been completed, the state machine reports the event (i.e., its state change) to themobile computing device220 or the self-propelleddevice230 and releases its memory resources and/or reinstantiates.
As an example, state machine (n) (SMN211) can be associated with a gesture that causes the self-propelleddevice230 to turn left. The associated gesture may be the user having an arm raised and pointing left, and can have a correlated sensor pattern that, if detected bySMN211, cause the state ofSMN211 to change—which, in turn, causesSMN211 to report the state change in astate machine report219 to themobile computing device220 or the self-propelleddevice230. Thestate machine report219 can ultimately cause the self-propelleddevice230 to execute thecontrol command226 associated with the gesture—in this case, turn left.
As provided herein, thememory310 of thewearable device200 can comprise any number of state machines that transition states when their associated gestures are identified in the sensor data. For each gesture performed by the user, each of the state machines in thememory210 can monitor the sensor data for its associated sensor pattern. Thewearable device200 arrangement shown inFIG. 2A can be utilized for any of the modes discussed herein. Furthermore, thewearable device200, including the state machine embodiment, can operate in conjunction with an executingapplication218 on a connectedmobile computing device220. Themobile computing device220 can receive eachstate machine report219 indicating the gestures performed by the user wearing thewearable device200, and generate a corresponding instruction or command depending on the mode, as described herein.
In the example shown inFIG. 2A, themobile computing device220, executing a designatedapplication218, can identify the mode of thewearable device200. The mode can determine the manner in which themobile computing device200 translates each state machine report received, and the corresponding response.
In variations, one or more examples described herein provide for thewearable device200 itself to handle the state machine reports219, execute feedback based on the reports, and/or transmit data (e.g., control commands226 to the self-propelled device230) corresponding to eachstate machine report219 in accordance with the selected mode.
FIG. 2B illustrates an example modular sensing device for generating commands for execution on a self-propelled device. More specifically, in an example provided in connection withFIG. 2B, awearable device260 is a wrist-worn device that includes amemory280 comprising a number of state machines, as described with respect toFIG. 2A. Thewearable computing device260 can include areport processor270 programmed to process state machine reports282. In, for example, the drive mode, thereport processor270 can translate eachstate machine report282 into acontrol command275 for execution by the self-propelleddevice290.
In examples, thereport processor270 of thewearable computing device260 can generate control commands275 based on the user gestures277 performed by the user. These control commands275 can be transmitted to a self-propelleddevice290, which can implement the control commands275 on its drive system directly. Accordingly, sensor data processing can be performed by thewearable computing device260 utilizing state machines in the memory into the control commands275 before being transmitted over thecommunication link271 via thecommunication interface265 to the self-propelleddevice290.
Drive Mode Methodology
FIG. 3A is a high level flow chart describing an example method of translating sensor data by a modular sensing device for implementation in a self-propelled device. In the below description ofFIG. 3A, reference may be made to like reference characters representing various features ofFIG. 1A, and alsoFIG. 2A for illustrative purposes. Furthermore, the method described in connection withFIG. 3A may be performed by an examplewearable device100 described in connection withFIG. 1. Referring toFIG. 3A, thewearable device100 can receive user input on themode selector103 to select a drive mode on the wearable device100 (300). Thewearable device100 may then detect an inductive link or pairing109 with a self-propelled device230 (305), thereby initializing the drive mode on the self-propelleddevice230 as well.
Thewearable device100 may then generate and transmit a calibration signal for directional calibration or alignment between thewearable device100 can the self-propelled device230 (310). In some examples, thewearable device100 relays the calibration command to the self-propelled device via a connected mobile computing device220 (311). In variations, the calibration may be automated by thewearable device100 or themobile computing device220 generating and transmitting a spin command for transmission to the self-propelleddevice230, determining a direction to the self-propelleddevice230 by detecting variations in the radiation pattern of the self-propelleddevice230 as it spins, and transmitting a command to the self-propelleddevice230 to directionally align its internal drive system with thewearable device100 accordingly (313).
Once calibrated, thewearable device100 can monitorsensor data122 corresponding touser gestures206 performed by the user (315). As described herein, thesensor data122 can be actively monitored by processing resources of the wearable device100 (e.g., processing/translation engine110 shown inFIG. 1A), or by state machines in a memory resource of thewearable device100. Based on each identifieduser gesture206, thewearable device100 can translate thesensor data122 and/or state machine reports219 into control commands226 for the self-propelled device230 (320). Accordingly, thewearable device100 can transmit the control commands226 to the self-propelleddevice230 for execution (325).
FIG. 3B is a low level flow chart describing an example method for translating sensor data by an example modular sensing device for implementation in a self-propelled device. In the below description ofFIG. 3B, reference may be made to like reference characters representing various features ofFIGS. 1A and 2A for illustrative purposes. Furthermore, the method described in connection withFIG. 3B may be performed by an examplewearable device100 as illustrated inFIG. 1A or thewearable devices200,260 as shown and described with respect toFIGS. 2A and 2B. Referring toFIG. 3B, thewearable device100 can detect an inductive pairing with a self-propelled device230 (330). Thewearable device100 can further receive a user input on amode selector103 of the wearable device100 (335) to select a drive mode. Thewearable device100 can be a wrist-worn device (336), and themode selector103 may be an analog button on an outer edge of the wearable device100 (337).
In many aspects, thewearable device100 can synchronize directionally with the self-propelled device230 (340). In some cases, the user can manually synchronize the gyroscopic sensors (gyros) of both devices by manually aligning them, and performing an action, such as depressing the mode selector to calibrate the gyros (342). In other cases, thewearable device100 can automatically synchronize the gyros (343). For automatic synchronization, thewearable device100 can transmit a spin command to the self-propelled device230 (345), detect the asymmetry in the radiation pattern of the self-propelled device230 (e.g., the BLUETOOTH low energy radiation pattern) (350), and transmit a directional calibration command to directionally align the self-propelleddevice230 with the wearable device100 (355).
Once calibrated, thewearable device100 can monitor sensor data122 (360), and detectuser gestures206 in the sensor data122 (365). Such gestures can correspond to various controls for the self-propelleddevice230, such as directional controls (367) and acceleration and deceleration controls (369). Thus, thewearable device100 can translate each of the user gestures206 into acontrol command226 for the self-propelled device230 (370), and transmit the control commands226 to the self-propelleddevice230 for execution (375).
In some aspects, hybrid gestures may be utilized to trigger one or more functions of the drive mode. For example, the drive mode may be initiated by a “trigger gesture” instead of the user actively scrolling through available modes and selecting the drive mode. A preconfigured sensor pattern for the trigger gesture can be stored, and when identified (e.g., by a corresponding state machine or via any set of executing instructions by a processor), thewearable device200 can automatically initialize the drive mode which can instantaneously connect thewearable device200 to the self-propelleddevice230 can cause the self-propelleddevice230 to perform an action. Feedback on thewearable device200 can provide the user which an indication that drive mode has been initialized automatically. Furthermore, the self-propelleddevice230 can also provide feedback indicating the drive mode, and can thereafter remain in operational control of thewearable device200. In certain implementations, the trigger gesture can be prioritized over other gestures performed by the user, which can override any other modes currently executing on thewearable device200.
When the user wishes to end the drive mode, thewearable device100 can detect a second user input, for example, on themode selector103 of the wearable device100 (380). In response to the second user input, thewearable device100 can transmit a power down command to the self-propelleddevice230 and exit the drive mode (385).
In some aspects, thewearable device100 can initiate the user experience based on an initial radio-frequency scan of proximate available devices. The scan may be performed by the wearable device itself100, or by a connectedmobile computing device220, and can identify nearby BLUETOOTH-enabled and/or Wi-Fi enabled devices, such as a self-propelleddevice230, another wearable device, network devices (e.g., devices potentially associated with virtual resources), or a smart home device. The initial scan, or in some aspects, periodic scans, can identify such connectable devices and limit the modal options of thewearable device100. For example, a scan may reveal that no self-propelleddevices230, smart home devices, other peripheral devices, or network devices are within connectable range, and can consequently limit the available mode choices to only those standalone modes that do not require a connection with another device (e.g., finder mode, sword mode, or default mode). As another example, a scan may reveal that a smart home device is within connectable range of thewearable device100, which can trigger a notification to the user and provide the control mode option.
Training Mode
FIG. 4 is a flow chart describing an example method of initiating a training mode on a wearable device in connection with a self-propelled device. In the below discussion ofFIG. 4, reference may be made to like reference characters representing various features described with respect toFIG. 2B for illustrative purposes. Referring toFIG. 4, thewearable device260 can detect a user input placing thewearable device260 in training mode (400). The user input can be detected via a mode selector on the wearable device260 (402), or via launch of a designated application on a connected mobile computing device (404). Alternatively, the training mode can be initiated via a combination of user inputs on the mode selector and an inductive link with a self-propelleddevice290. In one example, the inductive link causes the self-propelleddevice290 to execute an autonomous or partially autonomous mode. Accordingly, thewearable device260 may also detect an inductive pairing with a self-propelled device290 (405). In response to the inductive pairing, thewearable device260 can transmit data to initiate the training mode on the self-propelleddevice290 as well (410). The transmitted data can cause the self-propelleddevice290 to execute instructions to aid the user in training for a series of actions. In certain variations, the training mode can cause thewearable device260 to transmit a signal to cause the self-propelleddevice290 to execute the autonomous or partially autonomous mode. For example, the series of actions can correspond to offensive and defensive actions that the user can implement when thewearable device260 is in interactive or battle mode. Additionally or alternatively, the series of actions can get progressively more difficult as the user successively accomplishes each action.
Initially, thewearable device260 can synchronize directionally with the self-propelled device290 (415). In some aspects, the user can manually synchronize the gyroscopic sensors of thedevices260,290 by physically pointing a forward operational direction of the self-propelleddevice290 away from thewearable device260 and providing a calibration input (e.g., an input on the mode selector of the wearable device260) (417). In other aspects, the gyro synchronization may be performed automatically (419).
Automatic synchronization can be initiated by thewearable device260 by generating and transmitting a spin command to the self-propelleddevice290, which can execute a spin accordingly (420). Using a signal detector, thewearable device260 detect an asymmetry in the radiation pattern of the self-propelleddevice290 as it spins, indicating the direction towards the self-propelled device290 (425). With a known direction, thewearable device290 can transmit a direction calibration command to the self-propelleddevice290 indicating the direction, which the self-propelleddevice290 can process to align its internal drive system accordingly (430).
In many aspects, thewearable device260 can track the location of the self-propelleddevice290 as it traverses, spins, and/or maneuvers (440). The self-propelleddevice290 can include surface features or an accessory (e.g., a magnetically coupled attachment) that indicates a forward “looking” direction of the self-propelleddevice290. In certain examples, the user is instructed to walk or run around in a circle until the user is directly facing the forward looking direction of the self-propelleddevice290. In many aspects, thewearable device260 can include sensors to determine an orientation or location of the user. For example, thewearable device260 can determine whether the user is facing an instructed direction in connection with the training mode, such as facing the self-propelleddevice290. Additionally or alternatively, thewearable device260 can generate an output, via the feedback mechanism, instructing the user to perform a set of actions (445). The output may be in the form of audio instructions, and can be based on data received from the self-propelled device290 (447), or from thewearable device260 utilizing a local routine set (449), which may be randomized or sequenced in accordance with the executing training mode instructions.
In certain implementations, the training mode can involve an interaction between the self-propelleddevice290 in an autonomous mode and thewearable device260 that provides the user with positive and/or negative feedback with respect to whether the user has performed a series of actions. The instructions to perform the series of actions can be outputted as audio from thewearable device260 or the self-propelleddevice290. Furthermore, an established communication link between thewearable device260 and the self-propelleddevice290 can enable either device to transmit acknowledgements and confirmations that the user has or has not performed the series of actions. In some examples, a separate training mode program may be executed on the self-propelleddevice290 that causes the self-propelleddevice290 to generate a set of commands to sequentially instruct the user to perform respective sets of actions using thewearable device260. The sets of commands can be processed by thewearable device260 to generate outputs instructing the user to perform such physical actions. As provided herein, these physical actions can comprise learned arm and/or wrist motions that, if the user performed correctly, results in positive feedback from thewearable device260, the self-propelleddevice290, or both.
Execution of the training mode on thewearable device260 can cause thewearable device260 to sequentially output audio instructions to the user to perform one or more actions, either independently in accordance with the training mode, or in response to respective sets of commands from the self-propelleddevice290. If the user correctly performs the action(s), the wearable device can output positive feedback (e.g., via the haptic, audio, and/or visual feedback mechanisms). However, if the user does not perform the actions correctly, then thewearable device260 can output negative feedback and for example, await or otherwise instruct the user to try again. Detection of whether the user has correctly performed the series actions can be based on sensor data from one or more sensors of the wearable device260 (e.g., an inertial measurement unit), and either via state machine reports282 or via sensor data translation instructions executed by a processor of thewearable device260. The processor or state machine controller (i.e., report processor270) can determine whether the series of actions has been completed by the user, and generate the positive or negative feedback accordingly. In certain implementations, if the user has properly performed the series of actions, thewearable device260 or the self-propelleddevice290 can generate the positive feedback, and thereafter generate an audio and/or visual output instructing the user to perform a next series of actions.
Communications between thewearable device260 and the self-propelleddevice290 can enable thewearable device260 to communicate to the self-propelleddevice290 regarding whether a particular iteration of a series of actions has been successfully performed by the user. Based on the communication (e.g., either a positive or a negative indication), the self-propelled device can be triggered to output a response and/or perform an autonomous action. In one example, if the signal is negative, then the self-propelleddevice290 can output a “negative” feedback response (e.g., blink red lights, and make demeritis sounds). However, if the signal is positive, then the self-propelleddevice290 can perform an autonomous action that can include driving to a certain location, spinning in place, and/or outputting positive feedback. In accordance with the training mode, the user may be instructed to proceed to a location in which the user faces a forward-facing direction of the self-propelleddevice290 and perform the next series of actions. Thus, prior to performing each series of actions in each sequence of the training mode, the self-propelleddevice290 may first detect whether the user is located in the forward-facing direction.
In certain implementations, this dynamic between the self-propelleddevice290 and thewearable device260 can comprise a physical game in which (i) thewearable device260 can output sequential audio instructions for the user to perform a first series of actions, and determine whether has performed them by analyzing sensor data, and (ii) the self-propelled device can receive confirmations of whether the user has successfully performed such actions, and generate a response accordingly. According to examples, an initial requirement may be that the user proceeds (e.g., runs) to a forward-facing location of the self-propelleddevice290, with a subsequent requirement of performing the series of actions. Thus, whether the user is in the correct location may be determined by the self-propelleddevice290, which can communicate a confirmation to thewearable device260 accordingly, thereby “greenlighting” thewearable device290 to analyze the sensor data for the series of actions. Thus, the process may repeat until, for example, the user quits or a score tally reaches a threshold.
Once the instructions are outputted to the user, thewearable device260 or the self-propelleddevice290 can initiate a timer (450). The timer can be initiated for each instruction outputted to the user, and a threshold time limit can be set for each instruction. Thus, the user can be instructed to perform the set of actions within the predetermined time period. Thewearable device260 can monitor the sensor data to determine whether the user successfully performs the set of actions (455). Specifically, for each instruction output, thewearable device260 can determine whether the user has performed the instructed set of actions within the established threshold time limit (460). If so (461), then thewearable device260 can generate another output instructing the user to perform another set of actions (445). However, if the user fails to perform the set of actions (463), then thewearable device260 can terminate the training session and generate a final score (465), which may be displayed on the user's mobile computing device.
As provided herein, the instructions for the user to perform respective sets of actions can be generated by thewearable device260 executing the training mode, or the self-propelleddevice290 once the autonomous mode is initiated. The combined execution of the training mode and the autonomous mode can create a synchronized interaction between the self-propelleddevice290 and thewearable device260 in which bothdevices260,290 can provide feedback indicating whether the user has successfully performed a set of actions. In such implementations, thewearable device260 can determine whether the actions have been performed properly, transmitting an indication of the fact to the self-propelleddevice290 which can generate the appropriate output (e.g., positive or negative feedback). The timer can be included on thewearable device260, the self-propelleddevice290, or both, to determine whether the series of actions has been performed within the threshold time limit.
In one example, thewearable device260 determines whether the user has performed a particular series of actions and generates feedback accordingly (e.g., positive feedback). Thewearable device260 may then transmit an indication of the feedback to the self-propelleddevice290, which can in turn generate a similar output (e.g., a visual and/or audible feedback response) to coincide with the feedback from thewearable device260. Depending on (i) whether the user has successfully performed the set of actions, and (ii) whether the user has performed the set of actions within the threshold time limit, the combined system of thewearable device260 and the self-propelleddevice290 can proceed to a next set of commands and instructions.
Example Robotic Device
FIG. 5 is a schematic illustrating an example self-propelled device with which example wearable devices may be implemented, as described herein. The self-propelleddevice500 can be of a size and weight allowing it to be easily grasped, lifted, and carried by a user. The self-propelleddevice500 can include aspherical housing502 with an outer surface that makes contact with an external surface of a corresponding magnetically coupled accessory device590 as the self-propelleddevice500 rolls. In addition, thespherical housing502 includes aninner surface504. Additionally, the self-propelleddevice500 includes several mechanical and electronic components enclosed by thespherical housing502. In an example, self-propelleddevice500 includesmagnetic elements582 which are supported withinspherical housing502 and which magnetically interact with complementarymagnetic elements592 of the accessory device590. The magnetic interaction and coupling can occur and/or be maintained while the self-propelleddevice500 moves.
Thespherical housing502 can be composed of a material that transmits signals used for wireless communication, yet is impervious to moisture and dirt. Thespherical housing502 can comprise a material that is durable, washable, and/or shatter-resistant. Thespherical housing502 may also be structured to enable transmission of light and can be textured to diffuse the light.
In one variation, thespherical housing502 is made of sealed polycarbonate plastic. In one example, thespherical housing502 comprises two hemispherical shells with an associated attachment mechanism, such that thespherical housing502 can be opened to allow access to the internal electronic and mechanical components.
Several electronic and mechanical components are located inside the envelope for enabling processing, wireless communication, propulsion and other functions. In an example, the components include adrive system501 to enable the device to propel itself. Thedrive system501 can be coupled to processing resources and other control mechanisms, as described with other examples. Thecarrier514 serves as the attachment point and support for components of thedrive system501. The components of thedrive system501 are not rigidly attached to thespherical housing502. Instead, thedrive system501 can include a pair ofwheels518,520 that are in frictional contact with theinner surface504 of thespherical housing502.
Thecarrier514 can be in mechanical and electrical contact with anenergy storage516. Theenergy storage516 provides a reservoir of energy to power thedevice500 and electronics and can be replenished through aninductive charge port526. Theenergy storage516, in one example, is a rechargeable battery. In one variation, the battery is composed of lithium-polymer cells. In other variations, other rechargeable battery chemistries are used.
Thecarrier514 can provide the mounting location for most of the internal components, including printed circuit boards for electronic assemblies, sensor arrays, antennas, and connectors, as well as providing a mechanical attachment point for internal components.
Thedrive system501 can includemotors522,524 andwheels518,520. Themotors522 and524 connect to thewheels518 and520, respectively, each through an associated shaft, axle, and gear drive (not shown). The perimeter ofwheels518 and520 are two locations where the interior mechanism is in mechanical contact with theinner surface504. The locations where thewheels518 and520 contact theinner surface504 are an essential part of the drive mechanism of the self-propelleddevice500, and so are preferably coated or covered with a material to increase friction and reduce slippage. For example, thewheels518 and520 can be covered with silicone rubber tires.
In some variations, a biasingassembly515 is provided to actively force thewheels518,520 against theinner surface504. In an example illustrated byFIG. 5, the biasingassembly515 can comprise two or more separateportal axles558,560 to actively force thedrive system wheels518,520 against theinner surface504. Theportal axles558,560 may include biasingelements554,556 (or springs) which includetips555 or ends that press against theinner surface504 with a force vector having a vertical value. The vertical force from the bias springs554,556 pressing against theinner surface504 actively forces thedrive system501 and itsrespective wheels518,520 against theinner surface504, thereby providing sufficient force for thedrive system501 to cause the self-propelleddevice500 to move.
Theportal axles558,560 comprising theindependent biasing elements554,556 can be mounted directly onto thecarrier514. The biasingelements554,556 coupled to theportal axles558,560 may be in the form of torsion springs which instigate a force against theinner surface504. As an addition or alternative, the biasingelements554,556 may be comprised of one or more of a compression spring, a clock spring, or a tension spring. Alternatively, theportal axles558,560 can be mounted, without inclusion of springs, to maintain a force pressing thedrive system501 andwheels518,520 against theinner surface504, and allow sufficient traction to cause the self-propelleddevice500 to move.
According to many examples, the self-propelleddevice500 can include aninductive charge port526 to enable inductive charging of apower source516 used to provide power to theindependent motors522,524 that power thewheels518,520. The self-propelleddevice500 can further include amagnet holder580 coupled to thecarrier514. Themagnet holder580 can include a set of magneticallyinteractive elements582, such as elements comprised of ferrous materials, and/or electromagnets or permanent magnets. Likewise, the external accessory590 can also includecomplementary magnets592 for enabling the magnetic coupling. Thus, themagnet holder580 and the external accessory590 can comprise one or more of any combination of magnetically interactive metals, ferromagnetic elements, neodymium, yttrium/cobalt, alnico, or other permanent elemental magnets, other “rare-earth” magnets, electromagnets, etc.
In variations, themagnet holder580 can include a set of magnetic elements582 (e.g., a magnet pair) which can be oriented to have opposing polarity. For example, as shown inFIG. 5, themagnetic elements582 include a first magnet and a second magnet, where the first magnet can be oriented such that its north magnetic pole faces upwards and its south magnetic pole faces downwards. The second magnet can be oriented such that its south magnetic pole faces upwards and its north magnetic pole face downwards.
In variations, themagnet holder580 and an external accessory590 can each house any number or combination of complementary magnets or magnetic components. For example, a single magnetic component may be housed in either the self-propelleddevice500 or in the corresponding external accessory590, and be arranged to magnetically interact with a plurality of magnetic components of the other of the external accessory590 or the self-propelleddevice500. Alternatively, for larger variations, magnetic arrays of three or more magnets may be housed within thespherical housing502 to magnetically interact with a corresponding magnetic array of the external accessory590.
Themagnet holder580 can be placed on aninternal pivoting structure573 that enables themagnet holder580 to pivot within thespherical housing502. Thepivot structure573 can include a single or multiple guiderails and can be driven by one ormore pivot actuators572. Accordingly, in examples provided herein, user gestures can be performed by the user of the wearable device that not only cause the self-propelleddevice500 to accelerate and maneuver, but also to drive thepivot actuator572 to cause themagnet holder580 to rotate within thespherical housing502. For wrist-worn devices, acceleration and maneuver commands may be simply arm gestures, such as raising, lowering, and turning the arm on which the wearable device is worn. Pivot commands may be different arm or wrist gestures, such as pivoting or rotating the wrist or arm, or a combination of such actions.
When thepivot structure573 is driven by thepivot actuator572 to pivot, the magnetic interaction between themagnets582 of the self-propelleddevice500 and themagnets592 of the external accessory590 causes the external accessory590 to pivot as well. Thus, gesture responses by the self-propelleddevice500 can include pivoting or turning the external accessory590 as well as maneuvering and accelerating the self-propelleddevice500.
In some examples, the biasingassembly515 is formed such that thewheels518,520 and the tip ends555 of the biasingelements554,556 are almost constantly engaged with theinner surface504 of thespherical housing502. As such, much of the power from themotors522,524 is transferred directly to rotating thespherical housing502, as opposed to causing the internal components (i.e., the biasingassembly515 and internal drive system501) to pitch. Thus, while motion of the self-propelleddevice500 may be caused, at least partially, by pitching the internal components (and therefore the center of mass), motion may also be directly caused by active force of thewheels518,520 against theinner surface504 of the spherical housing502 (via the biasing assembly515) and direct transfer of electrical power from themotors522,524 to thewheels518,520. As such, the pitch of the biasingassembly515 may be substantially reduced, and remain substantially constant (e.g., substantially perpendicular to the external surface on which the self-propelleddevice500 moves). Additionally or as an alternative, the pitch of the biasingassembly515 may increase (e.g., to over 45 degrees) during periods of hard acceleration or deceleration. Furthermore, under normal operating conditions, the pitch of the biasingassembly515 can remain stable or subtly vary (e.g., within 10-15 degrees).
In some variations, themagnetic elements582 can be replaced or augmented with magnetic material, which can be included on, for example, the tip ends555 of the biasingelements554,556. The tip ends555 can be formed of a magnetic material, such as a ferrous metal. Such metals can include iron, nickel, cobalt, gadolinium, neodymium, samarium, or metal alloys containing proportions of these metals. Alternatively, the tip ends555 can include a substantially frictionless contact portion, in contact with theinner surface504 of thespherical housing502, and a magnetically interactive portion, comprised of the above-referenced metals or metal alloys, in contact or non-contact with theinner surface504. As another variation, the substantially frictionless contact portion can be comprised of an organic polymer such as a thermoplastic or thermosetting polymer.
In some examples, the tip ends555 can be formed of magnets, such as polished neodymium permanent magnets. In such variations, the tip ends555 can produce a magnetic field extending beyond the outer surface of thespherical housing502 to magnetically couple with the external accessory device. Alternatively still, the tip ends555 can include a substantially frictionless contact portion, and have a magnet included therein.
Alternatively still, a magnetic component of the self-propelleddevice500 may be included on any internal component, such as thecarrier514, or an additional component coupled to the biasingassembly515 or thecarrier514.
In further examples, one or more of themagnetic elements582, the tip ends555, and/or the complementary magnets of the external accessory device can comprise any number of electro- or permanent magnets. Such magnets may be irregular in shape to provide added magnetic stability upon motion of the self-propelleddevice500. For example, themagnetic elements582 of the self-propelleddevice500 can be a single or multiple magnetic strips including one or more tributary strips to couple with the complementary magnet(s) of the accessory device. Additionally, or alternatively, the tip ends555 can also include a single or multiple magnets of different shapes which couple to complementary magnets of the accessory device.
Alternatively, the magnetic coupling between the self-propelleddevice500 and the accessory device can be one which creates a stable magnetically repulsive state. For example, themagnetic elements582 can include a superconductor material to substantially eliminate dynamic instability of a repelling magnetic force in order to allow for stable magnetic levitation of the accessory device in relation to themagnetic elements582 while thespherical housing502 rotates on the underlying surface. In similar variations, a diamagnetic material may be included in one or more of the self-propelleddevice500, the tip ends555, or the external accessory device, to provide stability for magnetic levitation. Thus, without the use of guiderails or a magnetic track, the self-propelleddevice500 may be caused to maneuver in any direction with the external accessory device remaining in a substantially constant position along a vertical axis of the self-propelled device500 (Cartesian or cylindrical z-axis, or spherical r-coordinate with no polar angle (θ)).
Hardware Diagrams
FIG. 6 is a block diagram that illustrates a computer system upon which examples described may be implemented. For example, one or more components discussed with respect to the self-propelled device ofFIGS. 1B, 2A, 2B, and 5, and the methods described herein, may be performed by thesystem600 ofFIG. 6.
In one implementation, thecomputer system600 includes processingresources610, amain memory620,ROM630, astorage device640, and acommunication interface650. Thecomputer system600 includes at least oneprocessor610 for processing information and amain memory620, such as a random access memory (RAM) or other dynamic storage device, for storing information and instructions to be executed by theprocessor610. Themain memory620 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by theprocessor610. Thecomputer system600 may also include a read only memory (ROM)630 or other static storage device for storing static information and instructions for theprocessor610. Astorage device640, such as a magnetic disk or optical disk, is provided for storing information and instructions. For example, thestorage device640 can correspond to a computer-readable medium that store instructions performing sensor data processing and translation operations as discussed herein.
Thecommunication interface650 can enablecomputer system600 to communicate with a computing device and/or wearable device (e.g., via a cellular or Wi-Fi network) through use of a network link (wireless or wired). Using the network link, thecomputer system600 can communicate with a plurality of devices, such as the wearable device, a mobile computing device, and/or other self-propelled devices. Themain memory620 of thecomputer system600 can further store thedrive instructions624, which can be initiated by theprocessor610. Furthermore, thecomputer system600 can receive control commands662 from the wearable device and/or mobile computing device. Theprocessor610 can execute thedrive instructions624 to process and/or translate the control commands662—corresponding to user gestures performed by the user—and implement the control commands652 on the drive system of the self-propelled device.
Additionally, themain memory620 can further includemode instructions624, which theprocessor610 can execute to place the self-propelled device in one or multiple modes to interact with the wearable device. In some examples, execution of themode instructions622 can place the self-propelled device in an operational mode that providesfeedback652 and/orinstructions654 to the wearable device over the network680 (e.g., in training mode).
Examples described herein are related to the use ofcomputer system600 for implementing the techniques described herein. According to one example, those techniques are performed bycomputer system600 in response toprocessor610 executing one or more sequences of one or more instructions contained inmain memory620. Such instructions may be read intomain memory620 from another machine-readable medium, such asstorage device640. Execution of the sequences of instructions contained inmain memory620 causesprocessor610 to perform the process steps described herein. In alternative implementations, hard-wired circuitry and/or hardware may be used in place of or in combination with software instructions to implement examples described herein. Thus, the examples described are not limited to any specific combination of hardware circuitry and software.
FIG. 7 is a block diagram that illustrates a mobile computing device upon which examples described herein may be implemented, such as themobile computing device220 ofFIG. 2A. In one example, thecomputing device700 may correspond to, for example, a cellular communication device (e.g., feature phone, smartphone, etc.) that is capable of telephony, messaging, and/or data services. In variations, thecomputing device700 can correspond to, for example, a tablet or wearable computing device.
In an example ofFIG. 7, thecomputing device700 includes aprocessor710,memory resources720, a display device730 (e.g., such as a touch-sensitive display device), one or more communication sub-systems740 (including wireless communication sub-systems), input mechanisms750 (e.g., an input mechanism can include or be part of the touch-sensitive display device), and one or more location detection mechanisms (e.g., GPS component)760. In one example, at least one of thecommunication sub-systems740 sends and receives cellular data over data channels and voice channels.
Thememory resources720 can store a designatedcontrol application722, as one of multiple applications, to initiate thecommunication sub-system740 to establish one or more wireless communication links with the self-propelled device and/or a wearable device. Execution of thecontrol application722 by theprocessor710 may cause a specified graphical user interface (GUI)735 to be generated on thedisplay730. Interaction with theGUI735 can enable the user to calibrate the forward directional alignment between the self-propelled device and thecomputing device700. Furthermore, theGUI735 can allow the user to initiate a task-oriented operation (e.g., a game) to be performed by the user in conjunction with operating the self-propelled device with user gestures using the wearable device, as described herein.
FIG. 8 is a block diagram of an example modular sensing device upon which examples described herein may be implemented, such as thewearable device100 ofFIG. 1A.
In an example ofFIG. 8, themodular sensing device800 includes aprocessor810,memory resources820, a feedback mechanism830 (e.g., audio832, haptic833, visual831 devices), a communication sub-systems840 (e.g., wireless communication sub-systems such as BLUETOOTH low energy), one or more sensors860 (e.g., a gyroscopic sensor or accelerometer) and an input mechanism850 (e.g., an analog or digital mode selector). In one example, thecommunication sub-system840 sends and receives data over one or more channels.
Thememory resources820 can storemode instructions823 corresponding to a plurality of control modes822, as described herein, which can be executed by theprocessor810 to initiate a particular mode. Certain executingmode instructions823 can initiate thecommunication sub-system840 to establish one or more wireless communication links with the self-propelled device and/or the mobile computing device. Execution of a control mode822 by theprocessor810 may cause theprocessor810 to generate distinct feedback responses using thefeedback mechanism830 based on sensor data from the sensor(s)860 indicating user gestures performed by the user.
In some examples, thememory resources820 can comprise a number ofstate machines824 which can provide state machine reports827 to theprocessor810 can specified sensor patterns are identified byrespective states machines824. Eachstate machine824 may monitor for a single sensor pattern which, if identified by thatstate machine824, can cause thestate machine824 to transition states, thereby providing astate machine report827 to theprocessor810 identifying the user gesture performed. Theprocessor810 can translate the state machine reports827—which indicate the user gestures—in accordance with an executing set ofmode instructions823 in order to generate a corresponding output via thefeedback mechanism830 and/or control commands812 to be communicated to the self-propelled device via thecommunication sub-system840.
While examples ofFIG. 6,FIG. 7, andFIG. 8 provide for acomputer system600, acomputing device700, and amodular sensing device800 for implementing aspects described, in some variations, other devices of the three can be arranged to implement some or all of the functionality described with the processing resources of the self-propelleddevice150 ofFIG. 1B, themobile computing device220 ofFIG. 2A, or thewearable device100 ofFIG. 1A, as shown and described throughout.
With further reference to examples ofFIG. 6,FIG. 7, andFIG. 8, some examples include functionality for projecting an orientation and/or perspective of a user onto a gaming environment via sensing output of themodular sensing device800. For example, when themodular sensing device800 is worn, the orientation and perspective of the user can be inferred from sensors860 (e.g., IMU), and this sensor information can be virtualized for the gaming environment. For example, the gaming environment can be shown on a computing device (e.g., display screen of a computer, mobile computing device etc.). The gaming environment can be a perspective that is based on the orientation of the user (e.g., user is standing north), as determined by themodular sensing device800. The perspective can change as the user changes orientation, moves in a particular direction etc. In some examples, themodular sensing device800 can be used to control a virtual or actual object (e.g., self-propelled device), and the orientation and direction of the controlled object may be with reference to a reference frame of the user.
In variations, a reference frame of the self-propelled device may be used, and the user's orientation can be used to influence control of the virtual or actual device in motion. For example, the user's movement or motion can influence a change of direction. Alternatively, both orientations can be used concurrently. For example, if the device under control is a virtual vehicle that carries the user, the user may turn his head (e.g., when wearing a necklace carrying the modular sensing device) to see a view to a particular side while the orientation of the vehicle is used for the motion of the vehicle.
Multi-Device Usage
FIG. 9 illustrates an embodiment of multiple sensing devices that concurrently provide input for a program or application which utilizes the inputs, along with inferences which can be made about a person or object that carries the devices, according to one or more examples. In particular, an example such as shown enables input from multiple sensing devices to be used for purpose of enabling inferences of movement and pose from two relevant sources of user motion. For example, inFIG. 9, auser901 carries wearable devices in the form of a wrist-worndevice910 and pendent912. In other examples, one or both of the wrist-worndevice910 and pendent912 can be in the form of an alternative form factor or device type. For example, the combination of sensing devices can include a hat, a ring, eyeglasses or a device which the user can carry in his or her hand (e.g., FOB, mobile computing device). In variations, more than two wearable devices can be employed by one user.
FIG. 10 illustrates a system which concurrently utilizes input from multiple modular sensing devices in connection with execution of an application or program. With reference to an example ofFIG. 10, a multi-device system1000 includes a firstmodular sensing device1010, a secondmodular sensing device1020, and acontroller1030. Each of the first and secondmodular sensing devices1010,1020 includes a respective inertial measurement unit (IMU)1012,1022, aprocessor1014,1024 andmemory1016,1026. TheIMU1012,1022 of eachmodular sensing device1010,1020 can include sensors such as anaccelerometer1015,1025 andgyroscopic sensor1017,1027. The first and secondmodular sensing devices1010,1020 may also include additional sensing resources, such as a magnetometer and/or proximity sensor.
Thecontroller1030 can include aprocessor1032 and amemory1034. Theprocessor1032 can execute instructions1035 for a program or application that can execute andprocess inputs1011,1021 from each of the respectivemodular sensing devices1020,1010. In some variation, thecontroller1030 is a mobile computing device, such as a multi-purpose wireless communication device which can wirelessly communicate with each of the first and secondmodular sensing devices1010,1020.
While an example ofFIG. 10 illustrates thecontroller1030 as a separate device from the first and secondmodular sensing devices1010,1020, variations provide that thecontroller1030 is integrated or otherwise combined with at least one of the first or secondmodular sensing devices1010,1020. For example, thecontroller1030 can include a multi-purpose wireless communication device that is equipped with a gyroscopic sensor and accelerometer. Thus, for example, variations can provide the second modular sensing device to be a local resource of thecontroller1030, which communicates with the firstmodular sensing device1010.
With further reference toFIG. 10, thecontroller1030 can receiveinputs1011,1013 from respective first and secondmodular sensing devices1010,1020. Theinputs1011,1013 can be received in connection with an application1039 or program that is executed by theprocessor1032 of thecontroller1030. Theprocessor1032 can execute theinstructions1045 in order to implement a rule engine1035 for determining inferences about the person or object on which one or both of themodular sensing devices1010,1020. For example, the application1039 can correspond to a game or simulation, and the inferential engine1035 can be specific to the application1039. Among other applications, the inference1035 can be used to determine when the motions of twomodular sensing devices1010,1020 are separate and distinct from one another, or continuous and/or part of the same input motion.
According to one implementation, eachinput1011,1013 can correspond to one or more of a position input, height, orientation, velocity, linear and/or rotational acceleration. Each of the first andsecond sensing devices1010,1020 generate a set of measured (or sensed data) corresponding to, for example, a movement (e.g., gesture) made with therespective sensing device1010,1020. Additionally, thecontroller1030 can processinput1011,1013 corresponding to each of the respective data sets in order to determine a third data set of inferences. In this way, the inferences reflect information determined from sensed data, rather than directly measured data. The inferences which can be output from the inference engine1035 and can be determinative or probabilistic, depending on implementation.
With reference to an example ofFIG. 9,user901 can wear two modular sensing devices, and the inference engine1035 can assume some inferences based on anatomical constraints and/or context (e.g., such as provided from execution of the application1039). For example, each of the first and secondmodular sensing devices1010,1020 can correspond to a wearable wrist device. Alternatively, the secondmodular sensing device1020 can correspond to the pendent912 or neck-worn device. By way of example, if the first modular sensing device1010 (wrist device910) is detected to be motion, the inference engine1035 can be used to determine additional position data for the movement of that device along a third axis based on orientation, position or context of second modular sensing device1020 (wrist device911 or pendent device912). For example, if the first modular sensing device1010 (wrist device911) measures arc motion, and the secondmodular sensing1020 is the pendent, then the orientation of the second modular sensing device can indicate whether, for example, the arc motion is in front of the user or to the user's side. Alternatively, if the secondmodular sensing device1020 is thesecond wrist device912, the information sensed from the second wrist device can identify the corresponding hand or device as being in front of the body. In such an orientation, the inference engine1035 can determine the inference to be that the user is making the arc of motion in front of his body. Similarly, if the height of thesecond sensing device1020 is determined to be belt high and the device is held by the user, the orientation of the user's torso can be inferred (along with the direction of the arc).
In examples in which the secondmodular sensing device912 is a pocket device (e.g., mobile computing device, FOB), information can be determined from, for example, the height of the device (e.g., user standing, crouching or jumping) and the rotation of the device. For example, if the secondmodular sensing device1020 is pocket word, a change in the orientation of the device from vertical to horizontal, in combination with a downward acceleration can indicate the user is crouching. If the user is crouching, for example, the type of motion that is likely by the firstmodular sensing device1010 may be limited (e.g., motion of thewrist device910 is likely in front of user when user is moving up or down).
Modular Sensing Device
FIG. 11 illustrates an example modular sensing device insertable into a plurality of compatible apparatuses. Themodular sensing device1100 shown inFIG. 11 can comprise various components and modules of the modular sensing devices and wearable device as shown and described herein. Referring toFIG. 11, themodular sensing device1100 can include a number of output devices, such as anLED array1110, an audio output device1120 (e.g., a speaker), and a haptic driver1160 (included within the device). Furthermore, themodular sensing device1100 can include amode selector1130, which can comprise an analog or digital button to enable the user to select a particular mode of thedevice1100 by, for example, scrolling through a stored series of modes. Themodular sensing device1100 can further include memory andprocessing resources1165 that can execute the selected mode (either in the state machine implementation (FIG. 1B) or the executed instruction set implementation (FIG. 1A) described herein).
In various aspects, themodular sensing device1100 also includes a communications interface1170 (e.g., a BLUETOOTH low energy, WiFi, WiGig, WiMAX, or cellular radio interface), and anIMU1140 to provide the memory andprocessing resources1165 with sensor data for detecting gestures performed by the user. As described herein, depending on the mode and sub-mode of thedevice1100 the memory andprocessing resources1165 interpret the sensor data to generate outputs via theoutput devices1110,1120,1160 and/or commands or responses to be output to a connected device via the communications interface1170 (e.g., a remotely operated device or another modular sensing device). Furthermore, in some implementations, themodular sensing device1100 can include an input interface1150 (e.g., a mini-USB port) to enable charging of one or more batteries and/or uploading of additional mode instructions. In variations, themodular sensing device1100 can include an induction interface to charge one or more batteries and/or to enable inductive pairing with a second device to establish a wireless communications link.
In the various examples described herein, themodular sensing device1100 can be insertable into or otherwise attachable to any number of compatible apparatuses, such as wearable devices1195 (wrist devices, rings, pendants, hats, glasses, etc.) wieldeddevices1185, companion toys or dolls, and the like. Furthermore, themodular sensing device1100 can be implemented in various other form factors, can be sewn into clothing, or can be mounted, glued, or otherwise attached to various apparatuses. Such apparatuses can each include amodule holder1187,1197 into which themodular sensing device1100 may be inserted or otherwise mounted or attached. Thus, according to examples provided herein, the user can utilize the apparatuses into which themodular sensing device1100 has been inserted or attached, to perform various gestures in accordance with a selected mode of themodular sensing device1100.
FIG. 12 illustrates an implementation of the modularized sensing device. As shown, thesensing device1200 can be retained by the compatible structure1220 (e.g., wrist-worn strap), and then removed and placed in an opening of a wand device1210 (e.g., play sword). The placement of themodular sensing device1200 in differentcompatible structures1220,1210 for retention and use can be coordinated with different functionality being enabled through the sensing device. For example, themodular sensing device1200 in the wrist-wornstrap1220 can be used in conjunction with a first program running on a mobile computing device (controller), self-propelled device and/or other computer system (e.g., virtual gaming system). When placed in the wand or wieldeddevice1210, themodular sensing device1200 can be operated in conjunction with a mobile computing device, self-propelled device, and/or other computer system (e.g., virtual gaming system) which executes a second program or application. In each context, the orientation of themodular sensing device1200 can be used to determine a perspective, such as a virtual field of view for gameplay. The perspective can refer to the orientation, direction and/or position of the user, and/or of the user's body part with respect to the sensing device. With the wand, the orientation and direction of the sensing device can be used to project a corresponding virtual object in a virtual environment (e.g., sword). Themodular sensing device1200 may also be able to read an identifier of thecompatible structure1220,1210 in order to determine information about the structure, such as its dimension, and whether the structure is worn or carried. Based on known information, inferences can be determined for purpose of virtualization, etc. (e.g., length of sword).
CONCLUSIONIt is contemplated for examples described herein to extend to individual elements and concepts described herein, independently of other concepts, ideas or systems, as well as for examples to include combinations of elements recited anywhere in this application. Although examples are described in detail herein with reference to the accompanying drawings, it is to be understood that this disclosure is not limited to those precise examples. As such, many modifications and variations will be apparent to practitioners skilled in this art. Accordingly, it is intended that the scope of this disclosure be defined by the following claims and their equivalents. Furthermore, it is contemplated that a particular feature described either individually or as part of an example can be combined with other individually described features, or parts of other examples, even if the other features and examples make no mentioned of the particular feature. Thus, the absence of describing combinations should not preclude the inventor from claiming rights to such combinations.