CROSS-REFERENCES TO RELATED APPLICATIONSThis application is a continuation of, and claims priority to, U.S. patent application Ser. No. 15/637,468, filed on Jun. 29, 2017, the entire contents of which are incorporated herein by reference. Applicant also incorporates herein by reference the entire disclosures of U.S. application Ser. Nos. 15/637,650 and 15/637,701.
FIELD OF THE DISCLOSUREThe present disclosure generally relates to device control and, more particularly, to device control based on movement of a user.
BACKGROUNDConventional techniques for controlling operation of a device involve a user of the device touching a switch, knob, button, dimmer, or other component of the device in order to control operation in the desired manner. For example, the user may touch a switch that controls a light bulb in a light fixture in order to move the switch to an “ON” position to cause the light bulb to illuminate; the user may turn on a stove burner by using his or her hand to manipulate a corresponding knob for the burner; the user may press a button such as a temperature button on an oven or a volume button on a sound system to cause a temperature of the oven or a volume of the sound system to increase or decrease. The user may also press a button on a home security system or smart home device to turn on or off the system or device or various functionality of the system or device. Insurance coverage provided with respect to a residence may account for the presence and/or use of various items within the residence.
Outside of a residential context, such as in the context of driving, a driver of a vehicle may receive indications of events such as an approaching emergency vehicle or use of a crosswalk by pedestrians. The driver may change a manner of driving of the vehicle, such as by slowing down, in response to such indications. Conventional manners in which the driver receives such indications include the driver hearing an approaching emergency vehicle or noticing a flashing infrastructure component, such as a flashing light at an intersection, that indicates the emergency vehicle; or a visual indication of crossing pedestrians being provided to the driver, such as by a crossing guard.
SUMMARYThis Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In one embodiment, a system for movement-based device control may be provided. The system may include a target device and a control device. The control device may be configured to determine an indication of the target device. The control device may also be configured to detect at least one movement associated with the control device. The at least one movement may correspond to at least one control command associated with the target device. The control device may additionally be configured to generate a control signal indicative of the at least one control command associated with the target device based on the at least one movement associated with the control device. The control device may further be configured to transmit the control signal indicative of the at least one control command associated with the target device to the target device based on the indication of the target device so that the target device is controlled according to the at least one control command.
In another embodiment, a method for movement-based device control may be provided. The method may include receiving, using one or more processors associated with a target device, a control signal corresponding to at least one movement associated with a control device. The method may also include processing, using the one or more processors associated with the target device, the control signal to determine at least one indication of the at least one movement associated with the control device. The method may additionally include determining, using the one or more processors associated with the target device based on the at least one indication of the at least one movement associated with the control device, at least one control command associated with the target device. The method may further include causing, using the one or more processors associated with the target device, the at least one control command associated with the target device to be executed so that the target device is controlled according to the at least one control command.
In yet another embodiment, another method for movement-based device control may be provided. The method may include determining, using one or more processors associated with a control device, an indication of a target device. The method may also include detecting, using the one or more processors associated with the control device, at least one movement associated with the control device. The at least one movement may correspond to at least one control command associated with the target device. The method may additionally include generating, using the one or more processors associated with the control device, a control signal indicative of the at least one control command associated with the target device based on the at least one movement associated with the control device. The method may further include transmitting, using the one or more processors associated with the control device, the control signal indicative of the at least one control command associated with the target device to the target device based on the indication of the target device so that the target device is controlled according to the at least one control command.
BRIEF DESCRIPTION OF THE DRAWINGSThe figures described below depict various aspects of the system and methods disclosed herein. It should be understood that each figure depicts an embodiment of a particular aspect of the disclosed system and methods, and that each of the figures is intended to accord with a possible embodiment thereof. Further, wherever possible, the following description refers to the reference numerals included in the following figures, in which features depicted in multiple figures are designated with consistent reference numerals.
FIG. 1 illustrates a block diagram of a system for implementing movement-based device control, such as device control based on movement of a user, according to an exemplary embodiment of the present invention;
FIG. 2 illustrates a block diagram of a computing device according to an exemplary embodiment of the present invention;
FIG. 3 illustrates a flow diagram of a method for movement-based device control, according to an exemplary embodiment of the present invention;
FIG. 4 illustrates a flow diagram of another method for movement-based device control, according to an exemplary embodiment of the present invention;
FIG. 5 illustrates a flow diagram of yet another method for movement-based device control, according to an exemplary embodiment of the present invention; and
FIG. 6 illustrates a flow diagram of another method for movement-based device control, according to an exemplary embodiment of the present invention.
The figures depict various aspects of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
DETAILED DESCRIPTIONAlthough the following text sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this patent and equivalents. The detailed description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical. Numerous alternative embodiments could be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.
It should also be understood that, unless a term is expressly defined in this patent using the sentence “As used herein, the term ‘______’ is hereby defined to mean . . . ” or a similar sentence, there is no intent to limit the meaning of that term, either expressly or by implication, beyond its plain or ordinary meaning, and such term should not be interpreted to be limited in scope based on any statement made in any section of this patent (other than the language of the claims). To the extent that any term recited in the claims at the end of this patent is referred to in this patent in a manner consistent with a single meaning, that is done for the sake of clarity only so as to not confuse the reader, and it is not intended that such claim term be limited, by implication or otherwise, to that single meaning. Finally, the patent claims at the end of this patent application are not intended to be construed under 35 U.S.C. § 112(f) unless traditional means-plus-function language is expressly recited, such as “means for” or “step for” language being explicitly recited in the claim(s).
The present embodiments relate to, among other things, techniques for device control based on movement of a user. Conventionally, the operator or user of a device may control operation of the device by touching an associated switch, knob, button, dimmer, etc. and/or moving the associated switch, knob, dimmer, etc. in a particular direction and/or other manner so as to achieve the desired control of the device. For example, the user may manipulate a burner knob of a stove so as to turn on or off a corresponding burner, the user may flick a light switch up or down to turn a light on or off, the user may press a button to increase or decrease a setting such as volume, etc. The user may also press a button to move the device being controlled, such as by pressing a button on a remote control to raise or lower a window shade that is in a difficult-to-reach location within a residence.
A user may also receive indications of events when performing activities such as driving a vehicle, and may control (e.g., drive) the vehicle based on such indications. For example, a sign, infrastructure component, siren, crossing guard, etc. may warn the driver of an oncoming emergency response system (EMS) vehicle such as an ambulance, pedestrians crossing in a crosswalk, etc. Insurance coverage provided to a user with respect to a residence, a vehicle, etc. may have terms that reward the user for the presence and/or use of various items within the residence, the vehicle, etc., such as use of home security systems, smart home devices in general, and/or Global Positioning System (GPS) functionality within the vehicle, etc.
Conventional techniques for device control, however, typically require the presence of the user at devices such as household appliances, lights, etc. to activate switches, knobs, buttons, etc., and/or typically require the user to use an additional device such as a remote control to control the intended device, such as in the case of a sound system, television, difficult-to-reach window shades, etc. Conventional techniques for device control also typically require, in the context of a vehicle, the driver to notice various indications in order to control the vehicle in response to such indications. Such indications may include, for example, a flashing infrastructure component or crossing guard, and various conditions such as heavy traffic or reduced visibility may make noticing such indications more difficult. Conventional techniques for device control (including vehicle control) fail to afford a user or driver functionality to perform such device control remotely and to be readily and effectively notified of a need to perform such device control (e.g., a need to slow the speed of a vehicle).
The techniques, systems, and methods disclosed herein, though, allow a user to control a device (e.g., a household device such as one of the example types described above or further described herein) from a distance, without the need for an additional device such as a conventional remote control that is used to control a television, sound system, window shades, etc. More particularly, in some embodiments and as further discussed below, the user may wear a control device, such as a smart watch or a necklace, and may cause at least one movement to be associated with the control device. For example, the user may perform at least one gesture, such as at least one gesture using a hand of the user. The at least one gesture may be or may correspond to the at least one movement associated with the control device, as further described below, and the at least one movement associated with the control device may be detected by the control device. The control device may generate a control signal indicative of a control command based on the at least one movement associated with the control device that has been detected by the control device. The control signal may be transmitted to a target device, such as a light switch, light fixture, fan, thermostat, television, window shade, speaker, or to a component coupled to the target device, as discussed in further detail herein, so as to cause the target device to operate in a desired manner.
By the user performing a gesture or gestures that result in generation of a control signal that causes the target device to operate in a desired manner, the user may control the operation of the target device using a wearable control device and an appropriate movement(s) (e.g., gesture(s)). Unlike conventional techniques requiring the presence of the user at the target device that is to be controlled, the techniques described herein allow the user to control the operation of the target device from a distance. In some situations, the gesture-based device control described herein may allow a user to control a device that would otherwise be controlled by voice commands when an environment of the device is too noisy or loud to process such voice commands accurately or when the user simply prefers to utilize gesture-based device control. In some examples, as further described herein, control of a device may switch between movement-based (e.g., gesture-based) control and other forms of control, such as voice control.
In some embodiments, the target device to be controlled may be a roadside sign, a vehicle, a smart vehicle controller, or may be a device (e.g., a remote server or other computing device) that communicates with a vehicle or smart vehicle controller, as further described herein. Unlike in conventional implementations of providing indications to a driver of, for example, pedestrians crossing in a crosswalk or an approaching emergency vehicle, the techniques described herein may allow a user of a control device, who may be located at any suitable remote location outside of the vehicle (including in a different vehicle), to perform a movement(s) (e.g., gesture(s)) so as to cause relevant and timely indications to be provided to the driver in an easily noticeable way(s). For example, and as discussed in further detail below, such indications may be provided by way of easily visible roadside signs near the driver, by a smart vehicle controller of the vehicle being driven, etc.
More generally, the target device that is controlled using a control device worn by the user may be any suitable device that receives (or that has a component coupled thereto that receives) a control signal directly or indirectly from the control device as described herein and that operates (e.g., is controlled) in response to the control signal, where the control signal may be generated as a result of the user of the control device performing a movement(s) (e.g., gesture(s)). Accordingly, the techniques, systems, and methods disclosed herein are able to provide a user with the ability to control a device from a distance, without the need for an additional device such as a conventional remote control, and/or to perform device control in a way that provides a more effective indication or feedback to, for example, a driver of a vehicle. In view of these and other advantages that will be recognized by one of ordinary skill in the art in light of the teaching and disclosure herein, it will be appreciated that the techniques, systems, and methods described herein are directed to an improvement to computer functionality and/or an improvement in computer-related technology, and improve the functioning of conventional computers. Moreover, additional advantageous features and embodiments of techniques, systems and methods for movement-based device control are further described below.
FIG. 1 illustrates a block diagram of anexemplary system100 for implementing movement-based device control, such as device control based on movement of a user as described herein. The high-level architecture illustrated inFIG. 1 may include both hardware and software applications, as well as various data communication channels or links for communicating data between the various hardware and software components, as is described below. Thesystem100 may include auser102 of functionality described herein, anetwork104, aremote computing device106, an insuranceprovider computing device108, and a data storage device (or devices)110, it being understood from the teaching and disclosure herein that various components and/or devices shown and/or described in the singular may have plural instances or vice-versa, or may be omitted in some embodiments. Theremote computing device106 may be situated at any suitable or desired location remote from (e.g., not worn by or directly attached to) theuser102, as further described herein.
Thesystem100 may also include awatch112, which in the example ofFIG. 1 may be worn by theuser102 and may be a smart watch. Additionally or alternatively, thesystem100 may include anecklace114, which may also be worn by theuser102 and may in some embodiments include or be attached to acamera116 or any suitable image sensor. For ease of explanation, the present disclosure refers at times to a “control device,” and it will be understood in light of the teaching and disclosure herein that the control device may be thewatch112, thenecklace114 including thecamera116, and/or may be different devices at different times. For example, at a first time, thewatch112 may be used as the control device to implement functionality described herein, and at a second time, thenecklace114 including thecamera116 may be used as the control device to implement functionality described herein. Furthermore, it should be appreciated in light of the teaching and disclosure herein that a suitable additional and/or alternative device or devices may be worn by theuser102, held by theuser102, attached to theuser102, etc. and may constitute the control device, such as, for example, smart glasses. As an example, theuser102 may wear clothing with a sensor (or sensors) and/or other suitable hardware built in to the clothing, such as asensor117 built into the shirt of theuser102 as shown inFIG. 1. Theuser102 may perform suitable movement as described elsewhere herein, which may be sensed by thesensor117 and used to generate and transmit a control signal as described elsewhere herein. Thesensor117 may be or may include an image sensor and/or any other suitable type of sensor(s).
With reference to the discussion above, theuser102 may cause at least one movement to be associated with the control device in order to control operation of a target device in a desired and/or suitable manner. With further reference to the discussion above, theuser102 may cause the at least one movement to be associated with the control device by performing a movement(s) such as an arm and/or hand gesture(s). In some embodiments, the at least one movement associated with the control device may be the movement(s) performed by theuser102. In other embodiments, the at least one movement associated with the control device may be a movement(s) of the control device that results from the movement(s) performed by theuser102. For example, theuser102 may make a flicking motion with a finger of the user102 (e.g., a motion that theuser102 would make with his or her finger to turn a light switch on or off), and the movement of thewatch112 that results from the flicking motion of the finger of theuser102 may constitute a movement associated with the control device. Stated another way, it will be understood in light of the present disclosure that the flicking motion (or any other suitable movement) made by theuser102 may not result in a movement of thewatch112 or other control device that has the same size, direction, and/or speed, etc. as the flicking motion, but may nonetheless correspond to a particular movement of thewatch112 or other control device (e.g., a movement of thewatch112 that is of a certain direction, distance, speed, etc.). Such a corresponding particular movement of thewatch112—as opposed to the actual flicking motion made by theuser102 wearing thewatch112—may constitute the movement associated with the control device as described herein. In cases where another control device (or devices) is used, such as thenecklace114 with thecamera116, theuser102 may cause at least one movement to be associated with the control device by, for example, gesturing such that the gesture is captured by the camera116 (which may capture still images, video, etc.). An indication of the gesture captured by thecamera116 may be further processed in a similar manner as an indication of a movement associated with thewatch112 may be processed, as described elsewhere herein.
As discussed in further detail herein, a control signal may be generated accordingly in response to the at least one movement associated with the control device (e.g., a movement of the control device resulting from a flicking motion performed by the user102) so as to control operation of the target device. For example, and as also discussed in further detail herein, the control signal may be generated so as to turn on a light when the target device is a light switch or corresponding light controlled by the light switch, so as to provide a particular command(s) to a smart vehicle controller, so as to provide particular data and/or command(s) to a remote computing device such as theremote computing device106, etc. Data provided to theremote computing device106 may in some examples be stored in the data storage device (or devices)110 for retrieval at a desired or suitable time, such as to provide commands to one or more vehicles and/or smart vehicle controllers. In other examples, as further discussed herein, the control signal generated in response to the at least one movement associated with the control device may increase or decrease a setting (e.g., brightness, volume, channel such as in the case of a television, etc.) of the target device; may move the target device in a particular direction (e.g., moving a window shade up or down); and/or may control the target device in any other suitable manner. An example gesture that may be performed to increase or decrease a setting includes raising or lowering a hand and/or arm of theuser102. Example gestures that may be performed to move the target device in a particular direction include movement of a hand of theuser102 in particular directions (e.g., up to raise a window shade, down to lower a window shade, to the right to move the target device in one direction, to the left to move the target device in an opposing direction, and/or any other suitable gestures).
In some embodiments, it may be determined (e.g., by the control device, such as the watch112) whether a particular user, such as one user of at least one authorized user of the control device, is theuser102 wearing the control device, and various actions described herein such as generation of the control signal may not be performed when an authorized user of the control device is not wearing the control device. As described in further detail below, the determination or detection of the control device being worn by an authorized user may be performed based on analyzing biometric information of theuser102 wearing the control device (which biometric information may be determined by the control device while theuser102 wears the control device) with respect to biometric information that has been associated with the at least one authorized user of the control device. In this manner, if theuser102 is not an authorized user of the control device, theuser102 may be unable to use the control device to control operation of any target device. Additionally or alternatively, in some embodiments, it may be determined whether theuser102 is an authorized user of the control device based on additional authentication information (e.g., authentication information other than biometric information), such as a username and/or password entered by theuser102 of the control device via, for example, a user interface of the control device.
In any event, in some embodiments, the particular movements that theuser102 is to cause to be associated with the control device so as to cause generation of a control signal to control a target device, as further described herein, may be programmed into the control device (e.g., a mobile application, such as that further discussed below, may execute according to such programming) and may be known only to theuser102. In such embodiments, the control device (e.g., the watch112) may determine the particular movements that theuser102 is to cause to be associated with the control device so as to control the target device when theuser102 is authenticated as an authorized user as described herein, and the response (or lack of response) of the control device to movements of theuser102 in causing generation and transmission of a control signal, etc., may proceed accordingly.
In accordance with the techniques, systems, and methods disclosed herein, insurance coverage may be provided to theuser102 that rewards the user for controlling one or more target devices using a control device (e.g., the watch112). For example, an insurance premium for coverage provided to theuser102 with respect to a residence may be decreased based on a number of times (and/or an amount of time and/or any other suitable measure(s)) that theuser102 uses the control device (and/or another control device(s), such as the necklace114) to control operation of a target device that constitutes a smart home device, a home security device or system, and/or other device(s) or system(s) that, when used, reduce a risk of an insured loss (e.g., fire, burglary, flood, etc.) associated with a residence. Additionally or alternatively, in some examples, an insurance premium for coverage with respect to a vehicle may be decreased based on a number of times (and/or an amount of time and/or any other suitable measure(s)) a smart vehicle controller of the vehicle receives indications (e.g., of oncoming EMS vehicles, etc.) generated as a result of use of a control device by, for example, a different user outside of a vehicle being driven by theuser102.
Data indicative of control signals generated by the control device or devices used by theuser102 in, for example, a residential context, the context of a vehicle, etc. may be sent to the insurance provider computing device108 (e.g., an insurance provider server) and/or stored in the data storage device (or devices)110. In other examples, any suitable data, such as data from the target device or devices controlled, data indicative of control commands corresponding to the control signals, data indicative of the at least one movement associated with the control device, etc. may additionally or alternatively be sent to the insuranceprovider computing device108 and/or stored in thedata storage device110. Such data indicative of control signals and/or other suitable data may be used by an insurance provider associated with (e.g., owning, operating, controlling, and/or otherwise utilizing) the insuranceprovider computing device108 in determining insurance incentives, rewards, etc. as further described below.
With continued reference toFIG. 1, example target devices are shown in thesystem100, though it should be appreciated from the present disclosure that any suitable target device(s) may be implemented and controlled using the functionality described herein. More particularly,FIG. 1 illustrates thesystem100 as including astove118 with four burners120a-120dand four correspondingburner control knobs122a-122d.Thestove118 may also include or be coupled to a stove communicationmode control component124, which may be or include an electronic switch or other suitable component that may cause thestove118 to switch between operating in a manually-controlled mode (e.g., operating in response to manual control of theburner control knobs122a-122d) and operating in a remotely-controlled mode (e.g., operating in response to a control signal generated by thewatch112 or other control device). For example, in the remotely-controlled mode, thestove118 may operate based on receipt of the control signal via a communicative connection (e.g., a connection via awireless communication link126, as further described below).
In some embodiments, thestove118 may also be retrofitted with a communication control device, such as a stovecommunication control device123 which may connect to the manual controls of the stove118 (e.g., theburner control knobs122a-122d) in any suitable manner, such as by being coupled to, for example, theburner control knob122das shown inFIG. 1. The stovecommunication control device123 may facilitate the operation of thestove118 in the remotely-controlled mode instead of operation of thestove118 in the manually-controlled mode using theburner control knobs122a-122d.Existing wiring of thestove118, such as existing wiring connected to theburner control knobs122a-122d,may also be rewired to facilitate operation of thestove118 in the remotely-controlled mode. It will be appreciated in light of the teaching and disclosure herein that any other suitable target device or devices may also be retrofitted in a similar manner.
Thewireless communication link126 may be established by the control device when, for example, theuser102 orients the control device to allow a transceiver thereof (e.g., a wireless transceiver of thewatch112, which as noted above may be a smart watch, and example components of which are described with respect toFIG. 2) to communicate with the target device (e.g., the stove118) or a component associated with (e.g., coupled to) the target device. In the example ofFIG. 1, thewireless communication link126 is established between thewatch112 and a component associated with the target device, such as the stove communicationmode control component124 that is coupled to thestove118. The control device may establish thewireless communication link126 between the control device and the target device (or component coupled to the target device) using Wi-Fi communication, Bluetooth communication, GPS technology, infrared communication, line-of-sight communication, radar, a received signal strength indicator (RSSI), and/or any other suitable technique(s) and/or type(s) of communication.
More particularly, in some embodiments, the control device may establish thewireless communication link126 in accordance with the Wi-Fi Direct standard without the need for a wireless access point. In other embodiments, the control device may establish thewireless communication link126 using an additional device such as a wireless access point (not shown). In such embodiments, thewireless communication link126 may be illustrative of communication occurring between thewatch112 and the stove communicationmode control component124, even though such communication, while shown as direct communication inFIG. 1, may instead be indirect communication via the wireless access point.
In embodiments where the control device establishes thewireless communication link126 using GPS technology, the control device may, when oriented so that a transceiver thereof communicates with the target device, ping the target device by sending a suitable signal to the target device. The target device may transmit a signal indicative of its GPS coordinates to the control device, and the control device may in some embodiments store an indication of the GPS coordinates of the target device for use in communicating with the target device. For example, the stove communicationmode control component124 may be near another target device, and when the control device is oriented so as to transmit a control signal to the stove communicationmode control component124, the control device may use the GPS coordinates of the stove communicationmode control component124 to establish thewireless communication link126 between the control device and the stove communicationmode control component124. In this manner, the control device may be configured to communicate with the stove communicationmode control component124 as opposed to communicating with another nearby target device. The control device (e.g., the watch112) may indicate to theuser102 the target device with which the control device has thewireless communication link126. For example, a display of thewatch112 may indicate to the user102 (e.g., via a suitable written and/or graphical message) that thewatch112 has thewireless communication link126 with the stove communicationmode control component124.
In some examples, multiple target devices may be indicated (e.g., on the display of the watch112), such as multiple target devices that are known to be within transmission range of the watch112 (or other control device) and that have been prepopulated or preprogrammed for display via thewatch112. Examples of additional target devices are further described below. In such embodiments, theuser102 may select from among the multiple target devices via the display and/or another suitable user interface. A different wireless communication link (e.g., other than thewireless communication link126, as further described herein) between the control device and the selected target device may be established, such as by using the example techniques described herein.
In embodiments where the control device establishes thewireless communication link126 using Bluetooth technology, infrared technology, and/or line-of-sight technology, the control device may establish thewireless communication link126 with a target device or a component coupled thereto using any suitable techniques that will be recognized by one of ordinary skill in the art in light of the teaching and disclosure herein. For example, thewatch112 and/or a target device or component coupled thereto may perform any suitable actions necessary for Bluetooth pairing of thewatch112 with the target device or component, any suitable actions necessary to establish an infrared link between thewatch112 and the target device or component, any suitable actions necessary to establish a microwave or other line-of-sight communication link between thewatch112 and the target device or component, etc.
In embodiments where the control device establishes thewireless communication link126 using radar, the control device may use techniques that will be recognized by one of ordinary skill in the art in light of the teaching and disclosure herein to transmit suitable radio frequency signals and detect radio waves reflected from the target device in order to determine an indication of a location of the target device based on the reflected radio waves. Similarly, in embodiments where the control device establishes thewireless communication link126 using an RSSI, the control device may use techniques that will be understood by one of ordinary skill in the art in light of the teaching and disclosure herein to determine an indication of a location of the target device based on the RSSI. In any event, in various embodiments, the establishment of thewireless communication link126 may be completed using one of the other techniques described herein, such as by using Wi-Fi.
In some embodiments, the control device (e.g., the watch112) may transmit a signal in order to ping a target device or devices, and the target device or devices (or component(s) coupled thereto, as discussed elsewhere herein) may be configured (e.g., by way of suitable computer-executable instructions executable by a processor(s) of the target device or component(s) coupled thereto) to transmit a device identification signal in response to receiving the signal from the control device. Such a device identification signal may include, for example, data indicating the type of the target device, data indicating a location of the target device, data indicating communication capabilities of the target device (e.g., whether the target device has Bluetooth capability), and/or any other suitable data. Such data in the device identification signal may, in some embodiments, be used in conjunction with one or more of the various techniques described above in “pairing” the control device and the target device and/or in establishing thewireless communication link126.
In some embodiments, the control device may additionally or alternatively use laser communication, light-emitting diode (LED) to LED communication, and/or any other suitable type(s) of communication to identify a target device, a type of the target device, a location of the target device, communication capabilities of the target device, etc. In some embodiments, the target device may include a component, such as a laser light or LED, that flashes or blinks in response to receiving a device identification signal such as that described above or in response to detecting laser or LED communication (e.g., a laser or LED illuminated at the control device). Such flashing or blinking may assist theuser102 of the control device in establishing thewireless communication link126 with the particular target device intended by theuser102. In any event, the identified data regarding the target device, type of target device, location of target device, and/or communication capabilities of the target device, etc. may, in some embodiments, be used in conjunction with one or more of the various techniques described above in order to “pair” the control device and the target device and/or in establishing thewireless communication link126.
In some embodiments, a user interface of thewatch112 or other control device may include an option(s) for theuser102 to verify, disable, and/or cancel the establishment of thewireless communication link126 with the target device at any suitable time during or after thewireless communication link126 is established. Furthermore, in some embodiments, thewatch112 or control device may execute a mobile application (“app”) in establishing thewireless communication link126. An example implementation of thewatch112 or other control device is further described with respect toFIG. 2. Additionally or alternatively, in some embodiments, a target device or component coupled thereto may perform at least some of the actions involved in establishing a communicative connection, such as at least some of the actions described herein that are involved in establishing thewireless communication link126.
With reference to the discussion above, it may be determined whether theuser102 wearing the control device is an authorized user of the control device (e.g., a user authorized to cause at least one movement to be associated with the control device so as to generate a control signal to control a target device as described herein), such as based on biometric information of theuser102 and/or additional authentication information of theuser102. Thewatch112 or other control device may store (e.g., by previous collection of biometric information from an authorized user(s) or other suitable input of such information) biometric information of one or more authorized users as expected biometric information. In various embodiments, such expected biometric information may include a resting heart rate of the user for whom expected biometric information is collected; a body temperature of the user for whom expected biometric information is collected; a heart rate of the user for whom expected biometric information is collected while the user walks; a manner in which the user for whom expected biometric information is collected moves his or her arms while walking; a manner in which the user for whom expected biometric information is collected positions his or her arms when in a seated position; a manner in which the user for whom expected biometric information is collected positions his or her arms while walking; and/or any other suitable biometric information. Such biometric information that may be stored as expected biometric information may be sensed and/or collected by, for example, one or more sensors of thewatch112 or other control device.
Biometric authentication of theuser102 may be performed by thewatch112 or other control device sensing and/or collecting biometric information of theuser102, such as one or more of the example types of biometric information discussed above. It may be determined, such as by a processor of thewatch112, whether the biometric information of theuser102 matches the expected biometric information discussed above. It should be appreciated in light of the teaching and disclosure herein that the expected biometric information as discussed above may include biometric information of multiple users, such as all users authorized to use thewatch112 as the control device. Accordingly, it may be determined that the biometric information of theuser102 matches the expected biometric information if the biometric information of theuser102 matches a portion of the expected biometric information that corresponds to one of the authorized users of thewatch112. In some embodiments, theuser102 may provide identifying information, such as a name, username, and/or password, in order to indicate with respect to which portion of the expected biometric information (i.e., the expected biometric information for which authorized user) the biometric information of theuser102 is to be analyzed to determine whether a match exists.
In some embodiments, the sensed and/or otherwise collected or measured biometric information of theuser102 may be determined to match the expected biometric information if the biometric information of theuser102, and/or one or more individual components (e.g., resting heart rate) of the biometric information of theuser102, is/are within a particular (e.g., predetermined and/or preprogrammed) range of the expected biometric information. The particular range may be a percentage range, for example.
In some embodiments, each authorized user may wear the watch112 (or other control device) periodically (e.g., at predetermined or random intervals) in order for thewatch112 to sense and/or collect types of biometric information that constitute the expected biometric information for the authorized user. The expected biometric information for each authorized user may thus be periodically changed and/or updated.
In some embodiments, if it is determined that the biometric information of theuser102 is outside of a “profile” of the expected biometric information (e.g., outside of the particular range with respect to the expected biometric information for theuser102 or, for example, outside all particular ranges with respect to all portions of the expected biometric information for all authorized users), additional authentication may be performed. For example, theuser102 may use functionality of the control device to take a picture of his or her face for analysis, using facial recognition techniques, with respect to stored facial image information of authorized users of the control device in order to determine if theuser102 is an authorized user of thewatch112 or other control device. Additionally or alternatively, theuser102 may provide a fingerprint input, a username and/or password, and/or any other suitable additional authentication information to allow a processor of thewatch112 or other control device to determine if theuser102 is an authorized user of thewatch112 or other control device.
In some embodiments, theuser102 may be prompted to enter additional authentication information such as that described herein after the biometric information of theuser102 is determined to match the expected biometric information. That is, both the biometric information of theuser102 and the additional authentication information of theuser102 may need to match expected biometric information and expected additional authentication information (e.g., a stored username and password combination, etc.) for theuser102 to be able to use functionality of thewatch112 or other control device.
Theuser102 may perform an “unlock” operation with respect to the control device, which in various embodiments may be part of authentication of theuser102 based on biometric information (and/or based on additional authentication information) or which may be performed before or after such authentication. In some embodiments, such as those where the unlock operation is performed before or after biometric authentication, the unlock operation may be performed by way of theuser102 providing a particular input (e.g., touching a particular physical or on-screen button of the watch112) or series of inputs, which may be tactile, spoken, and/or any form of input(s). The particular input or series of inputs that is used to perform the unlock operation may be known only to authorized users, thus providing a further layer of security against unauthorized use of thewatch112 or other control device to control a target device(s).
Where the unlock operation is performed as a part of biometric authentication of theuser102, the unlock operation may be performed and functionality of the watch112 (or other control device), such as functionality that causes thewatch112 to be operative to generate and transmit a control signal to a target device (e.g., via the wireless communication link126), may be enabled when the biometric information of theuser102 matches the expected biometric information. In any event, in some embodiments, the watch112 (or other control device) may not allow performance of the unlock operation and/or may not allow selection of a target device, generation and/or transmission of the control signal, etc. when thewatch112 is not being worn by a person, such as theuser102, and/or when thewatch112 is not worn correctly (e.g., on a wrist). It may be determined whether thewatch112 is being worn using suitable sensors of thewatch112 and/or using a mobile application (such as the same mobile application used in establishing the wireless communication link126), which sensors may be or include sensors that are the same as and/or different from sensors used to sense and/or collect the biometric information of theuser102.
Additionally or alternatively, in some embodiments, one or more authorized users (e.g., the user102) may be able to control any target device within range of thewatch112 or other control device, while one or more other authorized users may be able to control one or more, but less than all, target devices within range of thewatch112 or other control device. Such a restriction(s) on control of target devices may be implemented by, for example, the watch112 (e.g., using a mobile application executing thereon) and/or by one or more of the target devices. For example, the target device(s) may receive(s), as part of a control signal, an indication of an identity of a user (e.g., a user other than theuser102 who is able to control less than all target devices) who performs one or more movements resulting in generation of the control signal. The target device may store an indication(s) of a user(s) authorized to control the target device, and/or the control signal may further indicate whether the indicated user is authorized to control one or more particular target devices.
With continued reference toFIG. 1, additional example target devices may include window shades of afirst window128 and asecond window130. In particular, thefirst window128 may have afirst window shade132 associated therewith and thesecond window130 may have asecond window shade134 associated therewith. Thefirst window shade132 may have a firstwindow shade actuator136 coupled thereto, and thesecond window shade134 may have a secondwindow shade actuator138 coupled thereto. Each of the firstwindow shade actuator136 and the secondwindow shade actuator138 may be any suitable type of actuator such as, for example, an actuator with suitable electronics for receiving a control signal generated by the control device as described herein and causing the respective window shade (i.e., thefirst window shade132 or the second window shade134) to operate based on the received control signal. In this manner, a device that is not equipped with, for example, an electronic switch to allow receipt of a signal to cause the device to operate in a remotely-controlled mode (as further described herein) or a device that is not otherwise “smart” and configured to operate in response to the control signal generated by a control device may be configured with “smart” functionality for movement-based (e.g., gesture-based) control using, for example, an actuator as described herein.
It will be appreciated in light of the teaching and disclosure herein that any suitable device may be configured with “smart” functionality for movement-based control by a remotely-located control device, and that thefirst window shade132 and thesecond window shade134 are merely examples. Among other examples, while not shown as such inFIG. 1, thefirst window128 and/or thesecond window130 may themselves be target devices. Suitable components such as actuators similar to the firstwindow shade actuator136 and/or the secondwindow shade actuator138 may be coupled to thefirst window128 and/or thesecond window130 to facilitate movement-based control of thefirst window128 and/or thesecond window130.
In any event,FIG. 1 shows anadditional communication link140, such as an additional wireless communication link (though it will be understood from the present disclosure that thewireless communication link126 need not be established in various examples) between thewatch112 and the firstwindow shade actuator136. Theadditional communication link140 may be established in one or more of the manners similar to those described with respect to establishment of thewireless communication link126. It will be further understood from the present disclosure that other communication links may be established between the watch112 (or other control device, such as thenecklace114 orcamera116 attached thereto) and another target device or devices or component(s) coupled thereto. For example, another communication link (not shown) may be established between thewatch112 and the secondwindow shade actuator138.
Referring now to further examples of target devices,FIG. 1 shows adoor142 with adoor actuator144 coupled thereto. Thedoor actuator144 may be any suitable type of actuator, such as described above with respect to the first and secondwindow shade actuators136 and138. Moreover, it will be appreciated in light of the teaching and disclosure herein that thedoor142, and/or any other devices that may be coupled to actuators as described herein, may be a manually-operated door (or other device(s)) that is retrofitted with an actuator (e.g., the door actuator144) or may be manufactured to connect to an actuator (e.g., the door actuator144). Additionally or alternatively, thedoor actuator144 and/or any other actuators described herein may be manufactured with, for example, one or more suitable processors and/or switching components so as to receive control signals from thewatch112 or other control device and cause corresponding control of the target device to which such an actuator(s) is coupled. In some embodiments, one or more processors and/or electronic switches, for example, that may be associated with an actuator may constitute or may be included in a communication control device instead of within or as part of the actuator. For example, such a communication control device may be a doorcommunication control device145, which may be coupled to thedoor actuator144 as shown inFIG. 1.
As will further be appreciated from the teaching and disclosure herein, while the first and secondwindow shade actuators136 and138 and thedoor actuator144 are shown coupled to their respective target devices, any suitable components may be configured to receive a control signal from a control device and may be coupled to respective target devices. With respect to thedoor actuator144, a communication link (not shown) may be established, such as in one or more of the manners described with respect to establishment of thewireless communication link126, between the watch112 (or other control device) and thedoor actuator144 to facilitate the movement-based device control described herein.
FIG. 1 also illustrates alight fixture146 that may communicate with the watch112 (or other control device) as described herein. For example, the user102 (e.g., after unlocking thewatch112 and/or being authenticated) may make a flicking motion up or down with his or her finger to cause a movement to be associated with thewatch112 that results in generation of a control signal which, when received by thelight fixture146, turns the light fixture on or off. In other embodiments, theuser102 may perform other suitable gestures, such as raising his or her hand to brighten or dim the lighting provided by thelight fixture146.FIG. 1 further shows a light switch panel including an on/offlight switch148 and a dimmer switch with a dimmer150. In one embodiment, each of the on/offlight switch148 and the dimmer150 may be configured to receive control signals from thewatch112 based on a gesture or gestures performed by theuser102. The on/offlight switch148 may, in response to a control signal or signals received, control the on/off status of a light (not shown, or in some embodiments the light fixture146). The dimmer150 may, in response to a control signal or signals received, control the brightness of a light (not shown, or in some embodiments the brightness of lighting provided by the light fixture146). Additional examples of target devices include smart meters, such as thermostats, andFIG. 1 shows an examplesmart thermostat151. Thesmart thermostat151 may, in response to a control signal or signals received, adjust a temperature of one or more rooms and/or areas of a residence, adjust time(s) at which one or more rooms and/or areas of a residence have particular temperature settings, and/or perform any other suitable or desired actions. It should be understood in light of the teaching and disclosure herein that whileFIG. 1 shows example target devices, as noted above, any suitable target device or devices may be controlled using the techniques described herein.
Furthermore, with regard to target devices capable of operating in a manually-controlled mode or a remotely-controlled mode (e.g., the stove118), the control device may send a mode change signal to such target devices to cause such target devices to change between operating modes. The mode change signal may be any suitable signal generated by, for example, thewatch112 which, when received by a target device or a component coupled thereto, causes the target device to switch from operating in the manually-controlled mode to operating in the remotely-controlled mode or vice versa. The component coupled to a target device that may receive the mode change signal, such as the stove communicationmode control component124, may be or may include an electronic switch that is activated in response to the mode change signal in order to change the operating mode of the target device coupled thereto.
FIG. 1 still further shows atraffic light152 and aroadside sign154. One or more of thetraffic light152 or theroadside sign154 may be a roadside infrastructure component(s) equipped with “smart” functionality, such as functionality that allows receipt and processing of a control signal transmitted by a control device. For example, theroadside sign154 shows an example message of “CROSSING” which may be used to indicate a road crossing. For example, a pedestrian (who may be the user102); a crossing guard (who may alternatively be theuser102, or who may be a different user in some examples); and/or any suitable person, such as a driver of a vehicle, may perform one or more movements while wearing thewatch112 or other control device to cause display of the “CROSSING” message to warn oncoming vehicles and/or drivers of a pedestrian crossing. In some embodiments, such a movement(s) by the same or a different person may additionally or alternatively cause changing of thetraffic light152 to red to provide an indication to traffic to stop for the pedestrian crossing. In other embodiments, a movement(s) by a person such as the user102 (e.g., as a driver) may cause thetraffic light152 to change color in another manner, such as from red to green. Still further, theroadside sign154 may be configured to receive and process a control signal from a control device that may cause a different message to be displayed aside from “CROSSING,” such as a message regarding estimated transit times to various local destinations from the position of theroadside sign154, or in some examples no message.
FIG. 1 further shows afirst vehicle158 and asecond vehicle160 traveling on aroad162. Thefirst vehicle158 may be equipped with a first on-board computer and/or smart vehicle controller164 (at times referred to herein as a “firstsmart vehicle controller164” for ease of explanation), and thesecond vehicle160 may be equipped with a second on-board computer and/or smart vehicle controller166 (at times referred to herein as a “secondsmart vehicle controller166” for ease of explanation). In some embodiments, one or both of thefirst vehicle158 and thesecond vehicle160 may be an autonomous or semi-autonomous vehicle. In some embodiments, the control device used to, among other things, generate and transmit a control signal to thetraffic light152 and/or theroadside sign154 may be or include one or more of (i) the first vehicle158 (or one or more components thereof, such as one or more suitable wearable control devices or other computing devices therein); (ii) the second vehicle160 (or one or more components thereof, such as one or more suitable wearable control devices or other computing devices therein); (iii) the firstsmart vehicle controller164; and/or (iv) the secondsmart vehicle controller166. As an example,FIG. 1 illustrates acommunication link168 between the firstsmart vehicle controller164 and thetraffic light152, over which the first smart vehicle controller164 (e.g., acting as a control device) may transmit a control signal to thetraffic light152.
As another example, a driver of a vehicle (e.g., theuser102 as the driver of the first vehicle158) may perform one or more suitable movements to cause thewatch112 or other control device to transmit a control signal to the firstsmart vehicle controller164 to control one or more functions of thefirst vehicle158. Control of one or more functions of a vehicle such as thefirst vehicle158 in this manner may provide various advantages, such as allowing the driver of thefirst vehicle158 to control functions of thefirst vehicle158 that would otherwise have been controlled using voice commands when use of voice commands is not desired (e.g., because a child or other passenger within thefirst vehicle158 is resting or sleeping).
Additionally or alternatively, in some embodiments, the first vehicle158 (or component(s) thereof), the second vehicle160 (or component(s) thereof), the firstsmart vehicle controller164, and/or the secondsmart vehicle controller166 may act as a control device(s) that generate and transmit a control signal(s) to the remote computing device106 (e.g., via thenetwork104 as shown inFIG. 1), thefirst vehicle158, thesecond vehicle160, the firstsmart vehicle controller164, the secondsmart vehicle controller166, and/or any other suitable components and/or computing devices. Such functionality may allow various components such as those shown inFIG. 1 to provide messages and/or other suitable indications to other components such as those shown inFIG. 1 in the context of vehicles, such as an indication that a pedestrian(s) is/are crossing theroad162 via acrosswalk170.
FIG. 2 illustrates a block diagram of anexemplary computing device200. Thecomputing device200 may be an implementation of one of the computing devices shown and described with respect toFIG. 1, and more than one of the computing devices shown and described with respect toFIG. 1 may be implemented in accordance with thecomputing device200. For example, theremote computing device106; the insuranceprovider computing device108; thewatch112; thesensor117; the stovecommunication control device123; the stove communicationmode control component124; the first and secondwindow shade actuators136 and138; thedoor actuator144; the doorcommunication control device145; thelight fixture146; the on/offlight switch148; the dimmer150; thesmart thermostat151; thetraffic light152; theroadside sign154; the firstsmart vehicle controller164; the secondsmart vehicle controller166; and/or any other suitable components such as those shown inFIG. 1 may be implemented in accordance with, or may include components implemented in accordance with, thecomputing device200.
The computing device200 (and thus any of the aforementioned components or other suitable components that may be implemented in accordance with the computing device200) may include acontroller202, adisplay204, acommunication unit206, aGPS unit208, one or more sensors210 (e.g., a heart rate sensor and/or other biometric sensors, a motion sensor(s), a gyroscope(s), and/or any other suitable sensor(s)), and acamera212. Thedisplay204 may provide and/or receive output and/or input to, for example, theuser102 regarding, for example, establishment of a communication link, target devices to select from, authentication processes, etc. Thecommunication unit206 may communicate with other computing devices, such as other devices shown inFIG. 1 (e.g., target devices or components coupled thereto, thewatch112 or other control device, thenetwork104 in order to communicate with theremote computing device106, the insuranceprovider computing device108, and/or the data storage device(s)110, etc.). In various embodiments, thecommunication unit206 may communicate via Wi-Fi, WiMAX, Bluetooth, infrared, microwave, and/or other suitable communication techniques. Thecommunication unit206 may also or alternatively be used in radar applications, such as described above with respect to establishment of thewireless communication link126. Thecommunication unit206 may in some embodiments provide input signals to thecontroller202, and/or may transmit sensor data from the sensor(s)210, GPS data from theGPS unit208, image and/or video data, etc. from thecamera212, and/or any other suitable data to one or more other devices, such as one of the example devices shown inFIG. 1.
Thecontroller202 may include aprogram memory214, aprocessor216 such as a microcontroller or a microprocessor, a random-access memory (RAM)218, an address/data bus220, and an input/output (I/O)circuit222. The address/data bus220 may connect theprogram memory214, theprocessor216, theRAM218, and the I/O circuit222. In some embodiments, thecommunication unit206, while illustrated separately from the I/O circuit222, may be integral with the I/O circuit222 so that the I/O circuit222 may provide the functionality described above with respect to thecommunication unit206. Theprogram memory214 may include anoperating system224,data storage226, and/or one or more software applications (e.g., mobile applications), which software applications are shown in the example ofFIG. 2 assoftware applications228A,228B, and228C. Theoperating system224 may include one or more general purpose and/or mobile platforms, such as the Android™, iOS®, or Windows® systems, developed by Google Inc., Apple Inc., and Microsoft Corporation, respectively. In some embodiments, theoperating system224 may be a custom operating system designed for thecomputing device200. Thedata storage226 may include data such as expected biometric information, expected additional authentication information, indications of particular movements for one or more particular users (e.g., movements known only to such a particular user(s) and thus resulting in control of a target device only when such a particular user(s) performs such movements), and/or any other suitable data in accordance with the functionality described herein. One or more of thesoftware applications228A-228C may be used in performing various operations for implementing the techniques and functionality described herein.
It should be appreciated that althoughFIG. 2 depicts only oneprocessor216, thecontroller202 may includemultiple processors216. Additionally, althoughFIG. 2 depicts the I/O circuit222 as a single block, the I/O circuit222 may include a number of different types of I/O circuits (not depicted). Theprogram memory214, theRAM218, and thedata storage226 may be implemented in any known form of non-transitory computer readable storage media, including but not limited to semiconductor memories, magnetically readable memories, and/or optically readable memories. It should also be appreciated that theexample computing device200 may include additional, fewer, or alternate components.
With reference toFIGS. 1 and 2, thenetwork104 may be or may include a network of the insurance provider (e.g., provided or used by the insurance provider or communications over which the insurance provider otherwise controls or facilitates). In various embodiments, processors of the devices communicatively coupled to thenetwork104 may execute instructions to transmit data to, receive data from, or otherwise communicate with other ones of the devices communicatively coupled to the network104 (e.g., via a communication unit such as thecommunication unit206, which as noted above may be integral with the I/O circuit222). In various embodiments, such communication may include, but not be limited to, transmitting and/or receiving data regarding a control signal or signals generated by thewatch112 or other control device (e.g., via the communication unit206), data regarding usage of thewatch112 or other control device to control a target device(s), etc. Thenetwork104 may be or may include a network such as the Internet and/or any other type of suitable network (e.g., a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), a mobile network, a wired or wireless network, a private network, a virtual private network, etc.). Thenetwork104 may also or alternatively be or include one or more cellular networks such as a CDMA (code division multiple access) network, GSM (Global System for Mobile Communications) network, WiMAX (Worldwide Interoperability for Microwave Access) network, LTE (Long Term Evolution) network, etc.
FIG. 3 illustrates a flow diagram of anexemplary method300 for movement-based device control, such as control of one of the target devices described with respect toFIG. 1 using a control device such as the watch112 (or thenecklace114 and/or the camera116). At least a portion of the method300 (and at least a portion of each of the exemplary methods described herein) may be performed by and/or using components of thesystem100. As such, the method300 (and each of the exemplary methods described herein) is described herein with respect to thesystem100 for ease of explanation, and not by way of limitation. It will be appreciated in view of the teaching and disclosure herein that at least some portions of the method300 (and at least some portions of each of the exemplary methods described herein) may be performed by or using another system(s) and/or by or using one or more components not shown inFIG. 1.
Themethod300 may include receiving, using a processor (or processors) associated with a target device, a control signal corresponding to at least one movement associated with a control device (e.g., the watch112) (block302). With reference to the discussion above, an example implementation of a processor or processors associated with a target device is further described with respect toFIG. 2. The target device may be any suitable target device, such as, but not limited to, a target device described with respect toFIG. 1. For example, the target device may be thestove118, and the processor associated with the target device may be a processor of the stove communicationmode control component124. As another example, the target device may be thefirst window shade132, and the processor associated with the target device may be a processor of the firstwindow shade actuator136. As yet another example, the target device may be the on/offlight switch148. The received control signal corresponding to the at least one movement associated with the watch112 (or other control device, with the control device being referred to at times herein as thewatch112 for ease of explanation) may be a control signal generated based on the at least one movement associated with thewatch112, as further described herein.
Themethod300 may also include processing, using the processor associated with the target device, the control signal to determine at least one indication of the at least one movement associated with the watch112 (block304). With reference to the discussion above, the at least one movement associated with thewatch112 may be, depending upon the implementation of themethod300 and/or the type(s) of the at least one movement associated with thewatch112, either the actual movement(s) made by theuser102 or movement(s) of thewatch112 that result from the actual movement(s) made by the user102 (such as in the case of a flicking motion made by the user102). The control signal may, for example, encode or otherwise provide the at least one indication of the at least one movement associated with thewatch112 in any suitable manner. Thus, it will be appreciated from the disclosure herein that when theuser102 performs a movement that corresponds to a particular control command for a particular manner of control of a target device (e.g., with such a correspondence being known only to the user102 (and in some cases to thewatch112 and/or the target device, as discussed below) and not to other users), the control signal may be generated by thewatch112 and may encode the movement of theuser102. In some embodiments, thewatch112 may generate the control signal encoding the movement of theuser102 only after authentication of (and thus determination of the identity of) theuser102, and only when the authenticateduser102 performs a movement that, based on data stored in memory of thewatch112, is a movement specifically corresponding to the user102 (e.g., known by theuser102 and not by other users) and thus corresponding to a particular control command for theuser102. The same movement may, in some examples, not correspond to a particular control command for a different user, and thus in some embodiments, if a user other than theuser102 is determined to be wearing thewatch112, thewatch112 may not generate the control signal encoding the movement of the other user when the other user performs the same movement.
Themethod300 may further include determining, using the processor associated with the target device based on the at least one indication of the at least one movement associated with thewatch112, at least one control command associated with the target device (block306). For example, the processor associated with the target device may be a processor of the firstwindow shade actuator136. The processor associated with the target device may perform suitable processing based on the at least one indication of the at least one movement associated with thewatch112 to determine that the at least one indication of the at least one movement corresponds to a control command for a particular action to be performed by, on, or otherwise with respect to the target device. For example, an indication of the aforementioned correspondence between a movement associated with thewatch112 when worn by theuser102 and a particular control command may be stored in a memory of the target device (or in a memory of a component coupled to the target device, such as a memory of the first window shade actuator136). In the case of the processor associated with the target device being a processor of the firstwindow shade actuator136, the processor of the firstwindow shade actuator136 may, for example, determine that the at least one indication of the at least one movement associated with thewatch112 corresponds to a control command to raise or lower thefirst window shade132.
Themethod300 may further include causing, using the processor associated with the target device, the at least one control command determined as described with respect to block306 to be executed so that the target device is controlled according to the at least one control command (block308). For example, performance of the actions described with respect to block306 may, as noted above, result in a determination that the at least one indication of the at least one movement associated with thewatch112 corresponds to a control command to raise or lower thefirst window shade132. In this example, the processor associated with the target device (e.g., a processor of the first window shade actuator136) may cause the control command to raise or lower thefirst window shade132 to be executed. For example, the processor of the firstwindow shade actuator136 may cause the actuator to engage thefirst window shade132 so that thefirst window shade132 is raised or lowered in accordance with the control command.
While the description of themethod300 has included references to particular target devices, particular components coupled thereto, and particular manners of controlling such target devices (e.g., thefirst window shade132, the firstwindow shade actuator136, and raising or lowering the first window shade132), it should be understood that themethod300 may be performed with respect to any suitable target devices, components coupled thereto, and/or particular manners of control. For example, themethod300 may also or alternatively be performed with respect to control of thestove118, thesecond window shade134, thedoor142, thelight fixture146, the on/offlight switch148, the dimmer150, and/or any other suitable target device including a suitable target device(s) not shown inFIG. 1.
FIG. 4 illustrates a flow diagram of anotherexemplary method400 for movement-based device control, such as control of one of the target devices described with respect toFIG. 1 using a control device such as the watch112 (or thenecklace114 and/or the camera116). As will be appreciated from the teaching and disclosure herein, aspects of themethod400 may correspond to a more detailed implementation(s) of aspects of themethod300, and may include additional actions relative to those described with respect to themethod300.
Themethod400 may include authenticating a user of the control device, such as theuser102 of the watch112 (block402). For example, theuser102 may be authenticated based on biometric information and/or additional authentication information as described above in order to confirm that theuser102 is an authorized user of thewatch112.
Themethod400 may include receiving, using the processor associated with the target device, the control signal corresponding to the at least one movement associated with thewatch112 when theuser102 executes at least one movement while theuser102 is wearing the watch112 (block404). As noted above, the movement(s) executed by theuser102 may be the movement(s) associated with thewatch112 in some embodiments, but in other embodiments the movement(s) associated with thewatch112 may differ from the movement(s) executed by theuser102, such as when the user executes a flicking gesture or gestures. It will be appreciated in light of the teaching and disclosure herein that the actions described with respect to block404 may be a more particular implementation of the actions described with respect to block302.
Themethod400 may also include performing the actions described with respect toblocks304 and306 to, generally speaking, process the control signal to determine the indication(s) of the movement(s) associated with the watch112 (block304); and determine, based on the indication(s) of the movement(s) associated with thewatch112, the control command(s) associated with the target device (block306).
Themethod400 may additionally include causing the at least one control command associated with the target device to be executed so that the target device is controlled according to the at least one control command by changing an operational state of the target device, increasing and/or decreasing a setting associated with the target device, and/or moving the target device in a first direction and/or a second direction (e.g., opposite the first direction) (block406). It will be appreciated in light of the teaching and disclosure herein that the actions described with respect to block406 may be a more particular implementation of the actions described with respect to block308.
In some embodiments, changing the operational state of the target device may include turning the target device on or off, such as when the target device is, for instance, thelight fixture146 or any other suitable target device that may be in an “on” or “off” state. Increasing a setting associated with the target device may include increasing a burner setting, such as on thestove118; increasing an amount of light by way of causing the dimmer150 to move up; and/or any other suitable increase in a setting of a target device. Decreasing the setting (or any other setting, it being understood that the control command(s) may include commands to increase one setting and decrease another setting, for example) may include decreasing the burner setting; decreasing an amount of light by causing the dimmer150 to move down; and/or any other suitable decrease in a setting of a target device. Moving the target device in a first direction may include, for example, causing thefirst window shade132 to move up (e.g., retract), and/or may include any other suitable movement of a target device in a first direction. Moving the target device in a second direction may include, for example, causing thefirst window shade132 to move down (e.g., extend), and/or may include any other suitable movement of a target device in a second direction. The second direction may be a direction opposite the first direction or may be any other suitable direction.
Themethod400 may further include sending, to one or more insurance provider computing devices (e.g., the insuranceprovider computing device108, which may be an insurance provider server), data indicative of the control signal so as to cause the one or more insurance provider computing devices to generate an indication of an insurance discount, an insurance reward, an insurance incentive, and/or an insurance premium adjustment (block408). The indication of the discount, reward, incentive, and/or premium adjustment may be generated based on the data indicative of the control signal. More particularly, and with reference to the discussion above, insurance coverage may be provided to theuser102 that rewards theuser102 for controlling one or more target devices as described herein. For example, the indication described with respect to block408 may be an indication of a discount (e.g., a one-time reduction in premium or other suitable discount), a reward (e.g., an affinity offer or other suitable reward), an incentive (e.g., an indication of a further number of times and/or amount of time theuser102 needs to use the functionality described herein to control a target device(s) in order to receive a discount, reward, and/or premium adjustment, or another suitable incentive), and/or a premium adjustment (e.g., a premium reduction) provided based on use of thewatch112 or other control device to generate control signals.
More specifically, in some embodiments, the indication of the discount, reward, incentive, and/or premium adjustment may be provided based on a number of times, amount of time, etc. that theuser102 uses the watch112 (or other control device) to perform control of a target device as described herein. Such an indication of a discount, reward, incentive, and/or premium adjustment may be provided because, in some situations, the insurance provider may determine that increased use of the device control techniques described herein reduce a risk of fire, burglary, etc., such as by regulating when lights are on, when doors are closed and/or locked, when temperatures are increased and/or decreased, when a home security system (not shown) is engaged, etc. Accordingly, as described with respect to block408, data indicative of a control signal or signals generated by the control device may be sent to the insurance provider computing device(s) so as to allow the insurance provider associated therewith to determine a discount(s), reward(s), incentive(s), and/or premium adjustment(s) to offer or provide to theuser102.
FIG. 5 illustrates a flow diagram of yet anotherexemplary method500 for movement-based device control, such as control of one of the target devices described with respect toFIG. 1 using a control device such as the watch112 (or thenecklace114 and/or the camera116). Themethod500 may include determining, using one or more processors associated with a control device (e.g., a processor(s) of thewatch112, thenecklace114, or the camera116), an indication of a target device (block502). For example, with reference to the discussion elsewhere herein, target devices with which the watch112 (or other control device) may communicate for movement-based control purposes may be determined and indicated (e.g., selected) via a user interface of thewatch112. Additionally or alternatively, thewatch112 may determine the indication of the target device (e.g., an identity of the target device, a location of the target device, communication capabilities of the target device, control commands that may be received and processed by the target device (which may depend on the type of the target device), etc.) as part of the establishment of a communication link, such as thewireless communication link126, with the target device.
Themethod500 may also include detecting, using the processor(s) associated with the watch112 (or other control device), at least one movement associated with the watch112 (or other control device), with the at least one movement corresponding to at least one control command associated with the target device (block504). For example, the processor(s) associated with thewatch112 may detect a movement(s) made by theuser102 using suitable sensors, and the detected movement(s) may be the at least one movement corresponding to the at least one control command. In other examples, the detected movement(s) may correspond to, but may not themselves be, the at least one movement corresponding to the at least one control command (e.g., in the case of theuser102 making a flicking motion and the movement associated with thewatch112 being a manner in which thewatch112 is actually moved as a result of theuser102 making the flicking motion, as discussed above). As further described above, a correspondence between various movements associated with thewatch112 and caused by a particular user (such as the user102) and resulting control commands (and control signals) may be known to the particular user (e.g., the user102) and not to other users.
Themethod500 may further include generating, using the processor(s) associated with thewatch112, a control signal indicative of the at least one control command associated with the target device based on the at least one movement associated with the watch112 (or other control device) (block506). With reference to the discussion above with respect to block504, the at least one movement associated with thewatch112 may correspond to one or more control commands, and such correspondence may be particular to theuser102 in some embodiments. The actions described with respect to block506 may include generating the control signal indicative of the control command(s) by generating the control signal so that the control signal encodes or otherwise suitably indicates the control command(s).
Themethod500 may additionally include transmitting, using the processor(s) associated with the watch112 (or other control device), the control signal to the target device based on the indication of the target device so that the target device is controlled according to the at least one control command (block508). For example, when the indicated target device (e.g., selected target device and/or target device in communication with the watch112) is thestove118, the control signal may be transmitted to the stove communicationmode control component124 via thewireless communication link126. The control signal may control operation of one or more of the burners120a-120d,such as via electronic processing of the control signal by the stove communicationmode control component124 and electronic control of the one or more of the burners120a-120dby the stove communicationmode control component124, instead of conventional manual control of the one or more of the burners120a-120dby operation of one or more of theburner control knobs122a-122d.With reference to the discussion elsewhere herein, it will be appreciated from the present disclosure that a control signal as described herein may be transmitted to any suitable target device so that the target device is controlled according to at least one control command.
FIG. 6 illustrates a flow diagram of anotherexemplary method600 for movement-based device control, such as control of one of the target devices described with respect toFIG. 1 using a control device such as the watch112 (or thenecklace114 and/or the camera116). As will be appreciated from the teaching and disclosure herein, aspects of themethod600 may correspond to a more detailed implementation(s) of aspects of themethod500, and may include additional actions relative to those described with respect to themethod500.
Themethod600 may include executing, using the processor(s) associated with the watch112 (or other control device), a mobile application (block602). The mobile application, when executed by the processor(s) of thewatch112, may provide functionality such as that described in greater detail above. For example, the mobile application may be utilized in establishing one or more communication links between the watch112 (or other control device) and a target device(s), such as thewireless communication link126.
Themethod600 may include establishing a communicative connection between the watch112 (or other control device) and a target device using GPS technology, Wi-Fi communication, Bluetooth communication, infrared communication, line-of-sight communication, radar, and/or an RSSI (block604). Establishment of such a communicative connection (e.g., the wireless communication link126) in one or more of the foregoing example manners may be performed as described in detail above with respect toFIG. 1. It will be appreciated in light of the teaching and disclosure herein that the actions described with respect to block602 and/or block604 may be a more particular implementation(s) of the actions described with respect to block502.
Themethod600 may include determining, using the processor(s) associated with the watch112 (or other control device), (i) biometric information of theuser102 of thewatch112 while theuser102 is wearing thewatch112, and/or (ii) additional authentication information of theuser102 of the watch112 (e.g., a username and/or password, etc.) (block606). The determination of the biometric information of theuser102 while theuser102 is wearing thewatch112, and/or the determination of additional authentication information of theuser102, may be performed in one of the example manners described with respect toFIG. 1. For example, the biometric information of theuser102 may be sensed and/or otherwise collected or determined using one or more sensors of thewatch112.
Themethod600 may include determining, using the processor(s) associated with the watch112 (or other control device), whether the biometric information of theuser102 and/or the additional authentication information of the user102 (as determined in the manner described with respect to block606) match(es) expected biometric information and/or expected additional authentication information (block608). Expected biometric information and expected additional authentication information, including expected biometric information and expected additional authentication information specific to theuser102, are described in greater detail with respect toFIG. 1. The determination of whether the biometric information of theuser102 and/or the additional authentication information of theuser102 match(es) the expected biometric information and/or the expected additional authentication information may be performed in a similar manner as discussed in detail with respect toFIG. 1.
Themethod600 may include detecting, using the processor(s) associated with the watch112 (or other control device), at least one movement of theuser102 of thewatch112 while theuser102 is wearing the watch112 (block610). For example, one or more sensors of thewatch112 may be used to determine whether thewatch112 is being worn (e.g., by being used to determine if biometric information is being sensed or otherwise collected by the sensor(s)), and it may be determined whether the user102 (who in this example is an authorized user of the watch112) is the particular user wearing the watch (such as in the manner described with respect toFIG. 1). In some embodiments, the processor(s) associated with thewatch112 may not detect, or may ignore, the at least one movement of a person wearing thewatch112 when the person wearing thewatch112 is determined not to be theuser102. It will be appreciated in light of the teaching and disclosure herein that the actions described with respect toblocks606,608, and/or610 may be a more particular implementation of the actions described with respect to block504.
Themethod600 may include generating, using the processor(s) associated with the watch112 (or other control device), the control signal indicative of the at least one control command associated with the target device when the biometric information of theuser102 and/or the additional authentication information of theuser102 matches expected biometric information of theuser102 and/or expected additional authentication information of the user102 (block612). In this manner, and with reference to the discussion above, the control signal may be generated when the watch112 (or other control device) is being worn by theuser102 and when theuser102 is an authorized user of thewatch112, and the control signal may not be generated when theuser102 is not wearing thewatch112 and/or when theuser102 is not an authorized user of thewatch112.
Themethod600 may include generating, using the processor(s) associated with the watch112 (or other control device), the control signal indicative of the at least one control command associated with the target device so that the control signal changes an operational state of the target device, increases and/or decreases a setting associated with the target device, and/or moves the target device in a first direction and/or a second direction (e.g., opposite the first direction) (block614). Similar to the discussion of the actions described with respect to block406, changing the operational state of the target device may include turning the target device on or off; increasing or decreasing a setting of the target device may include increasing or decreasing an amount of light (e.g., by causing movement of the dimmer150); and moving the target device in a first direction or a second direction may include raising or lowering, for example, thefirst window shade132. It will be appreciated from the present disclosure that other examples of changing an operational state of a target device; increasing a setting associated with a target device; decreasing a setting associated with a target device; moving a target device in a first direction; and/or moving a target device in a second direction are envisioned, including performance of such actions with respect to other target devices shown inFIG. 1 and/or with respect to any suitable target devices not shown inFIG. 1.
It will be appreciated in light of the teaching and disclosure herein that the actions described with respect to block612 and/or the actions described with respect to block614 may be a more particular implementation of the actions described with respect to block506. It will also be appreciated in light of the teaching and disclosure herein that at least some of the actions described with respect to block612 may be performed concurrently with at least some of the actions described with respect to block614, and that other actions described herein with respect to other blocks may in some examples also be performed concurrently.
Themethod600 may include transmitting the control signal indicative of the at least one control command to the target device concurrently with a voice control command causing the target device to be controlled according to the voice control command or distinct in time from the voice control command (block616). For example, the target device may be thelight fixture146, and thelight fixture146 may be configured with “smart” functionality that allows the light fixture to turn on and off (and/or increase or decrease brightness, etc.) in response to a voice control command and/or in response to a control signal from a control device such as thewatch112. As noted above, movement-based (e.g., gesture-based) control of a target device such as thelight fixture146 may be preferred by theuser102, particularly in circumstances such as the presence of a significant volume of other sound in the vicinity of theuser102 that may make issuing the voice control command to thelight fixture146 difficult. Theuser102 may also or alternatively desire to switch between controlling the target device (e.g., the light fixture146) based on a voice control command(s) and based on a control signal(s) generated by thewatch112 or other control device. Thus, thelight fixture146 or other target device may be controlled according to a control signal from thewatch112 during a first time interval that is distinct from a second time interval in which a voice control command causes the target device to be controlled.
In some embodiments, the user may perform movement-based control of the target device and issue a voice control command(s) concurrently. In such embodiments, a processor(s) of the target device may interpret the resulting movement-based control signal(s) and voice control command(s) and, for example, resolve any conflicts between the control signal(s) and the voice control command(s) based on any suitable predetermined rule or rules, such as a rule giving preference to a control signal(s) generated using the movement-based device control techniques described herein.
Themethod600 may also include transmitting, using the processor(s) associated with the watch112 (or other control device), data indicative of the control signal to one or more insurance provider computing devices (e.g., the insuranceprovider computing device108, which may be an insurance provider server) to cause the one or more insurance provider computing devices to generate an indication of an insurance discount, an insurance reward, an insurance incentive, and/or an insurance premium adjustment (block618). Similar to the actions described with respect to block408, the indication of the discount, reward, incentive, and/or premium adjustment may be generated based on utilization of the at least one movement associated with thewatch112 or other control device to control a target device according to at least one control command, with such utilization being indicated by, for example, control signal(s) transmitted to the one or more insurance provider computing devices.
The following additional considerations apply to the foregoing discussion. Throughout this specification, plural instances may implement functions, components, operations, or structures described as a single instance. As noted above, although individual functions and instructions of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
The methods described in this application may include one or more functions or routines in the form of non-transitory computer-executable instructions that are stored in a tangible computer-readable storage medium and executed using a processor of a computing device (e.g., thewatch112, theremote computing device106, the insuranceprovider computing device108, and/or any other computing devices within theexample system100 in any suitable combination). The routines may be included as part of any of the modules described in relation toFIGS. 1 and 2 or as part of a module that is external to the system illustrated byFIGS. 1 and 2. For example, the methods or portions thereof may be part of a browser application(s) or an application(s) running on any of the devices in theexample system100 as a plug-in or other module of the browser application. Further, the methods may be employed as “software-as-a-service” to provide, for example, thewatch112, theremote computing device106, the insuranceprovider computing device108, and/or any other computing devices with access to theexample system100.
Additionally, certain aspects are described herein as including logic or a number of functions, components, modules, blocks, or mechanisms. Functions may constitute either software modules (e.g., non-transitory code stored on a tangible machine-readable storage medium) or hardware modules. A hardware module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC) to perform certain functions). A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
Accordingly, the term hardware should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Hardware and software modules may provide information to, and receive information from, other hardware and/or software modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware or software modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware or software modules. In embodiments in which multiple hardware modules or software are configured or instantiated at different times, communications between such hardware or software modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware or software modules have access. For example, one hardware or software module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware or software module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware and software modules may also initiate communications with input or output devices, and may operate on a resource (e.g., a collection of information).
The various operations of example functions and methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
Similarly, the methods or functions described herein may be at least partially processor-implemented. For example, at least some of the functions of a method may be performed by one or processors or processor-implemented hardware modules. The performance of certain of the functions may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the functions may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., application program interfaces (APIs)).
The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
Still further, the figures depict preferred embodiments of anexample system100 and methods for purposes of illustration only. One of ordinary skill in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for movement-based device control. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.
To the extent that any meaning or definition of a term in this document conflicts with any meaning or definition of the same term in a document incorporated by reference, the meaning or definition assigned to that term in this document shall govern. Although the text sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this patent. The detailed description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical, if not impossible. Numerous alternative embodiments could be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims. While particular embodiments of the present invention have been illustrated and described, it would be obvious to those skilled in the art that various other changes and modifications can be made without departing from the spirit and scope of the invention. It is therefore intended to cover in the appended claims all such changes and modifications that are within the scope of this invention.