Movatterモバイル変換


[0]ホーム

URL:


US8164461B2 - Monitoring task performance - Google Patents

Monitoring task performance
Download PDF

Info

Publication number
US8164461B2
US8164461B2US11/788,178US78817807AUS8164461B2US 8164461 B2US8164461 B2US 8164461B2US 78817807 AUS78817807 AUS 78817807AUS 8164461 B2US8164461 B2US 8164461B2
Authority
US
United States
Prior art keywords
task
individual
sensors
tasks
performance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US11/788,178
Other versions
US20070192174A1 (en
Inventor
Brian J. Bischoff
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Best Buy Health Inc
Original Assignee
Healthsense Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/323,077external-prioritypatent/US7589637B2/en
Priority to US11/788,178priorityCriticalpatent/US8164461B2/en
Application filed by Healthsense IncfiledCriticalHealthsense Inc
Assigned to RED WING TECHNOLOGIES, INC.reassignmentRED WING TECHNOLOGIES, INC.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: BISCHOFF, BRIAN J.
Publication of US20070192174A1publicationCriticalpatent/US20070192174A1/en
Assigned to HEALTHSENSE, INC.reassignmentHEALTHSENSE, INC.CHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: RED WING TECHNOLOGIES, INC.
Assigned to HEALTHSENSEreassignmentHEALTHSENSECHANGE OF NAME (SEE DOCUMENT FOR DETAILS).Assignors: RED WING TECHNOLOGIES, INC.
Priority to PCT/US2008/004850prioritypatent/WO2008130542A2/en
Priority to US13/324,711prioritypatent/US8872664B2/en
Application grantedgrantedCritical
Publication of US8164461B2publicationCriticalpatent/US8164461B2/en
Assigned to BRIDGE BANK, NATIONAL ASSOCIATIONreassignmentBRIDGE BANK, NATIONAL ASSOCIATIONSECURITY AGREEMENTAssignors: HEALTHSENSE, INC.
Priority to US14/524,717prioritypatent/US9396646B2/en
Priority to US15/212,776prioritypatent/US10115294B2/en
Assigned to THE NORTHWESTERN MUTUAL LIFE INSURANCE COMPANY, AS COLLATERAL AGENTreassignmentTHE NORTHWESTERN MUTUAL LIFE INSURANCE COMPANY, AS COLLATERAL AGENTSECOND LIEN PATENT SECURITY AGREEMENTAssignors: GreatCall, Inc.
Assigned to GreatCall, Inc.reassignmentGreatCall, Inc.RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS).Assignors: THE NORTHWESTERN MUTUAL LIFE INSURANCE COMPANY, AS COLLATERAL AGENT
Priority to US16/174,741prioritypatent/US10475331B2/en
Priority to US16/678,144prioritypatent/US20200074840A1/en
Assigned to GreatCall, Inc.reassignmentGreatCall, Inc.ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: HEALTHSENSE, INC.
Assigned to BEST BUY HEALTH, INC.reassignmentBEST BUY HEALTH, INC.MERGER (SEE DOCUMENT FOR DETAILS).Assignors: GreatCall, Inc.
Activelegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

One system embodiment includes providing a number of sensors for monitoring an individual in performing a number of tasks from a list of tasks to be completed, the number of tasks each including, an associated number of steps to be completed by the individual, monitoring the performance of a task from the list by using at least one of the number of sensors, providing the individual with a number of step instruction prompts associated with the steps of the task, and obtaining task performance information corresponding to the performance of the task by the individual. The task performance information includes step prompt information including the number of step instruction prompts provided during performance of the task, sensor data from the at least one of the number of sensors, and adjusting the list of tasks to be completed based on the task performance information of the task.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a continuation-in-part of U.S. application Ser. No. 11/323,077, filed Dec. 30, 2005.
BACKGROUND OF THE DISCLOSURE
Methods, devices, and systems have been developed in various fields of technology for the monitoring of the movement and/or health of an individual. With respect to the monitoring of the health of an individual, some methods, devices, and systems have been developed to aid in the diagnosis and treatment of individuals.
In the field of remote health monitoring, for instance, systems have been developed to enable an individual to contact medical professionals from their dwelling regarding a medical emergency. For example, in various systems, a system is equipped with an emergency call button on a base station that initiates a call or signal to an emergency call center from a user's home telephone. The concept of such a system is that if an individual has a health related problem, they can press the emergency call button and emergency medical providers will respond to assist them.
To aid in situations, such as where an individual has fallen and cannot reach an emergency call button on the base station, mobile devices have been developed. The mobile devices generally include an emergency call button that transmits a signal to the base station in the dwelling indicating an emergency. Once the signal is made, the base station alerts a remote assistance center that can contact emergency medical personnel or a designated third party.
Systems have also been developed that use sensors within the home to monitor an individual within a dwelling. Typically, these systems include motion sensors, for example, that are connected to a base control system that monitors areas within the dwelling for movement.
In such systems, when a lack of movement is indicated, the system indicates the lack of movement to a remote assistance center that can contact a party to aid the individual. Additionally, such sensing systems also monitor the health of the system, and its sensors, based upon the individual sensor activations.
With respect to diagnosis and treatment, some systems can be used to diagnose and/or improve brain functionality. For example, in one system, the individual uses a computing program that goes through a number of exercises on the display of the computing device.
In such systems, the individual makes a selection from one of a number of choices presented on the display and executable instructions within the computing device determine whether the answer is correct. Such systems can aid in recovery of memory that has been lost due to a traumatic brain injury or can aid in relearning such information, among other uses.
Additionally, some systems can include functionality that aids the individual in their daily routine. For example, some systems can provide a scheduling functionality that can utilize reminders directed to an individual that may have reduced brain functionality.
In this way, the individual can continue with a daily routine, through the use of prompts from a computing device with such scheduling functionality, even though the individual may not know what they are supposed to do next. Further, such systems can also aid in recovery of memory that has been lost due to a traumatic brain injury or can aid in relearning such information, among other uses.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a monitoring system embodiment.
FIG. 2 illustrates an embodiment of base station data flow.
FIG. 3 illustrates an embodiment of a task performance monitoring system.
FIG. 4 illustrates an embodiment of a task performance planning, execution, and validation framework for use with various embodiments of the present disclosure.
FIG. 5 illustrates task prompting routine for use with various embodiments of the present disclosure.
DETAILED DESCRIPTION OF THE DISCLOSURE
Embodiments of the present disclosure can provide simple, cost effective, privacy-respecting, and/or relatively non-intrusive methods, devices, and/or systems for monitoring task performance. Embodiments of the present disclosure, for example, can be utilized with and can include systems and devices as described in U.S. application Ser. No. 11/323,077, filed Dec. 20, 2005. The present disclosure provides detail into task performance concepts that can be used with the systems discussed in the above referenced application, the present application, and/or other systems for monitoring one or more individuals.
For instance, embodiments can include systems, methods, and devices to monitor the activity of an individual within or around a dwelling. As used herein, a “dwelling” can, for example, be a house, condominium, townhouse, apartment, or institution (e.g., hospital, assisted living facility, nursing home, prison, etc.). Embodiments, for example, can monitor the task performance of an individual within or around a dwelling.
For example, an embodiment can use a fixed or mobile device to aid an individual in performing a kitchen function, such as making lunch, or opening a drawer, among other functions. Various embodiments can be designed such that, based upon a number of task performance factors, when an individual is successfully completing a task that information can be evaluated to determine whether the difficulty of the task, when repeated, and/or the whether the difficulty of the next task, can be adjusted to better fit the individual and/or can be adjusted to challenge the individual being monitored.
Various embodiments can include systems, methods, or devices that utilize a fixed or mobile device to monitor activity of an individual within and/or out of a dwelling, such as monitoring the task performance of an individual. For example, in some embodiments, a mobile device can be used to aid an individual in running errands, among other functions.
To monitor the activity of an individual, various embodiments can provide automated detection of changes in activity within a dwelling and automated initiation of alerts to third parties to check on and/or assist the individual where assistance is needed, thereby avoiding prolonged periods of time before assistance is provided. Some embodiments can utilize multiple sensors, multiple timers, and/or multiple rules to determine whether to initiate an action, thereby increasing the certainty that an action is necessary and should be initiated. Various embodiments also can utilize multiple sensors, multiple timers, and/or multiple rules to make statistical correlations between a number of sensors, thereby increasing certainty that the system is in satisfactory health.
In some embodiments, the logic component can be rules-based, and can initiate a timer which establishes a time period for making the determination. The system can then monitor sensors within the dwelling to determine whether a particular task has been completed. For such review, the system can include memory to store such information or send the information to a remote server (e.g., at a remote monitoring site), for example.
A system embodiment can, for example, include providing a number of sensors for monitoring an individual in performing a number of tasks from a list of tasks to be completed. The number of tasks can be large items, such as a list of errands to be accomplished in a given day, and/or can be small items, such as the steps for taking a bath.
The level of detail of the tasks provided to the individual can be determined in a number of ways. For example, in some embodiments, the level of detail can be determined by the manufacturer, a system installer, a system administrator, a care provider, and/or the individual being monitored.
Additionally, in some embodiments, the system (e.g., through use of executable instructions executed by logic component) can set the level of detail of the tasks presented to an individual. In some embodiments, the level of detail can be adjustable based upon one or more factors including, but not limited to, length of time that the individual has been monitored, number of times a particular task has been performed, success rate of task performance, and/or percentage of correct steps versus incorrect steps in performing a task, among other factors. Embodiments can be designed such that the number of tasks or steps can be increased and/or decreased based upon such factors.
As discussed above, a task can include an associated number of steps to be completed by an individual. In some embodiments, each task can include multiple steps.
In various embodiments, the performance of a task from a list of tasks can be monitored by using at least one of a number of sensors. This can be accomplished, for example, by waiting for a sensor response that indicates that the task has been performed, such as pressing a button when done, or inferring that a task has been accomplished based upon the feedback provided by the sensors.
For example, regarding the task of brushing teeth, the system can be designed such that sensors sense the opening of a drawer containing the toothpaste and toothbrush, sensors detecting the water being turned on and off, and sensors detecting the drawer being closed. Such sensor feedback can infer that the user has opened a drawer, removed the toothpaste and toothbrush, run the water to rinse, and returned the toothpaste and brush to the drawer. Such routines can be pre-selected and provided as executable instructions provided in the system and displayed for the individual to follow and/or can be designed based upon the individual's routine and entered before system installation, at system installation, and/or after installation.
In using such routines, the individual can be provided with a number of step instruction prompts associated with the steps of the task. These prompts can be short prompts that indicate a brief instruction to the individual, such as open drawer, or can have a lot of information, such as, “You are now going to brush your teeth. Take your brush and paste out of the drawer, add paste to brush, place brush in mouth and start brushing motion, rinse mouth with water, and place brush and paste back in drawer.”
Such level of detail of prompts can be determined and provided before system installation, at system installation, and/or after installation, in various embodiments. These prompts can be provided in various formats, such as text, image, video, and/or audio.
Task performance information corresponding to the performance of the task by the individual can be obtained by the system. Such information can be obtained via the sensors which can include direct or inferred information, as discussed above.
Task performance information can include step prompt information including the number of step instruction prompts provided during performance of the task, sensor data from the at least one of the number of sensors, and adjusting the list of tasks to be completed based on the task performance information of the task, in various embodiments.
In some embodiments, announcements can be provided to the individual. These announcements can provide any suitable information to the individual. For example, an announcement can be provided for accomplishment of a step or task, in various embodiments.
In some embodiments, adjusting the list of tasks can include changing the number of tasks from the list of tasks to be completed. Adjusting the list of tasks can additionally, or alternatively, include changing the number of steps associated with one or more particular tasks.
In various embodiments, the embodiment can include prompting the individual to perform a second task from the list when the first task has been completed. In some embodiments, prompting the individual can be accomplished via a mobile device.
Providing the individual with a number of step instruction prompts can include providing the individual with a reminder when a step of the first task remains uncompleted. This can aid in encouraging the individual to finish an unfinished task.
Some embodiments can include scheduling the number of tasks to be completed by the individual. For example, the number of tasks to be completed can be provided in a sequential order, among other ordering formats.
In some embodiments directed to a method for monitoring task performance, the method can include providing a number of sensors for monitoring the individual in performing a number of tasks from a list of tasks to be completed. The number of tasks can each include, for example, an associated number of steps to be completed by the individual, initiating a first task having an associated number of steps based upon one or more context items, monitoring the performance of the first task by using at least one of the number of sensors, and/or providing a task completion indication based upon a determination of completion of the first task.
Initiating the first task based upon one or more context items can include initiating the first task based upon one or more context items from the group including a time of day, a day of the week, a list of completed tasks, a list of uncompleted tasks, a completion status of a particular task, and/or a determined location of the individual, among other items.
Initiating the first task based upon one or more context items can include initiating the first task based upon localizing the individual within a residence by using at least one fixed sensor located in the residence. For example, motion sensors can be used to locate and individual.
Other types of sensors can be used. For example, an individual can be located based upon an activation of one or more sensors (e.g., Activation of a number of sensors in the kitchen can be an indication of the location of the individual. In some embodiments, the certainty can be increased by evaluating a combination of both fixed and wearable sensors to verify whether the individual is in the kitchen). The combination of fixed and wearable sensors can be used to improve the accuracy of the location detection and to distinguish between multiple individuals in the dwelling.
Some embodiments can include monitoring the performance of the first task by using at least one of a number of fixed sensors and by using at least one portable sensor worn by the individual. Monitoring the performance of the first task can also include using the at least one portable sensor to determine whether the first task is being completed.
Initiating the first task can include prompting the individual on an interface of a mobile device, in some embodiments. In various embodiments, the first task can be initiated based upon integrating sensor data provided by at least one fixed sensor and at least one portable sensor.
Some embodiments can include initiating a second task having an associated number of steps to be completed based upon one or more context items. Such context items will be discussed in more detail herein.
In various embodiments, a method can include monitoring the performance of the first task by using integrated sensor data provided by at least one fixed sensor and at least one portable sensor. For example, the position of the individual can be ascertained by the location of the portable sensor and the fixed sensor can indicate what the individual is doing (e.g., opening a drawer).
The present disclosure also provides a number of system embodiments. For example, a system embodiment for monitoring task performance can include a number of fixed sensors located throughout a residence of the individual. In some such systems, a number of portable sensor can be worn or carried by the individual.
A computing device can be used to communicate with a number of fixed sensors and the portable sensors. Computing devices can include memory having instructions storable thereon and executable by a processor or other logic component to perform a method.
An example of a method can include monitoring, by using one or more sensors, the performance of a first task from a list of tasks to be completed. Each of the tasks can include an associated number of steps to be completed by the individual.
In some embodiments the individual can be provided with a number of step instruction prompts corresponding to an associated number of steps of a first task. Task performance information can be obtained that corresponds to the performance of the first task by the individual. This task performance information can include, for example, step prompt information including the number of step instruction prompts provided during performance of the first task, and/or sensor data from the at least one of the number of sensors, indicating that a particular step has been completed based on the sensor data. Based on the task performance information corresponding to the performance of the first task, the number of step instruction prompts provided to the individual in performing the first task on a subsequent occasion can be adjusted. This can be beneficial in continuing to challenge the individual and in tailoring the system to the learning speed and/or granularity of the individual.
In various embodiments, the system can be designed to indicate whether a step or task has been completed. This can be accomplished in a variety of manners and can be presented to the individual or can be transparent to the individual. For example, indicating that a particular step has been completed can be accomplished by placing a flag in a data file and/or by an announcement to the individual.
The adjustment of the number of step instruction prompt can be accomplished in a number of ways. For example, adjusting the number of step instruction prompts provided to the individual in performing the first task on a subsequent occasion can be accomplished via a user interface (e.g., a health professional, system administrator, etc.) and/or via an interface usable by the individual being monitored. Adjusting the number of step instruction prompts provided to the individual in performing the first task on a subsequent occasion be accomplished by increasing or reducing the number of step instruction prompts provided to the individual in performing the first task on the subsequent occasion.
In various embodiments, a system can include instructions to analyze the task performance information and to initiate adjusting the number of step instruction prompts provided to the individual in performing the first task on a subsequent occasion. In some embodiments, a system can include one or more instructions to schedule the number of tasks to be completed by the individual. Such systems can include a guidance module for providing guidance information for accomplishing at least one step. System embodiments can include a tracking module for tracking completion of the steps associated with at least one task.
Additionally, in some embodiments, the system can include an emotion sensing module. Such a module can be helpful, for example, in determining if the individual is getting frustrated with a particular situation and needs some assistance or a prompt.
In various embodiments, the system can include a solution module to provide information to the individual that may aid in solving a particular problem encountered by the individual. The solution module can, for example, include instructions executable to assist the individual with locating a particular item by providing the individual with a number of possible locations of an item. In such embodiments, the number of possible locations can, for example, be based on data from at least one of the number of fixed sensors and/or based on data from a database, among other data locations.
FIG. 1 illustrates a monitoring system embodiment. The figures herein follow a numbering convention in which the first digit or digits correspond to the drawing figure number and the remaining digits identify an element or component in the drawing. Similar elements or components between different figures may be identified by the use of similar digits. For example,110 may reference element “10” inFIG. 1, and a similar element may be referenced as210 inFIG. 2. As will be appreciated, elements shown in the various embodiments herein can be added, exchanged, and/or eliminated so as to provide a number of additional embodiments of value.
In the illustrated embodiment ofFIG. 1, thesystem100 utilizes thebase station110 to monitor the activities of a client (e.g., an individual) in and/or around a dwelling through use of a number of sensors112-1,112-2, and112-N. The number of sensors can be a number of fixed or portable sensors.
Monitoring the performance of a particular task can include using at least one portable sensor to determine whether the task is being completed. These determinations can be responses to a prompt made by the system, or can be inferred based upon the individual's daily activities as sensed by one or more sensors.
In such embodiments, the system can prompt an individual for a specific response and/or can determine whether a task has been accomplished based upon the activity of the individual. For instance, a system can be designed to ask the individual if the individual has brushed his teeth and the individual can respond to the system with a response that indicates yes (e.g., closing the drawer holding the tooth brush, pressing a button on a mobile or fixed device, or an audible response identified by an audio sensor).
Thebase station110 can also initiate a number of actions based upon a number of rules implemented by thebase station110. These rules use the information obtained from the number of sensors112-1 through112-N to determine whether to initiate an action or not. For example, a rule may be that if a yes response is received, go to the next task on a task list.
Thebase station110 includes a number of components providing a number of functions, as will be discussed herein. In the embodiment ofFIG. 1, thebase station110 is illustrated with respect to its various functionalities. For example, thebase station110 is capable of usingrules116 and/ortimers118 to determine whether to initiate anaction120.
As discussed herein, in some embodiments thebase station110 includes executable instructions to receive signals from sensors112-1 through112-N that are generated by activation of a sensor112-1 through112-N. Embodiments of the disclosure can include various types of sensing devices, including on/off type sensors, and/or ones whose signal strengths scale to the size of the activation parameter, such as temperature, weight, or touch.
The one or more sensors can be of many different types. For example types of sensors can include, but are not limited to, sensors to indicate the opening and closing of a door and/or drawer; sensors to indicate the movement of objects such as shades and/or blinds; current and/or voltage sensors to monitor appliances, lights, wells, etc.; pressure or fluid flow sensors to indicate the turning on and off of water; temperature sensors to indicate that the furnace is on or off; force sensors such as strain gauge sensors to sense an individual walking over a pad, sitting in a chair, or lying in bed; motion sensors to sense the motion of objects within the dwelling; alert switches/buttons to signal an emergency or client input such as a cancellation request; and sensors to measure the signal strength between multiple sensors. Sensors may also include those carried or worn by the individual, such as, vibration, temperature, audio, touch, humidity, Electro Cardio Gram (ECG), Electro Encephalogram (EEG) and/or Resistance (e.g., Galvanic Skin Reaction, etc.), among other well know sensing devices.
A sensor can also be a button on thebase station110 and/or mobile device126-2 which senses when someone actuates the button. Sensors can be analog and/or digital type sensors and can include logic circuitry and/or executable instructions to transmit signal output to thebase station110.
In some embodiments, thebase station110 can utilize a remote assistance center device (indicated as Remote Access Interface)114 to inform a third party122-1 through122-N that an alert condition exists and that aid may be needed. Aid can be a call to the individual130, a visit by a third party122-1 through122-N to the location of the individual130, or other such aid. As used herein, “third parties”122-1 through122-N can include hospital staff, emergency medical technicians, system technicians, doctors, neighbors, family members, friends of the individual130, police, fire department, and/or emergency 911 operators.
In the embodiment ofFIG. 1, thebase station110 and the remoteassistance center device114 can each be any type of computing device capable of managing the functionality of receiving alert requests and initiating such requests. For example, suitable devices can include personal computers, mainframe computers, system servers, devices having computer components therein, and other such devices.
As illustrated inFIG. 1, the remoteassistance center device114 and alocal interface124 are accessible by an individual130 (e.g., client). The communication between thedevices110,112-1 through112-N, and124 can be accomplished in various manners.
For example, in the embodiment shown inFIG. 1, the communications can be accomplished by wired (e.g., telephone lines) and/or wireless (e.g., radio interface) communications. Further, in some embodiments, the functionality of these devices can be provided in fewer devices than shown, or in more devices than shown.
In an additional embodiment, system devices126-1 through126-N (where “N” represents any number) can also communicate with thebase station110 through thelocal interface124. In some embodiments, a system device can be in the form of a mobile device126-1. The mobile device126-1 can, in some embodiments, provide access to and/or control of at least some of the functions of thebase station110 described herein. Embodiments of the mobile device126-1 are discussed in greater detail herein.
In some embodiments, a logic component can be used to control the functions of thebase station110. For example, the logic component can include executable instructions for providing such functions as handling received information from the sensors in the system, time-stamping received information such as sensor activation and/or system health functionality, among others. In some embodiments, the logic component can include RAM and/or ROM, a clock, an input/output, and a processor, among other things.
In some embodiments, a mobile device can be used with thebase station110. The mobile device can be carried or worn by the individual130, as discussed herein. Mobile devices126-2 can be any type of device that is portable and that can provide the described functionalities.
Examples can include basic devices that have the capability to provide such functionalities, up to complex devices, having multiple functions. Examples of complex mobile devices126-2 can include mobile telephones and portable computing devices, such as personal digital assistants (PDAs), and the like.
In some embodiments, the mobile device can have home/away functionality to indicate whether the individual130 is within a certain distance of thebase station110 of the system, for instance, through use of a sensor (e.g., sensor112-5). In some embodiments, a transceiver, transmitter, and/or receiver can be used to transmit signals to and/or receive signals from thebase station110 within the dwelling.
As used herein, a transmitter and a transceiver can be used interchangeably if a transmission functionality is desired. Additionally, a receiver and a transceiver can be used interchangeably if a reception functionality is desired.
For example, a short range communication type of communication can be used. Short range communication types of sensors can include IEEE 802.15.4 and/or IEEE 802.11 protocols, among others.
As discussed herein, the mobile device126-2 can communicate with thebase station110 using short range communication signals. In these embodiments, the mobile device126-2 can use a short range communication signal and thelocal interface124 can be incorporated into thebase station110.
Additionally, in some embodiments, the mobile device126-2 can utilize a long range communication signal to communicate to thebase station110. In these embodiments, thelocal interface124 can be a mobile device such as a mobile telephone that can send the instructions from the mobile device126-2 to thebase station110. In such embodiments, the mobile device126-2 can be separate from, associated with, or included in the mobile telephone.
Additionally, the mobile device can include executable instructions to enable the mobile device126-2 to communicate with the mobile telephone in order to instruct the mobile telephone how to forward its base station message to thebase station110. In some embodiments, thebase station110 can include executable instructions to enable a short range communication signal to be translated into a long range wireless signal.
In an additional embodiment, the number of sensors112-1 through112-N can include a task sensor, where the task sensor is associated with a task assigned to the individual130. For example, in some embodiments the individual130 is assigned the task of retrieving a beverage.
In such embodiments, the task sensor would be the sensor that is activated when the refrigerator door is opened. The logic component can thus be designed to couple the task to the task sensor and to initiate a task-complete action when the task sensor is activated. In some embodiments, the task-complete action can be to send a signal to the remote assistance center device that the task was completed successfully.
Other task-complete actions may also be taken. In some embodiments, one or more sensors can be used to identify when a task is compete and/or in progress. Such embodiments can accomplish these tasks, for example, by monitoring the actuation of one or more sensors, the time between sensor activations, and other such suitable manners.
In some situations, more than one individual130 being monitored lives inside a single dwelling. In this situation, the home/away sensor112-5 and/or mobile device126-2 can be equipped with an identification tag. In some embodiments, the logic component can be designed to initiate a task-complete action when the task sensor is activated and when the signal strength between the home/away sensor112-5 with the correct identification tag is larger than the signal strength between the home/away sensor112-5 with a different identification tag and the task sensor.
As discussed herein, the mobile device can be equipped with components including, but not limited to, a display, a transceiver, a transmitter and/or a receiver, an antenna, a power source, a microprocessor, memory, input devices, and/or other output devices such as lights, speakers, and/or buzzers.
In addition, in some embodiments, the mobile device can include an “awaken” mechanism, where activating the awaken mechanism transmits a wireless signal indicative of a return of the mobile device to within the base range of the base station. When the awaken mechanism is activated, the mobile device can begin to send signals to the base station at the first predetermined amount of time if five (5) seconds at the first predefined time interval of thirty (30) seconds. Other first and second predetermined times of signal length and first and second predefined time intervals are also possible.
In some embodiments, the mobile device can be constructed to periodically check-in with the base station device, such as by sending a ping signal to the base station device via radio frequency or other such manner. In such embodiments, the mobile device can be provided with energy saving executable instructions that allow the mobile device to be in “sleep mode,” where power usage is reduced, and then to “awaken” periodically to send a ping signal to the base station device.
Once the signal is sent, the mobile device can then return to “sleep mode.” When in “sleep mode” the client can awaken the mobile device manually, for instance, by pushing an emergency button.
FIG. 2 illustrates an embodiment of base station data flow. This diagram illustrates the flow of information from various parts of the system. In the embodiment shown inFIG. 2, thesystem sensing information242 can be used in a variety of functions provided by thesystem200.
For example, system sensing information can be used to support emergency call functions234, activity monitoring functions232, andsystem health functions238, among others. Each of these functions (i.e.,232,234, and238) utilizes information about either a sensor or an activity of an individual that activates a sensor.
Theblocks232,234, and/or238 can process and interpret information fromsystem sensing block242 in order to provide information to the alertprotocol manager functionality236, and a system diagnosticalert protocol functionality240. These functionalities can be provided at the base station and/or at a remote location, for example. Individually, blocks232,234, or238 can pass system information directly to thealert protocol manager236, or can process the information itself to determine the need to initiate an alert request or other action request to thealert protocol manager236.
Thealert protocol manager236 can initiate an alert upon a request from232,234, or238, or can further process the information received from232,234, or238 to determine whether to initiate an alert or other action.
In the embodiment illustrated inFIG. 2, the initiation of an alert by thealert protocol manager236 can be implemented through use of functions within thesystem client interface244 and/or the systemremote interface246. In various embodiments, the systemremote interface246 can be a call center computer, such as a computer at an emergency call center. For example, an alert process can include a notification of the client that an alert will be or has been initiated.
This can be brought to the client's attention to allow the client to cancel the alert if the need for an intervention does not exist or has passed. In such cases, the system client interface can be used to indicate the impending or existing alert condition and/or can be used by the client to confirm and/or cancel the alert.
The system remote interface can be used to contact a third party, such as a remote assistance center device to inform the third party that an alert condition exists and that aid may be needed. Aid can be a call to the client of the system, a visit by a third party (e.g., doctor, emergency medical personal, system technician, etc.) to the location of the client, or other such aid, as discussed herein.
Similarly, system information can be provided from the system platform services block248 to thesystem health block238. This information can be used to determine whether to issue an alert for a fault in the system, for a software/firmware update, or the like. The system diagnosticalert block240 can be used to issue such an alert. This alert can then be effectuated through thesystem client interface244 and/or the systemremote interface246.
For example, if a sensor has to have a battery changed, the alert can be sent to both the individual and a third party (e.g., viablocks244 and246). If the client changes the battery, the alert can be canceled and notification of the cancellation can be provided to the third party.
Although embodiments illustrated herein may indicate a flow path, this is meant to be an example of flow and should not be viewed as limiting. Further, unless explicitly stated, the embodiments described herein are not constrained to a particular order or sequence. Additionally, some of the described embodiments and elements thereof can occur or be performed at the same point in time. Embodiments can be performed by executable instructions such as software and/or firmware.
Various embodiments can include different types of activity monitoring alert protocol management. Activity monitoring alert protocol management functionality can be accomplished by a number of executable instructions and/or through use of logic circuitry.
Further, in various embodiments, the executable instructions and/or, in some embodiments, the logic of the mobile device can be updated. For example, this can be accomplished wirelessly via communication with the base station, among other updating methods.
FIG. 3 illustrates an embodiment of a task performance monitoring system. In the embodiment ofFIG. 3, a number of functionalities that are provided by thesystem300 are illustrated. These functionalities are grouped as user interface functions, data storage and access functions, assistive technologies functions, personal safety functions, and sensory input functions. Embodiments can utilize more or less functions and/or function types, than are shown in the embodiment ofFIG. 3.
With respect to user interface functions, a system and/or device can include, for example, aspeech processor350, a remotemulti-user interface352, an inhome user interface354, and/or amobile user interface356.
In various embodiments, such as that illustrated inFIG. 3, a system or device can include adata center358,external health resources360, anddata sources362.
With respect to assistive technologies functionality, embodiments can include aplanning module364, amemory aid module366, and/or atask execution module368.
In various embodiments, such as that illustrated inFIG. 3, a system or device can include personal safety functions such as automated personal emergency response system (a-pers)370 and/or activities ofdaily living monitoring372.
With respect to sensory input functionality, embodiments can include inhome sensor devices374,remote sensor devices376, andlocalization devices378. The sensory input functions can be utilized to collect information about the individual being monitored and/or be used in determining whether a task has been performed and/or performed successfully. As used herein, in home sensor devices are devices that are positioned within the dwelling in which the individual is being monitored. Remote sensor devices are those located outside the dwelling of the individual. These may be portable or fixed and, if portable, may be used within the dwelling, in some situations.
Localization devices, are used to determine the location of the individual. Examples, of localization devices that would be suitable for use in embodiments of the present disclosure are mobile communication devices (e.g., capable of positioning based upon proximity to one or more fixed receivers), and global positioning system (GPS) devices. Mobile phones and other portable devices can have such capabilities.
FIG. 4 illustrates an embodiment of a task performance planning, execution, and validation framework for use with various embodiments of the present disclosure. In the embodiment ofFIG. 4, the embodiment includes initiation of an activity from a day plan at480.
The initiation of the activity (e.g., task or step) can, for example, be the monitoring of an activity that is being started by the individual being monitored, or the initiation of one or more instructions to aid in instructing the individual how to accomplish the activity. For example, in some embodiments, the instructions can be delivered in text, image, video, and/or audio information provided to the individual. This can be accomplished through the use of one or more files saved in memory and executable instructions that are executed to display the one or more files for the individual.
In some embodiments, the system can be designed such that the individual can select which format or formats the instructions are presented to them. For instance, the individual may select that the instructions are to be presented in text form or in video form when available, among other format selection choices. In some embodiments, the selection can be made by a user, such as an administrator or the manufacturer.
Atblock482, the embodiment includes execution of the activity. The embodiment includes determination of completion of the activity, atblock484. In the embodiment ofFIG. 4, the embodiment includes a verification that the activity has been successfully completed atblock486.
In some embodiments, an announcement can be made to the individual at various times during such processes as, for example, illustrated inFIG. 4. For instance, an announcement signaling the completion of a task or activity can be provided in one or more formats such as text, image audio, or video.
The method embodiment ofFIG. 4 also includes checking for the next pending activity or other item on the day plan. This can be beneficial in moving the individual to a next task/step once the current task/step has been completed.
FIG. 5 illustrates task prompting routine for use with various embodiments of the present disclosure. In this Figure a number of tasks are identified, namely, the day plan includes waking590-1, bathing590-2, eating breakfast590-3, performing a planned leisure activity (hobby)590-4, eating lunch590-5, exercise590-6, and other activities of daily living590-M.
Additionally, the bathing activity is further defined in the table to the right. In the table, an example of the steps and sensing methodology, criteria for proceeding, and actuations, if any, are discussed. This example provides a number of different steps of a task, a number of different sensor types used individually and in combination, different types of criteria for completion of the steps and/or task, among other features.
For instance, in the table, the first row includes the headings of the different information sections of the table. The headings are PROMPT/INTERVENTION, SENSOR(S) TO DETECT ACTION, CRITERIA TO PROCEED TO INITIATE SENDING TASK PROMPTS, AND ACTUATORS. The first column indicates the various steps for the Bathing Activity of Daily Living (ADL). InStep 1, the prompt is a prompt to indicate that the individual is to start their bathing task. The sensor to detect action is a motion sensor in the bathroom. The criterion is to detect that the individual has successfully entered the bathroom.
Instep 2, the prompt is a light emitting diode (LED) on the shaver indicating to the individual that they are to use the shaver. The sensor used is an accelerometer in the shaver. The criteria are the starting and/or stopping of the shaver.
Instep 3, an audio instruction is provided to the individual that indicates that they are to undress and enter the shower. A water flow sensor and/or temperature sensor is used in this example to detect action. The criterion is an audible acknowledgement from the individual that the task has been completed and an auto adjustment of the shower temperature can be made by an actuator during this step of the task.
Instep 4, an LED on the towel bar indicates that the individual should take the towel off the towel bar. In this example, some instructions were provided at a high level and others were provided at a low level. This example, indicates the versatility that can be provided to an individual based upon the implementation of embodiments discussed herein.
As discussed herein, if no response is obtained from the individual, then the executable instructions can determine which third party to contact. Other sensors can be used in combination with, or instead of, a sensor worn by the individual to determine whether the individual is within the dwelling. Examples of other sensors include, motion sensors, sensors on the interior/exterior/garage doors, sensors on the individual's automobile, and the like.
Although specific embodiments have been illustrated and described herein, those of ordinary skill in the art will appreciate that an arrangement calculated to achieve the same techniques can be substituted for the specific embodiments shown. As one of ordinary skill in the art will appreciate upon reading this disclosure, various embodiments of the invention can be performed in one or more devices, device types, and system environments including networked environments.
Combination of the above embodiments, and other embodiments not specifically described herein will be apparent to those of skill in the art upon reviewing the above description. The scope of the various embodiments of the disclosure includes other applications in which the above structures and methods can be used. Therefore, the scope of various embodiments of the disclosure should be determined with reference to the appended claims, along with the full range of equivalents to which such claims are entitled.
In the foregoing Detailed Description, various features may have been grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the embodiments of the invention require more features than are expressly recited in each claim.
Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.

Claims (33)

1. A method for monitoring task performance, comprising:
providing a number of sensors for monitoring an individual in performing a number of tasks from a list of tasks to be completed, wherein the number of tasks each include an associated number of steps to be completed by the individual;
initiating a first task based upon integrating sensor data provided by at least one of the number of sensors upon an activation of at least one of the number of sensors by the individual, wherein performing the first task includes the activation of at least one of the number of sensors by the individual and wherein the first task is commenced by the individual via the activation of at least one of the number of sensors by the individual;
monitoring the performance of a first task from the list by using at least one of the number of sensors and a timer;
providing the individual with a number of step instruction prompts associated with the steps of the first task;
obtaining task performance information corresponding to the performance of the first task by the individual, wherein the task performance information includes:
step prompt information including the number of step instruction prompts provided during performance of the first task;
timer data relating to the amount of time elapsed during performance of the first task; and
sensor data from the at least one of the number of sensors; and
adjusting the list of tasks to be completed based on the task performance information of the first task.
10. A method for monitoring task performance, comprising:
providing a number of sensors and a timer for monitoring an individual in performing a number of tasks from a list of tasks to be completed, wherein the number of tasks each include an associated number of steps to be completed by the individual;
initiating a first task having an associated number of steps based upon integrating sensor data provided by at least one of the number of sensors upon an activation of at least one of the number of sensors by the individual and one or more context items, wherein performing the first task includes the activation of at least one of the number of sensors by the individual and wherein the first task is commenced by the individual via the activation of at least one of the number of sensors by the individual;
monitoring the performance of the first task by using at least one of the number of sensors and the timer; and
providing a task completion indication based upon a determination of completion of the first task.
19. A system for monitoring task performance, comprising:
a number of fixed sensors located throughout a residence of an individual;
at least one portable sensor worn by the individual; and
a computing device in communication with the number of fixed sensors and the at least one portable sensor, the computing device including a memory having instructions storable thereon and executable by a processor thereof to perform a method that includes:
initiating a first task based upon integrating sensor data provided by at least one of the number of sensors upon an activation of at least one of the number of sensors by the individual, wherein performing the first task includes the activation of at least one of the number of sensors by the individual and wherein the first task is commenced by the individual via the activation of at least one of the number of sensors by the individual;
monitoring, by using at least one of the number of sensors and a timer, the performance of the first task from a list of tasks to be completed, wherein the number of tasks each include an associated number of steps to be completed by the individual;
providing the individual with a number of step instruction prompts corresponding to the associated number of steps of the first task;
obtaining task performance information corresponding to the performance of the first task by the individual, wherein the task performance information includes:
step prompt information including the number of step instruction prompts provided during performance of the first task;
timer data relating to the amount of time elapsed during performance of the first task; and
sensor data from the at least one of the number of sensors;
indicating that a particular step has been completed based on the sensor data; and
adjusting, based on the task perfoi ance information corresponding to the performance of the first task, the number of step instruction prompts provided to the individual in performing the first task on a subsequent occasion.
US11/788,1782005-12-302007-04-19Monitoring task performanceActive2026-09-29US8164461B2 (en)

Priority Applications (7)

Application NumberPriority DateFiling DateTitle
US11/788,178US8164461B2 (en)2005-12-302007-04-19Monitoring task performance
PCT/US2008/004850WO2008130542A2 (en)2007-04-192008-04-15Monitoring task performance
US13/324,711US8872664B2 (en)2005-12-302011-12-13Monitoring activity of an individual
US14/524,717US9396646B2 (en)2005-12-302014-10-27Monitoring activity of an individual
US15/212,776US10115294B2 (en)2005-12-302016-07-18Monitoring activity of an individual
US16/174,741US10475331B2 (en)2005-12-302018-10-30Monitoring activity of an individual
US16/678,144US20200074840A1 (en)2005-12-302019-11-08Monitoring activity of an individual

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
US11/323,077US7589637B2 (en)2005-12-302005-12-30Monitoring activity of an individual
US11/788,178US8164461B2 (en)2005-12-302007-04-19Monitoring task performance

Related Parent Applications (1)

Application NumberTitlePriority DateFiling Date
US11/323,077Continuation-In-PartUS7589637B2 (en)2005-12-302005-12-30Monitoring activity of an individual

Related Child Applications (1)

Application NumberTitlePriority DateFiling Date
US13/324,711ContinuationUS8872664B2 (en)2005-12-302011-12-13Monitoring activity of an individual

Publications (2)

Publication NumberPublication Date
US20070192174A1 US20070192174A1 (en)2007-08-16
US8164461B2true US8164461B2 (en)2012-04-24

Family

ID=38973664

Family Applications (6)

Application NumberTitlePriority DateFiling Date
US11/788,178Active2026-09-29US8164461B2 (en)2005-12-302007-04-19Monitoring task performance
US13/324,711Active2026-01-17US8872664B2 (en)2005-12-302011-12-13Monitoring activity of an individual
US14/524,717ActiveUS9396646B2 (en)2005-12-302014-10-27Monitoring activity of an individual
US15/212,776ActiveUS10115294B2 (en)2005-12-302016-07-18Monitoring activity of an individual
US16/174,741ActiveUS10475331B2 (en)2005-12-302018-10-30Monitoring activity of an individual
US16/678,144AbandonedUS20200074840A1 (en)2005-12-302019-11-08Monitoring activity of an individual

Family Applications After (5)

Application NumberTitlePriority DateFiling Date
US13/324,711Active2026-01-17US8872664B2 (en)2005-12-302011-12-13Monitoring activity of an individual
US14/524,717ActiveUS9396646B2 (en)2005-12-302014-10-27Monitoring activity of an individual
US15/212,776ActiveUS10115294B2 (en)2005-12-302016-07-18Monitoring activity of an individual
US16/174,741ActiveUS10475331B2 (en)2005-12-302018-10-30Monitoring activity of an individual
US16/678,144AbandonedUS20200074840A1 (en)2005-12-302019-11-08Monitoring activity of an individual

Country Status (2)

CountryLink
US (6)US8164461B2 (en)
WO (1)WO2008130542A2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20110275042A1 (en)*2010-02-222011-11-10Warman David JHuman-motion-training system
US9044543B2 (en)2012-07-172015-06-02Elwha LlcUnmanned device utilization methods and systems
US9049168B2 (en)*2013-01-112015-06-02State Farm Mutual Automobile Insurance CompanyHome sensor data gathering for neighbor notification purposes
US9061102B2 (en)2012-07-172015-06-23Elwha LlcUnmanned device interaction methods and systems
US9361778B1 (en)2013-03-152016-06-07Gary GermanHands-free assistive and preventive remote monitoring system
US9396646B2 (en)2005-12-302016-07-19Healthsense, Inc.Monitoring activity of an individual
US9426292B1 (en)*2015-12-292016-08-23International Business Machines CorporationCall center anxiety feedback processor (CAFP) for biomarker based case assignment
US20190109947A1 (en)*2016-03-232019-04-11Koninklijke Philips N.V.Systems and methods for matching subjects with care consultants in telenursing call centers
US10311694B2 (en)2014-02-062019-06-04Empoweryu, Inc.System and method for adaptive indirect monitoring of subject for well-being in unattended setting
US20190213100A1 (en)*2016-07-222019-07-11Intel CorporationAutonomously adaptive performance monitoring
US10475141B2 (en)2014-02-062019-11-12Empoweryu, Inc.System and method for adaptive indirect monitoring of subject for well-being in unattended setting
US11169613B2 (en)*2018-05-302021-11-09Atheer, Inc.Augmented reality task flow optimization systems
US11438435B2 (en)2019-03-012022-09-06Microsoft Technology Licensing, LlcUser interaction and task management using multiple devices

Families Citing this family (72)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
KR100750999B1 (en)*2004-12-202007-08-22삼성전자주식회사Device and method for processing call/message-related event in wireless terminal
US20080077020A1 (en)2006-09-222008-03-27Bam Labs, Inc.Method and apparatus for monitoring vital signs remotely
US20080201158A1 (en)2007-02-152008-08-21Johnson Mark DSystem and method for visitation management in a controlled-access environment
US8026814B1 (en)2007-07-252011-09-27Pinpoint Technologies Inc.Wireless mesh network for an asset tracking system
US8400268B1 (en)2007-07-252013-03-19Pinpoint Technologies Inc.End to end emergency response
US7893843B2 (en)*2008-06-182011-02-22Healthsense, Inc.Activity windowing
NL1036271C2 (en)*2008-12-032010-06-07Irina Til DEVICE FOR MEMORY ACTIVATION OF DEMENTING PEOPLE FOR INDEPENDENT PERFORMANCE OF EVERYDAY SELF-CARE TREATMENTS.
US20100262403A1 (en)*2009-04-102010-10-14Bradford White CorporationSystems and methods for monitoring water heaters or boilers
US8164444B2 (en)*2009-04-292012-04-24Healthsense, Inc.Position detection
EP2454921B1 (en)*2009-07-152013-03-13Koninklijke Philips Electronics N.V.Activity adapted automation of lighting
US8917181B2 (en)*2010-05-072014-12-23Mikael EdlundMethod for monitoring an individual
US9064391B2 (en)2011-12-202015-06-23Techip International LimitedTamper-alert resistant bands for human limbs and associated monitoring systems and methods
US8736447B2 (en)2011-12-202014-05-27Techip International LimitedTamper-resistant monitoring systems and methods
WO2014100488A1 (en)*2012-12-192014-06-26Robert Bosch GmbhPersonal emergency response system by nonintrusive load monitoring
US11872053B1 (en)*2013-02-222024-01-16Cloud Dx, Inc.Systems and methods for monitoring medication effectiveness
US11612352B1 (en)*2013-02-222023-03-28Cloud Dx, Inc.Systems and methods for monitoring medication effectiveness
US20140272847A1 (en)*2013-03-142014-09-18Edulock, Inc.Method and system for integrated reward system for education related applications
US20140272894A1 (en)*2013-03-132014-09-18Edulock, Inc.System and method for multi-layered education based locking of electronic computing devices
US20140255889A1 (en)*2013-03-102014-09-11Edulock, Inc.System and method for a comprehensive integrated education system
US20140278895A1 (en)*2013-03-122014-09-18Edulock, Inc.System and method for instruction based access to electronic computing devices
US9913003B2 (en)*2013-03-142018-03-06Alchera IncorporatedProgrammable monitoring system
US9008890B1 (en)2013-03-152015-04-14Google Inc.Augmented trajectories for autonomous vehicles
US8996224B1 (en)2013-03-152015-03-31Google Inc.Detecting that an autonomous vehicle is in a stuck condition
US20140278686A1 (en)*2013-03-152014-09-18Desire2Learn IncorporatedMethod and system for automatic task time estimation and scheduling
EP2989620A1 (en)*2013-04-222016-03-02Domosafety SASystem and method for automated triggering and management of alarms
KR20140134109A (en)*2013-05-132014-11-21엘에스산전 주식회사Solitary senior people care system
WO2015054501A2 (en)*2013-10-092015-04-16Stadson TechnologySafety system utilizing personal area network communication protocols between other devices
CN103680061B (en)*2013-12-112016-06-08苏州市职业大学A kind of indoor old solitary people security control device
US9355534B2 (en)*2013-12-122016-05-31Nokia Technologies OyCausing display of a notification on a wrist worn apparatus
US20160253910A1 (en)*2014-02-262016-09-01Cynthia A. FisherSystem and Method for Computer Guided Interaction on a Cognitive Prosthetic Device for Users with Cognitive Disabilities
US9460612B2 (en)2014-05-012016-10-04Techip International LimitedTamper-alert and tamper-resistant band
US9293029B2 (en)*2014-05-222016-03-22West CorporationSystem and method for monitoring, detecting and reporting emergency conditions using sensors belonging to multiple organizations
US10356649B2 (en)*2014-09-262019-07-16Intel CorporationMultisensory change detection for internet of things domain
US9754465B2 (en)*2014-10-302017-09-05International Business Machines CorporationCognitive alerting device
US10671954B2 (en)*2015-02-232020-06-02Google LlcSelective reminders to complete interrupted tasks
US20180053397A1 (en)*2015-03-052018-02-22Ent. Services Development Corporation LpActivating an alarm if a living being is present in an enclosed space with ambient temperature outside a safe temperature range
US9805587B2 (en)2015-05-192017-10-31Ecolink Intelligent Technology, Inc.DIY monitoring apparatus and method
WO2017003764A1 (en)2015-07-022017-01-05Select Comfort CorporationAutomation for improved sleep quality
WO2017013608A1 (en)*2015-07-202017-01-26Opterna Technology LimitedCommunications system having a plurality of sensors to remotely monitor a living environment
US9953511B2 (en)*2015-09-162018-04-24Honeywell International Inc.Portable security device that communicates with home security system monitoring service
US11283877B2 (en)2015-11-042022-03-22Zoox, Inc.Software application and logic to modify configuration of an autonomous vehicle
US9606539B1 (en)2015-11-042017-03-28Zoox, Inc.Autonomous vehicle fleet service and system
US10248119B2 (en)2015-11-042019-04-02Zoox, Inc.Interactive autonomous vehicle command controller
US9630619B1 (en)2015-11-042017-04-25Zoox, Inc.Robotic vehicle active safety systems and methods
US10334050B2 (en)*2015-11-042019-06-25Zoox, Inc.Software application and logic to modify configuration of an autonomous vehicle
US10401852B2 (en)2015-11-042019-09-03Zoox, Inc.Teleoperation system and method for trajectory modification of autonomous vehicles
WO2017079341A2 (en)2015-11-042017-05-11Zoox, Inc.Automated extraction of semantic information to enhance incremental mapping modifications for robotic vehicles
US9632502B1 (en)2015-11-042017-04-25Zoox, Inc.Machine-learning systems and techniques to optimize teleoperation and/or planner decisions
US9754490B2 (en)2015-11-042017-09-05Zoox, Inc.Software application to request and control an autonomous vehicle service
US12265386B2 (en)2015-11-042025-04-01Zoox, Inc.Autonomous vehicle fleet service and system
US10572961B2 (en)2016-03-152020-02-25Global Tel*Link CorporationDetection and prevention of inmate to inmate message relay
US9609121B1 (en)2016-04-072017-03-28Global Tel*Link CorporationSystem and method for third party monitoring of voice and video calls
EP3516559A1 (en)*2016-09-202019-07-31HeartFlow, Inc.Systems and methods for monitoring and updating blood flow calculations with user-specific anatomic and physiologic sensor data
US11061416B2 (en)2016-11-222021-07-13Wint Wi LtdWater profile used to detect malfunctioning water appliances
US20180302403A1 (en)*2017-04-132018-10-18Plas.md, Inc.System and method for location-based biometric data collection and processing
US10225396B2 (en)2017-05-182019-03-05Global Tel*Link CorporationThird party monitoring of a activity within a monitoring platform
US10860786B2 (en)2017-06-012020-12-08Global Tel*Link CorporationSystem and method for analyzing and investigating communication data from a controlled environment
CN107707657B (en)*2017-09-302021-08-06苏州涟漪信息科技有限公司Safety monitoring system based on multiple sensors
US10497475B2 (en)2017-12-012019-12-03Verily Life Sciences LlcContextually grouping sensor channels for healthcare monitoring
WO2019115308A1 (en)*2017-12-132019-06-20Koninklijke Philips N.V.Personalized assistance for impaired subjects
EP3546153B1 (en)2018-03-272021-05-12Braun GmbHPersonal care device
EP3546151B1 (en)2018-03-272025-04-30Braun GmbH BODY CARE DEVICE
US11410257B2 (en)2019-01-082022-08-09Rauland-Borg CorporationMessage boards
US11626010B2 (en)*2019-02-282023-04-11Nortek Security & Control LlcDynamic partition of a security system
US12165495B2 (en)*2019-02-282024-12-10Nice North America LlcVirtual partition of a security system
US11270799B2 (en)*2019-08-202022-03-08Vinya Intelligence Inc.In-home remote monitoring systems and methods for predicting health status decline
US11393326B2 (en)*2019-09-122022-07-19Rauland-Borg CorporationEmergency response drills
US11482323B2 (en)2019-10-012022-10-25Rauland-Borg CorporationEnhancing patient care via a structured methodology for workflow stratification
DE102019128456A1 (en)*2019-10-222021-04-22Careiot GmbH Procedures for monitoring people in need
CN113781692A (en)*2020-06-102021-12-10骊住株式会社 space management device
WO2022010398A1 (en)*2020-07-102022-01-13Telefonaktiebolaget Lm Ericsson (Publ)Conditional reconfiguration based on data traffic
US12112343B2 (en)2022-02-242024-10-08Klaviyo, Inc.Detecting changes in customer (user) behavior using a normalization value

Citations (39)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5447166A (en)1991-09-261995-09-05Gevins; Alan S.Neurocognitive adaptive computer interface method and system based on on-line measurement of the user's mental effort
DE19522803A1 (en)1995-06-231997-01-02Klaus VorschmittSafety monitoring of individuals requiring care
US5724987A (en)1991-09-261998-03-10Sam Technology, Inc.Neurocognitive adaptive computer-aided training method and system
US5810747A (en)*1996-08-211998-09-22Interactive Remote Site Technology, Inc.Remote site medical intervention system
US5890905A (en)1995-01-201999-04-06Bergman; Marilyn M.Educational and life skills organizer/memory aid
US5905436A (en)1996-10-241999-05-18Gerontological Solutions, Inc.Situation-based monitoring system
US6042383A (en)1998-05-262000-03-28Herron; Lois J.Portable electronic device for assisting persons with learning disabilities and attention deficit disorders
US6108685A (en)1994-12-232000-08-22Behavioral Informatics, Inc.System for generating periodic reports generating trend analysis and intervention for monitoring daily living activity
US6281790B1 (en)*1999-09-012001-08-28Net Talon Security Systems, Inc.Method and apparatus for remotely monitoring a site
US6402520B1 (en)1997-04-302002-06-11Unique Logic And Technology, Inc.Electroencephalograph based biofeedback system for improving learning skills
US20020198473A1 (en)2001-03-282002-12-26Televital, Inc.System and method for real-time monitoring, assessment, analysis, retrieval, and storage of physiological data over a wide area network
US20030004652A1 (en)2001-05-152003-01-02Daniela BrunnerSystems and methods for monitoring behavior informatics
US6520905B1 (en)1998-02-262003-02-18Eastman Kodak CompanyManagement of physiological and psychological state of an individual using images portable biosensor device
US6524239B1 (en)1999-11-052003-02-25Wcr CompanyApparatus for non-instrusively measuring health parameters of a subject and method of use thereof
US6540674B2 (en)2000-12-292003-04-01Ibm CorporationSystem and method for supervising people with mental disorders
US6558165B1 (en)2001-09-112003-05-06Capticom, Inc.Attention-focusing device and method of use
US20030117279A1 (en)2001-12-252003-06-26Reiko UenoDevice and system for detecting abnormality
US20030130590A1 (en)1998-12-232003-07-10Tuan BuiMethod and apparatus for providing patient care
US20030185436A1 (en)*2002-03-262003-10-02Smith David R.Method and system of object classification employing dimension reduction
US20030189485A1 (en)2002-03-272003-10-09Smith Simon LawrenceSystem for monitoring an inhabited environment
US20030216670A1 (en)*2002-05-172003-11-20Beggs George R.Integral, flexible, electronic patient sensing and monitoring system
US20030229471A1 (en)2002-01-222003-12-11Honeywell International Inc.System and method for learning patterns of behavior and operating a monitoring and response system based thereon
US20030236451A1 (en)*2002-04-032003-12-25The Procter & Gamble CompanyMethod and apparatus for measuring acute stress
US20040131998A1 (en)2001-03-132004-07-08Shimon MaromCerebral programming
US20040191747A1 (en)*2003-03-262004-09-30Hitachi, Ltd.Training assistant system
US20040219498A1 (en)*2002-04-092004-11-04Davidson Lance SamuelTraining apparatus and methods
US20050024199A1 (en)*2003-06-112005-02-03Huey John H.Combined systems user interface for centralized monitoring of a screening checkpoint for passengers and baggage
US20050057357A1 (en)*2003-07-102005-03-17University Of Florida Research Foundation, Inc.Daily task and memory assistance using a mobile device
US20050065452A1 (en)*2003-09-062005-03-24Thompson James W.Interactive neural training device
US20050073391A1 (en)*2003-10-022005-04-07Koji MizobuchiData processing apparatus
US20050131736A1 (en)2003-12-162005-06-16Adventium Labs And Red Wing Technologies, Inc.Activity monitoring
US20050137465A1 (en)*2003-12-232005-06-23General Electric CompanySystem and method for remote monitoring in home activity of persons living independently
US6950026B2 (en)2003-02-172005-09-27National Institute Of Information And Communication Technology Incorporated Administrative AgencyMethod for complementing personal lost memory information with communication, and communication system, and information recording medium thereof
US20050244797A9 (en)2001-05-142005-11-03Torkel KlingbergMethod and arrangement in a computer training system
US20050264425A1 (en)*2004-06-012005-12-01Nobuo SatoCrisis monitoring system
US20060161218A1 (en)*2003-11-262006-07-20Wicab, Inc.Systems and methods for treating traumatic brain injury
US20070032738A1 (en)*2005-01-062007-02-08Flaherty J CAdaptive patient training routine for biological interface system
US20070132597A1 (en)*2005-12-092007-06-14Valence Broadband, Inc.Methods and systems for monitoring patient support exiting and initiating response
US20070152837A1 (en)*2005-12-302007-07-05Red Wing Technologies, Inc.Monitoring activity of an individual

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5400246A (en)1989-05-091995-03-21Ansan Industries, Ltd.Peripheral data acquisition, monitor, and adaptive control system via personal computer
US6542076B1 (en)1993-06-082003-04-01Raymond Anthony JoaoControl, monitoring and/or security apparatus and method
US6160481A (en)*1997-09-102000-12-12Taylor, Jr.; John EMonitoring system
US7277414B2 (en)*2001-08-032007-10-02Honeywell International Inc.Energy aware network management
JP2003157481A (en)2001-11-212003-05-30Allied Tereshisu KkAged-person support system using repeating installation and aged-person support device
US20030114763A1 (en)*2001-12-132003-06-19Reddy Shankara B.Fusion of computerized medical data
US7091865B2 (en)2004-02-042006-08-15General Electric CompanySystem and method for determining periods of interest in home of persons living independently
US8894576B2 (en)2004-03-102014-11-25University Of Virginia Patent FoundationSystem and method for the inference of activities of daily living and instrumental activities of daily living automatically
US20050231356A1 (en)*2004-04-052005-10-20Bish Danny RHands-free portable receiver assembly for use with baby monitor systems
JP3857278B2 (en)2004-04-062006-12-13Smk株式会社 Touch panel input device
US7242305B2 (en)2004-04-092007-07-10General Electric CompanyDevice and method for monitoring movement within a home
US7154399B2 (en)2004-04-092006-12-26General Electric CompanySystem and method for determining whether a resident is at home or away
US20060055543A1 (en)*2004-09-102006-03-16Meena GaneshSystem and method for detecting unusual inactivity of a resident
US20060089538A1 (en)*2004-10-222006-04-27General Electric CompanyDevice, system and method for detection activity of persons
US7420472B2 (en)*2005-10-162008-09-02Bao TranPatient monitoring apparatus
US8164461B2 (en)2005-12-302012-04-24Healthsense, Inc.Monitoring task performance

Patent Citations (42)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5724987A (en)1991-09-261998-03-10Sam Technology, Inc.Neurocognitive adaptive computer-aided training method and system
US5447166A (en)1991-09-261995-09-05Gevins; Alan S.Neurocognitive adaptive computer interface method and system based on on-line measurement of the user's mental effort
US6108685A (en)1994-12-232000-08-22Behavioral Informatics, Inc.System for generating periodic reports generating trend analysis and intervention for monitoring daily living activity
US5890905A (en)1995-01-201999-04-06Bergman; Marilyn M.Educational and life skills organizer/memory aid
DE19522803A1 (en)1995-06-231997-01-02Klaus VorschmittSafety monitoring of individuals requiring care
US5810747A (en)*1996-08-211998-09-22Interactive Remote Site Technology, Inc.Remote site medical intervention system
US5905436A (en)1996-10-241999-05-18Gerontological Solutions, Inc.Situation-based monitoring system
US6626676B2 (en)1997-04-302003-09-30Unique Logic And Technology, Inc.Electroencephalograph based biofeedback system for improving learning skills
US6402520B1 (en)1997-04-302002-06-11Unique Logic And Technology, Inc.Electroencephalograph based biofeedback system for improving learning skills
US6520905B1 (en)1998-02-262003-02-18Eastman Kodak CompanyManagement of physiological and psychological state of an individual using images portable biosensor device
US6042383A (en)1998-05-262000-03-28Herron; Lois J.Portable electronic device for assisting persons with learning disabilities and attention deficit disorders
US20030130590A1 (en)1998-12-232003-07-10Tuan BuiMethod and apparatus for providing patient care
US6281790B1 (en)*1999-09-012001-08-28Net Talon Security Systems, Inc.Method and apparatus for remotely monitoring a site
US6821258B2 (en)1999-11-052004-11-23Wcr CompanySystem and method for monitoring frequency and intensity of movement by a recumbent subject
US6524239B1 (en)1999-11-052003-02-25Wcr CompanyApparatus for non-instrusively measuring health parameters of a subject and method of use thereof
US20030114736A1 (en)1999-11-052003-06-19Wcr CompanySystem and method for monitoring frequency and intensity of movement by a recumbent subject
US6540674B2 (en)2000-12-292003-04-01Ibm CorporationSystem and method for supervising people with mental disorders
US20040131998A1 (en)2001-03-132004-07-08Shimon MaromCerebral programming
US20020198473A1 (en)2001-03-282002-12-26Televital, Inc.System and method for real-time monitoring, assessment, analysis, retrieval, and storage of physiological data over a wide area network
US20050244797A9 (en)2001-05-142005-11-03Torkel KlingbergMethod and arrangement in a computer training system
US20030004652A1 (en)2001-05-152003-01-02Daniela BrunnerSystems and methods for monitoring behavior informatics
US6558165B1 (en)2001-09-112003-05-06Capticom, Inc.Attention-focusing device and method of use
US20030117279A1 (en)2001-12-252003-06-26Reiko UenoDevice and system for detecting abnormality
US20030229471A1 (en)2002-01-222003-12-11Honeywell International Inc.System and method for learning patterns of behavior and operating a monitoring and response system based thereon
US20030185436A1 (en)*2002-03-262003-10-02Smith David R.Method and system of object classification employing dimension reduction
US20030189485A1 (en)2002-03-272003-10-09Smith Simon LawrenceSystem for monitoring an inhabited environment
US20030236451A1 (en)*2002-04-032003-12-25The Procter & Gamble CompanyMethod and apparatus for measuring acute stress
US20040219498A1 (en)*2002-04-092004-11-04Davidson Lance SamuelTraining apparatus and methods
US20030216670A1 (en)*2002-05-172003-11-20Beggs George R.Integral, flexible, electronic patient sensing and monitoring system
US6950026B2 (en)2003-02-172005-09-27National Institute Of Information And Communication Technology Incorporated Administrative AgencyMethod for complementing personal lost memory information with communication, and communication system, and information recording medium thereof
US20040191747A1 (en)*2003-03-262004-09-30Hitachi, Ltd.Training assistant system
US20050024199A1 (en)*2003-06-112005-02-03Huey John H.Combined systems user interface for centralized monitoring of a screening checkpoint for passengers and baggage
US20050057357A1 (en)*2003-07-102005-03-17University Of Florida Research Foundation, Inc.Daily task and memory assistance using a mobile device
US20050065452A1 (en)*2003-09-062005-03-24Thompson James W.Interactive neural training device
US20050073391A1 (en)*2003-10-022005-04-07Koji MizobuchiData processing apparatus
US20060161218A1 (en)*2003-11-262006-07-20Wicab, Inc.Systems and methods for treating traumatic brain injury
US20050131736A1 (en)2003-12-162005-06-16Adventium Labs And Red Wing Technologies, Inc.Activity monitoring
US20050137465A1 (en)*2003-12-232005-06-23General Electric CompanySystem and method for remote monitoring in home activity of persons living independently
US20050264425A1 (en)*2004-06-012005-12-01Nobuo SatoCrisis monitoring system
US20070032738A1 (en)*2005-01-062007-02-08Flaherty J CAdaptive patient training routine for biological interface system
US20070132597A1 (en)*2005-12-092007-06-14Valence Broadband, Inc.Methods and systems for monitoring patient support exiting and initiating response
US20070152837A1 (en)*2005-12-302007-07-05Red Wing Technologies, Inc.Monitoring activity of an individual

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
AIST, Housing That Protects the Home-Maker-Development of Technology Capable of Detecting Abnormalities in the Ordinary Living Pattern Home-Maker for Information Signaling, Feb. 3, 2003.
AIST, Housing That Protects the Home-Maker—Development of Technology Capable of Detecting Abnormalities in the Ordinary Living Pattern Home-Maker for Information Signaling, Feb. 3, 2003.

Cited By (29)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9396646B2 (en)2005-12-302016-07-19Healthsense, Inc.Monitoring activity of an individual
US10475331B2 (en)2005-12-302019-11-12GreatCall, Inc.Monitoring activity of an individual
US10115294B2 (en)2005-12-302018-10-30Healthsense, Inc.Monitoring activity of an individual
US20110275042A1 (en)*2010-02-222011-11-10Warman David JHuman-motion-training system
US10019000B2 (en)2012-07-172018-07-10Elwha LlcUnmanned device utilization methods and systems
US9254363B2 (en)2012-07-172016-02-09Elwha LlcUnmanned device interaction methods and systems
US9044543B2 (en)2012-07-172015-06-02Elwha LlcUnmanned device utilization methods and systems
US9713675B2 (en)2012-07-172017-07-25Elwha LlcUnmanned device interaction methods and systems
US9733644B2 (en)2012-07-172017-08-15Elwha LlcUnmanned device interaction methods and systems
US9798325B2 (en)2012-07-172017-10-24Elwha LlcUnmanned device interaction methods and systems
US9061102B2 (en)2012-07-172015-06-23Elwha LlcUnmanned device interaction methods and systems
US9049168B2 (en)*2013-01-112015-06-02State Farm Mutual Automobile Insurance CompanyHome sensor data gathering for neighbor notification purposes
US9344330B2 (en)2013-01-112016-05-17State Farm Mutual Automobile Insurance CompanyHome sensor data gathering for neighbor notification purposes
US9361778B1 (en)2013-03-152016-06-07Gary GermanHands-free assistive and preventive remote monitoring system
US20170154516A1 (en)*2013-03-152017-06-01Nonnatech Inc.Hands-free assistive and preventive remote monitoring system
US10311694B2 (en)2014-02-062019-06-04Empoweryu, Inc.System and method for adaptive indirect monitoring of subject for well-being in unattended setting
US10475141B2 (en)2014-02-062019-11-12Empoweryu, Inc.System and method for adaptive indirect monitoring of subject for well-being in unattended setting
US9602669B1 (en)2015-12-292017-03-21International Business Machines CorporationCall center anxiety feedback processor (CAFP) for biomarker based case assignment
US9426292B1 (en)*2015-12-292016-08-23International Business Machines CorporationCall center anxiety feedback processor (CAFP) for biomarker based case assignment
US10701210B2 (en)*2016-03-232020-06-30Koninklijke Philips N.V.Systems and methods for matching subjects with care consultants in telenursing call centers
US20190109947A1 (en)*2016-03-232019-04-11Koninklijke Philips N.V.Systems and methods for matching subjects with care consultants in telenursing call centers
US10983894B2 (en)*2016-07-222021-04-20Intel CorporationAutonomously adaptive performance monitoring
US20190213100A1 (en)*2016-07-222019-07-11Intel CorporationAutonomously adaptive performance monitoring
US11169613B2 (en)*2018-05-302021-11-09Atheer, Inc.Augmented reality task flow optimization systems
US20220028297A1 (en)*2018-05-302022-01-27Atheer, Inc.Augmented reality task flow optimization systems
US11747909B2 (en)*2018-05-302023-09-05West Texas Technology Partners, LlcAugmented reality task flow optimization systems
US20240241586A1 (en)*2018-05-302024-07-18West Texas Technology Partners, LlcAugmented reality task flow optimization systems
US12086326B2 (en)2018-05-302024-09-10West Texas Technology Partners, LlcAugmented reality head gesture recognition systems
US11438435B2 (en)2019-03-012022-09-06Microsoft Technology Licensing, LlcUser interaction and task management using multiple devices

Also Published As

Publication numberPublication date
US10475331B2 (en)2019-11-12
US20190130732A1 (en)2019-05-02
US20170011617A1 (en)2017-01-12
US20070192174A1 (en)2007-08-16
US8872664B2 (en)2014-10-28
US20200074840A1 (en)2020-03-05
WO2008130542A3 (en)2009-02-26
US9396646B2 (en)2016-07-19
WO2008130542A2 (en)2008-10-30
US10115294B2 (en)2018-10-30
US20150179048A1 (en)2015-06-25
US20120086573A1 (en)2012-04-12

Similar Documents

PublicationPublication DateTitle
US8164461B2 (en)Monitoring task performance
US12217861B2 (en)Medication adherence device and coordinated care platform
KR102207631B1 (en)Methods and systems for remotely determining levels of healthcare interventions
US12354456B2 (en)Method and system to improve accuracy of fall detection using multi-sensor fusion
US20220304577A1 (en)Method and system to reduce infrastructure costs with simplified indoor location and reliable commumications
US10602964B2 (en)Location, activity, and health compliance monitoring using multidimensional context analysis
US9990827B2 (en)Wireless patient care system and method
US9293023B2 (en)Techniques for emergency detection and emergency alert messaging
CA2658604C (en)Remote device for a monitoring system
US20130150686A1 (en)Human Care Sentry System
CN104054390A (en)Wireless relay module for remote monitoring systems having power and medical device proximity monitoring functionality
WO2015143085A1 (en)Techniques for wellness monitoring and emergency alert messaging
WO2010104825A1 (en)Delivering media as compensation for cognitive deficits using labeled objects in surroundings
US20240194316A1 (en)Facilitating adherence to tasks designed to maintain or improve health
JP3687842B2 (en) Telemeter thermometer and temperature measuring system with telemeter emergency call function using it
Kutzik et al.Technological tools of the future
EP4312228A1 (en)System for supporting a patient's health control and operating method of such system
JP2003275180A (en)Health monitoring system utilizing hemadynamometer, automatic rescue notification system utilizing hemadynamometer, hemadynamometer, and receiving device

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:RED WING TECHNOLOGIES, INC., MINNESOTA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BISCHOFF, BRIAN J.;REEL/FRAME:019326/0224

Effective date:20070418

ASAssignment

Owner name:HEALTHSENSE, INC., MINNESOTA

Free format text:CHANGE OF NAME;ASSIGNOR:RED WING TECHNOLOGIES, INC.;REEL/FRAME:020092/0032

Effective date:20071014

Owner name:HEALTHSENSE, INC.,MINNESOTA

Free format text:CHANGE OF NAME;ASSIGNOR:RED WING TECHNOLOGIES, INC.;REEL/FRAME:020092/0032

Effective date:20071014

ASAssignment

Owner name:HEALTHSENSE, MINNESOTA

Free format text:CHANGE OF NAME;ASSIGNOR:RED WING TECHNOLOGIES, INC.;REEL/FRAME:020571/0472

Effective date:20080123

Owner name:HEALTHSENSE,MINNESOTA

Free format text:CHANGE OF NAME;ASSIGNOR:RED WING TECHNOLOGIES, INC.;REEL/FRAME:020571/0472

Effective date:20080123

STCFInformation on status: patent grant

Free format text:PATENTED CASE

ASAssignment

Owner name:BRIDGE BANK, NATIONAL ASSOCIATION, CALIFORNIA

Free format text:SECURITY AGREEMENT;ASSIGNOR:HEALTHSENSE, INC.;REEL/FRAME:029198/0532

Effective date:20121019

FPAYFee payment

Year of fee payment:4

FEPPFee payment procedure

Free format text:PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

ASAssignment

Owner name:THE NORTHWESTERN MUTUAL LIFE INSURANCE COMPANY, AS COLLATERAL AGENT, WISCONSIN

Free format text:SECOND LIEN PATENT SECURITY AGREEMENT;ASSIGNOR:GREATCALL, INC.;REEL/FRAME:043360/0117

Effective date:20170714

Owner name:THE NORTHWESTERN MUTUAL LIFE INSURANCE COMPANY, AS

Free format text:SECOND LIEN PATENT SECURITY AGREEMENT;ASSIGNOR:GREATCALL, INC.;REEL/FRAME:043360/0117

Effective date:20170714

ASAssignment

Owner name:GREATCALL, INC., CALIFORNIA

Free format text:RELEASE BY SECURED PARTY;ASSIGNOR:THE NORTHWESTERN MUTUAL LIFE INSURANCE COMPANY, AS COLLATERAL AGENT;REEL/FRAME:047174/0959

Effective date:20181001

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:8

ASAssignment

Owner name:GREATCALL, INC., CALIFORNIA

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEALTHSENSE, INC.;REEL/FRAME:051651/0137

Effective date:20161216

ASAssignment

Owner name:BEST BUY HEALTH, INC., MINNESOTA

Free format text:MERGER;ASSIGNOR:GREATCALL, INC.;REEL/FRAME:052522/0693

Effective date:20200201

MAFPMaintenance fee payment

Free format text:PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment:12


[8]ページ先頭

©2009-2025 Movatter.jp