The present application is a divisional application of a patent application having an application number of 202110592532.9, application date of 2021, 05/28, entitled "method and system for rehabilitating a subject via telemedicine".
The present application claims priority and benefit from U.S. patent application serial No. 17/147,428, filed on 12/1/2021, U.S. patent application serial No. 63/048,456, filed on 6/7/2020, U.S. patent application serial No. 17/021,895, filed on 15/9/2020, PCT application serial No. US2021/028655, U.S. patent application serial No. 17/147,439, U.S. patent application serial No. 63/104,716, filed on 23/10/23/2020, U.S. patent application serial No. 17/147,211, filed on 12/1/2021, U.S. patent application serial No. 63/088,657, filed on 7/10/2020, the entire disclosure of which is incorporated herein by reference.
Detailed Description
The following discussion is directed to various embodiments of the disclosure. Although one or more of these embodiments may be preferred, the embodiments disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. In addition, one skilled in the art will understand that the following description has broad application, and the discussion of any embodiment is meant only to be exemplary of that embodiment, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that embodiment.
Determining the optimal telestration procedure to create an optimal rehabilitation plan for a subject with certain characteristics (e.g., vital signs or other measurements, performance, demographic, psychological, geographic, diagnostic, measurement or test based, medical history, etiology, group-associated, differential diagnostic, surgical, physical therapy, pharmacological and other recommended treatments, etc.) can be a technically challenging problem. For example, a large amount of information may be considered when determining a rehabilitation plan, which may lead to inefficiencies and inaccuracies in the rehabilitation plan selection process. In a rehabilitation environment, some of the vast amounts of information considered may include characteristics of the subject such as personal information, performance information, and measurement information. Personal information may include, for example, demographic, psychological, or other information such as: age, weight, gender, height, body mass index, medical condition, family medical history, injury, medical procedure, prescribed medication, or some combination thereof. The performance information may include, for example, elapsed time of use of the rehabilitation device, an amount of force exerted on a portion of the rehabilitation device, a range of motion achieved on the rehabilitation device, a speed of movement of a portion of the rehabilitation device, indications of various pain levels using the rehabilitation device, or some combination thereof. The measurement information may include, for example, vital signs, respiratory rate, heart rate, temperature, blood pressure, or some combination thereof. It may be necessary to process the characteristics of a large number of subjects, the rehabilitation plan performed for those subjects, and the results of the rehabilitation plan for those subjects.
Further, another technical problem may relate to remotely treating a subject via a computing device from a location different from the location at which the subject is located during a telemedicine or teletherapy session. Another technical problem is to control or be able to control a rehabilitation device used by a subject at the location of the subject from different locations. Typically, when a subject undergoes a rehabilitation procedure (e.g., knee surgery), a healthcare provider may prescribe a rehabilitation device for the subject for use in performing a treatment regimen in their home or any mobile location or temporary dwelling. A healthcare provider may refer to a doctor, a physician's assistant, a nurse, a spine doctor, a dentist, a physical therapist, an acupuncture physician, a physical trainer, a coach, a personal trainer, a neurologist, a cardiologist, and the like. A healthcare provider may refer to any person having a certificate, license, degree, etc. in the fields of medicine, physical therapy, rehabilitation, etc.
Using a rehabilitation device to monitor the subject's actual progress (rather than relying on the subject's speech for its progress) when the healthcare provider is located in a different location than the subject and the rehabilitation device, modifying the rehabilitation plan according to the subject's progress, adapting the rehabilitation device to the subject's personal characteristics while the subject performs the rehabilitation plan, etc., can be technically challenging for the healthcare provider.
Accordingly, there may be a need for systems and methods of using sensor data to modify a rehabilitation plan and/or adjust a rehabilitation device while a subject is performing the rehabilitation plan using the rehabilitation device, such as those described herein.
In some embodiments, the systems and methods described herein may be configured to receive rehabilitation data relating to a user while the user is performing a rehabilitation plan using a rehabilitation device. The user may include a subject user or a person performing various exercises using a rehabilitation device. The rehabilitation plan may correspond to a rehabilitation plan, a pre-rehabilitation plan, an athletic rehabilitation plan, or other suitable rehabilitation plan. The rehabilitation data may include various characteristics of the user, various measurement information related to the user as the user uses the rehabilitation device, various characteristics of the rehabilitation device, a rehabilitation plan, other suitable data, or combinations thereof.
In some embodiments, when the user performs a rehabilitation plan using the rehabilitation device, at least some of the rehabilitation data may correspond to sensor data of a sensor configured to sense various features of the rehabilitation device and/or measurement information of the user. Additionally or alternatively, when the user performs a rehabilitation plan using the rehabilitation device, at least some of the rehabilitation data may correspond to sensor data from sensors associated with a wearable device configured to sense measurement information of the user.
The various features of the rehabilitation device may include one or more settings of the rehabilitation device, a current number of revolutions per time period (e.g., one minute) of a rotating member (e.g., a wheel) of the rehabilitation device, a resistance setting of the rehabilitation device, other suitable features of the rehabilitation device, or a combination thereof. The measurement information may include one or more vital signs of the user, a breathing rate of the user, a heart rate of the user, a body temperature of the user, a blood pressure of the user, other suitable measurement information of the user, or a combination thereof.
In some embodiments, the systems and methods described herein may be configured to generate rehabilitation information using rehabilitation data. The rehabilitation information may include a formatted summary of the user's performance of the rehabilitation plan while using the rehabilitation device such that the rehabilitation data may be presented on a computing device of a healthcare provider responsible for the user's performance of the rehabilitation plan. The healthcare provider can include a medical professional (e.g., doctor, nurse, therapist, etc.), an athletic professional (e.g., coach, dietician, etc.), or a similar professional (e.g., an athletic physiologist, a physical therapist, a professional therapist, etc.) that shares at least one of medical and athletic attributes. As used herein, and without limiting the foregoing, "healthcare provider" may be a human, a robot, a virtual assistant in virtual and/or augmented reality, or an artificial intelligence entity comprising a software program, integrated software and hardware, or separate hardware.
The systems and methods described herein may be configured to write rehabilitation information to an associated memory for access at and/or provision at a computing device of a healthcare provider. For example, the systems and methods described herein may be configured to provide rehabilitation information to an interface configured to present the rehabilitation information to a healthcare provider. The interface may include a graphical user interface configured to provide rehabilitation information and receive input from a healthcare provider. The interface may include one or more input fields, such as a text input field, a drop-down selection input field, a radio button input field, a virtual switch input field, a virtual wand input field, audio, tactile, biometric gesture recognition, gesture control, a contactless user interface (TUI), a dynamic user interface (KUI), a tangible user interface, a wired glove, a camera of recognizable depth, a stereo camera, a gesture-based controller or other activated and/or actuated input field, other suitable input fields, or a combination thereof.
In some embodiments, the healthcare provider may review (review) the rehabilitation information and determine whether to modify the rehabilitation plan and/or one or more features of the rehabilitation device. For example, the healthcare provider may view the rehabilitation information and compare the rehabilitation information to the rehabilitation plan being executed by the user.
The healthcare provider may compare (i) expected information related to the user when the user is performing a rehabilitation plan using the rehabilitation device, and (ii) measured information (e.g., indicated by the rehabilitation information) related to the user when the user is performing a rehabilitation plan using the rehabilitation device. The expected information may include one or more vital signs of the user, a respiratory rate of the user, a heart rate of the user, a body temperature of the user, a blood pressure of the user, other suitable information of the user, or a combination thereof. If one or more portions of the measured information are within an acceptable range associated with one or more corresponding portions of the expected information, the healthcare provider may determine that the rehabilitation plan is having the desired effect. Conversely, if one or more portions of the measurement information are outside of the range associated with one or more corresponding portions of the expected information, the healthcare provider may determine that the rehabilitation plan does not have the desired effect.
For example, the healthcare provider may determine whether a blood pressure value (e.g., systolic pressure, diastolic pressure, and/or pulse pressure) corresponding to the user (e.g., indicated by the measurement information) while the user is using the rehabilitation device is within an acceptable range (e.g., plus or minus 1%, plus or minus 5%, or any suitable range) of the expected blood pressure value indicated by the expected information. If the blood pressure value corresponding to the user is within the range of expected blood pressure values while the user is using the rehabilitation device, the healthcare provider may determine that the rehabilitation plan is having the desired effect. Conversely, if the blood pressure value corresponding to the user is outside of the range of expected blood pressure values while the user is using the rehabilitation device, the healthcare provider may determine that the rehabilitation plan does not have the desired effect.
In some embodiments, the healthcare provider may compare expected characteristics of the rehabilitation device when the user performs a rehabilitation plan using the rehabilitation device to the characteristics of the rehabilitation device indicated by the rehabilitation information. For example, the healthcare provider may compare an expected resistance (resistance) setting of the rehabilitation device to an actual resistance setting of the rehabilitation device indicated by the rehabilitation information. If the actual characteristic of the rehabilitation device indicated by the rehabilitation information is within the range of the corresponding characteristic of the expected characteristic of the rehabilitation device, the healthcare provider may determine that the user is properly performing the rehabilitation plan. Conversely, if the actual characteristics of the rehabilitation device indicated by the rehabilitation information are outside of the range of corresponding characteristics of the expected characteristics of the rehabilitation device, the healthcare provider may determine that the user is not performing the rehabilitation plan properly.
If the healthcare provider determines that the rehabilitation information indicates that the user is properly performing the rehabilitation plan and/or that the rehabilitation plan has a desired effect, the healthcare provider may determine not to modify the rehabilitation plan or one or more features of the rehabilitation device. Conversely, when the user is performing a rehabilitation plan using the rehabilitation device, the healthcare provider may determine to modify one or more characteristics of the rehabilitation plan and/or the rehabilitation device if the healthcare provider determines that the rehabilitation information indicates that the user has not or has not properly performed the rehabilitation plan and/or that the rehabilitation plan has not or has not had the desired effect.
In some embodiments, if the healthcare provider determines to modify the rehabilitation plan and/or one or more features of the rehabilitation device, the healthcare provider may interact with the interface to provide a rehabilitation plan input indicating one or more modifications to the rehabilitation plan and/or to the one or more features of the rehabilitation device. For example, the healthcare provider may use the interface to provide input indicative of an increase or decrease in a resistance setting of the rehabilitation device or other suitable modification to one or more features of the rehabilitation device. Additionally or alternatively, the healthcare provider may use the interface to provide input indicating a modification to the rehabilitation plan. For example, the healthcare provider may use the interface to provide input that indicates an increase or decrease in the amount of time required for the user to use the rehabilitation device in accordance with the rehabilitation plan, or other suitable modification to the rehabilitation plan.
In some embodiments, the systems and methods described herein may be configured to modify a rehabilitation plan based on one or more modifications of the rehabilitation plan input indication. Additionally or alternatively, the systems and methods described herein may be configured to modify one or more features of the rehabilitation device based on at least one aspect of the modified rehabilitation plan and/or the rehabilitation plan input. For example, the rehabilitation plan input may indicate a modification of one or more features of the rehabilitation device, and/or the rehabilitation plan may require or indicate an adjustment to the rehabilitation device to cause the user to obtain a desired result of the modified rehabilitation plan.
In some embodiments, the systems and methods described herein may be configured to receive subsequent rehabilitation data relating to the user while the user is performing a rehabilitation plan using the rehabilitation device. For example, after the healthcare provider provides input to modify the rehabilitation plan and/or control one or more features of the rehabilitation device, the user may continue to use the rehabilitation device to execute the modified rehabilitation plan. The subsequent rehabilitation data may correspond to rehabilitation data generated by the user while performing the modified rehabilitation plan using the rehabilitation device. In some embodiments, after the healthcare provider has received the rehabilitation information and determined not to modify the rehabilitation plan and/or control one or more features of the rehabilitation device, the subsequent rehabilitation data may correspond to the rehabilitation data generated while the user continues to perform the rehabilitation plan using the rehabilitation device.
Based on subsequent rehabilitation plan input received from the healthcare provider's computing device, the systems and methods described herein may be configured to further modify the rehabilitation plan and/or control one or more features of the rehabilitation device. In response to receiving and/or viewing subsequent rehabilitation information corresponding to the subsequent rehabilitation data, the subsequent rehabilitation plan input may correspond to an input provided by the healthcare provider at the interface. It should be understood that the systems and methods described herein may be configured to continuously and/or periodically provide rehabilitation information to a healthcare provider's computing device based on rehabilitation data continuously and/or periodically received from sensors or other suitable sources described herein.
The healthcare provider may continuously or periodically receive and/or view rehabilitation information as the user performs a rehabilitation plan using the rehabilitation device. Based on one or more trends indicated by the continuously or periodically received rehabilitation information, the healthcare provider may determine whether to modify the rehabilitation plan and/or control one or more features of the rehabilitation device. For example, the one or more trends may indicate an increase in heart rate or other suitable trend that indicates that the user is not performing the rehabilitation plan properly and/or that the user is not having a desired effect on the performance of the rehabilitation plan.
In some embodiments, the systems and methods described herein may be configured to use artificial intelligence and/or machine learning to assign subjects to groups (cohomes) and to dynamically control a rehabilitation device during an adaptive telemedicine session based on the assignment. In some embodiments, a number of rehabilitation devices may be provided to a subject. The subject may use the rehabilitation device to perform a rehabilitation program at their home, at a gym, at a rehabilitation center, at a hospital, or any suitable location, including permanent or temporary residence.
In some embodiments, the rehabilitation device may be communicatively connected (coupled) to a server. Before, during, and/or after the subject performs the rehabilitation plan, characteristics of the subject, including rehabilitation data, may be collected. For example, personal information, performance information, and measurement information may be collected before, during, and/or after a subject performs a rehabilitation program. The results of performing each exercise (e.g., performance improvement or performance decline) may be collected from the rehabilitation device throughout the rehabilitation program and after the rehabilitation program is executed. Before, during, and/or after execution of the rehabilitation program, parameters, settings, configurations, etc. of the rehabilitation device (e.g., position of the pedal, amount of resistance, etc.) may be collected.
Each feature, each result, and each parameter, setting, configuration, etc. of the subject may be time stamped and may be associated with a particular step in the rehabilitation plan. Such techniques can determine which steps in a rehabilitation program lead to the desired outcome (e.g., improved muscle strength, range of motion, etc.), and which steps lead to a reduced return (e.g., continuing motion after 3 minutes actually delays or impairs recovery).
As the subject performs various rehabilitation plans using the rehabilitation device, data may be collected over time from the rehabilitation device and/or any suitable computing device (e.g., a computing device that inputs personal information, such as an interface of a computing device described herein, a clinician interface, a subject interface, etc.). The data that may be collected may include characteristics of the subject, a rehabilitation plan performed by the subject, results of the rehabilitation plan, any of the data described herein, any other suitable data, or a combination thereof.
In some embodiments, data may be processed to group certain people. Persons having certain or selected similar characteristics, rehabilitation plans, and results of performing rehabilitation plans may be grouped. For example, athletes who do not have medical conditions and who are performing a rehabilitation program (e.g., using the rehabilitation device for 3 weeks, 5 times a week, 30 minutes a day) and are fully rehabilitated may be assigned to the first group. Elderly people classified as obese and having an increased range of motion by 75% after performing a rehabilitation program (e.g., using the rehabilitation program for 4 weeks, 3 times a week, 10 minutes a day) are classified into a second group.
In some embodiments, the artificial intelligence engine can include one or more machine learning models trained using the groups. For example, one or more machine learning models may be trained to receive input of features of a new subject and output a rehabilitation plan that results in a desired outcome for the subject. The machine learning model may correspond patterns between features of the new object and features of at least one of the objects included in the particular group. When the patterns match, the machine learning model may assign a new subject to the particular group and select a rehabilitation plan associated with the at least one subject. The artificial intelligence engine may be configured to: the rehabilitation device is remotely controlled based on the rehabilitation plan as the new subject performs the rehabilitation plan using the rehabilitation device.
It will be appreciated that as a new subject performs a rehabilitation plan using a rehabilitation device, the characteristics of the new subject (e.g., the new user) may change. For example, the performance of a subject may improve faster than the expected performance of people in the group to which the new subject is currently assigned. Thus, the machine learning model may be trained to dynamically reassign the new object to a different group based on the changed features, the group including people with features similar to the now changed features of the new object. For example, a clinically obese subject may lose weight and no longer meet the weight criteria of the initial group, resulting in the subject being reassigned to another group with a different weight criteria.
A different rehabilitation plan may be selected for the new subject, and the rehabilitation device may be controlled remotely (e.g., may be referred to as remotely) and based on the different rehabilitation plan as the new subject performs the rehabilitation plan using the rehabilitation device. Such techniques may provide a solution for controlling a rehabilitation device at a remote location.
Further, because a rehabilitation plan most accurately tailored to their characteristics is selected and implemented in real-time at any given moment, the systems and methods described herein may lead to faster recovery times and/or better results for the subject. "real time" may also refer to near real time, which may be less than 10 seconds. As described herein, the term "result" may refer to a medical result or medical effect. Results and effects may refer to responses to medical actions.
Depending on the desired outcome, the artificial intelligence engine may be trained to output several rehabilitation programs. For example, one result may include a recovery to a threshold level (e.g., 75% of the range of motion) in the fastest amount of time, while another result may include a full recovery (e.g., 100% of the range of motion) regardless of time. The data obtained from the subject and grouped may indicate that the first rehabilitation plan provides a first result for a person with characteristics similar to those of the subject and the second rehabilitation plan provides a second result for a person with characteristics similar to those of the subject.
Further, the artificial intelligence engine can be trained to output rehabilitation plans that are not optimal, i.e., suboptimal, non-standard, or otherwise excluded (all without limitation referred to as "excluded rehabilitation plans") for the subject. For example, if a subject has a high blood pressure, a particular exercise may not be approved or appropriate for the subject because this may put the subject at unnecessary risk and even cause a hypertensive crisis, and thus, the exercise may be flagged as a rehabilitation plan for exclusion for the subject. In some embodiments, the artificial intelligence engine may monitor rehabilitation data received while a subject (e.g., a user) having, for example, a relatively high blood pressure is using the rehabilitation device to perform an appropriate rehabilitation plan, and may modify the appropriate rehabilitation plan to include features of an excluded rehabilitation plan that may provide beneficial results to the subject if the rehabilitation data indicates that the subject is treating the appropriate rehabilitation plan without exacerbating, for example, a hypertensive condition of the subject.
In some embodiments, a rehabilitation plan and/or an excluded rehabilitation plan may be presented to a healthcare provider during a telemedicine or teletherapy session. The healthcare provider may select a particular rehabilitation plan for the subject, to have the rehabilitation plan transmitted to the subject, and/or to control the rehabilitation device based on the rehabilitation plan. In some embodiments, to facilitate telemedicine or teletherapy applications including remote diagnosis, determining a rehabilitation plan, and rehabilitation and/or pharmacologic prescription, the artificial intelligence engine may receive and/or operate from a remote location of the subject and the rehabilitation device.
In such a case, the recommended rehabilitation plan and/or the excluded rehabilitation plan may be presented concurrently with the video of the subject in real-time or near real-time during a telemedicine or teletherapy session on the user interface of the healthcare provider's computing device. The video may also be accompanied by audio, text, and other multimedia information. Real time may refer to less than or equal to 2 seconds. Real time may also refer to near real time, which may be less than 10 seconds or any reasonably approximate difference between two different times. Additionally, or alternatively, near real-time may refer to any interaction of sufficiently short time to enable two people to engage in a conversation via such a user interface, and typically will be less than 10 seconds but greater than 2 seconds.
Presenting the rehabilitation plan generated by the artificial intelligence engine while presenting the video of the subject may provide an enhanced user interface (user interface) because the healthcare provider may continue to visually and/or otherwise communicate with the subject while also viewing the rehabilitation plan on the same user interface. The enhanced user interface may enhance the healthcare provider's experience of using the computing device and may encourage the healthcare provider to reuse the user interface. This technique may also reduce computational resources (e.g., processing, memory, network) because the healthcare provider does not have to switch to another user interface screen to enter a query for a rehabilitation plan to recommend based on the characteristics of the subject. The artificial intelligence engine may be configured to dynamically provide a rehabilitation plan and an excluded rehabilitation plan on the fly.
In some embodiments, the rehabilitation device may be adaptive and/or personalized in that its performance, configuration and location may be tailored to the needs of a particular subject. For example, the pedal may be dynamically adjusted (e.g., via a telemedicine session or a programmed-based configuration in response to certain measurements detected) on-the-fly to increase or decrease the range of motion to conform to a rehabilitation plan designed for the user. In some embodiments, the healthcare provider may remotely adapt the rehabilitation device to the needs of the subject during the telemedicine session by causing control instructions to be transmitted from the server to the rehabilitation device. This adaptability may improve the rehabilitation outcome of the subject, enhance the goals of personalized medicine, and personalize the rehabilitation program on a per-person basis.
Fig. 1 generally illustrates a block diagram of a computer-implementedsystem 10 for managing a rehabilitation plan, the computer-implementedsystem 10 being referred to hereinafter as a "system". Managing the rehabilitation plan may include using an artificial intelligence engine to recommend the rehabilitation plan and/or provide excluded rehabilitation plans that should not be recommended to the subject.
Thesystem 10 also includes aserver 30, theserver 30 configured to store (e.g., write to an associated memory) and provide data related to managing the rehabilitation plan. Theserver 30 may include one or more computers and may take the form of one or more computers that are distributed and/or virtualized. Theserver 30 also includes afirst communication interface 32, thefirst communication interface 32 configured to communicate with the clinician interface 20 via afirst network 34. In some embodiments, thefirst network 34 may include wired and/or wireless network connections, such as Wi-Fi, bluetooth, zigBee, near Field Communication (NFC), cellular data networks, and the like. Theserver 30 includes afirst processor 36 and a first machine-readable storage memory 38, the first machine-readable storage memory 38 may be referred to simply as "memory," which holds (hold) first instructions 40 executed by thefirst processor 36 for performing various actions of theserver 30.
Theserver 30 is configured to store data regarding a rehabilitation plan. For example, the memory 38 includes a systemdata storage device 42, the systemdata storage device 42 configured to store system data, such as data related to a rehabilitation plan for treating one or more subjects. Theserver 30 is also configured to store data about the performance of the subject while following the rehabilitation plan. For example, memory 38 includes a subjectdata storage device 44, the subjectdata storage device 44 configured to store subject data, such as data related to one or more subjects, including data representing the performance of each subject in a rehabilitation plan.
Additionally, or alternatively, the characteristics of the person (e.g., individual, performance, measurement, etc.), the rehabilitation plan followed by the person, the level of compliance with the rehabilitation plan, and the results of the rehabilitation plan may use correlations and other statistical or probabilistic measures to enable partitioning of the rehabilitation plan into or into different subject equivalent group databases (cohort-equivalent databases) insubject data store 44. For example, data for a first set of first subjects having a first similar injury, a first similar medical condition, a first similar medical procedure being performed, a first rehabilitation plan followed by the first subjects, and a first outcome of the rehabilitation plan may be stored in a first subject database. Data for a second group of second subjects having a second similar injury, a second similar medical condition, a second similar medical procedure performed, a second rehabilitation plan followed by the second subjects, and a second outcome of the rehabilitation plan may be stored in a second subject database. Any single feature or any combination of features may be used to separate the components of the object. In some embodiments, different groups of objects may be stored in different partitions or volumes of the same database. No specific limitation on the number of objects of different groups is allowed, except for the limitation of the mathematical combination and/or division theory.
The characteristic data, rehabilitation plan data, and outcome data may be obtained from a large number of rehabilitation devices and/or computing devices over time and stored indatabase 44. The characteristic data, rehabilitation plan data, and outcome data may be associated in a subject group database in the subjectdata storage device 44. The characteristics of the person may include personal information, performance information, and/or measurement information.
In addition to historical information about others stored in the subject equivalence group database, real-time or near real-time information about the current subject being treated based on current subject characteristics may be stored in an appropriate subject equivalence group database. The characteristics of the object may be determined to match or be similar to the characteristics of another person in a particular group (e.g., group a) and the object may be assigned to that group.
In some embodiments, theserver 30 may run an Artificial Intelligence (AI)engine 11, the Artificial Intelligence (AI)engine 11 using one or moremachine learning models 13 to perform at least one of the embodiments disclosed herein. Theserver 30 may include a training engine 9 capable of generating one or moremachine learning models 13. Themachine learning model 13 may be trained to assign people to certain groups based on their characteristics, use real-time and historical data correlations involving equivalent groups of subjects to select a rehabilitation plan and control therehabilitation device 70, and so on.
The one or moremachine learning models 13 may be generated by training engine 9 and may be embodied as computer instructions executable by one or more processing devices of training engine 9 and/orserver 30. To generate one or moremachine learning models 13, training engine 9 may train one or moremachine learning models 13. Theartificial intelligence engine 11 may use one or moremachine learning models 13.
The training engine 9 may be a rack server, a router computer, a personal computer, a portable digital assistant, a smartphone, a laptop, a tablet, a netbook, a desktop computer, an internet of things (IoT) device, any other suitable computing device, or a combination thereof. The training engine 9 may be a cloud-based or real-time software platform, and the training engine 9 may include privacy software or protocols and/or security software or protocols.
To train one or moremachine learning models 13, training engine 9 may use a training data set of a number of characteristics of the person performing the rehabilitation program withrehabilitation device 70, details of the rehabilitation program performed by the person using rehabilitation device 70 (e.g., a treatment protocol including run, amount of time to perform the movement, frequency to perform the movement, schedule to perform the movement, parameters/configurations/settings ofrehabilitation device 70 in each step of the overall rehabilitation program, etc.), and results of the rehabilitation program performed by these persons. One or moremachine learning models 13 may be trained to match the feature patterns of the object with the features of others assigned to a particular group. The term "match" may refer to an exact match, a relative match, a substantial match, and the like. One or moremachine learning models 13 may be trained to receive as input features of a subject, map the features to features of persons assigned to a group and select a rehabilitation plan from the group. One or moremachine learning models 13 may also be trained to control themachine learning device 70 based on a rehabilitation plan.
Differentmachine learning models 13 may be trained to recommend different rehabilitation programs for different desired outcomes. For example, one machine learning model may be trained to recommend a rehabilitation plan for the most effective recovery, while another machine learning model may be trained to recommend a rehabilitation plan based on the speed of recovery.
Using training data including training inputs and corresponding target outputs, one or moremachine learning models 13 may reference the model created by training engine 9. The training engine 9 may find patterns (patterns) in the training data, where such patterns map the training inputs to the target outputs and generate themachine learning model 13 that captures these patterns. In some embodiments, theartificial intelligence engine 11, database 33, and/or training engine 9 may reside on another component (e.g.,auxiliary interface 94, clinician interface 20, etc.) depicted in fig. 1.
The one or moremachine learning models 13 may include, for example, a single level of linear or non-linear operation (e.g., support vector machine [ SVM ]), or themachine learning model 13 may be a deep network, i.e., a machine learning model that includes multiple levels of non-linear operations. Examples of deep networks are neural networks, including generative confrontation networks, convolutional neural networks, recurrent neural networks with one or more hidden layers, and fully connected neural networks (e.g., each neuron may transmit its output signal to the remaining neurons as well as its own input). For example, the machine learning model may include a number of layers and/or hidden layers that perform computations (e.g., dot products) using various neurons.
System 10 also includes asubject interface 50, withsubject interface 50 configured to communicate information to a subject and receive feedback from a subject. In particular, the subject interface includes aninput device 52 and anoutput device 54, which may be collectively referred to as asubject user interface 52, 54.Input device 52 may include one or more devices such as a keyboard, a mouse, a touch screen input, a gesture sensor, and/or a microphone and processor configured for voice recognition. Theoutput device 54 may take one or more of a variety of forms including, for example, a computer monitor or display screen on a tablet, smartphone, or smartwatch. Theoutput device 54 may include other hardware and/or software components, such as a projector, virtual reality capabilities, augmented reality capabilities, and the like. Theoutput device 54 may incorporate a variety of different visual, audio, or other presentation techniques. For example, theoutput device 54 may include a non-visual display, such as an audio signal that may include spoken language and/or other sounds such as tones, chimes, and/or melodies that may represent different conditions and/or directions.Output device 54 may include one or more different display screens and/or interfaces or controls for use by the subject to present various data. Theoutput devices 54 may include graphics, which may be rendered by a web-based interface and/or by a computer program or application (App).
As shown generally in fig. 1,subject interface 50 includes asecond communication interface 56, which may also be referred to as a remote communication interface, configured to communicate withserver 30 and/or clinician interface 20 via asecond network 58. In some embodiments, thesecond network 58 may include a Local Area Network (LAN), such as ethernet. In some embodiments, thesecond network 58 may include the internet and may protect communications between thesubject interface 50 and theserver 30 and/or clinician interface 20 via encryption, for example, by using a Virtual Private Network (VPN). In some embodiments, thesecond network 58 may include wired and/or wireless network connections, such as Wi-Fi, bluetooth, zigBee, near Field Communication (NFC), cellular data networks, and the like. In some embodiments, thesecond network 58 may be the same as thefirst network 34 and/or operatively connected to thefirst network 34.
Theobject interface 50 includes asecond processor 60 and a second machine-readable storage memory 62, the second machine-readable storage memory 62 holdingsecond instructions 64 that are executed by thesecond processor 60 to perform various actions of theobject interface 50. The second machine-readable storage memory 62 further includes a localdata storage device 66, the localdata storage device 66 configured to store data, such as data relating to a rehabilitation plan, and/or subject data, such as data representative of a subject's performance in a rehabilitation plan.Object interface 50 also includes alocal communication interface 68,local communication interface 68 being configured to communicate with various devices for use by objects in proximity to objectinterface 50.Local communication interface 68 may include wired and/or wireless communication. In some embodiments,local communication interface 68 may include a local wireless network such as Wi-Fi, bluetooth, zigBee, near Field Communication (NFC), cellular data network, or the like.
Thesystem 10 further includes arehabilitation device 70, therehabilitation device 70 being configured to manipulate by the subject and/or to manipulate a body part of the subject to perform an activity according to a rehabilitation plan. In some embodiments, therehabilitation apparatus 70 may take the form of a motion and rehabilitation device configured to perform and/or assist in performing a rehabilitation protocol, which may be an orthopedic rehabilitation protocol, and the treatment includes rehabilitation of a subject's body part, such as a joint or a bone or muscle group. Therehabilitation device 70 may be any suitable medical, rehabilitation, therapy, or like device configured to be controlled remotely via another computing device to treat a subject and/or exercise the subject.Rehabilitation device 70 may be an electromechanical machine including one or more weights (weights), an electromechanical bicycle, an electromechanical rotating disk, a smart mirror, a treadmill, or the like. The body part may comprise, for example, the spine, hands, feet, knees or shoulders. The body part may comprise a part of a joint, bone or muscle group, such as one or more vertebrae, tendons or ligaments. As shown generally in fig. 1, therehabilitation device 70 includes a controller 72, and the controller 72 may include one or more processors, computer memory, and/or other components. Therehabilitation apparatus 70 further comprises a fourth communication interface 74, the fourth communication interface 74 being configured to communicate with thesubject interface 50 via thelocal communication interface 68. Therehabilitation device 70 also includes one or moreinternal sensors 76 and anactuator 78, such as a motor. Theactuators 78 may be used, for example, for moving a body part of the subject and/or for resisting a force of the subject.
Theinternal sensors 76 may measure one or more operating characteristics of therehabilitation device 70, such as force, position, velocity, and/or velocity. In some embodiments, theinternal sensor 76 may include a position sensor configured to measure at least one of linear motion or angular motion of a body part of the subject. For example, theinternal sensor 76 in the form of a position sensor may measure a distance that a subject is able to move a portion of therehabilitation device 70, where such distance may correspond to a range of motion that is achievable by a body part of the subject. In some embodiments, theinternal sensor 76 may include a force sensor configured to measure the force applied by the subject. For example, aninternal sensor 76 in the form of a force sensor may measure the force or weight that a subject is able to apply to therehabilitation device 70 using a particular body part.
Thesystem 10 shown generally in fig. 1 also includes amovement sensor 82, themovement sensor 82 in communication with theserver 30 via thelocal communication interface 68 of theobject interface 50. Themovement sensor 82 may track and store the number of steps taken by the subject. In some embodiments, themovement sensor 82 may take the form of a wrist band, watch, or smart watch. In some embodiments, themovement sensor 82 may be integrated within a phone, such as a smartphone.
Thesystem 10 shown generally in fig. 1 also includes agoniometer 84, thegoniometer 84 communicating with theserver 30 via thelocal communication interface 68 of theobject interface 50. Thegoniometer 84 measures the angle of the subject body part. For example, thegoniometer 84 may measure the bending angle of the subject's knee or elbow or shoulder.
Thesystem 10 shown generally in fig. 1 also includes apressure sensor 86, thepressure sensor 86 in communication with theserver 30 via thelocal communication interface 68 of thesubject interface 50. Thepressure sensor 86 measures the amount of pressure or weight exerted by the subject's body part. For example, when a stationary bike is stepped on, thepressure sensor 86 may measure the amount of force applied by the subject's foot.
Thesystem 10 shown generally in FIG. 1 also includes asupervisory interface 90, which may be similar or identical to the clinician interface 20. In some embodiments, the monitoringinterface 90 may have enhanced functionality beyond that provided on the clinician interface 20. Themonitoring interface 90 may be configured for use by a person responsible for rehabilitation planning, such as an orthopedic surgeon.
Thesystem 10 shown generally in fig. 1 also includes a reportinginterface 92, which reportinginterface 92 may be similar or identical to the clinician interface 20. In some embodiments, the reportinginterface 92 may have less functionality than that provided on the clinician interface 20. For example, the reportinginterface 92 may not have the ability to modify the rehabilitation plan. Such a reportinginterface 92 may be used, for example, by a biller to determine the usage of thesystem 10 for billing purposes. In another example, the reportinginterface 92 may not have the capability to display information identifiable by an object, presenting only pseudonymized and/or anonymized data about certain data fields of a data body and/or certain data fields of a quasi-identifier for that data body. Such a reportinginterface 92 may be used, for example, by researchers to determine various effects of a rehabilitation plan on different subjects.
System 10 includes anauxiliary interface 94 for a healthcare provider, such as the healthcare provider described herein, to remotely communicate withsubject interface 50 and/orrehabilitation device 70. Such remote communication may enable a healthcare provider to provide assistance or guidance to the subject using thesystem 10. More specifically, theauxiliary interface 94 is configured to communicatetelemedicine signals 96, 97, 98a, 98b, 99a, 99b with thesubject interface 50 via a network connection, such as via thefirst network 34 and/or thesecond network 58. The telemedicine signals 96, 97, 98a, 98b, 99a, 99b comprise one of the following: audio signals 96, audio-visual signals 97, interface control signals 98a to control functions ofsubject interface 50, interface monitoring signals 98b to monitor a state ofsubject interface 50,device control signals 99a to change an operating parameter ofrehabilitation apparatus 70, and/or device monitoring signals 99b to monitor a state ofrehabilitation apparatus 70. In some embodiments, each ofcontrol signals 98a, 99a may be a one-way communication command fromauxiliary interface 94 tosubject interface 50. In some embodiments, an acknowledgement message may be sent fromsubject interface 50 tosecondary interface 94 in response to successful receipt ofcontrol signals 98a, 99a and/or in response to an implementation conveying success and/or failure of the requested control action. In some embodiments, each ofmonitoring signals 98b, 99b may be a one-way status information command fromsubject interface 50 tosecondary interface 94. In some embodiments, an acknowledgement message may be sent fromsecondary interface 94 tosubject interface 50 in response to successful receipt of one ofmonitoring signals 98b, 99b.
In some embodiments,subject interface 50 may be configured for pass-through ofdevice control signals 99a and device monitoring signals 99b betweenrehabilitation device 70 and one or more other devices, such asauxiliary interface 94 and/orserver 30. For example,subject interface 50 may be configured to transmitdevice control signal 99a in response todevice control signal 99a within telemedicine signals 96, 97, 98a, 98b, 99a, 99b fromauxiliary interface 94.
In some embodiments, theauxiliary interface 94 may be presented on a shared physical device as the clinician interface 20. For example, the clinician interface 20 may include one or more screens that implement theauxiliary interface 94. Alternatively or additionally, the clinician interface 20 may include additional hardware components such as a camera, speaker, and/or microphone to implement various aspects of theauxiliary interface 94.
In some embodiments, one or more portions of the telemedicine signals 96, 97, 98a, 98b, 99a, 99b may be generated from a pre-recorded source (e.g., an audio recording, a video recording, or an animation) for presentation via theoutput device 54 of theobject interface 50. For example, a course video may be streamed from theserver 30 and presented on theobject interface 50. The object may request content from a pre-recorded source viaobject interface 50. Optionally, via controls on theauxiliary interface 94, the healthcare provider may cause content from the pre-recorded source to be played on thesubject interface 50.
Theauxiliary interface 94 includes an auxiliary input device 22 and an auxiliary display 24, which may be collectively referred to as auxiliary user interfaces 22, 24. The auxiliary input device 22 may include, for example, one or more of a telephone, a keyboard, a mouse, a touch pad, or a touch screen. Alternatively or additionally, the auxiliary input device 22 may include one or more microphones. In some embodiments, the one or more microphones may take the form of an earpiece, headset, or wide area microphone or microphone configured for a healthcare provider to speak with the subject viasubject interface 50. In some embodiments, the auxiliary input device 22 may be configured to provide voice-based functionality, with hardware and/or software configured to interpret a healthcare provider's spoken instructions through the use of one or more microphones. The auxiliary input device 22 may include functionality provided by or similar to existing voice-based assistants, such as apple Siri, amazon Alexa, google assistant, or samsung Bixby. Auxiliary input device 22 may include other hardware and/or software components. The auxiliary input device 22 may include one or more general purpose devices and/or application specific devices.
The auxiliary display 24 may take one or more different forms including, for example, a computer monitor or a display screen on a tablet, smartphone, or smartwatch. The auxiliary display 24 may include other hardware and/or software components, such as a projector, virtual reality capabilities (virtual reality capabilities), augmented reality capabilities, or the like. The secondary display 24 may incorporate a variety of different visual, audio, or other presentation techniques. For example, the auxiliary display 24 may include a non-visual display (such as an audio signal) that may include spoken language and/or other sounds such as tones, chimes, melodies, and/or compositions that may indicate different conditions and/or directions. The secondary display 24 may include one or more different display screens and/or interfaces or controls for presenting various data for use by the healthcare provider. The auxiliary display 24 may include graphics that may be presented by a network-based interface and/or by a computer program or application (App).
In some embodiments,system 10 may provide computer translation of language fromauxiliary interface 94 to objectinterface 50, and/or vice versa. Computer conversion of a language may include computer conversion of spoken language and/or computer conversion of text. Additionally or alternatively, thesystem 10 may provide speech recognition and/or spoken pronunciation of text. For example, thesystem 10 may convert spoken words into printed text and/or thesystem 10 may audibly speak the language from the printed text. Thesystem 10 may be configured to identify spoken words of any or all of the subjects, clinicians, and/or healthcare providers. In some embodiments, thesystem 10 may be configured to recognize and react to a spoken request or command by an object. For example,system 10 may automatically initiate a telemedicine session in response to a subject's verbal command (which may be given in any of a number of different languages).
In some embodiments, theserver 30 may generate aspects of the secondary display 24 for presentation by thesecondary interface 94. For example, theserver 30 may include a web server configured to generate display screens for presentation on the secondary display 24. For example, theartificial intelligence engine 11 can generate recommended rehabilitation plans and/or excluded rehabilitation plans for the subject and generate a display screen including those recommended rehabilitation plans and/or excluded rehabilitation plans for presentation on the auxiliary display 24 of theauxiliary interface 94. In some embodiments, secondary display 24 may be configured to present a virtual desktop hosted byserver 30. In some embodiments, theserver 30 may be configured to communicate with theauxiliary interface 94 via thefirst network 34. In some embodiments, thefirst network 34 may comprise a Local Area Network (LAN), such as an ethernet network.
In some embodiments, thefirst network 34 may include the internet, and communications between theserver 30 and theauxiliary interface 94 may be secured via privacy-enhancing techniques, such as by using encryption over a Virtual Private Network (VPN). Alternatively or additionally, theserver 30 may be configured to communicate with theauxiliary interface 94 via one or more networks separate from thefirst network 34 and/or other communication means such as a direct wired or wireless communication channel. In some embodiments,subject interface 50 andrehabilitation device 70 may each operate from a subject location that is geographically separated from the location ofauxiliary interface 94. For example, thesubject interface 50 and therehabilitation device 70 may be used as part of a home rehabilitation system that may be remotely assisted by using theassistance interface 94 at a centralized location, such as a clinic or call center.
In some embodiments, theauxiliary interface 94 may be one of several different terminals (e.g., computing devices) grouped together, for example, at one or more call centers or at one or more clinicians' offices. In some embodiments, multipleauxiliary interfaces 94 may be geographically distributed. In some embodiments, a person may work as a healthcare provider remotely from any conventional office infrastructure. Such remote work may be performed, for example, where theauxiliary interface 94 takes the form of a computer and/or telephone. The tele-work may functionally allow for a schedule of work at home, which may include part-time and/or flexible work hours for the healthcare provider.
Fig. 2-3 illustrate an embodiment of arehabilitation device 70. More specifically, fig. 2 generally illustrates arehabilitation device 70 in the form of a stationary exercise bike 100, which may be referred to simply as a stationary bicycle. The stationary exercise bike 100 includes a set of pedals 102 that are each attached to apedal arm 104 to rotate about anaxis 106. In some embodiments, as generally shown in fig. 2, the pedal 102 is movable on thepedal arm 104 in order to adjust the range of motion that the subject uses while pedaling. For example, a pedal located inward toward theshaft 106 corresponds to a smaller range of motion than when the pedal is located outward away from theshaft 106. Thepressure sensor 86 is attached to one of the pedals 102 or embedded within one of the pedals 102 to measure the amount of force exerted by the subject on the pedals 102.Pressure sensor 86 may be in wireless communication withrehabilitation device 70 and/orsubject interface 50.
Fig. 4 generally illustrates a person (subject) using the rehabilitation device of fig. 2, and shows sensors and various data parameters connected to thesubject interface 50. Examples of object interfaces 50 are a tablet computer or smartphone or phablet, such as an iPad, iPhone, android device or Surface tablet, held manually by an object. In some other embodiments,subject interface 50 may be embedded withinrehabilitation device 70 or attached torehabilitation device 70.
Fig. 4 generally shows that the subject wears themotion sensor 82, which displays a record of "thisday step number 1355", on his wrist, indicating that themotion sensor 82 has recorded the step number and transmitted the step number to thesubject interface 50. Fig. 4 also generally shows the subject wearing agoniometer 84 on his right knee, thegoniometer 84 displaying a record of "knee angle 72 °" indicating that thegoniometer 84 is measuring the knee angle and transmitting the knee angle to thesubject interface 50. Fig. 4 also generally illustrates a right one of pedals 102, whereinpressure sensor 86 displays "force 12.5 pounds," indicating that rightpedal pressure sensor 86 is measuring force and transmitting a measurement of that force tosubject interface 50.
Fig. 4 also generally illustrates the left one of pedals 102, wherepressure sensor 86 displays "force 27 pounds," indicating that leftpedal pressure sensor 86 is measuring force and transmitting that force measurement tosubject interface 50. Fig. 4 also shows in general other subject data, such as the indicator "session time 0 04. The session time may be determined bysubject interface 50 based on information received fromrehabilitation device 70. Fig. 4 also generally shows the display indicator "pain level 3". Such pain levels may be obtained from the subject in response to a demand, such as a question, presented onsubject interface 50.
Fig. 5 is an example embodiment of an overview display 120 of theauxiliary interface 94. In particular, the overview display 120 presents a number of different controls and interfaces for a healthcare provider to remotely assist a subject through the use of thesubject interface 50 and/or therehabilitation device 70. The remote assistance function may also be referred to as telemedicine or teletherapy.
In particular, the overview display 120 includes asubject profile display 130 that presents biological information about a subject using therehabilitation device 70. Although theobject profile display 130 may take other forms, such as a separate screen or pop-up window, as generally illustrated in FIG. 5, theobject profile display 130 may take the form of a portion or region of the overview display 120.
In some embodiments, thesubject profile display 130 may include a limited subset of the subject's biological information. More specifically, the data presented on thesubject profile display 130 may depend on the needs of the healthcare provider for this information. For example, a healthcare provider who is assisting the subject in addressing a medical problem may be provided with medical history information about the subject, while a technician servicing a problem with therehabilitation device 70 may be provided with a more limited set of information about the subject. For example, the technician may be given only the name of the object.
Thesubject profile display 130 may include pseudonymized data and/or anonymized data, or use any privacy-enhancing technique to prevent the communication of confidential subject data in a manner that may violate the privacy requirements of the subject. Such privacy-enhancing techniques may enable compliance with laws, regulations, or other administrative rules, such as, but not limited to, the Health Insurance Portability and Accountability Act (HIPAA) or the general data protection act (GDPR), where an object may be considered a "data subject.
In some embodiments, asubject profile display 130 may present information about the rehabilitation plan for the subject to follow when using therehabilitation device 70. Such rehabilitation plan information may be limited to healthcare providers. For example, the rehabilitation plan information may be provided to a healthcare provider that assists the subject in addressing issues with the treatment regimen, while the technician servicing the issues of therehabilitation device 70 may not be provided with any information regarding the subject's rehabilitation plan.
In some embodiments, one or more recommended rehabilitation plans and/or excluded rehabilitation plans may be presented to the healthcare provider in thesubject profile display 130. The one or more recommended and/or excluded rehabilitation plans may be generated by theartificial intelligence engine 11 of theserver 30 and received from theserver 30 in real-time, particularly during a telemedicine or teletherapy session. An example of presenting one or more recommended rehabilitation plans and/or excluded rehabilitation plans is described below with reference to fig. 7.
The example overview display 120 generally shown in fig. 5 also includes asubject status display 134 that presents status information about a subject using the rehabilitation device. As generally shown in FIG. 5, theobject status display 134 may take the form of a portion or area of the overview display 120, although theobject status display 134 may take other forms, such as a separate screen or a pop-up window.
Thesubject status display 134 includessensor data 136 from one or moreexternal sensors 82, 84, 86 and/or from one or moreinternal sensors 76 of therehabilitation device 70. In some embodiments, thesubject status display 134 may include sensor data from one or more sensors in one or more wearable devices worn by the subject while using therehabilitation device 70. The one or more wearable devices may include a watch, bracelet, necklace, chest strap, or the like. The one or more wearable devices may be configured to monitor the subject's heart rate, temperature, blood pressure, one or more vital signs, etc., while the subject is using therehabilitation device 70. In some embodiments, thesubject status display 134 may presentother data 138 about the subject, such as a recently reported pain level or progress within a rehabilitation program.
User access controls may be used to restrict access to any or all of theuser interfaces 20, 50, 90, 92, 94 in thesystem 10, including which data is available for viewing and/or modification. In some embodiments, user access controls may be employed to control which information is available to any given person using thesystem 10. For example, data presented on theauxiliary interface 94 may be controlled by user access controls, the setting of which permissions depends on the healthcare provider/user's need and/or qualification for viewing the information.
The example overview display 120, shown generally in fig. 5, also includes a help data display 140 that presents information for use by a healthcare provider in assisting a subject. As generally illustrated in FIG. 5, the help data display 140 may take the form of a portion or area of the overview display 120. The help data display 140 may take other forms, such as a separate screen or a pop-up window.Help data display 140 may include, for example, presenting answers to common questions regarding the usage ofsubject interface 50 and/orrehabilitation device 70.
The help data display 140 may also include research data or best practices. In some embodiments, the help data display 140 may present answers or interpreted scripts in response to subject questions. In some embodiments, the help data display 140 may present a flowchart or an attack for use by a healthcare provider in determining a root cause and/or a solution to a problem of a subject.
In some embodiments, theassistance interface 94 may present two or morehelp data displays 140, which may be the same or different, to present help data simultaneously for use by the healthcare provider. For example, a first help data display may be used to present a service flowchart to determine the source of a problem with a subject, while a second help data display may present script information for a healthcare provider to read to the subject, such information preferably including an indication that the subject performed certain actions, which may help narrow the scope or resolve the problem. In some embodiments, the second help data display may be automatically populated with script information based on the input to the service flow diagram in the first help data display.
The example overview display 120, shown generally in FIG. 5, also includes anobject interface control 150, theobject interface control 150 presenting information about theobject interface 50 and/or modifying one or more settings of theobject interface 50. As generally shown in FIG. 5, theobject interface control 150 may take the form of a portion or region of the overview display 120. Theobject interface control 150 may take other forms, such as a separate screen or a pop-up window.Object interface control 150 may present information communicated tosecondary interface 94 via one or more of interface monitoring signals 98 b.
As shown generally in fig. 5, objectinterface control 150 includes a displayed display feed 152 that is rendered byobject interface 50. In some embodiments, display feed 152 may include a real-time copy of the display screen currently being presented to the object byobject interface 50. In other words, thedisplay feed 152 may present images presented on the display screen of theobject interface 50.
In some embodiments, display feed 152 may include abbreviated information about the display screen currently being presented byobject interface 50, such as a screen name or screen number. Object interface controls 150 may include object interface setting controls 154 for a healthcare provider to adjust or control one or more settings or aspects ofobject interface 50. In some embodiments, the object interface settings control 154 may cause thesecondary interface 94 to generate and/or transmit interface control signals 98 that control the functions or settings of theobject interface 50.
In some embodiments, object interface settings control 154 may include a co-browsing or co-browsing capability for a healthcare provider to remotely view and/orcontrol object interface 50. For example, object interface settings control 154 may enable a healthcare provider to remotely enter text into one or more text entry fields onobject interface 50 and/or remotely control a cursor onobject interface 50 using a mouse or touch screen ofauxiliary interface 94.
In some embodiments, usingsubject interface 50, subject interface settings control 154 may allow a healthcare provider to change settings that a subject cannot change. For example,subject interface 50 may be blocked from accessing language settings to prevent a subject from inadvertently switching languages for display onsubject interface 50, while subject interface settings control 154 may enable a healthcare provider to change the language settings ofsubject interface 50. In another example, objectinterface 50 may not be able to change the font size setting to a smaller size to prevent the subject from inadvertently switching the font size displayed onobject interface 50 such that the display will become illegible to the subject, and objectinterface setting control 154 may provide the healthcare provider with the ability to change the font size setting ofobject interface 50.
The example overview display 120 generally shown in fig. 5 also includes aninterface communication display 156 that displays the communication status between thesubject interface 50 and one or moreother devices 70, 82, 84, such as therehabilitation device 70, themovement sensor 82, and/or thegoniometer 84. As generally illustrated in FIG. 5, theinterface communication display 156 may take the form of a portion or region of the overview display 120.
Theinterface communication display 156 may take other forms, such as a separate screen or a pop-up window. Theinterface communication display 156 may include controls for the healthcare provider to remotely modify communication with one or more of theother devices 70, 82, 84. For example, the healthcare provider may remotely command theobject interface 50 to reset communication with one of theother devices 70, 82, 84 or to establish a new communication with one of theother devices 70, 82, 84. For example, the functionality may be used in the event that one of theother devices 70, 82, 84 of the subject has a problem, or in the event that the subject receives a new or replacement one of theother devices 70, 82, 84.
The example overview display 120, shown generally in fig. 5, also includes device controls 160 for a healthcare provider to view and/or control information about therehabilitation device 70. As generally shown in fig. 5, device controls 160 may take the form of a portion or region of overview display 120. The device controls 160 may take other forms, such as a separate screen or pop-up window. The device controls 160 may include adevice status display 162 having information about the current status of the device. Thedevice status display 162 may present information communicated to theauxiliary interface 94 via one or more of the device monitoring signals 99b.Device status display 162 may indicate whetherrehabilitation apparatus 70 is currently communicating withsubject interface 50. Thedevice status display 162 may present other current and/or historical information regarding the status of therehabilitation apparatus 70.
Device controls 160 may include device setting controls 164 for a healthcare provider to adjust or control one or more aspects ofrehabilitation apparatus 70. Device setting controls 164 may causeauxiliary interface 94 to generate and/or transmit device control signals 99 (e.g., device control signals 99 may be referred to as rehabilitation plan inputs, as described) for changing an operating parameter and/or one or more characteristics of rehabilitation apparatus 70 (e.g., pedal radius setting, resistance setting, target RPM, other suitable characteristics ofrehabilitation apparatus 70, or a combination thereof).
The device setting controls 164 may include amode button 166 and aposition control 168 that may be used in conjunction for a healthcare provider to place theactuators 78 of therehabilitation apparatus 70 in a manual mode, after which theposition control 168 may be used to change settings such as the position or speed of theactuators 78.Mode button 166 may provide settings such as position to switch between automatic and manual modes.
In some embodiments, one or more settings are adjustable at any time without an associated automatic/manual mode. In some embodiments, the healthcare provider may change an operating parameter of therehabilitation device 70, such as the pedal radius setting, when the subject is actively using therehabilitation device 70. This "on the fly" adjustment may or may not be available to objects usingobject interface 50.
In some embodiments, device settings control 164 may allow a healthcare provider to change settings that cannot be changed by a subject usingsubject interface 50. For example,subject interface 50 may be prevented from changing a pre-configured setting, such as a height or tilt setting ofrehabilitation device 70, whiledevice setting control 164 may provide a healthcare provider with the ability to change the height or tilt setting ofrehabilitation device 70.
The example overview display 120, shown generally in fig. 5, also includes anobject communication control 170 for controlling an audio or audiovisual communication session with theobject interface 50. The communication session withobject interface 50 may include real-time feeds fromauxiliary interface 94 to be rendered by an output device ofobject interface 50. The real-time feed may take the form of an audio feed and/or a video feed. In some embodiments, objectinterface 50 may be configured to provide two-way audio or audiovisual communication with a person usingauxiliary interface 94. In particular, a communication session withsubject interface 50 may include a two-way (two-way) video or audiovisual feed in which each ofsubject interface 50 andauxiliary interface 94 presents video of the other.
In some embodiments, objectinterface 50 may present video fromauxiliary interface 94 whileauxiliary interface 94 presents only audio orauxiliary interface 94 does not present real-time audio or visual signals fromobject interface 50. In some embodiments,auxiliary interface 94 may present video fromobject interface 50 whileobject interface 50 only presents audio or objectinterface 50 does not present real-time audio or visual signals fromauxiliary interface 94.
In some embodiments, an audio or audiovisual communication session withsubject interface 50 may be conducted, at least in part, while the subject is performing a rehabilitation regimen on the body part. As generally shown in FIG. 5, thesubject communication control 170 may take the form of a portion or region of the overview display 120. Theobject communication control 170 may take other forms, such as a separate screen or a pop-up window.
When the healthcare provider uses theauxiliary interface 94, the audio and/or audiovisual communication may be processed and/or directed by theauxiliary interface 94 and/or by another device or devices, such as a telephone system or a video conferencing system used by the healthcare provider. Alternatively or additionally, the audio and/or audiovisual communication may include communication with a third party. For example, thesystem 10 may enable a healthcare provider to initiate a three-way conversation with a subject and a subject matter professional, such as a healthcare provider or a specialist, with respect to using one particular piece of hardware or software. An exampleobject communication control 170, shown generally in fig. 5, includes acall control 172 for use by a healthcare provider in managing various aspects of audio or audiovisual communication with an object. Thecall control 172 includes adisconnect button 174 for the healthcare provider to end the audio or audiovisual communication session. Thecall control 172 also includes amute button 176 that temporarily mutes audio or audiovisual signals from theauxiliary interface 94. In some embodiments, thecall control 172 may include other features, such as a hold button (not shown).
The call controls 172 also include one or more record/playback controls 178, such as a record button, a play button, and a pause button, to control the recording and/or playback of audio and/or video from the teleconference session in conjunction with theobject interface 50. Thecall control 172 also includes avideo feed display 180 that presents still and/or video images from thesubject interface 50, and aself video display 182 that displays the current image of the healthcare provider using theauxiliary interface 94. As shown generally in fig. 5, thenative video display 182 may be presented in a picture-in-picture format within a portion of thevideo feed display 180. Alternatively or additionally, theown video display 182 may be presented separately from thevideo feed display 180 and/or independently of thevideo feed display 180.
The example overview display 120 generally shown in fig. 5 also includes a third party communications control 190 for use in performing audio and/or audiovisual communications with a third party. As generally shown in FIG. 5, the thirdparty communication control 190 may take the form of a portion or region of the overview display 120. The thirdparty communication control 190 may take other forms, such as a separate screen or pop-up window.
Third party communications control 190 may include one or more controls, such as a contact list and/or buttons or controls relating to using a particular piece of hardware or software to contact a third party, e.g., a subject matter expert, such as a healthcare provider or specialist. The thirdparty communication control 190 may include teleconferencing capabilities to enable a third party to simultaneously communicate with a healthcare provider via theauxiliary interface 94 and a subject via thesubject interface 50. For example, thesystem 10 may provide a healthcare provider with the option to initiate a three-way conversation with the subject and a third party.
Fig. 6 generally illustrates an example block diagram of arehabilitation program 602 for training themachine learning model 13 to output a subject based ondata 600 related to the subject in accordance with this disclosure. Theserver 30 may receive data related to other objects. Other subjects may have performed a rehabilitation plan using various rehabilitation devices.
The data may include characteristics of the other subject, details of a rehabilitation plan performed by the other subject, and/or results of performing the rehabilitation plan (e.g., a percentage of recovery of a portion of the subject's body, an amount of recovery of a portion of the subject's body, an increase or decrease in muscle strength of a portion of the subject's body, an increase or decrease in range of motion of a portion of the subject's body, etc.).
As depicted, the data has been assigned to different groups. Group a includes data for subjects with similar first characteristics, first rehabilitation plan, and first outcomes. Group B includes data for subjects with similar second characteristics, second rehabilitation plan, and second outcomes. For example, group a may include a first characteristic of a subject undergoing surgery for a limb amputation who has no medical condition by the age of twenty or more years; their rehabilitation plan may include a certain treatment regimen (e.g., using therehabilitation device 70,3 weeks, 5 times per week, 30 minutes each, with the values of the attributes, configurations, and/or settings of therehabilitation device 70 set to X (where X is a numerical value) in the first two weeks and to Y (where Y is a numerical value) in the last week).
Groups a and B may be included in a training data set used to train themachine learning model 13. Themachine learning model 13 may be trained to match patterns between the features of each group and output a rehabilitation plan that provides results. Thus, when thedata 600 of a new subject is input into the trainedmachine learning model 13, the trainedmachine learning model 13 may match the features included in thedata 600 with the features in group a or group B and output theappropriate rehabilitation plan 602. In some embodiments, themachine learning model 13 may be trained to output one or more excluded rehabilitation plans that should not be performed by the new subject.
Fig. 7 generally illustrates an embodiment of an overview display 120 of theassistance interface 94 presenting recommended and excluded rehabilitation plans in real-time during a telemedicine session according to the present disclosure. As depicted, the overview display 120 includes only the object profiles 130 and a portion of thevideo feed display 180 including theself video display 182. Any suitable configuration of the controls and interfaces of the summary display 120 described with reference to FIG. 5 may be presented in addition to or in place of the object profiles 130, thevideo feed display 180, and theself video display 182.
The healthcare provider using the auxiliary interface 94 (e.g., a computing device) during the teletherapy session may be presented in aself video 182 in a portion of the overview display 120 (e.g., a user interface presented on the display screen 24 of the auxiliary interface 94), which also presents video from the subject in thevideo feed display 180 of the overview display 120. Further, thevideo feed display 180 may also include a Graphical User Interface (GUI) object 700 (e.g., a button), which Graphical User Interface (GUI) object 700 enables a healthcare provider to share a recommended rehabilitation plan and/or an excluded rehabilitation plan with the subject on thesubject interface 50 in real-time or near real-time during the teletherapy session. The healthcare provider may selectGUI object 700 to share a recommended rehabilitation plan and/or an excluded rehabilitation plan. As depicted, another portion of the overview display 120 includes asubject profile display 130.
Thesubject profile display 130 is presenting two examples of recommended rehabilitation plans 600 and one example of an excludedrehabilitation plan 602. As described herein, a rehabilitation plan may be recommended in view of the characteristics of the subject being treated. In order to generate a recommendedrehabilitation plan 600 that the subject should follow to achieve the desired outcome, the patterns between the characteristics of the subject being treated and the set of other people who have performed the rehabilitation plan using therehabilitation device 70 may be matched by one or moremachine learning models 13 of theartificial intelligence engine 11. Each recommended rehabilitation plan may be generated based on a different desired outcome.
For example, as depicted, theobject profile display 130 presents "the characteristics of the object match the characteristics of the users in group A. Based on the characteristics of the subject and the desired outcome, the following rehabilitation program "is recommended for the subject. Thesubject profile display 130 then presents the recommended rehabilitation plans from group a, and each rehabilitation plan provides a different result.
As depicted, rehabilitation plan "a" indicates "subject X should use the rehabilitation device for 4 days, 30 minutes per day, to achieve a Y% increase in range of motion; subject X hastype 2 diabetes; and drug Z should be prescribed for subject X during rehabilitation planning to control pain (drug Z is approved fortype 2 diabetic subjects) ". Thus, the generated rehabilitation plan achieves a Y% increase in range of motion. As can be appreciated, the rehabilitation program also includes a recommended medication (e.g., medication Z) to prescribe to the subject to control pain in view of the subject's known medical condition (e.g.,type 2 diabetes). That is, not only does the recommended medication for the subject not conflict with the medical condition of the subject, but it also thereby increases the likelihood that the subject will obtain better results. This particular example, and all such examples elsewhere herein, are not intended to limit the generated rehabilitation program in any way according to the recommended plurality of medications or according to the treatment of the complications or diseases to be confirmed, viewed, diagnosed, and/or treated.
The recommended rehabilitation plan "B" may specify different rehabilitation plans including different treatment regimens, different medication regimens, etc. for the rehabilitation device based on different desired outcomes of the rehabilitation plan.
As depicted, thesubject profile display 130 may also present an excludedrehabilitation plan 602. These types of rehabilitation plans are displayed to the healthcare provider using theassistance interface 94 to alert the healthcare provider to not recommend certain portions of the rehabilitation plan to the subject. For example, an excluded rehabilitation plan may specify the following: "due to heart disease, subject X should not use the rehabilitation device more than 30 minutes per day; subject X hastype 2 diabetes; and drug M should not be prescribed to subjects X to control pain during the rehabilitation program (in which case drug M may cause complications intype 2 diabetic subjects). Specifically, the excluded rehabilitation program indicated a limitation of the treatment regimen, i.e. subject X should not move more than 30 minutes per day due to heart disease. The excluded rehabilitation program also indicated that subject X should not be prescribed drug M because the drug conflicts with themedical condition type 2 diabetes.
The healthcare provider may select a rehabilitation plan for the subject on overview display 120. For example, a healthcare provider may use an input peripheral (e.g., mouse, touch screen, microphone, keyboard, etc.) to select from the subject'srehabilitation plan 600. In some embodiments, during a remote treatment session, the healthcare provider may discuss with the subject the pros and cons of the recommendedrehabilitation plan 600.
In any event, the healthcare provider may select a rehabilitation plan for the subject to follow to achieve the desired result. The selected rehabilitation plan may be transmitted tosubject interface 50 for presentation. The subject may view the selected rehabilitation plan onsubject interface 50. In some embodiments, the healthcare provider and the subject may discuss details (e.g., a diet regimen, a medication regimen, a treatment regimen using therehabilitation device 70, etc.) in real-time or near real-time during the remote treatment session. In some embodiments, theserver 30 may control therehabilitation device 70 based on the selected rehabilitation plan and during the teletherapy session while the user is using therehabilitation device 70.
Fig. 8 generally illustrates an embodiment of an overview display 120 of anassistance interface 94 presenting a recommended rehabilitation plan that changes due to subject data changes in real-time during a telemedicine session according to the present disclosure. As can be appreciated, therehabilitation device 70 and/or any computing device (e.g., subject interface 50) may transmit data while the subject is performing a rehabilitation plan using therehabilitation device 70. The data may include updated features of the subject and/or other rehabilitation data. For example, the updated characteristics may include new performance information and/or measurement information. The performance information may include the speed of a portion of therehabilitation device 70, the range of motion achieved by the subject, the force exerted on a portion of therehabilitation device 70, the heart rate of the subject, the blood pressure of the subject, the respiratory rate of the subject, and the like.
In some embodiments, the data received at theserver 30 may be input into the trainedmachine learning model 13, which may determine that the features indicate that the subject is being treated according to the current rehabilitation plan. Determining that the subject is being treated according to the current rehabilitation plan may cause the trainedmachine learning model 13 to adjust the parameters of therehabilitation device 70. The adjustment may be based on the next step of the rehabilitation plan to further improve the performance of the subject.
In some embodiments, the data received at theserver 30 may be input into the trainedmachine learning model 13, which may determine that the characteristic indicates that the subject is not being treated according to the current rehabilitation plan (e.g., the progress is lagging, unable to maintain speed, unable to reach a certain range of motion, is experiencing excessive pain, etc.) or is out of progress (e.g., over a certain speed, moves painlessly for more than a specified time, applies more than a specified force, etc.).
The trainedmachine learning model 13 may determine that the features of the object no longer match the features of the objects in the group to which the object is assigned. Thus, the trainedmachine learning model 13 may reassign the object to another set that includes qualified features of the object features. In this manner, the trainedmachine learning model 13 may select a new rehabilitation plan from the new set and control therehabilitation device 70 based on the new rehabilitation plan.
In some embodiments, prior to controlling therehabilitation device 70, theserver 30 may provide thenew rehabilitation plan 800 to theassistance interface 94 for presentation in thesubject profile 130. As depicted, theobject profile 130 indicates "the characteristics of the object have changed and now match the characteristics of the users in group B". Based on the characteristics of the subject and the desired outcome, the next rehabilitation plan is recommended for the subject ".Subject profile 130 then presents a new rehabilitation plan 800 ("subject X should use the rehabilitation device for 10 minutes per day for 3 days to reach an increase in range of motion by L%". The healthcare provider may select anew rehabilitation plan 800 andserver 30 may receive the selection.
In some embodiments, theserver 30 may receive rehabilitation data related to the subject while the subject is performing a rehabilitation plan using therehabilitation device 70. As depicted, the rehabilitation plan may correspond to a rehabilitation plan, a pre-rehabilitation plan, an athletic rehabilitation plan, or other suitable rehabilitation plan. The rehabilitation data may include various characteristics of the subject (e.g., those described herein), various measurement information related to the subject while the subject is using the rehabilitation device 70 (e.g., those described herein), various characteristics of the rehabilitation device 70 (e.g., those described herein), a rehabilitation plan, other suitable data, or combinations thereof.
In some embodiments, at least some of the rehabilitation data may includesensor data 136 from one or moreexternal sensors 82, 84, 86 and/or from one or moreinternal sensors 76 of therehabilitation device 70. In some embodiments, at least some of the rehabilitation data may include sensor data from one or more sensors in one or more wearable devices worn by the subject while using therehabilitation device 70. The one or more wearable devices may include a watch, bracelet, necklace, chest band, head sweatband, wrist sweatband, any other suitable wearable, or combinations thereof. While the subject is using therehabilitation device 70, the one or more wearable devices may be configured to monitor the subject's heart rate, temperature, blood pressure, one or more vital signs, and the like.
In some embodiments, theserver 30 may use the rehabilitation data to generate rehabilitation information. The rehabilitation information may include a formatted summary of the user's performance of the rehabilitation plan while using the rehabilitation device such that the rehabilitation data may be presented on a computing device of a healthcare provider responsible for the user's performance of the rehabilitation plan. In some embodiments, the subject profile display 120 may include and/or display rehabilitation information.
Theserver 30 may be configured to provide rehabilitation information at the overview display 120. For example, theserver 30 may store rehabilitation information for access by the overview display 120 and/or communicate rehabilitation information to the overview display 120. In some embodiments, theserver 30 may provide the rehabilitation information to other suitable portions, components, or components of thesubject profile display 130 or overview display 120, or any other suitable display or interface.
In some embodiments, a healthcare provider assisting a subject in using therehabilitation device 70 may review the rehabilitation information and determine whether to modify the rehabilitation plan and/or one or more features of therehabilitation device 70. For example, the healthcare provider may view the rehabilitation information and compare the rehabilitation information to the rehabilitation plan being executed by the subject.
As the subject uses therehabilitation device 70, the healthcare provider may compare one or more portions or some of the expected information related to the subject's ability to perform the rehabilitation plan with one or more corresponding portions or some of the measured information related to the subject (e.g., as indicated by the rehabilitation information) as the subject uses therehabilitation device 70 to perform the rehabilitation plan. The expected information may include one or more vital signs of the user, a respiratory rate of the user, a heart rate of the user, a body temperature of the user, a blood pressure of the user, other suitable information of the user, or a combination thereof. If one or more portions of the measured information are within an acceptable range of one or more corresponding portions of the expected information, the healthcare provider may determine that the rehabilitation plan is having the desired effect. Conversely, if one or more portions of the measured information are outside of an acceptable range for one or more corresponding portions of the expected information, the healthcare provider may determine that the rehabilitation plan does not have the desired effect.
In some embodiments, as the subject performs a rehabilitation plan using therehabilitation device 70, the healthcare provider may compare various characteristics expected of therehabilitation device 70 with corresponding characteristics of therehabilitation device 70 indicated by the rehabilitation information. For example, the healthcare provider may compare the expected resistance setting of therehabilitation device 70 to the actual resistance setting of therehabilitation device 70 indicated by the rehabilitation information.
If the actual characteristics of therehabilitation device 70 indicated by the rehabilitation information are within the range of expected characteristics of therehabilitation device 70, the healthcare provider may determine that the subject is properly performing the rehabilitation plan. Conversely, if the actual characteristics of therehabilitation device 70 indicated by the rehabilitation information are outside the range of expected characteristics of therehabilitation device 70, the healthcare provider may determine that the subject is not properly performing a rehabilitation plan.
If the healthcare provider determines that the rehabilitation information indicates that the subject is properly performing the rehabilitation plan and/or that the rehabilitation plan has a desired effect, the healthcare provider may determine not to modify the rehabilitation plan or one or more features of therehabilitation device 70. Conversely, if the healthcare provider determines that the rehabilitation information indicates that the subject is not properly performing the rehabilitation plan and/or that the rehabilitation plan does not have the desired effect, the healthcare provider may determine to modify the rehabilitation plan and/or one or more features of therehabilitation device 70 while the user is performing the rehabilitation plan using therehabilitation device 70.
In some embodiments, theserver 30 may receive subsequent rehabilitation data related to the subject while the subject is executing the modified rehabilitation plan using therehabilitation device 70. For example, after the healthcare provider provides input to modify the rehabilitation plan and/or control one or more features of therehabilitation device 70, the subject may proceed to execute the modified rehabilitation plan using therehabilitation device 70. The subsequent rehabilitation data may correspond to rehabilitation data generated by the subject when the modified rehabilitation plan is executed usingrehabilitation device 70. In some embodiments, after the healthcare provider has received the rehabilitation information and determined not to modify the rehabilitation plan and/or control one or more features of therehabilitation device 70, subsequent rehabilitation data may correspond to rehabilitation data generated as the subject continues to perform the rehabilitation plan using therehabilitation device 70.
Theserver 30 may further modify the rehabilitation plan and/or control one or more features of therehabilitation device 70 based on subsequent rehabilitation plan inputs received from the overview display 120. In response to receiving and/or viewing subsequent rehabilitation information corresponding to the subsequent rehabilitation data, the subsequent rehabilitation plan input may correspond to the input provided by the healthcare provider at the overview display 120. It should be appreciated that theserver 30 may continuously and/or periodically provide rehabilitation information to thesubject profile display 130 and/or other portions, components, or components of the overview display 120 based on continuously and/or periodically received rehabilitation data.
The healthcare provider may continuously or periodically receive and/or view rehabilitation information as the user performs a rehabilitation plan using the rehabilitation device. The healthcare provider may determine whether to modify the rehabilitation plan and/or control one or more features of the rehabilitation device based on one or more trends indicated by the continuously and/or periodically received rehabilitation information. For example, one or more trends may indicate a change in heart rate increase or other suitable trends indicating that the user is not performing the rehabilitation plan properly and/or that the user has not performed the rehabilitation plan with the desired effect at all times.
Fig. 9 is a flow diagram generally illustrating amethod 900 of monitoring performance of a user using a rehabilitation device to perform a rehabilitation plan and selectively modifying the rehabilitation plan and one or more features of the rehabilitation device. In accordance with the present disclosure,method 900 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine), or a combination of both. Each of themethod 900 and/or individual functions, routines, subroutines, or operations thereof may be performed by one or more processors of a computing device (e.g., any of the components of fig. 1, such as theserver 30 running the artificial intelligence engine 11). In some embodiments,method 900 may be performed by a single processing thread. Alternatively,method 900 may be performed by two or more processing threads, each thread implementing one or more separate functions, routines, subroutines, or operations of the method.
For simplicity of explanation, themethod 900 is depicted and described as a series of acts. However, operations in accordance with the present disclosure may occur in various orders, and/or concurrently, and/or with other operations not presented and described herein. For example, the operations depicted inmethod 900 may occur in conjunction with any other operations of any other methods disclosed herein. Moreover, not all illustrated acts may be required to implement themethodology 900 in accordance with the disclosed subject matter. In addition, those skilled in the art will understand and appreciate that themethodology 900 can alternatively be represented as a series of interrelated states via a state diagram or events.
At 902, the processing device may receive rehabilitation data relating to a user using a rehabilitation device, such asrehabilitation device 70, to perform a rehabilitation plan. The rehabilitation data may include characteristics of the user, measurement information related to the user while the user is using therehabilitation device 70, characteristics of therehabilitation device 70, a rehabilitation plan, other suitable data, or a combination thereof.
At 904, the processing device may generate rehabilitation information using the rehabilitation data. The rehabilitation information may include a summary of the user's performance in performing a rehabilitation program using therehabilitation device 70. The rehabilitation information may be formatted such that the rehabilitation data may be presented on a computing device of a healthcare provider responsible for performance of the rehabilitation plan by the user.
At 906, the processing device may be configured to provide (e.g., store for access, make available, make accessible, transmit, etc.) the rehabilitation information on the healthcare provider's computing device. At 908, the processing device may be configured to provide rehabilitation information at an interface of the healthcare provider's computing device. For example, the processing device may store the rehabilitation information for access by the healthcare computing device and/or to communicate (e.g., transmit) the rehabilitation information to the healthcare provider's computing device for display at thesubject profile display 130 of the overview display 120. As described, the overview display 120 may be configured to receive input, such as rehabilitation plan input, indicating one or more modifications to the rehabilitation plan and/or one or more features of therehabilitation apparatus 70. The healthcare provider may interact with various controls, input fields, and other aspects of the overview display 120 to provide rehabilitation plan input.
At 910, the processing device may modify the rehabilitation plan in response to receiving the rehabilitation plan input including at least one modification to the rehabilitation plan. For example, the processing device may modify various characteristics and features of the rehabilitation plan based on at least one modification indicated by the rehabilitation plan input.
At 912, the processing device may selectively control therehabilitation device 70 using the modified rehabilitation plan. For example, the processing device may modify one or more features of therehabilitation device 70 based on the modification to the rehabilitation plan. Additionally or alternatively, the processing device may adapt, modify, adjust, or otherwise control one or more features based on the rehabilitation plan input. For example, the rehabilitation plan input may indicate at least one modification to one or more features of therehabilitation device 70. The processing device may modify one or more features of therehabilitation device 70 based on the at least one modification indicated by the rehabilitation plan input.
Fig. 10 is a flow diagram generally illustrating analternative method 1000 of monitoring performance of a rehabilitation plan performed by a user using a rehabilitation device and selectively modifying one or more features of the rehabilitation plan and the rehabilitation device in accordance with the present disclosure. Themethod 1000 includes operations performed by a processor of a computing device (e.g., any of the components of FIG. 1, such as theserver 30 running the artificial intelligence engine 11). In some embodiments, one or more operations ofmethod 1000 are implemented by computer instructions stored on a memory device and executed by a processing device.Method 1000 may be performed in the same or similar manner as described above with respect tomethod 900. The operations ofmethod 1000 may be performed in part in conjunction with any of the operations of any of the methods described herein.
At 1002, a processing device may receive first rehabilitation data relating to a user performing a rehabilitation plan using a rehabilitation device (such as rehabilitation device 70) during a telemedicine session. The first rehabilitation data includes at least measurement information related to the user while the user is performing a rehabilitation plan using therehabilitation device 70. The first rehabilitation data may correspond to sensor data, such assensor data 136, from one or more external sensors, such asexternal sensors 82, 84, 86, and/or from one or more internal sensors, such asinternal sensor 76, ofrehabilitation device 70.
In some embodiments, at least some of the first rehabilitation data may include sensor data from one or more sensors associated with one or more corresponding wearable devices worn by the user while using therehabilitation device 70. The one or more wearable devices may include a watch, bracelet, necklace, chest band, head sweatband, wrist sweatband, any other suitable sweatband and other suitable wearable devices, or combinations thereof. The one or more wearable devices may be configured to monitor the user's heart rate, temperature, blood pressure, one or more vital signs, etc. while the user is using therehabilitation device 70.
At 1004, the processing device may generate first rehabilitation information using the first rehabilitation data. The first rehabilitation information may include a performance summary of the user performing the rehabilitation plan while using therehabilitation device 70. The first rehabilitation information may be formatted such that the first rehabilitation data may be presented on a computing device of a healthcare provider responsible for performance of the rehabilitation plan by the user.
At 1006, the processing device may be configured to write the first rehabilitation information to the associated memory for access at and/or provision at the healthcare provider's computing device. At 1008, the processing device may be configured to provide the first rehabilitation information at an interface of a computing device of the healthcare provider. For example, the processing apparatus may be configured to provide the first rehabilitation information at thesubject profile display 130 of the overview display 120. As described, the overview display 120 may be configured to receive input, such as rehabilitation plan input, indicating one or more modifications to the rehabilitation plan and/or one or more features of therehabilitation device 70. The healthcare provider may interact with the various controls, input fields, and other aspects of the overview display 120 to provide rehabilitation plan input.
At 1010, the processing device may receive a first rehabilitation plan input responsive to the first rehabilitation information. The first rehabilitation plan input may indicate at least one modification to the rehabilitation plan. In some embodiments, as described, the first rehabilitation plan input may be provided by a healthcare provider. In some embodiments, based on the first rehabilitation information, theartificial intelligence engine 11 may generate a first rehabilitation plan input.
At 1012, the processing device may modify the rehabilitation plan in response to receiving a first rehabilitation plan input including at least one modification to the rehabilitation plan. For example, the processing device may modify various characteristics and features of the rehabilitation plan based on at least one modification indicated by the first rehabilitation plan input.
At 1014, the processing device may selectively control therehabilitation device 70 using the modified rehabilitation plan. For example, the processing device may modify one or more features of therehabilitation device 70 based on the modification to the rehabilitation plan. Additionally or alternatively, the processing device may adapt, modify, adjust, or otherwise control one or more features based on the first rehabilitation plan input. For example, the first rehabilitation plan input may indicate at least one modification to one or more features of therehabilitation device 70. The processing device may modify one or more features of therehabilitation device 70 based on the at least one modification indicated by the first rehabilitation plan input.
At 1016, the processing device may receive a second rehabilitation plan input responsive to second rehabilitation information generated using the second rehabilitation data. For example, the processing device may receive second rehabilitation data related to the user while the user is using therehabilitation device 70. The second rehabilitation data may include rehabilitation data received by the rehabilitation device after the first rehabilitation data. In some embodiments, the second rehabilitation data may relate to the user while the user is using therehabilitation device 70 to perform the modified rehabilitation plan.
In some embodiments, the second rehabilitation data may relate to the user while the user is using therehabilitation device 70 to perform a rehabilitation plan (e.g., without the healthcare provider modifying the rehabilitation plan, as described). The processing device may generate second rehabilitation information based on the second rehabilitation data. The processing device may receive a second rehabilitation plan input indicative of at least one modification to the rehabilitation plan.
As described, the processing device may be configured to provide the second rehabilitation information to any other suitable portion, component, or assembly of thesubject profile display 130 and/or the overview display 120, or any other suitable display or interface. The healthcare provider (e.g., and/or the artificial intelligence engine 11) may review the second rehabilitation information and determine whether to modify and/or further modify the rehabilitation plan based on the second rehabilitation information.
At 1018, the processing device may modify the rehabilitation plan using the second rehabilitation plan input. For example, the processing device may further modify (e.g., in the case where the processing device has modified the rehabilitation plan) and/or modify (e.g., in the case where the processing device has not previously modified the rehabilitation plan) various features and characteristics of the rehabilitation plan based on the at least one modification indicated by the second rehabilitation plan input.
At 1020, the processing device may selectively control therehabilitation device 70 using the modified rehabilitation plan. For example, based on the modification to the rehabilitation plan, the processing device may modify one or more features of therehabilitation device 70. Additionally, or alternatively, the processing device may adapt, modify, adjust, or otherwise control one or more features based on the second rehabilitation plan input. For example, the second rehabilitation plan input may indicate at least one modification to one or more features of therehabilitation device 70. The processing device may modify one or more features of therehabilitation device 70 based on the at least one modification indicated by the second rehabilitation plan input.
Fig. 11 is a flow diagram generally illustrating analternative method 1100 of monitoring performance of a rehabilitation plan performed by a user using a rehabilitation device and selectively modifying one or more features of the rehabilitation plan and the rehabilitation device in accordance with the present disclosure. Themethod 1100 includes operations performed by a processor of a computing device (e.g., any of the components of fig. 1, such as theserver 30 running the artificial intelligence engine 11). In some embodiments, one or more operations ofmethod 1100 are implemented by computer instructions stored on a memory device and executed by a processing device.Method 1100 may be performed in the same or similar manner as described above with respect tomethods 900 and 1000. The operations ofmethod 1100 may be performed in part in conjunction with any of the operations of any of the methods described herein.
At 1102, a processing device may receive rehabilitation data relating to a user using a rehabilitation device (such as rehabilitation device 70) to perform a rehabilitation plan. The rehabilitation data may include any of the data described herein. The rehabilitation data may correspond to sensor data, such assensor data 136, from one or more external sensors, such asexternal sensors 82, 84, 86, and/or from one or more internal sensors, such asinternal sensor 76, ofrehabilitation device 70. In some embodiments, at least some of the rehabilitation data may include sensor data from one or more sensors associated with one or more corresponding wearable devices worn by the user while using therehabilitation device 70. The one or more wearable devices may include a watch, bracelet, necklace, chest band, head sweatband, wrist sweatband, any other suitable wearable device, or combinations thereof. The one or more wearable devices may be configured to monitor the user's heart rate, temperature, blood pressure, one or more vital signs, etc. while the user is using therehabilitation device 70.
At 1104, the processing device may generate rehabilitation information using the rehabilitation data. The rehabilitation information may include a summary of the user's performance in performing a rehabilitation program using therehabilitation device 70. The rehabilitation information may be formatted such that the rehabilitation data may be presented on a computing device of a healthcare provider responsible for performance of the rehabilitation plan by the user.
At 1106, the processing device may be configured to provide the rehabilitation information to at least one of a computing device of the healthcare provider and a machine learning model executed by theartificial intelligence engine 11.
At 1108, the processing device may receive a rehabilitation plan input in response to the rehabilitation information. The rehabilitation plan input may indicate at least one modification to the rehabilitation plan. In some embodiments, as described, the rehabilitation plan input may be provided by a healthcare provider. In some embodiments, based on the rehabilitation information, theartificial intelligence engine 11 running the machine learning model may generate rehabilitation plan inputs.
At 1110, the processing device determines whether the rehabilitation plan input indicates at least one modification to the rehabilitation plan. If the processing device determines that the rehabilitation plan input does not indicate at least one modification to the rehabilitation plan, the processing device returns to 1102 and continues to receive rehabilitation data related to the user while the user is executing the rehabilitation plan using therehabilitation device 70. If the processing device determines that the rehabilitation plan input indicates at least one modification to the rehabilitation plan, the processing device continues at 1112.
At 1112, the processing device may modify the rehabilitation plan using the rehabilitation plan input. For example, the processing device may modify the rehabilitation plan using at least one modification to the rehabilitation plan indicated by the rehabilitation plan input. Based on the at least one modification indicated by the rehabilitation plan input, the processing device may modify various characteristics and features of the rehabilitation plan.
At 1114, the processing device may selectively control therehabilitation device 70 using the modified rehabilitation plan. For example, based on at least one modification to the rehabilitation plan, the processing device may modify one or more features of therehabilitation device 70. Additionally or alternatively, the processing device may adapt, modify, adjust, or otherwise control one or more features based on the rehabilitation plan input. For example, the rehabilitation plan input may indicate at least one modification to one or more features of therehabilitation device 70. Based on the at least one modification indicated by the rehabilitation plan input, the processing device may modify one or more features of therehabilitation device 70. While the user is executing the rehabilitation plan using therehabilitation device 70, the processing device may return to 1102 and continue to receive rehabilitation data related to the user.
Fig. 12 generally illustrates anexample computer system 1200 that can perform any one or more of the methods described herein in accordance with one or more aspects of the present disclosure. In one example,computer system 1200 may include a computing device and correspond toauxiliary interface 94, reportinginterface 92, monitoringinterface 90, clinician interface 20, server 30 (including AI engine 11),subject interface 50,dynamic sensor 82,goniometer 84,rehabilitation device 70,pressure sensor 86, or any suitable component in fig. 1. Thecomputer system 1200 is capable of executing instructions that implement one or moremachine learning models 13 of theartificial intelligence engine 11 of fig. 1. The computer system may be connected (e.g., networked) to other computer systems in a LAN, an intranet, an extranet, or the internet, including via a cloud or peer-to-peer network.
The computer system may operate in a client-server network environment in the capacity of a server. The computer system may be a Personal Computer (PC), a tablet computer, a wearable device (e.g., a wristband), a set-top box (STB), a Personal Digital Assistant (PDA), a mobile phone, a camera, a video camera, an internet of things (IoT) device, or any device capable of executing a set of instructions (sequential or otherwise) that specify actions to be performed by the device. Further, while only a single computer system is illustrated, the term "computer" shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
Thecomputer system 1200 includes aprocessing device 1202, a main memory 1204 (e.g., read Only Memory (ROM), flash memory, a Solid State Drive (SSD), dynamic Random Access Memory (DRAM) such as Synchronous DRAM (SDRAM)), static memory 1206 (e.g., flash memory, solid State Drive (SSD), static Random Access Memory (SRAM)), and adata storage device 1208 that communicate with each other via abus 1110.
Processing device 1202 represents one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, theprocessing device 1202 may be a Complex Instruction Set Computing (CISC) microprocessor, reduced Instruction Set Computing (RISC) microprocessor, very Long Instruction Word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processing device 1402 may also be one or more special-purpose processing devices such as an Application Specific Integrated Circuit (ASIC), a system on a chip, a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), a network processor, or the like. The processing device 1402 is configured to execute instructions for performing any of the operations and steps discussed herein.
Thecomputer system 1200 may further include anetwork interface device 1212. Thecomputer system 1200 may also include a video display 1214 (e.g., a Liquid Crystal Display (LCD), a Light Emitting Diode (LED), an Organic Light Emitting Diode (OLED), a quantum LED, a Cathode Ray Tube (CRT), a shadow mask (CRT), an aperture grid CRT, a monochrome (CRT), one or more input devices 1216 (e.g., a keyboard and/or a mouse or game-like controls), and one or more speakers 1218 (e.g., a sound box). In one illustrative example, thevideo display 1214 and theinput devices 1216 may be combined into a single component or device (e.g., an LCD touch screen).
Thedata storage 1216 may include a computer-readable medium 1220 on which are storedinstructions 1222 implementing any one or more of the methods, operations, or functions described herein. Theinstructions 1222 can also reside, completely or at least partially, within the main memory 1204 and/or within theprocessing device 1202 during execution thereof by thecomputer system 1200. As such, the main memory 1204 and theprocessing device 1202 also constitute computer readable media. Theinstructions 1222 may further be transmitted or received over a network via thenetwork interface device 1212.
While the computer-readable storage medium 1220 is shown in an illustrative example to be a single medium, the term "computer-readable storage medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "computer-readable storage medium" shall also be taken to include media capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term "computer-readable storage medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
Clause 1. A computer-implemented system, comprising: a rehabilitation device configured to be manipulated by a user while the user is performing a rehabilitation plan; a subject interface associated with the rehabilitation device, wherein the subject interface includes an output device configured to present telemedicine information associated with a telemedicine session; and a computing device configured to: receiving rehabilitation data relating to the user performing the rehabilitation plan using the rehabilitation device, wherein the rehabilitation data includes at least one of a characteristic of the user, measurement information relating to the user while the user is using the rehabilitation device, a characteristic of the rehabilitation device, and at least one aspect of the rehabilitation plan; generating rehabilitation information using the rehabilitation data, writing the rehabilitation information to an associated memory for access at a computing device of a healthcare provider; communicating (communicate) with an interface at the computing device of the healthcare provider, wherein the interface is configured to receive a rehabilitation plan input comprising at least one modification to at least one of the at least one aspect and any other aspect of the rehabilitation plan and to modify at least one of the at least one aspect and any other aspect of the rehabilitation plan in response to receiving the rehabilitation plan input.
The computer-implemented system of any clause herein, wherein the computing device is further configured to control the rehabilitation device based on the modified at least one aspect and any other aspect of the rehabilitation plan as the user uses the rehabilitation device.
The computer-implemented system of any clause herein, wherein the computing device is further configured to control the rehabilitation device based on at least one of the at least one aspect and any other aspect of the rehabilitation plan as modified while the user is using the rehabilitation device during the telemedicine session.
Clause 4. A computer-implemented system according to any clause herein, wherein the measurement information comprises at least one of a vital sign of the user, a respiratory rate of the user, a heart rate of the user, a body temperature of the user, and a blood pressure of the user.
Clause 5. A computer-implemented system according to any clause herein, wherein at least some of the rehabilitation data corresponds to at least some sensor data from a sensor associated with the rehabilitation device.
Clause 6. A computer-implemented system according to any clause herein, wherein at least some of the rehabilitation data corresponds to at least some sensor data from sensors associated with a wearable device worn by the user while the user is using the rehabilitation device.
Clause 7. A method comprises: receiving rehabilitation data relating to a user performing a rehabilitation program using a rehabilitation device, wherein the rehabilitation data includes at least one of a characteristic of the user, measurement information relating to the user while the user is using the rehabilitation device, a characteristic of the rehabilitation device, and at least one aspect of the rehabilitation program; generating rehabilitation information using the rehabilitation data; writing the rehabilitation information to an associated memory for access by a computing device of the healthcare provider; communicating, at the computing device of the healthcare provider, with an interface, wherein the interface is configured to receive a rehabilitation plan input; and modifying at least one aspect of the rehabilitation plan in response to receiving rehabilitation plan input, the rehabilitation plan input including at least one modification to at least one aspect of the rehabilitation plan.
The method of any clause herein, further comprising controlling the rehabilitation device based on the modified at least one aspect of the rehabilitation plan while the user is using the rehabilitation device.
Clause 9. The method according to any clause herein, further comprising controlling the rehabilitation device based on the modified at least one aspect of the rehabilitation plan while the user is using the rehabilitation device during a telemedicine session.
Clause 10. A method according to any clause herein, wherein the measurement information comprises at least one of a vital sign of the user, a respiratory rate of the user, a heart rate of the user, a body temperature of the user, and a blood pressure of the user.
A method according to any clause herein, wherein at least some of the rehabilitation data corresponds to at least some sensor data from a sensor associated with the rehabilitation device.
The method of any clause herein, wherein at least some of the rehabilitation data corresponds to at least some sensor data from sensors associated with a wearable device worn by the user while the user is using the rehabilitation device.
The method of any clause herein, further comprising receiving subsequent rehabilitation data relating to the user while the user is executing the rehabilitation plan using the rehabilitation device.
The method of any clause herein, further comprising modifying the modified rehabilitation plan in response to receiving a subsequent rehabilitation plan input comprising at least one further modification to at least one aspect of the modified rehabilitation plan, wherein the subsequent rehabilitation plan input is based on at least one of the rehabilitation data and the subsequent rehabilitation data.
Clause 15. A tangible, non-transitory, computer-readable medium storing instructions that, when executed, cause a processing device to: receiving rehabilitation data relating to a user performing a rehabilitation program using a rehabilitation device, wherein the rehabilitation data includes at least one of a characteristic of the user, measurement information relating to the user while the user is using the rehabilitation device, a characteristic of the rehabilitation device, and at least one aspect of the rehabilitation program; generating rehabilitation information using the rehabilitation data; writing the rehabilitation information to an associated memory for access at a computing device of a healthcare provider; communicating with an interface at the computing device of the healthcare provider, wherein the interface is configured to receive rehabilitation plan input; and modifying at least one aspect of the rehabilitation plan in response to receiving rehabilitation plan input, the rehabilitation plan input including at least one modification to the rehabilitation plan.
Clause 16. A computer readable medium according to any clause herein, wherein the processing device is further configured to control the rehabilitation device based on at least one aspect of the modified rehabilitation plan while the user is using the rehabilitation device.
A computer readable medium according to any clause herein, wherein the processing device is further configured to control the rehabilitation device based on at least one aspect of the modified rehabilitation plan while the user is using the rehabilitation device during a telemedicine session.
A computer readable medium according to any clause herein, wherein the measurement information comprises at least one of a vital sign of the user, a respiratory rate of the user, a heart rate of the user, a body temperature of the user, and a blood pressure of the user.
Clause 19. A computer-readable medium according to any clause herein, wherein at least some of the rehabilitation data corresponds to at least some sensor data from a sensor associated with the rehabilitation device.
The computer-readable medium of any clause herein, wherein at least some of the rehabilitation data corresponds to at least some sensor data from sensors associated with a wearable device worn by the user while the user is using the rehabilitation device.
Clause 21. The computer-readable medium according to any clause herein, wherein the processing device is further configured to: receiving subsequent rehabilitation data related to the user while the user is executing the rehabilitation plan using the rehabilitation device.
Clause 22. A computer-readable medium according to any clause herein, wherein the processing device is further configured to: modifying at least one aspect of the rehabilitation plan after modification in response to receiving a subsequent rehabilitation plan input comprising at least one further modification to the rehabilitation plan, wherein the subsequent rehabilitation plan input is based on at least one of the rehabilitation data and the subsequent rehabilitation data.
Clause 23. A system includes: a memory device to store instructions; a processing device communicatively connected to the memory device, the processing device executing instructions to: receiving rehabilitation data relating to a user performing a rehabilitation program using a rehabilitation device, wherein the rehabilitation data includes at least one of a characteristic of the user, measurement information relating to the user while the user is using the rehabilitation device, a characteristic of the rehabilitation device, and at least one aspect of the rehabilitation program; generating rehabilitation information using the rehabilitation data; writing the rehabilitation information to an associated memory for access at a computing device of a healthcare provider; communicating with an interface at the computing device of the healthcare provider, wherein the interface is configured to receive rehabilitation plan input; and modifying at least one aspect of the rehabilitation plan in response to receiving rehabilitation plan input, the rehabilitation plan input including at least one modification to the rehabilitation plan.
Clause 24. The system according to any clause herein, wherein the processing device is further configured to: controlling the rehabilitation device while the user is using the rehabilitation device and based on at least one aspect of the modified rehabilitation plan.
The system of any clause herein, wherein the processing device is further configured to: controlling the rehabilitation device based on at least one aspect of the modified rehabilitation plan as the user uses the rehabilitation device during a telemedicine session.
Clause 26. A system according to any clause herein, wherein the measurement information comprises at least one of a vital sign of the user, a respiratory rate of the user, a heart rate of the user, a body temperature of the user, and a blood pressure of the user.
A system according to any clause herein, wherein at least some of the rehabilitation data corresponds to at least some sensor data from a sensor associated with the rehabilitation device.
A system according to any clause herein, wherein at least some of the rehabilitation data corresponds to at least some sensor data from sensors associated with a wearable device worn by the user while the user is using the rehabilitation device.
The system of any clause herein, wherein the processing device is further configured to: receiving subsequent rehabilitation data related to the user while the user is executing the rehabilitation plan using the rehabilitation device.
Clause 30. The system according to any clause herein, wherein the processing device is further configured to: modifying at least one of the modified at least one aspect of the rehabilitation plan and any other aspect in response to receiving a subsequent rehabilitation plan input comprising a further modification to the at least one aspect of the rehabilitation plan, wherein the subsequent rehabilitation plan input is based on at least one of the rehabilitation data and the subsequent rehabilitation data.
The above discussion is meant to be illustrative of the principles and various embodiments of the present disclosure. Numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.
The various aspects, embodiments, implementations, or features of the described embodiments can be used alone or in any combination. Embodiments disclosed herein are modular in nature and can be used in conjunction with or connected to other embodiments.
Consistent with the above disclosure, the example set recited in the following clauses is specifically contemplated and is intended as an exemplary, non-limiting set.