RELATED APPLICATIONSThis application claims the benefit of U.S. Provisional Patent Application No. 61/792,601, filed on Mar. 15, 2013, U.S. Provisional Patent Application No. 61/873,339, filed on Sep. 3, 2013, and U.S. Provisional Patent Application No. 61/873,347, filed on Sep. 3, 2013, which are all incorporated by reference herein in their entirety.
This application is related to U.S. patent application titled “System and Method for Identifying and Interpreting Repetitive Motions”, filed on Mar. 14, 2014, the contents of which are hereby incorporated by reference.
FIELD OF THE INVENTIONThe disclosure generally relates to the field of tracking user movements, and in particular to monitoring and quantifying repetitive and non-repetitive movements made by a user.
BACKGROUND OF THE INVENTIONMotion processing and wireless communication technology allows people to track things such as their sleeping patterns and the amount of steps they walk each day. However, motion capturing devices and functionality have not seen much success in the marketplace because of limits in the functions that can be performed and movement that can be monitored, for example.
SUMMARY OF THE INVENTIONEmbodiments include, a motion tracking system that monitors the motions performed by a user in real time, based on motion data received from one or more sensors. The motion tracking system may include a motion tracking device with one or more sensors, a smart device with one or more sensors and/or a server, for example. The user may wear the motion tracking device and or/carry the motion tracking device or the smart device while performing motions. As the user interacts with the motion tracking system or smart device the motion data generated by one or more sensors is processed by a software application. The software application may be present on the smart device, the server, and/or the motion tracking device.
The software application generates interpreted data based on the motion data and contextual data such as the equipment being used by the user. The interpreted data may include the performance of the user as the user performs a motion and/or feedback provided to the user during or after the user performs a motion or set of motions. The software application identifies the movement being performed by the user based on features present in one or more signals of the motion data. The software application may count and or generate motion metrics associated with the performance of the user as a user performs a motion. The interpreted data is then provided to the user during and/or after the user has performed a motion or a set of motions. The feedback provided to the user may be visual, audio or tactile, for example.
As the software application is monitoring the user's movements, evaluating and keeping track of qualitative and quantitative metrics such as the current exercise being performed by the user, the number of repetitions performed by the user and the form of the user, all in real time and/or after the user has performed the motion, a set of motions, multiple sets of motions and/or one or more routines. Thus, the user does not have to provide input to the application by interacting with the smart device or the motion tracking device. Hence, the user has the freedom to perform the workout at his/her own pace, without the interruption of periodically providing user input to the application via the smart device or the motion tracking device.
The features and advantages described in the specification are not all inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter.
BRIEF DESCRIPTION OF THE DRAWINGSThe drawings presented herein are for the purposes of illustration, the embodiments are not limited to the precise arrangements and instrumentalities shown.
FIG. 1 is a perspective view of a motion tracking system, according to one embodiment.
FIG. 2 is a flowchart illustrating one implementation of the motion tracking system, according to one embodiment.
FIG. 3 is a flowchart illustrating the motion tracking system monitoring user movements, according to one embodiment.
FIG. 4 illustrates repeated and non-repeated movements present in the processed signal, according to one embodiment.
FIG. 5 is a flowchart illustrating the motion tracking system identifying user movements based on motion data, according to one embodiment.
The figures depict various embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the embodiments described herein.
DETAILED DESCRIPTION OF THE INVENTIONEmbodiments are now described with reference to the figures where like reference numbers indicate identical or functionally similar elements. Also in the figures, the left most digit(s) of each reference number corresponds to the figure in which the reference number is first used.
FIG. 1 is a perspective view of amotion tracking system100, according to one embodiment. In one aspect of an embodiment, as discussed in detail with reference to the figures below, auser23 wears amotion tracking device24 whilesuch user23 is performing motions such as weight training, walking and cardiovascular movements and/or lifting objects. Themotion tracking system100 monitors the motion of a user in real time. In one embodiment, themotion tracking device24 includes amotion processing unit5 which measures arepetitive movement32 or a non-repetitivemovement33 performed by theuser23. Themotion processing unit5 includes one or more sensors, such as anaccelerometer6, agyroscope7 and/or amagnetometer8. Themotion data25 measured by the sensors and themotion processing unit5 may be used to monitor the movements of a user in real time.
Themotion data25 is transmitted to an auxiliarysmart device18 running asoftware application19. Theapplication19 analyzes themotion data25 and generates an interpreteddata28 to provide to theuser23. Theapplication19 also provides theuser23 with feedback regarding the user's movements. For example, theapplication19 may analyzemotion data25 related to a user performing an exercise and provide feedback to theuser23 in real time. The feedback may include the quality of the form of the user's motion, recommendations for other exercises or the performance of the user.Motion data25 is also, in one aspect, analyzed by theapplication19 along withcontextual data26. Thecontextual data26 may be gathered from a number of sources such as other application data on the smart device19 (e.g., geographical location, time of day, etc) or from capturing devices such as a camera or a RFID tag/reader2. Associatingcontextual data26 withmotion data25 allows theapplication19 on the auxiliarysmart device18 to provide additional information to the user related to the health, fitness or motions being performed by the user.
In one embodiment themotion tracking device24 houses amicrocontroller1.Microcontroller1 may be a small computer on a single integrated circuit containing a processor core, memory, and programmable input/output peripherals which manage multiple inputs and outputs that take place within themotion tracking device24.Microcontroller1 may receive direct inputs fromuser input11 to power themotion tracking device24 on/off, to trigger data visualization sent to adisplay10 and to turn down the volume on aspeaker13. In one embodiment,microcontroller1 is coupled to other components via a single printed circuit board or flexible circuit board.
In one embodiment themotion processing unit5 is connected to themicrocontroller1 and a regulatedpower supply17.Motion processing unit5 includes multiple sensors which measureuser23'srepetitive movements32 and non-repetitivemovements33. Each component within themotion processing unit5 measures a type of motion. For example, theaccelerometer6 detects changes in orientation and acceleration of themotion tracking device24, thegyroscope7 measures the angular velocity and themagnetometer8 measures the strength and direction of magnetic fields. Hence, the sensors in themotion processing unit5 allow themotion tracking device24 to track the movements performed by theuser23. Whenmotion data25 is recorded by themotion processing unit5, it may be sent to one or more locations. In one aspect of the present disclosure,motion data25 is sent from themotion processing unit5 to themicrocontroller1, wheremotion data25 may be temporarily stored in anonboard memory9. In one embodiment,motion data25, along with the possiblecontextual data26, are sent tosmart device18 via acommunications module4.
In one aspect of the present disclosure,motion data25 may be sent directly tosmart device18 by thecommunications module4.Communications module4 is, in one embodiment, a Bluetooth module, but could also include Wi-Fi, zigbee, or any other form of wireless communication, either in conjunction with or instead of Bluetooth. Thecommunications module4 is coupled to other components such as themicrocontroller1 and aregulated power supply17. Theregulated power supply17 regulates the power transferred to different components from abattery16.
In one embodiment, arecharge management15 component acquires power from aUSB input12 and delivers it to thebattery16. In another embodiment, therecharge management15 component acquires power from other forms of input and is not limited to acquiring power from theUSB input12.Battery16 may be, but is not limited to, a rechargeable or non-rechargeable lithium ion battery, a rechargeable or non-rechargeable nickel metal hydride battery, a rechargeable or a non-rechargeable alkaline battery. In one embodiment, thebattery16 sends the power needed to theregulated power supply17. The regulated power supply then distributes power to all components which need it. These components include but are not limited to themicrocontroller1,communications module4,motion processing unit5,memory9,display10,speaker13 and avibrator14. In one aspect of the present disclosure, themotion tracking device24 may be powered using solar cells mounted on a surface of themotion tracking device24.
In one embodiment thespeaker13 is connected to themicrocontroller1 and/or theregulated power supply17. Thespeaker12 receives audio cues frommicrocontroller1. Sound fromspeaker13 is emitted through one or more speaker ports. Speaker ports34 may be, but not limited to, perforations located on the surface of themotion tracking device24.Microcontroller1 may also use thevibrator14 to send tactile cues to theuser23.Vibrator14 can be an off-axis motor which when triggered bymicrocontroller1 creates a vibrating sensation foruser23.Vibrator14 is connected tomicrocontroller1 andregulated power supply17, power is pulled frombattery16 to power the component.
In one embodiment, themotion tracking device24 is a wearable apparatus intended to be worn by theuser23 while performingrepetitive movements32.Motion tracking device24 may be wrapped around a limb or part of theuser23's body using a strap band and a strap connector (not shown inFIG. 1).Motion tracking device24 has a surface which may be intended to communicate and/or display data to theuser23 via components such asdisplay10 and speaker ports34.Display10 is a visual screen that theuser23 can read. Functions pertaining to display10 may be, but are not limited to, displaying interpreteddata28, managing interpreteddata28, displaying battery life and managing the settings installed onmotion tracking device24 such as the volume associated withspeaker13. Thedisplay10 may be, but is not limited to, an LED display, an LCD display, an electronic ink display, plasma display or ELD display and may be, but not limited to, being mounted on the surface of themotion tracking device24. The speaker port is a collection of perforations that emit audio cues given off byspeaker13. The speaker port may be, but is not limited to, being located on the surface of themotion tracking device24, it may be located in other locations such as on a side wall of themotion tracking device24.User inputs11, for example, buttons, protrude through the surface36 ofmotion tracking device24.User inputs11 may be located on any other exterior surface ofmotion tracking device24 such as side wall. Functions ofuser inputs11 may be, but are not limited to, scrolling through interpreteddata28 ondisplay10, turningmotion tracking device24 on/off, managing interpreteddata28 viadisplay10, visualizing battery life, displaying notifications regardingmotion tracking device24 and managing volume levels ofspeaker13.Motion tracking device24 is charged via charging port. The charging port may be, but is not limited to being, located on the side wall of themotion tracking device24. The charging port may be amicro USB input12, a mini USB port, an audio input, or any other means of transferring power.
Themotion tracking device24 may be, but not limited to, being manufactured out of a flexible composite, so it may naturally convert from laid out flat, to wrapped around a limb. In one aspect,motion tracking device24, including the strap bands and the surface of themotion tracking device24, is injection-molded out of a water resistant silicone, capable of various ranges of motion without causing stress on the silicone or the internal components. According to one aspect of the present disclosure, the strap bands may be made of rugged textile, capable of various ranges of movement. The strap connectors41 have contact surfaces which may be, but not limited to a Velcro™ adhesive, magnetic tape a snapping mechanism or any other components thereof. In one aspect of the present disclosure, the strap bands are embedded with magnets which then create the resulting connection between each strap band40.
Themotion tracking device24 may be of various lengths and sizes, dependent on the part of the body from whichmotion data25 is being recorded. In one aspect of the present disclosure, strap bands40 may be capable of stretching to meet the length requirements necessary to secure motion tracking device arounduser23 via strap connector41.
In one embodiment, themotion tracking device24 houses components for capturingcontextual data26 such as a camera or aRFID tag reader2. The camera captures images or videos of the environment the user is in or items the user is interacting with. For example if a user is performing a curl, the camera may capture an image of the dumbbell being used by the user to perform a curl, as the user is performing a curl. Themicrocontroller1 receives the image from the camera and sends the image to thesoftware application19. Thesoftware application19 may process the captured image (contextual data26) and generate interpreteddata28 identifying the weight of the dumbbell being used by the user. In another example, anRFID tag reader2 may capture an RFID tag associated with the dumbbell being used by the user to perform a curl. Themicrocontroller1 receives the RFID tag identifier from theRFID tag reader2 and sends the RFID tag to thesoftware application19. Thesoftware application19 may process the RFID tag (contextual data26) and generate interpreteddata28 identifying the weight of the dumbbell being used by the user, as identified by the RFID tag. In an alternate embodiment, the user may input the contextual information via thesoftware application19 and/or themotion tracking device24.
In one embodiment,motion data25 andcontextual data26 are sent to thesoftware application19 installed ontosmart device18.Software application19 interpretsmotion data25 andcontextual data26 into interpreteddata28. In one embodiment, the interpreteddata28 may include the user's23 movements, pauses in movement, collections of movements and any other contextual information related to the user's movements. Interpreteddata28 can also be interpretations ofcontextual data26, which can also, in one aspect, include estimates of the calories burned by the user during a given exercise reflected by a given set ofmotion data25 using a piece of equipment identified by a given set ofcontextual data26. In another embodiment, the interpreteddata28 includes the performance of the user during a set of motions and feedback provided to the user during and/or after the user performs a set of motions.
Thesmart device18 may be any device capable of accepting wireless data transfer such as a smartphone, tablet or a laptop. In one embodiment thesmart device18 has computing power sufficient to run thesoftware application19. Persons having skill in the art will realize that communication is not necessarily direct between themotion tracking device24 and thesmart device18, and could instead be indirect, via one or more intermediary devices and/or via a network such as the Internet. Thesoftware application19 interacts with thesmart device18 through a smart device API. Thesoftware application19 receivesmotion data25 from themotion tracking device24 by using the smart device API to interact with thecommunication module4. Thesoftware application19 may be adapted to interact with a variety of smart device APIs. This would allow thesoftware application19 to function on a variety ofsmart device platforms18 each having their own smart device API. Hence, the user is not restricted to a specificsmart device18 in order to be able to use theapplication19.
In one embodiment thesoftware application19 is hosted or installed on themotion tracking device24. In this embodiment, thesoftware application19 may be executed by the processor on themicrocontroller1. Hence, the analysis of themotion data25 may be performed by thesoftware application19 on themotion tracking device24, independent of thesmart device18 or in combination with thesmart device18 and/or a remote processing device, e.g.,server21. In another embodiment, the software application may be installed on a device with at least one sensing component, such as a smartphone. Thesoftware application19 in this embodiment, may usemotion data25 provided by the sensors on the device to generate interpreteddata28, and not on the pairing of thesmart device18 and themotion tracking device24. For example, theapplication19 installed on asmartphone18, may use the motion data, generated by theaccelerometer6 on the smart phone to determine the number of steps taken by the user as theuser23 walked from his/her house to work. Hence themotion tracking system100 is not restricted to the coupling of asmart device18 and amotion tracking device24, and can be performed in any number of steps with any number of devices involving the transfer ofmotion data25 to thesoftware application19. In alternate embodiments, sensor information from multiple devices, e.g.,smart device18 andmotion tracking device24, can be used bysoftware application19.
In one embodiment the interpreteddata28 is sent from thesmart device18 to a remote processing device (cloud based device and system), e.g.,server21 via a wireless data transfer or a network. For ease of reference,server21 will be used in this description, but any remote, e.g., cloud based, processing device including multiple devices such as a remote database, storage, memory, processor(s) can be used. Theserver21 can be any remote processing device. TheServer21 attaches/correlates/identifies the interpreteddata28 to auser profile29. Theuser23 may then review, access and/or visualize the interpreteddata28 history associated with theiruser profile29 via any device capable of wireless data transfer such as, without limitation, a smart phone, a tablet, a motion tracking device, or a computer, using a dedicated software application or a web browser to display the interpreteddata28. In another embodiment, the interpreteddata28 is also relayed back to theuser23 throughsoftware application19 installed on thesmart device18 or on themotion tracking device24. Interpreteddata28 may be displayed by thesoftware application19 for theuser23 to see during and/or following theuser23 performing movements. Feedback regarding the interpreteddata28 may be provided to theuser23 in real time in a number of ways. In one example visual feedback is provided to theuser23 either on thesmart device18 or on thedisplay10 of themotion tracking device24. In another example, audio feedback is provided to the user through speakers on thesmart device18 or on themotion tracking device24. Tactile feedback may also be provided to the user through thevibrator14.
In one embodiment, thesoftware application19 may be stored on theserver21. Thesoftware application19 on theserver21 may analyze themotion data25 sent to the server and generate the interpreteddata28 to associate with auser profile29. For example, in the instance that theuser23 would like to save the power consumed by thesmart device18, thesmart device18 may sendmotion data25 received from themotion tracking device24 to theserver21 for processing. Hence, the processing of themotion data25 by the software application is not limited to taking place on thesmart device18 or on themotion tracking device24.
In one embodiment, thesoftware application19 or other code stored on thesmart device18 or themotion tracking device24 may regulate the power consumed by the sensors by turning on or off one or more sensors. In one example, the sensors are turned off when the user has not activated or moved thedevice24. In another example, one or more sensors are turned on or off for particular movements performed by the user.
FIG. 2 is a flowchart illustrating one implementation of themotion tracking system100, according to one embodiment. In this embodiment the user is using themotion tracking system100 as an artificial aide and a monitoring unit while performing a fitness routine. The user activates205 themotion tracking device24 or theapplication19 on themotion tracking device24 by eitherpressing user input11 or moving themotion tracking device24. In one example theapplication19 on themotion tracking device24 identifies that the user has activated the device based onmotion data25 received from the sensors.
The user then begins the fitness routine by either following a routine suggested by theapplication19 or by following a routine the user would like to perform. For example, the routine suggested by theapplication19 may include 3 sets of hammer curls using 30 pound dumbbells with a rest period of 60 seconds between each set, followed by 4 sets of 20 crunches with a rest period of 30 seconds between each set. As the user performs the routine, theapplication19 monitors a number of characteristics related to the movements performed by the user based on themotion data25. For example theapplication19 determines and monitors215 the type of exercise being performed by the user, the quality of the form of the user as the user is performing the exercise and/or the number of counts or repetitions performed by the user. In one embodiment, theapplication19 suggests and monitors215 the rest time observed by the user in-between sets of exercises as the user goes through the fitness routine.
Theapplication19 may also providefeedback220 to the user in real-time as the user performs the fitness routine. For example thevibrator14 on themotion tracking device24 may vibrate, notifying the user of bad form as the user is performing a curl. In another example the feedback includes, charts and tables displayed on thedisplay10 of themotion tracking device24 describing the performance of the user through the fitness routine.
In one embodiment theapplication19 sends225 the interpreted data and a performance data to theserver21. The performance data may include statistics describing the performance of the user throughout the fitness routine, or quantitative metrics (e.g., percentage of routine completed, goals reached, repetitions of each exercise, etc) evaluating the fitness routine performed by the user. Theserver21 then associates or attaches230 the performance data and/or the interpreteddata28 to the user'suser profile29.
FIG. 3 is a flowchart illustrating themotion tracking system100 monitoring user movements, according to one embodiment. Theapplication19 monitors the movements made by the user based on the raw realtime motion data25 obtained305 from the sensors. The sensors generate raw realtime motion data25 based on the movements of the user. For example, theaccelerometer6 generates acceleration data and change in acceleration data based on the relative movement of thedevice18,24 on the user.
Theapplication19, then processes310 the real time motion data obtained305 from one or more of the sensors or themotion tracking device24. Processing310 the raw real time data or signal removes the noise and other irrelevant features carried by the signal. In one embodiment a low pass filter is used to filter out the noise in the raw signal obtained305 from the sensors. In another embodiment a moving average filter is used to filter out the noise in the raw signal obtained305 from the sensors. It is understood that other filters can be used to increase the signal-to-noise ratio of the raw signals.
In one embodiment theapplication19 determines315 a classification of the movement performed by the user based on one or more processed real time signals. Classifying the movement performed by the user is important as it helps the system identify and understand the movement being performed by the user. For example, theapplication19 first determines315 that the user is performing a curl, prior to identifying the characteristics associated with the user performing the curl, such as the form of the user's movements with respect to that of a correct curl movement.
In one embodiment, a classification algorithm may be a machine learning algorithm, a pattern recognition algorithm, a template matching algorithm, a statistical inference algorithm, and/or an artificial intelligence algorithm that operates based on a learning model. Examples of such algorithms are k-Nearest Neighbor (kNN), Support Vector Machines (SVM), Artificial Neural Networks (ANN), and Decision Trees.
In one embodiment, after the application classifies315 the movement being performed by the user, theapplication19 quantifies320 characteristics of the movement being performed by a user such as the count of the number of repetitive movements made by the user to determine the repetitions of a movement performed by a user. For example, theapplication19 determines the number of times a user has performed a curl during a given set of curls, based on the number of repetitive movements (that have been classified as a curl) performed by the user. In one embodiment, theapplication19 determines the number of real peaks present in a rolling window of one or more signals. A real peak may be determined based on the amplitude of the peak relative to the whole signal and/or other contextual information such as the expected pattern of peaks or duration of peaks for the classified or identified movement being performed by the user. For example, theapplication19 may have identified that the user is performing a curl. Based on this information, real peaks may be known to appear in the rolling window of the z-axis of theaccelerometer6 signal above an amplitude of 0.6 G and over a time period n as a user is performing a curl. Similarly real peaks may be known to appear in the rolling window of the y-axis of thegyroscope7 signal above an amplitude of 0.5 radians/sec and over a period of 2n as a user is performing a curl. Hence, theapplication19 may count the number of real peaks present in the z-axis accelerometer6 signal as 1 per time period of n, and those present in the y-axis gyroscope7 signal as 2 per period of 2n, thereby counting the number of curls performed by the user.
In another embodiment, theapplication19 may quantify320 other characteristics of the movement being performed by the user such as the speed of the movement being performed by the user. Theapplication19 may determine the time period over which a peak or valley or morphological feature in one or more signals occurs to determine the rate at which each repetitive movement is performed by the user. Longer time periods may correspond to slower movement speeds, and shorter time periods may correspond to fast movement speeds. Theapplication19 may thus quantify320 a number of characteristics associated with the movements performed by a user based on the morphological features present in one or more signals.
FIG. 4 illustrates repeated405 and non-repeated410 movements present in the processed signal, according to one embodiment. Referring toFIG. 4 with respect to the method illustrated inFIG. 3, theapplication19 counts therepetitive movements405 performed by the user during a fitness routine. For example, if the repeatedmovements405 were that of curls performed by the user, theapplication19 would determine that the user performed 5 repeated movements or 5 curls. Theapplication19, differentiates between thenon-repeated movements410 represented by a portion of the processed signal and the repeatedmovements405 represented by a different portion of the processed signal.
In one embodiment, theapplication19 identifies groups of repeatedmovements405 performed by a user. For example, the fitness routine suggested by the application may include the user receiving instructions to perform 3 sets of 5 curls with a rest time of 30 seconds between each set. Theapplication19, based on the processed real time signal first identifies and classifies the user's movements as curls. Then theapplication19 is notified of the user performing the first set of curls, based on the user performingrepetitive curl movements405. After theapplication19 has recorded group 1 (415) comprising of 5 curls, theapplication19 also monitors the transition time 1 (430) or the rest time, represented by thenon-repeated movements410 between groups 1 (415) and 2 (420). Theapplication19 then monitors group 2 (420) comprising of 5 curls, and the transition time 2 (435) between group 2 (420) and group 3 (425). Theapplication19 identifies that the user has finished the 3 sets of curls once the application has finished monitoring group 3 (425), the last set of curls performed by the user. Hence, theapplication19 monitors the fitness routine followed by the user based on the processed real time signal, representing the movements performed by a user.
FIG. 5 is a flowchart illustrating themotion tracking system100 identifying user movements based onmotion data25, according to one embodiment. Theapplication19 determines the movement performed by the user based on one or more processed real time signals. Theapplication19 extracts505 a set of statistical or morphological features present in a rolling window of one or more processed signals. Features may include amplitude, mean value, variance, standard deviation of the signal, the number of valleys and/or peaks, the order of valleys and/or peaks, the amplitude of valleys and/or peaks, the frequency of valleys and/or peaks and/or the time period of valleys and/or peaks in one or more processed signals. For example, while performing a curl, the z-axis of theaccelerometer6 might record a repetitive pattern of a single peak over a time period n, followed by a single valley also over a time period n. The y-axis of theaccelerometer6 might record a repetitive pattern of a valley between 2 peaks during the same time period n. The extracted features are used by the classification algorithm to detect the movement being performed by the user.
In one embodiment theapplication19 applies a template matching algorithm to identify repetitive features (such as peaks and valleys in the signal) in a rolling window of one or more processed signals. Theapplication19 compares the repetitive features in the rolling window of one or more processed signals to a set ofmovement templates27 stored in amovement template database22 on thesmart device18 or themotion tracking device24. Based on the comparison, theapplication19 then selects themovement templates27 in thedatabase22 having repetitive features most similar to or most closely matching those present in one or more processed signals. Based on one or more or a combination of the selected templates theapplication19 identifies and classifies515 the movement being performed by the user. For example, theapplication19 compares the repetitive features present in the z-axis of theaccelerometer6 signal and the y-axis of thegyroscope7 as a user is performing a curl, with themovement templates27 stored in themovement template database22. Theapplication19 selects a z-axis accelerationsignal movement template27 and a y-axis gyroscope7signal movement template27 similar to that of the recorded signals. Theapplication19 then identifies515 that the user is performing a curl, as the twomovement templates27 selected by theapplication19 are known to be associated with a curl movement. One example of template matching algorithms is cross correlation algorithm. Another example is dynamic time warping.
In one embodiment, theapplication19 guides the user through a fitness routine. As theapplication19, is guiding the user through the fitness routine, theapplication19 is aware of the repetitive movement being performed by the user. Thus, theapplication19, may verify the movement identified by theapplication19 based on the recordedmotion data25 with the movement theapplication19 expects the user to perform based on the fitness routine. In a second embodiment, as theapplication19 is aware of the movement being performed by the user, theapplication19 no longer needs to identify the movement being performed by the user based on the motion data, and hence begins to count the repetitive features present in one or more processed signals to determine the repetitions performed by the user. In a third embodiment, as theapplication19 is aware of the movement being performed by the user, theapplication19 may compare the recorded motion data to a subset ofmovement templates27 in themovement template database22, wherein the subset ofmovement templates27 represent templates related to the movements theapplication19 expects the user to perform. For example, if theapplication19 is aware that the user is currently performing a curl as part of a fitness routine, theapplication19 would compare the recordedmotion data19 with that ofmovement templates27 associated with the curl classification of movements.
In one embodiment, theapplication19 determines510 the statistical characteristics such as a mean or standard deviation associated with the repetitive features in one or more signals. For example, theapplication19 may determine510 the mean of the amplitude of the peaks recorded in one or more signals, while the user is performing curls. If the mean is found to be relatively greater than that of the expected threshold for characterizing real peaks, theapplication19 may raise the weight for the next set of curls suggested to the user as a relatively higher mean implies that the user was able to perform a current curl easier (at a faster rate) than that is expected. In another embodiment, the application may determine the standard deviation of the amplitude and frequency of the peaks recorded in one or more signals, while the user is performing curls. If the standard deviation is found to be outside of an expected range of standard deviation values for a curl action, it is possible that even though the pattern of features may have been identified to match a curl, the user may not really be performing a curl, but may be performing a different motion similar to a curl. Hence, the statistical characteristics of the features in one or more signals provide additional information towards the identifying515 the movement performed by the user.
In another embodiment, theapplication19 uses machine learning algorithms to detect movements, and/or classify or identify505 movements performed by a user in a rolling window of one or more signals based on the repetitive features or morphological features present in one or more signals. An example of a recognition and a learning algorithm that may be used is described in Ling Bao et. al, “Activity Recognition from user-Annotated Acceleration Data”, which is incorporated by reference herein in its entirety.
In one example, in addition to calibrating themotion tracking system100, the user may edit the interpreteddata28 generated by themotion tracking system100. For example the user may edit the number of repetitive movements or the count associated with a motion or exercise performed by the user as determined by themotion tracking system100. The edits performed by the user along with the motion data associated with the motion performed by the user are analyzed by themotion tracking system100. In one example, based on the edits performed to the interpreteddata28 themotion tracking system100 modifies the classification algorithm, the template matching algorithm, or the machine learning algorithm used to generate the interpreteddata28. The algorithms are modified to capture the discrepancies in the interpreted data generated prior to the user editing the interpreted data, and thereby cater to the behavior and unique movements and tendencies associated with the user.
In one example, themotion tracking system100 determines that the user performed 11 counts of a repetitive motion. The user may edit the count from 11 to 10. Themotion tracking system100 identifies the discrepancy between the features of the motion data associated with the repetitive movement and one or more algorithms used to classify the repetitive movement and quantify the repetitive movement. For example, based on the template matching algorithm, themotion tracking system100 may have classified a feature similar to that associated with the repetitive movement as a repetitive movement performed by the user. Based on the edit made by the user, themotion tracking system100 modifies the applied algorithm to no longer classify and quantify the feature similar to that associated with the repetitive movement.
In another example, themotion tracking system100 receives an edit from the user correcting the user's heart rate measured as the user was performing a motion. Themotion tracking system100 may retrieve the edit made by the user and the motion data associated with the edit (e.g., motion data associated with the user using a cardio fitness machine as themotion tracking system100 measured the user's heart rate). Themotion tracking system100 may modify the algorithm applied to quantify the user's heart rate (based on the heart rate data received). Themotion tracking system100 may then apply the modified algorithm when the user performs the motion or similar motion data is received in the future.
In one embodiment, themotion tracking system100 groups the edit information received from a number of users based on one or more characteristics associated with the users such as data associated with the physical measurements of a user and/or the type of edits made by the user to the interpreteddata28 generated by the motion tracking system. Based on the types of edits made by a user and the grouping associated with the type of edits, themotion tracking system100 may modify the algorithms applied to classify and quantify repetitive motions performed by the user. For example, users of a certain height range may perform particular motions prior to beginning a set of pull ups. These motions may accidentally be classified and quantified as a count or pull up performed by the user. Themotion tracking system100 may modify the algorithms classifying and quantifying pull ups performed by the user based on the height information associated with the user.
Reference in the specification to “one embodiment” or to “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiment. The appearances of the phrase “in one embodiment” or “an embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
Some portions of the detailed description are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps (instructions) leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared and otherwise manipulated. It is convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. Furthermore, it is also convenient at times, to refer to certain arrangements of steps requiring physical manipulations or transformation of physical quantities or representations of physical quantities as modules or code devices, without loss of generality.
However, all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or “determining” or the like, refer to the action and processes of a computer system, or similar electronic computing device (such as a specific computing machine), that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Certain aspects of the embodiments include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the embodiments can be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different platforms used by a variety of operating systems. The embodiments can also be in a computer program product which can be executed on a computing system.
The embodiments also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the purposes, e.g., a specific computer, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus. Memory can include any of the above and/or other devices that can store information/data/programs and can be transient or non-transient medium, where a non-transient or non-transitory medium can include memory/storage that stores information for more than a minimal duration. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the method steps. The structure for a variety of these systems will appear from the description herein. In addition, the embodiments are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the embodiments as described herein, and any references herein to specific languages are provided for disclosure of enablement and best mode.
In addition, the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the embodiments, which is set forth in the claims.
While particular embodiments and applications have been illustrated and described herein, it is to be understood that the embodiments are not limited to the precise construction and components disclosed herein and that various modifications, changes, and variations may be made in the arrangement, operation, and details of the methods and apparatuses of the embodiments without departing from the spirit and scope of the embodiments as defined in the appended claims.