Movatterモバイル変換


[0]ホーム

URL:


CN101784230A - System and method for displaying anonymously annotated physical exercise data - Google Patents

System and method for displaying anonymously annotated physical exercise data
Download PDF

Info

Publication number
CN101784230A
CN101784230ACN200880104207ACN200880104207ACN101784230ACN 101784230 ACN101784230 ACN 101784230ACN 200880104207 ACN200880104207 ACN 200880104207ACN 200880104207 ACN200880104207 ACN 200880104207ACN 101784230 ACN101784230 ACN 101784230A
Authority
CN
China
Prior art keywords
people
data
physical exercise
exercise data
practises
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200880104207A
Other languages
Chinese (zh)
Inventor
G·兰弗曼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NVfiledCriticalKoninklijke Philips Electronics NV
Publication of CN101784230ApublicationCriticalpatent/CN101784230A/en
Pendinglegal-statusCriticalCurrent

Links

Images

Classifications

Landscapes

Abstract

The present invention relates to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises. Based on physical exercise data, the physical exercise data is annotated at a physically separate annotation unit. At the location of the person, visual recordings of the person undertaking exercises together with synchronized annotation information are displayed to the person. A system for performing the method comprises a physical data processing unit (1), a display device (2), at least one posture recording device (3, 3'), a visual recording device (4), a data storage unit (5) and a physically separate annotation unit (6) in connection with the physical data processing unit (1), the connection being via an interconnected computer network (7).

Description

Be used to show the system and method for the anonymous physical exercise data of explaining
Background of invention
The present invention relates to be used for physical exercise (physical exercise) data show of anonymity note is given the people's who practises system and method.
Family care for the people who suffers the such health status puzzlement of similar apoplexy is practised, or improves the people's of the such body kinematics of similar golf swing family training exercise for hope, can come record via pick off.These exercises can also be assessed such as Physical Therapist or golf teacher by the professional person, so that provide direct feedback to this people.
If the professional person who comments (review) not at the scene, then can send to him to the video camera record between practice period.These records can be commented intuitively by the professional person, and can be understood intuitively by the people who practises by the record of filling comment.Yet these records particularly when being sent out away to the far-end professional person, may be invaded individual's privacy.And fully handling such document image automatically is an exigent task so that significant feedback to be provided.
As selection, only transmit the privacy that not invade the individual from the data of pick off.In this respect, US 6,817, and 979 B2 relate to a kind of by using mobile communication equipment to provide the virtual physiological model with the user to carry out mutual system and method.Obtain the physiological data that is associated with this user from the user.Preferably, this physiological data is sent to mobile communication equipment by using wireless communication protocol.This method also involves uses mobile communication equipment that physiological data is delivered to the webserver.This physiological data is integrated in user's the virtual physiological model.The user can visit from the user's data of this physiological data derivation and describe.
As an example, the user can create the incarnation (avatar) of the current condition of representing this user.The user can regulate described incarnation, so that the appearance of incarnation changed over the appearance of more wanting.For example, the anatomy yardstick of incarnation can be changed, waist, chest, upper arm and the thigh yardstick wanted with reflection.After the difference between given incarnation feature of wanting and the current incarnation feature, can draw various training, diet and relevant body-building suggestion, so that set up the training course of the body-building target that is suitable for helping this user to reach most and wants.Obtain physiological data subsequently, apply it to user's incarnation, and compare, reaching whether effective aspect the body-building target of wanting so that determine this training course with the data of the incarnation of wanting.
Yet, the common difficulty that can cause User Part in front end decipher sensor signal.Be difficult to interrelate with artificial screen personage's abstract presenting.
Although therefore this effort is arranged, still exist technically being used for showing the needs of the system and method for the anonymous physical exercise data of explaining to the people who practises.
Brief summary of the invention
In order to reach this and other purpose, the present invention is directed to a kind of method that is used for showing the anonymous physical exercise data of explaining to the people who practises, it may further comprise the steps:
A) gather the physical exercise data from the people who practises;
B) synchronously gather this people's who practises visual record;
C) the physical exercise data are sent to the note unit that physically separates;
D), explain this physical exercise data at the place, note unit that this physically separates according to these physical exercise data;
E) annotating information be sent to show and processing unit to be used for this people's who practises commentary;
F) this people's who practises visual record is shown to this people together with synchronous annotating information.
Detailed description of the invention
Before describing the present invention in detail, should be understood that the specific features part that the invention is not restricted to described equipment or the treatment step of described method, because such equipment and method can change.Be to be further appreciated that employed term only is in order to describe certain embodiments, and does not plan to limit here.It must be noted that when being used, singulative " " and " being somebody's turn to do " (" a ", " an " and " the ") comprise odd number and/or plural object in this description and claims, unless context is stipulated in addition significantly.
In the context of the present invention, the data that term " anonymous explain data " expression is such, the 3rd people who promptly wherein explains does not know that people's that he is explaining its data identity.Particularly, these data do not allow to discern this people.Reach anonymous a kind of mode and be by giving this data assigned identification number.The physical exercise data relate to the data of individual's motion or other exercise.
Two steps of described method are described two different information groups of how to gather about individual's exercise.At first, for example gather the physical exercise data by monitoring continuously from that people's sensor signal.Simultaneously, for example by using digital video camcorder to gather visual record.By synchronously gathering this data, guarantee that afterwards certain part of video flowing can be belonged to certain part of sensor-signal streams, and vice versa.
Because visual record and physical exercise data are independent entities, so the physical exercise data can be sent to the note unit that physically separates then.Explain the unitary anonymity that data separately are provided physically.Explaining the place, unit, the physical exercise data can be processed into the expression of the exercise of commenting for the 3rd people.The physical exercise data can be explained then.This comprises the automatic processing of data, is for example undertaken by detecting with the deviation of motion template.And the 3rd people can include comment and suggestion, so that provide helpful feedback to the people who practises.Subsequently, annotating information is sent to demonstration and the processing unit at the people's who practises place, place.Here, annotating information and visual record combine.Then, the people's who practises visual record is shown to that people together with synchronous annotating information.This assurance synchronously explained in correct time showing, and like this, it is what has caused the observer or has commented the attention of system automatically that people can be directly acquainted with.
In a word, by according to method of the present invention, individual's exercise can be commented anonymously, and can provide feedback to that people.This anonymous the permission shared professional person's resource, makes that the commentary process is more effective.Simultaneously, when that people receives when feedback, feedback most clearly shows to him via visual record: which of exercise partly evoked feedback.
In one embodiment of the invention, in step d), the place, note unit that separates physically is according to physical exercise data computation incarnation.For purposes of the present invention, that term " incarnation " should be represented is that computer generates, commissarial posture or the abstract of motion present.Under simple situation, incarnation can be the stick figure.Under more complicated situation, incarnation can be represented additional information, as pulse rate, volume of perspiration, muscle fatigue degree or the like.Use the advantage of incarnation representation to be, incarnation can be rotated on the unitary screen of note when the expression exercise.This makes the observer can select the optimal viewing angle in order to the assessment exercise.
In another embodiment of the present invention, step f) additionally comprises the calculating incarnation, and incarnation and visual record synchronously are shown to that people with explaining.In a word, that people will see visual record, note and the incarnation of his exercise then.This is favourable, because if those people's motion blocked by loose fitting clothes in visual record, if or they on video camera not by record correctly, then incarnation can more clearly be described those people's motion.In addition, incarnation can be rotated, so that obtain best viewing angle.Another option is to equip one or more incarnation to a plurality of visual angles.
In another embodiment of the present invention, transmitting the physical exercise data and transmit annotating information in step c) in step e), is to carry out via the computer network that interconnects, and this computer network is the Internet preferably.This allows the people who is positioned at far-end to comment and explain.Suitable agreement can comprise those agreements of ICP/IP protocol.
In another embodiment of the present invention, physical exercise data from that people are selected from the group that comprises following item, that is: the order of severity and/or the breathing rate of exercise data, gesture data, electromyographic data, pulse rate, blood pressure, oxygen content, blood sugar content, perspiration.Each of these data types relates to exercise itself, such as under the situation of motion and gesture data.Other data type relates to that people's total situation or physical ability.Can provide valuable insight about the knowledge of this respect for the effectiveness of rehabilitation or training measure.For example, can infer whether that people is in the exceeding compensation stage behind training stimulus.
In another embodiment of the present invention, annotating information is selected from the group that comprises following item, that is: visual information, audio signal and/or voice record.Visual information can have the form of labelling, arrow in all images that is inserted into incarnation in this way, that point out particular problem.In addition, can insert little video clipping, to show the correct execution of exercise.Other visual information can be written comment, or the figure of the statistics of video data, and described data are as the order of severity of electromyographic data, pulse rate, blood pressure, oxygen content, blood sugar content, perspiration and/or breathing rate.This feasible situation that can assess out this people who practises at a glance.Audio signal can be the simple buzzer when correctly not carrying out motion.When the voice comment is when explaining the simplest mode of exercise, can add the voice comment that has write down by the observer.
The present invention also at a kind of system that is used for showing to the people who practises the anonymous physical exercise data of explaining, comprising:
-body data processing unit;
-the display device of communicating by letter with the body data processing unit;
-at least one posture recording equipment, it is assigned to the people who practises, and communicates by letter with the body data processing unit;
-visual record the equipment that communicates with the body data processing unit;
-data storage cell is used to store and retrieve the data from body data processing unit and visual record equipment, and this data storage device is communicated by letter with the body data processing unit;
-with body data processing unit note that be connected, that physically separate unit, described connection be via the interconnection computer network.
In one embodiment of the invention, this at least one posture recording equipment is included in the motion sensor on the person of practising, and this pick off is selected from the group that comprises following item, that is: acceleration transducer, inertial sensor and/or gravity sensor.Motion sensor can be worn at the select location on the person, as upper arm, underarm, thigh, shank or trunk.They can be at the integrated solid state sensor of the height that can buy on the market.Sensor signal can be via wired, wireless or utilize the electric conductivity of application on human skin to carry out in body area network to the unitary transmission of postural assessment.After the posture of calculating the people, its result can provide with the form of incarnation.
In another embodiment of the present invention, this at least one posture recording equipment is included in the optical markings on the person of practising.The posture recording equipment utilizes optical tracking system to follow the tracks of described at least one optical markings then.According to the signal of optical tracking system, calculate the expression of that people's posture then.Optical markings can be carried on the select location on the person, as upper arm, underarm, thigh, shank or trunk.Can realize the tracking of labelling with single camera or numerous video camera.When using stereo camera, generate three-dimensional posture and exercise data.After people's posture being carried out Flame Image Process and calculating, the result can provide with the form of incarnation.
Also might make up several postures and monitor principle.For example, motion sensor and optically tracked combination can provide complementary data, so that calculate people's posture better.
Another aspect of the present invention according to claim of the present invention, be used for showing the use of the system of the anonymous physical exercise data of explaining to the people who practises.
The accompanying drawing summary
With reference to following accompanying drawing, the present invention will become and be more readily understood, wherein:
Fig. 1 shows according to system of the present invention;
Fig. 2 display of visually record is overlapping synchronously with the incarnation of representing the physical exercise data;
Fig. 3 shows the flow chart according to method of the present invention;
Fig. 4 shows the module that is used to carry out according to method of the present invention.
Describe in detail
Fig. 1 shows according to system of the present invention, that be used for showing to the people who practises the anonymous physical exercise data of explaining.This person has themovement monitor 3 that is positioned on its thigh and its ankle joint, with as the posture recording equipment.In addition, optical markings 3 ' is positioned on wrist and the trunk.As the physical exercise data, the signal ofmotion sensor 3 is transmitted wirelessly body data processing unit 1, and primary here sensor signal is processed into motion and gesture data.Video camera 4 recorders' motion.And body data processing unit 1 is carried out the optical tracking operation for the video flowing of video camera 4, with the position and the motion of identificationoptical markings 3 '.This also is processed into motion and gesture data, and replenishes the data that obtain frommotion sensor 3.
Primary or treated sensor signal and be stored in data storage cell 5 from the positional information ofoptical markings 3 '.In addition, the people's who practises video flowing also is stored in the there.Data in the data storage cell 5 are stored together with the information of time about record.This makes might be correlated with or synchronous described information, for example, knows by which frame of indicated which position ofposture recording equipment 3,3 ' corresponding to the people's who practises video clipping.
By using the computer network such as the such interconnection in the Internet 7, body data processing unit 1 is sent to the note unit 6 that physically separates treated pick off 3 signals with from the positional information ofoptical markings 3 '.Delivery time information also.This is explained the unit and calculates visual representation according to the body data that is received then, such as incarnation.The Physical Therapist watches the motion of this visual representation and to each segment filling comment, explains thereby carry out on his terminal 8.This note is transferred back to the body data processing unit 1 of the position that is in the people who practises together with the time of explaining in this exercise.Again, described transmission is by realizing such as the computer network of the such interconnection in the Internet 7.
Body data processing unit 1 is access data storage unit 5 then, and the control oneself data and the video clipping of record of the particular exercises explained of retrieval.Generate the film sequence of watching, and it is presented on the display 2 for that people.In this case, that people's video flowing and shown simultaneously from the incarnation that recorded data calculates.At reasonable time, Physical Therapist's comment also is displayed to or says to that people.
Fig. 2 display of visually record is overlapping synchronously with the incarnation of representing the physical exercise data.A people is practising.Represent the body data of his motion to be recorded and to be used in to calculate incarnation and represent.The motion of incarnation is decomposed by the time, and is divided into the stream offrame 20 one by one.Similarly, this person's motion is by the video camera record.This sequence of video images is also decomposed by the time, and is divided into the stream offrame 21 one by one.Because physical exercise data and visual record are synchronously gathered, so can be assigned to them to a common timeline.Flow down the timeline of face among Fig. 2 at frame, at random divide beginning and divide end at 4:21 at 4:16.
In the exercise of Fig. 2, this person stretches downwards from his two arms.In image, left arm remains stretching, extension, and lifts along coronalplane, surpasses this person's head until hands.Arm is maintained at this position, and supposes simultaneously to carry out identical motion with right arm.In the time of 4:20, this person can not make his right arm remain at horizontal level to stretch out.Arm is in elbow bends.This feasible easier lifting arm, like this, in this point, the benefit that does not obtain medical treatment.Can pick out the frame that divides at 4:20 so remotely comment the Physical Therapist ofincarnation frame 20, and add visual or oral comment.This comment is together with dividing the information that is shown in the exercise to be transmitted to this person at 4:20, to be used for commentary in the future.In this person's position, this note can be made up withvisual record 21, get in touch so that this person can more directly produce with exercise, and watch the mistake that he occurs attentively when practising.
Fig. 3 shows the flow chart according to method of the present invention.First step 30 is: the use video camera comes visually and uses pick off to write down an ongoing exercise of people via gesture data.Visual record is stored 31, and the posture record is sent to annotation system 32.By using annotation system, someone comments described posture record, and adds his comment and labelling 33.These notes are transmitted topatient system 34, wherein the people that practises of " patient " expression.In patient-side, the visual record of being stored is retrieved 35, and with this explain combined 36 so that provide his the comprehensive feedback of anonymity of still being safe from harm to that people.
Fig. 4 shows the module be used to carry out according to method of the present invention, so that replenish describing the system of Fig. 1.Pick offreceiver 40 receives from the signal of motion sensor or from the information of the tracking of optical markings.This pick offreceiver 40 is delivered tomotion transport module 41 to its data.With pick offreceiver 40 synchronously,video camera 42 is caught the people's who practises video sequence.These video sequences are stored in the storage facility 43.Motion transport module 41 is sent to its data themobile receiver 45 that is positioned at far-end.This is represented by the boundary line 44 of separating two module groups.
Motion receiver module 45 is explaineddevice 46 to data delivery to motion, and these data are converted into accessible data therein, and are explained by the observer.This note is delivered to together with the information about the time location of this note in exercise and explains transport module 47.Above-mentionednote transport module 47 is sent to this information thenote receiver 48 that is positioned at child group of place of the module that is assigned to the people who practises.This annotating information arrives to be handled and overlappingmodule 49, the video sequence that processing and overlappingmodule 49 are visited frommemory module 43, and make up described sequence and note, so that the reasonable time that should explain at video sequence occurs.At last, via presentingmodule 50, the overlapping video sequence is shown to the people who has practised.
In order to provide comprehensive disclosure under the situation of increase application length within reason, the applicant is incorporated in this to above-mentioned each patent and patent application by reference.
The unit in the embodiment of above detailed description and the particular combinations of feature only are exemplary; These instructions with in the application and by reference and the exchange of other instruction in merged patent/application and alternative also obviously expect.Just as the skilled person will recognize, it may occur to persons skilled in the art that content change as described herein, modification and other realizations, and do not deviate from desired the spirit and scope of the present invention.Therefore, above explanation only is as an example, and is not intended as restriction.Scope of the present invention limits in following claim and equivalent thereof.And the reference number that uses in description and claim does not limit desired scope of the present invention.

Claims (10)

CN200880104207A2007-08-242008-08-22System and method for displaying anonymously annotated physical exercise dataPendingCN101784230A (en)

Applications Claiming Priority (3)

Application NumberPriority DateFiling DateTitle
EP07114912.42007-08-24
EP071149122007-08-24
PCT/IB2008/053386WO2009027917A1 (en)2007-08-242008-08-22System and method for displaying anonymously annotated physical exercise data

Publications (1)

Publication NumberPublication Date
CN101784230Atrue CN101784230A (en)2010-07-21

Family

ID=40122948

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN200880104207APendingCN101784230A (en)2007-08-242008-08-22System and method for displaying anonymously annotated physical exercise data

Country Status (5)

CountryLink
US (1)US20110021317A1 (en)
EP (1)EP2185071A1 (en)
JP (1)JP2010536459A (en)
CN (1)CN101784230A (en)
WO (1)WO2009027917A1 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
CN102440774A (en)*2011-09-012012-05-09东南大学Remote measurement module for related physiological information in rehabilitation training process
CN103502987A (en)*2011-02-172014-01-08耐克国际有限公司 Select and correlate physical activity data using image data
US9297709B2 (en)2013-03-152016-03-29Nike, Inc.System and method for analyzing athletic activity
CN105615852A (en)*2016-03-172016-06-01北京永数网络科技有限公司Blood pressure detection system and method
CN105641900A (en)*2015-12-282016-06-08联想(北京)有限公司Respiration state reminding method, electronic equipment and system
US9381420B2 (en)2011-02-172016-07-05Nike, Inc.Workout user experience
US9389057B2 (en)2010-11-102016-07-12Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US9462844B2 (en)2008-06-132016-10-11Nike, Inc.Footwear having sensor system
US9549585B2 (en)2008-06-132017-01-24Nike, Inc.Footwear having sensor system
US9622537B2 (en)2008-06-132017-04-18Nike, Inc.Footwear having sensor system
US9743861B2 (en)2013-02-012017-08-29Nike, Inc.System and method for analyzing athletic activity
US9756895B2 (en)2012-02-222017-09-12Nike, Inc.Footwear having sensor system
US9924760B2 (en)2011-02-172018-03-27Nike, Inc.Footwear having sensor system
US10070680B2 (en)2008-06-132018-09-11Nike, Inc.Footwear having sensor system
US10568381B2 (en)2012-02-222020-02-25Nike, Inc.Motorized shoe with gesture control
US10926133B2 (en)2013-02-012021-02-23Nike, Inc.System and method for analyzing athletic activity
CN112805073A (en)*2018-08-072021-05-14交互力量公司Interactive fitness equipment system with mirror display
US11006690B2 (en)2013-02-012021-05-18Nike, Inc.System and method for analyzing athletic activity
US11684111B2 (en)2012-02-222023-06-27Nike, Inc.Motorized shoe with gesture control

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8500604B2 (en)*2009-10-172013-08-06Robert Bosch GmbhWearable system for monitoring strength training
US8706530B2 (en)2010-09-292014-04-22Dacadoo AgAutomated health data acquisition, processing and communication system
US9011293B2 (en)*2011-01-262015-04-21Flow-Motion Research And Development Ltd.Method and system for monitoring and feed-backing on execution of physical exercise routines
US9378336B2 (en)2011-05-162016-06-28Dacadoo AgOptical data capture of exercise data in furtherance of a health score computation
US20130178960A1 (en)*2012-01-102013-07-11University Of Washington Through Its Center For CommercializationSystems and methods for remote monitoring of exercise performance metrics
ITGE20120011A1 (en)*2012-01-272013-07-28Paybay Networks S R L PATIENT REHABILITATION SYSTEM
US9501942B2 (en)2012-10-092016-11-22Kc Holdings IPersonalized avatar responsive to user physical state and context
US9652992B2 (en)*2012-10-092017-05-16Kc Holdings IPersonalized avatar responsive to user physical state and context
JP5811360B2 (en)*2012-12-272015-11-11カシオ計算機株式会社 Exercise information display system, exercise information display method, and exercise information display program
JP2014199613A (en)*2013-03-292014-10-23株式会社コナミデジタルエンタテインメントApplication control program, application control method, and application control device
US20150133820A1 (en)*2013-11-132015-05-14Motorika LimitedVirtual reality based rehabilitation apparatuses and methods
US20170000388A1 (en)*2014-01-242017-01-05Icura ApsSystem and method for mapping moving body parts
US10484437B2 (en)*2015-01-212019-11-19Logmein, Inc.Remote support service with two-way smart whiteboard
WO2016196217A1 (en)*2015-05-292016-12-08Nike Innovate C.V.Enhancing exercise through augmented reality
WO2017055080A1 (en)*2015-09-282017-04-06Koninklijke Philips N.V.System and method for supporting physical exercises
KR102511518B1 (en)*2016-01-122023-03-20삼성전자주식회사Display apparatus and control method of the same
JP7009955B2 (en)*2017-11-242022-01-26トヨタ自動車株式会社 Medical data communication equipment, servers, medical data communication methods and medical data communication programs
US20200107750A1 (en)*2018-10-032020-04-09Surge Motion Inc.Method and system for assessing human movements

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US5679004A (en)*1995-12-071997-10-21Movit, Inc.Myoelectric feedback system
DE69736622T2 (en)*1996-07-032007-09-13Hitachi, Ltd. Motion detection system
JP3469410B2 (en)*1996-11-252003-11-25三菱電機株式会社 Wellness system
US20060247070A1 (en)*2001-06-112006-11-02Recognition Insight, LlcSwing position recognition and reinforcement
US20030054327A1 (en)*2001-09-202003-03-20Evensen Mark H.Repetitive motion feedback system and method of practicing a repetitive motion
US6817979B2 (en)*2002-06-282004-11-16Nokia CorporationSystem and method for interacting with a user's virtual physiological model via a mobile terminal
US20060025229A1 (en)*2003-12-192006-02-02Satayan MahajanMotion tracking and analysis apparatus and method and system implementations thereof
EP1846115A4 (en)*2005-01-262012-04-25Bentley Kinetics IncMethod and system for athletic motion analysis and instruction
US20060183980A1 (en)*2005-02-142006-08-17Chang-Ming YangMental and physical health status monitoring, analyze and automatic follow up methods and its application on clothing
US20080191864A1 (en)*2005-03-312008-08-14Ronen WolfsonInteractive Surface and Display System
WO2008007292A1 (en)*2006-07-122008-01-17Philips Intellectual Property & Standards GmbhHealth management device
WO2009024929A1 (en)*2007-08-222009-02-26Koninklijke Philips Electronics N.V.System and method for displaying selected information to a person undertaking exercises

Cited By (48)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9549585B2 (en)2008-06-132017-01-24Nike, Inc.Footwear having sensor system
US10408693B2 (en)2008-06-132019-09-10Nike, Inc.System and method for analyzing athletic activity
US10070680B2 (en)2008-06-132018-09-11Nike, Inc.Footwear having sensor system
US12225980B2 (en)2008-06-132025-02-18Nike, Inc.Footwear having sensor system
US10912490B2 (en)2008-06-132021-02-09Nike, Inc.Footwear having sensor system
US10314361B2 (en)2008-06-132019-06-11Nike, Inc.Footwear having sensor system
US11026469B2 (en)2008-06-132021-06-08Nike, Inc.Footwear having sensor system
US11707107B2 (en)2008-06-132023-07-25Nike, Inc.Footwear having sensor system
US9622537B2 (en)2008-06-132017-04-18Nike, Inc.Footwear having sensor system
US9462844B2 (en)2008-06-132016-10-11Nike, Inc.Footwear having sensor system
US12170138B2 (en)2010-11-102024-12-17Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US11817198B2 (en)2010-11-102023-11-14Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US11935640B2 (en)2010-11-102024-03-19Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US11600371B2 (en)2010-11-102023-03-07Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US9757619B2 (en)2010-11-102017-09-12Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US11568977B2 (en)2010-11-102023-01-31Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US9389057B2 (en)2010-11-102016-07-12Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US12224053B2 (en)2010-11-102025-02-11Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US10632343B2 (en)2010-11-102020-04-28Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US12322488B2 (en)2010-11-102025-06-03Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US12322489B2 (en)2010-11-102025-06-03Nike, Inc.Systems and methods for time-based athletic activity measurement and display
US10293209B2 (en)2010-11-102019-05-21Nike, Inc.Systems and methods for time-based athletic activity measurement and display
CN103502987B (en)*2011-02-172017-04-19耐克创新有限合伙公司 Select and correlate physical activity data using image data
US9381420B2 (en)2011-02-172016-07-05Nike, Inc.Workout user experience
US10179263B2 (en)2011-02-172019-01-15Nike, Inc.Selecting and correlating physical activity data with image data
US9411940B2 (en)2011-02-172016-08-09Nike, Inc.Selecting and correlating physical activity data with image data
CN103502987A (en)*2011-02-172014-01-08耐克国际有限公司 Select and correlate physical activity data using image data
US9924760B2 (en)2011-02-172018-03-27Nike, Inc.Footwear having sensor system
CN102440774A (en)*2011-09-012012-05-09东南大学Remote measurement module for related physiological information in rehabilitation training process
US10357078B2 (en)2012-02-222019-07-23Nike, Inc.Footwear having sensor system
US11071344B2 (en)2012-02-222021-07-27Nike, Inc.Motorized shoe with gesture control
US11071345B2 (en)2012-02-222021-07-27Nike, Inc.Footwear having sensor system
US9756895B2 (en)2012-02-222017-09-12Nike, Inc.Footwear having sensor system
US11684111B2 (en)2012-02-222023-06-27Nike, Inc.Motorized shoe with gesture control
US11793264B2 (en)2012-02-222023-10-24Nike, Inc.Footwear having sensor system
US10568381B2 (en)2012-02-222020-02-25Nike, Inc.Motorized shoe with gesture control
US11006690B2 (en)2013-02-012021-05-18Nike, Inc.System and method for analyzing athletic activity
US10926133B2 (en)2013-02-012021-02-23Nike, Inc.System and method for analyzing athletic activity
US9743861B2 (en)2013-02-012017-08-29Nike, Inc.System and method for analyzing athletic activity
US12194341B2 (en)2013-02-012025-01-14Nike, Inc.System and method for analyzing athletic activity
US11918854B2 (en)2013-02-012024-03-05Nike, Inc.System and method for analyzing athletic activity
US9410857B2 (en)2013-03-152016-08-09Nike, Inc.System and method for analyzing athletic activity
US9810591B2 (en)2013-03-152017-11-07Nike, Inc.System and method of analyzing athletic activity
US9297709B2 (en)2013-03-152016-03-29Nike, Inc.System and method for analyzing athletic activity
US10024740B2 (en)2013-03-152018-07-17Nike, Inc.System and method for analyzing athletic activity
CN105641900A (en)*2015-12-282016-06-08联想(北京)有限公司Respiration state reminding method, electronic equipment and system
CN105615852A (en)*2016-03-172016-06-01北京永数网络科技有限公司Blood pressure detection system and method
CN112805073A (en)*2018-08-072021-05-14交互力量公司Interactive fitness equipment system with mirror display

Also Published As

Publication numberPublication date
WO2009027917A1 (en)2009-03-05
EP2185071A1 (en)2010-05-19
JP2010536459A (en)2010-12-02
US20110021317A1 (en)2011-01-27

Similar Documents

PublicationPublication DateTitle
CN101784230A (en)System and method for displaying anonymously annotated physical exercise data
US20220005577A1 (en)Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation
US10755466B2 (en)Method and apparatus for comparing two motions
US10089763B2 (en)Systems and methods for real-time data quantification, acquisition, analysis and feedback
US8758020B2 (en)Periodic evaluation and telerehabilitation systems and methods
KR100772497B1 (en) Golf Clinic System and Its Operation Method
US20150327794A1 (en)System and method for detecting and visualizing live kinetic and kinematic data for the musculoskeletal system
WO2022193425A1 (en)Exercise data display method and system
US9248361B1 (en)Motion capture and analysis systems for use in training athletes
KR102388337B1 (en)Service provision method of the application for temporomandibular joint disease improvement service
KR20180103280A (en)An exercise guidance system for the elderly that performs posture recognition based on distance similarity between joints
US20130280683A1 (en)Equestrian Performance Sensing System
CN107049324A (en)The determination methods and device of a kind of limb motion posture
WO2012061804A1 (en)Method and system for automated personal training
US20170112418A1 (en)Motion capture and analysis system for assessing mammalian kinetics
JP2016080752A (en) Medical practice training suitability evaluation device
US20210265055A1 (en)Smart Meditation and Physiological System for the Cloud
CA3152977A1 (en)Systems and methods for wearable devices that determine balance indices
WO2022183009A1 (en)Neurofeedback rehabilitation system
KR20230013853A (en)System and method for management of developmental disabilities based on personal health record
US20250166834A1 (en)System and Method for Utilizing Immersive Virtual Reality and Sensor Data in Neuromuscular Movement Coaching and Training Activities, and Physical Therapy
US20240215922A1 (en)Patient positioning adaptive guidance system
JP7353605B2 (en) Inhalation motion estimation device, computer program, and inhalation motion estimation method
US20210352066A1 (en)Range of Motion Tracking System
KR20220067781A (en)Electrical muscle stimulation training system and method

Legal Events

DateCodeTitleDescription
C06Publication
PB01Publication
C10Entry into substantive examination
SE01Entry into force of request for substantive examination
C02Deemed withdrawal of patent application after publication (patent law 2001)
WD01Invention patent application deemed withdrawn after publication

Open date:20100721


[8]ページ先頭

©2009-2025 Movatter.jp