CROSS-REFERENCES TO RELATED APPLICATIONSThis application claims priority to Japanese Patents Application No. 2016-213378, filed Oct. 31, 2016, and No. 2016-213379, filed Oct. 31, 2016, all of which are herein incorporated by reference.
BACKGROUND1. Technical FieldThe present invention relates to an electronic device, a display method, a display system, and a recording medium.
2. Related ArtJP-A-2006-221251 (Patent Literature 1) describes a system in which, in a triathlon, when athletes record passing times using digital pens in switching gates of respective athletic events of swimming, bicycling, and running (a state gate of a bicycle even and a start gate of a running event), information concerning the recorded times is transmitted to a server via communication terminals and the server tabulates the passing times of the switching gates for each of the athletes, calculates times of the athletic events, and generates a print including a result sheet.
However, in the system described inPatent Literature 1, the athletes need to perform work for recording the passing times using the digital pens in the switching gates of the athletic events. This is time consuming and times required for switching (transition) of the athletic events (exercise events) increase to increase total times of the athletic events.
SUMMARYAn advantage of some aspects of the invention is to provide an electronic device, a display method, a display system, and a recording medium that make it unnecessary for a user to perform work when an exercise event carried out by the user is switched.
The invention can be implemented as the following forms or application examples.
APPLICATION EXAMPLE 1An electronic device according to this application example includes a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state.
The electronic device according to this application example may include a position-information generating unit configured to generate the position information on the basis of the satellite signal transmitted from the position information satellite.
With the electronic device according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, and the second exercise state in which the user is carrying out the second exercise event. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.
APPLICATION EXAMPLE 2The electronic device according to the application example may include a notifying unit configured to notify a state discriminated by the processing unit.
The notifying unit may notify the state with at least one of display, sound, and vibration.
With the electronic device according to this application example, since the electronic device notifies the discriminated state, the user can confirm whether the notified state is correct. Alternatively, a person (e.g., a coach) different from the user can confirm the notified state of the user.
APPLICATION EXAMPLE 3In the electronic device according to the application example, the processing unit may determine on the basis of the position information whether the user passes the first position and whether the user passes the second position, determine that the user is in the first exercise state until the user passes the first position, determine that the user is in the first transition state until the user passes the second position after passing the first position, and determine that the user is in the second exercise state after the user passes the second position.
With the electronic device according to this application example, since a state of the user is switched when the user passes the first position or the second position, it is possible to discriminate the state of the user.
APPLICATION EXAMPLE 4In the electronic device according to the application example, the processing unit may calculate times respectively required for the first exercise state, the second exercise state, and the first transition state.
With the electronic device according to this application example, it is possible to calculate a time required for carrying out the first exercise event, a time required for switching from the first exercise event to the second exercise event, and a time required for carrying out the second exercise event by the user.
APPLICATION EXAMPLE 5In the electronic device according to the application example, the plurality of states may include a third exercise state in which the user is carrying out a third exercise event and a second transition state halfway in transition from the second exercise state to the third exercise state, and the processing unit may discriminate the plurality of states on the basis of a third position and a fourth position registered in advance.
With the electronic device according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, the second exercise state in which the user is carrying out the second exercise event, the second transition state halfway in transition from the second exercise state to the third exercise state, and the third exercise state in which the user is carrying out the third exercise event. Therefore, the user does not need to perform work when the exercise event carried out by the user is switched from the first exercise event to the second exercise event and when the exercise event carried out by the user is switched from the second exercise event to the third exercise event.
APPLICATION EXAMPLE 6In the electronic device according to the application example, the processing unit may determine on the basis of the position information whether the user passes the third position and whether the user passes the fourth position, determine that the user is in the second exercise state until the user passes the third position, determine that the user is in the second transition state until the user passes the fourth position after passing the third position, and determine that the user is in the third exercise state after the user passes the fourth position.
With the electronic device according to this application example, since a state of the user is switched when the user passes the third position or the fourth position, it is possible to discriminate the state of the user.
APPLICATION EXAMPLE 7In the electronic device according to the application example, the processing unit may calculate times respectively required for the third exercise state and the second transition state.
With the electronic device according to this application example, it is possible to calculate a time required for switching from the second exercise event to the third exercise event and a time required for carrying out the third exercise event by the user.
APPLICATION EXAMPLE 8In the electronic device according to the application example, the processing unit may discriminate the plurality of states on the basis of at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, the position information, and the first position and the second position.
The electronic device according to this application example may include the first motion sensor or may include the pressure sensor. The electronic device according to this application example may include a position-information generating unit configured to generate the position information on the basis of the satellite signal transmitted from the position information satellite.
The first motion sensor may be an acceleration sensor.
With the electronic device according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event. Therefore, the user does not need to perform work when the exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.
APPLICATION EXAMPLE 9In the electronic device according to the application example, the processing unit may calculate moving speed on the basis of the position information, determine whether the output signal of the first motion sensor has periodicity, detect a change in pressure on the basis of the output signal of the pressure sensor, and discriminate the plurality of states on the basis of the moving speed, whether the output signal of the first motion sensor has periodicity, and the change in the pressure.
With the electronic device according to this application example, it is possible to recognize a change in speed of the user, a change in exercise of the user, and a change in a peripheral environment of the user and discriminate a state of the user on the basis of the moving speed, whether a waveform of the output signal of the first motion sensor has periodicity, and the change in the pressure.
APPLICATION EXAMPLE 10In the electronic device according to the application example, the processing unit may discriminate the plurality of states on the basis of an output signal of a second motion sensor.
The processing unit may determine whether a waveform of the output signal of the second motion sensor has periodicity and discriminate the plurality of states on the basis of whether the waveform has periodicity.
The second motion sensor may be an angular velocity sensor.
With the electronic device according to this application example, it is possible to more accurately discriminate a state of the user on the basis of the position information and at least either one of the output signal of the first motion sensor and the output signal of the pressure sensor and on the basis of the output signal of the second motion sensor.
APPLICATION EXAMPLE 11In the electronic device according to the application example, the processing unit may discriminate the plurality of states on the basis of an output signal of a temperature sensor.
The processing unit may detect a change in temperature on the basis of the output signal of the temperature sensor and discriminate the plurality of states on the basis of the change in the temperature.
With the electronic device according to this application example, it is possible to more accurately discriminate a state of the user on the basis of the position information and at least either one of the output signal of the first motion sensor and the output signal of the pressure sensor and on the basis of the output signal of the temperature sensor.
APPLICATION EXAMPLE 12In the electronic device according to the application example, the first exercise event may be swimming, the second exercise event may be bicycling, and the third exercise event may be running.
With the electronic device according to this application example, it is possible to discriminate a state of the user who carries out a triathlon.
APPLICATION EXAMPLE 13A display method according to this application example includes: discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state; and displaying a discriminated state.
With the display method according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition. Alternatively, a person (e.g., a coach) different from the user can visually recognize a state of the user displayed on a display unit and give advice or the like to the user.
APPLICATION EXAMPLE 14In the display method according to the application example, when the discriminated state is the first transition state, an image including a flashing object may be displayed.
With the display method according to this application example, it is possible to highlight and display a state during switching from the first exercise event to the second exercise event by the user.
APPLICATION EXAMPLE 15In the display method according to the application example, when the discriminated state is the first transition state, an elapsed time from a start of the first transition state may be displayed.
With the display method according to this application example, it is possible to cause the user and the like to recognize a time required for switching from the first exercise event to the second exercise event by the user.
APPLICATION EXAMPLE 16In the display method according to the application example, the elapsed time may be displayed to be comparable with a target time set in advance.
With the display method according to this application example, it is possible to cause the user and the like to recognize whether a time required for switching from the first exercise event to the second exercise event by the user is long or short compared with the set target time.
APPLICATION EXAMPLE 17In the display method according to the application example, the plurality of states may be discriminated on the basis of at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, the position information, and the first position and the second position, and the discriminated state may be displayed.
With the display method according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition. Alternatively, a person (e.g., a coach) different from the user can visually recognize a state of the user displayed on the display unit and give advice or the like to the user.
APPLICATION EXAMPLE 18A display system according to this application example includes: a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state; and a display unit configured to display a discriminated state.
With the display system according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition. Alternatively, a person (e.g., a coach) different from the user can visually recognize a state of the user displayed on the display unit and give advice or the like to the user.
APPLICATION EXAMPLE 19A computer-readable recording medium according to an application example 19 records therein a computer program for causing a computer to execute a step of discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state.
With the computer program and the recording medium according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.
APPLICATION EXAMPLE 20The recording medium according to the application example is a computer-readable recording medium having recorded therein a computer program for causing the computer to execute steps of: determining on the basis of the position information whether the user passes the first position and whether the user passes the second position; determining that the user is in the first exercise state until the user passes the first position; determining that the user is in the first transition state until the user passes the second position after passing the first position; and determining that the user is in the second exercise state after the user passes the second position.
With the recording medium according to this application example, since a state of the user is switched when the user passes the first position or the second position, it is possible to discriminate the state of the user. Therefore, the user does not need to perform work when the exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.
APPLICATION EXAMPLE 21In the electronic device according to Application Example 1 or 2, a plurality of the first positions and a plurality of the second positions may be registered in advance, and the processing unit may determine on the basis of the position information whether the user passes any one of the plurality of first positions and whether the user passes any one of the plurality of second positions, determine that the user is in the first exercise state until the user passes any one of the plurality of first positions, determine that the user is in the first transition state until the user passes any one of the plurality of second positions after passing any one of the plurality of first positions, and determine that the user is in the second exercise state after the user passes any one of the plurality of second positions.
With the electronic device according to this application example, a state of the user is switched when the user passes any one of the plurality of first positions or any one of the plurality of second positions. Therefore, compared with when only one first position and one second position are registered, it is possible to more accurately discriminate the state of the user.
APPLICATION EXAMPLE 22In the electronic device according to Application Example 5, a plurality of the third positions and a plurality of the fourth positions may be registered in advance, and the processing unit may determine on the basis of the position information whether the user passes any one of the plurality of third positions and whether the user passes any one of the plurality of fourth positions, determine that the user is in the second exercise state until the user passes any one of the plurality of third positions, determine that the user is in the second transition state until the user passes any one of the plurality of fourth positions after passing any one of the plurality of third positions, and determine that the user is in the third exercise state after the user passes any one of the plurality of fourth positions.
With the electronic device according to this application example, a state of the user is switched when the user passes any one of the plurality of third positions or any one of the plurality of fourth positions. Therefore, compared with when only one third position and one fourth position are registered, it is possible to more accurately discriminate the state of the user.
APPLICATION EXAMPLE 23A computer program according to this application example causes a computer to execute a step of discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and a first position and a second position registered in advance, a plurality of states including a first exercise state in which a user is carrying out a first exercise event, a second exercise state in which the user is carrying out a second exercise event, and a first transition state halfway in transition from the first exercise state to the second exercise state.
With the computer program according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event, the first transition state halfway in transition from the first exercise state to the second exercise state, and the second exercise state in which the user is carrying out the second exercise event. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.
APPLICATION EXAMPLE 24An electronic device according to this application example includes a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, a plurality of states including a first exercise state in which a user is carrying out a first exercise event and a second exercise state in which the user is carrying out a second exercise event.
The electronic device according to this application example may include the first motion sensor or may include the pressure sensor. The electronic device according to this application example may include a position-information generating unit configured to generate the position information on the basis of the satellite signal transmitted from the position information satellite.
The first motion sensor may be an acceleration sensor.
With the electronic device according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.
APPLICATION EXAMPLE 25The electronic device according to the application example 24 may include a notifying unit configured to notify the state discriminated by the processing unit.
The notifying unit may notify the state with at least one of display, sound, and vibration.
With the electronic device according to this application example, since the electronic device notifies the discriminated state, the user can confirm whether the notified state is correct. Alternatively, a person (e.g., a coach) different from the user can confirm the notified state of the user.
APPLICATION EXAMPLE 26In the electronic device according to the application example 24 or 25, the processing unit may calculate moving speed on the basis of the position information, determine whether a waveform of the output signal of the first motion sensor has periodicity, detect a change in pressure on the basis of the output signal of the pressure sensor, and discriminate the plurality of states on the basis of the moving speed, whether the waveform has periodicity, and the change in the pressure.
With the electronic device according to this application example, it is possible to recognize a change in speed of the user, a change in exercise of the user, and a change in a peripheral environment of the user and discriminate a state of the user on the basis of the moving speed, whether the waveform of the output signal of the first motion sensor has periodicity, and the change in the pressure.
APPLICATION EXAMPLE 27In the electronic device according to any one of the application examples 24 to 26, the plurality of states may include a third exercise state in which the user is carrying out a third exercise event.
With the electronic device according to this application example, it is possible to discriminate a state of the user who carries out an athletic competition including three or more exercise events.
APPLICATION EXAMPLE 28In the electronic device according to the application example 27, the plurality of states may include a first transition state halfway in transition from the first exercise state to the second exercise state and a second transition state halfway in transition from the second exercise state to the third exercise state.
With the electronic device according to this application example, it is possible to discriminate whether a state of the user is a state in which the user is carrying out the first exercise event, a state in which the user is switching the first exercise event to the second exercise event, a state in which the user is carrying out the second exercise event, a state in which the user is switching the second exercise event to the third exercise event, or a state in which the user is carrying out the third exercise event.
APPLICATION EXAMPLE 29In the electronic device according to the application example 28, the processing unit may calculate times respectively required for the first exercise state, the second exercise state, the third exercise state, the first transition state, and the second transition state.
With the electronic device according to this application example, it is possible to calculate a time required for carrying out the first exercise event, a time required for switching from the first exercise event to the second exercise event, a time required for carrying out the second exercise event, a time required for switching from the second exercise event to the third exercise event, and a time required for carrying out the third exercise event by the user.
APPLICATION EXAMPLE 30In the electronic device according to the application example 28 or 29, the first exercise event may be swimming, the second exercise event may be bicycling, and the third exercise event may be running.
With the electronic device according to this application example, it is possible to discriminate a state of the user who carries out a triathlon.
APPLICATION EXAMPLE 31In the electronic device according to any one of the application examples 24 to 30, the processing unit may discriminate the plurality of states on the basis of an output signal of a second motion sensor.
The processing unit may determine whether a waveform of the output signal of the second motion sensor has periodicity and discriminate the plurality of states on the basis of whether the waveform has periodicity.
The second motion sensor may be an angular velocity sensor.
With the electronic device according to this application example, it is possible to more accurately discriminate a state of the user on the basis of the position information and at least either one of the output signal of the first motion sensor and the output signal of the pressure sensor and on the basis of the output signal of the second motion sensor.
APPLICATION EXAMPLE 32In the electronic device according to any one of the application examples 24 to 31, the processing unit may discriminate the plurality of states on the basis of an output signal of a temperature sensor.
The processing unit may detect a change in temperature on the basis of the output signal of the temperature sensor and discriminate the plurality of states on the basis of the change in the temperature.
With the electronic device according to this application example, it is possible to more accurately discriminate a state of the user on the basis of the position information and at least either one of the output signal of the first motion sensor and the output signal of the pressure sensor and on the basis of the output signal of the temperature sensor.
APPLICATION EXAMPLE 33A display method according to this application example includes: discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, a plurality of states including a first exercise state in which a user is carrying out a first exercise event and a second exercise state in which the user is carrying out a second exercise event; and displaying a discriminated state.
With the display method according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition. Alternatively, a person (e.g., a coach) different from the user can visually recognize a state of the user displayed on a display unit and give advice or the like to the user.
APPLICATION EXAMPLE 34In the display method according to the application example 33, the plurality of states may include a first transition state halfway in transition from the first exercise state to the second exercise state.
With the display method according to this application example, it is possible to discriminate whether a state of the user is a state in which the user is carrying out the first exercise event, a state in which the user is switching the first exercise event to the second exercise event, or a state in which the user is carrying out the second exercise event and display the discriminated state.
APPLICATION EXAMPLE 35In the display method according to the application example 34, when the discriminated state is the first transition state, an image including a flashing object may be displayed.
With the display method according to this application example, it is possible to highlight and display a state during switching from the first exercise event to the second exercise event by the user.
APPLICATION EXAMPLE 36In the display method according to the application example 34 or 35, when the discriminated state is the first transition state, an elapsed time from a start of the first transition state may be displayed.
With the display method according to this application example, it is possible to cause the user and the like to recognize a time required for switching from the first exercise event to the second exercise event by the user.
APPLICATION EXAMPLE 37In the display method according to the application example 36, the elapsed time may be displayed to be comparable with a target time set in advance.
With the display method according to this application example, it is possible to cause the user and the like to recognize whether a time required for switching from the first exercise event to the second exercise event by the user is long or short compared with the set target time.
APPLICATION EXAMPLE 38A display system according to this application example including: a processing unit configured to discriminate, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, a plurality of states including a first exercise state in which a user is carrying out a first exercise event and a second exercise state in which the user is carrying out a second exercise event; and a display unit configured to display a discriminated state.
With the display system according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event and display the discriminated state. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition. Alternatively, a person (e.g., a coach) different from the user can visually recognize a state of the user displayed on the display unit and give advice or the like to the user.
APPLICATION EXAMPLE 39A computer program according to this application example causes a computer to execute a step of discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, a plurality of states including a first exercise state in which a user is carrying out a first exercise event and a second exercise state in which the user is carrying out a second exercise event.
APPLICATION EXAMPLE 40A computer-readable recording medium according to an application example records therein a computer program for causing a computer to execute a step of discriminating, on the basis of position information obtained on the basis of a satellite signal transmitted from a position information satellite and at least either one of an output signal of a first motion sensor and an output signal of a pressure sensor, a plurality of states including a first exercise state in which a user is carrying out a first exercise event and a second exercise state in which the user is carrying out a second exercise event.
With the computer program and the recording medium according to this application example, it is possible to discriminate the first exercise state in which the user is carrying out the first exercise event and the second exercise state in which the user is carrying out the second exercise event. Therefore, the user does not need to perform work when an exercise event carried out by the user is switched from the first exercise event to the second exercise event. Therefore, the user can concentrate on an athletic competition.
BRIEF DESCRIPTION OF THE DRAWINGSThe invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
FIG. 1 is a diagram showing a configuration example of an exercise information management system in a first embodiment.
FIG. 2 is an explanatory diagram concerning an overview of the exercise information management system in the first embodiment.
FIG. 3 is a diagram showing an example concerning a course used for a triathlon and registration of positions in the first embodiment.
FIG. 4 is an example of a functional block diagram of a user terminal.
FIG. 5 is a flowchart for explaining an example of a procedure of a part of processing performed by a processing unit of the user terminal.
FIG. 6 is a flowchart for explaining an example of details of a state discrimination processing in the first embodiment.
FIG. 7 is a diagram showing a display example of states of a user that the processing unit causes a display unit to display.
FIG. 8 is a diagram showing an example of images at the time when a state of a user is “first transition”.
FIG. 9 is a diagram showing an example of an image displayed on a display unit of an information terminal.
FIG. 10 is a diagram showing an example of an image displayed on the display unit of the information terminal.
FIG. 11 is a diagram showing an example of an image displayed on the display unit of the information terminal.
FIG. 12 is a diagram showing an example of an image displayed on the display unit of the information terminal.
FIG. 13 is a diagram showing an example concerning registration of positions in a second embodiment.
FIG. 14 is a flowchart for explaining an example of details of state discrimination processing in the second embodiment.
FIG. 15 is a diagram showing an example concerning registration of positions in a third embodiment.
FIG. 16 is a flowchart for explaining an example of details of state discrimination processing in the third embodiment.
FIG. 17 is a diagram showing an example of a course used in a triathlon.
FIG. 18 is a flowchart for explaining an example of details of state discrimination processing in a fourth embodiment.
FIG. 19 is a flowchart for explaining swim determination processing in a fifth embodiment.
FIG. 20 is a flowchart for explaining an example of first transition determination processing in the fifth embodiment.
FIG. 21 is a flowchart for explaining an example of bike determining processing in the fifth embodiment.
FIG. 22 is a flowchart for explaining an example of second transition determination processing in the fifth embodiment.
FIG. 23 is a flowchart for explaining an example of run determination processing in the fifth embodiment.
FIG. 24 is a flowchart for explaining an example of swim determination processing in a sixth embodiment.
FIG. 25 is a flowchart for explaining an example of first transition determination processing in the sixth embodiment.
FIG. 26 is a flowchart for explaining an example of bike determination processing in the sixth embodiment.
FIG. 27 is a flowchart for explaining an example of second transition determination processing in the sixth embodiment.
FIG. 28 is a flowchart for explaining an example of run determination processing in the sixth embodiment.
FIG. 29 is a flowchart for explaining an example of details of state discrimination processing in a seventh embodiment.
FIG. 30 is a flowchart for explaining an example of running movement A determination processing.
FIG. 31 is a flowchart for explaining an example of running movement B determination processing.
FIG. 32 is a flowchart for explaining an example of running movement C determination processing.
FIG. 33 is a flowchart for explaining an example of running movement D determination processing.
FIG. 34 is a diagram showing another example of the images at the time when the state of the user is the “first transition”.
FIG. 35 is a diagram showing another example of the images at the time when the state of the user is the “first transition”.
FIG. 36 is a diagram showing another example of an image at the time when the state of the user is “bike”.
FIG. 37 is a diagram showing another example of the image at the time when the state of the user is the “first transition”.
FIG. 38 is a diagram showing a disposition example of a plurality of pressure sensors in a modification.
DESCRIPTION OF EXEMPLARY EMBODIMENTSPreferred embodiments of the invention are explained in detail below with reference to the drawings. Note that the embodiments explained below do not unduly limit contents of the invention described in the appended claims. Not all of components explained below are essential constituent elements of the invention.
In the following explanation, an exercise information management system that manages a state of exercise of a user who carries out a triathlon as a competition including a plurality of athletic events (exercise events) is explained as an example.
Note that, in an exerciseinformation management system1 in second to seventh embodiments and modifications, the same components as components in a first embodiment are denoted by the same reference numerals and signs and explanation of the components is omitted or simplified. Differences from the first embodiment or the fourth embodiment are mainly explained.
1. First Embodiment1-1. Configuration of a SystemFIG. 1 is a diagram showing a configuration example of an exerciseinformation management system1 in a first embodiment. As shown inFIG. 1, the exerciseinformation management system1 in this embodiment includes auser terminal3, aserver4, andinformation terminals5. Theserver4 is connected to anetwork6 such as the Internet or a LAN (Local Area Network).
Auser2 carries the user terminal3 (an example of an “electronic device”) and carries out a triathlon. Theuser2 may carry out the triathlon in a tournament or may carry out the triathlon in a practice. The triathlon is configured by three athletic events (exercise events) of swim (swimming), bike (bicycling), and run (running). Theuser2 carries out the events in the order of the swim, the bike, and the run.
As shown inFIG. 2, in this embodiment, theuser terminal3 is a wrist-type (wristwatch-type) electronic device and mounted on a wrist or the like of theuser2. Theuser terminal3 can receive satellite signals transmitted from GPS (Global Positioning System) satellites7 (an example of a “position information satellite”) and transmit exercise information of theuser2 to the information terminal5 (5a). Note thatFIG. 2 is a diagram at the time when theuser2 is carrying out the run.
FIG. 3 is a diagram showing an example of a course used for the triathlon. A solid line C1 represents a course of the swim, a broken line C2 represents a course of the bike, and an alternate long and short dash line C3 represents a course of the run. A sign S1 represents a start point of the swim (a start point of the triathlon), a sign S2 represents a start point of the bike, and a sign S3 represents a start point of the run. A sign G1 represents a goal point of the swim, a sign G2 represents a goal point of the bike, and a sign G3 represents a goal point of the run (a goal point of the triathlon). A sign TA represents a transition area.
In the triathlon, for example, an elapsed time from time when theuser2 starts the start point S1 of the swim until theuser2 passes the start point S2 of the bike is regarded as a time required for the swim (a swim time). An elapsed time from time when theuser2 passes the start point S2 of the bike until theuser2 passes the start point S3 of the run is regarded as a time required for the bike (a bike time). An elapsed time from time when theuser2 passes the start point S3 of the run until theuser2 passes the goal point G3 of the run is regarded as a time required for the run (a run time).
In this case, a sum of an elapsed time (a first transition time) from time when theuser2 passes the goal point G1 of the swim until theuser2 passes the start point S2 of the bike, that is, a time in which theuser2 moves from the goal point G1 of the swim to the transition area TA, a time required for a change of clothes (e.g., wearing of bicycle shoes, a helmet, sunglasses, and the like) and the like in the transition area TA, and a time in which theuser2 moves to the start point S2 of the bike is included in the swim time.
Similarly, a sum of an elapsed time (a second transition time) from time when theuser2 passes the goal point G2 of the bike until theuser2 passes the start point S3 of the run, that is, a time in which theuser2 moves from the goal point G2 of the bike to a clothes change place in the transition area TA, a time required for a change of clothes (e.g., removal of the helmet, the sunglasses, the bicycle shoes, and the like and wearing of run shoes and the like) and the like, and a time in which theuser2 moves to the start point S3 of the run is included in the bike time. A sum of the swim time, the bike time, and the run time is a total time.
In this embodiment, as shown inFIG. 3, before starting the triathlon, theuser2 registers the position (the latitude and the longitude) of the goal point G1 of the swim or the vicinity of the goal point G1 as a position P1 (an example of a “first position”), registers the position of the start point S2 of the bike or the vicinity of the start point S2 as a position P2 (an example of a “second position”), registers the position of the goal point G2 of the bike or the vicinity of the goal point G2 as a position P3 (an example of a “third position”), and registers the position of the start point S3 of the run or the vicinity of the start point S3 as a position P4 (an example of a “fourth position”) in advance in a storing unit140 (refer toFIG. 4 referred to below) of theuser terminal3.
Theuser2 may actually go to the goal point G1 of the swim, the start point S2 of the bike, the goal point G2 of the bike, and the start point S3 of the run and operate an operation unit120 (seeFIG. 4) of theuser terminal3 to register the positions (the latitudes and the longitudes) of present places in thestoring unit140 as the positions P1, P2, P3, and P4.
Alternatively, theuser2 may select, in theinformation terminal5, positions respectively corresponding to the goal points G1 of the swim, the start point S2 of the bike, the goal point G2 of the bike, and the start point S3 of the run on map data of an area where the triathlon is performed. Theuser terminal3 may receive information concerning the selected positions (the latitudes and the longitudes) via a communication unit170 (seeFIG. 4) and register the information in thestoring unit140 as the positions P1, P2, P3, and P4.
When starting the triathlon (when starting the swim in the start point S1), theuser2 performs measurement start operation on theuser terminal3.
Theuser terminal3 incorporates a clocking unit130 (seeFIG. 4 referred to below). Theuser terminal3 measures an elapsed time from the measurement start operation, that is, a total elapsed time Ttotal from time when theuser2 starts the triathlon and sequentially displays information concerning the measured total elapsed time Ttotal on a display unit150 (seeFIG. 4) or the like (on a real-time basis).
Theuser terminal3 discriminates, on the basis of position information obtained on the basis of satellite signals transmitted from the GPS (Global Positioning System) satellites7 (an example of the “position information satellite”) and the positions P1, P2, P3, and P4 registered in advance, a plurality of states including a state “swim” (an example of a “first exercise state”) in which theuser2 is carrying out the swim (an example of a “first exercise event”), a state “first transition” (an example of a “first transition state”) halfway in transition from the “swim” to the “bike”, a state “bike” (an example of a “second exercise state”) in which theuser2 is carrying out the bike (an example of a “second exercise event”), a state “second transition” (an example of a “second transition state”) halfway in transition from the “bike” to the “run”, and a state “run” (an example of a “third exercise state”) in which theuser2 is carrying out the run (an example of a “third exercise event”). That is, in this embodiment, theuser terminal3 discriminates five states of the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run”.
Theuser terminal3 measures an elapsed time Tswim from the start to the end of the “swim”, an elapsed time Ttran1 from the start to the end of the “first transition”, an elapsed time Tbike from the start to the end of the “bike”, an elapsed time Ttran2 from the start to the end of the “second transition”, and an elapsed time Trun from the start to the end of the “run” and sequentially displays information concerning the discriminated states and the measured elapsed times of the states on thedisplay unit150 or the like (on a real-time basis).
Theuser terminal3 generates, on the basis of output signals of various sensors, information such as speed, a pace, a distance, a track, a pulse rate, a heart rate, a pitch, a swimming stroke, and a run stride of theuser2 and causes the incorporated storing unit140 (seeFIG. 4) to sequentially store the information.
While theuser2 is carrying out the triathlon, theuser terminal3 transmits exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun, a discriminated state, speed, a pace, a distance, a track, a pulse rate, a heart rate, a pitch, a swimming stroke, a run stride, and the like) of theuser2 to the information terminal5 (5a) by short-range wireless communication.
The information terminal5 (5a) displays the exercise information received from theuser terminal3 on a display unit. The information terminal5 (5a) is, for example, a head mount display (HMD) worn by theuser2. Theuser2 can carry out the triathlon while grasping exercise information displayed on the head mount display. Alternatively, the information terminal5 (5a) is, for example, a smartphone or a personal computer carried by a coach of theuser2. The coach can provide information such as advice to theuser2, who is carrying out the triathlon, on the basis of the exercise information of theuser2 displayed on the smartphone or the personal computer.
In this embodiment, when ending the triathlon (passing the goal point G3), theuser2 performs measurement end operation on theuser terminal3.
When the measurement end operation is performed, theuser terminal3 ends the measurement processing of the total elapsed time Ttotal, the discrimination processing of the five states of the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run”, and the measurement processing of the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun in the states and causes the incorporated storing unit140 (seeFIG. 4) to store the total elapsed time Ttotal and the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun. The total elapsed time Ttotal stored in thestoring unit140 is equivalent to the “total time” explained above. A sum of the elapsed time Tswim and the elapsed time Ttran1 stored in thestoring unit140 is equivalent to the “swim time” explained above. A sum of the elapsed time Tbike and the elapsed time Ttran2 stored in thestoring unit140 is equivalent to the “bike time” explained above. The elapsed time Trun stored in thestoring unit140 is equivalent to the “run time” explained above. The elapsed time Ttran1 stored in thestoring unit140 is equivalent to the “first transition time” explained above. The elapsed time Ttran2 stored in thestoring unit140 is equivalent to the “second transition time” explained above.
Theuser terminal3 is connectable to thenetwork6 via the information terminal5 (5b). After theuser2 ends the triathlon, the exercise information stored in thestoring unit140 is transferred to theserver4 via the information terminal5 (5b) and thenetwork6 and saved in a storing unit (not shown in the figure) of theserver4. The information terminal5 (5b) may be, for example, a smartphone or a personal computer.
The information terminal5 (5c) receives the exercise information of theuser2 saved in the storing unit of theserver4 via thenetwork6 and displays the exercise information on the display unit. The information terminal5 (5c) is, for example, a smartphone or a personal computer of theuser2 or a person related to theuser2. Theuser2 and the person related to theuser2 can analyze, on the basis of the exercise information displayed on the smartphone or the personal computer, results of triathlons carried out by theuser2 in the past.
1-2. Configuration of the User TerminalFIG. 4 is an example of a functional block diagram of theuser terminal3. As shown inFIG. 4, theuser terminal3 includes aprocessing unit100, aGPS sensor110, aterrestrial magnetism sensor111, apressure sensor112, anacceleration sensor113, anangular velocity sensor114, apulse sensor115, atemperature sensor116, anoperation unit120, aclocking unit130, astoring unit140, adisplay unit150, asound output unit160, acommunication unit170, and abattery180. However, the configuration of theuser terminal3 may be a configuration in which a part of the components are deleted or changed or other components are added.
The GPS sensor110 (an example of a “position-information generating unit”) generates position information on the basis of satellite signals transmitted from the GPS satellites7. For example, theGPS sensor110 may be a GPS receivers that receives the satellite signals transmitted from the GPS satellites7 with a not-shown antenna, demodulates a navigation message from the satellite signals, generates positioning data (data such as latitude, longitude, altitude, and a speed vector), which is position information indicating the position and the like of theuser terminal3, on the basis of the navigation message, and outputs the positioning data.
Theterrestrial magnetism sensor111 is a sensor that detects and outputs the magnetic field (the terrestrial magnetism) of the Earth. Theterrestrial magnetism sensor111 generates and outputs a terrestrial magnetism signal indicating magnetic flux densities in three axial directions orthogonal to one another. In theterrestrial magnetism sensor111, for example, an MR (Magnet resistive) element, an MI (Magnet impedance) element, and a Hall element are used.
Thepressure sensor112 is a sensor that detects and outputs peripheral pressure (air pressure, water pressure, wind pressure, etc.). Thepressure sensor112 includes, for example, a pressure sensitive element of a type for using a change in an oscillation frequency of a vibration piece (a vibration type). The pressure sensitive element is, for example, a piezoelectric vibrator formed of a piezoelectric material such as quartz, lithium niobate, or lithium tantalate. For example, a tuning fork-type vibrator, a dual tuning fork-type vibrator, an AT vibrator (a thickness shear vibrator), or a SAW resonator is applied. Alternatively, thepressure sensor112 may be a MEMS-type air pressure sensor manufactured using a semiconductor manufacturing technique. For example, thepressure sensor112 includes a diaphragm unit deflectively deformed by received pressure and a distortion detecting element that detects deflection of the diaphragm unit. The diaphragm unit is formed of, for example, silicon. The distortion detection element is, for example, a piezo-resistance element.
Theacceleration sensor113 detects accelerations in the respective three axial directions crossing (ideally orthogonal to) one another and outputs a signal (an acceleration signal) corresponding to the magnitudes and the directions of the detected three axis accelerations.
Theangular velocity sensor114 detects angular velocities in the respective three axial directions crossing (ideally orthogonal to) one another and outputs a signal (an angular velocity signal) corresponding to the magnitudes and the directions of the measured three axis angular velocities.
Note that at least one of the output signal (the pressure signal) of thepressure sensor112, the output signal (the acceleration signal) of theacceleration sensor113, and the output signal (the angular velocity signal) of theangular velocity sensor114 is used for correcting the information concerning the positions included in the positioning data generated by theGPS sensor110.
Thepulse sensor115 is a sensor that generates and outputs a signal indicating a pulse of theuser2. Thepulse sensor115 includes, for example, a light source such as an LED (Light Emitting Diode) light source that irradiates measurement light having an appropriate wavelength toward a blood vessel under a skin and a light receiving element that detects an intensity change of light that occurs in the blood vessel according to the measurement light. For example, it is possible to measure a pulse rate (a pulse rate per one minute) by processing an intensity change waveform (a pulse wave) of the light with a publicly-known method such as a frequency analysis. Note that, it is said that a heart rate (heat beats per one minute) and the pulse rate are substantially the same unless arrhythmia or pulse defect occurs. Therefore, the heart rate can be measured by thepulse sensor115. As thepulse sensor115, an ultrasonic sensor that detects contraction of a blood vessel with an ultrasonic wave and measures a pulse rate (a heart rate) may be adopted instead of a photoelectric sensor including a light source and a light receiving element. For example, a sensor that feeds a feeble current from an electrode into the body and measures a pulse rate (a heart rate) may be adopted.
Thetemperature sensor116 is a sensor that outputs a signal (a temperature signal) corresponding to an ambient temperature.
Theoperation unit120 is configured by, for example, buttons, keys, a microphone, a touch panel, a sound recognizing function (which uses a not-shown microphone), and an action detecting function (which uses theacceleration sensor113 or the like). Theoperation unit120 performs processing for converting an instruction received from theuser2 into an appropriate signal and sending the signal to theprocessing unit100.
Theclocking unit130 is configured by, for example, a real time clock (RTC) IC. Theclocking unit130 generates time data such as year, month, day, time, minute, and second and sends the time data to theprocessing unit100. Note that the time data may be corrected as appropriate on the basis of time information included in the positioning data generated by theGPS sensor110.
The storingunit140 is configured by, for example, one or a plurality of IC (Integrated Circuit) memories. The storingunit140 includes a ROM (Read Only Memory) in which data such as computer programs are stored, a RAM (Random Access Memory) serving as a work area of theprocessing unit100, and a recording medium (a recording medium readable by the user terminal3 (an example of a computer)) such as a memory card for storing computer programs and data. In the ROM or the recording medium, various computer programs for theprocessing unit100 to perform various kinds of calculation processing and control processing, various computer programs and various data for realizing application functions, and the like are stored.
Note that theuser terminal3 may receive various computer programs and various data stored in a recording medium (an optical disk (a CD or a DVD), a magneto-optical disk (MO), a magnetic disk, a hard disk, a magnetic tape, etc.) included in theserver4 or a storing unit via the information terminal5 (5b) and thenetwork6 and store the received various computer programs and various data in the storing unit140 (the RAM).
Thedisplay unit150 is configured by, for example, an LCD (Liquid Crystal Display), an organic EL (Electroluminescence) display, an EPD (Electrophoretic Display), or a touch panel display. Thedisplay unit150 displays various images according to instructions from theprocessing unit100. Note that, as thedisplay unit150, a head mount display (HMD) provided separately from theuser terminal3 can also be used.
Thesound output unit160 is configured by, for example, a speaker, a buzzer, or a vibrator. Thesound output unit160 generates various kinds of sound (or vibration) according to instructions from theprocessing unit100. Note that, as thesound output unit160, a bone conduction device provided separately from theuser terminal3 can also be used.
Thecommunication unit170 performs various kinds of control for establishing data communication between theuser terminal3 and theinformation terminal5. Thecommunication unit170 includes a transceiver corresponding to a short-range wireless communication standard such as Bluetooth (registered trademark) (including BTLE: Bluetooth Low Energy), Wi-Fi (Wireless Fidelity) (registered trademark), Zigbee (registered trademark), NFC (Near Field Communication), or ANT+ (registered trademark) and a connector corresponding to a communication bus standard such as USB (Universal Serial Bus).
Thebattery180 supplies electric power to the units configuring theuser terminal3 and is, for example, a rechargeable battery. As a charging type of thebattery180, for example, contactless charging or contact charging (charging performed using a cradle or the like) can be applied. Thebattery180 may be an interchangeable battery or may be a solar generation-type battery.
The processing unit100 (a processor) is configured by, for example, an MPU (Micro Processing Unit), a DSP (Digital Signal Processor), or an ASIC (Application Specific Integrated Circuit). Theprocessing unit100 executes various kinds of processing on the basis of computer programs stored in thestoring unit140 and signals input from theoperation unit120. The processing by theprocessing unit100 includes data processing for output signals of theGPS sensor110, theterrestrial magnetism sensor111, thepressure sensor112, theacceleration sensor113, theangular velocity sensor114, thepulse sensor115, thetemperature sensor116, and theclocking unit130, display processing for causing thedisplay unit150 to display an image, sound output processing for causing thesound output unit160 to output sound, communication processing for performing communication with theinformation terminal5 via thecommunication unit170, and power control processing for supplying electric power received from thebattery180 to the units.
In particular, in this embodiment, theprocessing unit100 performs processing for receiving a signal indicating setting of the positions P1, P2, P3, and P4 from theoperation unit120 or thecommunication unit170 and registering the positions P1, P2, P3, and P4 in thestoring unit140.
Theprocessing unit100 performs, as one kind of the data processing, processing for measuring, on the basis of an output signal of theclocking unit130, an elapsed time (the total elapsed time Ttotal) after a signal indicating measurement start operation is received from theoperation unit120.
Theprocessing unit100 performs, as one kind of the data processing, processing for discriminating the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2 on the basis of positioning data generated and output by the GPS sensor110 (the position information obtained on the basis of satellite signals transmitted from the GPS satellites7) and the positions P1, P2, P3, and P4 registered in advance in thestoring unit140.
Specifically, theprocessing unit100 determines on the basis of the positioning data (the position information) whether theuser2 passes the position P1 and whether theuser2 passes the position P2, determines that theuser2 is in the state “swim” until theuser2 passes the position P1, determines that theuser2 is in the state “first transition” until theuser2 passes the position P2 after passing the position P1, and determines that theuser2 is in the state “bike” after theuser2 passes the position P2.
Further, theprocessing unit100 determines on the basis of the positioning data (the position information) whether theuser2 passes the position P3 and whether theuser2 passes the position P4, determines that theuser2 is in the state “bike” until theuser2 passes the position P3 after passing the position P2, determines that theuser2 is in the state “second transition” until theuser2 passes the position P4 after passing the position P3, and determines that theuser2 is in the state “run” after theuser2 passes the position P4.
Theprocessing unit100 performs, as one kind of the data processing, processing for calculating times respectively required for the plurality of states “swim”, “first transition”, “bike”, “second transition”, and the “run” of theuser2. That is, theprocessing unit100 performs processing for measuring, on the basis of an output signal of theclocking unit130, the elapsed time Tswim of the state “swim”, the elapsed time Ttran1 of the state “first transition”, the elapsed time Tbike of the state “bike”, the elapsed time Ttran2 of the state “second transition”, and the elapsed time Trun of the state “run”.
Theprocessing unit100 performs, as one kind of the data processing, processing for generating, on the basis of output signals of theGPS sensor110, theterrestrial magnetism sensor111, thepressure sensor112, theacceleration sensor113, theangular velocity sensor114, thepulse sensor115, thetemperature sensor116, and theclocking unit130, information such as speed, a pace, a distance, a track, a pulse rate, a heart rate, a pitch, a swimming stroke, and a run stride of theuser2 after a signal indicating measurement start operation is received from theoperation unit120 and causing thestoring unit140 to store the information.
Theprocessing unit100 performs, as one kind of the data processing, processing for ending, when receiving a signal indicating measurement end operation from theoperation unit120, the measurement processing of the total elapsed time Ttotal, the discrimination processing of the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run”, and the measurement processing of the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun of the states and causing the incorporated storingunit140 to store the total elapsed time Ttotal and the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun.
Theprocessing unit100 may perform, as one kinds of the display processing, processing for causing thedisplay unit150 to display at least one of the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2. In this case, thedisplay unit150 functions as a notifying unit that notifies a state discriminated by theprocessing unit100.
Theprocessing unit100 may perform, as one kind of the display processing, processing for causing thedisplay unit150 to display at least a part of the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of theuser2.
Theprocessing unit100 may perform, as one kinds of the sound output processing, processing for causing thesound output unit160 to output, as sound, at least one of the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2. In this case, thesound output unit160 functions as a notifying unit that notifies a state discriminated by theprocessing unit100.
Theprocessing unit100 may perform, as one kind of the sound output processing, processing for causing thesound output unit160 to output, as sound, at least a part of the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of theuser2.
Theprocessing unit100 may perform, as one kinds of the communication processing, processing for transmitting at least one of the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2 to the information terminals5(5aand5b) via thecommunication unit170. In this case, thecommunication unit170 functions as a notifying unit that notifies a state discriminated by theprocessing unit100.
Theprocessing unit100 may perform, as one kind of the communication processing, processing for transmitting the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of theuser2 to the information terminals5 (5aand5b) via thecommunication unit170.
1-3. Procedure of processing of the user terminal
FIG. 5 is a flowchart for explaining an example of a procedure of a part of processing performed by theprocessing unit100 of theuser terminal3. Theprocessing unit100 of theuser terminal3 executes a computer program stored in the storing unit140 (the storage medium, the ROM, or the RAM) to thereby execute the processing in the procedure of the flowchart ofFIG. 5.
As shown inFIG. 5, first, theprocessing unit100 stays on standby until theprocessing unit100 receives a signal indicating measurement start operation from the operation unit120 (N in step S10). When receiving the signal indicating the measurement start operation (Y in step S10), theprocessing unit100 starts generation processing of the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of theuser2.
Subsequently, theprocessing unit100 performs state discrimination processing for discriminating a state of the user2 (step S14). Details of the state discrimination processing are explained below.
Subsequently, theprocessing unit100 stays on standby until theprocessing unit100 receives a signal indicating measurement end operation from the operation unit120 (N in step S16). When receiving the signal indicating the measurement end operation (Y in step S16), theprocessing unit100 ends the generation processing of the exercise information of the user2 (step S18).
FIG. 6 is a flowchart for explaining an example of details of the state discrimination processing (the processing in step S14 inFIG. 5) in the first embodiment.
As shown inFIG. 6, first, theprocessing unit100 sets the state of theuser2 to the “swim” (step S100).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S102) and determines on the basis of the acquired position information and the registered position P1 whether the distance between the position of theuser2 and the position P1 is equal to or smaller than a threshold (step S104). The threshold only has to be decided as appropriate.
If the distance between the position of theuser2 and the position P1 is not equal to or smaller than the threshold (N in step S104), theprocessing unit100 performs the processing in steps S102 and S104 again. On the other hand, if the distance between the position of theuser2 and the position P1 is equal to or smaller than the threshold (Y in step S104), theprocessing unit100 changes the state of theuser2 from the “swim” to the “first transition” (step S106).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S108) and determines on the basis of the acquired position information and the registered position P2 whether the distance between the position of theuser2 and the position P2 is equal to or smaller than the threshold (step S110).
If the distance between the position of theuser2 and the position P2 is not equal to or smaller than the threshold (N in step S110), theprocessing unit100 performs the processing in steps S108 and S110 again. On the other hand, if the distance between the position of theuser2 and the position P2 is equal to or smaller than the threshold (Y in step S110), theprocessing unit100 changes the state of theuser2 from the “first transition” to the “bike” (step S112).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S114) and determines on the basis of the acquired position information and the registered position P3 whether the distance between the position of theuser2 and the position P3 is equal to or smaller than the threshold (step S116).
If the distance between the position of theuser2 and the position P3 is not equal to or smaller than the threshold (N in step S116), theprocessing unit100 performs the processing in steps S114 and S116 again. On the other hand, if the distance between the position of theuser2 and the position P3 is equal to or smaller than the threshold (Y in step S116), theprocessing unit100 changes the state of theuser2 from the “bike” to the “second transition” (step S118).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S120) and determines on the basis of the acquired position information and the registered position P4 whether the distance between the position of theuser2 and the position P4 is equal to or smaller than the threshold (step S122).
If the distance between the position of theuser2 and the position P4 is not equal to or smaller than the threshold (N in step S122), theprocessing unit100 performs the processing in steps S120 and S122 again. On the other hand, if the distance between the position of theuser2 and the position P4 is equal to or smaller than the threshold (Y in step S122), theprocessing unit100 changes the state of theuser2 from the “second transition” to the “run” (step S124).
1-4. Display Method of States of the UserIn this embodiment, the user terminal3 (the processing unit100) discriminates a plurality of states of theuser2 according to the state discrimination processing (the processing in step S14 inFIG. 5) and causes thedisplay unit150 to display the discriminated states.FIG. 7 is a diagram showing a display example of states of theuser2 that theprocessing unit100 causes thedisplay unit150 to display while theuser2 is carrying out the triathlon.
As shown inFIG. 7, when a state of theuser2 is the “swim”, theprocessing unit100 causes thedisplay unit150 to display an image A1 including an object OB1 for reminding that theuser2 is carrying out the swim. When the state of theuser2 is the “swim”, theprocessing unit100 may cause thedisplay unit150 to display the elapsed time Tswim from the start of the “swim”. The image A1 includes the total elapsed time Ttotal (0:15:15) serving as a total time and the elapsed time Tswim (0:15:15) of the swim.
When the state of theuser2 is the “first transition”, theprocessing unit100 causes thedisplay unit150 to display an image A2 including an object OB2 for reminding that theuser2 is transitioning from the swim to the bike. When the state of theuser2 is the “first transition”, theprocessing unit100 may cause thedisplay unit150 to display the elapsed time Ttran1 from the start of the “first transition”. The image A2 includes the total elapsed time Ttotal (0:30:12) serving as the total time and the elapsed time Ttran1 (0:01:05) of the first transition.
When the state of theuser2 is the “bike”, theprocessing unit100 causes thedisplay unit150 to display an image A3 including an object OB3 for reminding that theuser2 is carrying out the bike. When the state of theuser2 is the “bike”, theprocessing unit100 may cause thedisplay unit150 to display the elapsed time Tbike from the start of the “bike”. The image A3 includes the total elapsed time Ttotal (1:01:45) serving as the total time and the elapsed time Tbike (0:26:59) of the bike.
When the state of theuser2 is the “second transition”, theprocessing unit100 causes thedisplay unit150 to display an image A4 including an object OB4 for reminding that theuser2 is transitioning from the bike to the run. When the state of theuser2 is the “second transition”, theprocessing unit100 may cause thedisplay unit150 to display the elapsed time Ttran2 from the start of the “second transition”. The image A4 includes the total elapsed time Ttotal (1:32:38) serving as the total time and the elapsed time Ttran2 (0:00:55) of the second transition.
When the state of theuser2 is the “run”, theprocessing unit100 causes thedisplay unit150 to display an image A5 including an object OB5 for reminding that theuser2 is carrying out the run. When the state of theuser2 is the “run”, theprocessing unit100 may cause thedisplay unit150 to display the elapsed time Trun from the start of the “run”. The image A5 includes the total elapsed time Ttotal (2:12:33) serving as the total time and the elapsed time Trun (0:39:22) of the run.
Note that theprocessing unit100 may transmit information concerning the images A1 to A5 representing the states of theuser2 to the information terminal5 (5a) via thecommunication unit170 and cause the display unit of the information terminal5 (5a) to display the images A1 to A5.
In this way, in this embodiment, the display system is configured that includes theprocessing unit100 that discriminates the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2 and thedisplay unit150 of theuser terminal3 or the display unit of theinformation terminal5 that displays the discriminated states.
Note that, in order to reduce the total time, it is requested to reduce the elapsed time Ttran1 of the first transition and the elapsed time Ttran2 of the second transition as much as possible. Therefore, when a discriminated state is the “first transition” or the “second transition”, theprocessing unit100 may generate an image including a flashing object and cause thedisplay unit150 or the display unit of the information terminal5 (5a) to display the image. Consequently, inFIG. 7, the image A2 at the time when the state of theuser2 is the “first transition” and the image A4 at the time when the state of theuser2 is the “second transition” are images more highlighted than the other images A1, A3, and A5. For example, as shown inFIG. 8, in the image A2 at the time when the state of theuser2 is the “first transition”, an object imitating T, which is a part of the object OB2, may be flashed. In the example shown inFIG. 8, the object imitating T is lit for one second (an image A2-1), extinguished for one second (an image A2-2), lit for one second (an image A2-3), and extinguished for one second (an image A2-4). Although not shown in the figure, similarly, in the image A4 at the time when the state of theuser2 is the “second transition”, an object imitating T, which is a part of the object OB4, may be flashed.
1-5. Display Method of Exercise InformationIn this embodiment, the user terminal3 (the processing unit100) transmits the exercise information of theuser2 generated by the generation processing of exercise information (the processing starting in step S12 and ending in step S18 inFIG. 5) to the information terminals5 (5aand5b) via thecommunication unit170.
While theuser2 is carrying out the triathlon, theinformation terminal5areceives the exercise information of theuser2 from the user terminal3 (the processing unit100) and displays at least a part of the exercise information on the display unit.
After theuser2 ends the triathlon, theinformation terminal5breceives the exercise information of theuser2 from the user terminal3 (the processing unit100) and transfers the exercise information to theserver4 via thenetwork6. Theserver4 saves the exercise information of theuser2 received from theinformation terminal5bin the storing unit. Thereafter, theinformation terminal5creceives the exercise information of theuser2 saved in the storing unit of theserver4 via thenetwork6 and displays the exercise information on the display unit.FIGS. 9 to 12 are diagrams showing examples of images displayed on the display unit of theinformation terminal5c.
The image shown inFIG. 9 includes information such as trend graphs of an average pace, an altitude, and a heart rate with time plotted on the horizontal axis, the objects indicating the states of theuser2, the elapsed time Ttran1 (62 seconds) of the first transition, and the elapsed time Ttran2 (44 seconds) of the second transition.
The image shown inFIG. 10 includes information such as trend graphs of an average pace, an altitude, and a pitch with time plotted on the horizontal axis, the objects indicating the states of theuser2, the elapsed time Ttran1 (62 seconds) of the first transition, and the elapsed time Ttran2 (44 seconds) of the second transition.
In the image shown inFIG. 11, a pulse rate is classified into five stages of 30 to 100, 101 to 130, 131 to 160, 161 to 190, and 191 to 240. The image includes information concerning times in which the pulse rate is in the stages and ratios of the times.
The image shown inFIG. 12 includes information concerning a moving track of the user2 (an athlete A) and a moving track of another user (e.g., a professional athlete B).
From the images shown inFIG. 9, 10, or11, for example, when time of theuser2 was improved, for example, it is possible to determine which of the state of the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run” was good, determine that time efficiency in the “second transition” was better than time efficiency in the “first transition”, and determine that a pace in the “run” was stable because theuser2 was able to save power in the “bike”. For example, when theuser2 desires to improve the time of theuser2, for example, it is possible to determine improvement of which of the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run” is effective, determine that it is better to reduce a pace in the “bike” in preparation for the “run”, and determine that theuser2 should repeat practices in a pool to develop basic strength because a pace in the “swim” is unstable.
From the image shown inFIG. 12, for example, it is possible to compare course selections of theuser2 and the other user (e.g., the professional athlete) and determine an event that caused a loss and a place of the loss (a curve or a straight line). Although not shown in the figure, an image for enabling trend graphs of average paces, altitudes, heart rates, and pitches to be compared between theuser2 and the other user (e.g., the professional athlete) may be displayed on the display unit of theinformation terminal5c. With such an image, it is possible to determine that, for example, a pace of the other user (the professional athlete) is overwhelmingly higher that a pace of theuser2 or theuser2 has a good match with the other user (the professional athlete) in the swim and the run.
1-6. Action and EffectsAs explained above, with the exerciseinformation management system1 in the first embodiment, on the basis of the positioning data (the position information) of theGPS sensor110 and the positions P1, P2, P3, and P4 registered in advance, theprocessing unit100 of theuser terminal3 determines that theuser2 is in the state “swim” until theuser2 passes the position P1, determines that theuser2 is in the state “first transition” until theuser2 passes the position P2 after passing the position P1, determines that theuser2 is in the state “bike” until theuser2 passes the position P3 after passing the position P2, determines that theuser2 is in the state “second transition” until theuser2 passes the position P4 after passing the position P3, and determines that theuser2 is in the state “run” after theuser2 passes the position P4. That is, since theprocessing unit100 automatically discriminates the state of theuser2, theuser2 does not need to perform work when an athletic event carried out by theuser2 is switched from the “swim” to the “bike” and when the athletic event is switched from the “bike” to the “run”. Therefore, theuser2 can concentrate on the triathlon.
With the exerciseinformation management system1 in the first embodiment, the states of theuser2 discriminated by theprocessing unit100 of theuser terminal3 are displayed on thedisplay unit150 of theuser terminal3 together with the elapsed times. Therefore, theuser2 is capable of recognizing the displayed elapsed times of the states and performing adjustment such as an increase and a reduction in a pace. The states of theuser2 discriminated by theprocessing unit100 of theuser terminal3 are displayed on the display unit of theinformation terminal5atogether with the elapsed times. Therefore, a person (e.g., a coach) carrying theinformation terminal5acan recognize the elapsed times of the states of theuser2 and give advice or the like such as an increase or a reduction in a pace to theuser2. In particular, theuser2 and the like can separately recognize a time required for the state “swim” and a time required for the state “first transition”. Therefore, theuser2 and the like can recognize whether a pace of the swim is faster or slower than an assumption and whether a time required for switching from the swim to the bike is longer or shorter than an assumption and appropriately determine whether the pace should be increased or reduced in the bike. Similarly, theuser2 and the like can separately recognize a time required for the state “bike” and a time required for the state “second transition”. Therefore, theuser2 and the like can recognize whether a pace of the bike is faster or slower than an assumption and whether a time required for switching from the bike to the run is longer or shorter than an assumption and appropriately determine whether the pace should be increased or reduced in the run.
2. Second EmbodimentThe exerciseinformation management system1 in a second embodiment is explained below.
In the exerciseinformation management system1 in the second embodiment, as shown inFIG. 13, before starting a triathlon, theuser2 stores, in advance, in thestoring unit140 of theuser terminal3, a plurality of positions P1 in the goal point G1 of swim or the vicinity of the goal point G1, a plurality of positions P2 in the start point S2 of bike or the vicinity of the start point S2, a plurality of positions P3 in the goal point G2 of the bike or the vicinity of the goal point G2, and a plurality of positions P4 in the start point S3 of run or the vicinity of the start point S3.
Theprocessing unit100 performs processing for discriminating a plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2 on the basis of positioning data (position information) generated and output by theGPS sensor110 and the plurality of positions P1, P2, P3, and P4 registered in thestoring unit140 in advance.
Specifically, theprocessing unit100 determines on the basis of the positioning data (the position information) whether theuser2 passes any one of the plurality of positions P1 and whether theuser2 passes any one of the plurality of positions P2, determines that theuser2 is in the state “swim” until theuser2 passes any one of the plurality of positions P1, determines that theuser2 is in the state “first transition” until theuser2 passes any one of the plurality of positions P2 after passing any one of the plurality of positions P1, and determines that theuser2 is in the state “bike” after theuser2 passes any one of the plurality of positions P2.
Further, theprocessing unit100 determines on the basis of the positioning data (the position information) whether theuser2 passes any one of the plurality of positions P3 and whether theuser2 passes any one of the plurality of positions P4, determines that theuser2 is in the state “bike” until theuser2 passes any one of the plurality of positions P3 after passing any one of the plurality of positions P2, determines that theuser2 is in the state “second transition” until theuser2 passes any one of the plurality of positions P4 after passing any one of the plurality of positions P3, and determines that theuser2 is in the state “run” after theuser2 passes any one of the plurality of positions P4.
As in the first embodiment, in the second embodiment, theprocessing unit100 executes the computer program stored in thestoring unit140 to thereby, for example, execute the processing in the procedure of the flowchart ofFIG. 5.
FIG. 14 is a flowchart for explaining an example of details of the state discrimination processing (the processing in step S14 inFIG. 5) in the second embodiment.
As shown inFIG. 14, first, theprocessing unit100 sets a state of theuser2 to the “swim” (step S200).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S202) and determines on the basis of the acquired position information and the registered plurality of positions P1 whether the distance between the position of theuser2 and any one of the plurality of positions P1 is equal to or smaller than a threshold (step S204). The threshold only has to be decided as appropriate.
If the distance between the position of theuser2 and any one of the plurality of positions P1 is not equal to or smaller than the threshold (N in step S204), theprocessing unit100 performs the processing in steps S202 and S204 again. On the other hand, if the distance between the position of theuser2 and any one of the plurality of positions P1 is equal to or smaller than the threshold (Y in step S204), theprocessing unit100 changes the state of theuser2 from the “swim” to the “first transition” (step S206).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S208) and determines on the basis of the acquired position information and the registered plurality of positions P2 whether the distance between the position of theuser2 and any one of the plurality of positions P2 is equal to or smaller than the threshold (step S210).
If the distance between the position of theuser2 and any one of the plurality of positions P2 is not equal to or smaller than the threshold (N in step S210), theprocessing unit100 performs the processing in steps S208 and S210 again. On the other hand, if the distance between the position of theuser2 and any one of the plurality of positions P2 is equal to or smaller than the threshold (Y in step S210), theprocessing unit100 changes the state of theuser2 from the “first transition” to the “bike” (step S212).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S214) and determines on the basis of the acquired position information and the registered plurality of positions P3 whether the distance between the position of theuser2 and any one of the plurality of positions P3 is equal to or smaller than the threshold (step S216).
If the distance between the position of theuser2 and any one of the plurality of positions P3 is not equal to or smaller than the threshold (N in step S216), theprocessing unit100 performs the processing in steps S214 and S216 again. On the other hand, if the distance between the position of theuser2 and any one of the plurality of positions P3 is equal to or smaller than the threshold (Y in step S216), theprocessing unit100 changes the state of theuser2 from the “bike” to the “second transition” (step S218).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S220) and determines on the basis of the acquired position information and the registered plurality of positions P4 whether the distance between the position of theuser2 and any one of the plurality of positions P4 is equal to or smaller than the threshold (step S222).
If the distance between the position of theuser2 and any one of the plurality of positions P4 is not equal to or smaller than the threshold (N in step S222), theprocessing unit100 performs the processing in steps S220 and S222 again. On the other hand, if the distance between the position of theuser2 and any one of the plurality of positions P4 is equal to or smaller than the threshold (Y in step S222), theprocessing unit100 changes the state of theuser2 from the “second transition” to the “run” (step S224).
The exerciseinformation management system1 in the second embodiment explained above can achieve the same effects as the effects in the first embodiment.
Further, with the exerciseinformation management system1 in the second embodiment, theprocessing unit100 of theuser terminal3 discriminates the states assuming that the state of theuser2 is switched when theuser2 passes any one of the plurality of positions P1, any one of the plurality of positions P2, any one of the plurality of positions P3, or any one of the plurality of positions P4. Therefore, compared with the first embodiment in which only one each of the positions P1, P2, P3, and P4 is registered, it is possible to more accurately discriminate the state of theuser2.
3. Third EmbodimentThe exerciseinformation management system1 in a third embodiment is explained below.
In the first embodiment, theprocessing unit100 calculates the elapsed time Ttran1 from the start to the end of the state “first transition” as a time equivalent to the first transition time (the elapsed time from time when theuser2 passes the goal point G1 of the swim until theuser2 passes the start point S2 of the bike).
Actually, the goal point G1 of the swim and the transition area TA are sometimes away. The first transition time is a sum of a time in which theuser2 moves from the goal point G1 of the swim to the transition area TA and a time required for a change of clothes in the transition area TA, movement to the start point S2 (a riding line) of the bike, and the like.
Therefore, in the third embodiment, a state halfway in transition from the “swim” to the “bike” consists of two states, that is, a state “running movement” in which theuser2 is moving from the goal point G1 of the swim to the transition area TA and a state “first transition” in which theuser2 is performing a change of clothes in the transition area TA and movement to the start point S2 (the riding line) of the bike. Theprocessing unit100 of theuser terminal3 discriminates the two states “running movement” and “first transition” respectively as separate states.
In this embodiment, as shown inFIG. 15, theuser2 registers, in advance, a position P5 in a point E1 of an entrance of the transition area TA or the vicinity of the point E1 present on a moving route from the goal point G1 of the swim in addition to the positions P1, P2, P3 and P4.
Theprocessing unit100 performs processing for discriminating a plurality of states “swim”, “running movement”, “first transition”, “bike”, “second transition”, and “run” on the basis of positioning data (position information) generated and output by theGPS sensor110 and the positions P1, P2, P3, P4, and P5 registered in thestoring unit140 in advance. The “running movement” is a state in which theuser2 is moving from the goal point G1 of the swim to the transition area TA.
Specifically, theprocessing unit100 determines on the basis of the positioning data (the position information) whether theuser2 passes the position P1 and whether theuser2 passes the position P2, determines that theuser2 is in the state “swim” until theuser2 passes the position P1, determines that theuser2 is in the state “running movement” until theuser2 passes the position P5 after passing the position P1, determines that theuser2 is in the “first transition” until theuser2 passes the position P2 after passing the position P5, and determines that theuser2 is in the state “bike” after theuser2 passes the position P2.
Further, theprocessing unit100 determines on the basis of the positioning data (the position information) whether theuser2 passes the position P3 and whether theuser2 passes the position P4, determines that theuser2 is in the state “bike” until theuser2 passes the position P3 after passing the position P2, determines that theuser2 is in the state “second transition” until theuser2 passes the position P4 after passing the position P3, and determines that theuser2 is in the state “run” after theuser2 passes the position P4.
Theprocessing unit100 performs, as one kind of the data processing, processing for measuring, on the basis of an output signal of theclocking unit130, the elapsed time Tswim of the state “swim”, an elapsed time Tmove of the state “running movement”, the elapsed time Ttran1 of the state “first transition”, the elapsed time Tbike of the state “bike”, the elapsed time Ttran2 of the state “second transition”, and the elapsed time Trun of the state “run”.
Theprocessing unit100 performs, as one kind of the data processing, processing for ending, when receiving a signal indicating measurement end operation from theoperation unit120, the measurement processing of the total elapsed time Ttotal, the discrimination processing of the “swim”, the “running movement”, the “first transition”, the “bike”, the “second transition”, and the “run”, and the measurement processing of the elapsed times Tswim, Tmove, Ttran1, Tbike, Ttran2, and Trun of the states and causing the incorporated storingunit140 to store the total elapsed time Ttotal and the elapsed times Tswim, Tmove, Ttran1, Tbike, Ttran2, and Trun.
Theprocessing unit100 may perform, as one kinds of the display processing, processing for causing thedisplay unit150 to display at least one of the plurality of states “swim”, “running movement”, “first transition”, “bike”, “second transition”, and “run” of theuser2. In this case, thedisplay unit150 functions as a notifying unit that notifies a state discriminated by theprocessing unit100.
Theprocessing unit100 may perform, as one kind of the display processing, processing for causing thedisplay unit150 to display at least a part of the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Tmove, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of theuser2.
Theprocessing unit100 may perform, as one kinds of the sound output processing, processing for causing thesound output unit160 to output, as sound, at least one of the plurality of states “swim”, “running movement”, “first transition”, “bike”, “second transition”, and “run” of theuser2. In this case, thesound output unit160 functions as a notifying unit that notifies a state discriminated by theprocessing unit100.
Theprocessing unit100 may perform, as one kinds of the communication processing, processing for transmitting at least one of the plurality of states “swim”, “running movement”, “first transition”, “bike”, “second transition”, and “run” of theuser2 to the information terminals5(5aand5b) via thecommunication unit170. In this case, thecommunication unit170 functions as a notifying unit that notifies a state discriminated by theprocessing unit100.
Theprocessing unit100 may perform, as one kind of the communication processing, processing for transmitting the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, Tmove, Ttran1, Tbike, Ttran2, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of theuser2 to the information terminals5 (5aand5b) via thecommunication unit170.
As in the first embodiment, in the third embodiment, theprocessing unit100 executes the computer program stored in thestoring unit140 to thereby, for example, execute the processing in the procedure of the flowchart ofFIG. 5.
FIG. 16 is a flowchart for explaining an example of details of the state discrimination processing (the processing in step S14 inFIG. 5) in the third embodiment.
As shown inFIG. 16, first, theprocessing unit100 sets a state of theuser2 to the “swim” (step S300).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S302) and determines on the basis of the acquired position information and the registered position P1 whether the distance between the position of theuser2 and the position P1 is equal to or smaller than a threshold (step S304). The threshold only has to be decided as appropriate.
If the distance between the position of theuser2 and the position P1 is not equal to or smaller than the threshold (N in step S304), theprocessing unit100 performs the processing in steps S302 and S304 again. On the other hand, if the distance between the position of theuser2 and the position P1 is equal to or smaller than the threshold (Y in step S304), theprocessing unit100 changes the state of theuser2 from the “swim” to the “running movement (running movement A)” (step S306).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S308) and determines on the basis of the acquired position information and the registered position P5 whether the distance between the position of theuser2 and the position P5 is equal to or smaller than the threshold (step S310).
If the distance between the position of theuser2 and the position P5 is not equal to or smaller than the threshold (N in step S310), theprocessing unit100 performs the processing in steps S308 and S310 again. On the other hand, if the distance between the position of theuser2 and the position P5 is equal to or smaller than the threshold (Y in step S310), theprocessing unit100 changes the state of theuser2 from the “running movement” to the “first transition” (step S312).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S314) and determines on the basis of the acquired position information and the registered position P2 whether the distance between the position of theuser2 and the positions P2 is equal to or smaller than the threshold (step S316).
If the distance between the position of theuser2 and the position P2 is not equal to or smaller than the threshold (N in step S316), theprocessing unit100 performs the processing in steps S314 and S316 again. On the other hand, if the distance between the position of theuser2 and the position P2 is equal to or smaller than the threshold (Y in step S316), theprocessing unit100 changes the state of theuser2 from the “first transition” to the “bike” (step S318).
Subsequently, theprocessing unit100 acquires positioning data (position information) from the GPS sensor110 (step S320) and determines on the basis of the acquired position information and the registered position P3 whether the distance between the position of theuser2 and the position P3 is equal to or smaller than the threshold (step S322).
If the distance between the position of theuser2 and the position P3 is not equal to or smaller than the threshold (N in step S322), theprocessing unit100 performs the processing in steps S320 and S322 again. On the other hand, if the distance between the position of theuser2 and the position P3 is equal to or smaller than the threshold (Y in step S322), theprocessing unit100 changes the state of theuser2 from the “bike” to the “second transition” (step S324).
Subsequently, theprocessing unit100 acquires positioning data (position information) (step S326) and determines on the basis of the acquired position information and the registered position P4 whether the distance between the position of theuser2 and the position P4 is equal to or smaller than the threshold (step S328).
If the distance between the position of theuser2 and the position P4 is not equal to or smaller than the threshold (N in step S328), theprocessing unit100 performs the processing in steps S326 and S328 again. On the other hand, if the distance between the position of theuser2 and the position P4 is equal to or smaller than the threshold (Y in step S328), theprocessing unit100 changes the state of theuser2 from the “second transition” to the “run” (step S330).
The exerciseinformation management system1 in the third embodiment explained above can achieve the same effects as the effects in the first embodiment.
Further, with the exerciseinformation management system1 in the third embodiment, theprocessing unit100 of theuser terminal3 can separately recognize a time required by theuser2 for movement from the goal point G1 of the swim to the transition area TA (a time required for the state “running movement”) and a time required by theuser2 for a change of clothes in the transition area TA, movement to the start point S2 (the riding line) of the bike, and the like (a time required for the state “first transition”). Therefore, theuser2 and the like can grasp, in detail, for example, points that should be improved in switching from the swim to the bike.
Note that, when the transition area TA and the start point S2 of the bike are away, similarly, theuser2 may register, in advance, a position P6 in a point of an exit of the transition area TA or the vicinity of the point present on a moving route to the start point S2 of the bike. Theprocessing unit100 may determine on the basis of positioning data (position information) whether theuser2 passes the position P6, determine that theuser2 is in the state “first transition” until theuser2 passes the position P6, and determine that theuser2 is in a state “running movement B” until theuser2 passes the position P2 after passing the position P6. Similarly, when the goal point G2 of the bike and the transition area TA are away, theuser2 may register, in advance, a position P7 in a point of an entrance of the transition area TA or the vicinity of the point present on a moving route from the goal point G2 of the bike. Theprocessing unit100 may determine on the basis of positioning data (position information) whether theuser2 passes the position P7, determine that theuser2 is in a state “running movement C” until theuser2 passes the position P7 after passing the position P3, and determine that theuser2 is in the state “second transition” after theuser2 passes the position P7. Similarly, when the transition area TA and the start point S3 of the run are away, theuser2 may register, in advance, a position P8 in a point of an exit of the transition area TA or the vicinity of the point present on a moving route to the start point S3 of the run. Theprocessing unit100 may determine on the basis of positioning data (position information) whether theuser2 passes the position P8, determine that theuser2 is in the state “second transition” until theuser2 passes the position P8, and determine that theuser2 is in a state “running movement D” until theuser2 passes the position P3 after passing the position P8.
4. Fourth EmbodimentThe exerciseinformation management system1 in a fourth embodiment is explained below.
In the first embodiment, theprocessing unit100 of theuser terminal3 automatically discriminates the state of theuser2 on the basis of the positioning data (the position information) of theGPS sensor110 and the positions P1, P2, P3, and P4 registered in advance. However, in the exerciseinformation management system1 in the fourth embodiment, theprocessing unit100 determines the state of theuser2 on the basis of at least either one of an output signal of the acceleration sensor113 (an example of the “first motion sensor”) and an output signal of thepressure sensor112.
4-1. Configuration of the SystemIn the exerciseinformation management system1 in this embodiment, as shown inFIG. 17, when starting a triathlon (when starting the swim in the start point S1), theuser2 performs measurement start operation on theuser terminal3.
Theuser terminal3 discriminates, on the basis of position information obtained on the basis of satellite signals transmitted from the GPS (Global Positioning System) satellites7 (an example of the “position information satellite”) and at least either one of an output signal of the acceleration sensor113 (an example of the “first motion sensor”) (seeFIG. 4) and an output signal of the pressure sensor112 (seeFIG. 4), a plurality of states including a state “swim” (an example of the “first exercise state”) in which theuser2 is carrying out swim (an example of the “first exercise event”), a state “bike” (an example of the “second exercise state”) in which theuser2 is carrying out bike (an example of the “second exercise event”), and a state “run” (an example of the “third exercise state”) in which theuser2 is carrying out run (an example of the “third exercise event”). In this embodiment, the plurality of states discriminated by theuser terminal3 include a state “first transition” (an example of the “first transition state”) halfway in transition from the “swim” to the “bike” and a state “second transition” (an example of the “second transition state”) halfway in transition from the “bike” to the “run”. That is, in this embodiment, theuser terminal3 discriminates five states of the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run”.
4-2. Configuration of the User TerminalThe configuration of theuser terminal3 in this embodiment is the same as the configuration in the embodiments explained above.
In particular, in this embodiment, the processing unit100 (the processor) performs, as one kind of the data processing, processing for measuring, on the basis of an output signal of theclocking unit130, an elapsed time (the total elapsed time Ttotal) after theprocessing unit100 receives a signal indicating measurement start operation from theoperation unit120.
Theprocessing unit100 performs, as one kind of the data processing, processing for discriminating the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2 on the basis of positioning data generated and output by the GPS sensor110 (position information obtained on the basis of satellite signals transmitted from the GPS satellites7) and at least either one of an output signal of the acceleration sensor113 (an example of the “first motion sensor”) and an output signal of thepressure sensor112.
In general, in the swim, a stroke of the arms of theuser2 is regular (has periodicity). Therefore, a waveform of an output signal of theacceleration sensor113 is regular (has periodicity). Swimming speed (moving speed) of theuser2 is in a predetermined speed range (e.g., approximately 3 km/h). Further, since a state in which the arms of theuser2 are in the air and a state in which the arms of theuser2 are in the water are alternately repeated, thepressure sensor112 detects an air pressure and a water pressure. In the first transition, since theuser2 is performing a change of clothes and the like, the position of theuser2 hardly changes and theuser2 nearly stops (moving speed is nearly zero). In the bike, running speed (moving speed) of theuser2 is predetermined speed (e.g., 20 km/h) or more. Since theuser2 receives wind, thepressure sensor112 detects a wind pressure. In the second transition, since theuser2 is performing a change of clothes and the like, the position of theuser2 hardly changes and theuser2 nearly stops (moving speed is nearly zero). In the run, an arm swing of theuser2 is regular (has periodicity). Therefore, a waveform of an output signal of theacceleration sensor113 is regular (has periodicity). Running speed (moving speed) of theuser2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h).
Therefore, theprocessing unit100 may calculate moving speed of theuser2 on the basis of positioning data (position information) generated and output by theGPS sensor110, determine whether a waveform of an output signal of theacceleration sensor113 has periodicity, detect a change in pressure on the basis of an output signal of thepressure sensor112, and discriminate the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2 on the basis of the moving speed of theuser2, whether the waveform of the output signal of theacceleration sensor113 has periodicity, and the change in the pressure.
Theprocessing unit100 in this embodiment can perform the data processing, the display processing, the sound output processing, and the communication processing as in the embodiments explained above.
4-3. Procedure of Processing of the User TerminalTheprocessing unit100 of theuser terminal3 executes a computer program stored in the storing unit140 (the storage medium, the ROM, or the RAM) to thereby execute the processing in the procedure of the flowchart ofFIG. 5.
FIG. 18 is a flowchart for explaining an example of details of the state discrimination processing (the processing in step S14 inFIG. 5) in the fourth embodiment.
As shown inFIG. 18, in this embodiment, theprocessing unit100 performs swim determination processing (step S400), first transition determination processing (step S500), bike determination processing (step S600), second transition determination processing (step S700), and run determination processing (step S800) in order.
As explained above, in the swim, the stroke of the arms of theuser2 is regular (has periodicity), the swimming speed of theuser2 is in the predetermined speed range (e.g., approximately 3 km/h), and the state in which the arms of theuser2 are in the air and the state in which the arms of theuser2 are in the water are alternately repeated. Therefore, in the swim determination processing (step S400), if an acceleration waveform (an output waveform of the acceleration sensor113) is regular (has periodicity) (Y in step S401), if moving speed obtained by differentiating the positions of theuser terminal3 included in positioning data of theGPS sensor110 is approximately 3 km/h (Y in step S402), and if a water pressure and an air pressure are detected on the basis of an output signal of the pressure sensor112 (Y in step S403), theprocessing unit100 determines that theuser2 is carrying out the swim and changes the state of theuser2 from an undecided state to the “swim” (step S404).
If a cycle in which a voltage of an output signal of theacceleration sensor113 coincides with a threshold Vt1 is substantially fixed (within a predetermined range) for a predetermined time, theprocessing unit100 may determine that an acceleration waveform is regular. Vt1 only has to be decided as appropriate. If moving speed of theuser terminal3 is 3 km/h−α1 or more and 3 km/h+α2 or less, theprocessing unit100 may determine that the moving speed is approximately 3 km/h. α1 and α2 only have to be decided as appropriately. Since the water pressure is larger than the air pressure by a predetermined amount or more, when pressure applied to theuser terminal3 calculated using an output signal of thepressure sensor112 periodically changes and a difference between a maximum value and a minimum value of the pressure is equal to or larger than a threshold Pt1, theprocessing unit100 may determine that the water pressure and the air pressure are detected. Pt1 only has to be decided as appropriate.
As explained above, in the first tradition, since theuser2 is performing a change of clothes and the like, the position of theuser2 hardly changes. Therefore, in the first transition determination processing (step S500), if the moving speed of theuser terminal3 is nearly zero (theuser terminal3 nearly stops) (Y in step S501), theprocessing unit100 determines that theuser2 is in the state of the first transition and changes the state of theuser2 from the “swim” to the “first transition” (step S502).
When the moving speed of theuser terminal3 is equal to or higher than131, theprocessing unit100 may determine that theuser terminal3 nearly stops.131 only has to be decided as appropriate.
As explained above, in the bike, the running speed of theuser2 is the predetermined speed (e.g., 20 km/h) or more and theuser2 receives wind. Therefore, in the bike determination processing (step S600), if the moving speed of theuser terminal3 is 20 km/h or more (Y in step S601), and if a wind pressure is detected on the basis of the output signal of the pressure sensor112 (Y in step S602), theprocessing unit100 determines that theuser2 is carrying out the bike and changes the state of theuser2 from the “first transition” to the “bike” (step S603).
As explained above, in the second transition, since theuser2 is performing a change of clothes and the like, the position of theuser2 hardly changes. Therefore, in the second transition determination processing (step S700), if the moving speed of theuser terminal3 is nearly zero (theuser terminal3 nearly stops) (Y in step S701), theprocessing unit100 determines that theuser2 is in the state of the second transition and changes the state of theuser2 from the “bike” to the “second transition” (step S702).
As explained above, in the run, the arm swing of theuser2 is regular (has periodicity) and the running speed of theuser2 is in the predetermined speed range (e.g., 8 km/h to 20 km/h). Therefore, in the run determination processing (step S800), if an acceleration waveform (an output waveform of the acceleration sensor113) is regular (has periodicity) (Y in step S801) and the moving speed of theuser terminal3 is 8 km/h to 20 km/h(Y in step S802), theprocessing unit100 determines that theuser2 is carrying out the run and changes the state of theuser2 from the “second transition” to the “run” (step S803).
4-4. Display Method of the State of the UserIn this embodiment, as in the embodiments explained above, the user terminal3 (the processing unit100) discriminates the plurality of states of theuser2 according to the state discrimination processing (the processing in step S14 inFIG. 5) and causes thedisplay unit150 to display the discriminated states.
4-5. Display Method of Exercise InformationIn this embodiment, as in the embodiments explained above, the user terminal3 (the processing unit100) transmits the exercise information of theuser2 generated by the generation processing of exercise information (the processing starting in step S12 and ending in step S18 inFIG. 5) to the information terminals5 (5aand5b) via thecommunication unit170.
4-6. Action and EffectsAs explained above, with the exerciseinformation management system1 in the fourth embodiment, theprocessing unit100 of theuser terminal3 can automatically discriminate the plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2 on the basis of the positioning data (the position information) of theGPS sensor110 and at least either one of the output signal of theacceleration sensor113 and the output signal of thepressure sensor112. Therefore, theuser2 does not need to perform work when an athletic event carried out by theuser2 is switched from the “swim” to the “bike” or switched from the “bike” to the “run”. Therefore, theuser2 can concentrate on the triathlon.
With the exerciseinformation management system1 in the fourth embodiment, the states of theuser2 discriminated by theprocessing unit100 of theuser terminal3 are displayed on thedisplay unit150 of theuser terminal3 together with the elapsed times. Therefore, theuser2 is capable of recognizing the displayed elapsed times of the states and performing adjustment such as an increase and a reduction in a pace. The states of theuser2 discriminated by theprocessing unit100 of theuser terminal3 are displayed on the display unit of theinformation terminal5atogether with the elapsed times. Therefore, a person (e.g., a coach) carrying theinformation terminal5acan recognize the elapsed times of the states of theuser2 and give advice or the like such as an increase or a reduction in a pace to theuser2. In particular, theuser2 and the like can separately recognize a time required for the state “swim” and a time required for the state “first transition”. Therefore, theuser2 and the like can recognize whether a pace of the swim is faster or slower than an assumption and whether a time required for switching from the swim to the bike is longer or shorter than an assumption and appropriately determine whether the pace should be increased or reduced in the bike. Similarly, theuser2 and the like can separately recognize a time required for the state “bike” and a time required for the state “second transition”. Therefore, theuser2 and the like can recognize whether a pace of the bike is faster or slower than an assumption and whether a time required for switching from the bike to the run is longer or shorter than an assumption and appropriately determine whether the pace should be increased or reduced in the run.
5. Fifth EmbodimentThe exerciseinformation management system1 in a fifth embodiment is explained below.
In the exerciseinformation management system1 in the fifth embodiment, a detailed procedure of the state discrimination processing (step S14 inFIG. 5) for discriminating the state of theuser2 by theprocessing unit100 of theuser terminal3 is different from the procedure in the fourth embodiment.
In the fifth embodiment, theprocessing unit100 performs processing for discriminating a plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2 on the basis of positioning data (position information) generated and output by theGPS sensor110 and an output signal of theacceleration sensor113 and an output signal of thepressure sensor112. As in the fourth embodiment, in the fifth embodiment, theprocessing unit100 performs the swim determination processing (step S400), the first transition determination processing (step S500), the bike determination processing (step S600), the second transition determination processing (step S700), and the run determination processing (step S800) in order.
As shown inFIG. 19, as in the fourth embodiment, in the swim determination processing (step S400), if an acceleration waveform (an output waveform of the acceleration sensor113) is regular (has periodicity) (Y in step S411), if moving speed obtained by differentiating the positions of theuser terminal3 included in positioning data of theGPS sensor110 is approximately 3 km/h (Y in step S412), and if a water pressure and an air pressure are detected on the basis of an output signal of the pressure sensor112 (Y in step S413), theprocessing unit100 determines that theuser2 is carrying out the swim and changes the state of theuser2 from an undecided state to the “swim” (step S414).
In the first transition, since theuser2 is performing a change of clothes and the like, the movement of the arms of theuser2 is irregular (does not have periodicity) and a waveform of an output signal of theacceleration sensor113 is irregular (does not have periodicity). The position of theuser2 hardly changes and theuser2 nearly stops (moving speed is nearly zero). Further, since the arms of theuser2 are always in the air, thepressure sensor112 detects only an air pressure.
Therefore, as shown inFIG. 20, in the first transition determination processing (step S500), if an acceleration waveform (an output waveform of the acceleration sensor113) is irregular (does not have periodicity) (Y in step S511), if moving speed of theuser terminal3 is nearly zero (theuser terminal3 nearly stops) (Y in S512), and if only an air pressure is detected on the basis of an output signal of the pressure sensor112 (a water pressure is not detected) (Y in step S513), theprocessing unit100 determines that theuser2 is in the state of the first transition and changes the state of theuser2 from the “swim” to the “first transition” (step S514).
When a cycle in which a voltage of an output signal of theacceleration sensor113 coincides with a threshold Vt2 is not substantially fixed (within a predetermined range) for a predetermined time or when a state in which the voltage is smaller than the threshold Vt2 is continued for a predetermined time, theprocessing unit100 may determine that the acceleration waveform is irregular. Vt2 only has to be decided as appropriate. When a state in which pressure applied to theuser terminal3 calculated using an output signal of thepressure sensor112 is smaller than a threshold Pt2 is continued for a predetermined time, theprocessing unit100 may determine that only an air pressure is detected. Pt2 only has to be decided as appropriate.
In the bike, the movement of the arms of theuser2 is irregular (does not have periodicity). Therefore, a waveform of an output signal of theacceleration sensor113 is irregular (does not have periodicity). Running speed (moving speed) of theuser2 is predetermined speed (e.g., 20 km/h) or more. Since theuser2 receives wind, thepressure sensor112 detects a wind pressure. Therefore, as shown inFIG. 21, in the bike determination processing (step S600), if an acceleration waveform (an output waveform of the acceleration sensor113) is irregular (does not have periodicity) (Y in step S611), if moving speed of theuser terminal3 is 20 km/h or more (Y in step S612), and if a wind pressure is detected on the basis of an output signal of the pressure sensor112 (Y in step S613), theprocessing unit100 determines that theuser2 is carrying out the bike and changes the state of theuser2 from the “first transition” to the “bike” (step S614).
In the second transition, since theuser2 is performing a change of clothes and the like, the movement of the arms of theuser2 is irregular (does not have periodicity) and a waveform of an output signal of theacceleration sensor113 is irregular (does not have periodicity). The position of theuser2 hardly changes and theuser2 nearly stops (moving speed is nearly zero). Further, since the arms of theuser2 are always in the air, thepressure sensor112 detects only an air pressure. Therefore, as shown inFIG. 22, in the second transition determination processing (step S700), if an acceleration waveform (an output waveform of the acceleration sensor113) is irregular (does not have periodicity) (Y in step S711), if moving speed of theuser terminal3 is nearly zero (theuser terminal3 nearly stops) (Y in step S712), and if only an air pressure is detected on the basis of an output signal of the pressure sensor112 (a water pressure is not detected) (Y in step S713), theprocessing unit100 determines that theuser2 is in the state of the second transition and changes the state of theuser2 from the “bike” to the “second transition” (step S714).
In the run, an arm swing of theuser2 is regular (has periodicity). Therefore, a waveform of an output signal of theacceleration sensor113 is regular (has periodicity). Running speed (moving speed) of theuser2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h). Further, since the arms of theuser2 are always in the air, thepressure sensor112 detects only an air pressure. Therefore, as shown inFIG. 23, in the run determination processing (step S800), if an acceleration waveform (an output waveform of the acceleration sensor113) is regular (has periodicity) (Y in step S811), if moving speed of theuser terminal3 is 8 km/h to20 km/h (Y in step S812), and if only an air pressure is detected on the basis of an output signal of the pressure sensor112 (a water pressure is not detected) (Y in step S813), theprocessing unit100 determines that theuser2 is carrying out the run and changes the state of theuser2 from the “second transition” to the “run” (step S814).
The exerciseinformation management system1 in the fifth embodiment explained above can achieve the same effects as the effects in the fourth embodiment.
Further, with the exerciseinformation management system1 in the fifth embodiment, theprocessing unit100 of theuser terminal3 discriminates the states assuming that the state of theuser2 is switched when all of a condition concerning positioning data (position information) of theGPS sensor110, a condition concerning an output signal of theacceleration sensor113, and a condition concerning an output signal of thepressure sensor112 are satisfied. Therefore, compared with the fourth embodiment in which the state of theuser2 is switched when only a part of the conditions are satisfied, it is possible to more accurately discriminate the state of theuser2.
6. Sixth EmbodimentIn the exerciseinformation management system1 in a sixth embodiment, a detailed procedure of the state discrimination processing (step S14 inFIG. 5) for discriminating a state of theuser2 by theprocessing unit100 of theuser terminal3 is different from the procedures in the fourth and fifth embodiments.
In the sixth embodiment, theprocessing unit100 performs processing for discriminating a plurality of states “swim”, “first transition”, “bike”, “second transition”, and “run” of theuser2 on the basis of positioning data (position information) generated and output by theGPS sensor110, at least either one of an output signal of theacceleration sensor113 and an output signal of thepressure sensor112, and at least either one of an output signal of the angular velocity sensor114 (an example of a “second motion sensor”) and an output signal of thetemperature sensor116.
In general, in the swim, a stroke of the arms of theuser2 is regular (has periodicity). Therefore, a waveform of an output signal of theangular velocity sensor114 is regular (has periodicity). A state in which the arms of theuser2 are in the air and a state in which the arms of theuser2 are in the water are alternately repeated. Therefore, thetemperature sensor116 detects a water temperature. In the first transition, since theuser2 is performing a change of clothes and the like, the movement of the arms of theuser2 is irregular (does not have periodicity) and a waveform of an output signal of theangular velocity sensor114 is irregular (does not have periodicity). Since the arms of theuser2 are always in the air, thetemperature sensor116 detects an air temperature and a body temperature of theuser2. In the bike, the movement of the arms of theuser2 is irregular (does not have periodicity). Therefore, a waveform of an output signal of theangular velocity sensor114 is irregular (does not have periodicity). Since the arms of theuser2 are always in the air, thetemperature sensor116 detects an air temperature and a body temperature of theuser2. In the second transition, since theuser2 is performing a change of clothes and the like, the movement of the arms of theuser2 is irregular (does not have periodicity) and a waveform of an output signal of theangular velocity sensor114 is irregular (does not have periodicity). Since the arms of theuser2 are always in the air, thetemperature sensor116 detects an air temperature and a body temperature of theuser2. In the run, an arm swing of theuser2 is regular (has periodicity). Therefore, a waveform of an output signal of theangular velocity sensor114 is regular (has periodicity). Since the arms of theuser2 are always in the air, thetemperature sensor116 detects an air temperature and a body temperature of theuser2.
Therefore, in addition to the same processing as the processing in the fifth embodiment, theprocessing unit100 may further determine whether a waveform of an output signal of theangular velocity sensor114 has periodicity and discriminate the plurality of states of theuser2 on the basis of the moving speed of theuser2, whether the waveform of the output signal of theacceleration sensor113 has periodicity, the change in pressure, and whether the waveform of the output signal of theangular velocity sensor114 has periodicity. Alternatively, in addition to the same processing as the processing in the fifth embodiment, theprocessing unit100 may further detect a change in temperature on the basis of the output signal of thetemperature sensor116 and discriminate the plurality of states of theuser2 on the basis of the moving speed of theuser2, whether the waveform of the output signal of theacceleration sensor113 has periodicity, the change in pressure, and the change in the temperature. Alternatively, theprocessing unit100 may discriminate the plurality of states of theuser2 on the basis of the moving speed of theuser2, whether the waveform of the output signal of theacceleration sensor113 has periodicity, the change in pressure, whether the waveform of the output signal of theangular velocity sensor114 has periodicity, and the change in temperature.
As in the fourth and fifth embodiments, in the sixth embodiment, theprocessing unit100 performs the swim determination processing (step S400), the first transition determination processing (step S500), the bike determination processing (step S600), the second transition determination processing (step S700), and the run determination processing (step S800) in order.
As explained above, in the swim, the stroke of the arms of theuser2 is regular (has periodicity), the swimming speed of theuser2 is in the predetermined speed range (e.g., approximately 3 km/h), and the state in which the arms of theuser2 are in the air and the state in which the arms of theuser2 are in the water are alternately repeated.
Therefore, as shown inFIG. 24, in the swim determination processing (step S400), first, theprocessing unit100 resets a count value of a not-shown counter to 0 (step S421).
Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor113) is regular (has periodicity) (Y in step S422), theprocessing unit100 increments the count value by 1 (step S423).
If moving speed obtained by differentiating the positions of theuser terminal3 included in positioning data of theGPS sensor110 is approximately 3 km/h (Y in step S424), theprocessing unit110 increments the count value by 1 (step S425).
When a water pressure and an air pressure are detected on the basis of an output signal of the pressure sensor112 (Y in step S426), theprocessing unit100 increments the count value by 1 (step S427).
If an angular velocity waveform (an output waveform of the angular velocity sensor114) is regular (has periodicity) (Y in step S428), theprocessing unit100 increments the count value by 1 (step S429). If a cycle in which a voltage of an output signal of theangular velocity sensor114 coincides with a threshold Vt3 is substantially fixed (within a predetermined range) for a predetermined time, theprocessing unit100 may determine that the angular velocity waveform is regular. Vt3 only has to be decided as appropriate.
If a water temperature is detected on the basis of an output signal of the temperature sensor116 (Y in S430), theprocessing unit100 increments the count value by 1 (step S431).
If the count value is less than 3 (N in step S432), theprocessing unit100 performs the processing in step S421 and subsequent steps again. If the count value is 3 or more (Y in step S432), theprocessing unit100 determines that theuser2 is carrying out the swim and changes the state of theuser2 from the undecided state to the “swim” (step S433).
Note that, in the flowchart ofFIG. 24, the order of the determinations in steps S422, S424, S426, S428, and S430 may be changed as appropriate.
As explained above, in the first transition, theuser2 is performing a change of clothes and the like. Therefore, the movement of the arms of theuser2 is irregular (does not have periodicity), the position of theuser2 hardly changes, and the arms of theuser2 are always in the air.
Therefore, as shown inFIG. 25, in the first transition determination processing (step S500), first, theprocessing unit100 resets a count value of the not-shown counter to 0 (step S521).
Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor113) is irregular (does not have periodicity) (Y in step S522), theprocessing unit100 increments the count value by 1 (step S523).
If moving speed of theuser terminal3 is nearly zero (theuser terminal3 nearly stops) (Y in step S524), theprocessing unit100 increments the count value by 1 (step S525).
If only an air pressure is detected on the basis of an output signal of the pressure sensor112 (a water pressure is not detected) (Y in step S526), theprocessing unit100 increments the count value by 1 (step S527).
If an angular velocity waveform (an output waveform of the angular velocity sensor114) is irregular (does not have periodicity) (Y in step S528), theprocessing unit100 increments the count value by 1 (step S529). If a cycle in which a voltage of an output signal of theangular velocity sensor114 coincides with a threshold Vt4 is not substantially fixed (within a predetermined range) for a predetermined time or a state in which the voltage is smaller than the threshold Vt4 is continued for a predetermined time, theprocessing unit100 may determine that the angular velocity waveform is irregular. Vt4 only has to be decided as appropriate.
If an air temperature and a body temperature of theuser2 are detected on the basis of an output signal of the temperature sensor116 (Y in S530), theprocessing unit100 increments the count value by 1 (step S531).
If the count value is less than 3 (N in step S532), theprocessing unit100 performs the processing in step S521 and subsequent steps again. If the count value is 3 or more (Y in step S532), theprocessing unit100 determines that theuser2 is in the state of the first transition and changes the state of theuser2 from the “swim” to the “first transition” (step S533).
Note that, in the flowchart ofFIG. 25, the order of the determinations in steps S522, S524, S526, S528, and S530 may be changed as appropriate.
As explained above, in the bike, the movement of the arms of theuser2 is irregular (does not have periodicity), the running speed of theuser2 is the predetermined speed (e.g., 20 km/h) or more, theuser2 receives wind, and the arms of theuser2 are always in the air.
Therefore, as shown inFIG. 26, in the bike determination processing (step S600), first, theprocessing unit100 resets a count value of the not-shown counter to 0 (step S621).
Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor113) is irregular (does not have periodicity) (Y in step S622), theprocessing unit100 increments the count value by 1 (step S623).
If moving speed of theuser terminal3 is 20 km/h or more (Y in step S624), theprocessing unit100 increments the count value by 1 (step S625).
If a wind pressure is detected on the basis of an output signal of the pressure sensor112 (Y in step S626), theprocessing unit100 increments the count value by 1 (step S627).
If an angular velocity waveform (an output waveform of the angular velocity sensor114) is irregular (does not have periodicity) (Y in step S628), theprocessing unit100 increments the count value by 1 (step S629).
If an air temperature and a body temperature of theuser2 are detected on the basis of an output signal of the temperature sensor116 (Y in step S630), theprocessing unit100 increments the count value by 1 (step S631).
If the count value is less than 3 (N in step S632), theprocessing unit100 performs the processing in step S621 and subsequent steps again. If the count value is 3 or more (Y in step S632), theprocessing unit100 determines that theuser2 is carrying out the bike and changes the state of theuser2 from the “first transition” to the “bike” (step S633).
Note that in the flowchart ofFIG. 26, the order of the determinations in steps S622, S624, S626, S628, and S630 may be changed as appropriate.
As explained above, in the second transition, theuser2 is performing a change of clothes and the like. Therefore, the movement of the arms of theuser2 is irregular (does not have periodicity), the position of theuser2 hardly changes, and the arms of theuser2 are always in the air.
As shown inFIG. 27, in the second transition determination processing (step S700), first, theprocessing unit100 resets a count value of the not-shown counter to 0 (step S721).
Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor113) is irregular (does not have periodicity) (Y in step S722), theprocessing unit100 increments the count value by 1 (step S723).
If moving speed of theuser terminal3 is nearly zero (theuser terminal3 nearly stops) (Y in step S724), theprocessing unit100 increments the count value by 1 (step S725).
If only an air pressure is detected on the basis of an output signal of the pressure sensor112 (a water pressure is not detected) (Y in step S726), theprocessing unit100 increments the count value by 1 (step S727).
If an angular velocity waveform (an output waveform of the angular velocity sensor114) is irregular (does not have periodicity) (Y in step S728), theprocessing unit100 increments the count value by 1 (step S729).
If an air temperature and a body temperature of theuser2 are detected on the basis of an output signal of the temperature sensor116 (Y in step S730), theprocessing unit100 increments the count value by 1 (step S731).
If the count value is less than 3 (N in step S732), theprocessing unit100 performs the processing in step S721 and subsequent steps again. If the count value is 3 or more (Y in step S732), theprocessing unit100 determines that theuser2 is in the state of the second transition and changes the state of theuser2 from the “bike” to the “second transition” (step S733).
Note that, in the flowchart ofFIG. 27, the order of the determinations in steps S722, S724, S726, S728, and S730 cay be changed as appropriate.
As explained above, in the run, the arm swing of theuser2 is regular (has periodicity), the running speed of theuser2 is in the predetermined speed range (e.g., 8 km/h to 20 km/h), and the arms of theuser2 is always in the air.
Therefore, as shown inFIG. 28, in the run determination processing (step S800), first, theprocessing unit100 resets a count value of the not-shown counter to 0 (step S821).
Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor113) is regular (has periodicity) (Y in step S822), theprocessing unit100 increments the count value by 1 (step S823).
If moving speed of theuser terminal3 is 8 km/h to 20 km/h (Y in step S824), theprocessing unit100 increments the count value by 1 (step S825).
If only an air pressure is detected on the basis of an output signal of the pressure sensor112 (a water pressure is not detected) (Y in step S826), theprocessing unit100 increments the count value by 1 (step S827).
If an angular velocity waveform (an output waveform of the angular velocity sensor114) is regular (has periodicity) (Y in step S828), theprocessing unit100 increments the count value by 1 (step S829).
If an air temperature and a body temperature of theuser2 are detected on the basis of an output signal of the temperature sensor116 (Y in step S830), theprocessing unit100 increments the counter value by 1 (step S831).
If the count value is less than 3 (N in step S832), theprocessing unit100 performs the processing in step S821 and subsequent steps again. If the count value is 3 or more (Y in step S832), theprocessing unit100 determines that theuser2 is carrying out the run and changes the state of theuser2 from the “second transition” to the “run” (step S833).
Note that, in the flowchart ofFIG. 28, the order of the determinations in steps S822, S824, S826, S828, and S830 may be changed as appropriate.
The exerciseinformation management system1 in the sixth embodiment explained above can achieve the same effects as the effects in the fourth embodiment or the fifth embodiment.
Further, with the exerciseinformation management system1 in the sixth embodiment, theprocessing unit100 of theuser terminal3 discriminates the states assuming that the state of theuser2 is switched when three or more conditions among of a condition concerning positioning data (position information) of theGPS sensor110, a condition concerning an output signal of theacceleration sensor113, a condition concerning an output signal of thepressure sensor112, a condition concerning an output signal of theangular velocity sensor114, and a condition concerning thetemperature sensor116 are satisfied. Therefore, it is possible to more accurately discriminate the state of theuser2.
7. Seventh EmbodimentIn the exerciseinformation management system1 in a seventh embodiment, a detailed procedure of the state discrimination processing (step S14 inFIG. 5) for discriminating the state of theuser2 by theprocessing unit100 of theuser terminal3 is different from the procedures in the fourth to sixth embodiments.
In the fourth to sixth embodiments, theprocessing unit100 calculates the elapsed time Ttran1 from the start to the end of the state “first transition” as the time equivalent to the first transition time (the elapsed time from the time when theuser2 passes the goal point G1 of the swim until theuser2 passes the start point S2 of the bike). Similarly, theprocessing unit100 calculates the elapsed time Ttran2 from the start to the end of the state “second transition” as the time equivalent to the second transition time (the elapsed time from the time when theuser2 passes the goal point G2 of the bike until theuser2 passes the start point S3 of the run).
Actually, the first transition time is a sum of a time in which theuser2 moves from the goal point G1 of the swim to the transition area TA, a time required for a change of clothes and the like in the transition area TA, and a time in which theuser2 moves to the start point S2 (the riding line) of the bike. The second transition time is a sum of a time in which theuser2 moves from the goal point G2 (an alighting line)of the bike to a clothes change place in the transition area TA, a time required for a change of clothes and the like, and a time in which theuser2 moves to the start point S3 of the run.
Therefore, in the seventh embodiment, a state halfway in transition from the “swim” to the “bike” consists of three states, that is, a state “running movement A” in which theuser2 is moving from the goal point G1 of the swim to the transition area TA, a state “first transition” in which theuser2 is performing a change of clothes and the like in the transition area TA, and a state “running movement B” in which theuser2 is moving from the transition area TA to the start point S2 of the bike. Theprocessing unit100 of theuser terminal3 discriminates the three states “running movement A”, “first transition”, and “running movement B” respectively as separate states. Similarly, a state halfway in transition from the “bike” to the “run” consists of three states, that is, a state “running movement C” in which theuser2 moves from the goal point G2 of the bike to the transition area TA, a state “second transition” in which theuser2 is performing a change of clothes and the like in the transition area TA, and a state “running movement D” in which theuser2 is moving from the transition area TA to the start point S3 of the run. Theprocessing unit100 of theuser terminal3 discriminates the three states “running movement C”, “second transition”, and “running movement D” respectively as separate states.
That is, in the seventh embodiment, theprocessing unit100 performs, as one kinds of the data processing, processing for discriminating a plurality of states “swim”, “running movement A”, “first transition”, “running movement B”, “bike”, “running movement C”, “second transition”, “running movement D”, and “run” of theuser2 on the basis of positioning data (position information) of theGPS sensor110, an output signal of theacceleration sensor113, an output signal of thepressure sensor112, an output signal of theangular velocity sensor114, and an output signal of thetemperature sensor116.
Theprocessing unit100 performs, as one kind of the data processing, processing for measuring, on the basis of an output signal of theclocking unit130, the elapsed time Tswim of the state “swim”, an elapsed time TmoveA of the state “running movement A”, the elapsed time Ttran1 of the state “first transition”, an elapsed time TmoveB of the state “running movement B”, the elapsed time Tbike of the state “bike”, an elapsed time TmoveC of the state “running movement C”, the elapsed time Ttran2 of the state “second transition”, an elapsed time TmoveD of the state “running movement D”, and the elapsed time Trun of the state “run”.
Theprocessing unit100 performs, as one kind of the data processing, processing for ending, when receiving a signal indicating measurement end operation from theoperation unit120, the measurement processing of the total elapsed time Ttotal, the discrimination processing of the “swim”, the “running movement A”, the “first transition”, the “running movement B”, the “bike”, the “running movement C”, the “second transition”, the running movement D″, and the “run”, and the measurement processing of the elapsed times Tswim, TmoveA, Ttran1, TmoveB, Tbike, TmoveC, Ttran2, TmoveD, and Trun of the states and causing the incorporated storingunit140 to store the total elapsed time Ttotal and the elapsed times Tswim, TmoveA, Ttran1, TmoveB, Tbike, TmoveC, Ttran2, TmoveD, and Trun.
Theprocessing unit100 may perform, as one kinds of the display processing, processing for causing thedisplay unit150 to display at least one of the plurality of states “swim”, “running movement A”, “first transition”, “running movement B”, “bike”, “running movement C”, “second transition”, “running movement D”, and “run” of theuser2. In this case, thedisplay unit150 functions as a notifying unit that notifies a state discriminated by theprocessing unit100.
Theprocessing unit100 may perform, as one kind of the display processing, processing for causing thedisplay unit150 to display at least a part of the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, TmoveA, Ttran1, TmoveB, Tbike, TmoveC, Ttran2, TmoveD, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of theuser2.
Theprocessing unit100 may perform, as one kinds of the sound output processing, processing for causing thesound output unit160 to output, as sound, at least one of the plurality of states “swim”, “running movement A”, “first transition”, “running movement B”, “bike”, “running movement C”, “second transition”, “running movement D”, and “run” of theuser2. In this case, thesound output unit160 functions as a notifying unit that notifies a state discriminated by theprocessing unit100.
Theprocessing unit100 may perform, as one kinds of the communication processing, processing for transmitting at least one of the plurality of states “swim”, “running movement A”, “first transition”, “running movement B”, “bike”, “running movement C”, “second transition”, “running movement D”, and “run” of theuser2 to the information terminals5(5aand5b) via thecommunication unit170. In this case, thecommunication unit170 functions as a notifying unit that notifies a state discriminated by theprocessing unit100.
Theprocessing unit100 may perform, as one kind of the communication processing, processing for transmitting the exercise information (the total elapsed time Ttotal, the elapsed times Tswim, TmoveA, Ttran1, TmoveB, Tbike, TmoveC, Ttran2, TmoveD, and Trun, the discriminated state, the speed, the pace, the distance, the track, the pulse rate, the heart rate, the pitch, the swimming stroke, the run stride, etc.) of theuser2 to the information terminals5 (5aand5b) via thecommunication unit170.
As in the fourth to sixth embodiments, in the seventh embodiment, theprocessing unit100 executes the computer program stored in thestoring unit140 to thereby, for example, execute the processing in the procedure of the flowchart ofFIG. 5.
FIG. 29 is a flowchart for explaining an example of details of the state discrimination processing (the processing in step S14 inFIG. 5) in the seventh embodiment.
As shown inFIG. 29, theprocessing unit100 performs the swim determination processing (step S400), running movement A determination processing (step S450), the first transition determination processing (step S500), running movement B determination processing (step S550), the bike determination processing (step S600), running movement C determination processing (step S650), the second transition determination processing (step S700), running movement D determination processing (step S750), and the run determination processing (step S800) in order. Detailed procedure of the swim determination processing (step S400), the first transition determination processing (step S500), the bike determination processing (step S600), the second transition determination processing (step S700), and the run determination processing (step S800) are the same as the procedures in the sixth embodiment (FIGS. 24 to 28). Therefore, illustration of flowcharts and explanation of the flowcharts are omitted concerning the detailed procedures.
In the state “running movement A” in which theuser2 is moving from the goal point G1 of the swim to the transition area TA, an arm swing of theuser2 is regular (has periodicity), running speed of theuser2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h), and the arms of theuser2 are always in the air.
Therefore, as shown inFIG. 30, in the running movement A determination processing (step S450), first, theprocessing unit100 resets a count value of the not-shown counter to 0 (step S451).
Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor113) is regular (has periodicity) (Y in step S452), theprocessing unit100 increments the count value by 1 (step S453).
If moving speed of theuser terminal3 is 8 km/h to 20 km/h (Y in step S454), theprocessing unit100 increments the count value by 1 (step S455).
If only an air pressure is detected on the basis of an output signal of the pressure sensor112 (a water pressure is not detected) (Y in step S456), theprocessing unit100 increments the count value by 1 (step S457).
If an angular velocity waveform (an output waveform of the angular velocity sensor114) is regular (has periodicity) (Y in step S458), theprocessing unit100 increments the count value by 1 (step S459).
If an air temperature and a body temperature of theuser2 are detected on the basis of an output signal of the temperature sensor116 (Y in S460), theprocessing unit100 increments the count value by 1 (step S461).
If the count value is less than 3 (N in step S462), theprocessing unit100 performs the processing in step S451 and subsequent steps again. If the count value is 3 or more (Y in step S462), theprocessing unit100 determines that theuser2 is in the state of the running movement A and changes the state of theuser2 from the “swim” to the “running movement A” (step S463).
Note that, in the flowchart ofFIG. 30, the order of the determinations in steps S452, S454, S456, S458, and S460 may be changed as appropriate.
In the state “running movement B” in which theuser2 is moving from the transition area TA to the start point S2 of the bike, since theuser2 grips a handle of a bicycle and runs, the arms of theuser2 slightly vibrate, running speed of theuser2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h), and the arms of theuser2 are always in the air.
Therefore, as shown inFIG. 31, in the running movement B determination processing (step S550), first, theprocessing unit100 resets a count value of the not-shown counter to 0 (step S551).
Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor113) is oscillatory (Y in step S552), theprocessing unit100 increments the count value by 1 (step S553). If a cycle in which a voltage of an output signal of theacceleration sensor113 coincides with a threshold Vt5 is within a predetermined range continuously for a predetermined time, theprocessing unit100 may determine that the acceleration waveform is oscillatory. Vt5 only has to be decided as appropriate.
If moving speed of theuser terminal3 is 8 km/h to 20 km/h (Y in step S554), theprocessing unit100 increments the count value by 1 (step S555).
If only an air pressure is detected on the basis of an output signal of the pressure sensor112 (a water pressure is not detected) (Y in step S556), theprocessing unit100 increments the count value by 1 (step S557).
If an angular velocity waveform (an output waveform of the angular velocity sensor114) is irregular (does not have periodicity) (Y in step S558), theprocessing unit100 increments the count value by 1 (step S559).
If an air temperature and a body temperature of theuser2 are detected on the basis of an output signal of the temperature sensor116 (Y in S560), theprocessing unit100 increments the count value by 1 (step S561).
If the count value is less than 3 (N in step S562), theprocessing unit100 performs the processing in step S551 and subsequent steps again. If the count value is 3 or more (Y in step S562), theprocessing unit100 determines that theuser2 is in the state of the running movement B and changes the state of theuser2 from the “first transition” to the “running movement B” (step S563).
Note that, in the flowchart ofFIG. 31, the order of the determinations in steps S552, S554, S556, S558, and S560 may be changed as appropriate.
In the state “running movement C” in which theuser2 is moving from the goal point G2 of the bike to the transition area TA, since theuser2 grips the handle of the bicycle and runs, the arms of theuser2 slightly vibrate, running speed of theuser2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h), and the arms of theuser2 are always in the air.
Therefore, as shown inFIG. 32, in the running movement C determination processing (step S650), first, theprocessing unit100 resets a count value of the not-shown counter to 0 (step S651).
Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor113) is oscillatory (Y in step S652), theprocessing unit100 increments the count value by 1 (step S653).
If moving speed of theuser terminal3 is 8 km/h to 20 km/h (Y in step S654), theprocessing unit100 increments the count value by 1 (step S655).
If only an air pressure is detected on the basis of an output signal of the pressure sensor112 (a water pressure is not detected) (Y in step S656), theprocessing unit100 increments the count value by 1 (step S657).
If an angular velocity waveform (an output waveform of the angular velocity sensor114) is irregular (does not have periodicity) (Y in step S658), theprocessing unit100 increments the count value by 1 (step S659).
If an air temperature and a body temperature of theuser2 are detected on the basis of an output signal of the temperature sensor116 (Y in S660), theprocessing unit100 increments the count value by 1 (step S661).
If the count value is less than 3 (N in step S662), theprocessing unit100 performs the processing in step S651 and subsequent steps again. If the count value is 3 or more (Y in step S662), theprocessing unit100 determines that theuser2 is in the state of the running movement C and changes the state of theuser2 from the “bike” to the “running movement C” (step S663).
Note that, in the flowchart ofFIG. 32, the order of the determinations in steps S652, S654, S656, S658, and S660 may be changed as appropriate.
In the state “running movement D” in which theuser2 is moving from the transition area TA to the start point S3 of the run, an arm swing of theuser2 is regular (has periodicity), running speed of theuser2 is in a predetermined speed range (e.g., 8 km/h to 20 km/h), and the arms of theuser2 are always in the air.
Therefore, as shown inFIG. 33, in the running movement D determination processing (step S750), first, theprocessing unit100 resets a count value of the not-shown counter to 0 (step S751).
Subsequently, if an acceleration waveform (an output waveform of the acceleration sensor113) is regular (has periodicity) (Y in step S752), theprocessing unit100 increments the count value by 1 (step S753).
If moving speed of theuser terminal3 is 8 km/h to 20 km/h (Y in step S754), theprocessing unit100 increments the count value by 1 (step S755).
If only an air pressure is detected on the basis of an output signal of the pressure sensor112 (a water pressure is not detected) (Y in step S756), theprocessing unit100 increments the count value by 1 (step S757).
If an angular velocity waveform (an output waveform of the angular velocity sensor114) is regular (has periodicity) (Y in step S758), theprocessing unit100 increments the count value by 1 (step S759).
If an air temperature and a body temperature of theuser2 are detected on the basis of an output signal of the temperature sensor116 (Y in S760), theprocessing unit100 increments the count value by 1 (step S761).
If the count value is less than 3 (N in step S762), theprocessing unit100 performs the processing in step S751 and subsequent steps again. If the count value is 3 or more (Y in step S762), theprocessing unit100 determines that theuser2 is in the state of the running movement D and changes the state of theuser2 from the “second transition” to the “running movement D” (step S763).
Note that, in the flowchart ofFIG. 33, the order of the determinations in steps S752, S754, S756, S758, and S760 may be changed as appropriate.
The exerciseinformation management system1 in the seventh embodiment explained above can achieve the same effects as the effects in the fourth embodiment, the fifth embodiment, or the sixth embodiment.
Further, with the exerciseinformation management system1 in the seventh embodiment, theprocessing unit100 of theuser terminal3 can separately recognize a time required by theuser2 for movement from the goal point G1 of the swim to the transition area TA (a time required for the state “running movement A”), a time required by theuser2 for a change of clothes and the like in the transition area TA (a time required for the state “first transition”), and a time required by theuser2 for movement from the transition area TA to the start point S2 of the bike (a time required for the state “running movement B”). Therefore, theuser2 and the like can grasp, in detail, for example, points that should be improved in switching from the swim to the bike.
Similarly, with the exerciseinformation management system1 in the seventh embodiment, theprocessing unit100 of theuser terminal3 can separately recognize a time required by theuser2 for movement from the goal point G2 of the bike to the transition area TA (a time required for the state “running movement C”), a time required by theuser2 for a change of clothes and the like in the transition area TA (a time required for the state “second transition”), and a time required by theuser2 for movement from the transition area TA to the start point S3 of the run (a time required for the state “running movement D”). Therefore, theuser2 and the like can grasp, in detail, for example, points that should be improved in switching from the bike to the run.
8. ModificationsThe present is not limited to the embodiments. Various modified implementations are possible within the scope of the gist of the invention. Modifications are explained below. Note that the same components as the components in the embodiments are denoted by the same reference numerals and signs and redundant explanation of the components is omitted.
For example, theprocessing unit100 of theuser terminal3 may discriminate the plurality of states of theuser2 and, when the discriminated state is the “first transition” or the “second transition”, generate an image including a plurality of flashing objects and cause thedisplay unit150 or the display unit of the information terminal5 (5a) to display the image. InFIGS. 34 and 35, examples of images at the time when the state of theuser2 is the “first transition” in this modification are shown.
In the example shown inFIG. 34, in images A6-1 to A6-5 at the time when the state of theuser2 is the “first transition”, three triangular objects, which are parts of an object OB6 for reminding that theuser2 is transitioning from the swim to the bike, are flashed at cycles different from one another. Specifically, the three triangular objects are extinguished for one second (the image A6-1), subsequently, only the triangular object at the left end is lit for one second (the image A6-2), only two triangular objects from the left end are lit for one second (the image A6-3), the three triangular objects are lit for one second (the image A6-4), and the three triangular objects are extinguished for one second again (the image A6-5). Although not shown in the figure, similarly, in images at the time when the state of theuser2 is the “second transition”, three triangular objects, which are parts of an object for reminding that theuser2 is transitioning from the bike to the run, may be flashed at cycles different from one another.
In the example shown inFIG. 35, in images A7-1 to A7-4 at the time when the state of theuser2 is the “first transition”, three triangular objects, which are parts of an object OB7 for reminding that theuser2 is transitioning from the swim to the bike, are flashed at timings different from one another and at the same cycle. Specifically, only the triangular object at the left end is lit for one second (the image A7-1), subsequently, only the triangular object in the middle is lit for one second (the image A7-2), only the triangular object at the right end is lit for one second (the image A7-3), and only the triangular object at the left end is lit for one second again (the image A7-4). Although not shown in the figure, similarly, in images at the time when the state of theuser2 is the “second transition”, three triangular objects, which are parts of an object for reminding that theuser2 is transitioning from the bike to the run, may be flashed at timings different from one another at the same cycle.
For example, theprocessing unit100 of theuser terminal3 may discriminate a plurality of states (e.g., the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run”) of theuser2, generate an image including information indicating to a degree to which a discriminated state has ended (a degree in which the discriminated state remains), and cause thedisplay unit150 or the display unit of the information terminal5 (5a) to display the image. For example, before starting the triathlon, theuser2 registers, in thestoring unit140 of theuser terminal3, respective kinds of position information of the start point S1 and the goal point G1 of the swim, the start point S2 and the goal point G2 of the bike, and the start point S3 and the goal point G3 of the run. Theprocessing unit100 can calculate, on the basis of the respective kinds of position information registered in thestoring unit140 and a time series of positioning data (position information) generated and output by theGPS sensor110, a degree to which a present state of theuser2 has ended (a degree in which the present state will end).
InFIG. 36, an example of an image at the time when the state of theuser2 is the “bike” in this modification is shown. In the example shown inFIG. 36, like the image A3 shown inFIG. 7, an image8A at the time when the state of theuser2 is the “bike” includes the object OB3, the total elapsed time Ttotal (1:01:45) serving as the total time, and the elapsed time Tbike (0:26:59) of the bike and further includes an object OB8. The object OB8 includes ten rectangular objects arranged in one row. All of the ten rectangular objects are painted in white at a point in time when theuser2 starts the bike. Every time theuser2 advances by a predetermined distance, the rectangular objects are painted in black one by one in order from the left end. When the bike approaches the end (or the bike ends), all of the rectangular objects are painted in black. In the example shown inFIG. 36, the seven rectangular objects are painted in black to indicate that the bike has ended to approximately 70% of the entire course (approximately 30% of the entire course remains). Theuser2 can view an image A8 and the like displayed on thedisplay unit150 or receive contact from the coach or the like, who views the image A8 and the like displayed on the display unit of the information terminal5 (5a), and adjust the subsequent pace.
For example, theprocessing unit100 of theuser terminal3 may discriminate the plurality of states (e.g., the “swim”, the “first transition”, the “bike”, the “second transition”, and the “run”) of theuser2 and, when the discriminated state is the “first transition”, cause thedisplay unit150 or the display unit of the information terminal5 (5a) to display the elapsed time Ttran1 from the start of the “first transition” to be comparable with a target time set in advance. Similarly, when the discriminated state is the “second transition”, theprocessing unit100 may cause thedisplay unit150 or the display unit of the information terminal5 (5a) to display the elapsed time Ttran2 from the start of the “second transition” to be comparable with a target time set in advance. For example, before starting the triathlon, theuser2 sets target times of the first transition and the second transition, alternatively, sets, as the target times of the first transition and the second transition, times in the past of other users (friends, etc.), professional athletes, and theuser2 himself/herself, and causes thestoring unit140 of theuser terminal3 to register the target time. Theprocessing unit100 may display the elapsed time Ttran1 and the elapsed time Ttran2 to be comparable with the target time.
InFIG. 37, an example of an image at the time when the state of theuser2 is the “first transition” is shown. In the example shown inFIG. 37, like the image A2 shown inFIG. 7, an image A9 at the time when the state of theuser2 is the “first transition” includes the total elapsed time Ttotal (0:30:12) serving as the total time and the elapsed time Ttran1 (0:01:05) of the first transition and further includes a target time (0:01:30). Although not shown in the figure, similarly, like the image A4 shown inFIG. 7, the image at the time when the state of theuser2 is the “second transition” includes the object OB4, the total elapsed time Ttotal serving as the total time, and the elapsed time Ttran2 of the second transition and further include a target time. Theuser2 can view the image A9 and the like displayed on thedisplay unit150 or receive contact from the coach or the like, who views the image A9 and the like displayed on the display unit of the information terminal5 (5a), grasp whether elapsed times of the first transition and the second transition are longer or shorter than the target time, and adjust the subsequent pace.
For example, in the first embodiment or the second embodiment, theprocessing unit100 of theuser terminal3 may change the state of theuser2 from the “swim” to the “running movement A” when theuser2 passes the position P1 (or any one of the plurality of positions P1), when the state of theuser2 is the state “running movement A”, change the state of theuser2 to the “first transition” when determining on the basis of positioning data (position information) of theGPS sensor110 that theuser2 stops, and, when the state of theuser2 is the state “first transition”, change the state of theuser2 to the “running movement B” when determining on the basis of positioning data (position information) of theGPS sensor110 that theuser2 starts to move. Similarly, theprocessing unit100 of theuser terminal3 may change the state of theuser2 from the “bike” to the “running movement C” when theuser2 passes the position P3 (or any one of the plurality of positions P3), when the state of theuser2 is the state “running movement C”, change the state of theuser2 to the “second transition” when determining on the basis of positioning data (position information) of theGPS sensor110 that theuser2 stops, and, when the state of theuser2 is the state “second transition”, change the state of theuser2 to the “running movement D” when determining on the basis of positioning data (position information) of theGPS sensor110 that theuser2 starts to move.
For example, theprocessing unit100 of theuser terminal3 may discriminate the “swim”, the “bike”, and the “run” as the plurality of states of theuser2 and does not have to discriminate the “first transition” and the “second transition”. In this case, for example, theuser2 may register, in advance, a position P11 in the start point S2 of the bike or the vicinity of the start point S2 and a position P12 in the start point S3 of the run or the vicinity of the start point S3. Theprocessing unit100 of theuser terminal3 may determine on the basis of positioning data (position information) of theGPS sensor110 whether theuser2 passes the position P11 and whether theuser2 passes the position P12, determine that theuser2 is in the state “swim” until theuser2 passes the position P11, determine that theuser2 is in the state “bike” until theuser2 passes the position P12 after passing the position P11, and determine that theuser2 is in the state “run” after theuser2 passes the position P12.
For example, in the fourth to seventh embodiments, theprocessing unit100 of theuser terminal3 may discriminate the “swim”, the “bike”, and the “run” as the plurality of states of theuser2 and does not have to discriminate the “first transition” and the “second transition”. In this case, for example, theprocessing unit100 of theuser terminal3 does have to perform the first transition determination and the second transition determination.
For example, theuser terminal3 may include a plurality ofpressure sensors112ato112d. In the bike determination processing, theprocessing unit100 may more accurately detect a wind pressure on the basis of output signals of the plurality ofpressure sensors112ato112d.FIG. 38 is a diagram showing a disposition example of the plurality ofpressure sensors112ato112dand is a plan view of theuser terminal3. When theuser2 wears theuser terminal3 on the left hand and travels with the bike, wind hits theuser terminal3 from the right side inFIG. 38. Therefore, thepressure sensor112bdetects a positive wind pressure, thepressure sensor112adetects a negative wind pressure, and thepressure sensors112cand112dhardly detect a wind pressure. When theuser2 wears theuser terminal3 on the right hand and travels with the bike, wind hits theuser terminal3 from the left side inFIG. 38. Therefore, thepressure sensor112adetects a positive wind pressure, thepressure sensor112bdetects a negative wind pressure, and thepressure sensors112cand112dhardly detect a wind pressure. Therefore, theprocessing unit100 can more accurately detect a wind pressure on the basis of thepressure sensors112a,112b,112c, and112d.
For example, in the sixth embodiment or the seventh embodiment, theprocessing unit100 of theuser terminal3 may discriminate the plurality of states of theuser2 without using one of an output signal of theangular velocity sensor114 and an output signal of thetemperature sensor116.
For example, in the embodiments, theprocessing unit110 of theuser terminal3 may perform state discrimination processing using theterrestrial magnetism sensor111 as a motion sensor. That is, in the embodiments, theprocessing unit100 may perform the state discrimination processing by replacing an output signal of theacceleration sensor113 with an output signal of the terrestrial magnetism sensor111 (an example of the “first motion sensor”). In the sixth embodiment or the seventh embodiment, theprocessing unit100 may perform the state discrimination processing by replacing an output signal of theangular velocity sensor114 with an output signal of the terrestrial magnetism sensor111 (an example of the “second motion sensor”).
For example, in the embodiments, theprocessing unit100 of theuser terminal3 discriminates the plurality of states of theuser2 in the triathlon (the swing, the bike, and the run). However, theprocessing unit100 may discriminate a plurality of states of theuser2 in any competition including a plurality of athletic events such as winter triathlon (snow run, snow bike, and cross-country ski), duathlon (first run, bike, and second run), aquathlon (run and swim or first run, swim, and second run), or biathlon (cross-country ski and rifle shooting).
For example, at least a part of the various sensors (theGPS sensor110, theterrestrial magnetism sensor111, thepressure sensor112, theacceleration sensor113, theangular velocity sensor114, thepulse sensor115, and the temperature sensor116) do not have to be integrated with theuser terminal3.
For example, in the embodiments, a part of the functions of theserver4 or theinformation terminal5 may be mounted on theuser terminal3 or a part of the functions of theuser terminal3 may be mounted on theserver4 or theinformation terminal5.
For example, in the embodiments, functions of a publicly-known smartphone such as a camera function, a call function, and a communication function may be mounted or other sensing functions (a humidity sensor, etc.) may be mounted on theuser terminal3 or theinformation terminal5.
For example, theuser terminal3 can be configured as, besides the wrist-type electronic device, electronic devices of various types such as an earphone-type electronic device, a fingering-type electronic device, a pendant-type electronic device, an electronic device attached to a sports instrument and used, a smartphone, and a head mount display (HMD). Theuser terminal3 only has to be mounted on a position where an exercise state of theuser2 can be analyzed. Theuser terminal3 may be mounted on, besides the wrist, for example, an arm, a waist, a chest, or a foot.
For example, theuser terminal3 or theinformation terminal5 may perform notification of information through image display, may perform the notification of information through sound output, vibration, or the like, or may perform the notification of information through a combination of at least two of the image display, the sound output, and the vibration.
For example, in the embodiments explained above, theuser terminal3 performs the various kinds of processing using the satellite signals from the GPS satellites. However, theuser terminal3 may use satellite signals from positioning satellites of a Global Navigation Satellite System (GNSS) other than the GPS or positioning satellites other than the GNSS. For example, theuser terminal3 may use satellite signals from satellites of one or two or more systems of satellite positioning systems such as a WAAS (Wide Area Augmentation System), an EGNOS (European Geostationary-Satellite Navigation Overlay Service), a QZSS (Quasi Zenith Satellite System), a GLONASS (GLObal Navigation Satellite System), a GALILEO, and a BeiDou (BeiDou Navigation Satellite System).
The embodiments and the modifications explained above are examples. The invention is not limited to the embodiments and the modifications. For example, it is also possible to combine the embodiments and the modifications as appropriate.
The invention includes configurations substantially the same as the configurations explained in the embodiments (e.g., configurations having the same functions, methods, and results or configurations having the same purposes and effects). The invention includes configurations in which non-essential portions of the configurations explained in the embodiments are replaced. The invention includes configurations that realize the same action and effects as the action and effects of the configurations explained in the embodiments and configurations that can achieve the same objects as the objects of the embodiments. The invention includes configurations in which publicly-known techniques are added to the configurations explained in the embodiment.