CROSS-REFERENCE TO RELATED APPLICATION(S)This application is a continuation of U.S. application Ser. No. 17/025,694, filed Sep. 18, 2020, entitled “ENERGY EXPENSE DETERMINATION FROM SPATIOTEMPORAL DATA.” This application also claims the benefit of priority under 35 U.S.C. § 119 from 62/903,259, filed on Sep. 20, 2019, entitled “Energy Expense Determination from Spatial Temporal Data.” The contents of all of the proceeding are incorporated by reference in their entirety.
SUMMARYThe present technology, roughly described, provides a mechanism for interpreting spatiotemporal data from augmented reality devices and virtual reality devices into fitness related metrics and recommendations. Spatiotemporal data can be compared to the physical data of an end user to interpret estimated energy expense (e.g., calorie burn or Joules) and quantifying fitness metrics (e.g., Squats). A virtual reality or augmented reality device can receive positional data from one or more position tracking sensors over time as the user engages in an activity, for example with a corresponding environment. The positional data is compared to a biometric profile of the end user to understand how their physical body is exerting itself. The spatiotemporal data is a series of coordinate data expressed over time. The biometric data is the physical profile of the end user. This data is used to understand and quantify physical movement while using a virtual reality or augmented reality device.
The present technology does not use a heart rate monitor to determine energy expense from the user. Rather, user energy expense is determined from user motions, captured from motion sensors in communication with a virtual, augmented, or other device, metadata, and from user biometric data. By not requiring a heart rate monitor, the user can enjoy activities, challenges, workouts, or other physical assertion without being inconvenienced with wearing a device that must be positioned to record heart rate data.
In some instances, a method for automatically determining the calories burned by a user during a workout. The method begins with receiving spatiotemporal data by a client application, stored and executing on a client device. The client application includes a display for displaying activity elements associated with a user workout, and the spatiotemporal data indicates the spatial coordinates of a plurality of points associated with a user's body while the user is performing the work. The spatial data is received periodically to comprise spatiotemporal data. The method includes determining an estimated heart rate for the user while performing the workout. The heart rate can be determined based on the spatiotemporal data and user biometric data that does not change during the workout. The method also determines an energy expense for the user during the workout based on the spatiotemporal data and the estimated user heart rate. The method can display the energy expense by the user and the estimated heart rate for the user during the workout by the client application. The client application displays energy expense information with the activity elements within the display.
In some instances, a non-transitory computer readable storage medium includes embodied thereon a program, wherein the program is executable by a processor to perform a method for automatically determining the calories burned by a user during a workout. The method begins with receiving spatiotemporal data by a client application stored and executing on a client device. The client application includes a display for displaying activity elements associated with a user workout, and the spatiotemporal data indicates the spatial coordinates of a plurality of points associated with a user's body while the user is performing the working. The spatial data is received periodically to comprise spatiotemporal data. The method includes determining an estimated heart rate for the user while performing the workout. The heart rate can be determined based on the spatiotemporal data and user biometric data that does not change during the workout. The method also determines an energy expense for the user during the workout based on the spatiotemporal data and the estimated user heart rate. The method can display the energy expense by the user and the estimated heart rate for the user during the workout by the client application. The client application displays energy expense information with the activity elements within the display
In some instances, a system can automatically determine the calories burned by a user during a workout. The system can include a server which includes a memory and a processor. One or more modules can be stored in the memory and executed by the processor to receive spatiotemporal data by a client application stored and executing on a client device, the client application including a display for displaying activity elements associated with a user workout, the spatiotemporal data indicating the spatial coordinates of a plurality of points associated with a user's body while the user is performing the working, the spatial data received periodically to comprise spatiotemporal data, determine an estimated heart rate for the user while performing the workout, the heart rate determined based on the spatiotemporal data and user biometric data that does not change during the workout, determine an energy expense for the user during the workout based on the spatiotemporal data and the estimated user heart rate, and display the energy expense by the user and the estimated heart rate for the user during the workout by the client application, the client application displaying energy expense information with the activity elements within the display.
BRIEF DESCRIPTION OF FIGURESFIG.1A is a block diagram of an energy expense determination system within a virtual reality device.
FIG.1B is a block diagram of an energy expense determination system within an augmented reality device.
FIG.1C is a block diagram of an energy expense determination system within an virtual reality device that processes spatiotemporal data determined from images.
FIG.1D is a block diagram of an energy expense determination system within an augmented reality device that processes spatiotemporal data determined from images.
FIG.1E is a block diagram of a client health application.
FIG.1F is a block diagram of a server health application.
FIG.2 is a block diagram of motion tracking data capture and transmission devices.
FIG.3 is an exemplary method for determining energy expense for a user.
FIG.4 is an exemplary method for performing login for a user.
FIG.5 is an exemplary method for receiving motion tracking data obtained by a client device camera.
FIG.6 is an exemplary method for receiving motion tracking data obtained from transmission units on a user.
FIG.7 is an exemplary method for analysing and processing motion tracking data.
FIG.8 is an exemplary method for reporting data to a server from a client.
FIG.9 is an exemplary method for providing a dashboard.
FIG.10 is an exemplary overlay for use with a virtual reality device or augmented reality device.
FIG.11 is an exemplary overlay for use with a virtual reality device or augmented reality device.
FIG.12 is a block diagram of a computing environment for implementing the present technology.
DETAILED DESCRIPTIONThe present technology, roughly described, provides a mechanism for interpreting spatiotemporal data from augmented reality devices and virtual reality devices into fitness related metrics and recommendations. Spatiotemporal data can be compared to the physical data of an end user to interpret energy expense (e.g., estimated calorie burn or joules). A virtual reality or augmented reality device can receive positional data from one or more position tracking sensors over time as the user engages in an activity, for example with a corresponding environment. The positional data is compared to a biometric profile of the end user to understand how their physical body is exerting itself. The spatiotemporal data is a series of coordinate data expressed over time. The biometric data is the physical profile of the end user. This data is used to understand and quantify physical movement while using a virtual reality or augmented reality device.
The present technology does not use a heart rate monitor to determine energy expense from the user. Rather, user energy expense is determined from user motions, captured from motion sensors in communication with a virtual, augmented, or other device, and from user biometric data. By not requiring a heart rate monitor, the user can enjoy activities, challenges, workouts, or other physical assertion without being inconvenienced with wearing a device that must be positioned to record heart rate data.
The present technology can be used in a variety of applications. For example, the energy expense determination mechanism can be used to provide a virtual coach for activities and workouts, quantifying fitness movements (e.g., squats), games involving physical movement, and other applications. In some instances, motion tracking data captured for the user can be utilized, both in raw and processed form, to provide end-user specific physical recommendations in spatial environments through visual and/or audio cues.
FIG.1A is a block diagram of an energy expense determination system within a virtual reality device.System100 ofFIG.1A includesvirtual reality device110,network120, andserver130.Virtual reality device110 may include one or more applications associated with a user performing a virtual activity in a virtual space.Virtual reality device110 includesclient health application112 and receives data frommotion tracking sensors114,116 and118. Although three motion tracking sensors are illustrated, any number of motion tracking sensors may be used with the present technology.
Client health application112 may be implemented as software and/or hardware onvirtual reality device110. In some instances,module112 may be implemented as code that interfaces with the native system ofvirtual reality device110.Client health application112 may determine the energy expense of a user that utilizesvirtual reality device110 and may communicate withserver130 overnetwork120. In some instances, client health applications may analyze data, determine calories burned and energy expense, manage and update dashboards, generate notifications, detect events, and perform other functionality discussed hereinClient health application112 is discussed in more with respect toFIG.1E.
Network120 may include one or more private networks, public networks, intranets, the Internet, wide-area networks, local area networks, cellular networks, radiofrequency networks, Wi-Fi networks, and any other network which may be used to transmit data.Device110 andserver130 may communicate overnetwork120.
Server130 can be implemented on one or more logical or physical servers, as a single machine or distributed machines, and may communicate withvirtual reality device110 overnetwork120.Server130 includes server health application132, which may be a single application or several applications. Server health application132 may communicate withclient health application112 ondevice110 and may analyze data, determiner calories burned and energy expense, manage and update dashboards, generate notifications, detect events, and perform other functionality discussed herein. Server health application132 is discussed in more detail with respect toFIG.1F.
FIG.1B is a block diagram of an energy expense determination system within an augmented reality device. The system ofFIG.1B is similar to that ofFIG.1A except that the device is an augmented reality device rather than a virtual reality device. As such, theclient health application142 may process images and video to provide graphics, text, video, and/or other content with the images and video. The images and video may be captured by a camera device local to an augmented reality device or remote from an augmented reality device. The server health application132,client health application142, sensors144-148,network120, andserver130 ofFIG.1B operate similarly to those described with respect toFIG.1A.
Though a virtual reality device and augmented reality device are discussed with respect toFIGS.1A and1B, it is understood that other devices can be used with the present technology as well. For example,health module142 can be contained in any device that receives motion tracking data from a number of sensors. The present technology is not intended to be limited to virtual reality and augmented reality devices.
FIG.1C is a block diagram of an energy expense determination system within a virtual reality device that processes spatiotemporal data determined from images. The energy expense system ofFIG.1C is similar to the system ofFIG.1A except that thevirtual reality device150 includes acamera154 andimage processing engine156. In some instances, thecamera154 may be a camera on a cellular telephone, a tablet computer, or some other computing device.
The camera may be any camera that is suitable for capturing images and/or video of a user. Image processing engine may process images captured bycamera154 to calculate points of inflection on a user, the position of the points in a three dimensional coordinate system, the velocity and angular velocity of each point over time, and other data. In some instances, positions in a two-dimensional system may be calculated and used in the present system Theimage processing engine156 may transmit the calculated data toclient health application152.
FIG.1D is a block diagram of an energy expense determination system within an augmented reality device that processes spatiotemporal data determined from images. The energy expense system ofFIG.1D is similar to the system ofFIG.1B except that theaugmented reality device160 includes acamera164 andimage processing engine166. In some instances, thecamera164 may be a camera on a cellular telephone, a tablet computer, or some other computing device.
Thecamera164 may be any camera that is suitable for capturing images and/or video of a user. Image processing engine may process images captured bycamera164 to calculate points of inflection on a user, the position of the points in a three dimensional coordinate system, the velocity and angular velocity of each point over time, and other data. In some instances, positions in a two-dimensional system may be calculated and used in the present system. Theimage processing engine166 may transmit the calculated data toclient health application162.
FIG.1E is a block diagram of aclient health application170. Theclient health application170 provides more detail for the client health applications ofFIGS.1A-1D.Client health application170 includes userbiometric data171,user workout data172,calorie engine173,biometric data library174, anddashboard engine175. Userbiometric data171 may include biometric details for the user, such as a height, weight, date of birth, body mass index, lean body mass data, water percentage, and other data.User workout data172 may include details for one or more workout segments for the user. In some instances, a workout segment includes details for a particular workout characterized by a start time, end time, and other data.
Calorie engine173 may analyze spatiotemporal data, determine a calculation method, determine calories burned and/or energy expense, and perform other functionality described herein.
Biometric data library138 includes data obtained from the current user and/or other users relating to motion tracking data and a corresponding energy expense associated with the data. In some instances,client health application170 may compare captured user motion tracking data and biometric data to the biometric data library in order to determine an energy expense for the user's corresponding motion tracking data.
Dashboard engine175 may calculate metrics, retrieve information to be populated into a dashboard, update a dashboard, and transmit dashboard data to a remote application for display.
FIG.1F is a block diagram ofserver health application180. Theserver health application180 provides more detail for the client health applications ofFIGS.1A-1D.Server health application180 includesuser account data181, userbiometric data182, user workout data283,calorie engine184,biometric data library185, anddashboard engine186. User account data includes data associated with a user's account with the system, such as but not limited to username, password, PIN, and other information.
Userbiometric data182 may include biometric details for the user, such as a height, weight, date of birth, body mass index, lean body mass data, water percentage, and other data.User workout data183 may include details for one or more workout segments for the user. In some instances, a workout segment includes details for a particular workout characterized by a start time, end time, and other data.
Calorie engine184 may analyze spatiotemporal data, determine a calculation method, determine calories burned and/or energy expense, and perform other functionality described herein.
Biometric data library185 includes data obtained from the current user and/or other users relating to motion tracking data and a corresponding energy expense associated with the data. In some instances,server health application180 may compare captured user motion tracking data and biometric data to the biometric data library in order to determine an energy expense for the user's corresponding motion tracking data.
Dashboard engine186 may calculate metrics, retrieve information to be populated into a dashboard, update a dashboard, and transmit dashboard data to a remote application for display.
FIG.2 is a block diagram of motion tracking data capture and transmission devices. InFIG.2, auser210 may be fitted with one or more motion tracking data capture and transmission devices221-231. As shown inFIG.2, exemplary devices221-231 may be placed on the user. In some instances, the devices may be placed on a user's arm, feet, body, head, and other locations. Though eleven devices are illustrated on a user inFIG.2, any number of devices may be implemented, for example 20, 30, 40, 50, or any other number. For purposes of discussion, the present technology may be discussed with respect tosensors221 and224 on the user's arms andsensor225 on a user's head.
FIG.3 is an exemplary method for determining energy expense for a user. First, login is performed atstep310. A user may login to one or more applications on a virtual reality or augmented reality device as well as software associated with the present technology. Login is discussed in more detail with respect to the method ofFIG.4.
An activity start indication is received atstep315. In some instances, the activity start indication may be the start of an application installed on a virtual reality or augmented reality device, such as an exercise application, physical therapy application, game application, or some other application that provides a physical workout, challenge, activity, or other physical experience. The indication may signify that a user play or experience space has been established for the user within a virtual or augmented reality space with a third-party application, and that the user is now able to engage in an activity.
In some instances, the indication may include detection of an application start which has components associated with VR or AR. For example, code may be implemented on the client device to detect when the operating system starts an application having a camera or other component that is suitable with or commonly associated with VR or AR functionality.
Spatiotemporal data is received for a user atstep320. The spatiotemporal data can include data derived or generated from one or more images or motion tracking data. Data generated from one or more images or video can be generated after performing image analysis techniques on a series of images or video in which the user is performing the workout. Image processing software may receive the images and video, process the changing position of the user in the series of images and/or video, and determine the user joints, degree of movement, and location of the user's body in a three dimensional (or two dimensional) coordinate system. The image processing software may be implemented by a system operation system, client health application, server health application, or other software and/or hardware used within or communication with the present system,
In the case of motion tracking data, the motion tracking data is received from one or more motion tracking sensors placed on the user's body. The data may be received wirelessly or via wired connections. In some instances, the motion tracking data may be sampled periodically, such as for example between 60 to 100 Hz by thedevice110 or140
In some instances, there are three tracked points on a user—a tracking point on the head of a user, for example attached a tracking point on a headset worn by a user, and a tracking point at each hand of the user, for example a tracking mechanism attached to a glove, other user clothing, or directly attached to the user's hand. As a result of the three tracking points, the motion tracking data can include position information for each of the three points, providing six degrees of freedom for each point. In this example, physical estimates of the tracking point locations, velocity, and/or overall body mechanics and activity can be determined from the motion tracking data received for the three tracked points in six degrees of freedom.
In some instances, the origin of the received motion tracking data is irrelevant. Hence, motion tracking data can be obtained from motion tracking devices placed on a user, a device with a camera such as a cellular phone, or some other device or series of devices. Receiving motion tracking data from a device with a camera is discussed with respect toFIG.5. Receiving motion tracking data from one or more motion tracking devices is discussed with respect toFIG.6.
Spatiotemporal data can be analyzed to determine an energy expense for a user atstep325. Energy expense may be determined at least in part from the Spatiotemporal data captured during activity performed by the user. More details for analyzing Spatiotemporal data to determine energy expense for a user is discussed with respect to the method ofFIG.7.
Energy expense information may be provided for the user atstep330. The energy expense information may include calories burned during the current activity. A determination is then made as to whether a recommendation should be provided atstep335. A recommendation may be provided based on several factors, such as for example whether an intended duration of the current activity is completed, whether a goal number of calories have burned, or some other event, such as for example if a user is performing poorly in a current activity based on an analysis of the motion tracking data. If a recommendation should not be provided, the method ofFIG.3 continues to step345. If a recommendation should be provided, the recommendation is provided atstep340, wherein an activity start indication for the recommended activity is received. The recommendation may include execution of a different application, such as for example an application that is more difficult or easier than the previous application, a particular activity within an application, or visual cues of activity within the present executed application.
A determination is made atstep345 as to whether the workout should be terminated. Workouts can end if the particular activity, challenge, or game is complete, if the executing application terminates, if there is a pause in user activity or the game itself for a certain period of time, such as at least one minute. If the workout should not end, the method returns to step320 where motion tracking data is received for a user. If the workout has come to an end, then data is transmitted to a server atstep350. The data transmitted may include spatiotemporal data collected by the client device, user input received by the client device, and other data. Once data is transmitted to the server, the server may create a user dashboard atstep355. The user dashboard may be configured by a user and provide information regarding the user workout. More details for creating a dashboard are discussed with respect toFIG.9.
FIG.4 is an exemplary method for performing login for a user. In some instances, the method ofFIG.4 provides more detail forstep310 the method ofFIG.3. A determination is made as to whether a user has an account atstep410. If the user does have an account, the method continues to step420. If the user does not have an account, a new account is created for the user atstep415. Creating a new account can include receiving user login information, user contact data, and other user data. A user may perform login by providing a username and password, or other credentials, atstep420. After user login has been confirmed, user account data and user biometric data retrieved atstep425. User account data may include the user's name. User biometric data may include a user's height, weight, birth date, body mass index, and other biometric data associated with the user.
FIG.5 is an exemplary method for receiving motion tracking data obtained by a client device camera. Captured images of a user during a workout are captured by a client device camera atstep510. User points are identified in the captured images and/or video atstep520. The user points may be determined as points on the user body that move, points of body joints, and other points. Spatiotemporal coordinates of the user points may be determined atstep530. Hence, all the points are placed into a single spatiotemporal coordinate system that has three (ins some instances, two) spatial dimensions. The spatiotemporal coordinates are then processed to determine the movements of theuser540. For example, the coordinates may be processed to determine if a user is moving their arms or legs, swinging an object in their hands, doing an exercise or action such as jumping or running, and so forth.
FIG.6 is an exemplary method for receiving motion tracking data obtained from transmission units on a user. Transmission data is received from motion tracking units on the user atstep610. Spatiotemporal coordinates of the user points may be determined based on the transmission data atstep615. The points are placed into a single spatiotemporal coordinate system that has three (in some instances, two) spatial dimensions. The spatiotemporal coordinates are then processed to determine the movements of theuser620. For example, the coordinates may be processed to determine if a user is moving their arms or legs, swinging an object in their hands, doing an exercise or action such as jumping or running, and so forth.
FIG.7 is an exemplary method for analysing and processing motion tracking data. The method ofFIG.7 provides more detail forstep325 of the method ofFIG.3. First, motion tracking data is accessed atstep710. The motion tracking data may include data captured over a set period of time, such as one second, two seconds, 0.5 seconds, or some other period of time. User motion is identified from the tracking data atstep715. The user identified motion may include swinging arms forward, lifting arms up, raising ahead, jumping, or some other user motion. User energy expense is determined based on user motion atstep720. In some instances, determining a user energy expense includes first determining a base energy expense, in some instances based in part on a resting energy expense. A resting energy expense may be determined is 3.5 J/millimetre/kilogram.
A user energy expense is adjusted based on user biometric data atstep725. The adjustment based on a user biometric data may include using a user's body mass index (BMI) to customize the energy expense for the particular user. For example, a user with a higher BMI will have a higher energy expense will user with a lower BMI will have a lower energy expense for a particular activity.
The user's energy expense can be adjusted based on application information atstep730. In some instances, when it is known would application a user is engaged with, the application can specify a particular activity the user is engaged in. For example, an application may indicate to the present technology that the user is engaged in squats, walking, sitting, running, or some other activity. In some instances, an activity by a user may appear to require very strenuous activity, and user controllers that appear to be moving very rapidly. However, the user may be moving the controllers by flicking their wrists rather than swinging their arms, and the actual calories spent will be less. In this case, user energy expense would be reduced, slightly. The specifics of activity which are received from application can be used to better to and futures energy expense over a set period of time.
In some instances, a client device collects data during a workout and repeatedly uploads the data to a server over a network. In some instances, the server or network may not be accessible. As such, the client device may store data locally until the server communication is established.FIG.8 is an exemplary method for reporting data to a server from a client. A data processing event is detected atstep810. In some instances, the event is detected at the client, such as an event triggered to transmit collected workout data from the client to the server. A determination is made as to whether the server is available over a network atstep815. In some instances, the server is available if the client device can connect with the server device over the network. If the server is available, the server may process the motion tracking and other data collected by the client device, and the motion tracking data and other data is transmitted to the server by the client device. The server receives and stores the motion tracking data received from the local device atstep820. The server analyses the motion tracking data and determines an energy expense atstep825. The server then transmits the energy expense data to the local device atstep830. The client device (local device) receives the energy expense data, including calories burned data, and provides the energy expense information to the user atstep840. The energy expense can be displayed to the user through an overlay that is displayed during the user workout.
If, atstep815, the server is not available, the client device may perform calculations locally and store data in a local cache until a server connection can be established. Atstep835, the client device caches motion tracking data, analyses motion tracking data, and determines the energy expense associated with the user workout.
FIG.9 is an exemplary method for providing a dashboard. The method ofFIG.9 provides more detail forstep355 of the method ofFIG.3. First, a dashboard is initiated atstep910. The dashboard may be initiated by generating a dashboard instance.
Metrics are calculated atstep910. The metrics may include workout metrics such as the average calories burned per minute, average calories burned per song, the length of the workout, the total calories burned, and other metrics. A user energy expense is updated atstep915. User energy expense may be updated in real time as user exercises and the updated energy expense is calculated. Workout duration may be updated atstep920. Historic workout data may be displayed and updated atstep925. Historic workout data may indicate the types of activities, duration, and calories burned for segments of workouts. Data from the dashboard can be stored to a health data store atstep930. A health data store may include a database, a library of data accessed by a health application, such as “Google Fit” or “Apple Health” mobile applications, or some other electronic health record. The dashboard can then be updated based on the calculations between step910-925. Updating the dashboard includes filling components of the dashboard with metrics calculated. The calculated data is then stored to a health data store.
FIG.10 is anexemplary overlay1000 for use with a virtual reality device or augmented reality device. Theoverlay1000 may be displayed during a workout for a user during a virtual reality experience or augmented reality experience. The information in the overlay can be continuously updated, either from calculations calculated by a remote server and communicated to a local client device which provides the overlay or from calculations made by the local client which provides the overlay.
Theoverlay1000 displays information such as acurrent activity level1010, alevelling wheel1015,level information1020, estimatedheart rate1025,exercise counter1030, exercise type graphic1035, calories burned1040,application name1045, calories burned1050, total calories burned incurrent day1055,experience points1060, andxx1065. Current activity level may be low, moderate, intense, or some other label, and can be determined by the type of activity, the speed of the activity, and other factors. The levelling wheel determines the level at which the activity is being presented to the user. Thelevel1020 is the level at which the user is performing the workout. The heart rate is determined based on the spatiotemporal data as described herein. The count of exercises performed1030 and exercise type graphic can be determined by processing the spatiotemporal data (recognition of the exercise type and count of times performed) and/or retrieved by the application. TheApplication name1045 is retrieved by detection of the application at execution. The calories counted1040 and1050 is determined by processing the spatiotemporal data as described herein. The total daily calories is determined by accumulation of the workouts done in a particular day, and the experience points (XP) is determined by based on the number of workouts, similar and otherwise, performed by the user historically.
FIG.11 is anexemplary overlay1100 for use with a virtual reality device or augmented reality device.Overlay1100 ofFIG.11 displays, during a virtual reality experience or augmented reality experience, calories burned, heart rate data, level and experience information, and other information. The overlay can be displayed within the user's view, such as above the direction currently viewed by the user, off to a side, top or bottom of the user's current view, or some other location within the virtual reality experience or augmented reality space.
Overlay1100 displays alevelling wheel1110, calories burned1120, estimatedheart rate1130,current workout time1140, number of exercise (e.g., number of squats performed)1150, andlevel information1160. Thelevelling wheel1110 identifies the level at which the activity is being presented to the user. The calories burned1120 is determined by processing the spatiotemporal data as described herein. Theheart rate1130 is determined based on the spatiotemporal data as described herein. Theexercise count1150 can be determined by processing the spatiotemporal data (recognition of the exercise type and count of times performed) and/or retrieved by the application. Thelevel1160 is the level at which the user is performing the workout.
FIG.12 is a block diagram of a computing environment for implementing the present technology.System1200 ofFIG.12 may be implemented in the contexts of the likes of machines that implementbattery client110 andserver120. Thecomputing system1200 ofFIG.12 includes one ormore processors1210 andmemory1220.Main memory1220 stores, in part, instructions and data for execution byprocessor1210.Main memory1220 can store the executable code when in operation. Thesystem1200 ofFIG.12 further includes amass storage device1230, portable storage medium drive(s)1240,output devices1250,user input devices1260, agraphics display1270, andperipheral devices1280.
The components shown inFIG.12 are depicted as being connected via asingle bus1290. However, the components may be connected through one or more data transport means. For example,processor unit1210 andmain memory1220 may be connected via a local microprocessor bus, and themass storage device1230, peripheral device(s)1280,portable storage device1240, anddisplay system1270 may be connected via one or more input/output (I/O) buses.
Mass storage device1230, which may be implemented with a magnetic disk drive, an optical disk drive, a flash drive, or other device, is a non-volatile storage device for storing data and instructions for use byprocessor unit1210.Mass storage device1230 can store the system software for implementing embodiments of the present invention for purposes of loading that software intomain memory1220.
Portable storage device1240 operates in conjunction with a portable non-volatile storage medium, such as a floppy disk, compact disk or Digital video disc, USB drive, memory card or stick, or other portable or removable memory, to input and output data and code to and from thecomputer system1200 ofFIG.12. The system software for implementing embodiments of the present invention may be stored on such a portable medium and input to thecomputer system1200 via theportable storage device1240.
Input devices1260 provide a portion of a user interface.Input devices1260 may include an alpha-numeric keypad, such as a keyboard, for inputting alpha-numeric and other information, a pointing device such as a mouse, a trackball, stylus, cursor direction keys, microphone, touchscreen, accelerometer, and other input devices. Additionally, thesystem1200 as shown inFIG.12 includesoutput devices1250. Examples of suitable output devices include speakers, printers, network interfaces, and monitors.
Display system1270 may include a liquid crystal display (LCD) or other suitable display device.Display system1270 receives textual and graphical information and processes the information for output to the display device.Display system1270 may also receive input as a touchscreen.
Peripherals1280 may include any type of computer support device to add additional functionality to the computer system. For example, peripheral device(s)1280 may include a modem or a router, printer, and other device.
The system of1200 may also include, in some implementations, antennas, radio transmitters andradio receivers1290. The antennas and radios may be implemented in devices such as smartphones, tablets, and other devices that may communicate wirelessly. The one or more antennas may operate at one or more radio frequencies suitable to send and receive data over cellular networks, Wi-Fi networks, commercial device networks such as a Bluetooth device, and other radio frequency networks. The devices may include one or more radio transmitters and receivers for processing signals sent and received using the antennas.
The components contained in thecomputer system1200 ofFIG.12 are those typically found in computer systems that may be suitable for use with embodiments of the present invention and are intended to represent a broad category of such computer components that are well known in the art. Thus, thecomputer system1200 ofFIG.12 can be a personal computer, handheld computing device, smart phone, mobile computing device, workstation, server, minicomputer, mainframe computer, or any other computing device. The computer can also include different bus configurations, networked platforms, multi-processor platforms, etc. Various operating systems can be used including Unix, Linux, Windows, Macintosh OS, Android, as well as languages including Java, .NET, C, C++, Node.JS, and other suitable languages.
The foregoing detailed description of the technology herein has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen to best explain the principles of the technology and its practical application to thereby enable others skilled in the art to best utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claims appended hereto.