CROSS-REFERENCE TO RELATED APPLICATIONThis application is a National Stage of International Application PCT/US2010/057349 filed on Nov. 19, 2010, which claims benefit pursuant to 35 U.S.C. §120 of the filing date of the Provisional Application Ser. No. 61/262,794 filed on Nov. 19, 2009, both of which are incorporated herein by reference for all that they disclose.
BACKGROUND1. Field
Apparatuses, methods, and computer readable mediums consistent with exemplary embodiments relate to standardizing data from various devices, and more specifically, to integrating data from various devices using an application interface and providing the data for analysis.
2. Description of the Related Art
The use of simulation training is growing rapidly. A simulation training session is a session in which training of personnel is performed through the use of a simulator device that outputs real-time data in response to interactions of the trainees.
In the medical industry, for example, medical training centers conduct simulation training that generally involve students performing simulated medical procedures and/or examinations on a mannequin simulator, which exhibits symptoms of various ailments of a patient during simulated examination sessions. Other types of medical simulators include EKG machines, blood pressure monitors, and virtual reality endoscopic, laparoscopic, and endovascular simulators. During each simulated examination session, which usually takes place in an assigned examination room, the student interacts with the patient during an appointed time period to make a diagnosis of the patient's ailment and to prescribe a proposed treatment plan or perform a procedure. Each examination room is equipped with monitoring equipment, including audio, visual and time recording devices, so that the student's simulated encounter with the patient can be monitored in real time by an evaluator, such as a faculty member or upper class person. Typically, simulation training sessions are also recorded on video for subsequent analysis and teaching purposes. A similar configuration is used in other industries for other types of training sessions.
Also, actual procedures such as a surgery performed in a hospital or an assembly in a manufacturer plant may be recorded by monitoring equipment for further analysis and study.
The monitoring equipment in the examination/practice rooms may include multiple audio/video (AN) sources, e.g. video cameras, to provide various camera angles of the training session. A typical recording session may have three video feeds, for instance, taken from different camera angles, and one of the video feeds might show a machine that displays data from a simulator, such as EKG, heart rate, or blood pressure data. Also, other monitoring equipment may be used e.g., the ones that receive output from the sensors.
To combine data from different sources (monitoring equipment) for centralized analysis such as quantifiable analysis and management, various analytical applications are developed. For example, U.S. patent applicant Ser. No. 11/611,792 filed Dec. 15, 2006 by Lucas Huang and Chafic Kazoun titled Synchronous Multi-Media Recording and Playback with End User Control of Time, Data, and Event Visualization for Playback Control Over a Network describes such an analytical application. The disclosure of this application, application Ser. No. 11/611,792 is incorporated herein by reference in its entirety. Accordingly, multiple different data sources and various analytical applications exist for managing the data from these data sources.
Currently within the healthcare IT environment and other environments, users must deal with the complexity of managing multiple analytical systems and data sources for medical events. As explained above, the data needs to be centrally managed but there is no single interchangeable format for all types of devices and no single programming application interface (API) for data exchange between these different analytical systems and data sources.
Healthcare IT companies typically have proprietary API's and sometimes make these selectively available. Furthermore, all the different device manufacturers implement a different mechanism of storing data and few implement a way to exchange data and interact with the devices. These devices sometimes need to be the master driver of a simulation start and end. As such, the devices need to communicate with the analysis application versus the analysis application communicating with the device.
An approach that has been attempted is for the industry to agree on a single standard data format. This has been in the works for years and has not gone anywhere. Different companies cannot agree to use a single format.
Conventionally, there is no way for a device to communicate to an external system in a consistent manner across devices and the data for the different devices is not normalized.
SUMMARYAccording to exemplary, non-limiting embodiments, communication between the data sources and the analytical application is established. The entities involved in the communication can have data in different formats. The engine in exemplary, non-limiting embodiments will take the source data from various formats and normalize the data into a universal, standard format.
An aspect of exemplary embodiment is to provide a single flexible API for exchanging data and events. For example, data from various sources is exchanged with the analytical application or other devices using the flexible API. Furthermore, important events are indicated and data is mapped. A mechanism to flexibly add additional devices is provided.
According to an exemplary, non-limiting embodiment, a method is provided in which data is standardized. The method includes receiving data from a device, determining by a computer type of information provided in the received data, converting the information into a predetermined format based on the determined type, and generating a message based on the determined type. The message includes the converted information in the predetermined format. The type of the information is different based on a stage of a process in which the received data was provided.
According to yet another exemplary, non-limiting embodiment, a system for standardizing data is provided. The system includes a receiving unit, which receives data from a device, a determining unit executed by a processor which determines type of information provided in the received data, a conversion unit which converts the information into a predetermined format based on the determined type, and a generation unit which generates a message based on the determined type. The message includes the converted information in the predetermined format. The type of the information is different based on a stage of a process in which the received data was provided.
According to yet another exemplary, non-limiting embodiment, a non-transitory computer readable medium storing instructions executed by a computer for implementing a method of standardizing data. The method includes receiving data from a device, determining type of information provided in the received data, converting the information into a predetermined format based on the determined type, and generating a message based on the determined type. The message includes the converted information in the predetermined format. The type of the information is different based on a stage of a process in which the received data was provided.
BRIEF DESCRIPTION OF THE DRAWINGSThe accompanying drawings, which are incorporated in and constitute a part of this specification exemplify the exemplary embodiments and, together with the description, serve to explain and illustrate exemplary embodiments. Specifically:
FIG. 1 is a block diagram illustrating a simulation training system according to an exemplary embodiment.
FIG. 2 is a block diagram illustrating a data integration system according to an exemplary embodiment.
FIG. 3 is a flow chart illustrating configuration of a simulation training system according to an exemplary embodiment.
FIG. 4 is a flow chart illustrating communication stages of an exemplary simulation training system according to an exemplary embodiment.
FIGS. 5A and 5B are views respectively illustrating a generated report message and a corresponding report output according to an exemplary embodiment.
FIG. 6 is a block diagram illustrating another data integration system according to an exemplary embodiment.
FIGS. 7A and 7B are a block diagram and a flow chart, respectively, illustrating a session data module according to an exemplary embodiment.
FIGS. 8A and 8B are a block diagram and a flow chart, respectively, illustrating a report module according to an exemplary embodiment.
FIG. 9 is a flow chart illustrating operations of a simulation training system according to an exemplary embodiment.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTSProducts from B-Line Medical® enhance simulation-based medical training by recording simulation sessions and integrating data from simulators with video recordings and capture of a patient monitor or other XGA source. The simulator data management portion is managed by an integration engine (“Integration Engine”) that connects with a range of simulators from numerous manufacturers. Software Development Kit (“SDK”) components allow simulator manufacturers to develop robust and reliable integration with products of B-Line Medical by working with a stable, well documented application programming interface (“API”).
FIG. 1 is a block diagram illustrating a simulation training system in accordance with an exemplary embodiment. An exemplary embodiment provides a web-basedsimulation training system10 for providing synchronous multimedia recording and playback of recorded training sessions. Thesimulation training system10 includes atraining center12 that has equipment for communicating with an analysis application over anetwork16, such as Internet. Thetraining center12 conducts and records simulation training sessions in one or more training rooms equipped with multiple audio/video (A/V) sources18, multiple encoder/recorders20, a time sync generator22, and asimulator data source24.
The training sessions are recorded using the A/V sources18 and the data is sent to respective encoders/recorders20. The A/V sources18 in an exemplary embodiment are video cameras, but A/V sources18 include any type of capture device, such as an auxiliary microphone or a still camera, and the like. The training sessions involve one or more trainees (not shown) who perform simulated procedures, or otherwise interact with, at least onesimulator data source24 that outputs real-time data in response. In an exemplary embodiment, real-time may include actions/events occurring at the current time as well as those that occur within the context of a session. That is to say that the data is not post-processed or collected first, then manipulated after the session is complete.
The type of training conducted by thetraining center12 will be described in terms of medical training that would be suitable for doctors, nurses, and emergency response personnel, but the exemplary embodiments are applicable in any type of training that involves the use of any type of simulator and any type of procedures that involve processing of data from various input devices.
Example types ofsimulator data sources24 in the medical industry, for instance, include full-body mannequin simulators, virtual reality simulators, EKG machines, and blood pressure monitors, and so on. However, thesimulator data source24 may be a device or a machine used in actual procedure performed in a hospital and not related to training. Also, thesimulator data source24 may be a device unrelated to the medical field e.g., sales training device in a corporate environment or a monitoring device in a security system.
The encoders/recorders20 and thesimulation capture tool36 may be located remote from the training center, e.g., at the physical location of the server14.
A server (or a number of servers)30, which is one or more computers that has a processor such as a CPU and a memory, is provided for running theanalysis application31. The analysis application is software that that may have a skills assessment tool. The skills assessment tool includes a debrief tool and an annotation and assessment tool. The analysis application may be implemented as a custom application that is installed at thetraining center12, and accessed directly by clients i.e.,end users42 over a network such asInternet16.
Theanalysis application31 accesses various data stored in databases/archives32 such as a session data archive, a simulation data archive, and a multimedia archive and so on.
For example, in response to a training session being conducted, the system synchronously records in real-time both simulator data from asimulator data source24 captured by a simulator capture tool, and video of the training session captured by a plurality of the A/V sources18. The simulator data may be metric data obtained directly from a simulator or medical equipment e.g., blood pressure of the patient at time t1, heart rate, etc. The simulator data is captured by thesimulation capture tool36. The time sync generator22 is coupled to the encoders/recorders20 and to thesimulator capture tool36, to control the synchronization of the recordings.
During the recording, each of the videos captured by A/V sources18 are encoded as respective digital media files in streaming media format. Streaming media is media that is consumed (heard and/or viewed) while the media is being delivered. The videos captured by the A/V sources18 may be encoded by the encoders/decoders20. The simulator data may be captured in its raw and/or compressed format. In another exemplary embodiment, the simulator data can be captured using one of the A/V sources18 by recording a video of the output of the simulator itself, e.g., by capturing a video of an EKG display. The simulation data may be encoded by thesimulation capture tool36.
During recording of the training session, the simulation data and the digital media files of the video and/or audio feeds are transmitted to theanalysis application31. The simulation data is sent to the analysis application bysimulation capture tool36, where it is stored in the simulation data archive32 and indexed by an ID of the training session. The video media files are sent to theanalysis application31 by the encoders/decoders and are stored in themultimedia archive32.
Both the simulator data and a stream of the video media files may be transmitted to theend users42 over thenetwork16, such that when theend users42 receives the simulator data and the stream, the respective videos are synchronously played back with the simulator data on a device of theend user42 using abrowser44 and amedia player46.
Each of the A/V sources18 may be providing data in different formats. Alsovarious simulators24 have different formats for the data. As illustrated inFIG. 2,various devices50a,50b,and50cprovide data in different formats to theAPI51, which converts the data to the appropriate format. The data is then processed and normalized and provided to a server for storage in the databases52.
For example, the simulator data may include physiological statistics that are sent to the database52 every time a change occurs. The rate at which data is sent to the database is configurable, e.g. once per second, once per millisecond, etc. The change may be transmitted in the form of raw data including numeric values for example “140, 90”. TheAPI51 may process these numbers as being from adevice50c,which it identifies as a blood pressure measuring machine. Accordingly, it may then format the raw data into a message format that will include the following attributes: time, sbp=140, dbp=90. The message may then be sent to the storage52.
Different attributes may be assigned to different physiological statistics. For example, raw data “80” may be received from thedevice50b. TheAPI51 will identify thedevice50bas an oxygen level measuring machine. The raw data80 may then be converted to a message with the following attributes: time, oxylvl=80%, display=“oxygen level at ” [oxylvl] “%”. That is, one or more additional attributes may provide information regarding how to present the statistics to the user.
An exemplary API provides a stable integration point and an improved user experience. Since the exemplary protocol utilized by the API is specifically designed for integration, it is protected against versioning problems—updates to the protocol are possible as additive only and preserve backwards compatibility. Time synchronization is accomplished as part of the protocol, providing accurate alignment of data elements to video-recorded events. The user experience is easier and simpler, with the user managing only the simulator. The exemplary API is managed behind the scenes, reducing opportunities for user error and the need to train users.
The exemplary API may be used with learner-initiated simulation sessions in which there is no operator viewing and annotating the live session. For a simulator to take advantage of the API, the communication from the simulator to the analysis application is configured. This can be done during integration development. Each simulator is identified during the configuration by the API. A connection data file is then generated that includes a unique ID for the simulator and the connection information to the analysis application.
For example, in order for the simulator to communicate with the API, the following operations may be performed to configure the simulator. In operation61, the simulator is identified to the API e.g., blood pressure measuring equipment, video recorder, and so on.
Inoperation62, a connection file is downloaded to the simulator that provides details on how to connect with the API. Of course, an exemplary connection file may be obtained via other means known in the art. Connection to the analysis application can occur either encrypted or unencrypted. Each simulator must be setup prior to being used. After setting up the simulator, a simulator connection file is downloaded which is an XML file that can be copied to the simulator and provides details on how it would connect to the analysis application. Each connection file contains only a single root element.
Since a simulator can be connected to multiple analysis applications, the simulator should be able to store multiple possible connections and provide a way for the user to choose which analysis application to use.
Inoperation63, security is established e.g., an encryption technique is determined such as a hybrid cryptosystem.
Inoperation64, the users in the system are synchronized such that when a session is initiated, a shared unique identifier, such as user name, is provided by the simulator to the API that matches a user in the analysis application. The initial user synchronization occurs during the configuration stage. One mechanism to provide a synchronized user list is for the simulator to pull existing users from the analysis application via the API. The reverse is also possible; the simulator can provide a list of users to the API to synchronize in the analysis application. In both mechanisms, a query or filter can be provided to specify a subset of users that should exist in both systems. Another mechanism is through a user directory system such as Lightweight Directory Access Protocol (LDAP) or Active Directory. The API and simulator can independently synchronize with the same user store on the user directory, and thus result in the same user lists.
Inoperation65, content updates should be performed during the configuration stage. That is, simulators that maintain training or educational content should send updated content hierarchies to initially populate the analysis application. Subsequent updates will occur whenever that data changes. The analysis application will then use this information to associate its content, i.e. scenarios with specific simulator content, thus providing accurately synchronized content from the simulator. For example, a message may be transmitted to the analysis application that would declare that it is an update message having content elements. Each element will have a reference identifier identifying a particular device and a test attribute that identifies the content being changed to a user.
Once the initial configuration is complete, the communication can occur at any point and progresses through the following stages as depicted in exemplaryFIG. 4. A standard for identifying these stages and providing any data gathered therein to analysis application is detailed below.
In a first stage,preparation71, a simulation session is being initiated and necessary setup is performed. For example, the simulator sends an early notification message to the API indicating that a session is about to start. Accordingly, the API may take custom actions based on the type of simulator, the type of scenario being run, etc., which may involve gathering initial statistics of the current state of the simulator, current time, available scenarios, etc., or it may do nothing if no preparation is necessary.
Next, the session is started instage72. Specifically, the operator provides information on the session to be run, including user information and simulation content details. For example, user name (login) and case type (content selection) are input.
When a user logs into the simulator, a message in a predetermined format should be sent to the analysis application. The system recognizes that a user logged in and converts the log in information into a predetermined format to be sent to the analysis application. For example, “login” can be the message sent to the analysis application.
When a user chooses which module and case they are going to run (content elements), this indicates the start of the session. The simulator device provides the user identifier, and content element to the analysis application e.g. if user Joe will perform endoscopy on a 45 yrs old male, then predetermined message may contain: Joe; endoscopy; 45 male.
In thenext stage73, recording and patient start occurs. For example, Patient Start indicates the moment the simulated patient's time begins, allowing precise alignment of the simulator, other data, and events. The session is recorded upon receiving a simple message from the simulator device. The recording may start in the middle of the session depending on the content element (scenario). For example, “start” can be the message sent to the analysis application to indicate the start of the recording.
This stage may occur before or after the Start Recording, which is recording of the session using a video camera for example. The Patient Start indicates the start of the user's encounter with the patient and may be provided as a date and time value or as a flag to indicate the start of a counter. That is, in an exemplary embodiment, a counter for a session may be started to synchronize various data sources.
In the next stage, Simulation in Progress74, events and statistics are collected. This data may be sent in real time or in batches during the session. In an exemplary embodiment, two types of messages are generated, one for the statistics and one for the events.
Statistics include physiological statistics such as heart rate, blood pressure and so on, which were briefly described above. Simulators should provide physiological statistics throughout the simulation if they track or calculate such data. These messages should be sent every time any of the tracked statistics change. The statistics may be provided with the counter attribute. For example, third touch of the heart at twenty minutes after the beginning of the procedure may be the statistic sent in a predetermined format such as 3; heart; 20, to the analysis application.
Events include notable actions by a user. For example, it may include visualizing a polyp, choosing a tool, performing a cutting operation, infusing air, etc. Events may be standardized into an exemplary format that includes time since the patient start stage and identification of an event in a text format for example. Some of additional fields may include: 1) type of event such as a surgery event, a drug event, and a comment; 2) importance of the event e.g., assign a priority to the event; and 3) gradable item, event is correct or not correct. For example, if an event is administering 20 cc of morphine with 200 cc of saline via IV in 20 minutes. The event will then have a time field of 20 minutes, a description such as administer pain medication (type of event may be drug). The event will be provided to the analysis application in this predefined format.
In thenext stage75, the recording is stopped. By way of an example, a simple message may be sent to stop recording e.g. “stop”. Typically, this message should be sent before sending the report, but it can be sent any time before the end of the session. Recording will be automatically stopped at the end of the session even if the message is not sent. However, usually, even when the recording is stopped, the session is still open to allow post-session data processing and transmission by the simulator.
Another optional stage is reporting76. This stage usually takes place after the recording is stopped and can be omitted altogether. During thereport76 stage, transmission of summarized data and images that are not time-specific occurs. For example, the report may include images, individual results, groups, and tables. An individual result may be provided within a table or a group. The individual result may include identification of the result, value, an image, and a grade value (correct or not or a numeric grade). Another individual result may be a summation of a number of events that occurred during the simulation, and so on.
A simulator may provide a report message to the analysis application in a standardized format. This report, however, varies greatly between simulators and by content. Accordingly, the format of a report message is designed to be flexible to handle any data. Reports may consist of a more complex message which allows for grouped as well as tabular results as noted above. The report may include name, group name for results and result data. The result data may be text, images, grades, or some other values. The report will be structured in a predetermined format for various types such as type of a message: report, image result, value, image or report, table, value, column, rows, descriptions, etc.
For example, the format of anunencrypted report message78afrom the simulator according to an exemplary embodiment is depicted inFIG. 5A. Specifically, thereport message78amay includeimages78b.Also, thereport message78amay include a number ofgroups78cand a table78d.For example, for the group, atitle78eis provided along withvalues78ffor each attribute in the group. Also, one or more links to various images may be provided78g.The table78dof thereport78amay include atitle78h,column titles78i,row titles78j,and values78k.It is noted that the above-describedreport message78ais provided by way of an exemplary only and is not limiting. For example, the report message may include an individual image or just an individual result:
<result text=“blood pressure”; Value “100, 50”; corrvalue=“1”>.
In an exemplary embodiment, the API will analyze the reports message and display the results, as illustrated for example inFIG. 5B. Specifically, the displayedresults79amay include animage79b,group results79cand a table79d.Although inFIG. 5B group results appear in a table format, group results may be provided in different formats such as a list and so on.
In the exemplary embodiment, the report messages such as the one depicted inFIG. 5A may then be encrypted to preserve the security and integrity of the messages being transmitted.
In addition to the reports provided directly by the simulator, the analysis application can also use the event and statistics data to generate session reports that may include a list of events that occurred, e.g., drug administration, or graphs of the statistical data, e.g. heart rate over time. As previously mentioned the time synchronization provides accurate alignment of these data elements to video-recorded events and allows for simultaneous playback of the recorded video with data elements collected from the simulator.
A final stage isEnd Session77.End Session77 indicates that all aspects of the session are complete and all data has been transmitted. If the End Recording stage was skipped, recording is stopped at this time.
In an exemplary embodiment, unless otherwise specified, all messages sent from a simulator to the analysis application can receive a response. For example, the response may be “ok,” “warning,” or “error.” An “ok” message is sent when the message is processed without exception or conflict and is the normal expected response. A warning is sent when the message is processed successfully, but the outcome might not be what the author intended. An error message is sent when the generated messages cannot be accepted by the analysis application or an exception occurs processing a particular message. Besides these responses from a recognized simulator call, it is also possible that an unrecognized, non-standard response will be received in cases where an error is provided by the server directly. This could be caused by various setup problems including invalid address, using the wrong port based on client setup, or the server connection limit is exceeded.
In addition to sending the above messages to the analysis application during a session, in an exemplary embodiment, the simulator may store a copy of all messages in a single local file. Users would be able to upload this file to an analysis application and attach the data to a recorded session. The formation should have a <session> element as its root tag and otherwise would have identical content as the in-session messages mentioned above. In this file, only the root <session> element is required to have a simulator attribute identifying the simulator; it can be left off for all of the other elements.
As discussed above, different simulator and other devices provide data in different formats, which needs to be converted in a standardized format for use by the analysis application.
FIG. 6 is a block diagram illustrating data integration system according to an exemplary embodiment. The API supports data integration with medical simulators through a data push mechanism initiated by the simulator as shown inFIG. 6. For example, the data from the devices (such as simulators)80a. . .80z(where z is a positive integer) may be recorded without the user interacting with theanalysis application89. That is, thedevices80a. . .80zpush data to theanalysis application89 and the session is recorded and associated behind the scenes without operator interaction.
The message from the device is converted by aconverter81a. . .81zto a standard format recognized by theanalysis application89 and transmitted to theanalysis application89 via the network such as the Internet (not shown).
In particular, various devices use different languages and provide data in various different languages and formats. An exemplary embodiment may include a device that implements its application in the C++ programming language. A single flexible API may be implemented using a .NET solution with a reusable assembly. For the .NET solution to understand exchanged data from the devices, the devices must provide a custom converter that translates its C++ application commands into C#/.NET. Another exemplary embodiment may include a device that implements its application in the C# programming language using the .NET platform. Since this language matches that used in the provided API, no further translation of the message is required and the device may exchange data directly with the API by calling the commands in the API.
For example, adevice80amay use a serial port communications protocol and provide its data to a plug-inconverter81afor translation. The plug-inconverter81aconverts data into a standardized format, as described above and detailed below.Device80bmay use .NET data tables. The data is provided as .NET Data tables and is converted into a standardized format. Adevice80cmay use XML. Accordingly, the XML data from thedevice80cis provided to theconverter81c.There may be many other devices, and the devices depicted inFIG. 6 are provided by way of an example and not by way of a limitation. Adevice80xmay use an SQL database and provide data in an SQL format to theconverter81x,adevice80ymay use plain text format and provide data in a test file format to theconverter81y.Also, adevice80zmay communicate in proprietary binary data format. Accordingly,converter81zconverts the binary data into a standardized format.
Once the data is converted into a standard format, the API determines whether the data is part of a login data, content selection data, start/stop data, session data, or reports data. That is, a respective module is provided to place the standardized data in a respective message.
For example, if the raw data is identified as login information, then thelogin module84 may generate a message that would instruct theanalysis application89 to prepare to receive data. In fact, no actual data may be provided but only an instruction to theanalysis application89 to “stand by”.
Data may be identified as content selection data that is provided to thecontent selection module85. Thecontent selection module85 will generate a message to theanalysis application89 that may include user name, content reference type and so on. Since the data is provided in the standard format, thecontent selection module85 recognizes the data and inserts it into a message it generates.
Data may also be identified as a start stop data that is provided to the start/stop module86. That is, once the data signals provided by various devices80a-80zare converted into a standardized format, it is easy to recognize that the signals should be provided to the start/stop module86. When the start/stop module receives these signals, it will generate a corresponding message to theanalysis application89. For example, it will generate a message that will request the analysis application to start recording the session or to start the patient counter and so on.
Some data will be data provided during the session i.e., simulation data. Once converted into the standard format, the data is then identified as session data by thestandard format module83 and is provided to the session data module. Thesession data module87 generates the proper message and provides the message to theanalysis application89.
For example,FIG. 7A is a block diagram illustrating thesession data module87 according to an exemplary embodiment. Thesession data module87 may include aninput unit90a,aprocessing unit90b,astatistics unit90c,anevent unit90d,anoutput unit90e,and acontroller90f.That is, thesession data module87 receives data from thestandard format module83 or may be directly from a converter if the standard format module is omitted via aninput unit90a.The received data is provided to theprocessing unit90b. Theprocessing unit90bis configured to process the data and determine whether the data is a statistics data or an event. If the data is statistics data, it is provided to thestatistics unit90c.Thestatistics unit90caccesses a database (not shown) to determine attributes of the provided data e.g., bp=blood pressure, and so on. As a possible variation, the statistics data may include identification of thedevice80a. . .80z,and based on this device type, a respective database may be accessed. This may increase the processing speed of thestatistics unit90c.Thestatistics unit90cgenerates a message for theanalysis application89 and places data into this message. For example, the message may be: output “blood pressure=”; [dataA] “.” The generated message is then provided to theoutput unit90efor transmission to theanalysis application89.
On the other hand, if theprocessing unit90bshown inFIG. 7A determines that the data is an event, the data is provided to theevents unit90d. Theevents unit90d(similar to thestatistics unit90c) will determine the type of event by accessing one or more databases. Once the type of event is determined, a predetermined message is generated. The generated message may include the retrieved identifier of the event and value, time from the start of the patient, priority of the event, and/or whether it was performed correctly. The generated message is then provided to theoutput unit90efor transmission to theanalysis application89.
If the data is identified as not being session data, an error message may be sent to the converter and/or the standard format module. Thecontroller90fcontrols the operations of thesession data module87 including theinput unit90a,theprocessing unit90b,thestatistics unit90c,theevents unit90d,and theoutput unit90e.It may instruct theinput unit90ato receive data or to temporarily stop receipt of data if a backup is detected. Similarly, it may instruct theoutput unit90eto output data or to report problems with data receipt, processing, message generation, and/or output.
FIG. 7B is a flow chart illustrating operations executed by asession data module87 according to an exemplary embodiment. Inoperation91, session data is received from astandard format module83. Inoperation92, type of session data is determined. That is, the session data module determines whether data is an event or statistics. Inoperation93, the session data module checks if the provided data is an event data. If the provided data is not an event data (N in operation93), the type of statistics data is determined.
For example, thesession data module87 may identify the type of device that provided the data and access a corresponding database to determine type of statistics provided. That is, the statistics data may include “stats, pressure measuring device, 140, 60.” The type of device may then be identified as a pressure measuring device, which would mean that the raw data is sbp and dbp. Accordingly, inoperation95, thesession data module87 may generate a message [output: “blood pressure=” [140, 60] “.”]
On the other hand, if the statistics data includes “stats, pulse measuring device, 140, 60.” The type of “device” may then be identified as a human nurse measuring the pulse of a patient and manually inputting the pulse into her electronic chart, which automatically adds a time value to the input pulse. This may mean that the patient had a pulse of 140 at 60 seconds from the start, for example. Accordingly, inoperation95, thesession data module87 may generate a message [output: “pulse=” [140], at [60] “second from the start of the procedure”] based on the statistics data received from the electronic chart.
The generated message is output to the analysis application, inoperation96.
On the other hand, if the data is identified as an event (93-Y), then the type of event is determined inoperation97. The type of event may be determined based on the identifier of thedevice80a. . .80zor based on the attributes identifying the event. Once event is identified, a corresponding event message may be generated inoperation98. The event message may include description of event with or without a value, time since the patient start, type of event, priority of event, whether it was performed correctly. Next, inoperation96, the message is transmitted to theanalysis application89.
Referring back toFIG. 6, some data will be report data that may be provided after the session is completed. Once converted into the standard format, the data is then identified as reports data by thestandard format module83 and is provided to thereports module88. Thereports module88 generates the message inserting the received reports data and provides the message to theanalysis application89.
For example,FIG. 8A is a block diagram illustrating thereports module88 according to an exemplary embodiment. Thereport module88 may include aninput unit100a,areports unit100b,anoutput unit100c,and acontroller100d.That is, thereports module88 receives data from thestandard format module83 or directly from a converter if the standard format module is omitted via aninput unit100a.Thereports unit100bthen determines the attributes of the provided report data e.g., whether it contains group data, table data, individual result and/or images.
If thereports unit100bdetermines that the raw data received contains the “group” attribute, then a grouped result will be inserted into the report message. A group result may include one or more individual results. If thereports unit100bdetermines that the raw data received contains the “table” attribute, then a tabular result will be inserted into the report message. A table result may include one or more individual results. If thereports unit100bdetermines that the raw data received contains the “image” attribute, then the image will be inserted into the report message. It is important to note that more than one instance of each attribute can be contained in the report data and in any order. The collective generated report message is then provided to theoutput unit100cfor transmission to theanalysis application89.
FIG. 8B is a flow chart illustrating operations executed by areports module88 according to an exemplary embodiment. Inoperation101, report information is received from astandard format module83 or a converter. In operation102, report attributes is determined. That is, thereport module88 determines whether data contains a result and/or an image by parsing the report. Inoperation103, the report module checks if the provided information is a result. If the provided data is not a result (N in operation103), the attribute “image” is determined. Accordingly, the attribute image is inserted into the reports message in operation104.
On the other hand, if the information is identified as a result (Y in operation103), then the report module must determine whether the data is tabular or grouped. Inoperation105, the report module checks if the provided information is a grouped result. If the provided data is not a grouped result (N in operation105), the “tabular” attribute is determined. Accordingly, the tabular result is inserted in the reports message inoperation106. If the data is determined to be a grouped result (Y in operation105), then the grouped result is inserted into the reports message inoperation107.
Thereports unit100biterates through the report attributes as long as end of the report is not reached (N in operation108). As noted above, as all the attributes of the report information are determined, the respective data—image104,tabular result106, and groupedresult107—is inserted into the reports message. Once the end of the report is reached (Y in operation108), the collective reports message is generated inoperation109 and output to the analysis application, inoperation110.
FIG. 9 is a flow chart illustrating operations of a simulation training system according to an exemplary embodiment. Inoperation111, data is received from a simulator device. The format of the data is determined based on the type of the simulator device inoperation112. Raw data is extracted according to the rules provided for the determined format inoperation120.
Inoperation130, type of the extracted information is determined. For example, if the extracted information is determined to be login information, inoperation140a,a predetermined login message is generated. If the extracted information is determined to be content selection information, a predetermined content selection message is generated inoperation140band the extracted information is converted or edited and inserted in the message. If the extracted information is determined to be start/stop information, inoperation140c,a predetermined stop/start message is generated. If the extracted information is determined to be session data, inoperation140d,type of session data is determined.
If the determined type of session data is statistics, inoperation145a,the statistics are converted and inserted into a generated statistics message. If the determined type of session data is an event, in operation145b, the data is converted and inserted into a generated event message.
Inoperation150, the generated message is transmitted to the analysis application. Inoperation160, the system may receive a response from theanalysis application160, where the response may include whether the message was successfully received and/or parsed or whether an error was encountered.
Although above exemplary embodiments are described in a context of medical industry and training, these are provided by way of an example only. The above exemplary embodiment are applicable to actual medical procedures and may be applied to other fields such as security systems and so on.
An exemplary application program interface (API) such as the one depicted inFIG. 6 may be implemented on a computer-readable medium. The term “computer-readable medium” as used herein refers to any medium that participates in providing instructions to a processor for execution. A computer readable medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable medium would include the following: an electrical connection having two or more wires, a portable computer diskette such as a floppy disk or a flexible disk, magnetic tape or any other magnetic medium, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a memory card, any other memory chip or cartridge, an optical fiber, a portable compact disc read-only memory (CD-ROM), any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, or any other medium from which a computer can read or suitable combination of the foregoing.
In the context of this document, a computer readable medium may be any tangible, non-transitory medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Another form is signal medium and may include a propagated data signal with computer readable program code embodied therein, for example, in a base band or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, the electro-magnetic, optical, or any suitable combination thereof. The signal medium may include coaxial cables, copper wire and fiber optics, including the wires that comprise data bus. The signal medium may be any medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wire line, optical fiber cable, RF, etc. or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the exemplary embodiments may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C+, .Net or the like and conventional procedural programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. The remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
The computer-readable medium is just one example of a machine-readable medium, which may carry instructions for implementing any of the methods and/or techniques described herein. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks. Volatile media includes dynamic memory.
Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor such as a CPU for execution. For example, the instructions may initially be carried on a magnetic disk from a remote computer. Alternatively, a remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to a computer system can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on the data bus. The bus carries the data to the volatile storage, from which processor retrieves and executes the instructions. The instructions received by the volatile memory may optionally be stored on persistent storage device either before or after execution by a processor. The instructions may also be downloaded into the computer platform via Internet using a variety of network data communication protocols well known in the art.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various exemplary embodiments. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical functions. It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or two blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagram and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terminology as used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The description of the exemplary embodiments has been presented for purposes of illustration and description, but is not intended to be exhaustive or limiting in any form. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. Embodiments were chosen and described in order to explain operations and the practical application, and to enable others of ordinary skill in the art to understand various embodiments with various modifications as are suited to the particular use contemplated. That is, various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles and specific examples defined herein may be applied to other embodiments without the use of inventive faculty. For example, some or all of the features of the different embodiments discussed above may be combined into a single embodiment. Conversely, some of the features of a single embodiment discussed above may be deleted from the embodiment. Therefore, the present invention is not intended to be limited to the embodiments described herein but is to be accorded the widest scope as defined by the limitations of the claims and equivalents thereof.