The present application claims priority from international application number PCT/CN2024/109055 filed on 7/31 of 2024, the entire contents of which are incorporated herein by reference.
Detailed description of the preferred embodiments
In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure. However, it will be apparent to one skilled in the art that the present description may be practiced without these specific details. In other instances, well-known methods, procedures, systems, components, and/or circuits have been described at a high-level in order to avoid unnecessarily obscuring aspects of the present description. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the disclosure. Thus, the present description is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the claims.
The terminology used in the description is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used in this specification, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features. Integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of at least one other feature, integer, step, operation, element, component, and/or group thereof.
It will be understood that when an element, engine, module or block is referred to as being "on," "connected to" or "coupled to" another element, engine, module or block, it can be directly on or coupled to or in communication with the other element, engine, module or block, or intervening elements, engines, modules or blocks may be present unless the context clearly dictates otherwise. As used in this specification, the term "and/or" includes any and all combinations of at least one of the associated listed items.
The features and characteristics of the present description, as well as the operation and function of the related structural elements and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description. All of the accompanying drawings form a part of this specification. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the specification. It should be understood that the figures are not drawn to scale.
Fig. 1 is a block diagram of an exemplary healthcare system 100 shown according to some embodiments of the present description.
The healthcare system 100, which may also be referred to as a metahospital system, is built based on a variety of innovative technologies including metauniverse technology, XR technology (e.g., AR technology, VR technology, MR technology, etc.), AI technology, digital twins technology, ioT technology, data circulation technology (e.g., blockchain technology, data privacy computing technology, etc.), spatial computing technology, image rendering technology, etc.
As shown in fig. 1, the healthcare system 100 may include a physical hospital 110, a virtual hospital 130, a user space application 120, and a hospital support platform 140. In some embodiments, the hospital support platform 140 may map data related to the physical hospital 110 into a virtual hospital 130 corresponding to the physical hospital 110 and provide user services to related users of the physical hospital 110 through the user space application 120. For example, at least a portion of the user services may be provided to the relevant users based on interactions between the relevant users and the virtual hospital 130.
The physical hospital 110 refers to a hospital that exists in the physical world and has tangible properties (e.g., has properties that are measurable in mass, volume, shape, etc., and can be perceived by human senses or instruments). In this specification, health care institutions providing medical, surgical and psychiatric care and treatment for human beings are collectively referred to as hospitals.
As shown in fig. 1, the physical hospital 110 may include a plurality of physical entities. For example, the plurality of physical entities may include departments, users, hardware devices, user services, public areas, medical service procedures, and the like, or any combination thereof.
A department refers to a specialized unit or department that is specialized in providing a particular type of medical care, treatment, and service. Each department may be focused on a particular medical field and may be equipped with a healthcare professional having expertise in that field. For example, the departments may include an outpatient department, an inpatient department, a surgical department, a support department (e.g., a registration department, a pharmacy department), etc., or any combination thereof, according to a healthcare procedure corresponding to each department. As another example, depending on the medical field corresponding to each department, the departments may include internal medicine, surgery, specialty medicine, child care, etc., or any combination thereof.
The user may include any user associated with the physical hospital 110 (or related user referred to as the physical hospital 110). For example, the user may include a patient (or a portion of a patient (e.g., an organ)), a physician, a visit to a patient, a hospital staff of the physical hospital 110, a provider of the physical hospital 110, an application developer of the physical hospital 110, or the like, or any combination thereof. Hospital staff of the physical hospital 110 may include healthcare providers (e.g., doctors, nurses, technicians, etc.), hospital administrators, support staff, or the like, or any combination thereof. Exemplary hospital administrators may include department care administrators, clinical administrators, department courtyards, hospital administrative staff, job management staff, or the like, or any combination thereof.
The hardware devices may include hardware devices located in the physical hospital 110 and/or hardware devices in communication with hardware devices in the physical hospital 110. Exemplary hardware devices may include terminal devices, healthcare devices, sensing devices, base devices, etc., or any combination thereof.
The terminal device may comprise a terminal device that interacts with a user of the medical services system 100. For example, the terminal devices may include terminal devices that interact with the patient (also referred to as patient terminals), terminal devices that interact with the patient's doctor (also referred to as doctor terminals), terminal equipment that interact with the nurse (also referred to as nurse terminals), terminal devices that interact with the remote seeker (also referred to as remote terminal devices), or public terminals of the hospital (e.g., office terminals, bedside terminal devices, terminal devices in waiting areas, intelligent surgical terminals), etc., or any combination thereof. In this specification, unless clearly obtained from the context or otherwise indicated by the context, the terminal devices owned by the user and the terminal devices provided to the user by the physical hospital 110 are collectively referred to as the user's terminal devices or the terminal devices interacting with the user.
The terminal device may include a mobile terminal, an XR device, an intelligent wearable device, etc. The mobile terminal may include a smart phone, a Personal Digital Assistant (PDA), a display, a gaming device, a navigation device, a hand-held terminal (POS), a tablet computer, etc., or any combination thereof.
The XR device may comprise a device that allows a user to participate in an augmented reality experience. For example, the XR device may include VR components, AR components, MR components, and the like, or any combination thereof. In some embodiments, the XR device may include an XR helmet, XR glasses, an XR patch, a stereo headset, or the like, or any combination thereof. For example, the XR device may include GoogleGlassTM、OculusRiftTM、GearVRTM、AppleVisionproTM, and the like. For example, the XR device may include a display component on which virtual content may be presented and/or displayed. In some embodiments, the XR device may further comprise an input component. The input component can enable user interaction between a user and virtual content (e.g., virtual surgical environment) displayed by the display component. For example, the input component may include a touch sensor, microphone, image sensor, etc. configured to receive user input that may be provided to the XR device and used to control the virtual world by changing visual content presented on the display component. The input components may include handles, gloves, styluses, consoles, and the like.
The intelligent wearable device may include an intelligent wristband, intelligent footwear, intelligent glasses, intelligent helmet, intelligent watch, intelligent garment, intelligent backpack, intelligent accessory, etc., or any combination thereof. In some embodiments, the smart wearable device may acquire physiological data of the user (e.g., heart rate, blood pressure, body temperature, etc.).
The healthcare device may be configured to provide healthcare to the patient. For example, the medical services device may include an examination device, a care device, a treatment device, etc., or any combination thereof.
The examination apparatus may be configured to provide examination services to a patient, e.g. to collect examination data of the patient. Exemplary examination data may include heart rate, respiratory rate, body temperature, blood pressure, medical imaging data, body fluid test reports (e.g., blood test reports), and the like, or any combination thereof. Accordingly, the examination device may include a vital sign monitor (e.g., blood pressure monitor, blood glucose meter, heart rate meter, thermometer, digital stethoscope, etc.), a medical imaging device (e.g., computed Tomography (CT) device, digital Subtraction Angiography (DSA) device, magnetic Resonance (MR) device, etc.), a laboratory device (e.g., blood routine examination device, etc.), etc., or any combination thereof.
The care device may be configured to provide care services to the patient and/or assist the healthcare provider in providing care services. Exemplary care devices may include hospital beds, patient care robots, intelligent care carts, intelligent kits, intelligent wheelchairs, and the like.
The treatment device may be configured to provide treatment services to the patient and/or assist the medical service provider in providing treatment services. Exemplary treatment devices may include surgical devices, radiation treatment devices, physical treatment devices, and the like, or any combination thereof.
The sensing device may be configured to collect sensing information related to the environment in which it is located. For example, the sensing device may include an image sensor, a sound sensor, or the like. The image sensor may be configured to collect image data in the physical hospital 110 and the sound sensor may be configured to collect voice signals in the physical hospital 110. In some embodiments, the sensing device may be a stand-alone device or may be integrated into another device. For example, the sound sensor may be part of a medical service device.
The base device may be configured to support data transmission, storage, and processing. For example, the base device may include a network, a machine room facility, a computing device, a computing chip, a storage device, and the like.
In some embodiments, at least a portion of the hardware devices of the physical hospital 110 are internet of things (IoT) devices. An internet of things device refers to a device with sensors, processing power, software and other technologies that connect and exchange data with other devices and systems through the internet or other communication networks. For example, one or more healthcare devices and/or sensing devices of the physical hospital 110 are internet of things devices and are configured to transmit collected data to the hospital support platform 140 for storage and/or processing.
The user services may include any service provided by the hospital support platform 140 to the user. For example, user services include medical services provided to patients and/or accompanying persons, support services provided to staff members of physical hospital 110 and/or suppliers of physical hospital 110, and the like. In some embodiments, user services may be provided to patients, doctors, and hospital administrators through the user space application 120, which will be described in detail in the following description.
The public area refers to a shared space accessible to users (or portions of users) in the physical hospital 110. For example, the common area may include a reception area (e.g., a foreground), a waiting area, a hallway, etc., or any combination thereof.
A healthcare procedure is a procedure that provides a corresponding healthcare to a patient. Medical service procedures typically include several links and/or steps through which a user may need to obtain a corresponding medical service. Exemplary healthcare procedures may include outpatient procedures, hospitalization procedures, surgical procedures, or the like, or any combination thereof. In some embodiments, the healthcare procedures may include corresponding healthcare procedures for different departments, different diseases, and the like. In some embodiments, a preset data acquisition protocol may be set and specify the standard links involved in the healthcare procedure and how to acquire data related to the healthcare procedure. For further description of the medical service procedure, see other relevant descriptions of the present specification, for example, see fig. 5 and its related description.
The user space application 120 provides the user with access to user services provided by the hospital support platform 140. The user space application 120 may be an application, plug-in, website, applet, or any other suitable form. For example, the user space application 120 is an application installed on a user terminal device that includes a user interface for a user to initiate requests and receive corresponding services.
In some embodiments, user space application 120 may include different applications corresponding to different types of users. For example, the user space application 120 includes a patient space application corresponding to a patient, a medical space application corresponding to a doctor, a tube space application corresponding to an administrator, and the like, or any combination thereof. User services provided through the patient space application, the medical space application, and the management space application are also referred to as a patient space service, a medical space service, and a management space service, respectively. Exemplary patient space services include registration services, route guidance services, pre-consultation services, remote consultation services, hospitalization services, discharge services, and the like. Exemplary medical space services include scheduling services, surgical planning services, surgical simulation services, patient management services, remote ward services, remote outpatient services, and the like. Exemplary manager space services include monitoring services, medical services assessment services, device parameter setting services, service parameter setting services, resource scheduling services, and the like.
In some embodiments, the patient space application, the medical space application, and the management space application may be integrated into one user space application 120, and the user space application 120 may be configured to provide access to each type of user (e.g., patient, healthcare provider, manager, etc.). By way of example only, a particular user may have a corresponding account number that may be used to log into a user space application, view corresponding diagnostic data, and obtain corresponding user services.
According to some embodiments of the present description, by providing user space applications for different types of users, each type of user can easily obtain various user services he/she needs on its corresponding user space application. In addition, currently users often need to install various applications to obtain different user services, which results in poor user experience and high development costs. Therefore, the user space application of the present specification can improve user experience, improve service quality and efficiency, enhance service security, and reduce development or operation costs.
In some embodiments, the user space application 120 may be configured to provide access portals for relevant users of the physical hospital 110 to interact with the virtual hospital 130. For example, through the user space application 120, a user may enter instructions for retrieving digital content of the virtual hospital 130 (e.g., hardware devices, patient organs, digital twin models of public areas), view the digital content, and interact with the digital content. As another example, through the user space application 120, a user may communicate with a avatar representing an agent. In some embodiments, a public terminal of a hospital may install a administrative space application, and an administrator account of a department to which the public terminal corresponds may be logged into the administrative space application. The user may accept user services through a pipe space application installed in the public terminal.
The virtual hospital 130 is a digital twin (i.e., virtual representation or virtual copy) of the physical hospital 110 for simulating, analyzing, predicting, and optimizing the operating state of the physical hospital 110. For example, the virtual hospital 130 may be a real-time digital copy of the physical hospital 110.
In some embodiments, the virtual hospital 130 may be presented to the user using digital technology. For example, when the relevant user interacts with the virtual hospital 130, at least a portion of the virtual hospital 130 may be presented to the relevant user using XR technology. For example only, MR technology may be used to superimpose at least a portion of the virtual hospital 130 on the real-world view of the relevant user.
In some embodiments, the virtual hospital 130 may include a digital twin of a physical entity associated with the physical hospital 110. Digital twins refer to virtual representations (e.g., virtual copies, mappers, digital simulators) of physical entities. The digital twin can reflect and predict the state, behavior and performance of the physical entity in real time. For example, the virtual hospital 130 may include digital twins of at least a portion of medical services, departments, users, hardware devices, user services, public areas, medical services procedures, and the like of the physical hospital 110. The digital twins of a physical entity can take a variety of forms including models, images, graphics, text, numerical values, and the like. For example, the digital twin body may be a virtual hospital corresponding to a physical hospital, virtual personnel (e.g., virtual doctors, virtual nurses, and virtual patients) corresponding to personnel entities (e.g., doctors, nurses, and patients), virtual devices (e.g., virtual imaging devices and virtual scalpels) corresponding to medical service devices (e.g., imaging devices and scalpels), and the like.
In some embodiments, the digital twins may include one or more first digital twins and/or one or more second digital twins. The state of each first digital twin may be updated based on an update of the state of the corresponding physical entity. For example, one or more first digital twins may be updated during the mapping of data associated with the physical hospital 110 to the virtual hospital 130. One or more second digital twins can be updated by at least one of the user space applications 120, and the update of each second digital twins can result in a status update of the corresponding physical entity. The first digital twin may be updated accordingly when the corresponding physical entity changes its state, and the state of the corresponding physical entity changes when the second digital twin is updated. For example, the one or more first digital twins may include digital twins of a public area, a medical service, a user, a hardware device, etc., and the one or more second digital twins may include digital twins of a hardware device, a user service, a medical service procedure, etc. It should be appreciated that the digital twins may be either the first digital twins or the second digital twins.
According to some embodiments of the present description, physical hospitals 110 (including hardware devices, users, user services, healthcare procedures, etc.) may be simulated and tested in a secure and controllable environment by generating a virtual hospital 130 that includes digital twins of physical entities associated with the physical hospitals 110. By virtual reality linkage (e.g., real-time interaction between physical hospital 110 and virtual hospital 130), various medical scenarios can be more accurately predicted and responded to, thereby improving the quality and efficiency of medical services. In addition, the application of the XR technology and the virtual reality integration technology enables the interaction of related users to be more natural and visual, and provides a more comfortable and efficient medical environment, so that the user experience is improved.
In some embodiments, the virtual hospital 130 may further include agents that implement self-evolution based on data related to the physical hospital 110 and AI technology.
An agent refers to an agent that acts in an intelligent manner. For example, an agent may include a computing/software entity that can autonomously learn and evolve, and sense and analyze data to perform specific tasks and/or achieve specific goals (e.g., healthcare procedures). Through AI techniques (e.g., reinforcement learning, deep learning, etc.), an agent can constantly learn and self-optimize in interactions with the environment. In addition, the agent can collect and analyze mass data (e.g., related data of the physical hospital 110) through a big data technology, mine patterns and learning rules from the data, optimize decision flow, thereby identifying environmental changes in uncertain or dynamic environments, responding quickly, and making reasonable judgment. For example, agents may learn and evolve autonomously based on AI technology to accommodate changes in physical hospitals 110. By way of example only, agents may be built based on NLP technology (e.g., large language models, etc.) and may automatically learn and autonomously update through large amounts of language text (e.g., hospital business data and patient feedback information) to improve the quality of user service provided by physical hospitals 110.
In some embodiments, the agents may include different types of agents corresponding to different healthcare procedures, different user services, different departments, different diseases, different hospital positions (e.g., nurses, doctors, technicians, etc.), different links of healthcare procedures, and the like. A particular type of agent is used to process tasks corresponding to the particular type. In some embodiments, one agent may correspond to a different healthcare procedure (or a different healthcare, or a different department, or a different disease, or a different hospital location). In some embodiments, an agent may operate with reference to basic configuration data (e.g., dictionary, knowledge graph, template, etc.) of a department and/or disease corresponding to the agent. In some embodiments, multiple agents may cooperate and share information through network communications to collectively accomplish complex tasks.
In some embodiments, a configuration of the agent may be provided. For example, basic configuration data for use by the agent in operation may be set. The basic configuration data may include dictionaries, knowledge databases, templates, etc. As another example, usage rights of the agent may be set for different users. In some embodiments, an administrator of the physical hospital 110 may set the configuration of the agent through a managed space application.
In some embodiments, the agent may be integrated into or deployed on a hardware device. For example, agents corresponding to hospitalization services may be integrated into a hospital bed or presentation device of a hospital bed. In some embodiments, the agent may be integrated into or deployed on the intelligent robot. A self-contained intelligent robot refers to a robotic system that combines physical presence (manifestation) with intelligent behavior (cognition). The self-contained intelligent robot may be configured to interact with the real world in a manner that mimics or complements human capabilities, utilizing physical morphology and cognitive functions to perform tasks, make decisions, and adapt to the environment. By utilizing artificial intelligence and sensor technology, the self-contained intelligent robot can operate autonomously, interact with the environment, and continuously improve performance. For example, the self-contained intelligent robot may be configured as an agent corresponding to a surgical service and assist a doctor in performing a surgery.
In some embodiments, at least a portion of the user services may be provided based on the agent. For example, at least a portion of the user services may be provided to the relevant users based on the processing results, wherein the processing results are generated by at least one of the agents based on data related to the physical hospital 110. For example only, the data related to the physical hospital 110 may include data related to a healthcare procedure of the physical hospital 110, the agent may include an agent corresponding to the healthcare procedure, and the user service may be provided to an associated user of the healthcare procedure by using the agent processing data corresponding to the healthcare procedure.
The hospital support platform 140 may be configured to provide technical support to the healthcare system 100. For example, the hospital support platform 140 may include computing hardware and software to support innovative technologies including XR technology, AI technology, digital twinning technology, data flow technology, and the like. In some embodiments, the hospital support platform 140 may include at least a storage device for data storage and a processing device for data computation.
In some embodiments, the hospital support platform 140 may support interactions between the physical hospitals 110 and the virtual hospitals 130. For example, the processing device of the hospital support platform 140 may obtain data related to the physical hospital 110 from the hardware device and map the data related to the physical hospital 110 into the virtual hospital 130. For example, the processing device of the hospital support platform 140 may update a portion of the digital twins (e.g., one or more first digital twins) in the virtual hospital 130 based on the obtained data such that each portion of the digital twins in the virtual hospital 130 may reflect the updated status of the corresponding physical entity in the physical hospital 110. Based on the digital twin body which is continuously updated with the corresponding physical entity, the user can know the state of the physical entity related to the physical hospital 110 in real time, so that the monitoring and evaluation of the physical entity are realized. As another example, agents corresponding to data related to the physical hospital 110 may train and/or update based on the data related to the physical hospital 110 to self-evolve and self-learn.
In some embodiments, the hospital support platform 140 may support and/or provide user services to the relevant users of the physical hospital 110. For example, in response to receiving a user service request from a user, the processing device of the hospital support platform 140 may provide a user service corresponding to the service request. For another example, in response to detecting a need to provide a user service to a user, the processing device of the hospital support platform 140 may control a physical entity or virtual entity corresponding to the user service to provide the user service. For example, in response to detecting that a patient is being admitted to a hospital ward, the processing device of the hospital support platform 140 may control the intelligent care cart to direct a nurse to the hospital ward for an admission check of the patient.
In some embodiments, at least a portion of the user services may be provided to the relevant users based on interactions between the relevant users and the virtual hospital 130. Interaction refers to interactions or effects (e.g., conversations, behaviors, etc.) between the relevant user and the virtual hospital 130. For example, interactions between the relevant user and the virtual hospital 130 may include interactions between the relevant user and a digital twin in the virtual hospital 130, interactions between the relevant user and an agent, interactions between the relevant user and a virtual character, and the like, or any combination thereof.
In some embodiments, at least a portion of the user services may be provided to the associated user based on interactions between the associated user and at least one of the digital twins. For example, an update instruction of the second digital twin input by the relevant user may be received by the user space application 120, and the corresponding physical entity of the second digital twin may be updated according to the update instruction. As another example, a user may view a first digital twin of a physical entity (e.g., a 3D digital twin model of a patient organ or hardware device) through the user space application 120 to learn about the state of the physical entity. Alternatively, the user may change the display angle, display size, etc. of the digital twin. For further description of digital twins with respect to physical entities, reference may be made to other relevant descriptions of the present specification, for example, fig. 25 and its associated description.
In some embodiments, the processing device of the hospital support platform 140 may present virtual characters corresponding to the agents through the user space application, interact with the associated user, and provide at least a portion of the user services to the associated user based on the interactions between the associated user and the virtual characters.
In some embodiments, the hospital support platform 140 may have a five-layer structure including a hardware device layer, an interface layer, a data processing layer, an application development layer, and a service layer. In some embodiments, the hardware devices of the physical hospital 110 may be part of the hospital support platform 140. Further description regarding hospital support platforms may be found in other related descriptions of this specification, such as, for example, fig. 3 and its related description.
According to some embodiments of the present application, a virtual hospital corresponding to a physical hospital may be established by integrating various internal and external resources (e.g., medical service equipment, hospital personnel, medicines and consumables, etc.) of the physical hospital. The virtual hospital may reflect real-time status (e.g., changes, updates, etc.) of physical entities associated with the physical hospital, thereby enabling monitoring and assessment of the physical entities. Such integration may provide accurate data support for the operation and intelligent decision-making of medical services. In addition, through the virtual hospital, users related to medical services can commonly establish an open shared ecosystem, thereby promoting innovation and promotion of medical services.
In addition, the medical care service of the patient in the whole life cycle can be provided for the linkage between the inside and outside of the hospital. The perspective of medical services extends from mere disease treatment to covering the entire life cycle of a patient, including prevention, diagnosis, treatment, rehabilitation, health management, and the like. By establishing the intra-and-inter-hospital linkage, the physical hospital can integrate online and offline resources better and provide comprehensive and continuous medical and health services for patients. For example, by remote monitoring and online consultation, the health condition of the patient can be followed in real time, the treatment scheme can be adjusted in time, and the treatment effect can be improved.
Fig. 2 is a schematic diagram of an exemplary healthcare system 200 shown according to some embodiments of the present description.
As shown in fig. 2, the healthcare system 200 may include a processing device 210, a network 220, a storage device 230, one or more healthcare devices 240, one or more perception devices 250, one or more patient terminals 260 of a patient 261, and one or more doctor terminals 270 of a doctor 271 associated with the patient 261. In some embodiments, components in the healthcare system 200 may be interconnected and/or communicate by a wireless connection, a wired connection, or a combination thereof. The connections between the components of the healthcare system 200 may be variable. For example only, the healthcare device 240 may be connected to the processing device 210 through the network 220 or directly. As another example, storage device 230 may be connected to processing device 210 through network 220 or directly.
The processing device 210 may process data and/or information obtained from the storage device 230, the healthcare device 240, the sensing device 250, the patient terminal 260, and/or the doctor terminal 270. For example, the processing device 210 may map data related to a physical hospital to a virtual hospital corresponding to the physical hospital. The data of different patients participating in the same type of medical service flow are collected according to the same preset data collection protocol, the preset data collection protocol corresponds to the medical service flow, and the preset data collection protocol corresponding to the medical service flow comprises data collection protocols corresponding to a plurality of standard links in the medical service flow. For each of at least a portion of the patients, the processing device 210 may process data related to the physical hospital and provide user services to the patient 261 and the doctor 271 via the patient terminal device 260 and/or the doctor terminal device 270, respectively. As another example, processing device 210 may maintain a digital smart object and provide user services to patient 261 and doctor 271 through patient terminal 260 and/or doctor terminal 270, respectively, by engaging the digital smart object in processing data related to a physical hospital.
In some embodiments, the processing device 210 may be a single server or a group of servers. The server group may be centralized or distributed. In some embodiments, the processing device 210 may be located locally or remotely from the healthcare system 200. In some embodiments, the processing device 210 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or a combination thereof. In some embodiments, the processing device 210 may be implemented by a computing device.
In some embodiments, processing device 210 may include one or more processors (e.g., single-core processors or multi-core processors). For illustration only, only one processing device 210 is depicted in the healthcare system 200. It should be noted, however, that the healthcare system 200 in this specification may also include a plurality of processing devices. Thus, as described herein, operations and/or method steps performed by one processing device 210 may also be performed by multiple processing devices in combination or separately. For example, if in this specification the processing device 210 of the healthcare system 200 performs both the procedure a and the procedure B, it should be understood that the procedure a and the procedure B may also be performed by two or more different processing devices in the healthcare system 200 in combination or separately (e.g., a first processing device performing procedure a, a second processing device performing procedure B, or both the first and second processing devices performing procedures a and B together).
The network 220 may include any suitable network capable of facilitating the exchange of information and/or data by the healthcare system 200. The network 220 may be or include a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network), a bluetoothTM network, a Near Field Communication (NFC) network, or the like, or any combination thereof.
Storage device 230 may store data, instructions, and/or any other information. In some embodiments, the storage device 230 may store data obtained from other components of the medical services system 200. In some embodiments, storage device 230 may store data and/or instructions that processing device 210 may perform or for performing the exemplary methods described in this specification.
In some embodiments, the data stored in the storage device 230 may include multi-modal data. Multimodal data may include various forms of data (e.g., images, graphics, video, text, etc.), various types of data, data obtained from different sources, data related to different medical services (e.g., diagnosis, surgery, rehabilitation, etc.), data related to different users (e.g., patients, medical personnel, management personnel, etc.). For example, the data stored in the storage device 230 may include medical data of the patient 261 reflecting the health of the patient 261. For example, the medical data can include an electronic medical record of the patient 261. Electronic medical records refer to electronic files that record various types of patient data (e.g., basic information, examination data, imaging data). For example, the electronic medical record can include a three-dimensional model of a plurality of organs and/or tissues of the patient 261.
In some embodiments, storage device 230 may include mass storage devices, removable storage devices, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, storage device 230 may include a data lake and a data warehouse. For more on the data lake and data warehouse see the relevant description of fig. 3.
The healthcare device 240 may be used to provide or assist in healthcare. As shown in fig. 2, the medical services device 240 includes a clinic terminal 240-1, a hospital bed 240-2, a smart surgical terminal 240-3, a smart care cart 240-4, a smart wheelchair 240-5, etc., or any combination thereof.
The office terminal 240-1 is a terminal device that is configured within the office for use by doctors and patients in a medical outpatient procedure. For example, the office terminal 240-1 may include one or more of a screen, a sound output component, an image sensor, or a sound sensor. A doctor interface may be displayed on the screen of the doctor-room terminal 240-1 and data may be displayed on the doctor interface to facilitate communication between the doctor and patient. Exemplary data may include electronic medical records (or portions thereof), pre-consultation records, medical images, 3D organ models, examination results, consultation advice, and the like.
Hospital bed 240-2 refers to a hospital bed that is capable of supporting inpatients in a hospital ward and providing user services to the patient. The hospital bed 240-2 may include beds, bedside terminal equipment, bedside inspection equipment, sensors, and the like, or any combination thereof. The bedside terminal device may include an XR device, a display device, a mobile device, etc., or any combination thereof. In some embodiments, the hospital bed 240-2 may be configured as an agent corresponding to a hospital service, where the hospital bed may also be referred to as a smart hospital bed or a meta-hospital bed.
The intelligent surgical terminal 240-3 refers to a device configured with an agent for assisting surgery. The intelligent surgical terminal 240-3 may sense interactions (e.g., conversations, behaviors, etc.) between the healthcare provider, the patient, and the agent and obtain data captured by the sensing device 250 to provide surgical assistance. In some embodiments, the intelligent surgical terminal 240-3 may be configured to perform a risk alert for a surgical procedure, generate a surgical record of a surgical procedure, etc., based on the agent configured therein.
The intelligent nursing cart 240-4 refers to a nursing cart having an automatic driving function, which can assist patient treatment and nursing. For example, the smart care cart 240-4 may be configured to guide a nurse to a hospital ward for admission checks of patients. In some embodiments, the intelligent care cart may be controlled by an agent (e.g., an agent corresponding to a hospitalization service, a care agent). In some embodiments, the smart care cart 240-4 may include a cart, a presentation device, one or more examination devices and/or care tools, a sensing device (e.g., an image sensor, a GPS sensor, a sound sensor, etc.), and so forth. In some embodiments, the smart care cart 240-4 may be configured to obtain relevant treatment and care information for the patient and generate the query data, care data, and the like. The physical examination data may include vital sign data of the patient. The care data may include detailed records of care operations, such as care time, care operator, care measure, patient response, and the like.
The intelligent wheelchair 240-5 refers to a transport device for intelligently taking in and out of a patient. In some embodiments, the smart wheelchair 240-5 may be configured to perform autonomous navigation by integrating sensors and maps (e.g., using synchronous localization and mapping (SLAM) techniques or pre-built environmental models), locate the patient's location using Radio Frequency Identification Devices (RFID), bluetooth, or Wi-Fi signals, and identify the patient by biometric techniques. In some embodiments, the intelligent wheelchair 240-5 may be controlled by an agent (e.g., an agent corresponding to a hospitalization service, an agent corresponding to a surgical service). In some embodiments, the smart wheelchair 240-5 may be configured to generate data (e.g., a record of the interaction content between the agent and the patient) by sensing the interaction data through the built-in cameras/sensors.
The sensing device 250 may be configured to gather sensing information related to the environment in which it is located. In some embodiments, the sensing device 250 may comprise a sensing device in a physical hospital 110. For example, the sensing device 250 may include an image sensor 250-1, a sound sensor 250-2, a temperature sensor, a humidity sensor, and the like.
The patient terminal 260 may be a terminal device that interacts with the patient 261. In some embodiments, patient terminal 260 may include a mobile terminal 260-1, an XR device 260-2, a smart wearable device 260-3, and so forth. Doctor terminal 270 may be a terminal device that interacts with doctor 271. In some embodiments, the physician terminal 270 may include a mobile terminal 270-1, an XR device 270-2, or the like. In some embodiments, patient 261 may access a user space application (e.g., a patient space application) through patient terminal 260 and doctor 271 may access a user space application (e.g., a doctor space application) through doctor terminal 270. In some embodiments, patient 261 and doctor 271 may communicate with each other remotely through patient terminal 260 and doctor terminal 270, thereby providing remote medical services, such as remote outpatient services, remote ward services, remote follow-up services, and the like.
The sensing device 250, patient terminal 260, and doctor terminal 270 may be configured as data sources to provide information to the healthcare system 200. For example, the devices may transmit the collected data to the processing device 210, and the processing device 210 may provide user services based on the received data.
It should be noted that the above description of the healthcare systems 100 and 200 is intended to be illustrative, and not limiting of the scope of the present description. Many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other features of the example embodiments herein may be combined in various ways to obtain additional and/or alternative example embodiments. For example, the healthcare system 200 may include one or more additional components, such as, for example, other users 'terminal devices, a hospital's public terminal device, and the like. As another example, two or more components of the healthcare system 200 may be integrated into a single component.
Fig. 3 is a schematic diagram of an exemplary hospital support platform 300 shown according to some embodiments of the present description.
As shown in fig. 3, the hospital support platform 300 may include a hardware layer 310 (also referred to as a hardware module), an interface layer 320 (also referred to as an interface module), a data processing layer 330 (also referred to as a data processing module), an application development layer 340 (also referred to as an application development module), and a service layer 350 (also referred to as a service module). It should be understood that the "layers" and "modules" in this disclosure are used only for logically dividing the components of the hospital support platform and are not intended to be limiting.
The hardware layer 310 may be configured to provide a hardware basis for interactions between the real world and the digital world, and may include one or more hardware devices related to hospital operations. Exemplary hardware devices may include healthcare devices, sensing devices, terminal devices, and base devices. For more description of hardware devices, see other relevant descriptions of this specification. See, for example, fig. 1 and its associated description.
The interface layer 320 may be connected with the hardware layer 310 and the data processing layer 330. The interface layer 320 may be configured to obtain data collected by hardware devices of the hardware layer 310 and send the data to the data processing layer 330 for storage and/or processing. The interface layer 320 may also be configured to control at least a portion of the hardware devices of the hardware layer 310. In some embodiments, interface layer 320 may include hardware interfaces and software interfaces (e.g., data interfaces, control interfaces).
The data processing layer 330 may be configured to store and/or process data. The data processing layer 330 may include a processing device on which a plurality of data processing units may be configured. The data processing layer 330 may be configured to obtain data from the interface layer 320 and process the data by at least one data processing unit to enable user services related to hospital services.
The data processing unit may include various preset algorithms for implementing data processing, which may take the form of software, programs, computer code, and/or instructions implemented in various computer programming languages (e.g., java, C/C++). In some embodiments, data processing layer 330 may include a processing device (e.g., processing device 210 in fig. 2). The data processing unit may be configured on the processing device. In some embodiments, the data processing unit may include an XR unit configured to process data using XR technology to implement XR services, an AI unit (e.g., an agent unit) configured to process data using AI technology to implement AI services, a digital twin unit configured to process data using digital twin technology to implement digital twin services, a data flow unit configured to process data using data flow technology (e.g., blockchain technology, data privacy computing technology) to implement data flow services, and so forth.
In some embodiments, data processing layer 330 may also include a data center configured to store data. In some embodiments, the data center may employ a lake-warehouse integrated architecture, which may include data lakes and data warehouses. The data lake may be used to persist large amounts of data in a tamper-proof manner. The data warehouse may be used to store index data corresponding to data in the data lake. The data stored in the data lake may include native (or raw) data collected by the hardware device, derived data generated based on the native data, and the like. In some embodiments, the data in the data lake may be processed by a processing device (e.g., processing device 210 in fig. 2).
The application development layer 340 may be configured to support application development, publishing, subscribing, and the like. The application development layer 340 is also referred to as an ecological suite layer. In some embodiments, the application development layer 340 may be configured to provide an open interface for application developers to access or invoke at least a portion of the data processing units and to utilize at least a portion of the data processing units to develop applications. In some embodiments, as shown in fig. 3, the application development layer 340 may provide development kits, application markets, multi-tenant operation platforms, cloud officials, workspaces, and other support kits to assist developers in doing work.
The service layer 350 may be configured to enable relevant users of the hospital service to access user services related to the hospital service through the user space application. For further description of user services and user space applications, reference may be made to other relevant descriptions of this specification. See, for example, fig. 1 and its associated description.
The present specification provides a hospital support platform designed for integrated management of various resources within a hospital, including hardware resources, software resources, and data resources. In some embodiments, the platform further integrates data processing units capable of supporting advanced technologies, such as artificial intelligence, XR, digital twinning, and blockchain. These advanced techniques are used to improve the efficiency and quality of service in the healthcare industry. For example, artificial intelligence techniques enable autonomous evolution and continuous optimization of hospital operations, while XR and digital twins techniques facilitate the creation and maintenance of virtual hospitals. The virtual hospital can interact with the user, providing an immersive novel service experience. In addition, the platform includes an application development layer for granting access to these advanced technologies to third party developers of the healthcare industry. This access promotes an open ecosystem, which promotes the development and innovation of applications, and thus promotes the advancement of medical services.
Fig. 4 is a block diagram of an exemplary processing device 210 shown in accordance with some embodiments of the present description. In some embodiments, processing device 210 may communicate with and execute instructions stored in a computer-readable storage medium (e.g., storage device 230 in fig. 2). Processing device 210 may include an acquisition module 410, a mapping module 420, and a service module 430.
The acquisition module 410 may be configured to acquire data related to a physical hospital. For more description of acquiring data related to a physical hospital, see other relevant description of the present specification. See, for example, fig. 1, 5 and their associated descriptions.
The mapping module 420 may be configured to map data related to a physical hospital to a virtual hospital corresponding to the physical hospital. For more description of mapping data related to physical hospitals, see other relevant descriptions of this specification. See, for example, fig. 1, 2 and their associated descriptions.
The service module 430 may be configured to provide user services to relevant users of the physical hospital through a user space application. For further description of user services, see other relevant descriptions of this specification. See, for example, fig. 1 and 5-7 and their associated descriptions.
In some embodiments, processing device 210 may include one or more other modules. For example, the processing device 210 may include a memory module for storing data generated by the module in the processing device 210. In some embodiments, any two modules may be combined into a single module, and any one module may be divided into two or more units. For example, the service module 430 may include a plurality of units configured to support and/or implement different user services.
Fig. 5 is a schematic diagram of an exemplary flow 500 of providing user services shown in accordance with some embodiments of the present description.
As shown in fig. 5, data 530 related to a healthcare procedure may be collected according to a preset data collection protocol 520 corresponding to the healthcare procedure, and user services 540 may be provided to related users of the healthcare procedure by processing the data 530.
A healthcare procedure is a procedure that provides a corresponding healthcare to a patient. Exemplary healthcare procedures may include outpatient procedures, hospitalization procedures, surgical procedures, or the like, or any combination thereof. In some embodiments, the healthcare procedure may include several standard links that a user needs to go through in order to obtain a corresponding healthcare. For example, the outpatient procedure may include standard links such as a registration link, a waiting link, a consultation link, and a post-consultation link.
In some embodiments, the healthcare procedures corresponding to different hospital departments of a hospital (e.g., physical hospital 110) may be different. For example, an outpatient procedure corresponding to an internal medicine may be different from an outpatient procedure corresponding to a surgery. In some embodiments, the healthcare procedures corresponding to different diseases may be different. For example, a healthcare procedure corresponding to a cold may be different from a healthcare procedure corresponding to a cancer.
The preset data collection protocol 520 is used to instruct how to collect data 530 related to the healthcare procedure. In some embodiments, the preset data acquisition protocol 520 may include a data acquisition protocol corresponding to a standard link in a healthcare procedure. For example, the preset data acquisition protocol corresponding to the outpatient procedure may include a registration link, a waiting link, a consultation link, and a post-consultation link in the outpatient procedure.
In some embodiments, a data collection protocol corresponding to a standard link of a healthcare procedure may specify at least one or more hardware devices that collect data related to the standard link. The one or more hardware devices may include an internet of things device, a sensing device, a user terminal of a user, a public terminal of a hospital, a medical services device, etc., or any combination thereof.
In some embodiments, the data acquisition protocol corresponding to the standard link may further specify data interface criteria and/or data quality criteria for one or more hardware devices. Data interface standards refer to a set of guidelines and specifications that define how to exchange data between different systems, devices, or software applications. The data interface standard may include protocols, data formats, communication means, encoding and decoding rules, security standards, etc., or any combination thereof. For example only, the data interface criteria may include health level 7 (HL 7), digital imaging and communications in medicine (DICOM), and the like, or any combination thereof. Data quality criteria refers to a set of criteria and guidelines that are intended to ensure that the data is accurate, consistent, reliable, and useful for its intended purpose. Exemplary data quality standards may include data quality model ISO/IEC25012, data quality model ISO8000, data standard HL7, bassell bank supervision committee (BCBS 239), etc., or any combination thereof.
For example only, as shown in fig. 5, the processing device 210 may determine the preset data acquisition protocol 520 corresponding to the healthcare procedure based on at least one of the hospital department 502, the disease 504, the standard link 506, the hardware device 508, the data interface standard 510, or the data quality standard 512 corresponding to the healthcare procedure. In some embodiments, the preset data acquisition protocol 520 may be set manually by a user. In some embodiments, the agent corresponding to the healthcare procedure may learn the preset data collection protocol 520 from a historical data and knowledge database.
In some embodiments, the processing device 210 may update the preset data collection protocol 520 corresponding to the healthcare procedure based on the healthcare policy 514 associated with the healthcare procedure. For example, if the healthcare policy 514 is updated to include new standard links or to delete some standard links, the processing device 210 may update the standard links in the preset data acquisition protocol 520 corresponding to the healthcare procedure.
The data 530 related to the healthcare procedure may include various types of data related to the healthcare procedure. For example, the data 530 related to the healthcare procedure may include patient location information, patient interaction information related to interactions between the patient and the patient terminal, examination data of the patient (e.g., examination report, medical image, vital sign), physician orders, perception information related to the environment in which the patient is located, patient clinical path, patient data entered by a healthcare provider (e.g., doctor, nurse) or the like, or any combination thereof.
In some embodiments, processing device 210 may obtain data 530 from a hardware device. For example, the hardware device may collect the data 530 according to a preset data collection protocol 520, and the processing device 210 may obtain the data 530 from the hardware device. In some embodiments, processing device 210 may obtain data 530 from a storage device (e.g., storage device 230) that stores data 530. For example, data 530 collected by a hardware device may be stored in a storage device, from which processing device 210 may retrieve data 530.
In addition, processing device 210 may provide user services 540 to relevant users of the healthcare procedure by processing data 530. For example, at least a portion of the data 530 may be processed by an agent corresponding to a healthcare procedure. For further description of agents, see other relevant descriptions of this specification. See, for example, fig. 1 and its associated description. For more description of processing data using agents, see other relevant descriptions of this specification. See, for example, fig. 6 and its associated description.
In some embodiments, the relevant users may include a first type of user and a second type of user, and the user space applications may include a first user space application corresponding to the first type of user and a second user space application corresponding to the second type of user. For example, the first type of user may be a patient and the second type of user may be a doctor or manager. Accordingly, the first user space application may be a patient space application and the second user space application may be a medical space application or a management space application.
In some embodiments, after obtaining the interaction information between the first type of user and the first user-space application, the processing device 210 may provide a first user service to the first type of user through the first user-space application based on the interaction information and a second user service to the second type of user through the second user-space application based on the interaction information. For example, when a patient enters a hospital ward, the processing device 210 may provide hospitalization services to the patient through the patient space application and alert a nurse in the hospital ward to the patient through the tube space application.
According to some embodiments of the present description, data of different patients participating in the same type of healthcare procedure is acquired through the same preset data acquisition protocol corresponding to the healthcare procedure. The preset data collection protocol specifies the hardware devices that collect data associated with each standard link, data interface standard, data quality standard, etc. Compared to traditional hospital systems that collect data for only a specific scenario without regard to the entire healthcare procedure, the methods disclosed herein may collect more comprehensive, finer granularity, more real-time healthcare procedure data. The collected data may then be used to evaluate various aspects of the healthcare process, thereby facilitating performance optimization and improving the overall efficiency of the healthcare system.
Fig. 6 is a schematic diagram of an exemplary flow 600 of providing user services shown in accordance with some embodiments of the present description.
As shown in fig. 6, after collecting data 610 associated with a physical hospital, an agent 620 may process the data 610 to provide user services 640. As above, agents 620 include different types. Depending on the type and content of the data 610, the data 610 may be processed using an agent 620 of the target type. For example, if the data 610 relates to one link in a patient's healthcare process, the data 610 is processed by an agent corresponding to that healthcare process or the link of that healthcare process. As another example, the data 610 relates to a particular user service, and the data 610 is processed by an agent corresponding to the user service.
In some embodiments, the processing device 210 may present the virtual persona 634 corresponding to the agent 620 through a user space application 630 (e.g., a patient space application), and may provide the user service 640 based on interactions between the relevant user of the healthcare process (e.g., patient, hospital staff (e.g., doctor, nurse, hospital administrator)) and the virtual persona 634. In some embodiments, an associated user may interact with virtual character 634 through a user terminal (e.g., XR device 632 shown in fig. 6). For example, a user space application on a user terminal may present a avatar to an associated user via XR technology, and the associated user may interact with the avatar via XR technology.
Virtual characters refer to computer-generated people or entities designed to interact with an associated user in a digital environment. The avatar 634 may be configured to interact with an associated user to provide user services 640. For example, virtual character 634 may communicate with a user by simulating human speech expressions, gestures, etc., providing a realistic communication experience for the user. During communication, the user may express his/her needs or requests in terms of speech, text, gestures, etc., using natural language, and the agent 620 may be constructed based on natural language processing algorithms to understand and analyze the user's input, determine feedback information, and communicate the feedback information to the user through the avatar 422.
In some embodiments, the virtual character 634 may be a digital character having certain topographical features, sound features, and the like. For example, the processing device 210 may determine the appearance characteristics and/or sound characteristics of the virtual character 634 based on patient data of the patient. As another example, the processing device 210 may determine the appearance characteristics and/or sound characteristics of the virtual character 634 from the patient's doctor or nurse's data such that the virtual character 634 may simulate a doctor or nurse.
Exemplary user services 640 provided to the patient based on interactions with the virtual character 634 include a registration service, a pre-consultation service, an admission inquiry service, a route guidance service, an discharge service, a follow-up service, and the like, or any combination thereof. Some of which may be provided by different agents. For example, a registration service may be provided by a registration agent based on interactions between a patient and a first virtual character (e.g., a nurse virtual character) that represents the registration agent. For another example, a pre-consultation service may be provided by a pre-consultation agent based on interactions between the patient and a second virtual character (e.g., a nurse virtual character) that represents the pre-consultation agent. As another example, an admission inquiry service may be provided by an admission agent based on interactions between a patient and a third virtual character (e.g., a doctor virtual character) that represents the admission agent.
In some embodiments, agent 620 may generate processing results 650 based on data 610 and may provide user services 640 to relevant users based on processing results 650. For example, the processing results 650 may include voice recognition results, image recognition results, abnormality detection results, and the like. The agent 620 may utilize various AI techniques (e.g., machine learning models) to generate the processing results 650. For example, the relevant users may include hospital staff (e.g., doctors, nurses, hospital administrators), and the user services 640 provided to the hospital staff may include providing records, notifications, advice, and the like based on the processing results 650.
According to some embodiments of the present disclosure, user services may be automatically provided by integrating agents, so that the need for a lot of human labor may be reduced, the operation cost may be reduced, and the efficiency of providing user services may be improved. In addition, the intelligent agent can continuously optimize the learned rules and mechanisms, and improve the accuracy, efficiency and service quality of the user service. In addition, the specification provides different types of agents that can handle tasks of different healthcare providers, different departments, different hospital positions, etc., each of which can achieve self-optimization and evolution from specific data to better handle the corresponding task.
Fig. 7 is a flow chart of an exemplary process 700 for providing user services according to some embodiments of the present description.
At step 702, the processing device 210 may monitor for updates to patient data.
The patient data may include data related to patients participating in a healthcare procedure. A healthcare procedure refers to a procedure that provides a patient with a corresponding healthcare. For further description of medical service procedures, see other relevant descriptions of this specification. Such as fig. 1, 5, and related descriptions thereof. In some embodiments, patient data may be considered multi-modal data, including multiple types of data, multi-dimensional data, and the like.
In some embodiments, patient-related data may be acquired according to a preset data acquisition protocol corresponding to a healthcare procedure. For example, the at least one hardware device may collect data related to the patient according to a preset data collection protocol, and the processing device 210 may obtain data from the at least one hardware device. For further description of data acquisition, see other relevant descriptions of this specification. See, for example, fig. 5 and its associated description.
In some embodiments, the processing device 210 may monitor for updates to the patient-related data by monitoring at least one hardware device. Updates may be detected when the hardware device collects update data 704 that has not been processed by the processing device 210. For example, assuming the vital sign monitor collects the heart rate of the patient every hour, the processing device 210 may detect an update when the vital sign monitor collects a new heart rate of the patient (i.e., update data).
In some embodiments, the processing device 210 may have a direct communication connection with and monitor at least one hardware device directly. Or the processing device 210 may have a communication connection with a storage device (e.g., storage device 230) that stores patient-related data collected by at least one hardware device and monitors the at least one hardware device by monitoring the storage device.
Update data 704 refers to data collected by at least one hardware device that has not been processed by processing device 210. For example, if the at least one hardware device comprises an internet of things device, the update data 704 may comprise internet of things data. The update data 704 may be obtained directly from at least one hardware device or may be obtained from a storage device.
If the patient-related data includes updated data 704 obtained from at least one hardware device, the processing device 210 may perform event of interest (EOI) detection 706 on the updated data. At the same time, the processing device 210 may continue to execute step 702 to continuously monitor for updates in the patient data.
In some embodiments, processing device 210 may perform EOI detection 706 on update data 704.
EOI refers to a particular event or behavior that requires attention. EOI detection 706 refers to processing update data 704 collected by at least one hardware device to detect whether one or more EOIs have occurred. For example, for an inpatient procedure, an exemplary EOI may include a patient taking a hospital ward, an admission check on the patient, at least one doctor making a round in the hospital ward, performing a care operation or medical check on the patient, the patient initiating a service request, obtaining or updating a doctor's order for the patient, a physiological condition abnormality of the patient, the doctor issuing instructions to the patient. As another example, for a surgical procedure, an example EOI may include a surgical participant issuing an instruction (e.g., a primary surgeon issuing a patient transfusion instruction), detecting a surgical risk (e.g., a patient's cardiac arrest or severe hemorrhage), an abnormal physiological state of the patient (e.g., a patient's blood pressure being too low), a count of surgical tools being less than a threshold (e.g., a count of surgical knives being less than a threshold number of surgical knives), a surgical completion (i.e., an end of a surgical procedure), etc., or any combination thereof.
In some embodiments, processing device 210 may perform EOI detection 706 based on EOI detection rules 712. EOI detection rules 712 refer to rules that need to be followed when performing EOI detection 706. For example, the EOI detection rules 712 may specify the EOI to be detected, the algorithm or technique used to detect a particular EOI, the algorithm or technique used to analyze data collected by a particular data source, the EOIs corresponding to different links of a hospital stay, the EOIs corresponding to different types of patients, etc., or any combination thereof.
In some embodiments, the EOI detection rules 712 may be determined based on a history of EOI detection or manually set by a user (e.g., doctor, nurse, technician, etc.). For example, processing device 210 may perform EOI detection 706 on update data 704 based on the type of update data 704.
In some embodiments, each link of the healthcare procedure may correspond to one or more EOIs, and different links of the healthcare procedure may correspond to different types of EOIs. Accordingly, processing device 210 may perform EOI detection 706 on update data 704 based on the current link of the patient in the healthcare procedure. For example, the processing device 210 may determine a current link of the patient in the healthcare procedure and determine type data of the EOI to be detected based on the current link of the patient. Further, processing device 210 may perform EOI detection 706 on update data 704 based on the type data.
By performing EOI detection based on the current link of the patient in the healthcare process, only a specific type of EOI needs to be detected or tracked, the amount of data to be processed can be reduced, thereby reducing the processing requirements on the processing device 210 and improving the efficiency of EOI detection.
If an EOI occurrence 708 exists, the processing device 210 may perform one or more preset operations 710 corresponding to the EOI to provide user services to the relevant user. At the same time, processing device 210 may continue to perform step 702 and EOI detection 706 to continuously monitor data updates and detect EOI.
In some embodiments, the processing device 210 may perform one or more preset operations 710 corresponding to the EOI to provide user services to the relevant users.
The relevant users may include any user related to the healthcare procedure. For example, the user may include a patient (or a portion of a patient (e.g., an organ)), a physician, a visit to a patient, a hospital staff for a healthcare procedure (e.g., a healthcare provider for a healthcare procedure (e.g., a doctor, a nurse, a technician, etc.), a hospital administrator, a support person, etc.), a provider for a healthcare procedure, an application developer for a healthcare procedure, etc., or any combination thereof.
The user services may include any services provided in a healthcare procedure. For example, the user services include medical services corresponding to medical service procedures provided to patients and/or consultants, support services corresponding to medical service procedures provided to hospital staff and/or suppliers, and the like.
The preset operation corresponding to the EOI refers to an operation that needs to be performed when the EOI occurs. In some embodiments, the preset operation may include a general operation and/or a specific operation. General operations refer to operations that need to be performed whenever an EOI occurs, regardless of the type of EOI. For example, the general operations may include generating a record related to the EOI. A particular operation refers to an operation performed when a particular type of EOI occurs. For example, updating a daily plan of a patient according to update data may be determined as a particular operation corresponding to updating an EOI of a doctor's patient order.
In some embodiments, the processing device 210 may determine a correspondence 714 between the EOI and the preset operations and determine one or more preset operations 710 corresponding to the detected EOI based on the correspondence 714. Correspondence 714 may indicate that one or more preset operations 710 need to be performed when a particular type of EOI occurs. For example, the correspondence 714 may take the form of a lookup table.
In some embodiments, the correspondence 714 may be predetermined and stored in a storage device, from which the processing device 210 may obtain the correspondence 714. In some embodiments, the processing device 210 may determine 714 a correspondence between the EOI and the preset operation from the history.
In some embodiments, one or more preset operations 710 corresponding to the EOI may be determined further from the patient profile data. The patient profile data may include basic data, health data, historical data, registration data, and the like, or any combination thereof. For example, different types of patients (e.g., having different diseases, having different ages) may correspond to different scheduled surgeries.
In some embodiments, the preset operation corresponding to the EOI may be performed immediately after the EOI is detected. For example, the processing device 210 may cause the nurse terminal to provide an alarm when the patient's physiological state is abnormal. In some embodiments, the preset operation corresponding to the EOI may be performed after the EOI is completed. For example, the records related to the EOI may be generated in response to detecting completion of the EOI.
According to some embodiments of the present description, at least one hardware device that captures information about a healthcare procedure is monitored. Such monitoring facilitates timely discovery of data updates and the occurrence of EOI, and timely triggering of corresponding preset operations. Therefore, the user service can be automatically and efficiently provided for the relevant users, so that the service efficiency and quality are improved. In addition, the monitored hardware device collects multi-mode data in different links of the whole medical service flow, thereby realizing the medical care and comprehensive hospital service taking the patient as the center.
In some embodiments, processing device 210 may configure an agent corresponding to a healthcare procedure, and the agent may perform at least a portion of procedure 700. An agent refers to an agent that acts in an intelligent manner. For example, the agent may learn EOI detection rules 712 from the history and perform EOI detection in accordance with the EOI detection rules. The EOI detection rule refers to a rule for performing EOI detection on updated data. For example, the agent may learn from the history what types of EOI need to be monitored, how to effectively detect EOI, what data needs to be analyzed to detect EOI, and so on. For another example, the agent may learn a correspondence between the EOI and the preset operations from the history record, and determine one or more preset operations corresponding to the EOI according to the correspondence. For example, the agent may learn from the history of which operations need to be performed when the EOI occurs.
In some embodiments, the agent may further learn EOI detection rules and/or correspondence between EOIs and preset operations based on profile data for different patients. For example, the agent may determine different EOI detection rules and different preset operations for different types of patients (e.g., different diseases, different ages).
By integrating agents, the system can continuously learn EOI detection rules and/or correspondence using big data techniques, machine learning techniques, and other advanced methods. The continuous optimization of the EOI detection rules and/or the corresponding relations improves the accuracy, efficiency and service quality of hospitalization services.
Fig. 8 is a schematic diagram of an exemplary medical clinic procedure 800 shown according to some embodiments of the present description. As shown in fig. 8, the medical outpatient procedure 800 may include a number of links, such as a registration link 810, a waiting link 820, a consultation link 830, and a post-consultation link 840. In some embodiments, processing device 210 (e.g., service module 430, an agent corresponding to an outpatient service/procedure configured on processing device 210) may perform operations involved in multiple links of medical outpatient procedure 800.
The patient may subscribe to a doctor at registration step 810. As shown in FIG. 8, operations associated with the registration link 810 can include obtaining/establishing an electronic medical record, providing an intelligent registration service, providing a path planning service, providing a path guidance service, and so forth. In some embodiments, the patient may initiate a registration request through the patient terminal or a registration terminal at the hospital site, or a remote registration request through the patient terminal at another site outside the hospital.
In some embodiments, the processing device 210 can acquire or establish an electronic medical record of the patient upon receiving a registration request from the patient.
Electronic medical records refer to electronic files that record various types of patient data. In some embodiments, the electronic medical record can be updated as the patient outpatient procedure progresses. For example, the electronic medical record may include basic information of the patient when the patient registers, and the electronic medical record may be updated to further include patient complaints and registration records after the patient completes registering.
In some embodiments, the processing device 210 may provide intelligent registration services to the patient when a registration request is received from the patient. The intelligent registration service can be used to match a registration department, a registration doctor, and a registration time for a patient. In some embodiments, the processing device 210 may provide services related to the registration link through a registration terminal of a hospital or a patient terminal of a patient.
If the patient does not have an explicit department to register, the processing device 210 may provide intelligent registration services to the patient based on patient complaints, patient uploaded exam reports, and/or historical patient information stored by the storage device to determine a recommended department that matches the patient's need for a visit. If the patient does not have an explicit doctor to register, the processing device 210 may recommend a registering doctor for the patient based on patient complaints and/or the doctor processing device 210 registration module 410, which presents the registering department to the patient, may recommend a registration period for the patient with a reduced number of reservations based on the reservation information of the registering doctor.
In some embodiments, the processing device 210 may determine the patient's registry by making a first query to the patient using the patient terminal or registry terminal. For example, the processing device 210 may obtain patient complaints via a patient terminal or registration terminal and determine at least one candidate department based on the patient complaints. The processing device 210 may then cause the patient terminal or registration terminal to make a first query of the patient based on the at least one candidate department. For example, the processing device 210 may determine a first query content of the first query from the patient complaint and the at least one candidate department and cause the patient terminal or registration terminal (e.g., XR device) to present a first virtual character that makes the first query to the patient based on the first query content. Finally, the processing device 210 may determine the doctor from the first data acquired by the patient terminal or registration terminal in the first query.
The path planning service may be used to provide the patient with a planned path to the registered doctor's office. For example, the processing device 210 may generate a planned path to the consulting room based on the current location of the patient and location information of the registering doctor's consulting room. In some embodiments, the processing device 210 may generate the planned path after the patient has been registered at the hospital site. In some embodiments, the processing device 210 may generate the planned path when the patient remotely registers and arrives at the hospital.
The path guidance service may be used to guide the patient to the doctor's office according to the planned path. In some embodiments, processing device 210 may display guidance information to the patient through the patient's patient terminal (e.g., XR device). For example, through AR technology, the XR device may overlay guiding information related to the planned path over the real view of the patient.
The patient may wait and prepare for a visit in the waiting period. In the waiting section 820, the patient may be at the hospital site (e.g., at a waiting area in front of a doctor's office) or at another location outside the hospital (e.g., at the patient's home). As shown in fig. 8, operations associated with the waiting link 820 may include providing a pre-consultation service, providing force/temperature feedback, and the like.
The pre-consultation service may be used to collect information about the patient by making a preliminary query to the patient before the patient enters the consulting room for a formal visit. For example, when a patient is waiting for a doctor, the processing device 210 may perform a pre-consultation on the patient through the patient terminal of the patient or a waiting terminal configured in the waiting area, so as to relieve anxiety emotion of the patient when the patient is waiting, generate a pre-consultation record, and provide the pre-consultation record to a doctor for reference, thereby improving the doctor's efficiency.
Force feedback/temperature feedback may be used to provide pacifying feedback to the patient. For example, upon detection of a patient's mood (e.g., tension, fear, anxiety, etc.), the processing device 210 may apply force feedback and/or temperature feedback to the patient via the patient's wearable device, causing the patient to experience actions such as handshaking, hugging, etc., thereby soothing the patient's bad mood.
The patient may communicate with the registering physician in the inquiry link 830 to receive medical outpatient services (e.g., in-situ outpatient service in the clinic, remote outpatient service). As shown in FIG. 8, operations associated with the inquiry link 830 may include outpatient previews of the day, medical data presentation, providing consultation advice, diagnostic record generation, remote co-diagnosis services, and the like.
In the inquiry link 830, the current day outpatient preview is used to show the doctor relevant information of the patient to be treated on the current day. For example, the processing device 210 may present the doctor with the category of the patient that was registered or waiting the day before the doctor began to visit. The categories of patients may include initial patients, re-patients, and the like. In some embodiments, the doctor may view the current day's outpatient preview through a common terminal device within the doctor's office or the doctor's terminal (e.g., XR device).
In the inquiry link 830, the medical data presentation is used to present the medical data of the patient, such as an electronic medical record, to a doctor, patient, and/or remote attendant. For example, the processing device 210 may synchronize display of the patient's medical data to a doctor, patient, and/or remote attendant via at least one terminal device. In addition, the display mode and/or the display content of the electronic medical record are updated according to the interaction operation of the doctor, the patient and/or the remote consultant on the medical data, so that the doctor, the patient and/or the remote consultant can communicate.
In the inquiry step 830, the provision of the consultation advice refers to providing the doctor with the consultation advice that can be referred to during the consultation. For example, the processing device 210 may generate a review proposal provided to the doctor based on the sensing information collected by the sensing device during the review process, so that the doctor can adjust the diagnosis mode, the diagnosis result and the prescription, thereby improving the review efficiency and the review accuracy.
Diagnostic record generation is used to assist the physician in generating a diagnostic record of the patient. For example, the processing device 210 may generate a diagnostic record to record information regarding the patient's medical condition diagnosed by the doctor, the doctor's order, and the like.
The remote attendant service may be used to provide immersive attendant services for the patient and the remote attendant. For example, the processing device 210 may present the remote co-mores with real-time views of the virtual consulting room emulating a real consulting room, the physician's real-time views within the consulting room, and the patient's real-time views. As another example, the processing device 210 may present real-time images of remote accompany persons through a common terminal device within the treatment room. As another example, the processing device 210 may present real-time graphic views of the remote attendant to the doctor and patient through the doctor's terminal and patient's patient terminal, respectively.
For further description of medical data presentation, providing review advice, diagnostic record generation, and remote co-diagnosis services, see the relevant description of fig. 11, which is not repeated here.
In some embodiments, the processing device 210 may provide services related to the inquiry link to an associated user (e.g., doctor, patient, remote attendant) via at least one terminal device. The at least one terminal device includes a consulting room public terminal device, a patient terminal, a doctor terminal, a remote attendant terminal device, and the like.
The public terminal device in the examination room refers to a terminal device installed in the site of the examination room, and may include a display screen, a sound output device, a sound sensor, an XR device, a wearable device, etc., or any combination thereof. For example, after the patient enters the consulting room, the processing device 210 may present the doctor and/or patient with electronic medical records, review advice, pre-consultation records, diagnostic records, real-time images of remote co-consultants, etc., via the display screen of the common terminal device. As another example, the patient may wear a wearable device of the public terminal device, through which force feedback of the remote co-diagnostic person is received. As another example, the physician and patient may wear an XR device of a common terminal device, and the processing device 210 may present various types of information to the physician and/or patient via the XR device of the common terminal device. In some embodiments, the patient worn XR device may also be a registration terminal XR device.
Fig. 12 is a schematic diagram of an exemplary clinic interface 1200 shown according to some embodiments of the present description. As shown in fig. 12, the visit interface 1200 may include a first interface element 1210 related to the patient's electronic medical record, a second interface element 1220 related to the remote follow-up service, a third interface element 1230 related to the medical document, and a fourth interface element 1240 related to the visit proposal.
The presentation content and/or presentation form of the visit interface 1200 may change as the visit procedure progresses. For example, the doctor and/or patient may issue control instructions to adjust what is displayed in the patient interface 1200. The control instructions may include various forms of instructions, such as voice control instructions, gesture control instructions, touch control instructions, and control instructions input through an input device (e.g., mouse, keyboard, etc.).
The first interface element 1210 can be an icon corresponding to the electronic medical record or can be used to display content of the electronic medical record. During a visit, a doctor and/or patient can invoke at least a portion of the electronic medical record by issuing control instructions to be presented via the first interface element 1210.
The second interface element 1220 may be an icon corresponding to a remote accompany service, or may be used to display a screen of a remote accompany. When the patient's remote diagnosis request is approved by the doctor, the patient can communicate with the remote diagnosis partner during the inquiry process, and view the picture of the remote diagnosis partner through the second interface element 1220.
The third interface element 1230 may be an icon corresponding to a medical document, and may also be used to display a medical document, such as a pre-consultation record, a diagnostic record (preliminary diagnostic record or target diagnostic record), or the like. When a patient is detected to begin a visit, a pre-consultation record may be presented via the third interface element 1230, the third interface element 1230 may be updated to present an initial diagnostic record after the doctor has completed communicating with the patient, and the third interface element 1230 may be updated to present a target diagnostic record after the doctor has confirmed that the initial diagnostic record was modified.
In some embodiments, the processing device 210 may determine that the patient is beginning to visit based on the number call instruction. The instruction to call the number is an instruction for the doctor to instruct the start of the next patient visit. The doctor can actively initiate the number calling instruction through the public terminal or the doctor terminal. Or the processing device 210 may automatically generate a number-calling instruction after the doctor submits the target diagnostic record for the last patient. In some embodiments, when the patient receives the on-site medical outpatient service, the processing device 210 may determine that the patient begins to visit based on the location information of the patient terminal. Or the processing device 210 may determine that the patient is beginning to visit based on the sensing device (e.g., image sensor) acquiring an image of the patient entering the consulting room. When the patient receives the telemedicine outpatient service, the processing device 210 may obtain the remote visit confirmation messages sent by the doctor and the patient through the doctor terminal and the patient terminal, respectively, to determine that the patient starts the visit.
The fourth interface element 1240 may be an icon corresponding to the inquiry suggestion or content for presenting the inquiry suggestion. After generating the review proposal based on the perception information during the review, the review proposal may be presented through the fourth interface element 1240.
In some embodiments, the processing device 210 may provide services related to the inquiry links to the doctor and patient through the doctor's terminal and the patient's patient terminal. At least a portion of the content presented by the doctor terminal and the patient terminal may be synchronized.
In some embodiments, processing device 210 may further provide services related to the inquiry link to the remote co-diagnostic device via the remote terminal device (e.g., XR terminal device) of the remote co-diagnostic device. When the remote consultant participates in the diagnosis process, the remote terminal equipment of the remote consultant can synchronously present the virtual diagnosis room.
In some embodiments, processing device 210 may provide a telemedicine outpatient service. For example, the processing device 210 may obtain or generate a three-dimensional patient model. The 3D patient model may correspond to the patient or a portion of the patient (e.g., the upper body). For example only, the processing device 210 may obtain an initial three-dimensional patient model from the electronic medical record of the patient and may update the initial three-dimensional patient model based on the real-time dynamic data and the physiological data of the patient to obtain the three-dimensional patient model. In addition, the processing device 210 may present the three-dimensional patient model to a physician and obtain the examination instructions entered by the physician through the physician's terminal. For example, the processing device 210 may present a three-dimensional patient model over a field of view of a doctor through a doctor terminal. The examination instructions entered by the physician may include an examination section, an examination device and an examination operation. The physician may enter the examination instructions in various ways, such as voice, gestures, and operation input devices (e.g., smart glove, smart grip, etc.). For example, the doctor terminal may present virtual examination devices corresponding to a plurality of examination devices. The doctor can select a virtual examination apparatus through the input apparatus, and perform a virtual examination operation on the three-dimensional patient model using the virtual examination apparatus. The processing device 210 may determine an examination site, an examination device, an examination operation, etc. according to a virtual examination operation performed by a doctor, and generate an examination instruction. In addition to examination instructions, the physician may also enter other instructions to instruct the physician terminal to rotate, zoom in, zoom out, etc. the three-dimensional patient model.
In some embodiments of the present description, a doctor and a patient may communicate remotely in a virtual consulting room space through a doctor terminal and a patient terminal, which may provide an unobstructed immersive consulting experience for the doctor and patient. In addition, the three-dimensional patient model of the patient can be presented to the doctor, so that remote examination can be performed more accurately and conveniently, and the accuracy of diagnosis is improved.
After the patient finishes the diagnosis, the post-diagnosis link can be entered. As shown in FIG. 8, operations associated with the post-diagnosis link 840 may include providing a medication intake service, providing an inspection service, providing a health monitoring service, and so forth.
The medication intake service is used to assist the patient in taking medication configured by the doctor. For example, the drug delivery service is used to assist the patient in paying for medications, reserving a pharmacy for delivery, guiding the patient to the pharmacy, and the like. The examination service is used to assist the patient in receiving the examination that the physician requires to perform. For example, examination services are used to assist patients in paying examination fees, reserving examinations for examination departments, guiding patients to an examination department, and the like.
In some embodiments, in response to detecting the end of the interrogation process, processing device 210 may determine a target service to be provided to the patient after the interrogation process and reserve the patient with a target business unit that provides the target service. The processing device 210 may detect whether the interrogation process is finished in a number of ways. For example, the processing device 210 may determine that the patient's inquiry process is over when it is detected that the doctor submitted a target diagnostic record for the patient or that the doctor called the next patient.
In some embodiments, the processing device 210 may determine the target services required by the patient based on the target therapy prescription in the target diagnostic record. For example, the target service may be a medication intake service and the target business unit may be a pharmacy. When the patient requires a medication intake service, the processing device 210 may send the patient's medication order information to the pharmacy and schedule the medication intake. As another example, the target service may be an inspection service and the target business unit may be an inspection department. When the patient requires the examination service, the processing device 210 may send examination prescription information of the patient to an examination room and reserve the examination. Alternatively, the processing device 210 may send the reservation information to the patient terminal after reservation with the target business unit. Health monitoring services may be used to continuously monitor the health of a patient after a visit procedure. For example, the processing device 210 may generate a health monitoring plan based on the target diagnostic record and cause the one or more monitoring devices to obtain health monitoring information for the patient based on the health monitoring plan. Optionally, the processing device 210 may further update the health monitoring plan based on the health monitoring information.
In some embodiments, one or more operations in medical outpatient procedure 800 may be performed by an agent corresponding to a medical outpatient service/procedure. In some embodiments, different links of medical clinic procedure 800 may share one agent, or different links may correspond to different agents. For example, the medical clinic procedure may include a registration service, which may correspond to a registration agent, and a pre-consultation service, which may correspond to a pre-consultation agent. For example, a registration service may be provided by a registration agent based on interactions between a patient and a first virtual character representing the registration agent, and a pre-consultation service may be provided by a pre-consultation agent based on interactions between a patient and a second virtual character representing the pre-consultation agent.
Fig. 9 is a schematic diagram of an exemplary flow 900 of providing a pre-consultation service according to some embodiments of the present description. The process 900 shown in fig. 9 may be performed in the waiting section 820.
Step 910, determining second query content of the second query based on the department of the doctor.
The second query may also be referred to as a pre-consultation query for making a preliminary query to the patient prior to the formal consultation. The second query may include multiple rounds of queries. The second interrogation content may include interrogation content for each round of interrogation. Or the second interrogation content may include only the interrogation content of the first round of interrogation.
In some embodiments, the processing device 210 may obtain a pre-consultation record template corresponding to the doctor's department and determine the second query content based on the pre-consultation record template.
In some embodiments, the processing device 210 can obtain known information about the patient (e.g., electronic medical records, complaints, etc.), and determine missing information in the pre-consultation record template that has not been acquired by comparing the pre-consultation record template to the known information. For example, if the known information includes a family history of the patient, the missing information need not include a family history. As another example, if the patient complaint includes a medical history of the patient, the missing information need not include a medical history. In some embodiments, missing information may be determined based on basic information of the patient. For example, for male patients, the omission need not include a menstrual history, a fertility history. Further, the processing device 210 may determine the second query content based on the missing information.
In some embodiments, the processing device 210 may determine the second query content based on the department of the doctor and known information about the patient using the first query model. The first query model may include a CNN model, an RNN model, an LSTM model, a BERT model, a ChatGPT model, and the like. In some embodiments, the first interrogation model may include a missing information determination model and a first interrogation content determination model. The missing information determination model may be configured to output missing information by processing the department of the doctor and known information about the patient. The first query content determination model may be configured to output second query content based on missing information of the patient.
Step 920, controlling the patient terminal to perform the second query on the patient based on the second query content.
In some embodiments, after the patient registers with the doctor, the processing device 210 may determine an estimated wait time for the patient to receive medical services. For example, the estimated wait time may be the time difference between the current time and the patient's registration period. For another example, the estimated wait time may be determined based on a doctor's current day clinic record and a patient registration record. The patient's registration record may include a registration period for which the patient was subscribed. The doctor's current day outpatient record is a record reflecting the doctor's current day outpatient condition.
In some embodiments, in response to determining that the estimated wait time is greater than the first preset time threshold, the processing device 210 may cause the patient terminal of the patient to initiate a second query to the patient or to present a suggestion to do the second query. This approach may ensure that there is sufficient time for the pre-consultation to prevent the doctor from calling the patient during the pre-consultation procedure.
In some embodiments, in response to determining that the estimated wait time is less than the second preset time threshold, the processing device 210 may cause the patient terminal of the patient to initiate a second query to the patient or to present a suggestion to make the second query. The second preset time threshold may be greater than the first preset time threshold. For example, when the current time is detected to be less than 24 hours from the registration period (i.e., the estimated waiting time is less than 24 hours), the patient terminal may display a suggestion to the patient to make a second query (e.g., a suggestion to the patient by a virtual character) to prompt the patient to make a pre-consultation in time.
In some embodiments, the processing device 210 may detect that the patient initiated the pre-consultation request through the patient terminal and then cause the patient terminal of the patient to make a second inquiry to the patient.
In some embodiments, the patient terminal may present a second avatar to make a second query based on the second query content. The second avatar refers to a digitized avatar having specific characteristics (e.g., specific topographical characteristics, acoustic characteristics, etc.) and may communicate with the patient to pre-visit the patient. For example, the processing device 210 may display the second virtual character through a screen of the patient terminal (e.g., an XR device) and play the second query content through a sound output device of the patient terminal. Meanwhile, the second virtual character can simulate the voice expression, gestures and the like of human beings, and provides realistic communication experience for patients.
In some embodiments, the second avatar may have a pre-set appearance characteristic. In some embodiments, the topographical features of the second virtual character may be determined based on optical image data of a doctor registering the patient. In some embodiments, the appearance characteristics of the second avatar may be determined based on the patient's basic information. In some embodiments, the processing device 210 may select an appropriate avatar from the avatar library as the second avatar based on the physician's profile and/or patient's basic information.
In some embodiments, the second query comprises a plurality of rounds of queries, and the second query content may comprise the query content of each round of queries in the second query, and the second query may be performed by the process 1000 shown in fig. 10.
As shown in fig. 10, for a first round of interrogation, the processing device 210 may cause the patient terminal to make the first round of interrogation based on the corresponding interrogation content.
For each current round of interrogation (abbreviated as current interrogation) except for the first round of interrogation, the processing device 210 may adjust the interrogation content of the current interrogation (abbreviated as current interrogation content) based on the second data collected prior to the current interrogation to make the interrogation content more consistent with the patient's condition. For example, processing device 210 may determine semantic information and affective information for the patient's historical answers based on the second data collected prior to the current query. The second data may be collected after the sound is detected by a sound sensor of the third terminal device. The historical answers are answers to the patient's queries of the historical rounds. The semantic information of the historical answers characterizes the content of the historical answers. The mood information for the historical answer may indicate the mood of the patient (e.g., calm, tension, anxiety, fear, suspicion, irritability, etc.) at the time the historical answer was provided. The processing device 210 may determine the semantic information by performing text transcription, voice content recognition, etc. on the second data. Processing device 210 may determine emotion information by analyzing features of content, tone, intonation, pace, etc. of the second data.
As shown in fig. 10, processing device 210 may adjust the current query content based on semantic information and emotion information. For example, when the patient's mood information is "tension" or "fear," the processing device 210 may add a pacifying utterance to the current query content. As another example, when the semantic information indicates that the patient does not explicitly answer the history query, the processing device 210 may adjust the current query content to repeat the history query, thereby directing the patient to explicitly answer the history query. The originally determined current query content can be used as the query content of the next round. Thus, the current inquiry content can be timely adjusted according to the illness state of the patient, and the service quality of the pre-inquiry is improved.
In some embodiments, in addition to adjusting the current query content, the sound characteristics used for the query may also be adjusted in real-time based on the patient's status. The sound features include speech rate features, mood features, intonation features, volume features, etc. As shown in fig. 10, the processing device 210 may determine the sound characteristics of the current query based on the semantic information and emotion information of the patient's historical answers, and cause the patient terminal to perform the current query based on the adjusted query content and the sound characteristics of the current query. The method can better take care of the emotion change of the patient, thereby enhancing the personification effect of the second virtual character and improving the quality of the pre-consultation service.
In some embodiments, as shown in fig. 10, the processing device 210 may further obtain physiological state information of the patient. The physiological state information of the patient may reflect a real-time physiological state of the patient. The physiological state information may include physiological parameter values (e.g., heart rate, pulse rate, respiration rate, etc.) of the patient. The physiological state information may also include information related to the posture, limb behavior, facial expression, muscle state, etc. of the patient. In some embodiments, the physiological state information of the patient may be obtained using a wearable device worn by the patient.
In addition, the processing device 210 may adjust the current query content based on semantic information, emotion information, and physiological state information. For example, the processing device 210 may update the patient's mood information based on the patient's physiological state information. It will be appreciated that the patient's internal mood is not always adequately expressed by the patient's answer, and thus the patient's mood information may be updated or modified based on the patient's physiological state information. In addition, processing device 210 may adjust the current query content based on the semantic information and the updated emotion information.
According to some embodiments of the present description, by further considering physiological status data of a patient, accuracy of emotion information of the patient may be improved, so that accuracy of adjustment of current query content may be improved, thereby improving quality of service of a pre-query service.
As shown in fig. 10, in some embodiments, the processing device 210 may determine feedback parameters from at least a portion of the semantic information, the affective information, and the physiological state information, and control the wearable device to apply feedback to the patient in accordance with the feedback parameters. The feedback may include at least one of force feedback or temperature feedback. The feedback parameters may be used to control the manner in which feedback is applied, e.g., the type of feedback, the body part to which the feedback is applied, the strength of the feedback, etc. In some embodiments, processing device 210 may determine the patient's emotion and emotion level from at least a portion of the semantic information, the emotional information, and the physiological state information, and determine the feedback parameters from the emotion and emotion level. The method can sooth the bad emotion of the patient in time, thereby improving the quality of the pre-consultation service.
In some embodiments, the processing device 210 may end the second query based on the second preset condition. The second preset condition may be that the number of remaining missing information is 0. The second preset condition may be that a time difference between the current time of the patient and the estimated wait time is less than a threshold value. The second preset condition may also be similar to the first preset condition.
In some embodiments, the second interrogation content determined in step 910 may include only the interrogation content of the first round of interrogation. The current query content of each current query except the first round of queries may be determined during the progress of the second query. For example, in a current query, the processing device 210 may input query content of the historical query, historical answers of the patient, known information of the patient, etc. into the second query content determination model, from which the current query content is output.
At step 930, a pre-consultation record is generated based on the second data collected by the patient terminal in the second query.
The second data may include voice data, text data, and image data that the patient inputs through the patient terminal in the second query. The pre-consultation record may be used to record patient information collected in a second consultation (i.e., a pre-consultation). Optionally, some known information of the patient may also be recorded in the pre-consultation record. In some embodiments, the pre-interrogation record is generated according to a preset template. The preset templates can be templates corresponding to departments where doctors are located, and can also be templates set by the doctors.
For example, when the second data includes a speech signal, the processing device 210 may first transcribe the speech signal into text and extract the second keyword from the text by a keyword extraction algorithm. Further, the processing device 210 may convert the second keyword into a medical term. In addition, the processing device 210 may obtain a plurality of template fields in the pre-consultation record template, retrieve content corresponding to each template field from the hospital terminology, and fill in corresponding locations of the pre-consultation record template. The transformation of the second keyword may be performed based on a term transformation model or may be performed based on a knowledge dictionary.
In some embodiments, the second query may be performed by a terminal device other than the patient terminal (e.g., a terminal waiting for a diagnosis). In some embodiments, process 900 may be performed by an agent corresponding to a medical outpatient service or medical outpatient process. For example, the pre-consultation service may be provided by a pre-consultation agent.
Fig. 11 is a flow diagram illustrating an exemplary flow 1100 for providing medical outpatient services based on awareness information, according to some embodiments of the present description. In some embodiments, flow 1100 may include one or more of sub-flows 1110, 1120, 1130, and 1140.
The sub-process 1110 may be used to provide viewing advice based on the perceptual information. The sub-process 1110 may be performed in the inquiry link 830. As shown in fig. 11, sub-process 1110 may include steps 1112 and 1114.
Step 1112 generates a review proposal based on the perception information and patient data of the patient. The consultation advice refers to advice that assists the doctor in providing medical outpatient services. For example, the consultation advice may include supplemental inquiry advice, physical advice, prescription advice, treatment advice, and the like.
In some embodiments, the review proposal may be determined based on a knowledge database corresponding to a registered department, review specifications, and the like. For example, the processing device 210 may determine dialogue content for the physician and patient based on the voice signals collected by the sound sensor and search in a knowledge database, a review specification, etc., based on the dialogue content and/or the patient data to determine review recommendations. For example only, a search may be conducted in the review specification based on dialogue content and/or patient data to determine which information in the review specification has not been collected and to provide supplemental query suggestions based on such information.
In some embodiments, the review proposal may be generated based on a diagnostic model. For example, the diagnostic module 430 may determine model inputs based on the perceived data perception information and the patient data and input the model inputs to a diagnostic model, which may output corresponding viewing advice. For example, the model input may include patient data, communication content determined based on voice signals, patient status information determined based on image data, and the like, or any combination thereof.
In some embodiments, the consultation advice may be generated by an agent corresponding to the medical outpatient service. The agent may learn a generation mechanism for generating the review proposal from various data (e.g., historical diagnostic records, knowledge databases, and review specifications) and process the perception information and patient data according to the mechanism to provide the review proposal.
Step 1114 controls at least a portion of at least one terminal device to present the review proposal.
For example, when a patient receives on-site medical outpatient service at a consulting room, the processing device 210 may control a public terminal device or doctor terminal to present consultation advice. As another example, when the patient receives the telemedicine outpatient service, the processing device 210 may control the doctor's doctor terminal and the patient's patient terminal to present the consultation advice, respectively. The diagnosis advice can improve the accuracy of diagnosis and prescription and the efficiency of medical treatment service.
Sub-process 1120 may be used to generate a target diagnostic record based on the perceptual information. The sub-process 1120 may be performed at the end of the inquiry link 830. As shown in fig. 11, sub-process 1120 may include steps 1122, 1124, and 1126.
At step 1122, an initial diagnostic record is generated based on the perceptual information.
The initial diagnostic record may be an automatically generated diagnostic record. In some embodiments, the initial diagnostic record may include an initial patient medical record, an initial diagnostic opinion, an initial diagnostic prescription (e.g., an initial treatment prescription and an initial examination prescription), an initial medical order, and the like. In some embodiments, key content may be extracted from the sensory information based on the diagnostic record template. The key content refers to content related to template fields in the diagnostic record template. The key content may be converted into professional content according to a knowledge dictionary or a term conversion model. Further, an initial diagnostic record is generated by updating the diagnostic record template based on the expertise and knowledge database. The knowledge database refers to a knowledge database of a registered department, for example, including a diagnosis specification (e.g., a disease description specification, a diagnosis specification, a prescription specification, a doctor's advice specification, etc.) of the department.
In some embodiments, patient screening data acquired by one or more screening devices during an outpatient procedure may be obtained, and an initial diagnostic record may be further generated from the screening data. In some embodiments, the initial diagnostic record may be generated by an agent corresponding to the medical outpatient service. The agent may learn the mechanism by which the diagnostic record is generated from various data (e.g., diagnostic record templates, knowledge dictionary, knowledge database, etc.), and process the perceived information and patient data according to the learned mechanism to generate the diagnostic record.
At step 1124, the initial diagnostic record is presented to the physician.
For example, the processing device 210 may control the common terminal to present an initial diagnostic record, for example, when the patient has started an outpatient clinic. As another example, the processing device 210 may control the doctor terminal to present the initial diagnostic record to the doctor. In some embodiments, the doctor terminal may present the initial diagnostic record to the doctor at a preset time (e.g., after the doctor has completed the day's inquiry work).
In step 1126, a target diagnostic record is generated based on the initial diagnostic record and feedback information entered by the physician for the initial diagnostic record.
The feedback information entered by the physician may include modifications and/or confirmations of the initial diagnostic record by the physician. The target diagnostic record is a diagnostic record that has been modified and/or validated by the physician. In some embodiments, the target diagnostic record may include a target patient medical record, a target diagnostic opinion, a target diagnostic prescription (e.g., a target treatment prescription and a target examination prescription), a target medical order, and the like.
By generating the target diagnosis record, the manual writing errors of the target diagnosis record can be reduced, and the generation efficiency of the target diagnosis record is improved. On the other hand, the paperwork of doctors can be reduced, so that the doctors have more energy to care the patients, and the service quality of medical clinics is improved.
Sub-process 1130 may be used to provide remote companion services based on the awareness information. The patient may request a remote companion service at a pre-consultation. Sub-process 1130 may be performed in query 830. As shown in fig. 11, sub-process 1130 may include steps 1132 and 1134.
Step 1132, based on the perceived information, it is determined whether the patient needs to communicate with a remote co-diagnosis.
In some embodiments, the processing device 210 may detect whether the patient issued a request to communicate with a remote co-diagnostic based on the perceptual information (e.g., voice data and/or image data). In some embodiments, the processing device 210 may determine status information of the patient based on the perceived information and determine whether the patient needs to communicate with a remote attendant based on the status information of the patient. For example, when the status information indicates that the patient is in a highly stressed, fear, etc., the processing device 210 may determine that the patient needs to communicate with a remote co-diagnosis.
Upon determining that the patient needs to communicate with the remote co-morbid, the processing device 210 may execute step 1134.
Step 1134, controlling at least a portion of the at least one terminal device to enlarge the second interface element.
The processing device 210 may control the common terminal device to enlarge the second interface element when the patient receives the on-site medical clinic service at the clinic. When the patient receives the telemedicine outpatient service, the processing device 210 may control the patient terminal to zoom in on the second interface element. Through the enlarged second interface element, the patient can view the real-time picture of the remote attendant and better communicate with the remote attendant.
In some embodiments, when a patient receives in-situ medical outpatient service at a clinic, processing device 210 may alert the patient to wear an XR device and control the XR device to present image data of the remote co-diagnostic person when it detects that the patient needs to communicate with the remote co-diagnostic person.
In some embodiments of the present description, the communication needs of the patient can be detected based on the perception information, and the communication needs can be timely satisfied, thereby providing more humanized care for the patient and providing a more realistic and immersive accompanying experience.
The sub-process 1140 may be used to present medical data to a target user based on the perceptual information. As shown in fig. 11, sub-flow 1140 may include steps 1142 and 1144.
Step 1142, based on the sensory information, obtains control commands issued by the at least one target user for retrieving at least a portion of the medical data.
The target user may include at least a patient and a doctor. In some embodiments, the target user further comprises a remote attendant to the patient. The patient's medical data may include various data reflecting the patient's health condition (e.g., electronic medical records, medical images, medical examination results, etc.).
The control instructions are instructions for retrieving at least a portion of the medical data (e.g., electronic medical records) for display. For example, the control instructions are used to invoke a three-dimensional model of an organ of interest of a patient in the electronic medical record for display. In some embodiments, the control instructions may also be used to set display parameters (e.g., display angle, display size, display position). In some embodiments, the control instructions may also be used to annotate critical data on medical data (e.g., a three-dimensional model of an organ of interest).
In some embodiments, the perceptual information may include a speech signal captured by a sound sensor, and the control instructions may be obtained by performing semantic analysis on the speech signal. In some embodiments, the target user may issue the control instruction by speaking a preset wake-up word. In some embodiments, the perception information may include optical image data of a target user (e.g., patient and/or physician) acquired by the image sensor, and the control instructions may be obtained by performing gesture recognition on the target user in the optical image data. In some embodiments, the target user may issue control instructions using a control device (e.g., a remote control, intelligent control glove, etc.).
In some embodiments of the present description, the target user may flexibly adjust display content and/or display parameters, e.g., through voice, gestures, etc., to optimize the user experience and improve the efficiency of the medical service.
Step 1144, retrieving and presenting at least a portion of the medical data via the at least one terminal device in response to the control instructions.
For example, the processing device 210 may retrieve at least a portion of the medical data from the storage device and control the at least one terminal device to present the at least a portion of the medical data. When the display parameters are included in the control instructions, the processing device 210 may control the at least one terminal device to present at least a portion of the medical data based on the display parameters.
In some embodiments of the present disclosure, a plurality of target users may browse medical data together through at least one terminal device, and may synchronously change presentation contents and presentation manners of medical data on different terminal devices, which is helpful to improve communication efficiency between target users and improve interactivity in a treatment process.
Fig. 13 is a schematic diagram of an exemplary hospitalization procedure 1300 shown according to some embodiments of the present description. As shown in fig. 13, the hospitalization procedure 1300 includes a hospitalization admission session 1310, an admission inquiry session 1320, a hospitalization session 1330, an discharge session 1340, a follow-up session 1350, etc., or any combination thereof. When the patient is in different links of the hospitalization procedure 1300, different user services may be provided to the relevant users of the hospitalization procedure 1300. Relevant users of the hospitalization procedure may include patients, healthcare providers (e.g., doctors, nurses, etc.) who provide healthcare to the patients during the hospitalization procedure, visitors to the patients, etc. In some embodiments, the processing device 210 (e.g., the service module 430, an agent corresponding to a hospitalization service/procedure configured on the processing device 210) may perform steps involved in multiple links of the hospitalization procedure 1300.
In the hospitalization admission segment 1310, the patient may transact the relevant procedure for hospitalization. In some embodiments, the relevant user may be provided with hospitalization services related to the hospitalization admission segment 1310. For example, hospitalization services may include guiding the patient to the hospital ward to which the patient corresponds, specifying hospitalization rules for the patient, conducting admission checks for the patient, generating registration records for the patient, and the like, or any combination thereof. For further description of hospitalization services, see other relevant descriptions of this specification. See, for example, fig. 14 and its associated description.
In an admission query 1320, the patient may receive an admission query that is used to gather basic information about the patient. In some embodiments, the relevant user may be provided with an admissions inquiry service associated with an admissions inquiry link 1320. For example, the admission inquiry service may include performing one or more inquiries on the patient, generating an admission record for the patient, or the like, or any combination thereof. The admission inquiry to the patient may be made in a similar manner to the second inquiry described in fig. 9 and 10, and will not be described in detail herein. In some embodiments, the processing device 210 may obtain feedback information regarding the admission records entered by the doctor through the doctor's terminal. Feedback information may include information that is not present in the hospitalization record, but is considered essential by the physician. Further, the processing device 210 may determine supplementary query contents of the supplementary query according to the feedback information, and cause the public terminal device in the hospital ward to perform the supplementary query according to the supplementary query contents. The supplementary query may be performed in a similar manner as the admission query. The processing device 210 may then obtain supplemental sensory information collected by one or more sensory devices during the supplemental query and update the admission record based on the supplemental sensory information. Or the doctor can also go directly to the hospital ward to make supplementary inquiry for the patient. The processing device 210 may obtain supplemental sensory information collected by one or more sensory devices during the supplemental query. The processing device 210 may further update the admission record based on the supplemental awareness information.
In the hospitalization session 1330, the patient may stay in a hospital (e.g., a hospital ward) for a period of time to receive all-weather medical care. In some embodiments, hospital ward services associated with the hospitalization session 1330 may be provided to the associated user. For example, hospital ward services may include a care service 1332, a ward service 1334, a visit service 1336, and the like, or any combination thereof.
The care service 1332 is used to provide direct care to patients, including administration, physical examination, monitoring vital signs, and assisting in activities of daily living. For further description of care services, see other relevant descriptions of this specification. See, for example, fig. 15 and its associated description.
The ward service 1334 relates to a ward conducted by a medical team (e.g., at least one doctor) of patients in a hospital ward, where the medical team can review and discuss the status and care plan of the patient during the ward. For example, the ward service 1334 may include presenting data for facilitating communication between a doctor and a patient, generating ward records, presenting virtual ward space to one or more remote doctors, and the like, or any combination thereof.
For example only, the processing device 210 may obtain sensory information collected by at least one doctor during a ward of the hospital while making a ward of the hospital and/or patient screening data collected by one or more screening devices during the ward, and generate a ward record based on the sensory information and/or patient screening data. The ward-round records are used for recording the related data of the ward-round, including ward-round time, participators, patient data, communication content between the patient and at least one doctor, medical advice during the ward-round, and the like. As another example, processing device 210 may generate a virtual ward space based on the awareness information and present the virtual ward space to one or more remote doctors via one or more XR devices of the one or more remote doctors. The virtual ward space refers to a digitized environment for ward rounds. Alternatively, the remote doctor may communicate with at least one doctor and patient in the hospital ward via an XR device.
The exploratory services 1336 allow remote exploratory viewers to communicate with patients remotely. For example, the exploring service 1336 may include generating a virtual exploratory space for the patient and the remote exploratory, presenting a virtual exploratory space to the patient and the remote exploratory, and the like, or any combination thereof.
For example only, in response to the access request, the processing device 210 may obtain first current information of the patient and second current information of the remote seeker, and generate a virtual access space for the patient and the remote seeker based on the first current information and the second current information. The first current information may indicate a current state and/or a current context of the patient. The second current information may indicate a current status and/or a current context of the remote seeker. Virtual exploratory space refers to a digital environment that is presented to a patient and a remote exploratory during an exploratory. Furthermore, the processing device 210 may present virtual access space to the patient and the remote seeker, respectively, through a common terminal device in the hospital ward and a remote terminal device of the remote seeker.
In the discharge link, the patient can handle discharge procedures. In some embodiments, discharge services associated with discharge link 1340 may be provided to the associated user to guide the patient discharge from the hospital. For example only, the processing device 210 may obtain a target hospitalization record of the patient in response to an discharge instruction received from a doctor terminal of the patient's doctor. The target hospitalization record may record information about the patient's hospitalization procedure, such as medical history, treatment received, prescribed medications, examination results, and discharge summaries. Furthermore, the processing device 210 may generate discharge data from the target hospitalization record and present the discharge data to the patient through a common terminal device within the hospital ward. The discharge data may include discharge summaries, physician orders for discharge, instructional information regarding discharge procedures, discharge fees, payment means, and the like, or any combination thereof. In response to determining that the patient is performing an discharge operation, the processing device 210 may generate a discharge record corresponding to the patient. The discharge records are used for recording discharge related data including discharge time, discharge summary, doctor's order for discharge, discharge fee, payment means, status of patient discharge, etc.
In the follow-up procedure 1350, the patient may be provided with continued care after discharge to ensure continued rehabilitation, address any remaining health issues, and prevent readmission. In some embodiments, subsequent services related to the follow-up link 1350 may be provided to the relevant user. For example only, the processing device 210 may determine a follow-up plan for the patient based on the target hospitalization record for the patient. The follow-up plan is used to instruct how to provide follow-up services to the patient. In some embodiments, the follow-up plan may include one or more follow-up visits performed at one or more plan times. In addition, the processing device 210 may cause the doctor's doctor terminal and the patient's patient terminal to alert the doctor and the patient, respectively, according to the follow-up schedule. The follow-up may be performed off-line or remotely in a virtual follow-up space.
In some embodiments, after the follow-up is performed, the processing device 210 may generate a follow-up record corresponding to the patient. The follow-up record is used for recording data related to follow-up, including time corresponding to the follow-up, updated medical advice corresponding to the follow-up, health monitoring information corresponding to the follow-up, and the like. In some embodiments, the processing device 210 may update the follow-up plan. For example, the processing device 210 may obtain health monitoring information for the patient and update the follow-up plan based on the health monitoring information. The health monitoring information may be collected by one or more home monitoring devices in the patient's home.
In some embodiments, the processing device 210 may monitor a data source that collects data related to patient hospitalization procedures. In response to detecting a data update in at least one of the data sources, the processing device 210 may perform EOI detection based on the update data collected by the at least one data source. If an EOI occurs, the processing device 210 may perform one or more preset operations corresponding to the EOI to provide at least a portion of the hospitalization service.
Update data refers to data collected by at least one data source that has not been processed by the processing device 210. For example, if the at least one data source comprises a sensing device, the update data may comprise sensing information collected by the sensing device, such as image data collected by an image sensor, sound data collected by a sound sensor, etc. As another example, if the at least one data source includes a terminal device of a doctor associated with the patient, the update data may include input data of the patient entered by the doctor through the terminal device. As another example, if the at least one data source includes a vital sign monitor, the update data may include vital signs of the patient. As another example, if at least one data source comprises a medical examination department, the updated data may comprise the examination results of the patient.
EOI refers to a particular event or behavior that requires attention. EOI detection refers to processing update data collected by at least one data source to detect whether one or more EOIs have occurred. For example, an exemplary EOI may include a patient taking a hospital ward, an admission check on the patient, at least one doctor taking a round of the hospital ward, performing a care operation or medical check on the patient, the patient initiating a service request, obtaining or updating a doctor's order for the patient, an abnormality in the patient's physiological state, a doctor issuing instructions to the patient, and the like, or any combination thereof.
In some embodiments, the at least one data source may include multiple data sources that collect data related to the same EOI. In some embodiments, the at least one data source may include a data source that collects data related to a plurality of EOIs.
In some embodiments, processing device 210 may perform EOI detection based on EOI detection rules. The EOI detection rule refers to a rule that needs to participate in the EOI detection. In some embodiments, the EOI detection rules may be determined based on a history of EOI detection or manually set by a user (e.g., doctor, nurse, technician, etc.).
In some embodiments, each link of the hospitalization procedure may correspond to one or more EOIs, and different links of the hospitalization procedure may correspond to different types of EOIs. Thus, the processing device 210 may perform EOI detection on the updated data based on the patient's current session in the hospitalization procedure.
The preset operation may include a general operation and/or a specific operation. General operations refer to operations that need to be performed whenever an EOI occurs, regardless of the type of EOI. For example, the general operations may include generating a record associated with the EOI, sending the record associated with the EOI to an associated user for validation or to a storage device for storage, etc.
The specific operation refers to an operation performed when a specific type of EOI occurs. For example, updating a daily plan of a patient according to update data may be determined as a particular operation corresponding to updating an EOI of a doctor's patient order. As another example, providing notification to the healthcare provider related to the EOI may be determined as a particular operation corresponding to the EOI with an abnormality in the patient's physiological state.
For example, when the EOI includes a patient being admitted to a hospital ward, the one or more preset operations may include determining, based on patient data of the patient, content of an inquiry to be made to the patient after the patient is admitted to the hospital ward, causing a terminal device within the hospital ward to inquire based on the content of the inquiry, acquiring sensory information acquired by one or more sensory devices within the hospital ward during the inquiry, and generating a hospitalization record for the patient based on the sensory information.
As another example, when the EOI includes obtaining a hospital stay instruction request, the one or more preset operations may include obtaining a first location of a patient terminal and a second location of a patient's hospital ward, determining a planned path from the first location to the second location according to a real-time map of the hospital, and instructing a terminal device of the patient to present guidance information related to the planned path to the patient.
As another example, when the EOI includes meeting a condition for performing an admission check, the one or more preset operations may include controlling the intelligent care cart to direct a nurse to a hospital ward for performing the admission check on the patient.
As another example, when the EOI includes acquiring or updating a patient's order, the one or more preset operations may include determining a daily plan for the patient from patient data for the patient and the order or updated order, presenting the daily plan to the patient via a terminal device within the patient ward, and presenting the daily plan to a nurse corresponding to the patient via a terminal device of the nurse. Daily planning may include at least one medical procedure per day on a patient.
As another example, when the EOI includes at least one doctor making a ward of a hospital of the patient, the one or more preset operations may include obtaining perception information acquired by one or more perception devices in the hospital ward when the at least one doctor makes a ward of the hospital, and generating a ward check record based on the perception information.
As another example, when the EOI includes obtaining a visit request, the one or more preset operations may include obtaining first current information of the patient and second current information of the remote visit, generating a virtual visit space for the patient and the remote visit based on the first current information and the second current information, and presenting the virtual visit space to the patient and the remote visit, respectively.
As another example, when the EOI includes a received discharge instruction, the one or more preset operations may include obtaining a target hospitalization record of the patient, generating discharge data based on the target hospitalization record, and presenting the discharge data to the patient through a terminal device within the patient ward.
As another example, when the EOI includes that the patient has been discharged from the hospital, the one or more preset operations may include determining a follow-up plan for the patient from a target hospitalization record for the patient. The follow-up schedule may include one or more follow-up visits performed at one or more scheduled times. For each of the one or more follow-up visits, the one or more preset actions may further include causing the terminal device of the attending physician and the terminal device of the patient to alert the attending physician and the patient, respectively, according to the scheduled time of the follow-up visit.
In some embodiments, one or more preset operations may be performed based on the data source corresponding to the EOI. For example, when at least one data source includes a plurality of data sources that collect data related to the same EOI, one or more preset operations may be performed based on a combination of data related to the same EOI collected by the plurality of data sources. As another example, the at least one data source includes a data source that collects data related to a plurality of EOIs, and one or more preset operations corresponding to at least two of the plurality of EOIs may be different.
In some embodiments, processing device 210 may be configured with an agent that may perform at least a portion of process 1300. For example, the agent may learn EOI detection rules from the history and perform EOI detection according to the EOI detection rules. For another example, the agent may learn a correspondence between the EOI and the preset operations from the history record, and determine one or more preset operations corresponding to the EOI according to the correspondence.
In some embodiments, the agent may further learn EOI detection rules and/or correspondence between EOIs and preset operations based on patient data of different patients.
Fig. 14 is a flow chart of an exemplary process 1400 for providing hospitalization services, shown in accordance with some embodiments of the present description. When the processing device 210 detects that the patient has been admitted (e.g., the patient has completed an admission related procedure), a procedure 1400 may be performed.
At step 1410, the processing device 210 may direct the patient to a hospital ward.
For example, the processing device 210 may instruct a patient terminal of the patient to guide the patient to a hospital ward. For example only, in response to the hospitalization guidance request, the processing device 210 may obtain a first location of the patient's patient terminal and a second location of the hospital ward, and determine a planned path from the first location to the second location according to a real-time map of the hospital. The processing device 210 may then instruct the patient terminal to present guidance information related to the planned path to the patient.
In step 1420, the processing device 210 may provide admission education to the patient.
The admission education may be used to introduce admission information (e.g., admission procedure, admission operation, pre-admission fee, payment method, etc.), admission hospitalization rules, hospital environment, patient's doctor and/or nurse, etc. to the patient. In some embodiments, the processing device 210 may cause the patient terminal to present a third virtual character that provides admission education.
At step 1430, the processing device 210 may assist the nurse in preparing to admit.
Admission preparation may be performed by a nurse preparing hospital supplies for the patient. In some embodiments, the processing device 210 may present patient admission notifications through a nurse terminal 1405 or a smart care cart 240-4 within the nurse workstation to assist the nurse in preparing for admission. The admission notification may include patient data for the patient, a hospital supply list for the patient, ward information for the patient, information for admission checks for the patient, and the like.
Admission checks, which may also be referred to as hospitalization checks, are performed after the patient is taken into the hospital ward. Admission checks may be used to gather information about the current medical condition of the patient (e.g., vital signs, basic health data, etc.). Admission checks may include checking blood pressure, blood glucose, heart rate, body temperature, etc., or any combination thereof.
At step 1440, the processing device 210 may issue a reminder to perform the admission check. The reminder may include a message reminder, a sound reminder, a pop-up reminder, etc. For example, the processing device 210 may instruct the care terminal device 1405 or the intelligent care cart 240-4 to present a reminder.
In some embodiments, the processing device 210 may determine whether the patient satisfies the condition for an admission check in a hospital ward. The condition for admission checks in a hospital ward may include that the patient has arrived in the hospital ward for a period of time. If the patient meets the condition, the processing device 210 may issue a reminder to proceed with the admission check.
At step 1450, the processing device 210 may direct the nurse to the hospital ward. In some embodiments, the processing device 210 may control the movement of the smart care cart 240-4 to guide a nurse into a hospital ward.
At step 1460, an admission check may be performed on the patient.
For example, after a nurse arrives in a hospital ward, one or more examination devices may be used to admissions patients to collect patient data. In some embodiments, the processing device 210 may instruct the smart care cart to present information related to the admission check to the nurse during the admission check. For example, the intelligent care cart may display an admission check graphic, an electronic medical record of the patient, and so forth.
At step 1470, the processing device 210 may generate an admission record.
An admission record refers to a record that indicates the condition of a patient when the patient has admitted to a hospital ward and/or when the patient admitted to a hospital ward. The admission records may include admission information (e.g., admission number, clinical information, time of admission, amount of pre-admissions, payment method, etc.), survey data collected at the time of first visit, etc.
In some embodiments, the processing device 210 may generate the admission record based on the admission record template and the check-in data. In some embodiments, the processing device 210 may further generate an admission record based on the electronic medical record of the patient. In some embodiments, the processing device 210 may present the admission records to the nurse via the smart care cart 240-4 or the nurse terminal 1405 and generate the target admission record based on the admission records and feedback information from the admission records entered by the nurse via the smart care cart 240-4 or the nurse terminal 1405. The feedback information may include confirmation instructions, modification instructions, etc. entered by the nurse.
In some embodiments, the processing device 210 may be configured with an agent (e.g., an admission agent) that may participate in performing one or more steps of the process 1400. For example, the agent may guide the patient into a hospital ward, provide admission education for the patient, assist a nurse in conducting admission checks, generate admission records, and the like.
According to some embodiments of the present description, hospitalization services may be provided to patients in a semi-automated manner by means of a healthcare system (e.g., intelligent care cart 240-4) and/or agents, which may reduce labor costs and increase the efficiency of hospitalization services.
Fig. 15 is a schematic diagram of an exemplary process 1500 for providing care services shown in accordance with some embodiments of the present description. In some embodiments, the process 1500 may be performed daily during patient hospitalization to provide care services to the patient.
In step 1502, the processing device 210 may determine a daily plan for the patient based on patient data for the patient and a physician's order for the patient.
Physician orders for patients refer to instructions or instructions that the physician gives to the patient. In some embodiments, the patient's orders may be stored in a storage device and updated as any physician places a new order for the patient. The processing device 210 may retrieve the latest version of the order from the storage device. In some embodiments, the processing device 210 may monitor various hardware devices to detect whether a physician's order for a patient is updated. For example, when providing an admission inquiry service and/or a ward round service to a patient, the patient's doctor may issue a new order to the patient. The processing device 210 may detect a new order during the admission inquiry service and/or the ward round service based on the perceived information collected by the perceived device. Once a new order is detected, the new order may be stored in a storage device. As another example, a doctor may update a patient order stored in a storage device through a doctor terminal. In some embodiments, the processing device 210 can determine the physician order from the electronic medical record of the patient.
In some embodiments, the processing device 210 may determine the daily schedule of the patient based on patient data of the patient and the physician's order for the patient. Daily planning may include at least one medical procedure on the patient on the day. Exemplary medical procedures may include care procedures, examination procedures, and the like.
At step 1504, the processing device 210 may present the daily plan to the patient through a common terminal device in the hospital ward (e.g., bedside terminal 240-6 of hospital bed 240-2).
At step 1506, the processing device 210 may present the daily plan to the nurse corresponding to the patient through a nurse terminal (e.g., a terminal device in a nurse workstation, etc.).
In step 1508, when the daily plan includes at least one care, the nurse may perform the at least one care on the patient, and the treatment device 210 may assist the nurse in performing the at least one care according to the daily plan.
As shown in fig. 15, for each of the at least one care operation, the processing device 210 may control the intelligent care cart to direct a nurse to a hospital ward for the care operation according to the scheduled time of the care operation. For example, prior to the scheduled time of a care operation, the intelligent care cart may be controlled to move to a nurse workstation informing the nurse that a care operation is needed for the patient. The intelligent care cart can then be controlled to move and guide the nurse to the patient's hospital ward. The processing device 210 may further control the intelligent care cart to present care instructions regarding the care operation after the nurse arrives at the hospital ward.
At step 1510, the processing device 210 may generate a care record.
A care record refers to a care operation record that is applied to a patient and/or patient status (e.g., vital signs and other physiological measurements) before, after, or at the time of the care operation. In some embodiments, the processing device 210 may, while performing at least one care operation, obtain fifth sensory information collected by one or more sensory devices within the hospital ward and generate a care record based on the fifth sensory information. In some embodiments, the care record may be displayed to the nurse for confirmation via the intelligent care cart or nurse terminal.
In some embodiments, the processing device 210 may configure an agent (e.g., a care agent) that may participate in the execution flow 1500. For example, the agent may determine a daily plan for the patient, present the daily plan to the patient and/or the nurse, assist the nurse in performing at least one care operation, and generate a care record. In some embodiments, the processing device 210 configured with the care agent may be integrated into a hospital bed, or into a common terminal device in a hospital ward, or into a smart care cart.
According to some embodiments of the present description, automatic generation of daily plans and care recordings may significantly reduce the workload of nurses. This automation enables nurses to focus more on directly caring for patients rather than administrative tasks. Furthermore, monitoring of the order updates ensures timely updates to the daily plan. This proactive approach improves care efficiency and care quality by ensuring timely adjustment of interventions and care plans based on the latest medical instructions.
Fig. 16 is an exemplary schematic diagram of a process 1600 for surgical planning and execution shown in accordance with some embodiments of the present description. The process 1600 may be performed by the processing device 210 (e.g., the service module 430, an agent corresponding to a surgical service/process configured on the processing device 210).
Step 1610, surgical planning.
Surgical planning refers to the process of developing a surgical plan (e.g., an optimal surgical plan or a plurality of viable surgical plans) for a patient. In some embodiments, the processing device 210 may generate the surgical plan based on patient data (e.g., patient personal data, historical diagnostic and therapeutic data, medical examination data, etc.), physician feedback information (e.g., second feedback information), and/or perception information (e.g., fourth perception information). In some embodiments, the processing device 210 may generate an initial surgical plan from the patient data, present the patient data to the physician through the physician terminal, and generate the surgical plan from the initial surgical plan and the second feedback information about the initial surgical plan entered by the physician through the physician terminal.
In some embodiments, processing device 210 may determine a surgical difficulty factor from the patient data and further determine whether to hold an expert conference based on the surgical difficulty factor. In response to determining that an expert conference is required, processing device 210 presents the virtual conference space via a doctor terminal of the doctor and a remote terminal device (e.g., an XR device) of the remote expert, respectively. During the expert conference, the processing device 210 obtains fourth perception information collected by the doctor terminal and the remote terminal device. The processing device 210 generates a surgical plan based on the patient data and the fourth awareness information.
In some embodiments, the processing device 210 may generate a risk assessment result for the surgical plan by processing the surgical plan and at least a portion of the patient data using a risk assessment model. The processing device 210 determines risk preventive measures from the risk assessment results and presents the risk assessment results and risk preventive measures of the surgical plan to the physician. In some embodiments, processing device 210 may generate the surgical plan by performing steps 1710-1740 in fig. 17.
Step 1620, surgical simulation.
Surgical simulation refers to the process of a surgeon performing surgical exercises in a safe and controlled environment in order to improve the surgical plan and/or to improve the surgeon's surgical skills. For example, for complex or rare procedures, a physician may use an XR device to simulate a procedure on a virtual patient in a virtual surgical scene (e.g., an augmented reality surgical scene) according to a surgical plan. Potential risk points in the operation process are identified, and corresponding risk precaution measures are formulated. In addition, for a variety of surgical plans, a doctor may simulate each surgical plan in a virtual surgical scene through an augmented reality device in order to compare the advantages and disadvantages of different surgical plans, determining an optimal surgical plan.
In some embodiments, the processing device 210 may generate a virtual surgical scene for surgical simulation according to the surgical plan and present the virtual surgical scene to the doctor through the doctor's doctor terminal. The processing device 210 may obtain the interaction instruction related to the virtual surgical device input by the doctor through the doctor terminal or the interaction device corresponding to the virtual surgical device. The processing device 210 may update the virtual surgical site and the virtual surgical device in the virtual surgical scene according to the interactive instructions.
In some embodiments, processing device 210 may determine a possible emergency in the virtual surgical scene from the interactive instructions. The processing device 210 may update the virtual surgical site and virtual surgical device in the virtual surgical scene, depending on the possible emergency situation. For example only, the processing device 210 may obtain simulation data of the virtual surgical site and the virtual surgical device in a surgical simulation procedure. Processing device 210 may determine from the simulation data whether an optimized surgical plan is needed. In response to determining that an optimized surgical plan is needed, the processing device 210 may update the surgical plan based on the simulation data. In some embodiments, step 1620 may be omitted.
Step 1630, preoperative patient preparation.
Preoperative patient preparation refers to the preparation of a patient prior to an operative procedure or prior to entering an operating room for an operative procedure. As shown in fig. 16, the pre-operative patient preparation may include pre-operative education (or referred to as pre-operative care) 1632 and pre-operative instructions 1634.
Preoperative education refers to the process of specifying the patient's condition, surgical planning, and inferring the patient's postoperative rehabilitation status to the patient and/or patient's family. Preoperative education may be performed through a process 1700 in fig. 17. Preoperative guidance refers to the process by which a patient completes preoperative preparation prior to surgery. Preoperative education may follow the flow 1800 in fig. 18.
Step 1640, the procedure is performed.
Surgical performance refers to surgical related operations performed on a patient after entering an operating room. In some embodiments, the processing device 210 may obtain first sensory information acquired by one or more first sensory devices in the operating room during a patient procedure and perform EOI detection on the first sensory information. In response to determining that the EOI has occurred, the processing device 210 may perform one or more preset operations corresponding to the EOI. For example, the EOI may include target surgical tool instructions issued by the surgical participant, and the one or more preset operations corresponding to the EOI may include causing the smart machine nurse to communicate the target surgical tool to the surgical participant. As another example, the first sensory information may include an image of the surgical tool captured by the image sensor, the EOI may include a count of the surgical tool being less than a preset value, and the one or more preset operations corresponding to the EOI may include controlling the smart mechanical nurse to supplement the surgical tool. As another example, the EOI may include detecting a surgical risk, and the one or more preset operations corresponding to the EOI may include providing a notification regarding the surgical risk. As another example, the EOI may include that the procedure has been completed, and the one or more preset operations corresponding to the EOI may include generating a procedure record based on the first perceived information.
Step 1650, postoperative disc reconstruction.
The post-operative review disc may include at least one of updating medical advice reports, generating surgical results and physician action records (to facilitate review of the surgical procedure), and developing a post-operative care plan, etc.
Fig. 17 is a schematic diagram of an exemplary flow 1700 of preoperative education shown in accordance with some embodiments of the present description.
At step 1710, an explanation material for explaining the surgical plan is generated.
The explanation material is configured to explain information related to the surgical plan, such as explanation notes of the surgical plan, an execution process of the surgical plan at the surgical site of the patient, a post-operation rehabilitation process after the patient uses the surgical plan, and the like. In some embodiments, the instructional material may comprise text, images, audio, or video. In some embodiments, there may be multiple surgical plans. The processing device 210 may generate the explanation material corresponding to each surgical plan.
In some embodiments, the teaching material is generated based on a digital twin model (e.g., a three-dimensional anatomical model) of the surgical site of the patient. For example, the processing device 210 may simulate the surgical procedure and results (e.g., post-operative incision size) based on the surgical plan on the three-dimensional anatomical model of the patient's surgical site.
In some embodiments, the instructional material comprises a surgical video. The surgical video shows the surgical procedure at the surgical site of the patient. For example, in surgical planning, invasive surgery is a type of surgery, and surgical videos show the appearance of a patient's surgical site before an incision procedure, a procedure for performing an incision procedure with a scalpel, a procedure for removing lesions, a suturing procedure, and an appearance after a suturing procedure.
In some embodiments, the instructional material may comprise a patient post-operative rehabilitation procedure. For example, the surgical video may further show the patient's post-operative rehabilitation procedure. The processing device 210 may predict a post-operative rehabilitation procedure for the patient from the patient data. The healing process reflects the patient's vital signs and the healing process of the wound after surgery. In some embodiments, the surgical video may further demonstrate the risk condition (e.g., intra-operative risk condition and post-operative risk condition) the patient is exposed to during or after the surgical procedure. Post-operative risk conditions refer to the deleterious conditions that a patient may face after surgery.
At 1720, the instructional material may be presented to the patient and physician simultaneously via at least one terminal device.
In some embodiments, the at least one terminal device includes a patient terminal of the patient, a doctor terminal of the doctor, and a home terminal device of the home. As shown in FIG. 17, processing device 210 may present instructional material to a patient, a physician, and a family member simultaneously via XR device 260-2 worn by the patient, XR device 270-2 worn by the physician, and XR device 1722 worn by the family member. In some embodiments, the processing device 210 may present the instructional material simultaneously through at least one terminal device, which is illustrated by the physician. The user can select to input instructions through own terminal equipment to update the display content and the display mode of the explanation materials.
Step 1730, first feedback information regarding the surgical plan is determined based on second sensory information collected by the one or more second sensory devices during the course of the interpretation of the surgical plan.
The second perception device refers to a perception device configured to receive input information (e.g., text information, voice information, etc.) from a user (e.g., doctor or patient, etc.). For example, the second perceiving device may be a sound sensor (e.g., a microphone), an image sensor, etc., integrated into the user terminal device.
The first feedback information includes feedback of the preoperative educational participant (e.g., doctor, patient family, etc.) to the surgical plan. For example, the first feedback information includes at least selection information for the surgical plan. The selection information refers to the result of selecting the surgical plan. In some embodiments, the first feedback information further includes modification information for the selected surgical plan. The processing device 210 may analyze and process the second perception information through voice recognition, image recognition, etc., to determine the first feedback information.
Step 1740, validating or updating the surgical plan based on the first feedback information.
In some embodiments, the processing device 210 may determine the target surgical plan from the plurality of surgical plans based on the first feedback information. In some embodiments, the processing device 210 may update the surgical plan of the patient based on the first feedback information. In some embodiments, the processing device 210 may further obtain the first confirmation instruction and the second confirmation instruction after determining the target surgical plan from the plurality of surgical plans. The first confirmation instruction is an instruction regarding a surgical plan entered by the patient through the patient terminal. The second confirmation instruction is an instruction about the operation plan input by the family member through the home terminal apparatus. In response to receiving the first confirmation instruction and the second confirmation instruction, the processing device 210 may cause the patient terminal, the doctor terminal, and the home terminal device to respectively present the operating protocol. The processing device 210 may obtain signature information of the surgical protocol from the patient terminal, doctor terminal, and home terminal devices, respectively.
The input method of confirming the instruction or signature information may include button input, gesture input, voice input, and the like. In some embodiments, the input of the validation instruction or signature information may be a fingerprint input. The processing device 210 may verify the identity of the user based on the fingerprint input by the user, for example, using blockchain verification techniques.
Step 1750, generating an explanation video of the surgical plan.
In some embodiments, as shown in fig. 17, the processing device 210 may generate an explanation annotation of the surgical plan based on the second perception information. The processing device 210 may create a surgical plan interpretation video based on the surgical video and the description of the surgical plan. The instructional notes may include a physician's instructions for one or more frames and corresponding operations in the surgical video. For example, the instructional notes can include instructions for incision site selection, incision path, incision length, and the like. The lecture notes may be incorporated into the surgical video in text or audio form to obtain the lecture video.
In some embodiments, processing device 210 may obtain interaction instructions related to the lecture material from at least one terminal device and update the lecture material displayed by the at least one terminal device based on the interaction instructions. The interactive instruction refers to a control or modification instruction of the explanation material input by the patient through the patient terminal or the doctor through the doctor terminal. For example, if the material is a surgical video, the interactive instruction may be "rewind video for 15 seconds".
According to some embodiments of the present disclosure, in a virtual patient education space, a patient and a patient family explain patient progress and operation plan through videos, and remote doctor-patient communication can be achieved, so that the patient and the patient family can be helped to understand quickly, doctor-patient communication efficiency is improved, the patient and the patient family can be helped to know the current situation and postoperative risk of the patient fully, fear and anxiety of the patient are reduced, and success rate of operation procedures is improved.
Fig. 18 is a schematic diagram of an exemplary preoperative instruction flow shown according to some embodiments of the present description.
The pre-operative guidance may include patient protection, patient verification, pre-operative education, pre-operative cleaning, venous access establishment, and the like. Patient escort refers to transporting the patient from his/her current location to a waiting area of the operating room.
Patient verification refers to verifying whether a patient meets surgical criteria. For example, patient verification may include verifying that the identity information of the verification object matches the target patient performing the current surgical procedure, verifying that the surgical procedure of the verification object is currently scheduled, and verifying that the current physical condition of the verification object meets the requirements of the surgical procedure. It will be appreciated that if the verification object does not meet any of the surgical criteria, the patient's surgical procedure may be delayed or delayed.
In some embodiments, the processing device 210 may collect biological information of the patient through one or more third sensing devices of the waiting area and verify the identity of the patient based on the biological information. For example, as shown in fig. 18, after the patient is transported to the waiting area 1810, the processing device 210 may acquire biological information of the patient 261 through one or more third sensing devices 1811 (e.g., image capturing devices, microphones, fingerprint sensors, etc.) in the waiting area 1810 and verify the identity of the patient 261 (e.g., perform patient identity verification) based on the biological information. In some embodiments, the processing device 210 may utilize a nurse agent to verify the identity of the patient. For example, the nurse agent may verify the collected biometric information, or verify the identity of the patient through voice interaction with the patient (e.g., asking the patient for age, name, gender, etc.).
Preoperative care includes preoperative pacifying and preoperative education. Preoperative pacifying refers to preoperative preparation that helps reduce negative emotions (e.g., anxiety, stress, fear, etc.) in the patient by way of verbal communication, video, music, etc. Preoperative education refers to preoperative preparation that helps patients understand the surgical procedure. Preoperative cleaning refers to pre-operative preparation, e.g., body cleaning, dehairing (e.g., hair, body hair, etc.), wearing of surgical gowns by a patient, etc. Venous access is the creation of an intravenous drug path within a patient to ensure that the drug is effectively administered to the patient during surgery.
In some embodiments, the processing device 210 may determine a planned path from the current location of the patient to the waiting area and control the intelligent wheelchair to transport the patient to the waiting area along the planned path. For example, as shown in fig. 18, the processing device 210 may determine a planned path from a current location 1803 of the patient 261 (e.g., a hospital ward) to the waiting area 1810 prior to performing a pre-operative procedure on the patient according to a surgical plan. The processing device 210 may control the intelligent wheelchair 240-5 to transport the patient 261 from the hospital ward 1803 to the waiting area 1810 along the planned path.
In some embodiments, the processing device 210 may determine a planned path from the current location to the waiting area based on the hospital map. In some embodiments, processing device 210 may configure a nurse agent that performs certain tasks in place of a nurse and may present a virtual nurse persona. The processing device 210 may use a nurse agent to control the intelligent wheelchair to transport the patient from the current location to the waiting area. In some embodiments, the processing device 210 may perform patient verification after the patient is transported to the waiting area.
In some embodiments, the processing device 210 may determine the pre-operative care material of the patient from the patient data and the surgical plan. The treatment device 210 may use a patient terminal to provide preoperative education to the patient in terms of preoperative care materials during transport of the patient to the waiting area. The pre-operative care material may include video, music, images, text, and other materials related to surgical interpretation and/or emotional relaxation.
In some embodiments, the processing device 210 may provide pre-operative education to the patient using a nurse agent. For example, as shown in FIG. 18, processing device 210 may present virtual nurse character 1823 on XR device 260-2 worn by patient 261, virtual nurse character 1823 speaking pre-operative care material to patient 261. In some embodiments, virtual nurse character 1823 may interact with patient 261 in voice to alleviate a negative emotion of the patient or answer the patient's question through communication. In some embodiments, the processing device 210 may determine whether it is necessary to alleviate the patient's mood by capturing the patient's facial expression, physiological characteristics, intonation, etc.
In some embodiments, the processing device 210 may use a nurse agent to instruct the nurse on pre-operative cleaning and/or venous access establishment.
In some embodiments, during the transportation of the patient to the waiting area, the processing device 210 may obtain third perceived information related to a portion of the planned path from the current location of the intelligent wheelchair to the waiting area (e.g., a portion of the planned path that the intelligent wheelchair has not traveled) through one or more fourth perceived devices in the hospital. Based on the third perceived information, the processing device 210 may determine a potential risk of the non-traveled portion of the planned path and update the non-traveled portion based on the potential risk.
The one or more fourth sensing devices can include image capture devices (e.g., infrared surveillance cameras 1813), lidar, and the like. The one or more fourth sensing devices may be mounted in a location such as a smart wheelchair, a hospital ceiling or a hospital wall.
In some embodiments of the present application, through the above-mentioned preoperative guidance procedure, a humanized, transparent and efficient preoperative preparation procedure can be provided, and preoperative preparation items can be dynamically adjusted according to patient feedback, so as to improve preoperative preparation efficiency. By the above process of conveying the patient and verifying the identity of the patient, human errors are avoided, and the safety of the whole operation process is improved. Many preoperative preparation tasks are completed with the aid of virtual nurse images, so that labor cost can be saved.
Fig. 19 is a schematic diagram of an exemplary flow 1900 of surgical execution shown in accordance with some embodiments of the present description. The surgical execution flow may include preoperative preparation, intraoperative events, and postoperative events. As shown in fig. 19, the preoperative preparation (i.e., the step prior to the surgical execution) may include steps 1911, 1913, and 1915.
Step 1911, the operating room is activated.
Activating the operating room may include opening an operating room door, activating an operating device in the operating room, monitoring the device, adjusting parameters in the operating room, verifying a status of the operating device, and the like. In some embodiments, the processing device 210 may control the intelligent robotic nurse to activate the operating room or direct the nurse to activate the operating room. For example, the processing device 210 may control the intelligent robotic nurse to automatically activate the operating room device at a predetermined operating time and adjust the room temperature, humidity, and air quality.
Step 1913, a surgical tool is prepared.
Surgical tools may include surgical instruments and surgical consumables. In some embodiments, the processing device 210 may control the intelligent robotic nurse to prepare surgical tools in the operating room prior to surgery according to the surgical plan. In some embodiments, the processing device 210 may further control the intelligent robotic nurse to disinfect and place the surgical table (e.g., place various surgical tools on the surgical table).
Step 1915, the patient is identified and/or anesthetized. Patient confirmation refers to the identity of the patient being confirmed. Anesthesia of a patient refers to the anesthesia of the patient.
Step 1920, a surgical procedure is performed. In some embodiments, as shown in fig. 19, the intraoperative events (e.g., intraoperative events) can include remote collaboration, tool transfer, image interaction, intraoperative planning and navigation, and real-time alerting.
Remote collaboration refers to remotely participating in and/or guiding a surgical procedure.
Tool delivery refers to the delivery of surgical tools to a surgical executor during a surgical procedure. In some embodiments, processing device 210 may identify instructions issued by the surgical participant for the targeted surgical tool based on first sensory information acquired by one or more first sensory devices in the operating room during the procedure. Based on these instructions, processing device 210 may control the intelligent robotic nurse to deliver the targeted surgical tool to the surgical participant.
Image interaction refers to displaying a digital body model of a patient (e.g., a three-dimensional anatomical model of a surgical site), an electronic medical record of the patient, a surgical plan of a current surgery, a real-time image of a surgical site of the patient, etc., to a surgical participant (e.g., a local surgical participant, a teleoperational participant) and/or the patient via an interaction device (e.g., a display screen of an operating room, a doctor terminal 270).
Intraoperative planning navigation refers to fusing a patient's lesion image (e.g., CT scan image of the lesion) with a digital body model of the patient during surgery, projecting the lesion image onto the patient's body, or overlaying the positioning and tracking of surgical tools to guide the surgical participants in performing the surgery.
Real-time alarms may include behavioral alarms of surgical participants, patient vital sign alarms, and device operational status alarms, among others. Behavior alert refers to the monitoring and alerting of the intraoperative operational behavior of a surgical participant. When an abnormality occurs in a patient vital sign (e.g., electrocardiogram, blood pressure, etc.), a patient vital sign alarm may be activated. The device operation state alarm refers to an alarm issued when an abnormal operation state of the surgical device occurs.
As shown in fig. 19, the post-operative procedure (e.g., post-operative event) may include steps 1931, 1933, 1935.
Step 1931, patient transfer. Patient transfer refers to the process of transferring a patient from an operating room to a rehabilitation area after the surgical procedure is completed. In some embodiments, transferring the patient may be performed by a healthcare professional assisted by a smart robotic nurse.
Step 1933, operating room cleaning. Operating room cleaning refers to the process of cleaning or disinfecting surgical equipment and surgical tools. In some embodiments, the processing device 210 may control the intelligent robotic nurse to perform room cleaning operations.
Step 1935, a surgical report is generated.
The surgical report may include surgical related information, patient related records, participant related records, and the like. In some embodiments, the processing device 210 may generate an initial surgical report based on data acquired during the surgical procedure (e.g., first sensory information acquired by one or more first sensory devices in the operating room). The processing device 210 may generate the surgical report based on the initial surgical report and feedback information entered by the physician regarding the initial surgical report.
In some embodiments, the processing device 210 may also monitor the patient's post-operative vital signs through vital sign monitoring devices (e.g., electrocardiographic monitors, blood pressure monitors, etc.) within the hospital ward to determine if the patient's post-operative vital signs are within normal range, if there is an abnormality, or if the recovery process is normal. Furthermore, the processing device 210 may update the medical advice report based on the post-operative sign of the patient. In some embodiments, the processing device 210 may update the medical advice report according to the direction of the physician. In some embodiments, the processing device 210 may send the updated medical advice report to a display device of the nurse workstation and/or a display device of the doctor workstation.
In some embodiments, the processing device 210 may determine the post-operative care plan from the updated medical advice report. Post-operative care plan refers to care tasks that need to be performed by a caregiver (e.g., nurse, caregiver, etc.) during a patient's post-operative hospital stay. In some embodiments, the processing device 210 may control a smart surgical device (e.g., the smart care cart 240-4) to provide care to the patient according to a post-operative care plan. In some embodiments, the processing device 210 may send the post-operative care plan to a nurse so that the nurse may provide post-operative care to the patient. In some embodiments, the processing device 210 may update the post-operative care plan in real-time based on the patient's condition during the care. The execution of the post-operative care plan is similar to the daily plan depicted in fig. 15.
In some embodiments, processing device 210 may generate surgical results and surgical records for the doctor from the surgical report and the medical advice report in order for the doctor to review the surgical procedure. Surgical outcome refers to data reflecting the outcome of a surgical procedure. In some embodiments, the surgical results further include summary data of the surgical results over a predetermined period of time (e.g., one month). The operation record refers to the operation record of the doctor during the operation. The operational records may include action records, force records, station records, and the like. In some embodiments, processing device 210 may generate surgical results and surgical records from the surgical report and the medical advice report.
In some embodiments, the surgical procedure is reviewed. For example, the processing device 210 may present the surgeon's surgical outcome and surgical records to the surgeon, allowing the surgeon to review the surgical procedure.
Fig. 20 is a schematic diagram of an exemplary healthcare procedure 2000 shown according to some embodiments of the present description. In some embodiments, the processing device 210 (e.g., the service module 430 or an agent corresponding to a patient service configured on the processing device 210) may perform the steps involved in the healthcare procedure 2000.
At step 2010, patient data is acquired.
The patient data may be information about the patient receiving the medical service. Patient data may also be referred to as patient-related data. In some embodiments, patient-related data may include, but is not limited to, patient location information 2011, patient interaction information 2012, patient's examination data 2013, perception data 2014, clinical path 2015, and/or healthcare provider entered data 2016.
The patient position information 2011 is information for determining the position of the patient at the current time.
Patient interaction information 2012 may be information generated by interactions between the patient and the terminal device. For example, personal information, health status, medical history, symptom descriptions, lifestyle, etc., entered by the patient through the patient terminal and/or hospital terminal. For another example, the patient answers entered by the patient through the patient terminal, remote accompanying requests, target links for patient selection, and the like.
The inspection data 2013 is an inspection result regarding the health condition of the patient. Such as examination reports of the patient, medical images, vital signs, etc.
The sensory data 2014 is information collected by the sensory device. For example, when a patient is located in a hospital, the patient data may include patient-related sensory information collected by one or more sensory devices in the hospital.
Clinical pathway 2015 refers to a model and method for prescribing standardized medical services that may be used to formulate corresponding detailed treatment steps and management plans for a particular disease or condition. For example, the clinical pathway may be a time sequence listing detailing standardized, evidence-based procedures performed for a particular patient population over a particular period of time. These steps may include specific operations and goals for each point in time corresponding to the diagnostic, therapeutic, nursing and rehabilitation procedures.
The data 2016 input by the medical service provider refers to patient-related data input by the medical service provider (e.g., doctor, nurse, care provider, etc.) through a doctor terminal or hospital terminal device, such as disease diagnosis, treatment plan, orders, document modification, etc.
In some embodiments, the acquisition module 410 may acquire patient-related data via one or more hardware devices including the patient terminal 2010-a, the doctor terminal 2010-b, the hospital terminal device 2010-c, the examination device 2010-d, and/or the perception device 2010-e of the environment in which the patient is located.
Step 2020, detecting that the patient enters a target link in the medical service flow according to the patient data.
The healthcare process is a standardized series of operations and service steps that are undergone by a patient from the time the patient first comes into contact with the healthcare service to the time the treatment is completed and left. As shown in fig. 20, the medical service procedure may include a registration link, a waiting link, a consultation link, a hospitalization link, a follow-up link, and so on. The target link refers to a medical service link which the patient is about to enter. For example, based on the data related to the patient, the processing device 210 may detect whether a link of the patient in the healthcare procedure switches to a target link. A detailed description of detecting patient entry into the target link can be found in fig. 21.
In step 2030, in response to detecting that the patient enters the target link, the patient terminal is controlled to execute at least one first preset operation corresponding to the target link.
The first preset operation refers to a dynamic operation performed through an interactive interface of the patient terminal. Fig. 22 is a schematic diagram of an exemplary first preset operation shown in accordance with some embodiments of the present application. As shown in fig. 22, the first preset operation may include one or more of presentation guidance information 2210, presentation information 2220 of the presentation target link, and/or presentation service portal 2230. The guidance information is navigation information of a planned path from the current position of the patient to a target position corresponding to the target link. The planned path may be determined by acquiring an initial 3D map of the hospital and real-time information related to the hospital, generating a real-time 3D map of the hospital from the initial 3D map and the real-time information, and determining the planned path from the real-time 3D map of the hospital.
The introduction information of the target link is information for introducing medical services and procedures of the target link. In some embodiments, the processing device 210 may present a avatar that interprets the introduction information. Or the processing device 210 may receive a question about the target link from the patient terminal after introducing the information and determine an answer to the question. The processing device 210 may cause the patient terminal to present an answer to the question to the patient.
The service portal is a portal where a user obtains medical services at a target link and/or a link subsequent to the target link. In some embodiments, the processing device 210 may cause the patient terminal to present guidance information related to a planned path from the current location of the patient to a target location corresponding to the target link, and determine whether the patient has reached the target location based on the patient-related data. To determine that the patient has reached the target location, the processing device 210 may cause the patient terminal to present a service portal for accessing user services associated with the target link.
In some embodiments, the patient terminal includes an XR device, and the processing device 210 may cause the XR device to overlay information related to the first preset operation over a real world view of the patient via AR technology or MR technology. In some embodiments, the patient terminal has an affected space application. Patient interaction information 2012 between the patient and the patient terminal may be obtained using the patient space application. The first preset operation may be performed using the lesion space application.
In some embodiments of the present disclosure, when it is detected that a patient enters a target link, a patient terminal is controlled to perform a first preset operation according to the target link, so as to provide a medical service corresponding to the target link for the patient, and assist the patient in participating in a required medical service procedure, thereby improving efficiency and quality of service of the medical service.
In some embodiments, the doctor terminal is controlled to perform at least one second preset operation in response to detecting that the patient switches from the current healthcare link to the target link.
The second preset operation is an operation performed by the interactive interface of the doctor terminal. In some embodiments, the medical space application is installed in a doctor terminal. The second preset operation may be performed using a medical space application.
In some embodiments, the second preset operation may include presenting a reminder that the patient has completed the current link, presenting a patient record related to the patient's current link, and/or presenting a patient schedule or patient request related to the patient's services to the target link.
In some embodiments, in response to detecting that the patient link switches from the current healthcare link to the target link, the hospital terminal device is controlled to perform at least one third preset operation.
The third preset operation is an operation performed by the interactive interface of the hospital terminal device. In some embodiments, the tube space application is installed on a hospital terminal device. A third preset operation may be performed using the pipe space application. In some embodiments, the processing device 210 may cause the common terminal device in the hospital to perform at least one third preset operation.
In some embodiments, the third preset operation may include presenting a patient record relating to the current link of the patient, presenting a reminder indicating that the patient has completed the current link, presenting a patient schedule or patient request relating to the patient's services to the target link, and/or presenting service appointment information for the target link of the patient.
In some embodiments, in response to detecting that a link of a patient switches from a current healthcare link to a target link, processing device 210 may schedule a service resource for a healthcare of the target link of the patient. For example, when a patient is detected entering an examination link, the processing device 210 may assign an examination room and an examination technician to the patient.
In some embodiments, if the target link is an outpatient registration link, the at least one preset operation may include a first query of the patient to determine a doctor to whom the patient is registered.
In some embodiments, if the target link is a waiting link for an outpatient, the at least one preset operation may include making a second query to the patient for a pre-consultation.
In some embodiments, if the target link is a follow-up link, the at least one preset operation may include alerting the patient to the follow-up at the scheduled time.
FIG. 21 is a schematic diagram of an exemplary flow 2100 for determining a target link, according to some embodiments of the present description.
Step 2110, detecting that the current link of the medical service procedure is completed according to the patient data.
The current link is the medical service link in which the patient currently participates. For example, the processing device 210 may detect that the current link has been completed based on the patient location information 2011. For example, when the medical service system 200 provides the on-site medical service, the processing device 210 determines that the patient leaves the location corresponding to the current link according to the patient location information 2011, and then determines that the current link is completed.
As another example, the processing device 210 may detect that the current link is completed based on the patient interaction information 2012. As another example, the processing device 210 may detect that the current link has been completed based on the data 2016 entered by the healthcare provider. As another example, the processing device 210 may detect from the inspection data 2013 that the current link has been completed. As another example, the processing device 210 may detect from the clinical path 2015 that the current link has been completed. As another example, the processing device 210 may detect that the current link has been completed based on the awareness data 2014.
Step 2120 predicts one or more next links that the patient may perform.
In some embodiments, the processing device 210 may determine one or more next links that the patient may perform according to a standardized operational procedure of the healthcare procedure. Standardized operational procedures are healthcare procedures that are applicable to most patients. For example, upon detecting that the patient completes a registration link, the processing device 210 may determine that the next link that the patient is likely to perform is a waiting link based on a standardized operational flow of the outpatient service.
In some embodiments, the processing device 210 may determine one or more next links that the patient may perform according to the patient personalization process of the healthcare process. Patient personalization flow is a medical service flow that refers to a specific patient (e.g., important person (VIP), critical patient). In some embodiments, the processing device 210 may determine one or more next links that the patient may perform based on the patient's off-site health information.
In some embodiments, when there are multiple links that the patient may execute, the processing device 210 may recommend the next link based on the time the patient is going to the location corresponding to the next link and the estimated wait time of the patient. For example, for each of the plurality of next links, the processing device 210 may determine a first estimated time from the current location to the target location corresponding to the next link and a second estimated time waiting at the target location from the real-time 3D map of the hospital. Further, the processing device 210 may determine at least a portion from the plurality of next links based on the first estimated time and the second estimated time for each next link.
At step 2130, the patient terminal is controlled to present at least a portion of one or more next links that the patient may perform.
The patient terminal 2010-a may present one or more next links that the patient may perform or the next links that the patient prefers to enter as recommended by the processing device 210. For example, after completing the registration session, the patient terminal 2010-a may present a number of possible next sessions, such as a waiting session, a consultation session, and a hospitalization session.
Step 2140, a patient selected target link is obtained from the patient terminal.
The patient can select the target link by means of text input, voice input, screen click, gesture input and the like by using the patient terminal. In some embodiments of the present disclosure, after detecting that the current link is completed, according to different medical procedures corresponding to different patients, a possible next link may be automatically presented to the patient, so as to help the patient define the medical procedure, and give the patient freedom of selection. In addition, personalized services are provided for the patient.
Fig. 23 is a schematic flow diagram of an exemplary flow 2300 for assisting a physician's work, shown in accordance with some embodiments of the present specification. In some embodiments, flow 2300 may be implemented by processing device 210 (e.g., service module 430 or an agent corresponding to a doctor service configured on processing device 210).
At step 2310, an access request to the medical space application may be obtained from the medical terminal.
An access request refers to a request by a doctor to initiate access to a medical space application. The physician may generate the access request in various ways. For example, the geospatial application may be displayed on the display screen of the mobile terminal device 270-1, and the doctor 271 may generate the access request by clicking on an icon of the geospatial application on the display screen or by issuing a voice command (e.g., "launch the geospatial application"). As another example, XR device 270-2 may generate a virtual reality version of the hospital space application, and doctor 271 may generate the access request by interacting with the virtual reality hospital space application or using voice commands.
In response to the access request, one or more pending tasks of the doctor may be determined based on the time of receipt of the access request and the doctor's schedule information, step 2320.
The reception time of the access request refers to a point of time when the processing device 210 receives the access request transmitted from the medical terminal 270.
Doctor schedule information refers to details of the doctor's work schedule on the same day. The schedule information of the doctor may include various work tasks that the doctor needs to complete on the same day, and the time corresponding to each work task (e.g., the planned start time and the planned end time of each work task). For example only, doctor's schedule information may be "ward preparation (7:45-8:00), ward (8:00-9:00), pre-consultation preview (9:00-9:10), visit (9:10-11:30), preoperative preparation (13:30-14:00), surgery (14:00-17:00)".
Pending tasks refer to tasks that the physician has not completed the day, which may include tasks that the physician is currently processing yet to begin, or a combination thereof. For example, taking doctor's calendar information as an example in the above example, if a doctor is currently taking a visit, one or more tasks to be completed may include a visit, preoperative preparation, and surgery. In some embodiments, processing device 210 may determine one or more tasks whose scheduled end times are later than the receipt time of the access request as pending tasks.
In some embodiments, the one or more pending tasks may include making rounds in a hospital ward. In some embodiments, the doctor may conduct the rounds by remotely participating in the rounds. For example, the processing device 210 may obtain a request from the physician terminal to participate in the ward round entered remotely by the physician. In response to detecting a ward being filled in a hospital ward, the processing device 210 may obtain sensory information collected by one or more sensory devices in the hospital ward during the ward, generate a virtual ward space based on the sensory information and patient data, and cause the doctor terminal to present the virtual ward space.
In some embodiments, the one or more tasks to be processed may include providing a visit service in a consulting room. In some embodiments, the one or more pending tasks may include providing a remote visit service. Remote medical services refer to doctors providing medical diagnosis and services through an online platform (e.g., medical space application). In some embodiments, the one or more tasks to be processed may include performing surgery on the target patient.
In some embodiments, the one or more pending tasks may include writing a work record. The job record may record details of the doctor's daily activities, such as the content of the job completed on the day, the time of the job, emergency situations during the task, task summaries, etc. In some embodiments, the work records may include daily work records, work records for each task, or work records for a preset period of time (e.g., at 3 hours, 5 hours, 24 hours, etc.). In some embodiments, one or more pending tasks for which work records are written may be displayed at all times.
In some embodiments, the one or more tasks to be processed may include reviewing a record of the one or more tasks that have been completed. For example, doctor 271 may access records related to rounds, visits, procedures, etc. through doctor terminal 270 and view these records. When viewing one or more completed tasks' records, the physician may add, modify, and/or delete content in the records.
In some embodiments, processing device 210 may determine one or more completed tasks for the physician based on the physician's time of receipt and the schedule information. In some embodiments, processing device 210 may determine a task whose scheduled end time is earlier than the receipt time of the access request as a completed task. In some embodiments, since certain tasks may be completed in advance, a doctor may input a task completion instruction (e.g., a voice command "see task completed") through the doctor terminal 270 according to the actual completion status of the task. The doctor terminal 270 may transmit a task completion instruction to the processing device 210, and the processing device 210 may determine that the corresponding task is a completed task according to the task completion instruction.
In some embodiments, in response to the access request, the processing device 210 may cause the physician terminal 270 to present an initial interactive interface including an eighth interface element for reminding the physician to check the work schedule (i.e., the physician's schedule information) through the physician space application. In some embodiments, the processing device 210 may determine one or more pending tasks in response to a request to access a work schedule entered by a doctor through a doctor terminal.
Step 2330, the doctor terminal may be caused to present an interactive interface through the doctor space application. For example, as shown in fig. 23, the processing device 210 may cause the physician terminal 270 to present an interactive interface 2331. Interactive interface 2331 may be presented in such a way that if doctor terminal 270 is a terminal device with a display screen (e.g., mobile terminal device 270-1 or a desktop terminal device, etc.), interactive interface 2331 may be presented directly on the display screen, and if doctor terminal 270 is XR device 270-2, XR device 270-2 may present interactive interface 431 in the virtual reality space generated by XR device 270-2. For more description of the interactive interface, see FIG. 24 and its associated description.
The interactive interface may include at least one interface element. The doctor may access one or more ancillary services corresponding to the at least one interface element by accessing the at least one interface element. The physician may access at least one interface element by clicking, long pressing, voice selection, etc.
Ancillary services refer to functionality provided by a medical space application to assist a doctor in completing a work task. For example, the auxiliary services may include displaying patient data, displaying a 3D map of a target location within a hospital, and providing a doctor with services to remotely participate in a visit or a ward round.
In some embodiments, the interactive interface may include a first interface element for accessing an auxiliary service associated with at least one of the one or more pending tasks. The doctor may access the auxiliary service corresponding to the first interface element by clicking or selecting the first interface element in the voice. For more description of the interactive interface, see FIG. 24 and its associated description.
Fig. 24 is a schematic diagram of an exemplary interaction interface 2331 shown in accordance with some embodiments of the present description.
In some embodiments, as shown in fig. 24, the interactive interface 2331 may include a first interface element 2410 (hereinafter referred to as a plurality of first interface elements 2410) for accessing auxiliary services related to at least one of one or more pending tasks.
In some embodiments, when the one or more pending tasks include making rounds in a hospital ward, the first interface element may include a first interface element for applying for remotely participating in rounds (hereinafter referred to as a rounds interface element). For example, as shown in fig. 24, the plurality of first interface elements 2410 may include a ward interface element 2411.
In some embodiments, when the one or more tasks to be processed include providing an outpatient service in a consulting room, the first interface element may include a first interface element (hereinafter referred to as a consultation interface element) for accessing patient data of patients for whom a consultation service has been scheduled. For example, as shown in fig. 24, the plurality of first interface elements 2410 may include a consultation interface element 2412. In some embodiments, the first interface element further comprises a first interface element for accessing an initial diagnostic record associated with the outpatient service, the initial diagnostic record generated based on sensory information collected by one or more sensory devices of a consulting room in the outpatient service. For example, the processing device 210 may obtain a request from the physician terminal to access patient data of a target patient of the patients, generate a virtual character representing the target patient from the patient data of the target patient, and cause the physician terminal to present the virtual character to interpret the patient data of the target patient to the physician.
In some embodiments, when the one or more tasks to be processed include providing a remote out-patient service, the first interface element may include a first interface element for entering a virtual office (hereinafter referred to as an office interface element). For example, as shown in fig. 24, the plurality of first interface elements 2410 can include a consulting room interface element 2413. In response to interaction between the doctor (e.g., doctor 271) and the consulting room interface element 2413, the processing device 210 may cause the doctor terminal 270 (e.g., XR device 270-2) to present a virtual consulting room. For example, the processing device 210 may obtain a request from the doctor terminal to enter the virtual clinic to provide remote outpatient services to the target patient, and cause the doctor terminal to present a 3D patient model of the target patient. The processing device 210 may acquire an examination order input by a doctor through interaction with the 3D patient model from the doctor terminal, and cause a wearable device worn by the target patient to acquire the examination data of the target patient according to the examination order.
In some embodiments, when the one or more tasks to be processed include performing a procedure on the target patient, the first interface element may include a first interface element (hereinafter referred to as a procedure interface element) for accessing patient data related to the target patient. For example, as shown in fig. 24, the plurality of first interface elements 2410 may include a surgical interface element 2414 for accessing patient data related to a target patient. The patient data may include data related to a target surgical plan corresponding to a target patient.
In some embodiments, the first interface element may further include a first interface element for updating the order of the target patient. In some embodiments, the first interface element may further include a first interface element for accessing an initial surgical record of a surgery. The initial surgical record may be generated based on sensory information acquired by one or more sensory devices in the operating room during the procedure.
In some embodiments, as shown in fig. 24, the interactive interface 2431 may further include a second interface element 2420 for accessing a real-time 3D map associated with a target location corresponding to the at least one task to be processed (i.e., browsing the target location corresponding to the at least one task to be processed). The real-time 3D map refers to a Virtual Reality (VR) model of a space corresponding to a target location. In some embodiments, a real-time 3D map of the target location may be generated based on the initial 3D map of the hospital and real-time information of the target location.
In some embodiments, as shown in fig. 24, the interactive interface 2331 may further include a third interface element 2430 for preoperative education. In response to the interaction between the physician and the third interface element 2430, the physician terminal 270 can generate a request for preoperative education for the target patient and send the request to the processing device 210. For example, the processing device 210 may obtain a request for preoperative education for the target patient. The request may be obtained from the XR device of the physician and entered by the physician through interaction with the third interface element. In response to the request, processing device 210 may generate instructional material for interpreting the candidate surgical plan for the target patient and cause the XR device worn by the patient and the XR device of the doctor to simultaneously present the instructional material to the target patient and the doctor.
In some embodiments, as shown in fig. 24, the interactive interface 2331 may further include a fourth interface element 2440 for performing surgical simulation. In response to the interaction between the physician and fourth interface element 2440, physician terminal 270 may generate a request to simulate the target procedure and send the request to simulate the target procedure to processing device 210. The target surgery refers to a surgery corresponding to a target surgery plan. For example, the processing device 210 may obtain a request to simulate a target procedure. The request may be obtained from the XR device of the physician and entered by the physician through interaction with the fourth interface element. In response to the request, the processing device 210 may generate a virtual surgical scene corresponding to the target surgery. The virtual surgical scene may include a virtual surgical site and a virtual surgical device. Processing device 210 may cause the XR device of the physician to present the virtual surgical scene to the physician. In some embodiments, processing device 210 may obtain, via the XR device of the physician or an interaction device corresponding to the virtual surgical device, the physician-entered interaction instructions regarding the virtual surgical device, and update the virtual surgical site and the virtual surgical device in the virtual surgical scene based on the interaction instructions.
In some embodiments, as shown in fig. 24, the interactive interface 2331 may include a fifth interface element 2450 for performing a surgical plan. In response to the interaction between the physician and fifth interface element 2450, physician terminal 270 may generate a request for a surgical plan for the target patient and send the request to processing device 210. For example, the processing device 210 may obtain a request to perform a surgical plan for a target patient. The request may be obtained from the physician terminal and entered by the physician by interacting with the fifth interface element. The processing device 210 may determine a surgical difficulty factor based on patient data of the target patient and determine whether an expert conference is required based on the surgical difficulty factor. In response to determining that the expert conference is required, the processing device 210 may cause the doctor terminal to present a sixth interface element 2460 for initiating the expert conference.
In some embodiments, as shown in fig. 24, the interactive interface 2331 may further include a seventh interface element 2470 for patient management. Patient management refers to the management of patient-related data (e.g., patient data, surgical records, hospitalization records, care records, post-operative recovery records, etc.). In some embodiments, in response to the interaction between the physician and the seventh interface element 2470, the physician terminal 270 can generate a request to access the initial hospitalization record of the target patient and send the request to the processing device 210. For example, the processing device 210 may obtain a request for access to an initial hospitalization record of the target patient. The request may be obtained from the physician terminal and entered by the physician by interacting with the seventh interface element. In response to the request, the processing device 210 may cause the doctor terminal to present an initial admission record and update the initial admission record according to feedback information regarding the initial admission record entered by the doctor through the doctor terminal.
In some embodiments, in response to the access request, the processing device 210 may cause the physician terminal to present an initial interactive interface through the hospital space application that includes an eighth interface element for prompting the physician to check the work schedule. In response to a request for access to a work schedule entered by a doctor through a doctor terminal, the processing device 210 may determine one or more tasks to be processed. In some embodiments, the initial interactive interface may further present a virtual character configured to communicate with the physician.
In some embodiments, interactive interface 2331 may further include calendar information associated with at least one task to be processed. For example, as shown in fig. 24, the interactive interface 2431 may include schedule information 2480 for ward rounds (the plan start time may be 8:00), pre-consultation/pre-consultation (the plan start time may be 9:00), visit (the plan start time may be 9:10), and the like.
In some embodiments, interactive interface 2331 may further include one or more foldable elements associated with one or more completed tasks. One or more foldable elements in the interactive interface may be reduced (e.g., to 1/4 of the original size) or enlarged through human-machine interaction (e.g., clicking or selecting in speech, etc.). Thus, the foldable element may include a folded state (state at the time of reduction) and an unfolded state (state at the time of enlargement). In the expanded state, the foldable element may display completed tasks and time information for the completed tasks (e.g., a start time for one or more completed tasks). For example, as shown in fig. 24, interactive interface 2331 may include a collapsible element 2490 (in an expanded state), collapsible element 2490 may present a ward preparation task, which may begin at a time of 7:45. The physician may click on the foldable element 2490 on the interactive interface 2331, and the processing device 210 may cause the foldable element 2490 to enter a folded state (the area of the foldable element 2490 may be reduced and the completed task may no longer be presented) according to the click operation.
In some embodiments, the configuration of the interactive interface 2331 may be determined based on physician preference information. The preference information reflects a physician's preference for the content, specifications (including shape, color, font, etc.), and/or location of elements on the interface (e.g., first interface element, second interface element, etc.). In some embodiments, the preference information may also reflect the physician's preference for interface style. The interface style may include a simple style and a detailed style. Simple styles may contain fewer interface elements than detailed styles. In some embodiments, the preference information may include whether to display the avatar.
Fig. 25 is a schematic diagram of an exemplary system 2500 for hospital management (also referred to as a hospital management system 2500) shown in accordance with some embodiments of the present specification. As shown in FIG. 25, the hospital management system 2500 may include a manager 2510 of a hospital, a tube space application 2520, and a hospital resource 2530.
The administrator 2510 may manage the hospital resources 2530 through the administrative space application 2520. The administrator 2510 may include an administrator of a hospital, e.g., a hospital yard, a supervisor of a hospital department (e.g., stomatology, medicine, etc.), and the like. Hospital resource 2530 may include devices, personnel, digital twins, digital smart resources (e.g., agents), etc., or any combination thereof. In some embodiments, different administrators 2510 may have different administrative rights for different hospital resources 2530. In some embodiments, a particular administrator 2510 only manages a particular hospital resource 2530. The pipe space application 2520 may be presented through a display of the terminal device of the administrator 2510.
In some embodiments, the pipe space application 2520 may present an interface. The interface may be presented through a display of the manager terminal device of manager 2510. The display may include a Light Emitting Diode (LED) display, a liquid crystal display, an electronic ink display, a touch LCD, an organic LED touch, or the like, or any combination thereof. The terminal device may include a mobile device, an XR device, a smart wearable device, etc.
In some embodiments, the pipe space application 2520 (or an interface of the pipe space application 2520) may be configured to present a virtual persona that is used to assist the administrator 2510 in managing resources. The avatar may be configured to communicate with manager 2510. The administrative space application 2520 may also be configured to provide interface elements to the administrator 2510 to initiate a communication session with the avatar. In some embodiments, administrator 2510 may initiate a communication session with the avatar via an interface element to express his/her resource management needs via voice, text, gestures, etc. (e.g., view information related to a particular type of resource, schedule a particular type of resource, update parameters related to a particular type of resource). A processing device (e.g., processing device 210) may analyze the user's input (e.g., using AI technology, using agents), determine feedback information, and communicate the feedback information to the administrator 2510 via the avatar in a communication session. In some embodiments, content displayed by the administrative space application 2520 may be updated according to the communication content of the communication session.
The interface of the conventional hospital management system is limited to providing the user with a preset analysis result. The interface of the hospital management system described in the present application enables users to retrieve data across different dimensions. In addition, virtual characters allow users to express their needs in natural language, thereby enhancing the user interaction experience and improving the quality and efficiency of resource management services.
In some embodiments, as shown in fig. 25, hospital resources 2530 include digital twins 2531. The digital twins may map the states of the corresponding physical entities and be generated according to a preset data structure. For example, as used herein, a digital twin refers to a virtual representation (or digital copy) of a physical entity. A physical entity refers to any object or phenomenon that exists in the physical world and can be directly or indirectly observed, measured or interacted with.
In some embodiments, the preset data structure of the digital twin may specify a format of the digital twin, a type of information reflected by the digital twin, a storage address of the digital twin, an access entry of the digital twin, an update mode of the digital twin, a modification right of the preset data structure, and the like, or any combination thereof. In some embodiments, the preset data structure may be a default data structure provided by the hospital or a custom data structure set by administrator 2510 through tube space application 2520.
In some embodiments, digital twin 2531 can include one or more first digital twin 25311. For each first digital twin, mapping the state to the respective physical entity may include updating the first digital twin based on the updating of the state of the respective physical entity. Exemplary first digital twins may include digital twins corresponding to a public area of a hospital, digital twins corresponding to medical services, digital twins corresponding to users (e.g., patients, organs of patients, doctors), digital twins corresponding to hardware devices of a hospital, and the like. In some embodiments, status updates of respective physical entities are detected from real-time information of the respective physical entities. For example, the real-time information of the respective physical entity includes at least one of information collected by an in-hospital awareness apparatus or information collected by a user terminal associated with the hospital.
In some embodiments, digital twin 2531 can include one or more second digital twin 25312. For each second digital twin, the state mapping of the corresponding physical entity may include updating the corresponding physical entity based on the updating of the second digital twin. The example second digital twin may include a digital twin corresponding to a hardware device, a digital twin corresponding to a user service, a digital twin corresponding to a healthcare procedure, and so on. It should be appreciated that the digital twins may be either the first digital twins or the second digital twins.
In some embodiments, administrator 2510 may manage digital twin 2531 through pipe space application 2520. Since the digital twin maps the states of the corresponding physical entities, the physical entities of the hospital can be managed by managing the digital twin 2531. For example, the administrator 2510 may view the first digital twin 25311 of the physical entity (e.g., a 3D digital twin model of the patient organ or hardware device, a digital twin view of the common region) through the pipe space application 2520 to understand and evaluate the state of the physical entity. Alternatively, the user may change the displayed display angle, display size, etc. of the displayed first digital twin 25311.
In some embodiments, the one or more first digital twins 25311 can include a digital twin corresponding to a common area in a hospital. Based on the digital twin corresponding to the public area of the hospital, the monitoring of the public area can be realized. In some embodiments, the digital twins corresponding to the common region may reflect digital twins views of the common region generated based on real-time information of the common region. The digital twins view may be used to monitor, analyze, and predict the performance and operation of the common area in real time so that insight into the current state of the common area may be provided. In some embodiments, the digital twinned view of the public area may present a real-time 3D map of the public area and a monitoring index of the public area. The monitoring metrics may include monitoring metrics related to users in the public area, monitoring metrics related to events in the public area, monitoring metrics related to devices in the public area, and the like, or any combination thereof.
In some embodiments, the one or more first digital twins 25311 can include a digital twin corresponding to a medical service. A digital twin corresponding to a healthcare service may be used to conduct a healthcare service assessment for the healthcare service. For example, a digital twin corresponding to a medical service may reflect operational metrics of the medical service that are used to evaluate the quality, efficiency, profits, etc. of the medical service.
In some embodiments, the one or more first digital twins 25311 can include other digital twins that are updated based on the status updates of the respective physical entities. For example only, the first digital twin may comprise a digital twin of the patient or of an organ of the patient, which may be updated once new medical data (e.g., new medical image, new examination data) of the patient is acquired. As another example, the first digital twin may comprise a digital twin of the hardware device that may be updated once a usage state, operating parameters, etc. of the hardware device are updated.
In some embodiments, for each of at least a portion of the one or more first digital twins 25311, the first digital twins can be updated in a first manner in response to detecting a state update exception for the respective physical entity, and the first digital twins can be updated in a second manner in response to detecting a state update exception for the respective physical entity. The first manner may be different from the second manner. In other words, the first digital twin may be updated in different ways when the corresponding physical entity undergoes both an abnormal state change and a normal state change. For example, in response to detecting that a status update of a respective physical entity is normal, the first digital twin may display a first marker symbol (or a first color). Therefore, the first digital twin body can be utilized to monitor the abnormality of the physical entity, and the physical entity can be conveniently and timely adjusted.
In some embodiments, the one or more second digital twins 25312 can include a digital twins corresponding to the hardware device, and the digital twins corresponding to the hardware device reflect parameters of the hardware device. A second digital twin corresponding to the hardware device may be used to update/set parameters of the hardware device.
In some embodiments, the hardware device may include a display device that displays information related to the medical service. The digital twins corresponding to the hardware devices may reflect display parameters of the display devices, which may be set or updated by the administrator 2510 through the pipe space application 2520. For example, the administrator 2510 may adjust the display content of the display device by updating the digital twins corresponding to the hardware devices.
In some embodiments, the one or more second digital twins 25312 can include a digital twins corresponding to the user service, and the digital twins corresponding to the user service can reflect parameters of the user service. In some embodiments, parameters of the user service may be updated/set using a digital twin corresponding to the user service. Exemplary parameters of the user service may include the manner in which the user service is provided, the requirements of the user using the user service, the content of the user service, etc., or any combination thereof. In some embodiments, the user service may be accessed from a patient space application installed in the patient terminal or a doctor space application in the doctor terminal.
In some embodiments, the one or more second digital twins 25312 can include a digital twins corresponding to the healthcare procedure, and the digital twins corresponding to the healthcare procedure reflect parameters of the healthcare procedure. In some embodiments, the parameters of the healthcare procedure may include a Standard Operational Procedure (SOP) specifying standard links in the healthcare procedure. In some embodiments, the SOP may further specify a preset data acquisition protocol. In some embodiments, a second digital twins corresponding to a healthcare procedure may be configured to update/set parameters of the healthcare procedure.
Digital twins can be instantiated with data obtained from sensors and other sources to dynamically model corresponding real world entities in real time. According to the hospital management system disclosed herein, digital twins can be deployed for monitoring, analysis, simulation and control, providing valuable insight to optimize the performance of the hospital management system and to increase overall efficiency.
In some embodiments, hospital resources 2530 include digital intelligent resources of a hospital. Digital intelligence resources may include various types of digital assets and tools enhanced by artificial intelligence to support and improve different aspects of digital operations, learning, and management. In some embodiments, as shown in fig. 25, the digital intelligent resource may include an agent 2532. In some embodiments, administrator 2510 may manage agent 2532 through pipe space application 2520. For example, manager 2510 can view and modify information related to agents through a pipe space application 2520. Since agent 2532 is involved in processing data to achieve user services, user services can be managed by managing agent 2532.
In some embodiments, the pipe space application 2520 can present basic configuration data used by at least a portion of the agents 2532. The basic configuration data may be updated by the administrator 2510 through the pipe space application 2520. The basic configuration data includes key information that the agent relies upon in providing a particular service. Exemplary basic configuration data may include at least one of a dictionary (e.g., word list, staff directory), knowledge database, or template.
In some embodiments, the pipe space application 2520 can present the operation metrics of at least a portion of the agent 2532. The operation index may reflect information about the number and quality of services provided by the agent. The operational metrics of the agent may include the number of users served by the agent, the number of services provided by the agent, the amount of data processed by the agent, the quality of service of the agent, etc., or any combination thereof.
In some embodiments, agents 2532 include agents corresponding to different types of healthcare providers, different hospital departments, different healthcare procedures, different user services, and the like. According to some embodiments of the present description, configurations of various agents in a hospital may be customized, and adaptability and application scope of the agents may be enhanced, thereby significantly improving accuracy and efficiency of user services supported by the agents, while enhancing therapeutic experience of a patient.
In some embodiments, the hospital management system 2500 may further include a processing device. The processing device may process data and/or information obtained from the hospital management system 2500. In some embodiments, the processing device may receive information and/or instructions entered by the administrator 2510 through the pipe space application 2520 and provide corresponding feedback after processing the information and instructions. For example, processing device 210 may receive update information related to a digital twin corresponding to a hardware device from an administrator terminal of presentation pipe space application 2520 and control the hardware device to update its configuration based on the update information. For example, the processing device 210 may store the update information in a storage device and send an update notification to the hardware device so that the hardware device obtains the update information from the storage device to update its configuration. As another example, processing device 210 may receive agent-related update information via pipe space application 2520 and control the agent to perform certain operations.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure does not imply that the subject matter of the present description requires more features than are set forth in the claims. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values may be as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., referred to in this specification is incorporated herein by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the content of this specification, documents that are currently or later attached to this specification in which the broadest scope of the claims to this specification is limited are also. It is noted that, if the description, definition, and/or use of a term in an attached material in this specification does not conform to or conflict with the content of this specification, the description, definition, and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments in this specification are merely illustrative of the principles of the embodiments in this specification. Other variations are also within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.