The present application claims priority from international application number PCT/CN2024/109064 filed on 7/31 of 2024, the entire contents of which are incorporated herein by reference.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
The terms "a," "an," "the," and/or "the" are not specific to the singular, but may include the plural, unless the context clearly indicates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Some embodiments of the present description provide a hospital support platform including a hardware layer (also referred to as a hardware device module), an interface layer (also referred to as an interface module), a data processing layer (also referred to as a data processing module), an application development layer (also referred to as an application development module), and a service layer (also referred to as a meta-service layer, a meta-service module, a service module). The hardware layer is configured to collect data related to hospital services, the interface layer is configured to acquire the data from the hardware layer and send the data to the data processing layer, the data processing layer comprises a plurality of data processing units configured to acquire the data sent by the interface layer and process the data by at least one of the plurality of data processing units so as to realize user services related to the hospital services, the application development layer is configured to provide an open interface for a developer to call at least one part of the data processing units and perform application development by at least one part of the data processing units, and the service layer is configured to provide user space application for relevant users of the hospital services to acquire the user services related to the hospital services.
In some embodiments, the hospital support platform includes hardware devices (as a hardware layer), software-hardware interfaces (as an interface layer), data centers and processing devices (as a data processing layer), open interfaces (as an application development layer), user space applications (as a service layer). The hardware device is configured to collect hospital business related data. The software and hardware interface is configured to obtain data from the hardware device and send the data to the data center for storage. The data center includes a data lake configured to persist data in a tamper-resistant manner. The processing device is configured with data processing units, such as an augmented reality unit, an artificial intelligence unit, a digital twin unit, and a data flow unit. The processing device may process at least a portion of the data stored in the data center using at least one of the data processing units described above to effectuate a user service. The open interface is configured to invoke at least a portion of the data processing unit by the application developer and to utilize at least a portion of the data processing unit for application development. A further description of hardware devices, data centers, processing devices, user space applications, open interfaces, and software and hardware interfaces may be found in fig. 3.
Some embodiments of the present description provide a hospital support platform configured to comprehensively manage various types of resources of a hospital, including hardware resources, software resources, data resources, and the like. In some embodiments, the hospital support platform integrates a plurality of data processing units supporting various advanced technologies, including Artificial Intelligence (AI) technology, augmented reality (XR) technology, digital twinning technology, and blockchain technology, among others, which are used to improve the efficiency and quality of service in the medical industry. For example, artificial intelligence techniques can enable autonomous evolution and continuous optimization of hospital operations, XR techniques and digital twinning techniques can enable creation and maintenance of virtual hospitals. The virtual hospital can interact with the user and provide the user with an immersive and novel virtual-real fusion service experience. In addition, the hospital support platform further comprises an application development layer, so that third-party developers in the medical industry can access the advanced technologies, an open medical ecological system is built, development and innovation of the application are promoted, and development of the medical industry is promoted.
Fig. 1 is a block diagram of an exemplary healthcare system shown according to some embodiments of the present application.
The healthcare system 100, which may also be referred to as a metahospital system, is built based on a variety of innovative technologies including metauniverse technology, XR technology (e.g., augmented Reality (AR) technology, virtual Reality (VR) technology, mixed Reality (MR) technology, etc.), AI technology, digital twin technology, IOT technology, data flow technology (e.g., blockchain technology, data privacy computing technology), spatial computing technology, image rendering technology, etc.
As shown in fig. 1, the healthcare system 100 may include a physical hospital 110, a virtual hospital 130, a user space application 120, and a hospital support platform 140. In some embodiments, the hospital support platform 140 may map data related to the physical hospital 110 into a virtual hospital 130 corresponding to the physical hospital 110 and provide user services to related users of the physical hospital 110 through the user space application 120.
The physical hospital 110 refers to a hospital existing in the physical world and having a tangible attribute. Health care institutions that provide medical, surgical and psychiatric care and treatment for humans are collectively referred to herein as hospitals.
As shown in fig. 1, the physical hospital 110 may include a plurality of physical entities. For example, the plurality of physical entities may include departments, users, hardware devices, user services, public areas, medical service procedures, and the like, or any combination thereof.
A department refers to a specialized unit or department that is specialized in providing a particular type of medical care, treatment, and service. Each department may be focused on a particular medical field and may be equipped with a healthcare professional having expertise in that field. For example, the departments may include an outpatient department, an inpatient department, a surgical department, a support department (e.g., a registration department, a pharmacy department), a medical department, a surgical department, a specialty medical department, a child care department, etc., or any combination thereof.
The user may include any user associated with the physical hospital 110 (or related user referred to as the physical hospital 110). For example, the user may include a patient (or a portion of a patient (e.g., an organ)), a physician, a visit to a patient, a hospital staff of the physical hospital 110, a provider of the physical hospital 110, an application developer of the physical hospital 110, or the like, or any combination thereof. Hospital staff of the physical hospital 110 may include healthcare providers (e.g., doctors, nurses, technicians, etc.), hospital administrators, support staff, or the like, or any combination thereof. Exemplary hospital administrators may include department care administrators, clinical administrators, department institutions, hospital administration, job management, or the like, or any combination thereof.
The hardware devices may include hardware devices located in the physical hospital 110 and/or hardware devices in communication with hardware devices in the physical hospital 110. Exemplary hardware devices may include terminal devices, healthcare devices, sensing devices, base devices, etc., or any combination thereof.
The terminal device may comprise a terminal device that interacts with a user of the medical services system 100. For example, the terminal devices may include terminal devices that interact with the patient (also referred to as patient terminals), terminal devices that interact with the patient's doctor (also referred to as doctor terminals), terminal equipment that interacts with a nurse (also referred to as nurse terminals), terminal devices that interact with a remote seeker (also referred to as remote terminal devices), or a public terminal of the hospital (e.g., room terminal, bedside terminal devices, terminal devices in waiting areas, intelligent surgical terminals), etc., or any combination thereof. In the present application, unless explicitly obtained from the context or otherwise stated in the context, the terminal devices owned by the user and the terminal devices provided to the user by the physical hospital 110 are collectively referred to as the user's terminal devices or the terminal devices interacting with the user.
The terminal device may include a mobile terminal, an XR device, an intelligent wearable device, etc. The mobile terminal may include a smart phone, a Personal Digital Assistant (PDA), a display, a gaming device, a navigation device, a hand-held terminal (POS), a tablet computer, etc., or any combination thereof.
The XR device may comprise a device that allows a user to participate in an augmented reality experience. For example, the XR device may include VR components, AR components, MR components, and the like, or any combination thereof. In some embodiments, the XR device may include an XR helmet, XR glasses, an XR patch, a stereo headset, or the like, or any combination thereof. For example, the XR device may include GoogleGlassTM、OculusRiftTM、GearVRTM、AppleVisionproTM, and the like. In particular, the XR device may include a display component on which virtual content may be presented and/or displayed. In some embodiments, the XR device may further comprise an input component. The input component can enable interaction between a user and virtual content (e.g., a virtual surgical environment) displayed by the display component. For example, the input component may include a touch sensor, microphone, image sensor, etc. configured to receive user input that may be provided to the XR device and used to control the virtual world by changing visual content presented on the display component. The input components may include handles, gloves, styluses, consoles, and the like.
The intelligent wearable device may include an intelligent wristband, intelligent footwear, intelligent glasses, intelligent helmet, intelligent watch, intelligent garment, intelligent backpack, intelligent accessory, etc., or any combination thereof. In some embodiments, the smart wearable device may acquire physiological data of the user (e.g., heart rate, blood pressure, body temperature, etc.).
The healthcare device may be configured to provide healthcare to the patient. For example, the medical services device may include an examination device, a care device, a treatment device, etc., or any combination thereof.
The examination apparatus may be configured to provide examination services to a patient, e.g. to collect examination data of the patient. Exemplary examination data may include heart rate, respiratory rate, body temperature, blood pressure, medical imaging data, body fluid test reports (e.g., blood test reports), and the like, or any combination thereof. Accordingly, the examination device may include a vital sign monitor (e.g., blood pressure monitor, blood glucose meter, heart rate meter, thermometer, digital stethoscope, etc.), a medical imaging device (e.g., computed Tomography (CT) device, digital Subtraction Angiography (DSA) device, magnetic Resonance (MR) device, etc.), a laboratory device (e.g., blood routine examination device, etc.), etc., or any combination thereof.
The care device may be configured to provide care services to the patient and/or assist the healthcare provider in providing care services. Exemplary care devices may include hospital beds, patient care robots, intelligent care carts, intelligent kits, intelligent wheelchairs, and the like.
The treatment device may be configured to provide treatment services to the patient and/or assist the medical service provider in providing treatment services. Exemplary treatment devices may include surgical devices, radiation treatment devices, physical treatment devices, and the like, or any combination thereof.
The sensing device may be configured to gather sensing information related to the environment in which it is located. For example, the sensing device may include an image sensor, a sound sensor, or the like. The image sensor may be configured to collect image data in the physical hospital 110 and the sound sensor may be configured to collect voice signals in the physical hospital 110. In some embodiments, the sensing device may be a stand-alone device or may be integrated into another device. For example, the sound sensor may be part of a medical service device or a terminal device.
The base device may be configured to support data transmission, storage, and processing. For example, the infrastructure devices may include networks, machine room facilities, computing devices, computing chips, storage devices, and the like.
In some embodiments, at least a portion of the hardware devices of the physical hospital 110 are IoT devices. An internet of things device refers to a device with sensors, processing power, software and other technologies that connect and exchange data with other devices and systems through the internet or other communication networks. For example, one or more healthcare devices and/or sensing devices of the physical hospital 110 are internet of things devices and are configured to transmit collected data to the hospital support platform 140 for storage and/or processing.
The user services may include any service provided by the hospital support platform 140 to the user. For example, user services include medical services provided to patients and/or accompanying persons, support services provided to staff members of physical hospital 110 and/or suppliers of physical hospital 110, and the like. In some embodiments, user services may be provided to patients, doctors, and hospital administrators through the user space application 120, which will be described in detail in the following description.
The public area refers to a shared space accessible to users (or portions of users) in the physical hospital 110. For example, the public area may include a reception area (e.g., a foreground), a waiting area, hallways, etc., or any combination thereof.
A healthcare procedure is a procedure that provides a corresponding healthcare to a patient. Medical service procedures typically include several links and/or steps through which a user may need to obtain a corresponding medical service. Exemplary healthcare procedures may include an outpatient procedure, an inpatient procedure, a surgical procedure, or the like, or any combination thereof. In some embodiments, the healthcare procedures may include corresponding healthcare procedures for different departments, different diseases, and the like. In some embodiments, a preset data acquisition protocol may be set and specify the standard links involved in the healthcare procedure and how to acquire data related to the healthcare procedure.
The user space application 120 provides the user with access to user services provided by the hospital support platform 140. The user space application 120 may be an application, plug-in, website, applet, or any other suitable form. For example, the user space application 120 is an application installed on a user terminal device that includes a user interface for a user to initiate requests and receive corresponding services.
In some embodiments, user space application 120 may include different applications corresponding to different types of users. For example, the user space application 120 includes a patient space application corresponding to a patient, a medical space application corresponding to a doctor, a tube space application corresponding to an administrator, and the like, or any combination thereof. User services provided through the patient space application, the medical space application, and the management space application are also referred to as a patient space service, a medical space service, and a management space service, respectively. Exemplary patient space services include registration services, path guidance services, pre-consultation services, remote consultation services, hospitalization services, discharge services, and the like. Exemplary medical space services include scheduling services, surgical planning services, surgical simulation services, patient management services, remote ward services, remote outpatient services, and the like. Exemplary manager space services include monitoring services, medical services assessment services, device parameter setting services, service parameter setting services, resource scheduling services, and the like.
In some embodiments, the patient space application, the medical space application, and the management space application may be integrated into one user space application 120, and the user space application 120 may be configured to provide access portals for each type of user (e.g., patient, healthcare provider, manager, etc.). By way of example only, a particular user may have a corresponding account number that may be used to log into a user space application, view corresponding diagnostic data, and obtain corresponding user services.
According to some embodiments of the present description, by providing user space applications for different types of users, each type of user can easily obtain various user services that he/she may need on its corresponding user space application. In addition, currently users often need to install various applications to obtain different user services, which results in poor user experience and high development costs. Therefore, the user space application of the application can improve the user experience, improve the service quality and efficiency, enhance the service safety and reduce the development or operation cost.
In some embodiments, the user space application 120 may be configured to provide access portals for relevant users of the physical hospital 110 to interact with the virtual hospital 130. For example, through the user space application 120, a user may enter instructions for retrieving digital content of the virtual hospital 130 (e.g., hardware devices, patient organs, digital twin models of public areas), view the digital content, and interact with the digital content. As another example, through the user space application 120, a user may communicate with a avatar representing an agent. In some embodiments, a public terminal of a hospital may install a administrative space application, and an administrator account of a department to which the public terminal corresponds may be logged into the administrative space application. The user may accept user services through a pipe space application installed in the public terminal.
The virtual hospital 130 is a digital twin (i.e., virtual representation or virtual copy) of the physical hospital 110 for simulating, analyzing, predicting, and optimizing the operating state of the physical hospital 110. For example, the virtual hospital 130 may be a real-time digital copy of the physical hospital 110.
In some embodiments, the virtual hospital 130 may be presented to the user using digital technology. For example, when the relevant user interacts with the virtual hospital 130, at least a portion of the virtual hospital 130 may be presented to the relevant user using XR technology. For example only, MR technology may be used to superimpose at least a portion of the virtual hospital 130 on the real-world view of the relevant user.
In some embodiments, the virtual hospital 130 may include a digital twin of a physical entity associated with the physical hospital 110. Digital twins refer to virtual representations (e.g., virtual copies, mappers, digital simulators) of physical entities. The digital twin can reflect and predict the state, behavior and performance of the physical entity in real time. For example, the virtual hospital 130 may include digital twins of at least a portion of medical services, departments, users, hardware devices, user services, public areas, medical services procedures, and the like of the physical hospital 110. The digital twins of a physical entity can take a variety of forms including models, images, graphics, text, numerical values, and the like. For example, the digital twin body may be a virtual hospital corresponding to a physical hospital, virtual personnel (e.g., virtual doctors, virtual nurses, and virtual patients) corresponding to personnel entities (e.g., doctors, nurses, and patients), virtual devices (e.g., virtual imaging devices and virtual scalpels) corresponding to medical service devices (e.g., imaging devices and scalpels), and the like.
In some embodiments, the digital twins may include one or more first digital twins and/or one or more second digital twins. The state of each first digital twin may be updated based on an update of the state of the corresponding physical entity. For example, one or more first digital twins may be updated during the mapping of data associated with the physical hospital 110 to the virtual hospital 130. One or more second digital twins can be updated by at least one of the user space applications 120, and the update of each second digital twins can result in a status update of the corresponding physical entity. In other words, the first digital twin may be updated accordingly when the corresponding physical entity changes its state, and the state of the corresponding physical entity changes accordingly when the second digital twin is updated. For example, the one or more first digital twins may include digital twins of a public area, a medical service, a user, a hardware device, etc., and the one or more second digital twins may include digital twins of a hardware device, a user service, a medical service procedure, etc. It should be appreciated that the digital twins may be either the first digital twins or the second digital twins.
According to some embodiments of the present description, physical hospitals 110 (including hardware devices, users, user services, healthcare procedures, etc.) may be simulated and tested in a secure and controllable environment by generating a virtual hospital 130 that includes digital twins of physical entities associated with the physical hospitals 110. By virtual reality linkage (e.g., real-time interaction between physical hospital 110 and virtual hospital 130), various medical scenarios can be more accurately predicted and responded to, thereby improving the quality and efficiency of medical services. In addition, the application of the XR technology and the virtual reality integration technology enables the interaction of related users to be more natural and visual, and provides a more comfortable and efficient medical environment, so that the user experience is improved.
In some embodiments, the virtual hospital 130 may further include agents that implement self-evolution based on data related to the physical hospital 110 and AI technology.
An agent refers to an agent that acts in an intelligent manner. For example, an agent may include a computing/software entity that can autonomously learn and evolve, and sense and analyze data to perform specific tasks and/or achieve specific goals (e.g., healthcare procedures). Through AI techniques (e.g., reinforcement learning, deep learning, etc.), an agent can constantly learn and self-optimize in interactions with the environment. In addition, the agent can collect and analyze mass data (e.g., related data of the physical hospital 110) through a big data technology, mine patterns and learning rules from the data, optimize decision flow, thereby identifying environmental changes in uncertain or dynamic environments, responding quickly, and making reasonable judgment. For example, agents may learn and evolve autonomously based on AI technology to accommodate changes in physical hospitals 110. By way of example only, agents may be built based on NLP technology (e.g., large language models, etc.) and may automatically learn and autonomously update through large amounts of language text (e.g., hospital business data and patient feedback information) to improve the quality of user service provided by physical hospitals 110.
In some embodiments, the agents may include different types of agents corresponding to different healthcare procedures, different user services, different departments, different diseases, different hospital positions (e.g., nurses, doctors, technicians, etc.), different links of healthcare procedures, and the like. A particular type of agent is used to process tasks corresponding to the particular type. In some embodiments, one agent may correspond to a different healthcare procedure (or a different healthcare, or a different department, or a different disease, or a different hospital location). In some embodiments, an agent may operate with reference to basic configuration data (e.g., dictionary, knowledge graph, template, etc.) of a department and/or disease corresponding to the agent. In some embodiments, multiple agents may cooperate and share information through network communications to collectively accomplish complex tasks.
In some embodiments, a configuration of the agent may be provided. For example, basic configuration data for use by the agent in operation may be set. The basic configuration data may include dictionaries, knowledge databases, templates, etc. As another example, usage rights of the agent may be set for different users. In some embodiments, an administrator of the physical hospital 110 may set the configuration of the agent through a managed space application.
In some embodiments, the agent may be integrated into or deployed on a hardware device. For example, agents corresponding to hospitalization services may be integrated into a hospital bed or presentation device of a hospital bed. In some embodiments, the agent may be integrated into or deployed on the intelligent robot. A self-contained intelligent robot refers to a robotic system that combines physical presence (manifestation) with intelligent behavior (cognition). The self-contained intelligent robot may be configured to interact with the real world in a manner that mimics or complements human capabilities, utilizing physical morphology and cognitive functions to perform tasks, make decisions, and adapt to the environment. By utilizing artificial intelligence and sensor technology, the self-contained intelligent robot can operate autonomously, interact with the environment, and continuously improve performance. For example, the self-contained intelligent robot may configure an agent corresponding to a surgical service and assist a doctor in performing a surgery.
In some embodiments, at least a portion of the user services may be provided based on the agent. For example, at least a portion of the user services may be provided to the relevant users based on the processing results, wherein the processing results are generated by at least one of the agents based on data related to the physical hospital 110. For example only, the data related to the physical hospital 110 may include data related to a healthcare procedure of the physical hospital 110, the agent may include an agent corresponding to the healthcare procedure, and the user service may be provided to an associated user of the healthcare procedure by using the agent processing data corresponding to the healthcare procedure.
The hospital support platform 140 may be configured to provide technical support to the healthcare system 100. For example, the hospital support platform 140 may include computing hardware and software to support innovative technologies including XR technology, AI technology, digital twinning technology, data flow technology, and the like. In some embodiments, the hospital support platform 140 may include at least a storage device for data storage and a processing device for data computation.
In some embodiments, the hospital support platform 140 may support interactions between the physical hospitals 110 and the virtual hospitals 130. For example, the processing device of the hospital support platform 140 may obtain data related to the physical hospital 110 from the hardware device and map the data related to the physical hospital 110 into the virtual hospital 130. For example, the processing device of the hospital support platform 140 may update a portion of the digital twins (e.g., one or more first digital twins) in the virtual hospital 130 based on the obtained data such that each portion of the digital twins in the virtual hospital 130 may reflect the updated status of the corresponding physical entity in the physical hospital 110. Based on the digital twin body which is continuously updated with the corresponding physical entity, the user can know the state of the physical entity related to the physical hospital 110 in real time, so that the monitoring and evaluation of the physical entity are realized. As another example, agents corresponding to data related to the physical hospital 110 may train and/or update based on the data related to the physical hospital 110 to self-evolve and self-learn.
In some embodiments, the hospital support platform 140 may support and/or provide user services to the relevant users of the physical hospital 110. For example, in response to receiving a user service request from a user, the processing device of the hospital support platform 140 may provide a user service corresponding to the service request. As another example, in response to detecting a need to provide a user service to a user, the processing device of the hospital support platform 140 may control a physical entity or virtual entity corresponding to the user service to provide the user service. For example, in response to detecting that a patient is being admitted to a hospital ward, the processing device of the hospital support platform 140 may control the intelligent care cart to direct a nurse to the hospital ward for an admission check of the patient.
In some embodiments, at least a portion of the user services may be provided to the relevant users based on interactions between the relevant users and the virtual hospital 130. Interaction refers to interactions or effects (e.g., conversations, behaviors, etc.) between the relevant user and the virtual hospital 130. For example, interactions between the relevant user and the virtual hospital 130 may include interactions between the relevant user and a digital twin in the virtual hospital 130, interactions between the relevant user and an agent, interactions between the relevant user and a virtual character, and the like, or any combination thereof.
In some embodiments, at least a portion of the user services may be provided to the associated user based on interactions between the associated user and at least one of the digital twins. For example, an update instruction of the second digital twin input by the relevant user may be received by the user space application 120, and the corresponding physical entity of the second digital twin may be updated according to the update instruction. As another example, a user may view a first digital twin of a physical entity (e.g., a 3D digital twin model of a patient organ or hardware device) through the user space application 120 to learn about the state of the physical entity. Alternatively, the user may change the display angle, display size, etc. of the digital twin.
In some embodiments, the processing device of the hospital support platform 140 may present virtual characters corresponding to the agents through the user space application, interact with the associated user, and provide at least a portion of the user services to the associated user based on the interactions between the associated user and the virtual characters.
In some embodiments, the hospital support platform 140 may have a five-layer structure including a hardware device layer, an interface layer, a data processing layer, an application development layer, and a service layer, see fig. 3 and its associated description. In some embodiments, the hardware devices of the physical hospital 110 may be part of the hospital support platform 140.
According to some embodiments of the present description, a virtual hospital corresponding to a physical hospital may be established by integrating various internal and external resources (e.g., medical service equipment, hospital personnel, medicines and consumables, etc.) of the physical hospital. The virtual hospital may reflect real-time status (e.g., changes, updates, etc.) of physical entities associated with the physical hospital, thereby enabling monitoring and assessment of the physical entities. Such integration may provide accurate data support for the operation and intelligent decision-making of medical services. In addition, through the virtual hospital, users related to medical services can commonly establish an open shared ecosystem, thereby promoting innovation and promotion of medical services.
In addition, the medical care service of the patient in the whole life cycle can be provided for the linkage between the inside and outside of the hospital. The perspective of medical services extends from mere disease treatment to covering the entire life cycle of a patient, including prevention, diagnosis, treatment, rehabilitation, health management, and the like. By establishing the intra-and-inter-hospital linkage, the physical hospital can integrate online and offline resources better and provide comprehensive and continuous medical and health services for patients. For example, by remote monitoring and online consultation, the health condition of the patient can be followed in real time, the treatment scheme can be adjusted in time, and the treatment effect can be improved.
Fig. 2 is a schematic diagram of an exemplary healthcare system shown according to some embodiments of the present application.
As shown in fig. 2, the healthcare system 200 may include a processing device 210, a network 220, a storage device 230, one or more healthcare devices 240, one or more perception devices 250, one or more patient terminals 260 of a patient 261, and one or more doctor terminals 270 of a doctor 271 associated with the patient 261. In some embodiments, components in the healthcare system 200 may be interconnected and/or communicate by a wireless connection, a wired connection, or a combination thereof. The connections between the components of the healthcare system 200 may be variable.
The processing device 210 may process data and/or information obtained from the storage device 230, the healthcare device 240, the sensing device 250, the patient terminal 260, and/or the doctor terminal 270. For example, the processing device 210 may map data related to a physical hospital to a virtual hospital corresponding to the physical hospital and provide user services to the patient 261 and the doctor 271 through the patient terminal 260 and/or the doctor terminal 270, respectively, by processing the data related to the physical hospital. As another example, processing device 210 may maintain a digital smart object and provide user services to patient 261 and doctor 271 through patient terminal 260 and/or doctor terminal 270, respectively, by engaging the digital smart object in processing data related to a physical hospital.
In some embodiments, the processing device 210 may be a single server or a group of servers. The server group may be centralized or distributed. In some embodiments, the processing device 210 may be located locally or remotely from the healthcare system 200. In some embodiments, the processing device 210 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, etc., or any combination thereof.
In some embodiments, processing device 210 may include one or more processors (e.g., single-core processors or multi-core processors). For illustration only, only one processing device 210 is depicted in the healthcare system 200. It should be noted, however, that the healthcare system 200 of the present application may also include multiple processing devices. Thus, as with the present application, operations and/or method steps performed by one processing device 210 may also be performed by multiple processing devices in combination or separately.
The network 220 may include any suitable network capable of facilitating the exchange of information and/or data by the healthcare system 200. The network 220 may be or include a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network), a bluetoothTM network, a Near Field Communication (NFC) network, or the like, or any combination thereof.
Storage device 230 may store data, instructions, and/or any other information. In some embodiments, the storage device 230 may store data obtained from other components of the medical services system 200. In some embodiments, storage device 230 may store data and/or instructions that processing device 210 may perform or be used to perform the exemplary methods described herein.
In some embodiments, the data stored in the storage device 230 may include multi-modal data. Multimodal data may include various forms of data (e.g., images, graphics, video, text, etc.), various types of data, data obtained from different sources, data related to different medical services (e.g., diagnosis, surgery, rehabilitation, etc.), data related to different users (e.g., patients, medical personnel, management personnel, etc.). For example, the data stored in the storage device 230 may include medical data reflecting the health of the patient 261. For example, the medical data may include an electronic health profile of patient 261. An electronic health record refers to an electronic file that records various types of patient data (e.g., basic information, examination data, imaging data). For example, the electronic health record may include a three-dimensional model of a plurality of organs and/or tissues of patient 261.
In some embodiments, storage device 230 may include mass storage devices, removable storage devices, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, storage device 230 may include a data lake and a data bin, as will be described in detail in connection with FIG. 3.
The healthcare device 240 may be used to provide or assist in healthcare. As shown in fig. 2, the medical services device 240 includes a clinic terminal 240-1, a hospital bed 240-2, a smart surgical terminal 240-3, a smart care cart 240-4, a smart wheelchair 240-5, etc., or any combination thereof.
The office terminal 240-1 is a terminal device that is configured within the office for use by doctors and patients in a medical outpatient procedure. For example, the office terminal 240-1 may include one or more of a screen, a sound output component, an image sensor, or a sound sensor. A doctor interface may be displayed on the screen of the doctor-room terminal 240-1 and data may be displayed on the doctor interface to facilitate communication between the doctor and patient. Exemplary data may include electronic health records (or portions thereof), pre-consultation records, medical images, 3D organ models, examination results, consultation advice, and the like.
Hospital bed 240-2 refers to a hospital bed that is capable of supporting inpatients in a hospital ward and providing user services to the patient. The hospital bed 240-2 may include beds, bedside terminal equipment, bedside inspection equipment, sensors, and the like, or any combination thereof. The bedside terminal device may include an XR device, a display device, a mobile device, etc., or any combination thereof. In some embodiments, the hospital bed 240-2 may be controlled by an agent corresponding to the hospitalization service, wherein the hospital bed may also be referred to as a smart hospital bed or a meta-hospital bed.
The intelligent surgical terminal 240-3 refers to a device configured with an agent for assisting surgery, and is controlled by the agent corresponding to a surgical service. The intelligent surgical terminal 240-3 may sense interactions (e.g., conversations, behaviors, etc.) between the healthcare provider, the patient, and the agent and obtain data captured by the sensing device 250 to provide surgical assistance. In some embodiments, the intelligent surgical terminal 240-3 may be configured to perform a risk alert for a surgical procedure, generate a surgical record of a surgical procedure, etc., based on the agent configured therein.
The intelligent nursing cart 240-4 refers to a nursing cart having an automatic driving function, which can assist patient treatment and nursing. For example, the intelligent care cart 240-4 may be configured to guide a nurse to a hospital ward for admission of the patient. In some embodiments, the intelligent care cart may be controlled by an agent (e.g., an agent corresponding to a hospitalization service, a care agent). In some embodiments, the smart care cart 240-4 may include a cart, a presentation device, one or more examination devices and/or care tools, a sensing device (e.g., an image sensor, a GPS sensor, a sound sensor, etc.), and/or the like. In some embodiments, the smart care cart 240-4 may be configured to obtain relevant treatment and care information for the patient and generate physical examination data, care data, and the like. The physical examination data may include vital sign data of the patient. The care data may include detailed records of care operations, such as care time, care operator, care measure, patient response, and the like.
The intelligent wheelchair 240-5 refers to a transport device for intelligently taking in and out of a patient. In some embodiments, the smart wheelchair 240-5 may be configured to perform autonomous navigation through integrated sensors and maps, locate the patient's location using Radio Frequency Identification Devices (RFID), bluetooth, or Wi-Fi signals, and identify the patient through biometric technology. In some embodiments, the intelligent wheelchair 240-5 may be controlled by an agent (e.g., an agent corresponding to a hospitalization service, an agent corresponding to a surgical service). In some embodiments, the intelligent wheelchair 240-5 may be configured to generate data (e.g., a record of the interaction between the agent and the patient) by sensing the interaction data through the built-in cameras/sensors.
The sensing device 250 may be configured to gather sensing information related to the environment in which it is located. In some embodiments, the sensing device 250 may comprise a sensing device in a physical hospital 110. For example, the sensing device 250 may include an image sensor 250-1, a sound sensor 250-2, a temperature sensor, a humidity sensor, and the like.
The patient terminal 260 may be a terminal device that interacts with the patient 261. In some embodiments, patient terminal 260 may include a mobile terminal 260-1, an XR device 260-2, a smart wearable device 260-3, and so forth. Doctor terminal 270 may be a terminal device that interacts with doctor 271. In some embodiments, the physician terminal 270 may include a mobile terminal 270-1, an XR device 270-2, or the like. In some embodiments, patient 261 may access a user space application (e.g., a patient space application) through patient terminal 260 and doctor 271 may access a user space application (e.g., a doctor space application) through doctor terminal 270. In some embodiments, patient 261 and doctor 271 may communicate with each other remotely through patient terminal 260 and doctor terminal 270, thereby providing remote medical services, such as remote outpatient services, remote ward services, remote follow-up services, and the like.
The sensing device 250, patient terminal 260, and doctor terminal 270 may be configured as data sources to provide information to the healthcare system 200. For example, the devices may transmit the collected data to the processing device 210, and the processing device 210 may provide user services based on the received data.
It should be noted that the above description of the healthcare systems 100 and 200 is intended to be illustrative, and not limiting of the scope of the present application. Many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other features of the example embodiments herein may be combined in various ways to obtain additional and/or alternative example embodiments. For example, the healthcare system 200 may include one or more additional components, such as, for example, other users 'terminal devices, a hospital's public terminal device, and the like. As another example, two or more components of the healthcare system 200 may be integrated into a single component.
Fig. 3 is a schematic diagram of the architecture of a hospital support platform according to some embodiments of the present disclosure.
As shown in fig. 3, the hospital support platform 300 may include a hardware layer 310, an interface layer 320, a data processing layer 330, an application development layer 340, and a services layer 350. In this specification, the terms "layer" and "module" may be used interchangeably unless otherwise apparent from the context or otherwise indicated. For example, the hardware layer may also be referred to as a hardware module, the interface layer may also be referred to as an interface module, the data processing layer may also be referred to as a data processing module, the application development layer may also be referred to as an application development module, and the service layer may also be referred to as a service module. It should be understood that the "layers" and "modules" in this specification are used only for logically dividing the components in the hospital support platform and are intended to be limiting.
A hospital support platform may refer to a platform for providing support for the operation and management of a hospital. For example, the hospital support platform may be used to support the scheduling, coordination, control and processing of resources (e.g., hardware resources, data resources, etc.), and may also be used to support services (e.g., medical services, artificial intelligence services, operational services, etc.) provided to various types of users and institutions.
The hardware layer 310 may provide a hardware basis for real world and digital world interactions, which may include one or more hardware devices related to hospital operation. Exemplary hardware devices include healthcare devices, sensing devices, terminal devices, base devices, and the like.
Medical services devices may include a variety of devices used in medical services (e.g., diagnosis, treatment, rehabilitation, etc.). For example, it may include medical scanning devices (e.g., CT devices, PET/CT devices, MRI/CT devices, ultrasound imaging devices, etc.), surgical robots, and like large medical instruments. As another example, the healthcare device may include a portable or implantable small medical health device or sensor such as a bedside monitor, hearing aid, smart watch, cardiac pacemaker, or the like.
The sensing equipment is used for sensing the environment where the sensing equipment is located and collecting corresponding sensing information. Exemplary sensing devices include various types of sensors such as image sensors, sound sensors, temperature sensors, humidity sensors, and the like. The sensing device may be a stand-alone device, for example, a monitoring device installed in a clinic. The sensing device may also be integrated in other devices, for example, the sound sensor may be integrated in the terminal device.
The terminal device may interact with a user (e.g., patient, doctor, nurse, hospital administrator, etc.). Exemplary terminal devices include cell phones, tablet computers, notebook computers, wearable devices, displays, and the like. In some embodiments, the terminal device may further include an Extended Reality (XR) device for the user to experience and interact with the immersive digital environment. The XR device may comprise one or a combination of VR device, AR device, MR device, etc.
The base device is used to provide a hardware basis for the transmission, storage and processing of data. The base device may include a network, a machine room facility, a computer device (e.g., a processing device), a computing chip, a storage device, etc.
In some embodiments, hardware devices in hardware layer 310 may be used to collect data related to hospital services. For example, a medical scanning device may be used to acquire medical image data, a terminal device may be used to acquire data of its interaction with a user, and a perception device may be used to acquire perception information.
In some embodiments, one or more of the above hardware devices may be internet of things (Internet of Things, ioT) devices. The internet of things equipment refers to hardware equipment which can sense, collect and transmit data through internet interconnection and intercommunication. For example, the healthcare device, sensing device, described above may be an internet of things device that may transmit the collected data to other layers of the hospital support platform 300 for storage and/or processing via communication techniques (e.g., wired/wireless networks, bluetooth, zigbee, loRaWAN, etc.).
In some embodiments, at least some of the hardware devices (e.g., each of the hardware devices) in the hardware layer 310 need to meet a preset hardware standard. The preset hardware standard may include, but is not limited to, a device specification (size, model number of the device), a data transfer protocol (e.g., manner of data transfer, structure of data, data type, etc.), a device manufacturer, etc.
In some embodiments, hardware layer 310 may include hardware management means that may be used to generate and update hardware configuration information for a hardware device. See fig. 4 and the description thereof for more information about hardware configuration.
The interface layer 320 is connected to the hardware layer 310 and the data processing layer 330. The interface layer 320 may be configured to obtain data collected by a hardware device in the hardware layer 310, and send the data to the data processing layer for storage and/or processing. The interface layer 320 may also be used to control at least some of the hardware devices in the hardware layer 310.
In some embodiments, interface layer 320 may include hardware interfaces and software interfaces (also referred to as software and hardware interfaces).
The hardware interface may be used to implement a physical connection or a communication connection with a hardware device of the hardware layer 310. The hardware interface may include a wired communication interface, such as a serial communication interface, a parallel communication interface, a universal serial bus (Universal Serial Bus, USB) interface, an ethernet interface, or the like. The hardware interface may include a wireless Communication interface, such as a Wi-Fi interface, a Bluetooth interface, NEAR FIELD Communication interface, or the like.
The software interface defines the manner in which different software components or systems interact with each other. Unlike hardware interfaces that involve physical and electrical connections, software interfaces are abstract methods that support communication, data exchange, and function calls between software entities. The software interfaces may include Application Programming Interface (API), protocols (e.g., hypertext transfer Protocol (Hyper Text Transfer Protocol, HTTP), file transfer Protocol (FILE TRANSFER Protocol, FTP), simple object access Protocol (Simple Object Access Protocol, SOAP), etc.
In some embodiments, the software interface may include a data interface for enabling data interaction with the hardware device. Different hardware devices may correspond to different data interfaces. For example, the medical imaging device may transmit its acquired medical image data to interface layer 320 via a medical data interface.
The interface layer 320 obtains or receives data collected by the hardware devices through the data interface to provide a data basis for the data processing layer 330. It should be noted that, the data interface does not modify the data content during data transmission, so as to ensure that the original data collected by the hardware device may be transmitted to the data processing layer 330 for storage and/or processing. In some embodiments, the data interface defines transmission standard information of the data, which is used to indicate a protocol, a format, a mode, etc. of data transmission. The data transmission standard information may include standard data transmission protocols within the industry, such as digital imaging and Communications in medicine (DIGITAL IMAGING AND Communications IN MEDICINE, DICOM) protocols, and the like. The data transmission standard information may include a custom data transmission protocol, which may be set according to actual conditions (e.g., security requirements).
In some embodiments, interface layer 320 may be configured to control at least a portion of a hardware device. As shown in fig. 3, the software interface may also include a control interface that may be used to control at least a portion of the hardware device. For example, the control interface may send control instructions (such as commands and code instructions) to the hardware device, so that the hardware device performs functions or behaviors corresponding to the control instructions, and feeds back relevant information (such as status information).
In some embodiments, control instructions may be generated by data processing layer 330 (e.g., a data processing unit), i.e., data processing layer 330 may interact with hardware layer 310 through interface layer 320. The control instruction may be generated according to different hardware devices and control protocols thereof. For example, the control instructions may include, but are not limited to, hardware device identification, control parameters (e.g., operating parameters such as angle of rotation, direction of movement, etc.), and the like.
It should be noted that the data interface and/or the control interface may be implemented as software and may be stored in any type of non-transitory computer readable medium or storage device. In some embodiments, the data interface and/or control interface may be invoked by other units/modules (e.g., data processing units of data processing layer 330), or hardware devices (e.g., hardware devices of hardware layer 310), and/or may be invoked in response to detected events.
In some embodiments, interface layer 320 may also include a preset algorithm (not shown) that is decoupled from the medical service for processing that is unrelated or weakly related to the medical service. As an example, the preset algorithm decoupled from the medical service may include a data traffic analysis algorithm, a basic AI algorithm, and the like. For example, data traffic (e.g., data transmission frequency, data volume, etc.) is analyzed by a data traffic analysis algorithm and/or a basic AI algorithm to effect monitoring of the data traffic without affecting the data content. As another example, the underlying AI algorithm in interface layer 320 may be invoked by a data processing unit in data processing layer 330 for implementing processing tasks related to hospital services.
In some embodiments, interface layer 320 or a portion thereof may be integrated with data processing layer 330. For example, a data interface and/or control interface may be disposed at the data processing layer 330 as part of the data processing layer 330. Illustratively, the data interface and/or the control interface may be provided at the data processing layer 330 in the form of an interface unit or module.
The data processing layer 330 is used for storing and/or processing data. For example, the data processing layer 330 may include a data processing unit. The data processing layer 330 is configured to obtain data from the interface layer 320 and process the data through at least one of the data processing units to enable user services related to hospital services.
The data processing unit may comprise various types of preset algorithms for implementing data processing, which may be in the form of software, programs, computer code and/or instructions implemented in various types of computer programming languages, such as Java, C/c++. In some embodiments, the data processing layer 330 includes a processing device (such as the processing device 220 shown in FIG. 2) on which the data processing unit may be configured.
In some embodiments, the data processing unit may include an augmented reality (XR) unit configured to implement an augmented reality service using augmented reality technology. Among other things, the augmented reality service may include VR service, AR service, MR service, and the like. For example, by providing MR services, digital content (e.g., three-dimensional organ models, virtual figures) in a virtual hospital can be superimposed onto the user's real field of view to enable the user to interact with the digital content.
In some embodiments of the present disclosure, through integrating the augmented reality unit, the hospital support platform can have the capability of supporting the augmented reality service, and provide a more intuitive and natural interaction mode for users (such as medical staff and patients) in medical services, so as to improve the service efficiency and quality of the users.
In some embodiments, the data processing unit may include an artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) unit configured to implement artificial intelligence services using artificial intelligence techniques. Among other artificial intelligence techniques, those include, but are not limited to, machine learning (MACHINE LEARNING, ML), deep learning (DEEP LEARNING, DL), natural language (Natural Language Processing, NLP) processing, computer Vision (CV), speech recognition, and the like.
In some embodiments, the AI unit may include a plurality of AI subunits, at least some of which may correspond to different hospital services and/or tasks for processing data related to the different hospital services and/or for performing the different tasks. By way of example, the AI unit may include a natural language processing subunit, an image recognition subunit, a speech recognition subunit, and the like.
In some embodiments, the artificial intelligence unit may include an agent unit constructed based on AI technology. Each of the agent units is for maintaining an agent. An agent may refer to a software entity capable of performing tasks automatically on behalf of a user and/or an application, which may perceive an environment, process information, make decisions, and perform actions to achieve a particular goal. The agent is designed to have characteristics of adaptability, learning, autonomy, target oriented behavior, and the like. In some embodiments, different agent units (i.e., different agents) may be provided for different hospital business scenarios (e.g., diagnosis, surgery, etc.), roles (e.g., doctor, nurse, etc.), user services, etc., to provide corresponding agent services. For example, the agent units may include doctor agent units, nurse agent units, technician agent units, and the like.
In some embodiments, the agent may learn and evolve autonomously based on AI technology and data related to hospital services collected by hardware devices to accommodate changes in the hospital services or medical services provided thereby. By way of example only, an agent may be built based on natural language processing techniques (e.g., large predictive models, etc.) by automatically learning and autonomously updating large amounts of language text (e.g., hospital business data, patient feedback information) to enhance the quality of the services it provides.
In some embodiments of the present disclosure, by integrating artificial intelligence units, a hospital support platform is enabled with the ability to support artificial intelligence services to enable efficient data analysis and processing in conjunction with artificial intelligence algorithms in medical business. In addition, by configuring the agent units to maintain multiple agents, the processing of different hospital business scenarios or medical tasks is more intelligent, and the hospital support platform can evolve itself to provide higher quality services.
In some embodiments, the data processing unit may comprise a digital twin unit configured to implement a digital twin service using digital twin technology.
Digital twinning (DIGITAL TWIN) technology can be used to digitize physical entities of the real world and enable virtual-real linkage between the real world and the digitized (virtualized) world. The physical entity may be various physical entities related to a hospital, such as physical space, personnel (such as doctors, nurses, etc.), hardware devices, user services, medical procedures, etc. A digital twin may be a digitized copy of a physical entity, which may include, but is not limited to, representations of 3D models, images, text, and the like. By way of example, the digital twins may be virtual hospitals corresponding to hospital entities, virtual persons (e.g., virtual doctors, virtual nurses, virtual patients) corresponding to personnel entities (e.g., doctors, nurses, patients), virtual devices (e.g., virtual imaging devices, virtual scalpels) corresponding to medical services devices (e.g., imaging devices, scalpels), and the like.
Digital twin services may refer to a linkage or interaction service of physical entities of the real world with digital twin. For example, when the state of a physical entity of the real world changes, its corresponding digital twin may be updated accordingly. By way of example only, a status update of a healthcare device, an update of an expression posture of a person, etc. may cause an update of their corresponding digital twins. Based on the digital twin body which is updated along with the corresponding physical entity, a user can know the real-time condition of the physical entity related to the hospital, thereby realizing the monitoring, evaluation and the like of the physical entity. For another example, when a digital twin of a virtual world is updated, its corresponding physical entity in the physical world may be updated accordingly. By way of example only, a hospital administrator may update parameters of a corresponding digital twin of hardware devices, medical services, medical procedures, etc. through a managed space application. Accordingly, parameters of hardware devices, medical services, medical procedures, etc. in the real world are correspondingly updated. Based on such digital twins, which may affect the status of the corresponding physical entity, the user may implement control, update, etc. of the physical entity related to the hospital. For another example, a digital twin (e.g., a three-dimensional virtual patient model) may be presented to a real-world user via XR technology, which may interact with the digital twin.
In some embodiments, the digital twin unit may process the data based on the XR unit and/or AI unit to achieve a high quality digital twin service. For example, the digital twin unit may invoke the XR unit to present the digital twin to the user using XR technology, enabling user interaction with the digital twin. As another example, the digital twinning unit may invoke the AI unit to achieve more accurate state mapping and state prediction using AI techniques.
In some embodiments of the present disclosure, by setting the digital twin unit, the hospital support platform can have the capability of supporting the digital twin service, and realize real-time linkage and interaction between the real world and the virtual world based on the digital twin service, thereby improving the service quality of various user services.
In some embodiments, the data processing unit may include a data flow unit configured to process data using data flow techniques to implement a data flow service.
The data flow services may include transport services for data, sharing services, switching services, and the like. For example, the data flow services may include data transmission and exchange services between different systems. In some embodiments, the data flow services may include data flow services across medical institutions. For example, the data flow services may enable data sharing, transmission and exchange between different medical institutions.
The data flow techniques may include various types of data privacy computing techniques and/or data security techniques. The data privacy computing technology can be used for processing data according to various preset privacy protection rules so as to ensure that privacy information (such as personal information of a user) in data circulation is not infringed or revealed. The preset privacy protection rule can be a rule or law and the like which are common in the medical industry. The data privacy computing technology comprises an encryption technology, a federal learning technology, a privacy differential technology, a multiparty security computing technology, a security outsourcing technology and the like. The data privacy calculation can realize the separation of ownership, management and use rights of the data and the circulation of the data value.
The data security technology can be used for guaranteeing the security, the integrity and the availability of the data in the circulation process and preventing the data from unauthorized access, leakage, tampering or damage. Data security techniques may include, but are not limited to, encryption techniques, hash functions, digital signature techniques, key exchange protocols, and the like.
In a specific application, the adopted data privacy calculation technology and/or the data security technology can be determined according to actual situations or requirements (such as the size of data volume, the efficiency of data circulation, the privacy degree of data, the security level and the like).
In some embodiments, the data flow techniques may include at least blockchain (Block Chain) techniques and data privacy computation (Privacy Computing) techniques. Blockchain technology is a decentralized, distributed ledger technology that aims to securely record, verify and share information of multiple nodes in a network without the need for a central authority.
In some embodiments of the present disclosure, a data flow module is provided to enable a hospital support platform to support data flow across medical institutions and/or across systems. In addition, the block chain technology and the privacy computing technology are introduced, so that the security of data circulation can be improved, and the abuse or leakage of data is avoided.
In some embodiments, data processing layer 330 also includes a data center to store data. The data center may include various types of storage devices/media, such as mass storage devices, removable storage devices, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. By way of example, a data center may include mass storage devices (e.g., magnetic disks, optical disks, solid state drives), removable storage devices (optical disks, memory cards), random Access Memory (RAM), and the like. In some embodiments, the data center may be deployed on a cloud platform (e.g., public cloud, private cloud).
In some embodiments, the data center employs a lake and warehouse integrated architecture that includes data lakes and data warehouses (also referred to as data warehouses). The data lake is used for persisting mass data. The data bin is used for storing index data corresponding to the data in the data lake. The data stored in the data lake may include raw data collected by the hardware device, data generated based on the raw data, and the like. The data stored in the data lake is multi-modal and may include data from different sources (e.g., different hardware devices), data related to different medical services (e.g., diagnosis, surgery, rehabilitation, etc.), data related to different users (e.g., patients, healthcare personnel, management personnel, etc.), data of different data types (e.g., images, sounds, videos, files). In some embodiments, the data lake is configured to persist data related to hospital traffic collected by the hardware device in a tamper-resistant manner. See fig. 5 and its description for further details regarding data lagoons.
The application development layer 340 may be used to support application development, publishing, subscribing, etc., which may also be referred to as an ecology suite layer.
In some embodiments, the application development layer 340 is configured to provide an open interface for an application developer to access or invoke a data processing unit or a portion thereof and utilize at least a portion of the data processing unit for application development. That is, various data processing capabilities of the data processing layer may be opened to application developers through the application development layer 340 to supply the application developers to develop various types of applications on the basis of the same. A developer may include a person, an organization (e.g., a hospital, an IT technology company), or any entity. An open interface may refer to an entry that enables a developer to access or invoke a data processing unit. In some embodiments, the open interface may include an application programming interface (Application Programming Interface, API), or the like.
In some embodiments, as shown in fig. 3, the application development layer 340 may provide support suites for development tools, application markets, multi-tenant operation platforms, cloud end networks, workspaces, etc. to assist the developer in working. See fig. 8 and its description for more on development tools, application markets, and operating platforms. The cloud-end network is an open portal for a developer to obtain various support services (such as development tools) provided by the application development layer 340. The workspace is used to assemble and configure user workspaces and provide a workable experience for doctors, patients and nurses.
In some embodiments of the present disclosure, by setting an application development layer, an access for obtaining various types of data processing units and various types of support kits can be provided for multiple parties (such as personal developers, medical institutions, medical service equipment manufacturers, drug suppliers, IT technical companies, etc.), so as to promote development of related applications, and promote development of medical services and/or medical ecology.
The service layer 350 is used to enable related users to access and obtain user services related to hospital services through user space applications.
User services may include various types of services related to hospital business. Exemplary user services may include services for imaging, electronic health record, medical services device management, augmented reality, disease screening, visual computing, doctor-patient communication, etc. In some embodiments, the user services may include patient space services for patients, medical space services for healthcare workers, tube space services for hospitals or medical facilities, and the like. In some embodiments, the user services may be native services provided by the data processing layer 330 or third party services provided through the application development layer 340. For example, a user may obtain a third party service by obtaining a user space application published by a third party through an application marketplace of the application development layer 340.
The user space application is an entrance for a user to acquire various user services and is also an interaction interface for the user to interact with the hospital support platform. Among other things, user space applications include, but are not limited to, forms of one or more combinations of application programs (e.g., app), websites (e.g., website), service interfaces (e.g., SERVICE INTERFACE), and the like.
In some embodiments, as shown in fig. 3, the user space application may include a patient space application, a medical space application, a tube space application. The patient space application may refer to an application that provides patient space services for a patient, the medical space application may refer to an application that provides medical space services for medical staff, and the management space application may refer to an application that provides management space services for a manager of a hospital or medical institution. Exemplary patient space applications include registration services, navigation services, pre-consultation services, remote consultation services, hospitalization services, discharge services, and the like. Exemplary medical space services include scheduling services, surgical planning services, surgical simulation services, patient management services, remote ward services, remote inquiry services, and the like. Exemplary managed space services include regional monitoring services, healthcare assessment services, device parameter setting services, service parameter setting services, resource scheduling services, and the like.
In some embodiments, the user space application may be provided or developed by a hospital. In some embodiments, the user space application may be provided or developed by a third party. For example, a third party may publish a user space application to the application marketplace of the application development layer 340.
In some embodiments of the present description, by providing user space applications for each type of user separately, each type of user may conveniently obtain services that each type of user may use on their corresponding user space applications. In the existing hospital service system, users are usually required to install various applications to respectively acquire different services, so that the user experience is poor and the development cost is high. Compared with the prior art, the scheme in the specification can improve user experience, improve service quality and reduce development cost.
In some embodiments, the data processing layer 330 may utilize the data processing unit to map at least a portion of the data to a virtual hospital, and further, the user space application may include providing a portal for a user to interact with the virtual hospital such that the user obtains at least a portion of the user services in the services layer 350. In some embodiments, the virtual hospital, or a portion thereof, may be presented to the user based on XR technology when the user interacts with the virtual hospital. In some embodiments, the virtual hospital or a portion thereof may be superimposed over the user's real world field of view by MR technology. See fig. 1 and the description thereof for more content regarding virtual hospitals.
It should be noted that the hospital support platform 300 is provided for illustrative purposes only and is not intended to limit the scope of the present description. Many modifications and variations will be apparent to those of ordinary skill in the art in light of the present description. However, such changes and modifications do not depart from the scope of the present specification. In some embodiments, one or more layers (e.g., application development layer 340) may be omitted. In some embodiments, the hospital support platform 300 may also include one or more other layers. In some embodiments, two or more layers in the hospital support platform 300 may be combined into one layer, for example, the interface layer 320 may be integrated at the data processing layer 330. In some embodiments, one layer in the hospital support platform 300 may also be divided into multiple sub-layers, for example, the data processing layer 330 may be divided into a data storage sub-layer and a data processing sub-layer. In some embodiments, one or more layers of the hospital support platform 300 may be sequentially connected, e.g., the hardware layer 310, the interface layer 320, and the data processing layer 330 may be sequentially connected and sequentially interacted with each other. In some embodiments, the manner of connection between the different layers of the hospital support platform 300 may be varied. For example, service layer 350 may be directly connected to data processing layer 330 (shown in phantom in FIG. 3).
In some embodiments, the data interactions of the hospital support platform 300 (e.g., the hardware layer 310, the interface layer 320, the data processing layer 330, the application development layer 340, and the service layer 350, each within the layers or between the different layers) need to conform to preset data privacy specifications and data security rules. In particular, any data acquirer (including software and hardware) needs to undergo identity verification and privacy verification when acquiring specific data. The authentication is used to verify whether the data acquirer has authority to acquire data. Privacy verification is used to verify whether the specific data contains private data (such as personal data of a patient), and whether the data acquirer has authority to access the private data. By way of example only, the data processing unit of the data processing layer 330 and the API of the application development layer 340 need to be authenticated and privacy verified before the corresponding data can be obtained or accessed when the data in the data lake is obtained or accessed. For another example, when accessing data of other tenants, the multi-tenant of the application development layer 340 needs to perform authentication first, and when the tenant has access rights, the multi-tenant can access data of other tenants. In some embodiments, the preset data privacy specifications and data security rules may include data privacy specifications and data security specifications commonly used in the medical arts of "health insurance portability and liability act" (Health Insurance Portability and Accountability Act, HIPPA), "universal data protection act" (GENERAL DATA Protection Regulation, GDPR)), and the like.
FIG. 4 is an exemplary flow chart of a process for verifying a hardware device according to some embodiments of the present description. In some embodiments, the flow 400 may be performed by the interface layer 320. For example, the interface layer 320 includes a software and hardware interface that can be used to authenticate the hardware device to which it is connected. As shown in fig. 4, the process 400 includes the following steps.
In step 410, hardware configuration information of the hardware device is obtained.
The hardware configuration information may include basic information of hardware, for example, a name, model number, manufacturer, date of production, etc. of the hardware device.
In some embodiments, the hardware configuration information may include device identity information, which is a unique identification of the hardware. The device identity information may be generated according to a preset rule, which may be determined based on a Mac address of the device or a device id or device number having uniqueness, for example.
In some embodiments, the hardware configuration information may include operational state information of the hardware device, which may display an operational condition (e.g., a current operational condition) of the hardware device. For example, the hardware configuration information may display whether the hardware device is abnormal, an abnormality type, an abnormality number, a historical maintenance number, a service duration, and the like.
In some embodiments, the hardware configuration information for the hardware may be generated by the hardware device itself and sent to the interface layer 320. In some embodiments, hardware layer 310 includes a hardware management device configured to generate hardware configuration information for at least a portion of the hardware devices and send the hardware configuration information to interface layer 320. For example, the hardware management device may perform status monitoring (e.g., based on a heartbeat mechanism) on at least a portion of the hardware devices and send the monitoring result as running status information to the interface layer 320.
Step 420, determining whether the hardware configuration information satisfies a preset condition.
The preset conditions may be used to evaluate whether the identity and/or the operational status of the hardware device meets desired requirements.
For example, for each of the hardware devices (or a portion thereof), the interface layer 320 may determine whether the hardware device is one of a plurality of trusted devices based on the device identity information, and in response to determining that the hardware device is one of the plurality of trusted devices, the interface layer 320 may determine that the hardware configuration information corresponding to the hardware device satisfies a preset condition. In some embodiments, the plurality of trusted devices may be preset configured by a hardware management device in the hardware layer 310. For example, a hardware management device may store a list of ids for a plurality of trusted devices.
For another example, for each of the hardware devices (or a portion thereof), the interface layer 320 may determine whether the current operating state of the hardware device is normal based on the operating state information, and in response to determining that the current operating state of the hardware device is normal, the interface layer 320 may determine that the hardware configuration information corresponding to the hardware device satisfies a preset condition.
In step 430, the data is transmitted to the data processing layer in response to the hardware configuration information satisfying the preset condition.
In some embodiments of the present description, the interface layer 320 may verify the identity and/or the running state of the hardware device first, and only when the hardware device passes the verification, the data collected by the hardware device may be sent to the data processing layer. By the method, the influence of malicious equipment or abnormal equipment on a hospital support platform can be prevented, and the reliability and accuracy of data are improved, so that the accuracy of user services provided subsequently is ensured.
Fig. 5 is an exemplary schematic diagram of a data lake reservoir according to some embodiments of the present description.
Data lake bin 520 includes a data lake 521 and a data bin 522. The data lake 521 may persist the stored data 510. Persistent storage is a way to preserve data for a preset period of time that ensures that the stored data is accessed and remains intact even when the system is powered down or restarted.
The data 510 may include various types, structures, or formats of data. As shown in fig. 5, data 510 may include structured data 511, semi-structured data 512, and unstructured data 513. Structured data 511 is highly organized data, typically having a fixed data structure. Exemplary structured data 511 includes doctor base information, hardware device base information, profiles, administration/operation data, and the like. The semi-structured data 512 has a structure, but such a structure is not strict. Exemplary semi-structured data 512 includes clinical information/documents, wearable device health data, user behavior logs, device operational status logs, service metrics/logs/link information, and the like. Unstructured data 513 has no predefined pattern or organization. Exemplary unstructured data 513 includes audio, video, images, scanned documents, three-dimensional models, machine learning models, digital twin models, electrocardiographic/pathology/cell/electron microscope data, protein/genomic data, and the like. In some embodiments, structured data 511 may be stored in a database (e.g., a relational database) in the form of a two-dimensional data table, and semi-structured data 512 or unstructured data 513 may be stored in the form of files or objects.
In some embodiments, data 510 may include raw data, which refers to data collected by a hardware device without any content modification. The original data can be stored based on an original (Native) storage mode, so that information loss caused by file format conversion is avoided. In some embodiments, data 510 may include derived data generated based on the original data. For example, when data 510 is processed by a data processing unit, intermediate and final processing results may be generated, which may be stored as derivative data in data lake 521.
In some embodiments, the data lake 521 may be configured to store the data 510, or a portion thereof, in a non-tamperable form. For example, the raw data stored in the data lake 521 may be set to a Read Only (Read Only) mode to avoid being rewritten.
As the medical field is rapidly evolving, some type of raw data may be used in various medical businesses or tasks. Through the data lake 521, it can be ensured that the raw data collected by the hardware device is stored completely and safely for future use in various medical services or tasks.
The data bin 522 is used for storing a data index corresponding to the data 510, so as to improve the retrieval efficiency of the data 510. Different types of indexes may be built for different types of data 510. For example, a database index may be built for structured data 511, and a file index (e.g., directory index, full text index, metadata index, etc.) may be built for semi-structured data 512 and/or unstructured data 513. For the same data record, a plurality of data indexes can be constructed to realize multi-dimensional data retrieval. For example, for CT images, a data index of patient ID, scanning device ID, acquisition time, etc. may be constructed.
In some embodiments, data lake cartridge 520 may also include a cache area (CACHE LAYER) for storing temporary data or hot spot data. The temporary data may be derived data of data 510, which may be temporary data generated during medical business processes, for example, with a shorter lifecycle/storage period. Illustratively, the temporary data may be cleaned up after the medical procedure is completed. The hotspot data may be data that is accessed or read more frequently, which may be associated with one or more medical services, with a longer lifecycle/storage period. For example, after a medical procedure is completed, it may continue to be stored in a cache area for access or reading while other medical procedures are being processed.
In some embodiments, as shown in fig. 5, data lake reservoirs 520 may provide data support for various types of medical business processing tasks to enable various types of user services 540. User services 540 may include various types of user services such as reporting services (e.g., report generation and querying), AI services, visualization/XR services, digital twin services, data transaction services, and the like. See fig. 3 and its description for more content regarding user services.
In some embodiments, the data in data lake reservoirs 520 may be processed by processing device 530. The processing device 530 may include one or more data processing units (e.g., XR units, digital twinning units, etc.) in the data processing layer 330. The content of the method by which the data processing layer processes data is further described with reference to fig. 6 and its description.
In some embodiments of the present disclosure, data is stored by using a lake and warehouse integrated architecture, so that a reliable data support can be provided for complex medical service processing tasks while the requirement of large data storage is met, and thus, more efficient user services are provided for users.
Fig. 6 is an exemplary schematic diagram of a data processing method according to some embodiments of the present description.
In some embodiments, the process 600 may be performed by a hospital support platform (e.g., the hospital support platform 300). The process 600 includes the following steps.
At step 610, data 601 is obtained from a hardware device.
As shown in fig. 6, the data 601 may include data acquired by the interface layer 320 from a hardware device (e.g., a medical service device, a sensing device, a terminal device, a base device, etc.) of the hardware layer 310 through a data interface. See fig. 3 and its description for further details regarding the interface layer, the hardware layer.
At step 620, data 601 is stored to data lake 521.
Data 601 may be persisted into data lake 521 in a structured, semi-structured, or unstructured data format. See fig. 5 and its description for further details regarding the data lake 521.
Steps 630 through 661 described below may be performed by a data processing layer (e.g., a data processing unit therein).
At step 630, the data is processed using the XR unit.
The XR unit may process data 601 using an augmented reality technique to implement an augmented reality service. For example, it may map the data 601 into a virtual hospital based on the data 601 using one or a combination of VR, AR, MR techniques to enable visual presentation of the data 601 with human-machine interaction.
At step 640, the data is processed using the AI unit.
The AI unit may utilize AI technology to analyze, predict, etc. the data 601 to implement the artificial intelligence service. For example, the AI unit may recognize the data 601 using a voice recognition technique to obtain a voice recognition result. In some embodiments, the AI unit may interact with the user using an agent unit (e.g., doctor agent unit, nurse agent unit) based on data entered by the user and generate corresponding feedback information.
In some embodiments, the process 600 may further include a step 641, where the xr unit may invoke the AI unit to process the data 601.
The XR unit may invoke the AI unit to analyze and process the data 601 in the augmented reality service using AI technology. For example, in a virtual-real fusion scenario, data 601 (e.g., user-fed information) is analyzed and predicted using AI techniques.
The data is processed using the digital twin unit, step 650.
The digital twin unit may process the data 601 using digital twin technology to implement digital twin services. In some embodiments, the digital twin unit may invoke the XR unit and/or the AI unit to process the data 601 to implement the digital twin service.
In step 660, the data is processed using the data flow cell.
The data flow unit may process the data 601 using a data flow technique to implement a data flow service. For example, the data flow unit may perform data privacy calculations, encryption processing on the data 601 to enable security of the data 601 for transmission, sharing, and/or exchange across systems and/or across medical institutions.
In some embodiments, the process 600 may further include a step 661 of processing the derivative data with the data flow unit.
Derived data refers to data that is dynamically generated in real-time in a medical business. For example, the derivative data may include data generated during processing of the data 601 in accordance with medical business requirements by XR units, AI units, digital twinning units, and the like. For example, it may be user feedback information (such as voice, text, etc.) acquired by the XR unit, a prediction result generated by the AI unit, or a processing result of the digital twin unit (such as a patient diagnosis report, etc.). In some embodiments, the derivative data may also be persisted into the data lake 521 of the data lake bin 520 for use in subsequent medical business processing tasks. In some embodiments, the derivative data may also be cached in the cache region of the data lake bin 520.
Furthermore, the data flow unit can process the derived data according to actual requirements so as to realize data flow service.
It should be noted that different data processing units may process the same or different data in the data lake 521. In addition to the raw data 601 collected by the hardware device, the data processing unit may also process other data stored in the data lake 521, including derivative data, index data, and the like.
In some embodiments of the present disclosure, the data is processed by the XR unit, the AI unit, the digital twin unit, and the data circulation unit, so that the hospital support platform may support a new technology (including the XR technology, the AI technology, the digital twin technology, the data circulation technology, and the like) that cannot be supported by a traditional hospital. The new technology is used for medical service processing tasks, so that the data processing efficiency and accuracy can be improved, and the quality of user service is improved.
FIG. 7 is an exemplary schematic diagram of a method of storing data according to some embodiments of the present description. The flow 700 shown in fig. 7 may be performed by a data processing layer (e.g., a processing device therein).
Step 710, determining evaluation information of the data. The data refers to data acquired by hardware of the hardware layer 310 acquired via the interface layer 320.
The evaluation information may include information obtained by evaluating various characteristics (such as integrity, quality, security, size) of the data. For example, the assessment information may include data integrity, which may be obtained by an integrity assessment of the data. For example, it may be determined whether the data lacks data, lacks critical data, or lacks a proportion of the data to determine data integrity. Alternatively, an alert may be issued to re-acquire the data when the data integrity is lower (e.g., the proportion of missing data is above a threshold). For another example, for data of an image type, the data evaluation information may include an evaluation result of image quality (e.g., sharpness, signal-to-noise ratio).
In some embodiments, the assessment information may include a security level of the data. The security level may be used to indicate the extent to which data needs to be protected. The security level may be determined based on the sensitivity and importance of the data. In some embodiments, the security level may relate to whether the data contains private information, sensitive information, confidential information of the user. If the data contains private information, sensitive information, confidential data, etc., the data has a relatively high level of security.
In some embodiments, the assessment information may include an amount of data and/or a frequency of data acquisition. The data amount is used to measure the amount of storage space occupied by data during a predetermined period of time (e.g., one day, one week). The data acquisition frequency is used for measuring the acquisition times of data in a preset time period. In some embodiments, the data processing layer 330 may determine the amount of data and the frequency of data acquisition based on a method of data statistics. For example, the data amount (e.g., maximum, minimum, average, etc.), the data acquisition frequency (e.g., maximum, minimum, average, etc.), and the data acquisition frequency for a predetermined period of time may be counted to obtain the data amount and the data acquisition frequency.
In some embodiments, data processing layer 330 may also determine the amount of data and/or the frequency of data acquisition for a certain type of data over a future period of time based on the assessment model. The assessment model may be a long-Term Memory network (LongShort-Term Memory, LSTM) model or other time-series based machine learning network model. Taking the data volume as an example, the input of the data volume estimation model includes time series data, and the output includes the data volume for a future period of time. Wherein the time series data includes historical data amounts of the class data in a plurality of historical time periods. The data quantity and/or the data acquisition frequency of various types of data in a future period can be automatically predicted through the evaluation model, so that the data can be better monitored.
In some embodiments, the assessment information may include a data importance coefficient. The importance coefficient may reflect the importance level of the data. The importance coefficient may be comprehensively determined based on a plurality of factors such as a degree of association of the data with the hospital service, a possibility that the data is used, a frequency with which the data is used, a security level of the data, and the like.
Step 720, determining a target storage policy based on the evaluation information.
The target storage policy refers to a data storage mode, and comprises an encryption storage policy, a distributed storage policy and the like. For example, for data with a higher security level (e.g., a security level exceeding a threshold value), the encryption storage policy may be used as a target storage policy, and may be stored in an encrypted manner based on an encryption algorithm. For another example, when the amount of data and/or the frequency of data corresponding to the data is greater than a threshold, the target storage policy corresponding to the data may be determined to be a distributed storage policy. The distributed storage policy may relieve pressure on the storage device.
In some embodiments, the target storage policy may also include a storage deadline for the data. For example, the larger the importance coefficient, the longer the storage life.
At step 730, the data is stored in the data lake based on the target storage policy.
FIG. 8 is an exemplary schematic diagram of an application development layer shown in accordance with some embodiments of the present description. As shown in fig. 8, the application development layer 340 may include development tools 810, an application marketplace 820, and a multi-tenant operation platform 830.
Development tools 810 may include software development tools, libraries, frameworks, etc. for developing, testing, and publishing applications. Development tool 810 can be used to simplify the development process, reduce coding effort, and ensure that application developers can efficiently build functional, user-friendly, and maintainable software applications. Development tool 810 may include APIs, example code (remove source code), code development tools (e.g., code editors, debuggers, packaging and publishing devices, etc.), technical documents, and the like.
In some embodiments, various types of development resources may be provided to the developer in the form of a software development suite (Software Development Kit, SDK). As shown in FIG. 8, development tool 810 may include an augmented reality (XR) SDK, an Artificial Intelligence (AI) SDK, a digital twin SDK, and the like. The different SDKs may include corresponding application programming interfaces, presentation source code. A developer (e.g., a hospital developer, a third party developer) may download a desired SDK from the development tool 810 through the application development layer 340 as needed for development of the application. For example, a developer may develop an application related to augmented reality through an augmented reality SDK.
The application marketplace 820 is used to support online publishing, subscribing, downloading, trading, etc. of applications. As shown in FIG. 8, the application marketplace 820 may include a plurality of applications, application 1, application 2, application 3, and so on. Wherein each application may be developed and/or updated by a developer using development tool 810 and published into application marketplace 820. The user may download an application from the application marketplace 820 and utilize the downloaded application to obtain a corresponding user service.
In some embodiments, applications in the application marketplace 820 may be used to implement one or more user services (e.g., patient space services, medical space services, management space services) in the services layer 350. As shown in fig. 8, the applications in the application marketplace 820 may include an patient space application for patients, a medical space application for healthcare workers, and a management space application for management workers in the service layer 350.
The multi-tenant operation platform 830 refers to a platform that provides operation services for a plurality of medical institutions, and may provide services for different medical institutions, such as cross-hospital operation, and department operation. Exemplary operational services include personnel management, cross-hospital area/department resource scheduling, hospital area/department assessment, account rights management, and the like. As shown in fig. 8, multi-tenant operation platform 830 may include operation platforms corresponding to medical facility 1, medical facility 2, medical facility 1, and the like.
In some embodiments, the application development layer 340 may be built based on a multi-tenant schema. The multi-tenant mode is a software architecture mode that allows a single instance of a software application or database to serve multiple tenants (i.e., application developers, hospitals, healthcare organizations, or groups of users).
In some embodiments, the application development layer 340 may be deployed on a cloud platform (e.g., public cloud, private cloud), each tenant may have its independent space and/or resources. For example, the corresponding operating platforms of different medical institutions (such as medical institution 1 and medical institution 2) are independent and isolated from each other.
In some embodiments, the application development layer 340 may provide open services for tenants. The open services may include access services for resources, and the like. Resources may include, but are not limited to, development resources provided by the application development layer, resources and data of other tenants (e.g., workspaces of other tenants, data of an operating platform), and the like. In some embodiments, the tenant may set permissions on its own resources and/or data, e.g., select portions of the resources to share or open to other tenants. When the application development layer 340 provides an open service for tenants, authority information of other tenants needs to be considered.
In some embodiments of the present disclosure, the application development layer 340 is constructed by a multi-tenant architecture, so that each tenant can operate data sets that are logically independent of each other, and ensure that the data sets do not interfere with each other. By introducing a multi-tenant architecture, a server, a storage device, network resources and other basic devices, resource sharing is realized among multiple tenants, and each tenant can customize configuration, interfaces and functions to meet respective specific requirements without affecting other tenants.
Some embodiments of the present disclosure also provide a hospital support platform including a hardware device configured to collect data related to a hospital business, a software-hardware interface configured to obtain data from the hardware device and send it to a data center for storage, a processing device configured with a data processing unit for data processing, and a user space application configured to provide for relevant users of the hospital business to obtain user services related to the hospital business, wherein the user services are implemented by the processing device by processing at least a portion of the data stored at the data center. Optionally, the hospital support platform may further comprise an open interface configured for a developer to invoke at least a portion of the data processing unit and to utilize at least a portion of the data processing unit for application development.
Other embodiments of the present disclosure also provide a hospital support platform including a hardware device configured to collect data related to a hospital business, a software-hardware interface configured to obtain data from the hardware device and send it to a data center for storage, a processing device configured with a data processing unit for data processing, and an open interface configured for a developer to invoke at least a portion of the data processing unit and to utilize at least a portion of the data processing unit for application development. Optionally, the hospital support platform may further comprise a user space application configured for a relevant user of the hospital service to obtain user services related to the hospital service, wherein the user services are implemented by the processing device by processing at least a portion of the data stored in the data center.
Further embodiments of the present specification provide a hospital support platform comprising a hardware device configured to collect data related to a hospital service, a data lake configured to persist the data in a tamper-resistant manner, a processing device configured with a data processing unit for data processing, the data processing unit comprising an augmented reality (XR) unit, an Artificial Intelligence (AI) unit, a digital twin unit, and a data circulation unit, a user space application configured for a relevant user of the hospital service to obtain user services related to the hospital service, wherein the user services are implemented by the processing device by processing at least a portion of the data stored in the data center. In some embodiments, processing at least a portion of the data includes mapping the at least a portion of the data to a virtual hospital using at least a portion of the data twinning unit, the AI unit, and the XR unit, and the user space application is further configured for an associated user to interact with the virtual hospital such that the associated user obtains at least a portion of the user services. For example, the digital twin unit may update the digital twin body in the virtual hospital based on the at least a portion of the data, the AI unit may update the agent in the virtual hospital based on the at least a portion of the data, and the XR unit may update and render image data corresponding to the virtual hospital based on the at least a portion of the data. In some embodiments, upon user interaction with the virtual hospital, the XR unit is configured to present at least a portion of the virtual hospital to the relevant user using XR technology (e.g., superimposed over the relevant user's real world field of view based on MR technology). For more content on virtual hospitals reference can be made to the relevant description of fig. 1.
Fig. 9A is a schematic diagram illustrating an exemplary flow of providing a pre-consultation service according to some embodiments of the present description. The pre-consultation service may be used to collect information about the patient by making a preliminary inquiry to the patient before the patient enters the consulting room for a formal consultation. Specifically, when the patient is waiting for a diagnosis, the processing device 210 may perform a pre-inquiry on the patient through the patient terminal device of the patient or the waiting terminal configured in the waiting area, so as to relieve anxiety emotion of the patient when waiting for a diagnosis, generate a pre-inquiry recording template, and provide the recording template to a doctor for reference, thereby improving the inquiry efficiency of the doctor. In some embodiments, at least a portion of flow 900A is performed by a pre-consultation agent configured on processing device 210 corresponding to a pre-consultation service.
In step 910, the query content of the pre-consultation query is determined based on the department of the doctor (e.g., the patient's registering doctor).
The pre-consultation query is used to make a preliminary query to the patient prior to the formal consultation. The pre-inquiry may include multiple rounds of inquiry. The interrogation content may include interrogation content for each round of interrogation. Or the query content includes only the query content of the first round of queries. In some embodiments, processing device 210 may obtain a pre-consultation record template corresponding to the doctor's department and determine the content of the query based on the pre-consultation record template.
In some embodiments, the processing device 210 can obtain known information about the patient (e.g., electronic medical records, complaints, etc.), and determine missing information that has not been collected in the pre-consultation record template by comparing the pre-consultation record template to the known information. Further, the processing device 210 may determine the query content based on the missing information.
In some embodiments, the processing device 210 may determine the query content based on known information of the doctor's department and patient using a query model. The query model may include a CNN model, an RNN model, an LSTM model, a BERT model, a ChatGPT model, and the like. In some embodiments, the query model may include a missing information determination model and a first query content determination model. The missing information determination model is configured to output missing information by processing known information of the department and the patient of the doctor. The first query content determination model is configured to output query content based on missing information of the patient.
Step 920, based on the query content, controlling the patient terminal to perform a pre-consultation query on the patient.
In some embodiments, after the patient registers with the doctor, the processing device 210 may determine an estimated wait time for the patient to receive the medical services. For example, the estimated wait time may be the time difference between the current time and the patient's registration time. For another example, the estimated wait time may be determined based on a doctor's current day inquiry record and a patient registration record. The doctor's current day inquiry records are records reflecting the doctor's current day outpatient condition.
In some embodiments, in response to determining that the estimated wait time is greater than the first preset time threshold, the processing device 210 may cause the patient terminal of the patient to initiate a pre-consultation inquiry or display a suggestion to pre-consultate the patient. This approach can ensure that there is sufficient time for the pre-consultation to occur, avoiding the doctor from calling the patient during the pre-consultation.
In some embodiments, in response to determining that the estimated wait time is less than the second preset time threshold, the processing device 210 may cause the patient terminal of the patient to initiate a pre-consultation inquiry to the patient or to present a suggestion to do the pre-consultation inquiry. The second preset time threshold may be greater than the first preset time threshold. For example, when the current time is detected to be less than 24 hours from the registration period (i.e., the estimated wait time is less than 24 hours), the patient terminal may display a suggestion to the patient to make a pre-consultation query (e.g., a suggestion to the patient by a virtual character) to prompt the patient to make the pre-consultation query in time.
In some embodiments, the processing device 210 may detect that the patient initiated a pre-consultation inquiry request through the patient terminal and then cause the patient terminal of the patient to pre-consultate the patient.
In some embodiments, the patient terminal may present a virtual character to make a pre-consultation inquiry based on the inquiry content. A virtual character refers to a digitized character having specific characteristics (e.g., specific appearance characteristics, sound characteristics, etc.) and may communicate with a patient to pre-diagnose the patient. Specifically, the processing device 210 may present the virtual character through a screen of a patient terminal (e.g., an XR device) and play the query content through a sound output device of the patient terminal. Meanwhile, the virtual character can simulate the language expression, gestures and the like of human beings, and provides realistic communication experience for patients. In some embodiments, the avatar may be a visual representation of a pre-consultation agent.
In some embodiments, the avatar may have a pre-set appearance characteristic. In some embodiments, the appearance characteristics of the avatar may be determined based on optical image data of a doctor registering the patient. In some embodiments, the avatar may determine the avatar's appearance based on the patient's basic information. In some embodiments, the processing device 210 may select an appropriate avatar from the avatar library as the avatar based on the physician's profile and/or patient's basic information.
In some embodiments, the pre-consultation query may include a multi-round query. The pre-consultation content may include the query content of each round of queries in the pre-consultation query, which may be performed by the process 900B shown in fig. 9B.
As shown in fig. 9B, for a first round of querying, the processing device 210 may cause the patient terminal to make the first round of querying based on the corresponding query content.
For each current round of interrogation (abbreviated as current interrogation) except for the first round of interrogation, the processing device 210 may adjust the interrogation content of the current interrogation (abbreviated as current interrogation content) based on the reference data collected prior to the current interrogation to make the interrogation content more consistent with the patient's condition. In particular, processing device 210 may determine semantic information and affective information of the patient's historical answers based on reference data collected prior to the current query. The reference data may include voice signals, image data, text data, etc. collected by the patient terminal. The historical answers are answers to the query content of the historical rounds by the patient.
The semantic information of the historical answers characterizes the content of the historical answers. The mood information for the historical answer may indicate the mood of the patient (e.g., calm, tension, anxiety, fear, suspicion, agitation, etc.) at the time the historical answer was provided. The processing device 210 may determine semantic information by performing text transcription, speech content recognition, etc. on the reference data. Processing device 210 may determine emotion information by analyzing features of content, pitch, intonation, pace, etc. of the reference data.
With continued reference to fig. 9B, processing device 210 may adjust the current query content based on the semantic information and the emotion information. For example, when the emotional information of the patient is "tension" or "fear," the processing device 210 may add a pacifying utterance to the current query content. As another example, when the semantic information indicates that the patient does not explicitly answer the historical query, the processing device 210 may adjust the current query content to repeat the historical query, thereby directing the patient to explicitly answer the historical query. The originally determined current query content can be used as the query content of the next round of queries. Thus, the current inquiry content can be timely adjusted according to the illness state of the patient, and the service quality of the pre-inquiry is improved.
In some embodiments, in addition to adjusting the current query content, the sound characteristics used for the query may also be adjusted in real-time based on the patient's status. The sound features include speech rate features, mood features, intonation features, volume features, etc. As shown in fig. 9B, the processing device 210 may determine the sound characteristics of the current query based on the semantic information and emotion information of the patient's historical answers, and cause the patient terminal to adjust the current query based on the adjusted query content and the sound of the current query. The method can better take care of the emotion change of the patient, thereby enhancing the personification effect of the virtual character and improving the quality of the pre-consultation service.
In some embodiments, as shown in fig. 9B, the processing device 210 may further obtain physiological state information of the patient. The physiological state information of the patient may reflect a real-time physiological state of the patient. The physiological state information may include physiological parameter values (e.g., heart rate, pulse rate, respiration rate, etc.) of the patient. The physiological state information may also include information related to the posture, limb behavior, facial expression, muscle state, etc. of the patient. In some embodiments, the physiological state information of the patient may be obtained by a wearable device worn by the patient, an image sensor in the patient's environment.
In addition, the processing device 210 may adjust the current query content based on semantic information, emotion information, and physiological state information. In particular, the processing device 210 may update the mood information of the patient in accordance with the physiological state information of the patient. It will be appreciated that the patient's internal mood may not always be adequately expressed by the patient's answer, and thus the patient's mood information may be updated or modified based on the patient's physiological state information. In addition, processing device 210 may adjust the current query content based on the semantic information and the updated emotion information.
In some embodiments of the present description, by further considering the physiological status data of the patient, the accuracy of the emotional information of the patient may be improved, thereby improving the accuracy of the adjustment of the current query content, and thus improving the quality of service of the pre-query service.
As shown in fig. 9B, in some embodiments, the processing device 210 may determine feedback parameters from at least a portion of the semantic information, the affective information, and the physiological state information, and control the wearable device to apply feedback to the patient in accordance with the feedback parameters. The feedback may include at least one of force feedback or temperature feedback. The feedback parameters may be used to control the manner in which feedback is applied, e.g., the type of feedback, the body part to which the feedback is applied, the strength of the feedback, etc. In some embodiments, processing device 210 may determine the patient's emotion and emotion level from at least a portion of the semantic information, the emotional information, and the physiological state information, and determine the feedback parameters from the emotion and emotion level. The method can sooth the bad emotion of the patient in time, thereby improving the quality of the pre-consultation service.
In some embodiments, the processing device 210 may end the pre-consultation query according to the preset condition. The preset condition may be that the number of remaining missing information is 0. The preset condition may also be that the time difference between the current time of the patient and the estimated wait time is less than a threshold value.
In some embodiments, the query content determined in step 910 may include only the query content of the first round of queries. The current query content of each current query except the first round of queries may be determined during the progress of the pre-consultation query. For example, in the current query, the processing device 210 may input query content of the historical query, historical answers of the patient, known information of the patient, and the like to the second query content determination model, from which the current query content is output.
Step 930, generating a pre-consultation record based on the reference data acquired by the patient terminal in the pre-consultation query.
The reference data may include voice data, text data, image data, and the like, which are input by the patient through the patient terminal in making the pre-consultation inquiry. The pre-consultation record may be used to record patient information collected in the pre-consultation query. Optionally, some known information of the patient may also be recorded in the pre-consultation record. In some embodiments, the pre-consultation record is generated in accordance with a pre-consultation record template. The pre-consultation record template can be a template corresponding to a department where a doctor is located or a template set by the doctor.
For example, when the reference data includes a speech signal, the processing device 210 may first transcribe the speech signal into text and extract keywords from the text by a keyword extraction algorithm. Further, the processing device 210 may convert keywords into medical terms. In addition, processing device 210 may obtain a plurality of template fields in the pre-consultation record template, retrieve content corresponding to each of the template fields from the medical terms, and fill in corresponding locations of the pre-consultation record template. The conversion of keywords can be performed based on a term conversion model or based on a knowledge dictionary. The term conversion model may be configured to convert the spoken description into medical terms.
In some embodiments, the pre-consultation inquiry may be performed by a terminal device other than the patient terminal (e.g., a waiting terminal).
Fig. 10 is a schematic diagram of an exemplary flow for providing medical outpatient services based on awareness information, according to some embodiments of the present description. Schematic of a medical interrogation service. In some embodiments, flow 1000 may include one or more of sub-flows 1010, 1020, 1030, and 1040. In some embodiments, at least a portion of process 1000 is performed by a interview agent configured on processing device 210 that corresponds to a medical interview service.
In the inquiry link, the patient may communicate with a registering physician to receive an inquiry service (e.g., a field inquiry service in a diagnostic room, a remote inquiry service). In some embodiments, the relevant user services associated with the inquiry link may be provided to the relevant user (e.g., doctor, patient, remote companion) through at least one terminal device. The at least one terminal device includes a common terminal device in the consulting room, a patient terminal device, a doctor terminal device, a remote attendant terminal device, and the like. The common terminal device in the consulting room refers to a terminal device installed at the site of the consulting room, which may include a display device, a sound output device, a sound sensor, an XR device, a wearable device, etc., or any combination thereof. The sensory information may be collected during a consultation by a patient, a doctor, a sensory device in the environment of the remote co-doctor. The sensing device may be a stand-alone device or may be part of at least one terminal device.
The sub-flow 1010 may be used to provide review suggestions based on the perceptual information. The sub-process 1010 may be performed in a consultation link. As shown in fig. 10, sub-process 1010 may include step 1012 and step 1014.
Step 1012, generating a review proposal based on the perception information and patient data of the patient. The consultation advice refers to advice that assists a doctor in providing a medical consultation service. For example, the consultation advice may include supplemental inquiry advice, physical advice, prescription advice, treatment advice, and the like.
In some embodiments, the review proposal may be determined based on a knowledge database corresponding to a registered department, review specifications, and the like. For example, the processing device 210 may determine dialogue content between the physician and the patient based on the voice signals collected by the sound sensor and retrieve in a knowledge database, a review specification, etc., based on the dialogue content and/or the patient data to determine review recommendations. For example only, a search may be performed in the consultation specifications based on dialogue content and/or patient data to determine which information in the consultation specifications has not been collected and to provide supplemental query suggestions based on such information.
In some embodiments, the review proposal may be generated based on a diagnostic model. In particular, the processing device 210 may determine model inputs from the perceived perception information and the patient data, input the model inputs to a diagnostic model, which may output a review proposal corresponding thereto. For example, the model input may include patient data, dialog content determined based on speech signals, state information of the patient determined based on image data, and the like, or any combination thereof.
In some embodiments, the review advice may be generated by a consultation agent. The interview agent may learn a generation mechanism for generating the interview advice from various data (e.g., historical interview records, knowledge databases, and interview specifications) and process the perception information and patient data according to the mechanism to provide the interview advice.
Step 1014 controls at least a portion of the at least one terminal device to present the review proposal.
For example, when a patient receives on-site medical outpatient service at a consulting room, the processing device 210 may control a public terminal device or doctor terminal to present viewing advice. As another example, when the patient receives the telemedicine outpatient service, the processing device 210 may control the doctor's doctor terminal and the patient's patient terminal to present the consultation advice, respectively. The consultation advice may improve the accuracy of diagnosis and prescription and improve the efficiency of the medical consultation service.
The sub-process 1020 may be used to generate a target diagnostic record based on the perception information. The sub-process 1020 may be performed at the end of the inquiry link. As shown in fig. 10, sub-process 1020 may include steps 1022, 1024, and 1026.
Step 1022 generates an initial diagnostic record based on the perceptual information.
The initial diagnostic record may be an automatically generated diagnostic record. In some embodiments, the initial diagnostic record may include an initial patient medical record, an initial diagnostic opinion, an initial diagnostic prescription (e.g., an initial treatment prescription and an initial examination prescription), an initial medical order, and the like. In some embodiments, key content may be extracted from the sensory information based on the diagnostic record template. The key content refers to content related to template fields in the diagnostic record template. The key content may be converted into professional content according to a knowledge dictionary or a term conversion model. Further, the diagnostic record template may be updated based on the expertise and knowledge database to generate an initial diagnostic record. The knowledge database refers to a knowledge database of a registered department, for example, including a diagnosis specification (e.g., a disease description specification, a diagnosis specification, a prescription specification, a doctor's advice specification, etc.) of the department.
In some embodiments, physical examination data of a patient acquired by one or more examination devices during an outpatient procedure may be acquired, and an initial diagnostic record may be further generated from the physical examination data. In some embodiments, the initial diagnostic record may be generated by a consultation agent. The agent may learn the mechanism by which the diagnostic record is generated from various data (e.g., diagnostic record templates, knowledge dictionary, knowledge database, etc.), and process the perceived information and patient data according to the learned mechanism to generate the diagnostic record.
At step 1024, the initial diagnostic record is presented to the physician.
For example, when a patient begins a consultation, the processing device 210 may control the common terminal to present an initial diagnostic record. As another example, the processing device 210 may control the doctor terminal to present the initial diagnostic record to the doctor. In some embodiments, the doctor terminal may present the initial diagnostic record to the doctor at a preset time (e.g., after the doctor has completed the current day of inquiry work).
A target diagnostic record is generated based on the initial diagnostic record and feedback information entered by the physician for the initial diagnostic record, step 1026.
The feedback information entered by the physician may include modifications and/or confirmations of the initial diagnostic record by the physician. The target diagnostic record refers to a diagnostic record that is modified and/or validated by a physician. In some embodiments, the target diagnostic record may include a target patient medical record, a target diagnostic opinion, a target diagnostic prescription (e.g., a target treatment prescription and a target examination prescription), a target medical order, and the like.
By generating the target diagnosis record, on one hand, the manual writing errors of the target diagnosis record can be reduced, and the efficiency of generating the target diagnosis record is improved. On the other hand, the paperwork of doctors can be reduced, so that doctors have more energy to pay attention to patients, and the quality of medical clinic service is improved.
The sub-process 1030 may be used to provide a remote companion service based on the awareness information. The patient may initiate a request for remote companion service prior to a consultation. The sub-process 1030 may be performed in a consultation link. As shown in fig. 10, sub-process 1030 may include step 1032 and step 1034.
Step 1032, based on the perception information, determines whether the patient needs to communicate with a remote co-diagnostic person.
In some embodiments, the processing device 210 may detect whether the patient issued a request to communicate with a remote co-diagnostic based on the perceptual information (e.g., voice data and/or image data). In some embodiments, the processing device 210 may determine status information of the patient based on the perceived information and determine whether the patient needs to communicate with a remote attendant based on the status information of the patient. For example, when the status information prompts the patient to be in a highly stressed, fear, etc., state, the processing device 210 may determine that the patient needs to communicate with a remote co-diagnostic.
Upon determining that the patient needs to communicate with the remote co-morbid, the processing device 210 may execute step 1034.
Step 1034 controls at least a portion of at least one terminal device to zoom in on the interface element.
When the patient receives the on-site medical consultation service at the consulting room, the processing device 210 may control the common terminal device to zoom in on the interface element. When the patient receives the remote medical interrogation service, the processing device 210 may control the patient terminal of the patient to zoom in on the interface element. Through the enlarged interface elements, the patient can view the real-time picture of the remote attendant and better communicate with the remote attendant.
In some embodiments, when the patient receives on-site medical consultation services in the consulting room, processing device 210 may alert the patient to wear the XR device and control the XR device to present image data of the remote consultant when it detects that the patient needs to communicate with the remote consultant.
In some embodiments of the present description, the communication needs of the patient can be detected based on the perception information, and the communication needs can be timely satisfied, so as to provide more humanized care for the patient and provide a more realistic and immersive companion experience.
Sub-process 1040 may be used to present medical data to a target user based on the perceptual information. As shown in fig. 10, sub-process 1040 may include step 1042 and step 1044.
Step 1042, based on the perceived information, obtains control commands issued by at least one target user for retrieving at least a portion of the medical data.
The target user may include at least a patient and a doctor. In some embodiments, the target user may also include a remote companion to the patient. The medical data of the patient may include various data reflecting the health condition of the patient (e.g., electronic medical records, medical images, medical physical examination results, etc.).
The control instructions refer to instructions for retrieving at least a portion of medical data (e.g., electronic medical records) for display. For example, the control instructions can be used to retrieve a three-dimensional model of an organ of interest of a patient in an electronic medical record for display. In some embodiments, the control instructions may be used to set a display parameter (e.g., display angle, display size, or display position). In some embodiments, the control instructions may also be used to annotate critical data on medical data (e.g., a three-dimensional model of an organ of interest).
In some embodiments, the perceptual information may include a speech signal captured by a sound sensor, and the control instructions may be obtained by performing semantic analysis on the speech signal. In some embodiments, the target user may issue the control instruction by speaking a preset wake-up word. In some embodiments, the perception information may include optical image data of a target user (e.g., patient and/or physician) acquired by the image sensor, and the control instructions may be acquired by gesture recognition of the target user in the optical image data. In some embodiments, the target user may issue control instructions using a control device (e.g., a remote control, intelligent control glove, etc.).
In some embodiments of the present description, the target user may flexibly adjust display content and/or display parameters, e.g., through voice, gestures, etc., to optimize the user experience and improve the efficiency of the medical inquiry service.
Step 1044, retrieving and presenting at least a portion of the medical data via the at least one terminal device in response to the control instruction.
For example, the processing device 210 may retrieve at least a portion of the medical data from the storage device and control the at least one terminal device to present the at least a portion of the medical data. When the control instructions include display parameters, the processing device 210 may control the at least one terminal device to present at least a portion of the medical data based on the display parameters.
In some embodiments of the present disclosure, a plurality of target users may browse medical data together through at least one terminal device, and may synchronously change presentation contents and presentation manners of medical data on different terminal devices, which is helpful to improve communication efficiency of the target users and enhance interactivity in a treatment process.
FIG. 11 is a schematic diagram of an exemplary flow of providing services to an associated user of an admission transaction segment, according to some embodiments of the present description. In the admission process, the patient can handle relevant procedures to admit. In some embodiments, at least a portion of the process 1100 is performed by a hospitalization agent corresponding to a hospitalization service configured on the processing device 210. In some embodiments, at least a portion of the process 1100 (e.g., steps 1130-1150) is performed by a care agent corresponding to a care service configured on the processing device 210.
At step 1110, the processing device 210 may direct the patient to a ward.
For example, the processing device 210 may instruct a patient terminal of the patient to guide the patient to the ward. For example, in response to a hospital lead request, the processing device 210 may acquire a first location of a patient terminal of a patient and a second location of a ward, and determine a planned route from the first location to the second location based on a real-time map of the hospital. The processing device 210 may then instruct the patient terminal device to present guidance information related to the planned route to the patient.
Step 1120, providing admission instruction information to the patient.
The admission instruction information may be used to introduce admission information (e.g., admission procedure, admission operation, pre-admission fee, payment method, etc.), admission rules, hospital environment, patient's doctor and/or nurse, etc. to the patient. In some embodiments, the processing device 210 may cause the patient terminal device (e.g., XR device 260-2) to present a virtual character that provides the admission announced information.
At 1130, the processing device 210 may assist the nurse in preparing to admit.
Admission preparation may be performed by a nurse preparing hospital supplies for the patient. In some embodiments, the processing device 210 may present the patient's admission notification through a nurse terminal 1105 in the nurse workstation or smart care cart 240-4 to assist the nurse in performing the admission preparation. The admission notification may include patient data for the patient, a hospital supply list for the patient, ward information for the patient, initial examination information for the patient, and the like.
The admission check may also be referred to as an inpatient check, which may be performed after the patient is taken into the ward. Admission checks may be used to gather information about the current medical condition of the patient (e.g., vital signs, basic health data, etc.). The initial exam may include an exam of blood pressure, blood glucose, heart rate, body temperature, or the like, or any combination thereof.
In step 1140, the processing device 210 may issue a reminder to perform the admission check. The reminder may include a message reminder, a sound reminder, a pop-up reminder, etc. For example, the processing device 210 may instruct the nurse terminal 1105 or the intelligent care cart 240-4 to present a reminder.
In some embodiments, the processing device 210 may determine whether the patient satisfies the condition for the admission check in the ward based on the perceived information collected by the perceived device in the ward. The condition for admission checks in the patient room may include that the patient has arrived in the patient room for a period of time. If the patient meets the condition, the processing device 210 may issue a reminder to proceed with the admission check.
In step 1150, the processing device 210 may direct the nurse to the ward. In some embodiments, the processing device 210 may control movement of the smart care cart 240-4 to guide a nurse into a patient room.
At step 1160, an admission check is performed on the patient.
For example, after a nurse arrives in a patient room, one or more examination devices may be used to admissions a patient to collect patient's physical data. In some embodiments, after the smart care cart reaches the ward, the processing device 210 may instruct the smart care cart to present information related to the admission check to the nurse during the admission check. For example, the intelligent care cart can present an admission check graphic, an electronic medical record of the patient, and the like.
In step 1170, the processing device 210 generates an admission record.
Admission records refer to records that display the status of a patient as he or she has admitted to the ward and/or the patient admitted to the ward. The admission record may include admission information (e.g., admission number, clinical information, time of admission, amount of pre-admissions, payment method, etc.), physical examination data collected at the time of admission examination, etc.
In some embodiments, the processing device 210 may generate the admission record based on the admission record template and the physical examination data. In some embodiments, the processing device 210 may further generate an admission record based on the electronic medical record of the patient. In some embodiments, the processing device 210 may present the admission records to the nurse based on the smart care cart 240-4 or the nurse terminal 1105 and generate the target admission record based on the admission records and feedback information from the admission records entered by the nurse through the smart care cart 240-4 or the nurse terminal 1105. The feedback information may include confirmation instructions, modification instructions, etc. entered by the nurse.
In some embodiments of the present description, the patient may be provided with the admission service in a semi-automated manner with the assistance of a medical service system (e.g., the intelligent care cart 240-4) and/or an agent, which may reduce labor costs and increase the efficiency of the admission service.
Fig. 12 is a schematic diagram of a process for providing care services, shown in accordance with some embodiments of the present description. In some embodiments, the process 1200 may be performed daily during patient hospitalization to provide care services to the patient. In some embodiments, at least a portion of the process 1200 may be performed by a hospitalization agent corresponding to a hospitalization service configured on the processing device 210. In some embodiments, at least a portion of the process 1200 may be performed by a care agent corresponding to a care service configured on the processing device 210.
In step 1202, the processing device 210 determines a daily plan for the patient based on patient data for the patient and the physician's order for the patient.
Physician orders for patients refer to instructions or instructions that the physician gives to the patient. In some embodiments, the patient's orders may be stored in a storage device and updated as any physician places a new order for the patient. The processing device 210 may retrieve the latest version of the order from the storage device. In some embodiments, the processing device 210 may monitor various hardware devices to detect whether the patient's order is updated. For example, when an admission inquiry service and/or a ward round service is provided to the patient, the patient's doctor may place a new order to the patient. The processing device 210 may detect the new order based on the perceived information collected by the perceived device during the admission inquiry service and/or the ward round service. Once the new order is detected, the new order may be stored in a storage device. As another example, a doctor may update the order in the storage device through the doctor's terminal. In some embodiments, the processing device 210 can determine the physician order based on the electronic medical record of the patient.
In some embodiments, the processing device 210 may determine the daily plan for the patient based on patient data for the patient and the patient's order. The daily schedule may include at least one medical operation that needs to be performed on the patient on the day. By way of example, the medical procedure may include a care procedure, an inspection procedure, and the like.
In step 1204, the processing device 210 may present the daily plan to the patient through a common terminal device (e.g., bedside terminal 240-6) within the patient room.
In step 1206, the processing device 210 may present the daily plan to a nurse corresponding to the patient through a nurse terminal (e.g., a terminal device in a nurse workstation, etc.).
In step 1208, when the daily schedule includes at least one care session, the nurse may perform the at least one care session with the patient, and the treatment device 210 may assist the nurse in performing the at least one care session according to the daily schedule.
As shown in fig. 12, for each of the at least one care operation, the processing device 210 may control the intelligent care cart to direct a nurse to proceed to a ward for the care operation according to the scheduled time of the care operation. For example, prior to the scheduled time of a care operation, the smart care cart may be controlled to move to a nurse's workstation to notify the nurse that a care operation needs to be performed for the patient. The intelligent care cart can then be controlled to move and guide the nurse to the patient's ward. The processing device 210 may further control the intelligent care cart to present care instructions regarding care operations after the nurse arrives at the ward.
At step 1210, the processing device 210 may generate a care record.
A care record refers to a care operation record that is applied to a patient and/or patient status (e.g., vital signs and other physiological measurements) before, after, or at the time of the care operation. In some embodiments, the processing device 210 may, while performing at least one care operation, obtain sensory information collected by one or more sensory devices in the patient room and generate a care record based on the sensory information. In some embodiments, the care records may be displayed to the nurse for confirmation via the intelligent care cart or nurse terminal.
In some embodiments of the present description, automatic generation of daily plans and care recordings may significantly alleviate the workload of nurses. This automation enables the nurse to focus more on directly caring for the patient than on administrative tasks. Furthermore, monitoring of the order updates ensures timely updates to the daily plan. The proactive method can ensure that the intervention measures and the nursing plan are timely adjusted according to the latest doctor's advice, thereby improving the nursing effect and the nursing quality.
Fig. 13 is an exemplary schematic diagram of a preoperative guidance procedure shown in accordance with some embodiments of the present description. In some embodiments, at least a portion of process 1300 may be performed by a surgical agent corresponding to a surgical service configured on processing device 210.
The pre-operative guidance may include patient protection, patient confirmation, pre-operative education, pre-operative cleaning, venous access establishment, and the like. Patient escort refers to transporting the patient from his/her current location to a waiting area of the operating room.
Patient verification refers to verifying whether a patient meets surgical criteria. For example, patient verification may include verifying that the identity information of the verification object matches the target patient performing the current surgical procedure, verifying that the surgical procedure of the verification object is currently scheduled, and verifying that the current physical condition of the verification object meets the requirements of the surgical procedure. It will be appreciated that if the subject is verified as not meeting any of the surgical criteria, the patient's surgical procedure may be delayed or delayed.
In some embodiments, the processing device 210 may collect biometric information of the patient through one or more sensing devices in the waiting area and verify the identity of the patient based on the biometric information. For example, as shown in fig. 13, after the patient is transported to the waiting area 1310, the processing device 210 may collect biological information of the patient 261 via one or more sensing devices 1311 (e.g., image acquisition devices, microphones, fingerprint sensors, etc.) in the waiting area 1310 and verify the identity of the patient 261 based on the biological information. In some embodiments, the processing device 210 may utilize a nurse agent to verify the identity of the patient. For example, the nurse agent may verify the collected biometric information or verify the identity of the patient by voice interaction with the patient (e.g., asking the patient for age, name, gender, etc.).
Preoperative care may include preoperative pacifying and preoperative education. Preoperative pacifying refers to preoperative preparation that helps patients alleviate negative emotions (e.g., anxiety, stress, fear, etc.) by way of verbal communication, video, music, etc. Preoperative education refers to preoperative preparation that helps patients understand the surgical procedure. Preoperative cleaning refers to pre-operative preparation such as body cleaning, hair removal (e.g., hair, body hair, etc.), patient wear of surgical gowns, etc. Venous access is established by establishing a venous access in a patient for injection of a drug to ensure that the drug is effectively administered to the patient during surgery.
In some embodiments, the processing device 210 may determine a planned path from the current location of the patient to the waiting area and control the intelligent wheelchair to transport the patient to the waiting area along the planned path. For example, as shown in fig. 13, the processing device 210 may determine a planned path from the current location of the patient 261 (e.g., ward 1303) to the waiting area 1310 before performing a pre-operative operation on the patient according to the surgical plan. The processing device 210 may control the intelligent wheelchair 240-5 to transport the patient 261 from the patient room 1303 to the waiting area 1310 along the planned path.
In some embodiments, the processing device 210 may determine a planned path from the current location to the waiting area based on the hospital map. In some embodiments, processing device 210 may configure a nurse agent that performs certain tasks in place of a nurse and may assume a virtual nurse role. The processing device 210 may use a nurse agent to control the intelligent wheelchair to transport the patient from the current location to the waiting area. In some embodiments, the processing device 210 may perform patient verification after the patient is transported to the waiting area.
In some embodiments, the processing device 210 may determine the pre-operative care material of the patient from the patient data and the surgical plan. The processing device 210 may use the patient terminal to provide pre-operative education to the patient in terms of pre-operative care materials during transport of the patient to the waiting area. The pre-operative care material may include video, music, images, text, and other materials related to surgical interpretation and/or emotional relaxation.
In some embodiments, the processing device 210 may provide pre-operative education to the patient using a nurse agent. For example, as shown in FIG. 13, processing device 210 may present a virtual nurse character 1323 on XR device 260-2 worn by patient 261, virtual nurse character 1323 speaking pre-operative care material to patient 261. In some embodiments, the virtual nurse character 1323 may interact with the patient 261 in voice to alleviate the patient's negative emotion or answer the patient's question through communication. In some embodiments, the processing device 210 may determine whether it is necessary to alleviate the patient's mood by collecting the patient's facial expressions, physiological signs, intonation, etc.
In some embodiments, the processing device 210 may use a nurse agent to instruct the nurse to perform pre-operative cleaning and/or establish venous access.
In some embodiments, during the transportation of the patient to the waiting area, the processing device 210 may acquire sensory information related to a portion of the planned path from the current location of the intelligent wheelchair to the waiting area (e.g., a portion of the planned path that the intelligent wheelchair has not traveled) through one or more sensory devices in the hospital (e.g., the image sensor 1313). Based on the awareness information, the processing device 210 may determine a potential risk in the non-traveled portion of the planned path and update the non-traveled portion based on the potential risk.
In some embodiments of the present disclosure, through the above-described preoperative guidance procedure, a humanized, transparent, efficient preoperative preparation procedure may be provided, and preoperative preparation items may be dynamically adjusted according to patient feedback, improving preoperative preparation efficiency. By the above process of conveying the patient and verifying the identity of the patient, human errors are avoided, and the safety of the whole operation process is improved. Many preoperative preparation works are completed with the assistance of virtual nurse images, so that labor cost can be saved.
Fig. 14 is a schematic illustration of a surgical execution flow shown in accordance with some embodiments of the present description. Surgical execution flow 1400 may include preoperative preparation, intraoperative events, and postoperative events. In some embodiments, at least a portion of process 1400 may be performed by a surgical agent corresponding to a surgical service configured on processing device 210. As shown in fig. 14, preoperative preparations (e.g., pre-operative steps) may include steps 1411, 1413, and 1415.
Step 1411, the operating room is activated.
Activating the operating room may include opening an operating room door, activating an operating device in the operating room, monitoring the device, adjusting parameters in the operating room, verifying a status of the operating device, and the like. In some embodiments, the processing device 210 may control the intelligent robotic nurse to activate the operating room or direct the nurse to activate the operating room. For example, the processing device 210 may control an intelligent robotic nurse to automatically activate the operating room device at a predetermined time during surgery and adjust room temperature, humidity, and air quality.
Step 1413, preparing a surgical tool.
Surgical tools may include surgical instruments and surgical consumables. In some embodiments, the processing device 210 may control the intelligent robotic nurse to prepare surgical tools in the operating room prior to surgery according to the surgical plan. In some embodiments, the processing device 210 may further control the intelligent robotic nurse to disinfect the surgical table and to deploy the surgical table (e.g., where various surgical tools are deployed on the surgical table).
Step 1415, patient identification and/or patient anesthesia. Patient confirmation refers to confirming the identity of the patient. Patient anesthesia refers to the administration of anesthesia to a patient.
Step 1420, a surgical procedure is performed. In some embodiments, as shown in fig. 14, the intraoperative events (e.g., intraoperative events) can include remote collaboration, tool transfer, image interaction, intraoperative planning and navigation, and real-time alerting.
Remote collaboration refers to remote participation and/or remote guidance during a surgical procedure.
Tool delivery refers to the delivery of surgical tools to a surgical executor during a surgical procedure. In some embodiments, the processing device 210 may identify instructions issued by the surgical participant for the targeted surgical tool based on first sensory information acquired by one or more first sensory devices in the operating room during the procedure. Based on these instructions, processing device 210 may control the intelligent robotic nurse to deliver the targeted surgical tool to the surgical participant.
Image interaction refers to displaying a digital manikin of a patient (e.g., a three-dimensional anatomical model of a surgical site), an electronic medical record of the patient, a surgical plan of a current surgery, a real-time image of a surgical site of the patient, etc., to a surgical participant (e.g., a local surgical participant, a teleoperational participant) and/or the patient through an interaction device within the operating room (e.g., a display screen within the operating room, a doctor terminal device 270).
Intraoperative planning navigation refers to fusing a patient's lesion image (e.g., a CT scan image of a lesion) with a digital phantom of the patient, projecting the lesion image onto the patient's body, or overlaying the positioning and tracking of surgical tools to guide the surgical participants in the procedure.
Real-time alarms may include behavioral alarms of surgical participants, patient vital sign alarms, and device operational status alarms, among others. Behavioral alarms refer to the monitoring and alerting of surgical participants of intraoperative operational behavior. Patient vital sign alarms may be activated when abnormalities occur in a patient's vital signs (e.g., electrocardiogram, blood pressure, etc.). The device operation state alarm refers to an alarm issued when an abnormal operation state of the surgical device occurs.
As shown in fig. 14, the post-operative procedure (e.g., post-operative event) may include steps 1431, 1433 and 1435.
Step 1431, the patient is transferred. Transferring a patient refers to the process of transferring the patient from the operating room to the rehabilitation section after the surgical procedure is completed. In some embodiments, transferring the patient may be performed by a healthcare professional with the assistance of a smart robotic nurse.
At step 1433, operating room cleaning is performed. Operating room cleaning refers to the process of cleaning or disinfecting surgical equipment and tools. In some embodiments, the processing device 210 may control the intelligent robotic nurse to perform operating room cleaning.
At step 1435, a surgical report is generated.
The surgical report may include surgical related information, patient related records, participant related records, and the like. In some embodiments, processing device 210 may generate an initial surgical report from data collected during a surgical procedure (e.g., sensory information collected by one or more sensory devices in an operating room). The processing device 210 may generate the surgical report based on the initial surgical report and feedback information entered by the physician regarding the initial surgical report.
In some embodiments, the processing device 210 may also monitor the patient's post-operative vital signs through a vital sign monitoring device (e.g., electrocardiograph, sphygmomanometer, etc.) in the patient's room to determine if the patient's post-operative vital signs are within normal range, if there is an abnormality, or if the progress of recovery is normal. Furthermore, the processing device 210 may update the medical advice report based on the post-operative sign of the patient. In some embodiments, the processing device 210 may update the medical advice report according to the direction of the physician. In some embodiments, the processing device 210 may send the updated medical advice report to a display device of the nurse workstation and/or a display device of the doctor workstation.
In some embodiments, the processing device 210 may determine the post-operative care plan from the updated medical advice report. A post-operative care plan refers to care tasks that need to be performed by a caregiver (e.g., nurse, care assistant, etc.) during a patient's post-operative hospital stay. In some embodiments, the processing device 210 may control a smart surgical device (e.g., the smart care cart 240-4) to provide care to the patient according to a post-operative care plan. In some embodiments, the processing device 210 may send the post-operative care plan to a nurse so that the nurse provides post-operative care to the patient. In some embodiments, the processing device 210 may update the post-operative care plan in real-time during the care session according to the patient's condition. The execution of the post-operative care plan is similar to the daily plan described in fig. 12.
In some embodiments, processing device 210 may generate surgical results and surgical records for the doctor from the surgical report and the medical advice report for the doctor to view the surgical procedure. Surgical outcome refers to data reflecting the outcome of a surgical procedure. In some embodiments, the surgical results further include summary data of the surgical results by the physician over a predetermined period of time (e.g., one month). The operation record refers to the operation record of the doctor during the operation. The operational records may include action records, effort records, station records, and the like. In some embodiments, processing device 210 may generate surgical results and surgical records from the surgical report and the medical advice report.
In some embodiments, the surgical procedure is reviewed. For example, the processing device 210 may present the surgeon's surgical outcome and operational record to the surgeon, allowing the surgeon to review the surgical procedure.
In some embodiments, processing device 210 (e.g., an agent configured on processing device 210) may invoke data processing units in data processing layer 330 for data processing to implement or support at least some of the operations in processes 900A-1400 as described above.
It should be noted that the above description of related processes is only for example and illustration, and does not limit the application scope of the present disclosure. Various modifications and alterations to this process will be apparent to those skilled in the art in light of the present description.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure does not imply that the subject matter of the present description requires more features than are set forth in the claims. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., referred to in this specification is incorporated herein by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the content of this specification, documents that are currently or later attached to this specification in which the broadest scope of the claims to this specification is limited are also. It is noted that, if the description, definition, and/or use of a term in an attached material in this specification does not conform to or conflict with what is described in this specification, the description, definition, and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.