Movatterモバイル変換


[0]ホーム

URL:


CN119811606A - A system and method for providing inpatient services - Google Patents

A system and method for providing inpatient services
Download PDF

Info

Publication number
CN119811606A
CN119811606ACN202411743194.4ACN202411743194ACN119811606ACN 119811606 ACN119811606 ACN 119811606ACN 202411743194 ACN202411743194 ACN 202411743194ACN 119811606 ACN119811606 ACN 119811606A
Authority
CN
China
Prior art keywords
patient
data
ward
information
record
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411743194.4A
Other languages
Chinese (zh)
Inventor
张丽
魏元康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Lianying Zhiyuan Medical Technology Co ltd
Original Assignee
Shanghai Lianying Zhiyuan Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Lianying Zhiyuan Medical Technology Co ltdfiledCriticalShanghai Lianying Zhiyuan Medical Technology Co ltd
Priority to CN202510046027.2ApriorityCriticalpatent/CN119811610A/en
Priority to CN202510046811.3Aprioritypatent/CN119626491A/en
Priority to CN202510039867.6Aprioritypatent/CN119560117A/en
Priority to CN202510052424.0Aprioritypatent/CN119541801A/en
Priority to CN202510039748.0Aprioritypatent/CN119541800A/en
Publication of CN119811606ApublicationCriticalpatent/CN119811606A/en
Pendinglegal-statusCriticalCurrent

Links

Landscapes

Abstract

Translated fromChinese

本说明书实施例提供了一种提供住院服务的方法和系统。该方法包括:监测数据源,数据源被配置为采集与患者的住院流程相关的数据;响应于检测到数据源中至少一个数据源存在数据更新,基于患者的患者数据和感兴趣事件的检测规则,对至少一个数据源采集的更新数据进行感兴趣事件的检测,检测规则由智能体从历史服务记录中学习得到;以及响应于检测到感兴趣事件发生,基于患者数据和感兴趣事件和预设操作之间的对应关系,执行一个或多个预设操作,以提供至少部分住院服务,对应关系由智能体从历史服务记录中学习得到。

The embodiments of this specification provide a method and system for providing hospitalization services. The method includes: monitoring a data source, the data source is configured to collect data related to the patient's hospitalization process; in response to detecting that at least one of the data sources has a data update, based on the patient data of the patient and the detection rule of the event of interest, detecting the event of interest on the updated data collected by at least one data source, the detection rule is learned by the intelligent agent from the historical service records; and in response to detecting the occurrence of the event of interest, based on the correspondence between the patient data and the event of interest and the preset operation, performing one or more preset operations to provide at least part of the hospitalization service, the correspondence is learned by the intelligent agent from the historical service records.

Description

System and method for providing hospitalization service
Cross reference
The present application claims priority from International application PCT/CN2024/109058, entitled "SYSTEMS AND METHODS FOR PROVIDING HOSPITALIZATION SERVICES", filed on even 31 of month 07 of 2024, the entire contents of which are incorporated herein by reference.
Technical Field
The present disclosure relates to the field of medical services, and more particularly to a system and method for providing hospitalization services.
Background
Hospitalization is the service provided for patients who need admission therapy or monitoring of the condition. The duration of hospitalization depends on the severity of the condition and the desired treatment. During hospitalization, patients may be subjected to all-weather medical care, including diagnostic testing, treatment, care, and supervision by medical personnel. However, conventional hospitalization services face a number of challenges. For example, inefficient communication and information gaps between healthcare workers (e.g., nurses, doctors) and/or different departments of a hospital often lead to treatment errors and delays, lack of patient-centric care, patient dissatisfaction, reduced rehabilitation results, and the like. In addition, healthcare workers often spend a significant amount of time processing administrative matters, including recording various documents, the burden of which may impact the efficiency of the healthcare and the status of the healthcare workers.
Accordingly, it would be desirable to provide a system and method for providing hospitalization services to improve service efficiency, service accuracy, and patient satisfaction (e.g., patient experience, and more importantly, immersive healthcare experience).
Disclosure of Invention
One aspect of the embodiments of the present description provides a method of providing hospitalization services. The method is performed by a computing device that includes at least one processing device and at least one storage device. The computing device is configured with an agent, at least a portion of the method is performed by the agent, and the agent implements self-evolution based on artificial intelligence techniques. The method may include monitoring a data source. The data source may be configured to collect data related to a patient's hospitalization procedure. The method may include, in response to detecting that at least one of the data sources is present with data updates, detecting an event of interest based on patient data of the patient and detection rules for the event of interest, the updated data collected by the at least one data source. The detection rules are learned from the history service record by the agent. The method may further include, in response to detecting the occurrence of the event of interest, performing one or more preset operations to provide at least a portion of the hospitalization service based on the patient data and a correspondence between the event of interest and the preset operations. The corresponding relation is learned from the history service record by the agent.
One aspect of embodiments of the present description provides a method of providing an admission query. The method is implemented on a computing device having at least one processor and at least one storage device. The method may comprise making a first query via a first terminal device provided in the patient room based on patient data of the patient. The first terminal device makes a first query by presenting the avatar. The method may include obtaining first perception information. The first sensory information is collected by a sensory device in the patient room during the first interrogation. The method may further include generating an admission record for the patient based on the first perception information.
One aspect of the embodiments of the present specification provides a method of providing admission guidance. The method is implemented on a computing device having at least one processor and at least one storage device. The method may include controlling a terminal device of the patient to display route guidance information to guide the patient to the ward in response to the hospitalization guidance request. The method may include controlling the patient's terminal device to admix instructions to the patient on its way to the patient's room. The content of the admission instruction is determined based on patient data of the patient. The method may further comprise determining whether the patient meets the condition for performing the admission check after the patient arrives in the ward. The method may further include controlling the intelligent care cart to issue a reminder to a nurse corresponding to the patient to conduct the admission check in response to determining that the patient meets the condition to conduct the admission check.
One aspect of embodiments of the present description provides a method of providing hospitalization. The method is implemented on a computing device having at least one processor and at least one storage device. The method may include determining a daily schedule for the patient based on patient data for the patient and the patient's order for each day of patient hospitalization. Daily plans include medical procedures that need to be performed on the patient on the day, including care procedures. The method may include presenting the daily schedule to the nurse via the patient's corresponding nurse's terminal. The method can further comprise for each nursing operation, controlling the intelligent nursing trolley to guide nurses to ward where the patients are located to execute the nursing operation according to the planning time of the nursing operation, and controlling the intelligent nursing trolley to display nursing instructions related to the nursing operation after the nurses arrive at the ward.
One aspect of the embodiments of the present description provides a method of providing ward round services. The method is implemented on a computing device having at least one processor and at least one storage device. The method may include obtaining perception information. The sensory information is acquired by a sensory device in the ward in which the patient is located during a ward visit by the doctor. The method may include detecting an instruction from a doctor to retrieve patient data based on the sensory information during a ward, and controlling a terminal device in the ward arrangement to present the patient data using an augmented reality technique in response to the instruction. The method may further include generating a ward record based on the perceived information and presenting the ward record to the doctor for confirmation after the ward period.
One aspect of the embodiments of the present specification provides a method of providing follow-up management. The method is implemented on a computing device having at least one processor and at least one storage device. The method may include determining a follow-up plan for the patient based on the target hospitalization record for the patient. The follow-up schedule includes one or more follow-up visits performed at one or more scheduled times. The method may include for each follow-up visit, reminding the doctor and the patient respectively by the doctor's terminal device and the patient's terminal device according to the scheduled time corresponding to the follow-up visit. At least one of the one or more follow-up visits is a remote follow-up visit. The method may further comprise presenting, for each remote follow-up, a virtual follow-up space to the doctor and the patient for remote follow-up, respectively, by means of the doctor's terminal device and the patient's terminal device. The method may further include generating a follow-up record based on the sensory information collected by the sensory device during each follow-up.
One aspect of the embodiments of the present specification provides a system comprising a processor. The processor is configured to perform any of the methods described above.
One aspect of the embodiments of the present specification provides a storage medium. The storage medium stores computing instructions that, when read by a processor, the processor performs any of the methods described above.
Additional features of some of the description will be set forth in the description which follows. Additional features of part of the application will be readily apparent to those skilled in the art from a study of the following description and the accompanying drawings, or from a study of the manufacture or operation of the embodiments. The features of the present specification can be implemented and realized in the practice or use of the various aspects of the methods, tools, and combinations set forth in the detailed examples discussed below.
Drawings
The present specification will be further elucidated by way of example embodiments, which will be described in detail by means of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is an exemplary block diagram of a healthcare system shown according to some embodiments of the present description;
FIG. 2 is a schematic illustration of a scenario of a healthcare system shown according to some embodiments of the present description;
FIG. 3 is a schematic diagram of the structure of a hospital support platform according to some embodiments of the present disclosure;
FIG. 4 is a schematic illustration of a scene of an hospitalization system as shown in some embodiments of the present description;
FIG. 5 is an exemplary flow chart of hospitalization procedures shown according to some embodiments of the present description;
FIG. 6 is an exemplary schematic diagram of a processing device according to some embodiments of the present description;
FIG. 7 is an exemplary flow chart for providing hospitalization services according to some embodiments of the present description;
FIG. 8 is an exemplary flow chart for providing hospitalization admission services according to some embodiments of the present description;
FIG. 9 is an exemplary flow chart showing guidance information, according to some embodiments of the present description;
FIG. 10 is an exemplary diagram illustrating the presentation of guidance information by an augmented reality device according to some embodiments of the present description;
FIG. 11 is an exemplary schematic diagram of providing hospitalization admission services according to some embodiments of the present description;
FIG. 12 is an exemplary flow chart for providing an admission inquiry service according to some embodiments of the present description;
FIG. 13 is a schematic diagram showing first query content by an augmented reality device according to some embodiments of the present description;
FIG. 14 is an exemplary flow chart for updating admission records shown in accordance with some embodiments of the present description;
FIG. 15 is an exemplary flow chart of performing a first interrogation according to some embodiments of the present description;
FIG. 16 is an exemplary flow chart for providing care services according to some embodiments of the present description;
fig. 17 is an exemplary flow chart for providing ward round services according to some embodiments of the present description;
FIG. 18 is an exemplary flow diagram for providing a visit service, shown in accordance with some embodiments of the present disclosure;
FIG. 19 is an exemplary flow chart for providing discharge services according to some embodiments of the present description;
Fig. 20 is an exemplary flow chart for providing follow-up services according to some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
As used in this specification and the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
It will be understood that when an element, engine, module or block is referred to as being "on," connected to "or" coupled to "another element, engine, module or block, it can be directly on, connected or coupled to, or in communication with the other element, engine, module or block, or intervening elements, engines, modules or blocks may also be present unless the context clearly dictates otherwise. In the present application, the term "and/or" may include any one or more of the associated listed items or combinations thereof.
These and other features, characteristics, and functions of related structural elements of the present application, as well as the methods of operation and combination of parts and economies of manufacture, will become more apparent upon consideration of the following description of the drawings, all of which form a part of this specification. It is to be understood, however, that the drawings are designed solely for the purposes of illustration and description and are not intended as a definition of the limits of the application. It should be understood that the figures are not drawn to scale.
The terms "pixel" and "voxel" in this specification are used interchangeably to refer to an element in an image. In the present specification, the term "image" may refer to a two-dimensional (2D) image, a three-dimensional (3D) image, or a four-dimensional (4D) image (e.g., a time series of three-dimensional images). In some embodiments, the term "image" may refer to an image of a subject region (e.g., ROI). In some embodiments, the image may be a medical image, an optical image, or the like.
The present specification relates to systems and methods for providing hospitalization services. The method may include monitoring a data source for collecting data related to a patient hospitalization procedure. The method may include, in response to detecting a data update of at least one of the data sources, performing event of interest (Event of Interest, EOI) detection based on the update data collected by the at least one data source. The method may still further include, in response to detecting the occurrence of the event of interest, performing one or more preset operations corresponding to the event of interest to provide at least a portion of the hospitalization services.
According to some embodiments of the present disclosure, a data source that collects information related to a patient's hospitalization procedure is monitored. The monitoring is helpful for timely detecting the data updating and the occurrence of the interested event, and immediately triggers the corresponding preset operation, so that the hospitalization service can be automatically and efficiently provided for the relevant users, and the service efficiency and quality are improved. In addition, the monitored data sources collect multi-modal data throughout different phases of the hospitalization procedure, thereby enabling patient-centric healthcare and comprehensive hospital services.
In some embodiments, one or more operations of the methods illustrated herein may be performed by an agent. By introducing the intelligent agent, the intelligent agent can process repeated management tasks, thereby reducing the requirement on a large amount of manpower and lowering the operation cost. Further, by minimizing human intervention, the agent can significantly reduce the possibility of errors in data entry and processing, and ensure that the process is performed in a uniform manner, maintaining high standard consistency and reliability, thereby improving accuracy of hospitalization services. In addition, the agent can learn and optimize itself from various data (such as history service data, knowledge database, etc.), thereby improving the quality of service of hospitalization supported by the agent.
Fig. 1 is a block diagram of an exemplary healthcare system 100 shown according to some embodiments of the present application.
The healthcare system 100, which may also be referred to as a metahospital system, is built based on a variety of innovative technologies including metauniverse technology, XR technology (e.g., augmented Reality (AR) technology, virtual Reality (VR) technology, mixed Reality (MR) technology, etc.), AI technology, digital twin technology, IOT technology, data flow technology (e.g., blockchain technology, data privacy computing technology), spatial computing technology, image rendering technology, etc.
As shown in fig. 1, the healthcare system 100 may include a physical hospital 110, a virtual hospital 130, a user space application 120, and a hospital support platform 140. In some embodiments, the hospital support platform 140 may map data related to the physical hospital 110 into a virtual hospital 130 corresponding to the physical hospital 110 and provide user services to related users of the physical hospital 110 through the user space application 120.
The physical hospital 110 refers to a hospital existing in the physical world and having a tangible attribute. Health care institutions that provide medical, surgical and psychiatric care and treatment for humans are collectively referred to herein as hospitals.
As shown in fig. 1, the physical hospital 110 may include a plurality of physical entities. For example, the plurality of physical entities may include departments, users, hardware devices, user services, public areas, medical service procedures, and the like, or any combination thereof.
A department refers to a specialized unit or department that is specialized in providing a particular type of medical care, treatment, and service. Each department may be focused on a particular medical field and may be equipped with a healthcare professional having expertise in that field. For example, the departments may include an outpatient department, an inpatient department, a surgical department, a support department (e.g., a registration department, a pharmacy department), a medical department, a surgical department, a specialty medical department, a child care department, etc., or any combination thereof.
The user may include any user associated with the physical hospital 110 (or related user referred to as the physical hospital 110). For example, the user may include a patient (or a portion of a patient (e.g., an organ)), a physician, a visit to a patient, a hospital staff of the physical hospital 110, a provider of the physical hospital 110, an application developer of the physical hospital 110, or the like, or any combination thereof. Hospital staff of the physical hospital 110 may include healthcare providers (e.g., doctors, nurses, technicians, etc.), hospital administrators, support staff, or the like, or any combination thereof. Exemplary hospital administrators may include department care administrators, clinical administrators, department courtyards, hospital administrative staff, job management staff, or the like, or any combination thereof.
The hardware devices may include hardware devices located in the physical hospital 110 and/or hardware devices in communication with hardware devices in the physical hospital 110. Exemplary hardware devices may include terminal devices, healthcare devices, sensing devices, base devices, etc., or any combination thereof.
The terminal device may comprise a terminal device that interacts with a user of the medical services system 100. For example, the terminal devices may include terminal devices that interact with the patient (also referred to as patient terminals), terminal devices that interact with the patient's doctor (also referred to as doctor terminals), terminal equipment that interact with the nurse (also referred to as nurse terminals), terminal devices that interact with the remote seeker (also referred to as remote terminal devices), or public terminals of the hospital (e.g., office terminals, bedside terminal devices, terminal devices in waiting areas, intelligent surgical terminals), etc., or any combination thereof. In the present application, unless explicitly obtained from the context or otherwise stated in the context, the terminal devices owned by the user and the terminal devices provided to the user by the physical hospital 110 are collectively referred to as the user's terminal devices or the terminal devices interacting with the user.
The terminal device may include a mobile terminal, an XR device, an intelligent wearable device, etc. The mobile terminal may include a smart phone, a Personal Digital Assistant (PDA), a display, a gaming device, a navigation device, a hand-held terminal (POS), a tablet computer, etc., or any combination thereof.
The XR device may comprise a device that allows a user to participate in an augmented reality experience. For example, the XR device may include VR components, AR components, MR components, and the like, or any combination thereof. In some embodiments, the XR device may include an XR helmet, XR glasses, an XR patch, a stereo headset, or the like, or any combination thereof. For example, the XR device may include GoogleGlassTM, oculusRiftTM, gearVRTM, appleVisionproTM, etc. In particular, the XR device may include a display component on which virtual content may be presented and/or displayed. In some embodiments, the XR device may further comprise an input component. The input component can enable user interaction between a user and virtual content (e.g., virtual surgical environment) displayed by the display component. For example, the input component may include a touch sensor, microphone, image sensor, etc. configured to receive user input that may be provided to the XR device and used to control the virtual world by changing visual content presented on the display component. The input components may include handles, gloves, styluses, consoles, and the like.
The intelligent wearable device may include an intelligent wristband, intelligent footwear, intelligent glasses, intelligent helmet, intelligent watch, intelligent garment, intelligent backpack, intelligent accessory, etc., or any combination thereof. In some embodiments, the smart wearable device may acquire physiological data of the user (e.g., heart rate, blood pressure, body temperature, etc.).
The healthcare device may be configured to provide healthcare to the patient. For example, the medical services device may include an examination device, a care device, a treatment device, etc., or any combination thereof.
The examination apparatus may be configured to provide examination services to a patient, e.g. to collect examination data of the patient. Exemplary examination data may include heart rate, respiratory rate, body temperature, blood pressure, medical imaging data, body fluid test reports (e.g., blood test reports), and the like, or any combination thereof. Accordingly, the examination device may include a vital sign monitor (e.g., blood pressure monitor, blood glucose meter, heart rate meter, thermometer, digital stethoscope, etc.), a medical imaging device (e.g., computed Tomography (CT) device, digital Subtraction Angiography (DSA) device, magnetic Resonance (MR) device, etc.), a laboratory device (e.g., blood routine examination device, etc.), or any combination thereof.
The care device may be configured to provide care services to the patient and/or assist the healthcare provider in providing care services. Exemplary care devices may include hospital beds, patient care robots, smart care carts, smart kits, smart wheelchairs, and the like.
The treatment device may be configured to provide treatment services to the patient and/or assist the medical service provider in providing treatment services. Exemplary treatment devices may include surgical devices, radiation treatment devices, physical treatment devices, and the like, or any combination thereof.
The sensing device may be configured to gather sensing information related to the environment in which it is located. For example, the sensing device may include an image sensor, a sound sensor, or the like. The image sensor may be configured to collect image data in the physical hospital 110 and the sound sensor may be configured to collect voice signals in the physical hospital 110. In some embodiments, the sensing device may be a stand-alone device or may be integrated into another device. For example, the sound sensor may be part of a medical service device or a terminal device.
The base device may be configured to support data transmission, storage, and processing. For example, the infrastructure devices may include networks, machine room facilities, computing devices, computing chips, storage devices, and the like.
In some embodiments, at least a portion of the hardware devices of the physical hospital 110 are IoT devices. An internet of things device refers to a device with sensors, processing power, software and other technologies that connect and exchange data with other devices and systems through the internet or other communication networks. For example, one or more healthcare devices and/or sensing devices of the physical hospital 110 are internet of things devices and are configured to transmit collected data to the hospital support platform 140 for storage and/or processing.
The user services may include any service provided by the hospital support platform 140 to the user. For example, user services include medical services provided to patients and/or accompanying persons, support services provided to staff members of physical hospital 110 and/or suppliers of physical hospital 110, and the like. In some embodiments, user services may be provided to patients, doctors, and hospital administrators through the user space application 120, which will be described in detail in the following description.
The public area refers to a shared space accessible to users (or portions of users) in the physical hospital 110. For example, the common area may include a reception area (e.g., a foreground), a waiting area, a hallway, etc., or any combination thereof.
A healthcare procedure is a procedure that provides a corresponding healthcare to a patient. Medical service procedures typically include several links and/or steps through which a user may need to obtain a corresponding medical service. Exemplary healthcare procedures may include outpatient procedures, hospitalization procedures, surgical procedures, or the like, or any combination thereof. In some embodiments, the healthcare procedures may include corresponding healthcare procedures for different departments, different diseases, and the like. In some embodiments, a preset data acquisition protocol may be set and specify the standard links involved in the healthcare procedure and how to acquire data related to the healthcare procedure.
The user space application 120 provides the user with access to user services provided by the hospital support platform 140. The user space application 120 may be an application, plug-in, website, applet, or any other suitable form. For example, the user space application 120 is an application installed on a user terminal device that includes a user interface for a user to initiate requests and receive corresponding services.
In some embodiments, user space application 120 may include different applications corresponding to different types of users. For example, the user space application 120 includes a patient space application corresponding to a patient, a medical space application corresponding to a doctor, a tube space application corresponding to an administrator, and the like, or any combination thereof. User services provided through the patient space application, the medical space application, and the management space application are also referred to as a patient space service, a medical space service, and a management space service, respectively. Exemplary patient space services include registration services, route guidance services, pre-consultation services, remote consultation services, hospitalization services, discharge services, and the like. Exemplary medical space services include scheduling services, surgical planning services, surgical simulation services, patient management services, remote ward services, remote outpatient services, and the like. Exemplary managed space services include monitoring services, medical services assessment services, device parameter setting services, service parameter setting services, resource scheduling services, and the like.
In some embodiments, the patient space application, the medical space application, and the management space application may be integrated into one user space application 120, and the user space application 120 may be configured to provide access portals for each type of user (e.g., patient, healthcare provider, manager, etc.). By way of example only, a particular user may have a corresponding account number that may be used to log into a user space application, view corresponding diagnostic data, and obtain corresponding user services.
According to some embodiments of the present application, by providing user space applications for different types of users, each type of user can easily obtain various user services that he/she may need on its corresponding user space application. In addition, currently users often need to install various applications to obtain different user services, which results in poor user experience and high development costs. Therefore, the user space application of the application can improve the user experience, improve the service quality and efficiency, enhance the service safety and reduce the development or operation cost.
In some embodiments, the user space application 120 may be configured to provide access portals for relevant users of the physical hospital 110 to interact with the virtual hospital 130. For example, through the user space application 120, a user may enter instructions for retrieving digital content of the virtual hospital 130 (e.g., hardware devices, patient organs, digital twin models of public areas), view the digital content, and interact with the digital content. As another example, through the user space application 120, a user may communicate with a avatar representing an agent. In some embodiments, a public terminal of a hospital may install a administrative space application, and an administrator account of a department to which the public terminal corresponds may be logged into the administrative space application. The user may accept user services through a pipe space application installed in the public terminal.
The virtual hospital 130 is a digital twin (i.e., virtual representation or virtual copy) of the physical hospital 110 for simulating, analyzing, predicting, and optimizing the operating state of the physical hospital 110. For example, the virtual hospital 130 may be a real-time digital copy of the physical hospital 110.
In some embodiments, the virtual hospital 130 may be presented to the user using digital technology. For example, when the relevant user interacts with the virtual hospital 130, at least a portion of the virtual hospital 130 may be presented to the relevant user using XR technology. For example only, MR technology may be used to superimpose at least a portion of the virtual hospital 130 on the real-world view of the relevant user.
In some embodiments, the virtual hospital 130 may include a digital twin of a physical entity associated with the physical hospital 110. Digital twins refer to virtual representations (e.g., virtual copies, mappers, digital simulators) of physical entities. The digital twin can reflect and predict the state, behavior and performance of the physical entity in real time. For example, the virtual hospital 130 may include digital twins of at least a portion of medical services, departments, users, hardware devices, user services, public areas, medical services procedures, and the like of the physical hospital 110. The digital twins of a physical entity can take a variety of forms including models, images, graphics, text, numerical values, and the like. For example, the digital twin body may be a virtual hospital corresponding to a physical hospital, virtual personnel (e.g., virtual doctors, virtual nurses, and virtual patients) corresponding to personnel entities (e.g., doctors, nurses, and patients), virtual devices (e.g., virtual imaging devices and virtual scalpels) corresponding to medical service devices (e.g., imaging devices and scalpels), and the like.
In some embodiments, the digital twins may include one or more first digital twins and/or one or more second digital twins. The state of each first digital twin may be updated based on an update of the state of the corresponding physical entity. For example, one or more first digital twins may be updated during the mapping of data associated with the physical hospital 110 to the virtual hospital 130. One or more second digital twins can be updated by at least one of the user space applications 120, and the update of each second digital twins can result in a status update of the corresponding physical entity. In other words, the first digital twin may be updated accordingly when the corresponding physical entity changes its state, and the state of the corresponding physical entity changes accordingly when the second digital twin is updated. For example, the one or more first digital twins may include digital twins of a public area, a medical service, a user, a hardware device, etc., and the one or more second digital twins may include digital twins of a hardware device, a user service, a medical service procedure, etc. It should be appreciated that the digital twins may be either the first digital twins or the second digital twins.
According to some embodiments of the present application, physical hospitals 110 (including hardware devices, users, user services, healthcare procedures, etc.) may be simulated and tested in a secure and controllable environment by generating a virtual hospital 130 that includes digital twins of physical entities associated with the physical hospitals 110. By augmenting the real-time linkage (e.g., real-time interaction between the physical hospital 110 and the virtual hospital 130), various medical scenarios can be more accurately predicted and responded to, thereby improving the quality and efficiency of medical services. In addition, the application of the XR technology and the augmented reality integration technology enables the interaction of related users to be more natural and visual, and provides a more comfortable and efficient medical environment, so that the user experience is improved.
In some embodiments, the virtual hospital 130 may further include agents that implement self-evolution based on data related to the physical hospital 110 and AI technology.
An agent refers to an agent that acts in an intelligent manner. For example, an agent may include a computing/software entity that can autonomously learn and evolve, and sense and analyze data to perform specific tasks and/or achieve specific goals (e.g., healthcare procedures). Through AI techniques (e.g., reinforcement learning, deep learning, etc.), an agent can constantly learn and self-optimize/evolve in interactions with the environment. In addition, the agent can collect and analyze mass data (e.g., related data of the physical hospital 110) through a big data technology, mine patterns and learning rules from the data, optimize decision flow, thereby identifying environmental changes in uncertain or dynamic environments, responding quickly, and making reasonable judgment. For example, agents may learn and evolve autonomously based on AI technology to accommodate changes in physical hospitals 110. By way of example only, agents may be built based on NLP technology (e.g., large language models, etc.) and may automatically learn and autonomously update through large amounts of language text (e.g., hospital business data and patient feedback information) to improve the quality of user service provided by physical hospitals 110.
In some embodiments, the agents may include different types of agents corresponding to different healthcare procedures, different user services, different departments, different diseases, different hospital positions (e.g., nurses, doctors, technicians, etc.), different links of healthcare procedures, and the like. A particular type of agent is used to process tasks corresponding to the particular type. In some embodiments, one agent may correspond to a different healthcare procedure (or a different healthcare, or a different department, or a different disease, or a different hospital location). In some embodiments, an agent may operate with reference to basic configuration data (e.g., dictionary, knowledge graph, template, etc.) of a department and/or disease corresponding to the agent. In some embodiments, multiple agents may cooperate and share information through network communications to collectively accomplish complex tasks.
In some embodiments, a configuration of the agent may be provided. For example, basic configuration data for use by the agent in operation may be set. The basic configuration data may include dictionaries, knowledge databases, templates, etc. As another example, usage rights of the agent may be set for different users. In some embodiments, an administrator of the physical hospital 110 may set the configuration of the agent through a managed space application.
In some embodiments, the agent may be integrated into or deployed on a hardware device. For example, agents corresponding to hospitalization services may be integrated into a hospital bed or presentation device of a hospital bed. In some embodiments, the agent may be integrated into or deployed on the intelligent robot. A self-contained intelligent robot refers to a robotic system that combines physical presence (manifestation) with intelligent behavior (cognition). The self-contained intelligent robot may be configured to interact with the real world in a manner that mimics or complements human capabilities, utilizing physical morphology and cognitive functions to perform tasks, make decisions, and adapt to the environment. By utilizing artificial intelligence and sensor technology, the self-contained intelligent robot can operate autonomously, interact with the environment, and continuously improve performance. For example, the self-contained intelligent robot may configure an agent corresponding to a surgical service and assist a doctor in performing a surgery.
In some embodiments, at least a portion of the user services may be provided based on the agent. For example, at least a portion of the user services may be provided to the relevant users based on the processing results, wherein the processing results are generated by at least one of the agents based on data related to the physical hospital 110. For example only, the data related to the physical hospital 110 may include data related to a healthcare procedure of the physical hospital 110, the agent may include an agent corresponding to the healthcare procedure, and the user service may be provided to an associated user of the healthcare procedure by using the agent processing data corresponding to the healthcare procedure.
The hospital support platform 140 may be configured to provide technical support to the healthcare system 100. For example, the hospital support platform 140 may include computing hardware and software to support innovative technologies including XR technology, AI technology, digital twinning technology, data flow technology, and the like. In some embodiments, the hospital support platform 140 may include at least a storage device for data storage and a processing device for data computation.
In some embodiments, the hospital support platform 140 may support interactions between the physical hospitals 110 and the virtual hospitals 130. For example, the processing device of the hospital support platform 140 may obtain data related to the physical hospital 110 from the hardware device and map the data related to the physical hospital 110 into the virtual hospital 130. For example, the processing device of the hospital support platform 140 may update a portion of the digital twins (e.g., one or more first digital twins) in the virtual hospital 130 based on the obtained data such that each portion of the digital twins in the virtual hospital 130 may reflect the updated status of the corresponding physical entity in the physical hospital 110. Based on the digital twin body which is continuously updated with the corresponding physical entity, the user can know the state of the physical entity related to the physical hospital 110 in real time, so that the monitoring and evaluation of the physical entity are realized. As another example, agents corresponding to data related to the physical hospital 110 may train and/or update based on the data related to the physical hospital 110 to self-evolve and self-learn.
In some embodiments, the hospital support platform 140 may support and/or provide user services to the relevant users of the physical hospital 110. For example, in response to receiving a user service request from a user, the processing device of the hospital support platform 140 may provide a user service corresponding to the service request. As another example, in response to detecting a need to provide a user service to a user, the processing device of the hospital support platform 140 may control a physical entity or virtual entity corresponding to the user service to provide the user service. For example, in response to detecting that a patient is being sent to a patient room, the processing device of the hospital support platform 140 may control the intelligent care cart to direct a nurse to the patient room for an admission check of the patient.
In some embodiments, at least a portion of the user services may be provided to the relevant users based on interactions between the relevant users and the virtual hospital 130. Interaction refers to interactions or effects (e.g., conversations, behaviors, etc.) between the relevant user and the virtual hospital 130. For example, interactions between the relevant user and the virtual hospital 130 may include interactions between the relevant user and a digital twin in the virtual hospital 130, interactions between the relevant user and an agent, interactions between the relevant user and a virtual character, and the like, or any combination thereof.
In some embodiments, at least a portion of the user services may be provided to the associated user based on interactions between the associated user and at least one of the digital twins. For example, an update instruction of the second digital twin input by the relevant user may be received by the user space application 120, and the corresponding physical entity of the second digital twin may be updated according to the update instruction. As another example, a user may view a first digital twin of a physical entity (e.g., a 3D digital twin model of a patient organ or hardware device) through the user space application 120 to learn about the state of the physical entity. Alternatively, the user may change the display angle, display size, etc. of the digital twin.
In some embodiments, the processing device of the hospital support platform 140 may present virtual characters corresponding to the agents through the user space application, interact with the associated user, and provide at least a portion of the user services to the associated user based on the interactions between the associated user and the virtual characters.
In some embodiments, the hospital support platform 140 may have a five-layer structure including a hardware device layer, an interface layer, a data processing layer, an application development layer, and a service layer, see fig. 3 and its associated description. In some embodiments, the hardware devices of the physical hospital 110 may be part of the hospital support platform 140.
According to some embodiments of the present application, a virtual hospital corresponding to a physical hospital may be established by integrating various internal and external resources (e.g., medical service equipment, hospital personnel, medicines and consumables, etc.) of the physical hospital. The virtual hospital may reflect real-time status (e.g., changes, updates, etc.) of physical entities associated with the physical hospital, thereby enabling monitoring and assessment of the physical entities. Such integration may provide accurate data support for the operation and intelligent decision-making of medical services. In addition, through the virtual hospital, users related to medical services can commonly establish an open shared ecosystem, thereby promoting innovation and promotion of medical services.
In addition, the medical care service of the patient in the whole life cycle can be provided for the linkage between the inside and outside of the hospital. The perspective of medical services extends from mere disease treatment to covering the entire life cycle of a patient, including prevention, diagnosis, treatment, rehabilitation, health management, and the like. By establishing the intra-and-inter-hospital linkage, the physical hospital can integrate online and offline resources better and provide comprehensive and continuous medical and health services for patients. For example, by remote monitoring and online consultation, the health condition of the patient can be followed in real time, the treatment scheme can be adjusted in time, and the treatment effect can be improved.
Fig. 2 is a schematic diagram of a scenario illustrating an exemplary healthcare system 200 according to some embodiments of the present application.
As shown in fig. 2, the healthcare system 200 may include a processing device 210, a network 220, a storage device 230, one or more healthcare devices 240, one or more perception devices 250, one or more patient terminals 260 of a patient 261, and one or more doctor terminals 270 of a doctor 271 associated with the patient 261. In some embodiments, components in the healthcare system 200 may be interconnected and/or communicate by a wireless connection, a wired connection, or a combination thereof. The connections between the components of the healthcare system 200 may be variable.
The processing device 210 may process data and/or information obtained from the storage device 230, the healthcare device 240, the sensing device 250, the patient terminal 260, and/or the doctor terminal 270. For example, the processing device 210 may map data related to a physical hospital to a virtual hospital corresponding to the physical hospital and provide user services to the patient 261 and the doctor 271 through the patient terminal 260 and/or the doctor terminal 270, respectively, by processing the data related to the physical hospital. As another example, processing device 210 may maintain a digital smart object and provide user services to patient 261 and doctor 271 through patient terminal 260 and/or doctor terminal 270, respectively, by engaging the digital smart object in processing data related to a physical hospital.
In some embodiments, the processing device 210 may be a single server or a group of servers. The server group may be centralized or distributed. In some embodiments, the processing device 210 may be located locally or remotely from the healthcare system 200. In some embodiments, the processing device 210 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or a combination thereof.
In some embodiments, processing device 210 may include one or more processors (e.g., single-core processors or multi-core processors). For illustration only, only one processing device 210 is depicted in the healthcare system 200. It should be noted, however, that the healthcare system 200 of the present application may also include multiple processing devices. Thus, as with the present application, operations and/or method steps performed by one processing device 210 may also be performed by multiple processing devices in combination or separately.
The network 220 may include any suitable network capable of facilitating the exchange of information and/or data by the healthcare system 200. The network 220 may be or include a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network), a bluetoothTM network, a Near Field Communication (NFC) network, or the like, or any combination thereof.
Storage device 230 may store data, instructions, and/or any other information. In some embodiments, the storage device 230 may store data obtained from other components of the medical services system 200. In some embodiments, storage device 230 may store data and/or instructions that processing device 210 may perform or be used to perform the exemplary methods described herein.
In some embodiments, the data stored in the storage device 230 may include multi-modal data. Multimodal data may include various forms of data (e.g., images, graphics, video, text, etc.), various types of data, data obtained from different sources, data related to different medical services (e.g., diagnosis, surgery, rehabilitation, etc.), data related to different users (e.g., patients, medical personnel, management personnel, etc.). For example, the data stored in the storage device 230 may include medical data of the patient 261 reflecting the health of the patient 261. For example, the medical data can include an electronic medical record (or electronic health record) of the patient 261. Electronic medical records refer to electronic files that record various types of patient data (e.g., basic information, examination data, imaging data). For example, the electronic medical record can include a three-dimensional model of a plurality of organs and/or tissues of the patient 261.
In some embodiments, storage device 230 may include mass storage devices, removable storage devices, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, storage device 230 may include a data lake and a data warehouse, as will be described in detail in connection with FIG. 3.
The healthcare device 240 may be used to provide or assist in healthcare. As shown in FIG. 2, the medical services device 240 includes a clinic terminal 240-1, a hospital bed 240-2, a smart surgical terminal 240-3, a smart care cart 240-4, a smart wheelchair 240-5, etc., or any combination thereof.
The office terminal 240-1 is a terminal device that is configured within the office for use by doctors and patients in a medical outpatient procedure. For example, the office terminal 240-1 may include one or more of a screen, a sound output component, an image sensor, or a sound sensor. A doctor interface may be displayed on the screen of the doctor-room terminal 240-1 and data may be displayed on the doctor interface to facilitate communication between the doctor and patient. Exemplary data may include electronic medical records (or portions thereof), pre-consultation records, medical images, 3D organ models, examination results, consultation advice, and the like.
Hospital bed 240-2 refers to a hospital bed that is capable of supporting an inpatient and providing user services to the patient in a hospital room. The hospital bed 240-2 may include beds, bedside terminal equipment, bedside inspection equipment, sensors, and the like, or any combination thereof. The bedside terminal device may include an XR device, a display device, a mobile device, etc., or any combination thereof. In some embodiments, the hospital bed 240-2 may be controlled by an agent corresponding to the hospitalization service, wherein the hospital bed may also be referred to as a smart hospital bed or a meta-hospital bed.
The intelligent surgical terminal 240-3 refers to a device configured with an agent for assisting surgery, and is controlled by the agent corresponding to a surgical service. The intelligent surgical terminal 240-3 may sense interactions (e.g., conversations, behaviors, etc.) between the healthcare provider, the patient, and the agent and obtain data captured by the sensing device 250 to provide surgical assistance. In some embodiments, the intelligent surgical terminal 240-3 may be configured to perform a risk alert for a surgical procedure, generate a surgical record of a surgical procedure, etc., based on the agent configured therein.
The intelligent nursing car 240-4 is a nursing car having an automatic driving function and capable of assisting patient treatment and nursing. For example, the intelligent care vehicle 240-4 may be configured to guide a nurse to a ward for admission checks of the patient. In some embodiments, the intelligent care vehicle may be controlled by an agent (e.g., an agent corresponding to a hospitalization service, a care agent). In some embodiments, the smart care cart 240-4 may include a cart, a presentation device, one or more examination devices and/or care tools, a sensing device (e.g., an image sensor, a GPS sensor, a sound sensor, etc.), and so forth. In some embodiments, the intelligent care vehicle 240-4 may be configured to obtain relevant treatment and care information for the patient and generate the screening data, care data, and the like. The physical examination data may include vital sign data of the patient. The care data may include detailed records of care operations, such as care time, care operator, care measure, patient response, and the like.
The intelligent wheelchair 240-5 refers to a transport device for intelligently taking in and out of a patient. In some embodiments, the smart wheelchair 240-5 may be configured to perform autonomous navigation through integrated sensors and maps, locate the patient's location using Radio Frequency Identification Devices (RFID), bluetooth, or Wi-Fi signals, and identify the patient through biometric technology. In some embodiments, the intelligent wheelchair 240-5 may be controlled by an agent (e.g., an agent corresponding to a hospitalization service, an agent corresponding to a surgical service). In some embodiments, the intelligent wheelchair 240-5 may be configured to generate data (e.g., a record of the interaction between the agent and the patient) by sensing the interaction data through the built-in cameras/sensors.
The sensing device 250 may be configured to gather sensing information related to the environment in which it is located. In some embodiments, the sensing device 250 may comprise a sensing device in a physical hospital 110. For example, the sensing device 250 may include an image sensor 250-1, a sound sensor 250-2, a temperature sensor, a humidity sensor, and the like.
The patient terminal 260 may be a terminal device that interacts with the patient 261. In some embodiments, patient terminal 260 may include a mobile terminal 260-1, an XR device 260-2, a smart wearable device 260-3, and so forth. Doctor terminal 270 may be a terminal device that interacts with doctor 271. In some embodiments, the physician terminal 270 may include a mobile terminal 270-1, an XR device 270-2, or the like. In some embodiments, patient 261 may access a user space application (e.g., a patient space application) through patient terminal 260 and doctor 271 may access a user space application (e.g., a doctor space application) through doctor terminal 270. In some embodiments, patient 261 and doctor 271 may communicate with each other remotely through patient terminal 260 and doctor terminal 270, thereby providing remote medical services, such as remote outpatient services, remote ward services, remote follow-up services, and the like.
The sensing device 250, patient terminal 260, and doctor terminal 270 may be configured as data sources to provide information to the healthcare system 200. For example, the devices may transmit the collected data to the processing device 210, and the processing device 210 may provide user services based on the received data.
It should be noted that the above description of the healthcare systems 100 and 200 is intended to be illustrative, and not limiting of the scope of the present application. Many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other features of the example embodiments herein may be combined in various ways to obtain additional and/or alternative example embodiments. For example, the healthcare system 200 may include one or more additional components, such as, for example, other users 'terminal devices, a hospital's public terminal device, and the like. As another example, two or more components of the healthcare system 200 may be integrated into a single component.
Fig. 3 is a schematic structural diagram of an exemplary hospital support platform 300, according to some embodiments of the present application.
As shown in fig. 3, the hospital support platform 300 may include a hardware layer 310 (also referred to as a hardware module), an interface layer 320 (also referred to as an interface module), a data processing layer 330 (also referred to as a data processing module), an application development layer 340 (also referred to as an application development module), and a service layer 350 (also referred to as a service module). It should be understood that the "layers" and "modules" in this disclosure are used only for logically dividing the components of the hospital support platform and are not intended to be limiting.
The hardware layer 310 may be configured to provide a hardware basis for interactions between the real world and the digital world, and may include one or more hardware devices related to hospital operations. Exemplary hardware devices may include healthcare devices, sensing devices, terminal devices, and base devices.
The interface layer 320 may be connected with the hardware layer 310 and the data processing layer 330. The interface layer 320 may be configured to obtain data collected by hardware devices of the hardware layer 310 and send the data to the data processing layer 330 for storage and/or processing. Interface layer 320 may also be configured to control at least a portion of the hardware devices of hardware layer 310. In some embodiments, interface layer 320 may include hardware interfaces and software interfaces (e.g., data interfaces, control interfaces).
The data processing layer 330 may be configured to store and/or process data. The data processing layer 330 may include a processing device on which a plurality of data processing units may be configured. The data processing layer 330 may be configured to obtain data from the interface layer 320 and process the data by at least one data processing unit to enable user services related to hospital services.
The data processing unit may comprise various preset algorithms for implementing data processing. In some embodiments, data processing layer 330 may include a processing device (e.g., processing device 210 in fig. 2). The data processing unit may be configured on the processing device. In some embodiments, the data processing unit may include an XR unit configured to process data using XR technology to implement XR services, an AI unit (e.g., an agent unit) configured to process data using AI technology to implement AI services, a digital twin unit configured to process data using digital twin technology to implement digital twin services, a data flow unit configured to process data using data flow technology (e.g., blockchain technology, data privacy computing technology) to implement data flow services, and so forth.
In some embodiments, data processing layer 330 may also include a data center configured to store data. In some embodiments, the data center may employ a lake-warehouse integrated architecture, which may include data lakes and data warehouses. The data lake may be used to persist large amounts of data in a tamper-proof manner. The data warehouse may be used to store index data corresponding to data in the data lake. The data stored in the data lake may include native (or raw) data collected by the hardware device, derived data generated based on the native data, and the like. In some embodiments, the data in the data lake may be processed by a processing device (e.g., processing device 210).
The application development layer 340 may be configured to support application development, publishing, subscribing, and the like. The application development layer 340 is also referred to as an ecological suite layer. In some embodiments, the application development layer 340 may be configured to provide an open interface for application developers to access or invoke at least a portion of the data processing units and to utilize at least a portion of the data processing units to develop applications. In some embodiments, as shown in fig. 3, the application development layer 340 may provide development kits, application markets, multi-tenant operation platforms, cloud officials, workspaces, and other support kits to assist developers in doing work.
The service layer 350 may be configured to enable relevant users of the hospital service to access user services related to the hospital service through the user space application.
The application provides a hospital support platform which is designed for comprehensive management of various resources in a hospital, including hardware resources, software resources and data resources. In some embodiments, the platform further integrates data processing units capable of supporting advanced technologies, such as artificial intelligence, XR, digital twinning, and blockchain. These advanced techniques are used to improve the efficiency and quality of service in the healthcare industry. For example, artificial intelligence techniques enable autonomous evolution and continuous optimization of hospital operations, while XR and digital twins techniques facilitate the creation and maintenance of virtual hospitals. The virtual hospital can interact with the user, providing an immersive novel service experience. In addition, the platform includes an application development layer for granting access to these advanced technologies to third party developers of the healthcare industry. This access promotes an open ecosystem, which promotes the development and innovation of applications, and thus promotes the advancement of medical services.
Fig. 4 is a schematic illustration of a scene of an hospitalization system 400 according to some embodiments of the present description. The hospitalization system 400 may be used to implement or support various hospitalization services associated with hospitalization procedures.
By way of example, fig. 5 discloses an exemplary flowchart of an hospitalization procedure 500 shown in accordance with some embodiments of the present description. As shown in fig. 5, the hospitalization procedure 500 includes an admission phase 510, an admission inquiry phase 520, an hospitalization phase 530, an discharge phase 540, a follow-up phase 550, and the like, or any combination thereof.
When a patient (e.g., patient 261) is in different phases of the hospitalization procedure 500, the hospitalization system 400 may provide different hospitalization services to the relevant users in the hospitalization procedure. For example, hospitalization services may include admission services, admission inquiry services, ward services, discharge services, follow-up services, and the like, or any combination thereof. Relevant users in the hospitalization procedure may include patients, healthcare staff (e.g., doctors, nurses, etc.) providing medical services to the patients in the hospitalization procedure, seekers of the patients, and the like.
During the admission phase 510 of the hospitalization procedure, the patient may transact with the relevant procedure to admit. In some embodiments, the hospitalization service system 400 may provide the relevant user with an admission service related to the admission phase 510. For example, the admission service may include directing the patient to a ward corresponding to the patient, specifying hospitalization rules for the patient, conducting an admission check for the patient, generating a registration record for the patient, or the like, or any combination thereof. For more details regarding admission services, see the relevant description of fig. 8-11.
In the admission inquiry phase 520, the patient may receive an admission inquiry that is used to obtain basic information about the patient. In some embodiments, the hospitalization service system 400 may provide the relevant user with an admissions inquiry service related to the admissions inquiry phase 520. For example, the admission inquiry service may include performing one or more inquiries (e.g., a first inquiry, a second inquiry, etc.) on the patient, generating an admission record for the patient, etc., or any combination thereof. For more details regarding the admission interrogation service, see the associated description of fig. 12-15.
In the hospitalization stage 530, the patient may stay in a hospital (e.g., a ward) for a period of time to receive all-weather medical care. In some embodiments, the hospitalization service system 400 may provide ward services to the relevant users in connection with the hospitalization stage 530. For example, the ward services may include a care service 532, a ward service 534, a visit service 536, and the like, or any combination thereof.
The care service 532 is used to provide direct care to the patient, including dosing, performing physical examination, monitoring vital sign data, and assisting in activities of daily living. For example, the care service 532 may include determining a daily plan for the patient, displaying a daily plan, displaying a care instruction, guiding a nurse to a ward to perform a care operation, assisting the nurse in performing a care operation, generating a care record, and the like, or any combination thereof. For more details on care services, see fig. 16 and its associated description.
The ward-round service 534 is associated with a ward-round conducted by a medical team (e.g., at least one doctor) on a patient in a ward, where the medical team can view and discuss the patient's status and care plan in the ward-round. For example, the ward service 534 may include exposing data for facilitating communication between a doctor and a patient, generating ward records, exposing virtual ward space to one or more remote doctors, and the like, or any combination thereof. For more details on ward-round services, see fig. 17 and its associated description.
The exploratory service 536 allows remote exploratory viewers to communicate with patients remotely. For example, the exploratory service 536 may include generating a virtual exploratory space for the patient and the remote exploratory, presenting the virtual exploratory space to the patient and the remote exploratory, and the like, or any combination thereof. For more description of the visit service, see fig. 18 and its associated description.
In discharge stage 540, the patient may transact discharge. In some embodiments, the hospitalization system 400 may provide discharge services to the relevant user in connection with discharge phase 540 to guide the discharge of the patient. For example, the discharge service may include generating discharge data, presenting discharge data to the patient, generating discharge records corresponding to the patient, facilitating patient handling discharge procedures, and the like, or any combination thereof. For further description of discharge services, see fig. 19 and the associated description.
In the follow-up phase 550, the patient may be provided with continued care after discharge to ensure that he continues to rehabilitate, manage any remaining health issues, and prevent readmission. In some embodiments, the relevant user may be provided with a follow-up service related to the follow-up stage 550. For example, the follow-up services may include determining a follow-up plan for the patient, alerting the doctor and patient according to the follow-up plan, exposing virtual follow-up space to the patient and doctor, updating the follow-up plan, and the like, or any combination thereof. For more description of providing follow-up services, see fig. 20 and its associated description.
As described above, hospitalization services may be supported or implemented by the hospitalization service system 400. As shown in fig. 4, the hospitalization service system 400 may include a processing device 210, a perception device 410, an examination device 420, a terminal device 430, and a care device 440.
The sensing device 410 may be configured to gather sensing information about the environment in which it is located. In some embodiments, the sensing device 410 includes one or more sensing devices in a patient room of a patient. For example, the sensing device 410 may include an image sensor, a sound sensor, and the like. The image sensor may be configured to collect image data in a patient room and the sound sensor may be configured to collect sound data in the patient room. In some embodiments, the sensing device may be a stand-alone device or integrated into other devices. For example, the sound sensor may be part of a hospital bed or terminal device 430.
The examination apparatus 420 may be configured to acquire examination data of a patient. Exemplary examination data may include heart rate, respiratory rate, body temperature, blood pressure, medical imaging data, body fluid test reports (e.g., blood test reports), and the like, or any combination thereof. Accordingly, the examination device 420 may include a vital sign monitor (e.g., blood pressure monitor, blood glucose meter, heart rate meter, thermometer, digital stethoscope, etc.), a medical imaging device, a laboratory device, etc., or any combination thereof. In some embodiments, the inspection device 420 may be a stand-alone device. In some embodiments, the examination device 420 (e.g., a sphygmomanometer) may also be integrated into other devices (e.g., a hospital bed, a patient care robot 441, a smart care cart 240-4, etc.). In some embodiments, the inspection device 420 may also be referred to as an intelligent inspection device. In some embodiments, the examination device 420 (e.g., medical imaging device, laboratory device) may be affiliated with a hospital examination department, such as a medical imaging department, laboratory department, or the like.
Terminal 430 may include a terminal that interacts with a user associated with the hospitalization procedure. For example, the terminal devices 430 may include a first terminal device in a patient room that interacts with a patient and/or healthcare personnel, a second terminal device that interacts with a patient's doctor, a third terminal device of a patient, a fourth terminal device that interacts with a nurse, a fifth terminal device of a remote seeker, and the like, or any combination thereof. Taking the first terminal device as an example, the first terminal device may be configured to present data (e.g., notifications, avatar, image data of a remote seeker) to the patient and/or receive instructions or requests entered by the patient during the hospitalization procedure. The patient may interact with the first terminal device via an input device (e.g., a controller), voice, gestures, etc. For example, terminal device 430 may include a display device 431, an augmented reality device 432, a mobile device 433, and the like.
The care device 440 may be configured to provide care services to a patient and/or assist healthcare personnel (e.g., doctors, nurses, etc.) in providing care services. For example, the care device 440 may include a patient care robot 441, an intelligent care cart 240-4, and the like.
The patient care robot 441 may include sensors, robotic arms, actuators, user interfaces, and the like, or any combination thereof. The patient care robot 441 may provide care services, such as assisting the patient in drinking water, eating or taking medicine. The smart care cart 240-4 may have autonomous driving capabilities and be capable of moving to different locations in a hospital. For example, the smart care cart 240-4 may be configured to guide a nurse to a ward for an examination or care operation on a patient. For another example, the smart care cart 240-4 may be configured to present guidance information to guide a nurse to perform an examination or care operation. Optionally, the intelligent care cart 240-4 may include one or more examination devices and/or care tools for use by nurses in performing examination and/or care operations. For more description of the intelligent care cart, see the previous related description (e.g., fig. 2 and its description).
The processing device 210 may be configured to process data related to the hospitalization system 400 to enable and/or support at least a portion of hospitalization. In some embodiments, as shown in fig. 4, the processing device 210 may be communicatively coupled to a data source that collects data related to patient hospitalization procedures and processes data received from the data source. The data sources may include one or more sensing devices 410, examination devices 420, terminal devices 430, and care devices 440. In other embodiments, the processing device 210 may be communicatively coupled to a storage device that stores data collected by the data source, retrieve data from the storage device, and process the retrieved data.
In some embodiments, the processing device 210 may be configured with an agent. The agent may participate in processing data related to the hospitalization procedure to enable and/or support at least a portion of the hospitalization services. In some embodiments, different hospitalization services, different stages in a hospitalization procedure, and/or different operations involved in a hospitalization procedure may correspond to the same agent. Or different hospitalization services, different phases of the hospitalization procedure, and/or different operations involved in the hospitalization procedure may correspond to different agents. For example, the processing device 210 may be configured as multiple agents corresponding to different hospitalization services. An agent corresponding to a particular hospitalization service may be configured to process data related to the hospitalization service to support or effectuate the hospitalization service. By way of example only, the plurality of agents may include an admission agent corresponding to an admission service, an inquiry agent corresponding to an admission inquiry service, a care agent corresponding to a care service, a ward agent corresponding to a ward service, a discharge agent corresponding to a discharge service, and a follow-up agent corresponding to a follow-up service. In some embodiments, multiple agents may cooperate and share information over a network to enable and/or support hospitalization services.
In some embodiments, processing device 210 may set the configuration of the agent. For example, the configuration of the agent may include configuration of a hospital real-time map, configuration of a sensing device, configuration of an admission announcement, configuration of an intelligent care cart, configuration of an electronic health record (e.g., an electronic health record template, an electronic health record audit process, a registration record template, an admission record template, a care record template, a ward record template, a visit record template, an discharge record template, a follow-up record template, etc.), configuration of a medical knowledge map (e.g., interrogation templates of different diseases, a professional medical knowledge base, etc.), configuration of speech recognition management (e.g., custom vocabulary, speech recognition engine, emergency vocabulary, etc.), security and privacy configuration (e.g., access control, operation log, etc.), data interaction configuration, etc., or any combination thereof. For example, a manager of a hospital may set up the configuration of an agent through a managed space application.
In some embodiments of the present description, repetitive tasks and management tasks may be handled by an agent, which may reduce the need for significant manpower and reduce operating costs. For example, the agent can process repetitive tasks and management tasks faster and without fatigue, which can reduce the number of staff required for the same workload, improving the efficiency of hospitalization. In addition, by minimizing human intervention, the agent can significantly reduce the likelihood of errors in the data entry and processing process and ensure that the process is performed in a uniform manner to maintain high standard consistency and reliability, thereby improving accuracy of hospitalization services. For example, automatically generating records (e.g., registration records, admission records, care records, ward records, visit records, discharge records, follow-up records, etc.) may ensure the comprehensiveness and authenticity of the records, thereby improving the accuracy and objectivity of the records and, thus, the accuracy of subsequent services. In addition, the agent can learn and optimize itself from various data (such as history service data, knowledge database, etc.), thereby improving the quality of service of hospitalization supported by the agent.
In some embodiments, the processing device 210 may be integrated into other devices of the hospitalization system 400. For example, the processing device 210 may configure the agent and be integrated into a hospital bed or a display device of a hospital bed, which may also be referred to as a smart hospital bed or a metabed. For another example, the processing device 210 may be configured with an agent and integrated into a patient care robot 441, such patient care robot 441 may also be referred to as an intelligent robot with body. For example, the self-contained intelligent robot may configure an agent corresponding to the hospitalization service and assist the nurse in providing patient care services. Exemplary patient care services may include pressure sore care, bed transport, assisting a patient in drinking and/or eating, etc., or any combination thereof.
In some embodiments, the processing device 210 may be implemented on a cloud computing device. For more details regarding the processing device 210, see FIG. 2 and its associated description.
It should be noted that the description of the hospitalization system 400 above is for illustrative purposes only and is not intended to limit the scope of the present description. Various changes and modifications may be made by one of ordinary skill in the art in light of the description of the application. However, such changes and modifications do not depart from the scope of the present application.
In some embodiments, the processing device 210 may be connected to more data sources to collect more data related to patient hospitalization procedures. In some embodiments, multiple devices in fig. 4 may be integrated into one device. For example, one or more sensing devices and/or one or more terminal devices may be integrated into the patient bed or patient care robot 441. For example only, the first terminal device may be integrated into the patient bed as a bedside terminal 240-6. In some embodiments, one or more of the devices in fig. 4, e.g., sensing devices, inspection devices, etc., may be internet of things devices.
Fig. 6 is an exemplary schematic diagram of a processing device according to some embodiments of the present description. In some embodiments, processing device 210 may communicate with and execute instructions stored in a computer-readable storage medium (e.g., storage device 230 shown in fig. 2). As shown in fig. 6, the processing device 210 may include a monitoring module 610, a detection module 620, and a service module 630.
The monitoring module 610 may be configured to monitor a data source that collects data related to patient hospitalization procedures. A data source refers to a source (e.g., hardware device) that provides (e.g., collects) data related to a hospital procedure. In some embodiments, the monitoring module 610 may monitor the data sources to detect data updates of at least one of the data sources. For more on monitoring data sources, see step 702 of FIG. 7 and its associated description.
The detection module 620 may be configured to perform event detection of interest on the update data collected by the at least one data source. Update data refers to data collected by at least one data source that has not been processed by the processing device 210. An event of interest (Event of Interest, EOI) refers to a particular event or operation that requires attention. Event of interest detection refers to processing update data collected by at least one data source to detect whether one or more events of interest have occurred. In some embodiments, the detection module 620 may perform the event detection of interest based on event detection rules of interest. The event of interest detection rule refers to a rule to be followed when the event of interest detection is performed. For more on performing event detection of interest, see step 704 of FIG. 7 and its associated description.
The service module 630 may be configured to perform one or more preset operations corresponding to events of interest to provide at least a portion of hospitalization services. The preset operation corresponding to the event of interest refers to an operation that needs to be performed when the event of interest occurs. In some embodiments, the service module 630 may determine one or more preset operations corresponding to the event of interest based on a correspondence between the event of interest and the preset operations. The correspondence may indicate one or more preset operations that need to be performed when a particular type of event of interest occurs. For more details regarding performing one or more preset operations, reference may be made to step 706 of FIG. 7 and the associated description.
It should be noted that the above description of the processing device 210 is provided for illustrative purposes and is not intended to limit the scope of the present description. Various changes and modifications may be made by one of ordinary skill in the art in light of the present description. However, such changes and modifications do not depart from the scope of the present application. In some embodiments, processing device 210 may include one or more other modules. For example, the processing device 210 may include a memory module for storing data generated by the modules in the processing device 210. In some embodiments, any two modules may be combined into a single module, and any one module may be divided into two or more units. For example, the service module 630 may include multiple units configured to support and/or enable different hospitalization services.
Fig. 7 is an exemplary flow chart of a procedure 700 for providing hospitalization services according to some embodiments of the present description. In some embodiments, the process 700 may include the following steps. In some embodiments, the process 700 is performed by the processing device 210.
At step 702, the processing device 210 (e.g., the monitoring module 610) may monitor a data source of data related to a patient hospitalization procedure.
As illustrated in fig. 5, the hospitalization procedure may include multiple phases. For example, the plurality of phases may include an admission phase, an admission inquiry phase, a hospitalization phase, an discharge phase, a follow-up phase, and the like, or any combination thereof. In some embodiments, each phase may correspond to one or more hospitalization services. For example, an admission phase may correspond to an admission service, an admission inquiry phase may correspond to an admission inquiry service, an admission phase may correspond to a ward service (e.g., a care service, a ward service, a visit service, etc.), an discharge phase may correspond to an discharge service, and a follow-up phase may correspond to a follow-up service. For more description of the hospital services see fig. 7-20 and their associated description.
The data related to the hospitalization procedure may indicate the progress of the hospitalization procedure, the status of the patient in the hospitalization procedure, and/or various aspects of the hospitalization procedure. For example, data related to an hospitalization procedure may include data related to each stage of the hospitalization procedure. As another example, data related to an hospitalization procedure may include data related to each hospitalization service provided to the associated user in the hospitalization procedure.
A data source refers to a source (e.g., hardware device) that provides (e.g., collects) data related to a hospital procedure.
For example, the data source may comprise a sensing device in a patient's ward for collecting sensing information. The sensing device may include, among other things, an image sensor, a sound sensor, a temperature sensor, a humidity sensor, an atmospheric pressure sensor, etc., or any combination thereof.
For another example, the data source may include a terminal device of a physician (e.g., doctor) associated with the patient to obtain patient-related input data entered by the doctor. Exemplary doctor's terminal devices may include a doctor's mobile device, an augmented reality device, a notebook computer, etc., or any combination thereof. The input data may include orders, prescriptions (e.g., prescriptions, examination prescriptions), diagnostic reports, and the like.
As another example, the data source may include a vital sign monitor (e.g., the examination device 420) for collecting vital sign data of the patient. Exemplary vital sign data may include heart rate, respiratory rate, body temperature, blood pressure, etc., or any combination thereof.
As another example, the data source may include a hospital examination department for collecting patient examination results. The hospital examination departments may include a medical imaging department, a clinical laboratory department, and the like, or any combination thereof. The medical imaging department may be configured with medical imaging devices such as computed tomography (computed tomography, CT) devices, digital subtraction angiography (digital subtraction angiography, DSA) devices, magnetic resonance (magnetic resonance, MR) devices, ultrasound devices, positron emission tomography (positron emission tomography, PET) devices, single-photon emission computed tomography (SPECT) devices, positron emission tomography-computed tomography (positron emission tomography-computed tomography, PET-CT) devices, positron emission tomography-magnetic resonance imaging (positron emission tomography-magnetic resonance imaging, PET-MRI) devices, or the like, or any combination thereof. The clinical laboratory department may be configured to conduct a clinical examination of the subject. Exemplary clinical tests may include blood routine tests, urine routine tests, biochemical tests, immunoassay tests, instrumental tests, and the like, or any combination thereof.
It should be noted that the above description of the data sources is for illustrative purposes only and is not limiting. The processing device 210 may be configured to monitor any data source for collecting data related to the hospitalization procedure. For example, the processing device 210 may monitor a first terminal device in a patient's ward, a third terminal device of the patient, a fourth terminal device in a hospital that interacts with a nurse corresponding to the patient (e.g., a common terminal in a nurse workstation), etc., or any combination thereof.
Because there are multiple types of data sources, data related to the hospitalization procedure includes various types of data, such as sensory information, examination data, interaction data, clinical data, order data, and the like. In some embodiments, data related to hospitalization procedures may be considered multi-modal data, including multiple types of data, multi-dimensional data, and the like.
In some embodiments, the data source may include an internet of things (IoT) device that collects internet of things data. An internet of things device refers to a device having sensors, processing power, software and other technologies that connect to and exchange data with other devices and systems via the internet or other communication networks. For example, as shown in fig. 4, the sensing device 410, the intelligent care cart 240-4, the patient care robot 441, and the examination device 420 in the hospitalization system 400 may be communicatively connected via a wireless network. That is, the sensing device 410, the intelligent care cart 240-4, the patient-care robot 441, and the examination device 420 may be referred to as internet of things devices.
In some embodiments, the processing device 210 may monitor the data sources to detect data updates of at least one of the data sources. The processing device 210 may detect a data update when at least one of the data sources acquires update data that has not been processed by the processing device 210. For example, assuming that the vital signs monitor collects the heart rate of the patient every hour, the processing device 210 may detect a data update in the vital signs monitor 420 when a new heart rate of the patient is collected.
In some embodiments, processing device 210 may be in communication with and directly monitor a data source. For example, when a data source transmits update data to the processing device 210, the processing device 210 may detect data updates in the data source. For another example, the processing device 210 may be communicatively coupled to a storage device (e.g., storage device 230) that stores data related to the hospitalization procedure collected by the data source and monitors the data source by monitoring the storage device. For example, when the storage device receives update data from a data source, the processing device 210 may detect data updates in the data source.
In some embodiments, the processing device 210 may monitor different data sources while the patient is in different phases of the hospitalization procedure. For example, the processing device 210 may determine a current stage of the patient in the hospitalization procedure and monitor at least a portion of the data sources corresponding to the current stage. For example only, the processing device 210 may monitor the sensing devices in the patient room while the patient is in the admission phase.
In some embodiments, the processing device 210 may monitor the data sources according to a preset data acquisition protocol corresponding to the hospitalization procedure. The preset data acquisition protocol may include data acquisition protocols corresponding to various phases of the hospitalization procedure. The data collection protocols corresponding to the various phases may be a particular data source (e.g., hardware device) that collects data associated with the various phases, data interface criteria for the data source, data quality criteria for the data source, etc.
According to some embodiments of the present description, personalized monitoring may be provided for a patient to enable continuous status monitoring, thereby enabling timely response to patient needs and improving efficiency and quality of hospitalization services.
If a data update is detected in at least one of the data sources, the processing device 210 may continue to step 704. At the same time, processing device 210 may continue to execute step 702 to continuously monitor the data source.
At step 704, the processing device 210 (e.g., detection module 620) may perform event detection of interest on the update data collected by the at least one data source.
Update data refers to data collected by at least one data source that has not been processed by the processing device 210. For example, if the at least one data source includes an IoT device, the update data may include IoT data. For another example, if the at least one data source comprises a sensing device, the update data may comprise sensing information acquired by the sensing device, such as image data acquired by an image sensor, acoustic data acquired by a sound sensor, etc. For another example, if the at least one data source includes a terminal device of a doctor associated with the patient, the update data may include input data associated with the patient entered by the doctor via the terminal device. For another example, if the at least one data source includes a vital sign monitor, the update data may include vital sign data of the patient. For another example, if the at least one data source comprises a hospital examination department, the updated data may comprise the patient's examination results. The update data may be obtained directly from at least one data source or from a storage device.
An event of interest refers to a particular event or operation that requires attention. Event of interest detection refers to processing update data collected by at least one data source to detect whether one or more events of interest have occurred. Exemplary events of interest may include patient admission to a ward, patient admission to an examination, at least one doctor making a ward check, patient care or medical examination, patient initiation of a service request, patient orders being acquired or updated, patient physiological condition abnormalities, doctor giving an indication of patient, etc., or any combination thereof.
In some embodiments, the at least one data source may include a plurality of data sources that collect data related to the same event of interest. For example, an image sensor and a sound sensor in a patient room may collect data related to the same event of interest occurring in the patient room (e.g., a patient attends the patient room where at least one doctor is visiting).
In some embodiments, the at least one data source may include a data source that collects data related to a plurality of events of interest. For example, an image sensor in a patient room may collect data related to a plurality of events of interest occurring in the patient room, including admission to a patient, care or medical examination of the patient, etc.
In some embodiments, the processing device 210 may perform detection of the event of interest based on event detection rules of interest. The event of interest detection rule refers to a rule to be followed when the event of interest detection is performed. For example, the event of interest detection rules may specify events of interest to be detected, algorithms or techniques for detecting particular events of interest, algorithms or techniques for analyzing data collected by particular data sources, events of interest corresponding to different phases of a hospitalization procedure, events of interest corresponding to different types of patients, and the like, or any combination thereof.
In some embodiments, the event detection rules of interest may be determined from a history of event detection of interest, or manually set by a user (e.g., doctor, nurse, technician, etc.). For example, the processing device 210 may perform event detection of interest on the update data based on the type of update data. For example, if the update data includes image data acquired by an image sensor, the event detection of interest may include at least one of moving object detection, anomaly detection, behavior detection, or identification of the image data. For another example, if the update data includes a voice signal captured by a sound sensor, the event of interest detection may include performing at least one of voice content recognition, speaker recognition, or key content extraction on the voice signal. For another example, if the updated data includes a value of the physiological parameter, the event detection of interest may include determining whether a data difference between the value of the physiological parameter and the historical value exceeds a data difference threshold, determining whether the value of the physiological parameter exceeds a data range, and so forth. The data variance threshold and/or data range may be preset in the event detection rules of interest. For example, assuming a heart rate range of 50-120 beats/min for the patient, when the heart rate collected by the vital sign monitor is 130 beats/min, the processing device 210 may determine that the heart rate of the patient is outside the heart rate range and an event of interest (i.e., an abnormality in the physiological condition of the patient) has occurred.
In some embodiments, each phase of the hospitalization procedure may correspond to one or more events of interest, and different phases of the hospitalization procedure may correspond to different types of events of interest. Thus, the processing device 210 may perform event detection of interest on the updated data based on the current stage of the patient in the hospitalization procedure. For example, the processing device 210 may determine the current stage of the patient in the hospitalization procedure and determine the type data of the event of interest to be detected based on the current stage of the patient. Further, the processing device 210 may perform event detection of interest on the update data based on the type data.
For example only, when the patient is in the admission phase of the hospitalization procedure, the processing device 210 may determine that the event of interest to be detected includes the patient admitted to the ward, the patient meeting the conditions for performing the admission check, performing the admission check on the patient, and so forth. For another example, when the patient is in a hospital phase, the processing device 210 may determine that the event of interest to be detected includes at least one doctor making a ward round in the ward, performing a care operation or medical examination on the patient, the patient initiating a service request, the patient's order being acquired or updated, the patient's physiological condition being abnormal, the doctor issuing instructions regarding the patient, and so forth.
By performing the event-of-interest detection on the updated data according to the current stage of the patient in the hospitalization procedure, only specific types of events-of-interest need to be detected or tracked, which can effectively reduce the amount of data that needs to be processed, thereby reducing the processing requirements on the processing device 210 and improving the detection efficiency of the events-of-interest.
If an event of interest occurs, processing device 210 may continue to step 706. At the same time, processing device 210 may continue to execute steps 702 and 704 to continuously monitor the data source and detect the event of interest.
In step 706, the processing device 210 (e.g., the service module 630) may perform one or more preset operations corresponding to the event of interest to provide at least a portion of the hospitalization service.
The preset operation corresponding to the event of interest refers to an operation that needs to be performed when the event of interest occurs. In some embodiments, the preset operation may include a general operation and/or a specific operation. A general operation refers to an operation that needs to be performed whenever an event of interest occurs, regardless of the type of the event of interest. For example, the general operations may include generating a record relating to the event of interest. That is, whenever an event of interest is detected, the processing device 210 may generate a record relating to the event of interest based on the updated data relating to the event of interest. Exemplary records may include registration records, admission records, care records, ward records, visit records, discharge records, follow-up records, and the like, or any combination thereof. For another example, the general operations may include transmitting records related to the event of interest to an associated user for confirmation, or to a storage device for storage.
A particular operation refers to an operation that needs to be performed when a particular type of event of interest occurs. For example, updating the daily schedule of the patient according to the update data may be determined as a particular operation corresponding to an event of interest updating the patient order. For another example, providing notification to a healthcare worker regarding an event of interest may be determined as a particular operation corresponding to the event of interest for which the physiological condition of the patient is abnormal.
For example, when the event of interest includes a patient being admitted to a hospital ward, the one or more preset operations may include determining, based on patient data of the patient, query content to be queried after the patient is admitted to the hospital ward (i.e., first query content of a first query), querying the patient via a terminal device in the hospital ward based on the query content, obtaining perception information (i.e., first perception information) acquired by a perception device in the ward during the query, and generating an admission record of the patient based on the perception information. For more description on the generation of admission records, see fig. 12 and its associated description.
For another example, when the event of interest includes acquiring a hospital stay guidance request, the one or more preset operations may include acquiring a first location corresponding to a terminal device of the patient (i.e., a third terminal device) and a second location corresponding to a ward of the patient, determining a planned route from the first location to the second location based on a real-time map of the hospital, and presenting guidance information to the patient through the terminal device of the patient, the guidance information being related to the planned route. For more information on the presentation of guidance information, see fig. 9 and 10 and their associated description.
For another example, when the event of interest includes meeting a condition for performing an admission check, the one or more preset operations may include controlling the intelligent care cart to guide a nurse to the ward for performing the admission check on the patient. For more on the execution of admission checks on patients, see fig. 11 and its associated description.
For another example, when the event of interest includes a patient's order being acquired or updated, the one or more preset operations may include, for each day of patient hospitalization, determining a daily plan for the patient based on the patient's patient data and the order or updated order, presenting the daily plan to the patient via a terminal device (e.g., a first terminal device) in the patient's ward, and presenting the daily plan to a nurse corresponding to the patient via a nurse's terminal device (i.e., a fourth terminal device). The daily schedule may include at least one medical operation that needs to be performed on the patient during the day. For more on a daily schedule, see fig. 16 and its associated description.
For another example, when the event of interest includes at least one doctor making a ward of the patient, the one or more preset operations may include obtaining perception information (i.e., fourth perception information) acquired by one or more perception devices in the ward during the ward of the at least one doctor, and generating a ward record based on the perception information. For more on generating a ward record, see FIG. 17 and its associated description.
For another example, when the event of interest includes obtaining a visit request, the one or more preset operations may include obtaining first current information of the patient and second current information of the remote visit, generating a virtual visit space for the patient and the remote visit based on the first current information and the second current information, and presenting the virtual visit space to the patient and the remote visit, respectively. For more content of the presentation of the virtual exploration space, see FIG. 18 and its associated description.
For another example, when the event of interest includes receipt of an instruction to discharge, the one or more preset operations may include obtaining a target hospitalization record of the patient, generating discharge data based on the target hospitalization record, and presenting the discharge data to the patient via a terminal device (i.e., a first terminal device) in a patient's ward. For more on presentation of discharge data, see fig. 19 and its associated description.
For another example, when the event of interest includes the patient having been discharged from the hospital, the one or more preset operations may include determining a follow-up plan for the patient based on a target hospitalization record for the patient, the follow-up plan including one or more follow-up visits performed at one or more planned times. For any one of the one or more follow-up visits, the one or more preset operations may further include alerting the doctor and the patient through the terminal device of the doctor (i.e., the second terminal device) and the terminal device of the patient (i.e., the third terminal device), respectively, based on the scheduled time corresponding to the follow-up visit. For more details of the follow-up operation, reference may be made to the relevant description of the rest of the specification, such as FIG. 20 and its associated description.
In some embodiments, the processing device 210 may determine one or more preset operations corresponding to the event of interest based on a correspondence between the event of interest and the preset operations. The correspondence may indicate one or more preset operations that need to be performed when a particular type of event of interest occurs. In some embodiments, the correspondence may be determined by way of a look-up table.
In some embodiments, the correspondence may be predetermined and stored in a memory device from which the processing device 210 may retrieve the correspondence. In some embodiments, the processing device 210 may determine a correspondence between the event of interest and the preset operation based on the history.
In some embodiments, the processing device 210 may also determine one or more preset operations corresponding to the event of interest based on patient data of the patient. Patient data for a patient may include patient base data, health data, history data, registration data, and the like, or any combination thereof. For more content on patient data, see fig. 12 and its associated description. For example, different types of patients (e.g., suffering from different diseases, having different ages) may correspond to different preset procedures. For example only, if the patient has a history of hospitalization at the hospital, one or more preset operations corresponding to the event of interest to obtain the hospitalization guidance request may be simplified. By further determining one or more preset operations corresponding to the event of interest based on patient data for the patient, the one or more preset operations may be customized for the patient, thereby improving accuracy in providing hospitalization services and optimizing user satisfaction.
In some embodiments, the processing device 210 may perform one or more preset operations according to the data source corresponding to the event of interest. For example, when at least one data source includes a plurality of data sources that collect data related to the same event of interest, the processing device 210 may perform one or more preset operations according to a collection of data related to the same event of interest collected by the plurality of data sources. For example only, the image sensor and the sound sensor in the patient room may simultaneously acquire data related to an event of interest of the patient entering the patient room, and one or more preset operations may be performed according to a collection of data related to the event of interest acquired by the image sensor and the sound sensor.
For another example, the at least one data source includes a data source for acquiring data related to a plurality of events of interest, and the one or more preset operations corresponding to at least two of the plurality of events of interest may be different. For example, a sound sensor in a patient room may collect acoustic data related to different events of interest occurring in the patient room, such as a patient entering the patient room, at least one doctor making a ward round in the patient's room, etc., and one or more preset operations corresponding to the different events of interest may be different.
In some embodiments, the processing device 210 may perform a preset operation corresponding to the event of interest immediately after the event of interest is detected. For example, the processing device 210 may issue an alarm through the fourth terminal device when the physiological condition of the patient is abnormal. In some embodiments, the processing device 210 may perform a preset operation corresponding to the event of interest after the event of interest is completed. For example, in response to detecting completion of an event of interest, the processing device 210 may generate a record related to the event of interest.
According to some embodiments of the present disclosure, by monitoring a data source for collecting information related to a patient hospitalization procedure, the data update and the occurrence of an interesting event can be detected in time, and a corresponding preset operation can be triggered immediately, so that hospitalization services can be automatically and efficiently provided to related users, and service efficiency and service quality can be improved. Further, the monitored data sources collect multi-modal data throughout different phases of the hospitalization procedure, thereby enabling patient-centric healthcare and comprehensive hospital services.
In some embodiments, processing device 210 may be configured with an agent, and the agent may perform at least a portion of the operations of flow 700. An agent refers to an agent that acts in an intelligent manner. For example, an agent may be a computing entity that may autonomously learn and develop and perceive and analyze data in order to perform particular tasks and/or achieve particular goals (e.g., present step 702-step 706 in flowchart 700). Through artificial intelligence techniques (e.g., reinforcement learning, deep learning, etc.), an agent can constantly learn and self-optimize during interactions with the environment. In addition, the intelligent agent can acquire and analyze mass data through a big data technology, mine rules and learning rules from the data, optimize a decision process, and therefore identify environmental changes and fast response in an uncertain or dynamic environment and make reasonable judgment.
For example, the agent may learn the event detection rules of interest from the history and perform event detection of interest according to the event detection rules of interest. The event of interest detection rule refers to a rule of how event of interest detection is performed on update data. For example, the agent may learn from the history of the type of event of interest that needs to be monitored, how to effectively detect the event of interest, which data needs to be analyzed to detect the event of interest, and so on.
For another example, the agent may learn a correspondence between the event of interest and the preset operation from the history record, and determine one or more preset operations corresponding to the event of interest according to the correspondence. For example, the agent may learn from the history what operations need to be performed when the event of interest occurs.
In some embodiments, the agent may also learn, based on patient data of different patients, the event of interest detection rules and/or the correspondence between the event of interest and the preset operations. For example, the agent may determine different event detection rules of interest and different preset operations for different types of patients (e.g., suffering from different diseases, having different ages). For example only, the agent may determine different ranges of concentricity corresponding to different types of patients having different ages for event detection of interest.
Through the integration of the intelligent agent, the hospitalization service system can continuously learn and optimize the detection rule of the interested event and/or the corresponding relation between the interested event and the preset operation by utilizing the big data technology, the machine learning technology and other advanced technologies or methods, so that the accuracy, the service efficiency and the service quality of hospitalization service are improved.
Fig. 8 is an exemplary flow chart of a procedure 800 for providing hospitalization admission services according to some embodiments of the present description. In some embodiments, the process 800 may be performed during an admission phase of the hospitalization process. In some embodiments, the process 800 may be performed when the processing device 210 detects that a patient is admitted (e.g., the patient completes the relevant hospitalization process). In some embodiments, the process 800 may include the following steps.
At step 810, the processing device 210 (e.g., the service module 630) may direct the patient to a ward.
For example, the processing device 210 may control a third terminal device of the patient to display route guidance information to guide the patient to the ward. The third terminal device may be a mobile phone, an augmented reality device worn by the patient (e.g., the augmented reality device 260-2 shown in fig. 8), etc. For example only, in response to the hospitalization guidance request, the processing device 210 may obtain a first location corresponding to a terminal device of the patient (i.e., a third terminal device) and a second location corresponding to the ward, determine a planned route from the first location to the second location based on a real-time map of the hospital, and then control the third terminal device to present guidance information related to the planned route to the patient. For more on guiding the patient to the ward, reference may be made to the relevant description of the rest of the present description, such as fig. 9 and 10 and their relevant description.
At step 820, the processing device 210 (e.g., the service module 630) may provide admission instructions to the patient.
The admission instructions may be used to introduce admission information (e.g., admission procedures, admission operations, pre-admission fees, payment methods, etc.), hospital rules, hospital environment, patient's doctor and/or nurse, etc. to the patient. In some embodiments, the processing device 210 may adjust the content of the admission education based on patient data of the patient. For example, if the patient is a child, the admission instructions may be simplified.
In some embodiments, the processing device 210 may control a third terminal device (e.g., the augmented reality device 260-2) to present a second virtual character providing admission instructions to the patient. For example, the display interface of the third terminal device may display a virtual nurse, and the speaker of the third terminal device may broadcast the contents of admission and education using a natural language, and the virtual nurse may simulate the language expression, gestures, etc. of a person, so as to provide a realistic communication experience for the patient. In some embodiments, the second avatar is generated in a manner similar to the first avatar.
In some embodiments, the third terminal device includes an augmented reality device 260-2, and the processing device 210 may control the augmented reality device 260-2 to present the second virtual character to account for hospitalization rules via an augmented reality technique. Specifically, in some embodiments, the second virtual character may be superimposed onto the real world view of the patient by an augmented reality technique or superimposed onto the real world view of the patient by a mixed reality technique to enable the patient to interact with the second virtual character.
Through showing the second virtual character and carrying out admission propaganda in natural language, the patient can understand the rule of hospitalization better, thereby be favorable to strengthening humanized care, improve quality and efficiency of admission service.
In some embodiments, the processing device 210 may provide admission instructions to the patient on their way to the ward. In some embodiments, the processing device 210 may also provide admission education to the patient's accompanying person. The accompanying patient refers to a person accompanying the patient to a hospital.
In step 830, the processing device 210 (e.g., the service module 630) may assist the nurse in performing admission preparation.
Admission preparation may be performed by a nurse to prepare hospital supplies for the patient. In some embodiments, the processing device 210 may present the patient's hospitalization notification through a fourth terminal device 805 or intelligent care cart 240-4 in the nurse station to assist the nurse in performing the admission preparation. The hospitalization notification may include patient data for the patient, a hospital supply list for the patient, ward information for the patient, information for an admission check to be performed on the patient, and the like.
The admission check may also be referred to as a hospitalization check, which is performed after the patient is admitted to the ward. Admission checks may be used to gather information (e.g., vital sign data, basic health data, etc.) about the patient's current medical condition. Admission checks may include checks of blood pressure, blood glucose, heart rate, body temperature, etc., or any combination thereof.
In step 840, the processing device 210 (e.g., the service module 630) may issue a reminder to perform the admission check.
The reminder may include a message reminder, a sound reminder, a pop-up reminder, etc. For example, the processing device 210 may issue a reminder through the fourth terminal device 805 or the intelligent care cart 240-4.
In some embodiments, after the patient arrives in the patient room, the processing device 210 may determine whether the patient satisfies the condition for performing an admission check in the patient room. The condition for performing an admission check in a patient room may include that the patient has arrived in the patient room for a period of time. If the patient meets the above conditions, the processing device 210 may issue a reminder to perform an admission check.
At step 850, the processing device 210 (e.g., the service module 630) may direct the nurse to the ward.
For example, the processing device 210 may present a route from a third location of the fourth terminal device to a second location of the patient room through the fourth terminal device 805. As another example, the processing device 210 may control the smart care cart 240-4 to guide a nurse to a ward of a patient. In some embodiments, the processing device 210 may control movement of the smart care cart 240-4 to guide a nurse to a ward. For example, the processing device 210 may control the intelligent care cart 240-4 to accelerate, decelerate, stop, etc., based on the real-time traffic conditions of the hospital.
At step 860, an admission check is performed on the patient.
For example, after a nurse arrives in a patient room, one or more examination devices may be used to admix the patient to collect first measurement data for the patient. For example, the one or more examination devices may include a blood pressure meter, a blood glucose meter, an electrocardiographic monitor, a thermometer, and the like. The first measurement data acquired during the admission check may include blood pressure, blood glucose, heart rate, body temperature, etc.
In some embodiments, the processing device 210 may present information related to the admission check to a nurse during the admission check through the smart care cart. For example, the intelligent care cart may display admission exam instructions, an electronic health profile of the patient, and the like.
In step 870, the processing device 210 (e.g., the service module 630) may generate a registration record.
The registration record refers to a record indicating that the patient has entered the ward and/or the status of the patient when the patient entered the ward. For example, the registration record may include admission information (e.g., admission number, clinical information, time of admission, amount of pre-paid hospitalization, payment method, etc.), first measurement data collected at the time of admission check, etc.
In some embodiments, the processing device 210 may generate the enrollment record based on the enrollment template and the first measurement data. For example, the processing device 210 may acquire first measurement data relating to the patient acquired by one or more examination devices during the admission examination and generate a registration record of the patient by populating the registration template with the first measurement data and the admission information. In some embodiments, the processing device 210 may further generate the registration record based on the electronic health record of the patient.
In some embodiments, the processing device 210 may further generate the target enrollment record based on feedback information entered by the nurse regarding the enrollment record. The target registration record refers to a registration record confirmed by a nurse. For example, the processing device 210 may present the registration record to the nurse via the smart care cart 240-4 or the fourth terminal device 805 and generate the target registration record based on the registration record and feedback information regarding the registration record entered by the nurse via the smart care cart 240-4 or the fourth terminal device 805. For more details on generating the target registration record, reference may be made to the relevant description of the rest of the specification, such as fig. 11 and its relevant description.
In some embodiments, the processing device 210 may be configured with an agent (e.g., an admission agent) that may participate in performing one or more operations of the process 800. For example, the agent may direct the patient to a ward, provide admission instructions for the patient, assist a nurse in conducting admission checks, generate registration records, and the like.
In some embodiments of the present description, the patient may be provided with the admission service in a semi-automated manner with the assistance of a medical service system (e.g., intelligent care cart 240-4) and/or an agent to reduce labor costs and increase the efficiency of the admission service.
Fig. 9 is an exemplary flow diagram of a process 900 for presenting guidance information, shown in accordance with some embodiments of the present description. In some embodiments, the process 900 may include the following steps. In some embodiments, the process 900 may be performed by the processing device 210.
The processing device 210 (e.g., the service module 630) may direct the patient to the ward before the patient arrives at the ward.
In response to the hospitalization guidance request, the processing device 210 (e.g., the service module 630) may acquire a first location of a third terminal device of the patient and a second location of the ward, step 902.
A hospital stay guidance request refers to a request to guide a patient (e.g., patient 261) to stay in a hospital ward. In some embodiments, the processing device 210 may receive a patient-initiated hospitalization guidance request from a third terminal device of the patient. For example, the third terminal device may include an augmented reality device (e.g., augmented reality device 260-2). In some embodiments, the processing device 210 may monitor the data source and automatically generate a hospitalization guidance request in response to detecting that the patient needs to be guided to the ward. For example, the processing device 210 may automatically generate a hospitalization guidance request upon detecting that the patient arrived at the hospital and/or that the patient has completed an admission related procedure.
The first location refers to the current location of the third terminal device at the time of acquisition of the hospitalization guidance request. For example, the first location may comprise coordinates of the third terminal device in a real-time map of the hospital. In some embodiments, the processing device 210 may obtain the first location of the third terminal device from the third terminal device. For example, the third terminal device may determine the first location using a positioning technique (e.g., global positioning system (Global Positioning System, GPS)) and send the first location to the processing device 210.
In some embodiments, the processing device 210 may obtain a second location corresponding to the ward based on the hospitalization guidance request. The second location may comprise coordinates of a ward in a real-time map of the hospital. In some embodiments, the hospitalization guidance request may include the patient's ID. The processing device 210 may determine a ward corresponding to the patient based on the patient's ID and then determine a second location corresponding to the ward.
At step 904, the processing device 210 (e.g., the service module 630) may determine a planned route from the first location to the second location based on the real-time map of the hospital.
The real-time map of the hospital may be a three-dimensional map capable of reflecting the layout and real-time status of the hospital. For example, the real-time conditions may include congestion conditions, barrier information, etc. for each path in the real-time map. The planned route is a route that directs the patient to move from a first location to a second location. The processing device 210 may determine the optimal route as the planned route based on a real-time map of the hospital and taking into account factors such as route distance, congestion status, roadblocks, and other relevant information.
At step 906, the processing device 210 (e.g., the service module 630) may control the third terminal device to present guidance information (which may also be referred to as route guidance information) related to the planned route to the patient.
Guidance information refers to information indicating that the patient is moving along the planned route. For example, the guidance information may include a guidance gesture that guides a virtual character (e.g., a second virtual character), a guidance voice, and the like. For another example, the guidance information may include arrows, text, etc. for guidance.
In some embodiments, the third terminal device comprises an augmented reality device, and the processing device 210 may control the third terminal device to superimpose the guiding information related to the planned route onto the real world view of the patient via augmented reality or mixed reality technology. That is, when the patient wears the augmented reality device, the patient can see the real world environment and the guidance information at the same time. The real world view of the patient refers to the real world within the patient's field of view. The field of view of the patient may change as the patient's head/eyes move, as may the corresponding patient's real world view.
For example only, as shown in fig. 10, when the head of the patient 261 is oriented in a direction parallel to arrow a, the real world view of the patient 261 may include a real hospital scene in a dashed box 1010. The processing device 210 may superimpose the virtual guide character 1020 on the real world view of the patient 261 through the augmented reality device 260-2. The virtual guide character 1020 may guide the patient 261 to move along a planned route from a first location 1030 corresponding to the augmented reality device 260-2 to a second location 1040 corresponding to the patient room by communicating with the patient 261 using natural language and guide gestures. For another example, the processing device 210 may superimpose a virtual arrow on the real world view of the patient 261 in the dashed box 1010. As the patient 261 moves along the planned route, the real world view of the patient 261 may change in real time and the processing device 210 may update the guidance information in real time.
In some embodiments of the present description, guidance information may be provided to the patient to speed up his way to the ward, thereby improving the efficiency of the hospital admission process and optimizing user satisfaction. In addition, the guide information is displayed to the patient through the augmented reality or mixed reality technology, so that not only is the accuracy and reliability of the guide information enhanced, but also the perception of the physical environment of the patient is not influenced.
Fig. 11 is an exemplary schematic diagram of a procedure 1100 for providing hospitalization admission services according to some embodiments of the present description.
As shown in fig. 11, when a patient 261 enters a ward, the processing device 210 (e.g., the service module 630) may control the smart care cart 240-4 to guide a nurse to the ward according to a real-time map of the hospital, and to perform an admission check 1102 on the patient 261.
After performing the admission check 1102, the processing device 210 may acquire first measurement data 1104 relating to the patient 261 acquired by one or more of the examination devices during the admission check 1102. Optionally, the intelligent care cart 240-4 may house one or more inspection devices. Further, the processing device 210 may generate a registration record 1106 of the patient 261 from the first measurement data 1104.
In some embodiments, the processing device 210 may present the registration record 1106 to the nurse via the smart care cart 240-4 and obtain feedback information 1108 related to the registration record 1106 entered by the nurse via the smart care cart 240-4. The feedback information may include confirmation instructions entered by the nurse, modification instructions, and the like. Further, the processing device 210 may generate a target registration record 1110 based on the registration record 1106 and the feedback information 1108. For example, the processing device 210 may update the registration record 1106 based on the feedback information 1108 to generate the target registration record 1110.
Fig. 12 is an exemplary flow diagram of a procedure 1200 for providing a hospital admission query service according to some embodiments of the present description. In some embodiments, the process 1200 may be performed during an admission interrogation phase. In some embodiments, the process 1200 may be performed when the processing device 210 detects that the patient has arrived in the ward and/or the admission check has been completed. In some embodiments, the process 1200 may include the following steps.
In step 1202, the processing device 210 (e.g., the service module 630) may determine first query content of a first query to be made after the patient enters the ward based on patient data of the patient.
The patient data may include basic data (e.g., name, age, gender, weight, address, work, etc.), health data (e.g., disease type, disease symptoms), historical data (e.g., historical clinical data, historical hospitalization data, etc.), enrollment data (e.g., goal enrollment records, etc.), etc., or any combination thereof. In some embodiments, the patient data may include an electronic health record of the patient.
The first query, also called an admission query, is used to obtain basic information of the patient to initially determine the condition of the patient and/or to initially address the confusion of the patient. The first query content may include questions that require the patient to answer in the first query.
In some embodiments, the processing device 210 may determine the first query content of the first query based on patient data of the patient. For example only, the processing device 210 may input patient data into a first query model, which may output first query content of a first query. In some embodiments, the first interrogation model may include a convolutional neural network (Convolutional Neural Network, CNN) model, a recurrent neural network (Recurrent Neural Network, RNN) model, a Long Short-Term Memory (LSTM) model, a bi-directional transformer (Bidirectional Encoder Representations from Transformers, BERT) model, a chat-generating pre-training transformer (ChatGPT) model, or the like, or any combination thereof.
In some embodiments, the processing device 210 may train to obtain the first query model based on the first training set. The first training set may include a plurality of first training samples and a first training label corresponding to each of the plurality of first training samples. Each first training sample may include sample patient data for a sample patient, and the first training tag may include gold standard interrogation content corresponding to the sample patient data for the sample patient. In some embodiments, the first training sample and the first training tag may be determined from historical admission inquiries or manually set by a user (e.g., doctor, nurse, technician, etc.). In some embodiments, the first query model may be obtained by training the first initial model using a first training set based on a first loss function. For example, a first training sample may be input into a first initial model that outputs sample query content of a first query. Based on the difference between the sample query content and the gold standard query content, the value of the first loss function can be determined, and the parameters of the first initial model are iteratively updated according to the value of the first loss function until the value of the first loss function reaches a preset value or the iteration number reaches a preset number of times, thereby obtaining a trained first query model.
By determining the first query content of the first query using machine learning techniques, the efficiency and quality of the query may be improved, thereby facilitating improved accuracy of subsequent services.
In some embodiments, the processing device 210 may obtain an interrogation template based on patient data of the patient and determine the first interrogation content from the interrogation template. For example, the processing device 210 may invoke a preset query template according to a disease type of the patient or a hospital department, and determine the first query content according to the preset query template and patient data of the patient. For example only, the processing device 210 may compare the preset query template with patient data (i.e., known information of the patient) to determine missing information that has not been collected in the preset query template, and further determine the first query content based on the missing information.
In some embodiments, the processing device 210 may determine the first query content further based on an input from a physician of the patient. For example, the processing device 210 may determine preliminary query content based on the query template and present the preliminary query content to the patient's physician via the second terminal device. Further, the processing device 210 may determine the first query content according to the preliminary query content and feedback information of the preliminary query content inputted by the doctor through the second terminal device. The second terminal device refers to a terminal device which interacts with a doctor. For example, the second terminal device may include a mobile device 270-1, an augmented reality device 270-2, a notebook 270-3, a display device, and the like.
In some embodiments, the first query may include multiple rounds of queries, each round of queries may include one query and one answer to the patient. The first interrogation content may include the interrogation content of each round of interrogation or may include only the interrogation content of the first round of interrogation.
In step 1204, the processing device 210 (e.g., the service module 630) may make a first query via a first terminal device in the patient room based on the first query content.
The first terminal device may be a hospital provided terminal device for use by the patient in a ward during a hospital stay. The first terminal device can include an augmented reality device (e.g., augmented reality device 432), a mobile device, a display device, etc., or any combination thereof. In some embodiments, the first terminal device may be part of a hospital bed.
In some embodiments, the processing device 210 may make the first query by the first terminal device presenting the first query content. For example, the processing device 210 may present the first query content in text form to the patient via a screen of the first terminal device. For another example, the processing device 210 may play the first query content to the patient through a speaker of the first terminal device.
In some embodiments, the processing device 210 may present the first avatar making the first query via the first terminal device. Virtual characters refer to computer-generated people or entities designed to interact with a patient in a digital environment. The first avatar may be configured to make the first query by communicating with the patient. For example, the first avatar may be a digital avatar having certain appearance characteristics, acoustic characteristics, etc. By way of example only, the processing device 210 may present the first avatar through a screen of the first terminal device and play the first query content through a speaker of the first terminal device. Meanwhile, the first virtual character can simulate the expression, action and the like of the human speaking, and provides a real communication experience for the patient.
In some embodiments, the first avatar may have a pre-set appearance characteristic. Exemplary appearance features may include physical features, skin features, facial features, clothing features, and the like, or any combination thereof. For example, the first avatar may be a virtual doctor. For example, the processing device 210 may acquire optical image data of a doctor of the patient and determine the appearance characteristic of the first virtual character from the optical image data of the doctor of the patient.
For another example, the processing device 210 may determine the appearance characteristics of the first avatar based on patient data (e.g., base data) of the patient. For example, for a female patient aged 70, the first virtual character may be a virtual female nurse with a stronger affinity. For another example, for a patient who is a doctor, the first virtual character may be a virtual professional doctor. For another example, the processing device 210 may determine the first avatar from the plurality of candidate avatars based on the patient data.
In some embodiments, the first avatar may have preset acoustic characteristics. Exemplary acoustic features may include frequency features, volume features, duration features, quality features, tone features, speed features, tone features, etc., or any combination thereof. For example, the processing device 210 may acquire acoustic features of a physician of the patient and determine the acoustic features of the first virtual character from the acoustic features of the physician. For another example, the processing device 210 may determine the acoustic characteristics of the first avatar based on patient data of the patient.
In some embodiments, the first terminal device may include an augmented reality device (e.g., augmented reality device 432) worn by the patient, which may be configured to present the first virtual character through an augmented reality technology. For example, the processing device 210 may present the first virtual character and play the first query content within the field of view of the patient via the augmented reality device. The patient's field of view may show real world (e.g., a patient room) or data content (e.g., a virtual room) within the patient's field of view. For another example, the processing device 210 may synchronously present the first query content and the first virtual character within the field of view of the patient via the augmented reality device. Referring to fig. 13, fig. 13 is a schematic diagram of an interface for presenting first query content via an augmented reality device, according to some embodiments of the present description. As shown in fig. 13, the augmented reality device 432 may simultaneously present a first virtual character 1310 (e.g., a virtual character of a doctor of the patient 261) and first query content 1320 within a field of view of the patient 261.
In some embodiments of the present description, the interaction of the patient may be enhanced by using a first avatar capable of natural language communication with the patient to perform the first interrogation, thereby improving the quality and efficiency of the admissions interrogation service.
In some embodiments, the first query may include multiple rounds of queries and the first query content may include the query content of each round of queries, as described in step 1202. The processing device 210 may make the first query by executing the flow 1500 as shown in fig. 15.
As shown in fig. 15, the processing device 210 may make a first round of interrogation 1504 by a first terminal device in the patient room (e.g., bedside terminal 240-6 of patient bed 240-2) based on the interrogation content of the first round of interrogation 1502.
For each round of current queries 1514 (i.e., current queries 1514) other than the first round of queries, the processing device 210 may adjust the content of the query corresponding to the current query 1514 based on the first perceived information 1506 collected in the queries of the historical round to make the content of the query more consistent with the patient's condition. For example, processing device 210 may determine semantic information 1508 and emotion information 1510 of a patient's historical answers based on first perceived information 1506 collected in queries of a historical round, and adjust the query content (i.e., current query content) of current query 1514 based on semantic information 1508 and emotion information 1510. The historical answers refer to answers to historical round queries. For example, if the current query 1514 is a third round of queries, the historical answers may include answers to the first round and the second round of queries.
The semantic information of the historical answers may represent the content of the historical answers. The affective information of the historical answer may represent the emotion (e.g., calm, tension, anxiety, fear, confusion, annoyance, etc.) of the patient given the historical answer. For example, the first perceived information 1506 may include speech signals acquired in queries of historical rounds. Processing device 210 may determine semantic information 1508 by transcribing and identifying the content of the speech signal, and determine emotion information 1510 by analyzing acoustic characteristics (e.g., frequency characteristics, volume characteristics, duration characteristics, tone characteristics, pitch characteristics, speed characteristics, intonation characteristics, etc.) of the speech signal.
In some embodiments, processing device 210 may determine a target signal from the speech signal and determine semantic information 1508 and emotion information 1510 from the target signal. The target signal may include a statement of the patient, including, for example, keywords related to the content of the query. By determining the target signal, the data volume to be analyzed can be reduced, so that the determination efficiency of semantic information and emotion information is improved, and the efficiency of the first inquiry is further improved.
In some embodiments, processing device 210 may adjust query content corresponding to the current query based on semantic information 1508 and emotion information 1510 to generate adjusted query content 1512 of current query 1514. Further, the processing device 210 may make a current query 1514 via the first terminal device (e.g., bedside terminal 240-6) based on the adjusted query content 1512. For example, when the patient's affective information 1510 is "tension" or "fear," the processing device 210 can add a pacifying word to the current query. For another example, when the patient's affective information 1510 is "confused," the processing device 210 can add explanatory words to the current query. For another example, when semantic information 1508 shows that the patient did not explicitly answer the historical query, processing device 210 may adjust the current query content to repeat the historical query, thereby directing the patient to explicitly answer the historical query. The originally determined current interrogation content may be used as the interrogation content for the next round of interrogation.
According to some embodiments of the present description, the current query content is adjusted based on semantic information and emotion information of one or more historical answers, so that the current query content is adjusted in time according to the current state of the patient, thereby improving accuracy of the admission query.
In some embodiments, in addition to adjusting the current query content, the processing device 210 may also adjust the acoustic characteristics (e.g., the acoustic characteristics of the first virtual character) for the current query 1514 in real-time. Exemplary acoustic features may include frequency features, volume features, duration features, quality features, tone features, speed features, tone features, etc., or any combination thereof.
As shown in fig. 15, processing device 210 may determine acoustic features 1518 of current query 1514 based on semantic information 1508 and emotion information 1510, and conduct current query 1514 via the first terminal device based on adjusted query content 1512 and acoustic features 1518 of current query 1514. For example, processing device 210 may determine acoustic features 1518 of current query 1514 based on semantic information 1508, emotion information 1510, and preset correspondence. The preset correspondence may reflect a correspondence between semantic information, emotion information, and acoustic features. By way of example only, the preset correspondence may include that when semantic information 1508 is a positive answer and emotion information 1510 is calm, the speed feature may be medium, the tone feature may be polite, the intonation feature may be calm, and the volume feature may be medium.
According to some embodiments of the present disclosure, the acoustic features of the current query are adjusted in real time according to the semantic information and the emotion information of the patient's historical answer, so that the emotion change of the patient can be better cared for, thereby improving the personification effect of the first virtual character and improving the quality of service of the admission query.
In some embodiments, the processing device 210 may further obtain physiological state data 1516 of the patient corresponding to the current query 1514 and adjust the query content of the current query 1514 based on the semantic information 1508, the emotion information 1510, and the physiological state data 1516. The patient's physiological state data 1516 may reflect the patient's real-time physiological condition. The physiological state data 1516 may include values of physiological parameters of the patient (e.g., heart rate, pulse, respiratory rate, etc.). The physiological state data 1516 may further include information related to the posture, limb behavior, facial expression, muscle state, etc. of the patient. In some embodiments, the patient's physiological state data 1516 may be acquired by a wearable device worn by the patient. For example, the value of the physiological parameter of the patient may be acquired by a physiological sensor integrated on the wearable device. In some embodiments, the patient's physiological state data 1516 may be acquired by one or more image sensors in the patient's environment. For example, an image sensor within a patient room may capture a patient's posture, facial expression, and the like.
Further, processing device 210 may adjust the query content of current query 1514 based on semantic information 1508, emotion information 1510, and physiological state data 1516. For example, the processing device 210 can update the affective information 1510 based on the patient's physiological state data 1516. It will be appreciated that the patient's internal mood cannot be fully expressed by the patient's answer, and thus the patient's mood information 1510 can be updated or corrected based on the patient's physiological state data 1516. For example, if the affective information 1510 of the patient is determined to be calm based on the first perceived information 1506, but the physiological state data 1516 of the patient indicates that the patient is in tension (e.g., the heart rate exceeds a preset threshold), the processing device 210 may correct the affective information 1510 of the patient to tension. Further, processing device 210 may adjust the query content of current query 1514 based on semantic information 1508 and updated affective information.
According to some embodiments of the present disclosure, by further considering physiological status data of a patient, accuracy of emotion information of the patient can be improved, so that accuracy of adjusting current query content is improved, and quality of service of admission query is improved.
In some embodiments, the processing device 210 may determine the feedback parameters based on at least one of the semantic information 1508, the emotion information 1510, and the physiological state data 1516, and control the wearable device to provide feedback to the patient according to the feedback parameters. Feedback may include force feedback, temperature feedback, and the like, or any combination thereof. The feedback parameters may be used to control the manner in which the feedback is applied, such as the type of feedback, the location at which the feedback is applied to the patient, the strength of the feedback, etc. In some embodiments, processing device 210 may determine the emotion and emotion level of the patient based on at least one of semantic information 1508, emotion information 1510, and physiological state data 1516, and then determine the feedback parameters based on the emotion and emotion level. For example, the patient's mood may be used to determine the type of feedback and on which part of the patient the feedback is acting, and the patient's mood level may be used to determine the strength of the feedback.
According to some embodiments of the present disclosure, feedback parameters are determined according to semantic information, emotion information, and physiological state data, and the wearable device is controlled to provide feedback to the patient according to the feedback parameters, so that bad emotion of the patient can be comforted in time, and thus quality of service of admission inquiry is improved.
In some alternative embodiments, the first interrogation content determined in step 1202 may include only interrogation content of the first round of interrogation. For each round of current queries other than the first round of queries, the processing device 210 may determine the query content of the current query during the first query. For example, for each round of current queries, the processing device 210 may input patient data and first perceived information acquired in queries of historical rounds into a second query model, through which the query content of the current query is output. In some embodiments, the second query model may include a CNN model, RNN model, LSTM model, BERT model, chatGPT model, or the like, or any combination thereof.
In some embodiments, the second query model may be trained based on a second training set. The second training set may include a plurality of second training samples and a second training label corresponding to each of the plurality of second training samples. Each second training sample may include sample patient data of the sample patient and sample first perception information, and the second training tag may include gold standard interrogation content of a corresponding round of sample interrogation. The second training sample and the second training label may be determined from the historical first query or manually set by the user. In some embodiments, the training manner of the second query model may be similar to that of the first query model, and will not be described herein.
In some embodiments, the processing device 210 may end the first query according to a preset condition. The preset conditions may include the number of interrogation rounds reaching a threshold (e.g., 10 rounds), the patient has given a preset answer, the interrogation content of the current round is preset, etc. For example, when the patient's answer is "over" or "no other symptoms," then the first preset condition is deemed satisfied. For another example, the query content of the current round is "no other problem, thank you", and the preset condition is considered to be satisfied.
In some embodiments of the present description, by performing multiple rounds of interrogation to make a first interrogation, the patient's condition may be fully understood, thereby improving accuracy of admission interrogation and subsequent hospitalization. In addition, the first inquiry can be automatically executed by the first terminal equipment, so that the labor cost is saved, and the workload of medical service personnel is reduced.
In step 1206, the processing device 210 (e.g., the service module 630) may obtain the first awareness information. The first sensory information is collected by one or more sensory devices in the patient room during the first interrogation.
The first sensory information may include information related to the patient's answer during the first query.
In some embodiments, the first perceptual information may include various types of information. For example, when the one or more sensing devices include one or more sound sensors, the first sensing information may include a first speech signal acquired by the one or more sound sensors. For another example, when the one or more sensing devices include one or more image sensors, the first sensing information may include a first video signal acquired by the one or more image sensors.
In step 1208, the processing device 210 (e.g., the service module 630) may generate an admission record for the patient based on the first awareness information.
The admission record is used to record details of the admission of the patient. In some embodiments, the admission record may include patient information (e.g., patient data), admission information, first query information, and the like. The admission information may include admission time, admission department (e.g., emergency room, cardiology, obstetrics), admission doctor (e.g., doctor information responsible for admitting the patient), etc., or any combination thereof.
In some embodiments, the processing device 210 may generate an admission record for the patient based on the first perception information and patient data of the patient. For example, the processing device 210 may obtain an admission template and generate an admission record based on the admission template, the first perception information, and the patient data.
In some embodiments, after generating the admission record, the processing device 210 may present the admission record to the doctor for confirmation via the second terminal device. If the physician confirms the admission record, the processing device 210 may transmit the admission record to a storage device for storage. If the physician feedback requires more information to be included in the admission record, the processing device 210 may continue to step 1210.
In step 1210, the processing device 210 (e.g., the service module 630) may update the admission record.
In some embodiments, the processing device 210 may perform the flow 1400 in fig. 14 to update the admission record. As shown in fig. 14, the processing device 210 may present the admission record 1402 to a physician 271 of the patient 261 via a second terminal device (e.g., an augmented reality device 270-2).
In some embodiments, the processing device 210 may obtain feedback information 1404 related to the admission record 1402 entered by the physician 271 via a second terminal device (e.g., the augmented reality device 270-2). Feedback information 1404 may include information that is not in admission record 1402 but that is deemed necessary by doctor 271. For example, feedback information 1404 may include information needed to collect information related to the patient's allergy history.
The processing device 210 may determine the second query content 1406 of the second query 1408 based on the feedback information 1404 and make the second query 1408 based on the second query content 1406 via the first terminal device (e.g., bedside terminal 240-6) in the hospital ward. The second query 1408 may also be referred to as a supplemental query for collecting supplemental information that updates the admission record. The second query 1408 may be made in a similar manner as the first query. The processing device 210 may then obtain second perception information 1410 acquired by one or more perception devices (e.g., image sensors and/or sound sensors in the patient room) during the second query 1408, and update the admission record 1402 based on the second perception information 1410.
Alternatively, doctor 271 may go directly to the ward to make a second query 1408 of the patient. The processing device 210 may obtain second perception information 1410 acquired by one or more perception devices during the second query 1408 and update the admission record 1402 based on the second perception information 1410.
In some embodiments of the present description, important information missing in an admission record may be re-acquired by collecting feedback information from a physician related to the admission record. This approach enables a more comprehensive admission record to be created, thereby improving the accuracy of subsequent hospitalization services that rely on the record. In some embodiments, the relevant information of the second query may be used to train and optimize the first query model, the second query model, and the agent (e.g., an admission query agent), which may enrich the training sample, improving accuracy of the first query model, the second query model, and the agent.
In some embodiments, processing device 210 may be configured with an agent (e.g., an admission interrogation agent) that may participate in performing one or more operations of process 1200. For example, the agent may determine the first query content, control the first terminal device to perform the first query, generate an admission record, and the like. In some embodiments, the first avatar may be a visual representation of an admission inquiry agent. In some embodiments, the processing device 210 configured with the admission inquiring agent may be integrated into the hospital bed or the first terminal device.
According to some embodiments of the present disclosure, the first terminal device automatically performs the admission inquiry and automatically generates the admission record, so that the workload of medical service personnel can be reduced.
Fig. 16 is an exemplary flow diagram of a process 1600 for providing care services according to some embodiments of the present description. In some embodiments, the process 1600 may be performed during a hospital stay. In some embodiments, the processing device 210 may execute the process 1600 daily to provide care services to the patient while the patient is hospitalized. In some embodiments, flow 1600 may include the following steps.
At step 1602, the processing device 210 (e.g., the service module 630) may determine a daily plan for the patient based on patient data for the patient and the patient's order.
Patient orders refer to instructions or instructions that a physician gives to a patient. For example, the order may include instructions regarding diet, medication, laboratory checks, and the like. Exemplary orders may include long term orders, temporary orders, alternate orders, and the like, or any combination thereof.
In some embodiments, the patient's order may be stored in a storage device and updated if a doctor issues a new order for the patient. The processing device 210 may retrieve the latest version of the order from the storage device. In some embodiments, the processing device 210 may monitor various data sources that collect data related to the hospitalization procedure to detect whether the patient's order is updated. For example, when providing an admission inquiry service and/or a ward round service to a patient, the patient's doctor may place a new order to the patient. The processing device 210 may detect a new order based on the admissions query service and/or the sensory information collected by the sensory device during the physician's visit. Once a new order is detected, the new order may be stored in the storage device. Further, the processing device may update the daily schedule based on the latest orders. For another example, the doctor may update the orders of the patient stored in the storage device through the second terminal device. In some embodiments, the processing device 210 may determine the patient's order from the patient's electronic health record.
In some embodiments, the processing device 210 may determine the daily plan for the patient based on patient data for the patient and the patient's order. The daily schedule includes at least one medical operation that needs to be performed on the patient on the day. Example medical procedures may include care procedures, examination procedures, and the like. Exemplary care operations may include infusion, medication, injection, infusion, monitoring, and the like, or any combination thereof. For example, the processing device 210 may determine information about each care operation based on patient data of the patient and the patient's orders, and determine a daily plan for the patient based on the information about each care operation. By way of example, the information about each care operation may include an operation type, a scheduled time (e.g., start time, operation time, end time, etc.), a dose, a degree of importance, etc., or any combination thereof. For another example, the processing device 210 may obtain a historical care plan for the patient and update the historical care plan based on patient data for the patient and the patient's orders to determine a daily plan for the patient.
In some embodiments, the processing device 210 may update the patient's daily schedule. For example, in response to receiving the patient's examination results, processing device 210 may determine whether an update to the daily schedule is required based on the examination results, and update the daily schedule if it is determined that it is required. For example, if the examination results indicate that the patient is improving, the daily schedule may be updated by reducing the dose in the daily schedule. For another example, in response to receiving a request from a patient, processing device 210 may update the daily schedule based on the patient's request. For example, the processing device 210 may identify a request from the patient's voice signal and update the daily schedule based on the identified patient request.
After the daily schedule is updated, process 1600 may be performed based on the updated daily schedule. In some embodiments, processing device 210 may mark the updated content in the daily schedule. For example, the processing device 210 may mark updated content in the daily schedule by using different colors. For another example, the processing device 210 may mark updated content in the daily schedule via a dashed box. In some embodiments, the processing device 210 may transmit the daily schedule (or updated daily schedule, daily schedule marked with updated content) to the doctor's second terminal device for confirmation.
At step 1604, the processing device 210 (e.g., the service module 630) may present the daily schedule to the patient via a first terminal device (e.g., bedside terminal 240-6) in the patient room.
For example, the processing device 210 may display the daily schedule to the patient in text form via a screen of the first terminal device. For another example, the processing device 210 may play the daily schedule to the patient through a speaker of the first terminal device. For another example, the processing device 210 may present a first avatar illustrating the daily schedule via the first terminal device.
In step 1606, the processing device 210 (e.g., the service module 630) may present the daily plan to the nurse corresponding to the patient via the fourth terminal device.
The fourth terminal device refers to a terminal device for interaction with a nurse. For example, the fourth terminal device may include a terminal device in a nurse station or the like.
In step 1608, when the daily plan includes at least one care operation, the nurse may perform the at least one care operation on the patient, and the processing device 210 (e.g., the service module 630) may assist the nurse in performing the at least one care operation according to the daily plan.
As shown in fig. 16, for each of the at least one care operation, the processing device 210 may control the intelligent care cart to guide the nurse to the ward to perform the care operation according to the scheduled time of the care operation. For example, prior to the scheduled time of the care operation, the processing device 210 may control the movement of the smart care cart to the nurse's workstation to notify the nurse that the care operation needs to be performed for the patient. The intelligent care cart can then be controlled to move and guide the nurse to the patient's ward. Further, the processing device 210 may control the intelligent care cart to present care instructions regarding care operations after a nurse arrives at a ward. The care instructions may be used to instruct the nurse to perform the care operation properly. For example, the care instructions may include video data, voice data, text data, image data, etc., that illustrate how and/or what needs to be noted when performing the care operation.
At step 1610, the processing device 210 (e.g., the service module 630) may generate a care record.
A care record refers to a record of a care operation performed on a patient and/or a patient's state (e.g., vital signs and other physiological measurements) before, after, or at the time of the care operation.
In some embodiments, when performing at least one care operation, the processing device 210 may acquire third perception information acquired by one or more perception devices in the patient room during performance of the at least one care operation, and generate a care record according to the third perception information. The generation of the care record may be similar to the generation of the admission record in step 1210, and will not be described here. In some embodiments, the care record may be presented to the nurse for confirmation via the intelligent care cart or the fourth terminal device.
In some embodiments, processing device 210 may configure an agent (e.g., a care agent) that may participate in execution flow 1600. For example, the agent may determine a daily plan for the patient, display the daily plan to the patient and/or the nurse, assist the nurse in performing at least one care operation, and generate a care record. In some embodiments, the treatment device 210 configured with the care agent may be integrated into a hospital bed, a first terminal device, or a smart care cart.
According to some embodiments of the present description, the workload of nurses can be significantly reduced by automatically generating daily plans and care records, thereby enabling nurses to put more effort on direct patient care rather than administrative work. In addition, the monitoring of the update of the medical advice ensures the timely update of the daily plan, and the proactive method can ensure the timely adjustment of the intervention measures and the nursing plan according to the latest medical advice, so that the nursing effect and the nursing quality are improved.
Fig. 17 is an exemplary flow diagram of a process 1700 for providing ward round services according to some embodiments of the present description. In some embodiments, the process 1700 may be performed during a hospital stay. In some embodiments, the process 1700 may be performed when the processing device 210 detects that at least one doctor is making a ward of the patient. In some embodiments, the process 1700 may include the following steps.
At step 1702, the processing device 210 (e.g., the service module 630) may obtain fourth awareness information. The sensing device in the ward may collect fourth sensing information when at least one doctor makes a ward visit in the ward. For example, when at least one doctor makes a ward round in a ward, fourth perception information may be collected by an image sensor and a sound sensor in the ward. In some embodiments, the processing device 210 may detect a ward (i.e., an event of interest) by analyzing the sensory information collected by the sensory device.
In step 1706, the processing device 210 (e.g., the service module 630) may generate a ward record.
The ward round records are used for recording data about the ward round, including time, participants, patient data, communication between the patient and at least one doctor, orders issued in the ward round, and the like. For example, the processing device 210 may obtain the ward record template and populate the ward record template according to the fourth perception information to generate the ward record. In some embodiments, the processing device 210 may present the ward record to at least one doctor (or a portion thereof) for confirmation.
According to some embodiments of the present disclosure, the effort of the physician may be significantly reduced by automatically generating the ward record, thereby allowing the physician to put more effort on the direct treatment of the patient than on administrative work. In addition, the integrity and the authenticity of the ward-round records can be ensured by automatically generating the ward-round records, so that the accuracy and the objectivity of the ward-round records are improved, and the accuracy of subsequent services is further improved.
In some embodiments, processing device 210 may generate a ward record based on the fourth sensory information and the second measurement data. As shown in fig. 17, at step 1704, the processing device 210 may further acquire second measurement data of the patient. The second measurement data may be acquired by one or more examination devices during the ward round. For more on the inspection apparatus, reference may be made to the relevant description hereinbefore (e.g. to the relevant description of fig. 4, 8). In some embodiments, the second measurement data is similar to the first measurement data described in fig. 8, and is not described here.
In some embodiments, one or more remote doctors may remotely participate in the ward round. For example, the processing device 210 may generate the virtual ward space 1710 from the fourth perception information and present the virtual ward space 1710 to one or more remote doctors (e.g., doctor 1722, doctor 1724, and doctor 1726) via one or more augmented reality devices of the one or more remote doctors. Remote doctors refer to doctors who are not in the ward. The virtual ward space refers to a digital environment for ward rounds. For example, the virtual ward space may be a digital twin space of a ward in which a patient is located that can reflect real-time conditions of the ward and provide an immersive experience for a remote doctor. Alternatively, the remote doctor may communicate with at least one doctor and patient in the ward through an augmented reality device. For example, the augmented reality device of the remote doctor may acquire voice data and image data of the remote doctor, and the processing device may control the first terminal device to play the voice of the remote doctor according to the voice data and display a virtual character representing the remote doctor according to the image data. In some embodiments, the remote doctor may remotely initiate the request to participate in the ward round through a hospital space application installed in the remote doctor's second terminal device.
In some embodiments of the present disclosure, one or more remote doctors access the virtual ward space through the augmented reality device, participate in ward rounds remotely, and can effectively eliminate the limitation of physical isolation and provide ward rounds service in time.
In some embodiments, at least one doctor may interact with a first terminal device in a patient room. For example, when the processing device 210 detects that the doctor issues an instruction to retrieve patient data (e.g., an electronic health record) based on the fourth awareness information, the first terminal device may be controlled to display the patient data to at least one doctor and the patient using the augmented reality technology to facilitate disease analysis. For another example, the first terminal device may be controlled to present the second measurement data to the at least one doctor and the patient when the processing device 210 detects an instruction for presenting the second measurement data.
In some embodiments, the process 1700 may further include step 1712 before providing the ward round service. In step 1712, the processing device 210 may also present patient data of the patient to the at least one doctor (or a portion thereof) via a second terminal device of the at least one doctor (or a portion thereof). For example, the second terminal device may simultaneously present the virtual character representing the patient and the patient data (e.g., electronic health record) of the patient in the field of view of the doctor, wherein the virtual character may indicate the patient's status to the doctor and provide the doctor with a ward-round advice. By displaying patient data to the doctor in advance, the doctor can better understand the state of the patient, thereby improving the accuracy and efficiency of ward round service.
In some embodiments, the processing device 210 may notify at least one physician (or a portion thereof) of a patient that requires attention during a ward round. For example, if a particular patient needs to be focused on during a ward round, the processing device 210 may tag patient data for the particular patient, e.g., using a different color, a dashed box, etc., when presenting the patient data.
In some embodiments, after the ward record is generated, the processing device 210 may update the patient's daily schedule based on the ward record. For example, when an updated order for the patient is detected to be included in the ward round record, the processing device 210 may update the daily plan for the patient based on the updated order.
In some embodiments, after the ward, at least one doctor (or a portion thereof) may view the ward record through the hospital space application. For example, the processing device 210 may present the ward record in a hospital space application, and at least one doctor (or a portion thereof) may interact with the hospital space application to view/confirm the ward record. For another example, at least one doctor (or a portion thereof) may interact with the second terminal device through an input device (e.g., a controller), voice, gestures, etc., to confirm the ward round record and/or daily plan.
In some embodiments, the processing device 120 may further generate a ward-round summary for each of the at least one doctor based on the ward-round records over a preset period of time (e.g., one week, one month, one year, etc.). The physician's ward-round summary may include the physician's personal ward-round information, the residence time for the patient, the order update rate, etc. For example, on the first day of each month, processing device 120 may generate a ward-round summary of the doctor from the ward-round records of the past month, and send a notification regarding the ward-round summary to the doctor's second terminal device. The doctor may interact with the second terminal device to read the ward-round summary based on the notification.
In some embodiments, processing device 210 may configure an agent (e.g., a ward-round agent) that may participate in execution of process 1700. For example, the agent may control one or more sensing devices to collect fourth sensing information, control one or more examination devices to collect second measurement data, provide a virtual ward space, and generate a ward record. In some embodiments, the processing device 210 configured with the ward agent may be integrated into a hospital bed, a first terminal device, or a patient care robot.
Fig. 18 is an exemplary flow diagram of a process 1800 for providing a visit service, according to some embodiments of the present disclosure. In some embodiments, the process 1800 may be performed during a hospital stay.
As shown in fig. 18, in response to a visit request 1802, the processing device 210 (e.g., the service module 630) may obtain first current information 1804 of a patient (e.g., the patient 261) and second current information 1805 of a remote visit 1808. The exploratory request 1802 refers to a request for a remote exploratory person to remotely communicate with the patient 261. For example, the processing device 210 may receive the exploring request 1802 from a first terminal device (e.g., the augmented reality device 432) or a fifth terminal device (e.g., the augmented reality device 1810) of the remote explorator 1808 in a patient room. The first current information 1804 may characterize a current state and/or a current context of the patient 261. The second current information 1805 may characterize a current state and/or current environment of the remote seeker 1808.
The processing device 210 may then generate a virtual exploratory space 1806 for the patient 261 and the remote explorator 1808 based on the first current information 1804 and the second current information 1805. The virtual exploratory space 1806 refers to a digital environment that is presented to the patient 261 and remote exploratory 1808 during an exploratory period. For example, the virtual visit space 1806 may be a digital twin space of a ward of the patient 261 or a digital twin space where the remote visit 1808 is located. In some embodiments, the processing device 210 may obtain fifth perceived information of the patient's ward of the patient 261 and/or the location of the remote exploratory 1808 and generate the virtual exploratory space 1806 based on the fifth perceived information, the first current information 1804 and the second current information 1805. For another example, the virtual exploration space 1806 may be a computer-generated digital space.
Further, the processing device 210 may present the virtual exploratory space 1806 to the patient 261 and the remote explorator 1808, respectively, via the first terminal device and the fifth terminal device of the remote explorator 1808.
In some embodiments, the processing device 210 may further obtain the exploratory information of the patient 261 and schedule the exploration of the remote explorer 1808 according to the exploratory information. The visit information may include whether the patient 261 is allowed to be visited, a visit time limit, etc., or any combination thereof. For example, the processing device 210 may approve the snoop request 1802 and generate the virtual snoop space 1806 only if the snoop patient 261 is allowed. For another example, the processing device 210 may determine a visit time of the remote visit 1808 based on the visit time limit and provide a reminder based on the visit time via the first terminal device.
In some embodiments, if the visit request indicates that patient 261 and remote visit 271 wish to communicate with a physician, processing device 210 may further obtain physician information 1812 of physician 271 of patient 261. For example, doctor information may include basic information of a doctor, an indication about a visit, an idle time of the doctor, and the like. Processing device 210 may determine visit time 1814 based on doctor information 1812 and present virtual visit space 1806 to patient 261, remote visit 1808, and doctor 271 via the first terminal device, the fifth terminal device, and the second terminal device (e.g., augmented reality device 270-2) of doctor 271, respectively, at visit time 1814.
In some embodiments, the processing device 210 may further generate the visit record. The visit record is used to record data about the visit, such as the patient, remote visit, doctor, visit time, etc. The manner of generating the visit record is similar to that of generating the ward record and will not be described in detail herein.
In some embodiments, processing device 210 may be configured with an agent (e.g., a exploring agent) that may participate in execution flow 1800. For example, the agent may generate a virtual visit space 1806, expose the virtual visit space 1806, and generate a visit record. In some embodiments, the processing device 210 configured with the exploring agent may be integrated into a hospital bed, a first terminal device, or a patient care robot.
In some embodiments of the present description, remote viewers and patients may receive a viewing service by accessing a virtual viewing space, which facilitates unrestricted communication without external time and geographic restrictions, and provides immediate emotional support and care to the patient, thereby facilitating a significant reduction in the mind of the viewer.
Fig. 19 is an exemplary flow diagram of a process 1900 of providing discharge services according to some embodiments of the present description. In some embodiments, the process 1900 may be performed during the discharge phase. In some embodiments, the process 1900 may be performed when the processing device 210 detects that the patient needs to be discharged (e.g., the condition of the patient satisfies discharge conditions).
As shown in fig. 19, the processing device 210 (e.g., the service module 630) may obtain a target hospitalization record 1904 of the patient in response to an discharge instruction 1902 received from a second terminal device (e.g., the augmented reality device 270-2) of a doctor of the patient (e.g., the patient 261). Optionally, the processing device 210 may also obtain physician information 1906 for the physician.
The target hospitalization record 1904 may record information about the patient's hospitalization procedure, such as medical history, treatment received, prescribed medications, test results, and discharge summaries. In some embodiments, the target hospitalization record 1904 may be generated based on an electronic health record. For example, the processing device 210 may update the electronic health record during a patient hospitalization procedure to record information related to the hospitalization procedure. For example only, when one or more paperwork records of the check-in record, the admission record, the care record, the ward record, the visit record, etc. are generated, the processing device 210 may update the electronic health record based on key information in the paperwork records described above. In some embodiments, the target hospitalization record 1904 may be generated directly based on the records generated during the hospitalization procedure.
Doctor information 1906 may include basic information of a doctor, discharge conditions corresponding to a patient determined by the doctor of the patient, and the like. The discharge condition may include that each vital sign of the patient is within a corresponding vital sign data range.
The processing device 210 may then generate discharge data 1908 from the target hospitalization records 1904 and the optional doctor information 1906. The discharge data 1908 may include a discharge abstract, a discharge-related order, discharge-related guidance information, discharge procedure-related guidance information, discharge fee, payment means, etc., or any combination thereof.
In some embodiments, the processing device 210 may present the discharge data 1908 to the patient 261 via a first terminal device (e.g., the augmented reality device 440). For example, the first terminal device may present a first virtual character to illustrate the discharge data 1908 and/or direct the patient 261 to perform the discharge operation 1910.
In response to determining that the discharge operation 1910 of the patient 261 is complete, the processing device 210 may generate a discharge record 1912 corresponding to the patient. The discharge record 1912 is used to record discharge-related data including discharge time, discharge summary, discharge-related order, discharge fee, payment means, status of the patient at discharge, etc. The discharge record 1912 is generated in a similar manner to that of the ward round record, and will not be described in detail herein.
In some embodiments, the processing device 210 may be configured with an agent (e.g., an discharge agent) that may participate in the execution flow 1900. For example, the agent may generate discharge data, display discharge data to the patient, generate discharge records, and the like. In some embodiments, the processing device 210 configured with the discharge agent may be integrated into a hospital bed, a first terminal device, or a patient care robot.
In some embodiments of the present disclosure, discharge data may be automatically generated and displayed to a patient, so that the patient may perform discharge conveniently and quickly, thereby improving efficiency and quality of service of discharge service.
Fig. 20 is an exemplary flow diagram of a flow 2000 of providing a follow-up service according to some embodiments of the present description. In some embodiments, the process 2000 may be performed during a follow-up phase. In some embodiments, the process 2000 may be performed when the processing device 210 detects that the patient has been discharged from the hospital.
As shown in fig. 20, the processing device 210 (e.g., the service module 630) may determine a follow-up plan 2004 for the patient 261 based on the target hospitalization record 1904 for the patient 261.
The follow-up plan is used to instruct how to provide follow-up services to the patient. In some embodiments, the follow-up plan 2004 may include one or more follow-ups performed at one or more plan times. For example, the processing device 210 may obtain a patient's follow-up level determined by the physician 271 from the target hospitalization record 1904, and determine the follow-up plan 2004 from the patient's follow-up level. The level of follow-up may characterize the frequency with which the patient needs to receive follow-up. For another example, the processing device 210 may determine the follow-up plan 2004 based on the target hospitalization record 1904 and the historical follow-up plan. The historical follow-up schedule may include a historical follow-up schedule for patient 261 and/or a historical follow-up schedule for other patients having similar symptoms as patient 261.
Further, the processing device 210 may alert the doctor 271 and the patient 261, respectively, according to the follow-up plan 2004 via a second terminal device of the doctor 271 (e.g., the augmented reality device 270-2) and a third terminal device of the patient 261 (e.g., the augmented reality device 260-2). For example, for each of the one or more follow-up visits, the processing device 210 may alert the physician 271 and patient 261 to attend the follow-up visit via the augmented reality device 270-2 and the augmented reality device 260-2, respectively, according to the scheduled time for the follow-up visit.
In some embodiments, one or more of the follow-up visits may be performed offline, such as at a hospital.
In some embodiments, at least one of the one or more follow-up visits is a remote follow-up visit, which may be performed remotely in the virtual follow-up space 2008. The virtual follow-up space refers to the digital environment used for follow-up. For example, the virtual follow-up space 2008 may be a digital twin space of the patient's home 261 or an office digital twin space of the doctor 271. By way of example, the processing device 210 may present the virtual follow-up space 2008 to the doctor 271 and the patient 261 via the augmented reality device 270-2 and the augmented reality device 260-2, respectively, and may follow-up 2006 between the doctor 271 and the patient 261 in the virtual follow-up space 2008.
In some embodiments, after performing the follow-up 2006, the processing device 210 may generate a follow-up record corresponding to the patient. The follow-up record is used to record data about the follow-up, including time corresponding to the follow-up, updated orders corresponding to the follow-up, health monitoring information corresponding to the follow-up, and the like. The generation of the follow-up record is similar to that of the ward record, and will not be described in detail here.
In some embodiments, the processing device 210 may update the follow-up plan 2004. For example, the processing device 210 may obtain the health monitoring information 2010 of the patient and update the follow-up plan 2004 based on the health monitoring information 2010. In some embodiments, health monitoring information 2010 may be collected by one or more home monitoring devices 2012 in the patient 261 home. Exemplary home monitoring devices may include smart watches, blood pressure monitors, blood glucose meters, heart rate meters, thermometers, and the like, or any combination thereof. For example only, if the health monitoring information 2010 indicates that the patient's condition is deteriorating, the processing device 210 may update the follow-up plan 2004 by adding follow-up.
In some embodiments, processing device 210 may be configured with an agent (e.g., a follow-up agent) that may participate in execution flow 2000. For example, the agent may determine a follow-up plan for the patient, remind the doctor and the patient according to the planned follow-up time, assist the patient in obtaining health monitoring information, generate a follow-up record, and the like.
In some embodiments of the present description, the patient and physician may access the virtual follow-up space through an augmented reality device to remotely participate in the follow-up. The method can effectively eliminate the limitation of physical isolation, thereby being capable of providing follow-up service for patients in time. In addition, the virtual follow-up space is a digital twin space of a patient family, so that doctors can fully know the living environment of the patient and the influence of the living environment on the illness state of the patient, and the follow-up service accuracy is improved.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure does not imply that the subject matter of the present description requires more features than are set forth in the claims. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers used in the embodiment description are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., referred to in this specification is incorporated herein by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the content of this specification, documents that are currently or later attached to this specification in which the broadest scope of the claims to this specification is limited are also. It is noted that, if the description, definition, and/or use of a term in an attached material in this specification does not conform to or conflict with what is described in this specification, the description, definition, and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.

Claims (67)

CN202411743194.4A2024-07-312024-11-29 A system and method for providing inpatient servicesPendingCN119811606A (en)

Priority Applications (5)

Application NumberPriority DateFiling DateTitle
CN202510046027.2ACN119811610A (en)2024-07-312024-11-29 A system and method for providing inpatient nursing services
CN202510046811.3ACN119626491A (en)2024-07-312024-11-29 A system and method for providing ward rounds service
CN202510039867.6ACN119560117A (en)2024-07-312024-11-29 A system and method for providing admission inquiry service
CN202510052424.0ACN119541801A (en)2024-07-312024-11-29 A system and method for providing follow-up services
CN202510039748.0ACN119541800A (en)2024-07-312024-11-29 A system and method for providing hospital admission guidance services

Applications Claiming Priority (2)

Application NumberPriority DateFiling DateTitle
CNPCT/CN2024/1090582024-07-31
CN20241090582024-07-31

Related Child Applications (5)

Application NumberTitlePriority DateFiling Date
CN202510052424.0ADivisionCN119541801A (en)2024-07-312024-11-29 A system and method for providing follow-up services
CN202510046027.2ADivisionCN119811610A (en)2024-07-312024-11-29 A system and method for providing inpatient nursing services
CN202510046811.3ADivisionCN119626491A (en)2024-07-312024-11-29 A system and method for providing ward rounds service
CN202510039748.0ADivisionCN119541800A (en)2024-07-312024-11-29 A system and method for providing hospital admission guidance services
CN202510039867.6ADivisionCN119560117A (en)2024-07-312024-11-29 A system and method for providing admission inquiry service

Publications (1)

Publication NumberPublication Date
CN119811606Atrue CN119811606A (en)2025-04-11

Family

ID=95268456

Family Applications (1)

Application NumberTitlePriority DateFiling Date
CN202411743194.4APendingCN119811606A (en)2024-07-312024-11-29 A system and method for providing inpatient services

Country Status (1)

CountryLink
CN (1)CN119811606A (en)

Similar Documents

PublicationPublication DateTitle
CN109310317A (en) System and method for automated medical diagnosis
US20160342753A1 (en)Method and apparatus for healthcare predictive decision technology platform
KR20210113299A (en) Systems and methods for interactive and flexible data presentation
CN109313817A (en) Systems and methods for generating medical diagnoses
US20190214134A1 (en)System and method for automated healthcare service
CN117912662A (en)Artificial intelligence nursing system based on thing networking
CN109310330A (en) System and method for medical device patient measurements
CN119580978A (en) A hospital management system
CN119580976A (en) A hospital support platform
US20250131997A1 (en)Systems and methods for automated medical data capture and caregiver guidance
Vishnevskaya et al.Study the possibility of creating self-diagnosis and first aid system
Li et al.Comprehensive review of virtual assistants in vascular surgery
US7877341B2 (en)Self-adaptive data pre-fetch by artificial neuron network
CN119811606A (en) A system and method for providing inpatient services
CN119541800A (en) A system and method for providing hospital admission guidance services
CN119964751A (en) A medical service system, device, equipment and method
CN119541906A (en) A method, system and storage medium for providing medical consultation services
CN119580977A (en) Medical service method and system
CN119626588A (en) A method, system and storage medium for providing medical services
WO2025062383A1 (en)Advanced multisectoral counter and kiosk for integrated regional health-related data gathering and personalized physiologic and hemodynamic monitoring
US20250246282A1 (en)Digital therapy management system of reflecting and updating feedback according to actual use of digital therapeutics and operation method thereof
US20240355486A1 (en)Artificial intelligence system for facilitating interactions via digital representations
AlanaziExploring Policies and Strategies for the Diffusion of Remote Patient Monitoring (RPM) for the Care of Senior Population
Singhi et al.The rise of Robotics and AI in healthcare: Concerns and Implications
CN116844704A (en) A self-service system and method for smart medical care

Legal Events

DateCodeTitleDescription
PB01Publication
PB01Publication
SE01Entry into force of request for substantive examination
SE01Entry into force of request for substantive examination

[8]ページ先頭

©2009-2025 Movatter.jp