Disclosure of Invention
In view of the foregoing, it is desirable to provide a console, a physician end effector, and a master-slave ultrasound detection system that address at least one of the above-mentioned problems.
In one aspect, an operation table is provided, the operation table is applied in a master-slave ultrasonic detection system, the operation table is provided with an operation surface, the operation surface is matched with a probe so that the probe can move on the operation surface and apply pressure to execute ultrasonic detection, the operation table comprises an elastic material layer and a flexible sensor layer, the elastic material layer is provided with elastic deformation amounts along the thickness direction everywhere, the elastic deformation amounts allow the elastic material layer to deform when the probe acts on the operation surface, so that the operation table simulates skin elasticity of a human body or an animal body, and the flexible sensor layer is used for sensing the position of the probe on the operation surface.
Further, the operation table further comprises a pressure detection device, wherein the pressure detection device is used for sensing the force of the probe acting on the operation surface.
Further, the elastic material layer is attached to the flexible sensor layer, and the operation surface is formed on the elastic material layer; or the flexible sensor layer is attached to the elastic material layer, and the operation surface is formed on the flexible sensor layer.
Further, the operation table comprises a flexible display layer, wherein the flexible display layer is used for displaying image information of a detected part of a detected person, where the ultrasonic probe in the master-slave ultrasonic system is located.
Further, the flexible sensor layer is attached to the flexible display layer, the elastic material layer is attached to the flexible sensor layer, the elastic material layer and the flexible sensor layer are made of transparent or semitransparent materials, and the operation surface is formed on the elastic material layer; or, the flexible sensor layer is attached to the flexible display layer, the flexible display layer is attached to the elastic material layer, the flexible sensor layer is made of transparent or semitransparent material, and the operation surface is formed on the flexible sensor layer.
Further, the operation surface is a cambered surface.
On the other hand, still provide a doctor end controlling device, doctor end controlling device includes foretell operation panel and profile modeling probe.
Further, a gesture sensor and a conductive layer are arranged on the profiling probe, the gesture sensor is used for sensing and outputting gesture information of the profiling probe in real time, and electric coupling is formed between the conductive layer and the flexible sensor layer so that the flexible sensor layer senses the position of the profiling probe on the operation surface.
In still another aspect, a master-slave ultrasonic detection system is provided, the master-slave ultrasonic detection system includes doctor end equipment and a person to be detected end equipment, the doctor end equipment includes aforementioned doctor end controlling means, communication device and display device, the person to be detected end equipment includes camera device, arm, ultrasonic probe, controlling means and communication device, wherein: the imaging device is used for acquiring three-dimensional image data of the checked part of each checked person, and the control device of the doctor terminal device or the control device of the checked person terminal device processes the three-dimensional image data of the checked part of each checked person into three-dimensional data and matches the three-dimensional data of the effective area of the operation surface, so that each position of the checked part of each checked person is uniquely corresponding to one position of the effective area of the operation surface.
Further, the control device of the doctor-side device or the control device of the detected-person-side device is further configured to generate a position command for controlling the position of the mechanical arm with the position information of the profiling probe acquired in real time according to the matching result of the three-dimensional data of the detected part of each detected person and the three-dimensional data of the effective area of the operation surface, so that the mechanical arm controls the ultrasonic probe to be located at the position of the detected part of the corresponding detected person.
According to the operation table, the doctor end control device and the master-slave ultrasonic detection system provided by the embodiment of the invention, in the first aspect, the elastic material layer is arranged on the operation table, so that the skin elasticity of a human body (animal body) can be simulated, the scene that the ultrasonic probe contacts the skin of the human body (animal body) can be simulated to the greatest extent, and the picture making experience of a doctor can be enhanced; in the second aspect, the shape setting of the detected part is simulated by the operation surface of the operation table, so that a doctor can conveniently map different parts, for example, the operation surface is arc-shaped, the abdomen setting of a human body is simulated, and the doctor can conveniently scan the abdomen of the detected person; in the third aspect, by matching the three-dimensional data of the detected part of different detected persons with the three-dimensional data of the operation surface of the operation table, the operation table can be adapted to different detected persons without changing the patterning habit of a doctor; in the fourth aspect, an image of the current examination position of the examinee is displayed at the contact operation surface of the profiling probe, and the doctor can be provided with a feeling of pressing against the real skin in combination with the skin-like setting of the operation table.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It will be understood that when an element is referred to as being "mounted" to another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "disposed on" another element, it can be directly on the other element or intervening elements may also be present.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The term "or/and" as used herein includes any and all combinations of one or more of the associated listed items.
Referring to fig. 1, the doctorend control device 1 includes aprofiling probe 10 and an operation table 20. Theprofiling probe 10 simulates the movement and pressure of a real ultrasonic probe on the operation table 20 so as to simulate the whole process of ultrasonic detection of the real ultrasonic probe on the body part of a detected person. The part of theprofiling probe 10, which is in contact with the operation table 20, is provided with a soft rubber material to reduce the rigidity of the contact part of theprofiling probe 10 and the operation table 20, reduce the friction resistance between the profiling probe and the operation table 20, and prolong the service lives of the profiling probe and the operation table 20. Theconsole 20 is a skin-like console, and has aconsole surface 21 for mating with theprofiling probe 10. Theoperation surface 21 is acontoured operation surface 21, and thecontoured operation surface 21 generally simulates the surface shape of the body part to be detected by the subject by extending and waving theentire operation surface 21, and in this embodiment, theoperation surface 21 is contoured to the abdomen position of the subject, and is generally curved, and includes afirst region 211 corresponding to the front abdomen and asecond region 212 corresponding to the side abdomen. Twosecond areas 212 corresponding to the abdomen are respectively located at two sides of thefirst area 211, and eachsecond area 212 is bent backwards from thefirst area 211, so that theoperation surface 21 presents a stereoscopic impression corresponding to the abdomen of a human body.
Referring to fig. 2, in the embodiment shown in fig. 2, theconsole 20 further includes aconsole body 22, apressure detecting device 23 disposed below theconsole body 22, and aflexible display layer 24, aflexible sensor layer 25, and anelastic material layer 26 disposed above theconsole body 22. The dimensions of theconsole body 22, thepressure detecting device 23, theflexible display layer 24, theflexible sensor layer 25, and theelastic material layer 26 are uniform in the direction perpendicular to the stacking direction, that is, the thickness direction, thepressure detecting device 23 is attached to the lower side of theconsole body 22, and theflexible display layer 24 is attached to the upper side of theconsole body 22. Theflexible sensor layer 25 is attached to the upper side of theflexible display layer 24, and theelastic material layer 26 is attached to the upper side of theflexible sensor layer 25. Theoperating surface 21 is formed on theelastic material layer 26. The upper surface of theconsole body 22 is formed in an arc shape to follow the abdominal position of the subject, so that theoperation surface 21 formed on theelastic material layer 26 also has a surface shape to follow the abdominal position of the subject. Theelastic material layer 26 is made of transparent or semitransparent material allowing light to pass through, has elastic deformation along the thickness direction, and can deform along the thickness direction under the action of external force, preferably, theelastic material layer 26 has uniform elasticity everywhere, more preferably, theelastic material layer 26 is a skin-like elastic layer, and the bonding mode among the components is that the components are completely bonded by using water gel, optical gel and the like. Theflexible sensor layer 25 is made of a transparent or semitransparent material.
Referring to fig. 3, in the embodiment shown in fig. 3, theelastic material layer 26 is attached to the upper side of theconsole body 22, theflexible display layer 24 is attached to the upper side of theelastic material layer 26, theflexible sensor layer 25 is attached to the upper side of theflexible display layer 24, and theoperation surface 21 is formed on theflexible sensor layer 25. Theflexible sensor layer 25 is made of a transparent or translucent material, and the elastic material layer may be made of a transparent or translucent material or an opaque material that prevents light from passing therethrough.
Referring to fig. 4, in the embodiment shown in fig. 4, the flexible display layer is omitted, theflexible sensor layer 25 is directly attached to the upper side of theconsole body 22, theelastic material layer 26 is attached to theflexible sensor layer 25, and theflexible sensor layer 25 and theelastic material layer 26 may be made of transparent, semitransparent or opaque materials.
Referring to fig. 5, in the embodiment shown in fig. 5, the flexible display layer is omitted, theelastic material layer 26 is attached to the upper side of theconsole body 22, and theflexible sensor layer 25 is directly attached to theelastic material layer 26. Theflexible sensor layer 25 and theelastic material layer 26 may be made of transparent, semitransparent or opaque materials.
In each of the above embodiments, theelastic material layer 26 is configured to enable theconsole 20 to obtain skin elasticity similar to that of a human body or an animal body, so as to improve experience and acceptance of a doctor, theflexible sensor layer 25 is configured to obtain a position and/or a movement track of theprofiling probe 10 on theoperation surface 21, theflexible display layer 24 is configured to display image information of an inspected portion of the inspected person, and thepressure detecting device 23 is configured to sense force applied by theprofiling probe 10 on theoperation surface 21. For better sensitivity, when theelastic material layer 26 is attached to theflexible sensor layer 25, the thickness of theelastic material layer 26 is preferably less than 12mm, and the thickness of theflexible sensor layer 25 is preferably less than 5mm. Preferably, when theflexible sensor layer 25 is placed on the uppermost layer to contact with theprofiling probe 10, in order to increase the service life of theflexible sensor layer 25, a film-sticking treatment may be performed on the surface of theflexible sensor layer 25, and the thickness is about 1 mm.
Referring to fig. 6, a schematic diagram of hardware connection of a master-slave ultrasonic detection system is shown. The master-slaveultrasonic detection system 3 comprises doctor-side equipment 4 and detected person-side equipment 5. The doctor-side device 4 mainly comprises aprofiling probe 10, an operation table 20, acontrol device 41, acommunication device 42 and adisplay device 43, and the detected-side device 5 mainly comprises animaging device 51, amechanical arm 52, anultrasonic probe 53, acontrol device 54 and acommunication device 55. The doctor-side device 4 and the detected-end device 5 communicate through respective communication means 42, 55, for example, when the physical distance between the doctor-side device 4 and the detected-end device 5 is relatively short, the doctor-side device 4 and the detected-end device 5 may communicate through a data line or a face-to-face transmission mode such as bluetooth, or, regardless of the distance, the doctor-side device 4 and the detected-end device 5 may also implement communication through respective communication means 42, 55 by carrying a wired and/or wireless network.
Specifically, theprofiling probe 10 includes aposture sensor 11, and theposture sensor 11 senses and outputs posture information of theprofiling probe 10 to thecontrol device 41 in real time. Theconsole 20 includes aflexible sensor 251, aflexible display 241, and apressure detecting device 23 connected to thecontrol device 41, where theflexible sensor 251 forms part or all of theflexible sensor layer 25, and theflexible sensor 251 is used to acquire and output the position information of theprofiling probe 10 to thecontrol device 41 in real time. Specifically, in one embodiment, a conductive layer is disposed on theprofiling probe 10, and capacitive touch sensing, inductive touch sensing or similar sensing technology is adopted to form electrical coupling between theprofiling probe 10 and theflexible sensor 251, so that theflexible sensor 251 can acquire a specific position of theprofiling probe 10 on theoperation surface 21 in real time; in another embodiment, resistive touch sensing or similar sensing technology is adopted between theprofiling probe 10 and theflexible sensor 251 to achieve real-time acquisition of the specific position of theprofiling probe 10 by theflexible sensor 251. Theflexible display 241 forms part or all of theflexible display layer 24, and theflexible display 241 is used for displaying image information of the current examination position of the examined person in real time. In other embodiments, theflexible display 241 displays the entire image information of the detected portion of the detected person, for example, the image information of the entire abdominal surface. By the image presentation of the examined region of the examined person, the doctor can be provided with a feeling of pressing against the real skin. Thepressure detecting device 23 is configured to sense in real time the force applied by theprofiling probe 10 to theoperation surface 21 of the operation table 20, and output force information to thecontrol device 41.
Thecontrol device 41 generates a control command for controlling themechanical arm 52 according to the gesture information and the position information of theprofiling probe 10 and the force information applied on theoperation surface 21 by theprofiling probe 10, which are acquired in real time, and themechanical arm 52 controls theultrasonic probe 53 according to the control command. Theultrasonic probe 53 is typically in direct contact with the skin of the body surface of the subject, and during ultrasonic imaging, theultrasonic probe 53 converts electrical signals into ultrasonic waves, so that the ultrasonic waves propagate in a target area (such as organs, tissues, blood vessels, etc. in the body of the subject), and then receives ultrasonic echoes containing information about the region to be detected, which are emitted from the target area, and converts the ultrasonic echoes into electrical signals.
Referring to fig. 7, thecommunication devices 42 and 55 are omitted in fig. 7, but it should be understood that thecontrol devices 41 and 54 communicate with each other directly or via thecommunication devices 42 and 55 on a wired and/or wireless network.
Thecontrol device 41 further includes a controlcommand generation module 411, a positioninformation display module 412, and an ultrasoundimage display module 413. The controlcommand generating module 411 generates a control command for controlling themechanical arm 52 according to the gesture information and the position information of theprofiling probe 10 and the force information of theprofiling probe 10 applied on theoperation surface 21, which are acquired in real time, wherein the controlcommand generating module 411 generates a gesture command for controlling the gesture of themechanical arm 52 according to the gesture information of theprofiling probe 10, so that themechanical arm 52 controls theultrasonic probe 53 to copy the gesture of theprofiling probe 10. Wherein the pose is the spatial azimuthal orientation of theprofiling probe 10. The controlcommand generation module 411 generates a position command for controlling the position of themechanical arm 52 according to the position information of theprofiling probe 10, and causes themechanical arm 52 to control theultrasonic probe 53 to move to a position corresponding to the body of the subject. The controlcommand generation module 411 generates a force command for controlling the force of themechanical arm 52 according to the force information applied to theoperation surface 21 by theprofiling probe 10, and causes the mechanical arm to control the force of theultrasonic probe 53 acting on the body of the subject.
Thecontrol device 41 further includes aposition matching module 414, where theposition matching module 414 matches the three-dimensional data of the detected part of each detected person with the three-dimensional data of the effective area of theoperation surface 21, so that each position of the detected part of each detected person can uniquely correspond to one of the positions of the effective area of theoperation surface 21, and each detected person can effectively correspond to the effective area of theoperation surface 21 regardless of the size and shape of the detected part, so that a doctor operating ultrasonic detection can adjust the position or mode of the ultrasonic mapping without taking care of the difference of the shapes of the detected parts of each detected person, thereby facilitating the ultrasonic mapping operation of the doctor. The effective area of theoperation surface 21 is a set of all the position points where the position of theprofiling probe 10 and the applied force can be detected by the operation table 20 when theprofiling probe 10 acts on theoperation surface 21. In this embodiment, theoperation surface 21 is configured to contour the abdomen position of the subject, and has afirst region 211 corresponding to the front abdomen and twosecond regions 212 corresponding to the two lateral abdomen, so that the three-dimensional data of the abdomen position of each subject is also divided into three regions, i.e., the front abdomen and the two lateral abdomen, and theposition matching module 414 matches the three-dimensional data of the front abdomen with the three-dimensional data of the effective region of thefirst region 211 of theoperation surface 21 to obtain a one-to-one correspondence between each position point of the front abdomen of each subject and each position point of the effective region of thefirst region 211 of theoperation surface 21. Preferably, the effective area of thefirst area 211 is the wholefirst area 211. Theposition matching module 414 matches the three-dimensional data of the left abdomen of the two side abdomens with the three-dimensional data of the effective area of thesecond area 212 of the left side of theoperation surface 21, so as to obtain a one-to-one correspondence between each position point of the left abdomen of each detected person and each position point of the effective area of thesecond area 212 of the left side of theoperation surface 21; theposition matching module 414 matches the three-dimensional data of the right abdomen of the two sides with the three-dimensional data of the effective area of thesecond area 212 of theright operation surface 21, so as to obtain a one-to-one correspondence between each position point of the right abdomen of each detected person and each position point of the effective area of thesecond area 212 of theright operation surface 21. Preferably, the effective area of the leftsecond area 212 is the entire leftsecond area 212, and the effective area of the rightsecond area 212 is the entire rightsecond area 212. In the process of matching the detected part of the detected person with the effective area of theoperation surface 21, according to one embodiment, theposition matching module 414 compares the size of each detected area of the detected person with the size of the corresponding area of theoperation surface 21, and corresponds each detected area to the corresponding area of theoperation surface 21 by scaling up and scaling down, and obtains corresponding scale data. For example, when the width of the front abdomen of a person to be detected is smaller than the width of thefirst region 211 of theoperation surface 21 and the height is larger than the height of thefirst region 211 of theoperation surface 21, theposition matching module 414 obtains the ratio of the width of the front abdomen of the person to be detected to the width of thefirst region 211 of theoperation surface 21 and the ratio of the height of the front abdomen of the person to be detected to the height of thefirst region 211 of theoperation surface 21, and fits each position of the width of the front abdomen of the person to be detected to each position of the width of thefirst region 211 to each position of the height according to the two ratios, thereby fitting the detected parts of different persons having the shape differences to the effective region of theoperation surface 21 to adapt the operation table 20 to each person to be detected.
Thelocation matching module 414 starts the matching operation according to the operation of the doctor and returns a matching result after the matching is completed, for example, thelocation matching module 414 is started according to the operation of a specific button by the doctor, after the matching is completed, thelocation matching module 414 returns a matching completion message to thedoctor terminal device 4, or simultaneously, the matching result is prompted by thedoctor terminal device 4 and the detected terminal device 5 through a display and/or voice mode to the detected terminal device 5. In other embodiments, thelocation matching module 414 may also automatically initiate a matching operation after each ultrasonic inspection task is initiated.
After theposition matching module 414 completes the matching, the controlcommand generating module 411 may generate a position command for controlling the position of themechanical arm 52 according to the matching data provided by theposition matching module 414, such as the ratio data between the detected part of the detected person and the effective area of theoperation surface 21, and the position information of theprofiling probe 10 acquired in real time. The positioninformation display module 412 may display the position information of theultrasonic probe 53 at the corresponding position of theoperation surface 21 according to the matching data provided by theposition matching module 414 and the position information of theultrasonic probe 53. In one embodiment, the position information is image information of the position of theultrasonic probe 53 captured by theimaging device 51.
The ultrasonicimage display module 413 is configured to display ultrasonic image information returned from the detected end device 5 on thedisplay device 43 for viewing by a doctor.
Thecontrol device 54 further includes a three-dimensionaldata processing module 540, a controlcommand analyzing module 541, an ultrasonicsignal processing module 542, and a positionalinformation providing module 543. The three-dimensionaldata processing module 540 is connected to thecamera device 51, where thecamera device 51 may include a camera that can move under the control of thecontrol device 54 or other control components to capture three-dimensional image data of the detected part of the detected person, and thecamera device 51 may also include a plurality of cameras, each camera obtains three-dimensional image data of one area of the detected part of the detected person, and the three-dimensional image data of the whole detected part can be obtained by stitching the three-dimensional image data obtained by all the cameras. The three-dimensionaldata processing module 540 obtains three-dimensional image data of the detected part captured by theimaging device 51 and processes the three-dimensional image data into three-dimensional data, and in one embodiment, the processing includes stitching (for a multi-camera case), removing color data, and the like. The three-dimensionaldata processing module 540 transmits the three-dimensional data to thecontrol device 41 through thecommunication devices 55, 42 so that theposition matching module 414 can perform matching operation. In one embodiment, after thelocation matching module 414 is started, thelocation matching module 414 sends a request signal to the three-dimensionaldata processing module 540, and the three-dimensionaldata processing module 540 starts the transmission of the three-dimensional data according to the request signal sent by thelocation matching module 414. In another embodiment, the three-dimensionaldata processing module 540 initiates data processing and transmission of three-dimensional data according to the request signal sent by thelocation matching module 414.
The controlcommand analysis module 541 analyzes the control command generated by the controlcommand generation module 411, and generates control for each degree of freedom of themechanical arm 52, so that themechanical arm 52 controls theultrasonic probe 53 to replicate the posture and position of theprofiling probe 10 and the force applied to the subject. The ultrasonicsignal processing module 542 receives the electrical signal transmitted back from theultrasonic probe 53 and performs a series of processes thereon to obtain ultrasonic image data. The ultrasound image data includes, but is not limited to, two-dimensional image data of type B, type C, type D, etc., three-dimensional ultrasound image data, and four-dimensional image data including a time dimension. The ultrasound image data may be one or more image frames or may be a video file formed from a sequence of image frames. Further, the ultrasound image data may further include relevant information such as a corresponding detection mode, detection parameters, and subject data (e.g., age, past medical history, etc.) for generating an ultrasound echo. The ultrasound image data is transmitted to thecontrol device 41 in real time and displayed on thedisplay device 43 by the ultrasoundimage display module 413 in real time. In another embodiment, the device 5 of the detected user terminal may further include a display device, on which the ultrasound image data is displayed at the same time. The positioninformation providing module 543 is connected with theimage capturing device 51 in a wired or wireless manner, so as to obtain image information of the position of theultrasonic probe 53 captured by theimage capturing device 51, and provide the image information to the positioninformation display module 412.
In summary, in the operation table 20 and the master-slave ultrasonic detection system 3 provided by the embodiment of the present invention, in the first aspect, by disposing the elastic material layer 26 on the operation table 20, the skin elasticity of the human body (animal body) can be simulated, the scene that the ultrasonic probe contacts the skin of the human body (animal body) can be simulated to the greatest extent, and the imaging experience of the doctor can be enhanced; in the second aspect, the shape setting of the detected part is simulated by the operation surface 21 of the operation table 20, so that a doctor can conveniently map different parts, for example, the operation surface 21 is arc-shaped, the abdomen setting of a human body is simulated, and the doctor can conveniently scan the abdomen of the detected person; in the third aspect, by matching the three-dimensional data of the detected portion of the different detected person with the three-dimensional data of the operation surface 21 of the operation table 20, the operation table 20 can be adapted to the different detected person without changing the patterning habit of the doctor; in the fourth aspect, an image of the current examination position of the examinee is displayed where the profiling probe 10 contacts the operation surface 21, and in combination with the skin-like setting of the operation table 20, the doctor can be provided with a feeling of pressing against the real skin.
It will be appreciated that in the above embodiments, theprofiling probe 10 is designed to mimic the structural shape of a real ultrasound probe, and in other embodiments, theprofiling probe 10 may be replaced by other probes that are not profiling.
It is understood that when theflexible display layer 24 is omitted from theconsole 20, the positioninformation display module 412 in thecontrol device 41 and the positioninformation providing module 543 in thecontrol device 54 may be omitted.
It will be appreciated that, in other embodiments, the controlcommand generating module 411 and theposition matching module 414 may also be disposed in thecontrol device 54, where the gesture information, the position information, the force information, etc. obtained by the doctor-side device 4 may be transmitted to thecontrol device 54.
It should be understood that, in other embodiments, the three-dimensionaldata processing module 540, the ultrasonicsignal processing module 542, and the positioninformation providing module 543 may be disposed in thecontrol device 41, and at this time, three-dimensional image data obtained by the device 5 at the detected end, electrical signals obtained by the ultrasonic probe, image information obtained by the image capturing device, and so on may be transmitted to thecontrol device 54.
It will be appreciated that the control means 41, 54 may comprise a suitable processing unit and memory unit to perform the functions required by the control means 41, 54.
It will be appreciated that the processing units in the control means 41, 54 may execute a series of software programs, and that the storage unit stores the software programs executed by the processing units and may simultaneously store some of the result data obtained by the processing units executing the software programs, so as to implement the functions described above, and that the modules appearing in this specification are summaries and descriptions of the respective functions of the software programs executed by the control means 41, 54.
It will be appreciated by persons skilled in the art that the above embodiments have been provided for the purpose of illustrating the invention and are not to be construed as limiting the invention, and that suitable modifications and variations of the above embodiments are within the scope of the invention as claimed.