CROSS-REFERENCE TO RELATED APPLICATIONThe present application claims priority under 35 U.S.C § 119(e) to Japanese Patent Application No. 2017-222355, filed on Nov. 20, 2017, the entire contents of which are incorporated herein by reference.
BACKGROUNDTechnological FieldThe present invention relates to a dynamic state imaging system.
Description of the Related ArtFor conventional still image imaging and diagnosis of a subject by radiation using a film/screen or stimulable phosphor plate, attempts are being made that images a dynamic state image of the subject by using a semiconductor image sensor such as a flat panel detector (FPD) and applies the dynamic state image for diagnosis. Specifically, utilizing quick responsiveness in reading and erasing image data of the semiconductor image sensor, pulsed radiation is continuously radiated from a radiation source in synchronization with reading and erasing timings of the semiconductor image sensor, and imaging is performed a plurality of times per second, whereby a dynamic state of a subject is imaged (for example, see JP 5672147 B2 and JP 5125750 B2).
However, in conventional dynamic state imaging, imaging is always performed at constant time intervals from an imaging start without being conscious of a period of a dynamic state of a subject. For that reason, as illustrated inFIG. 6, phases of frame images of respective periods acquired by the dynamic state imaging are not basically the same as each other, and it has been impossible to effectively utilize dynamic state images of a plurality of periods.
SUMMARYAn object of the present invention is to enable dynamic state images of a plurality of periods to be effectively utilized by acquiring dynamic state images having the same phases at each period of the plurality of periods.
To achieve the abovementioned object, according to an aspect of the present invention, a dynamic state imaging system reflecting one aspect of the present invention comprises a hardware processor that acquires period information on a period of a subject's dynamic state having periodicity, and controls a radiation imaging apparatus to acquire dynamic state images of a plurality of periods by assigning a same phase as an imaging start timing for each period of the subject's dynamic state, based on the period information, and performing imaging at predetermined time intervals from the imaging start timing for each period.
BRIEF DESCRIPTION OF THE DRAWINGSThe advantages and features provided by one or more embodiments of the invention will become more fully understood from the detailed description given hereinbelow and the appended drawings which are given by way of illustration only, and thus are not intended as a definition of the limits of the present invention:
FIG. 1 is a diagram illustrating an overall configuration of a dynamic state imaging system according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating imaging control processing executed by a control device of an imaging console ofFIG. 1;
FIG. 3 is a diagram illustrating a temporal change in dynamic state detection information of a subject and imaging timings of frame images in the present embodiment;
FIG. 4 is a diagram for explaining frame images at the same phase of a plurality of periods;
FIG. 5 is a diagram schematically illustrating a modification in which imaging phases are shifted by a predetermined interval for each period, and frame images are rearranged in accordance with the phases; and
FIG. 6 is a diagram illustrating a temporal change in dynamic state detection information of a subject and imaging timings of frame images in a conventional technique.
DETAILED DESCRIPTION OF EMBODIMENTSHereinafter, one or more embodiments of the present invention will be described with reference to the drawings. However, the scope of the invention is not limited to the disclosed embodiments.
[Configuration of Dynamic State Imaging System100]
First, a configuration of the present embodiment will be described.
FIG. 1 illustrates an overall configuration example of a dynamicstate imaging system100 in the present embodiment.
As illustrated inFIG. 1, the dynamicstate imaging system100 includes animaging apparatus1 and animaging console2 connected via a communication cable or the like, and adiagnostic console3 connected to theimaging console2 via a communication network NT such as a local area network (LAN). The apparatuses constituting the dynamicstate imaging system100 conform to the digital image and communications in medicine (DICOM) standard, and communication between the apparatuses is performed in accordance with DICOM.
[Configuration of Imaging Apparatus1]
Theimaging apparatus1 is a radiation imaging apparatus that images a subject's dynamic state having periodicity, such as a form change of expansion and contraction of a lung accompanying respiratory movement, and a beat of a heart. Dynamic state imaging means to acquire a plurality of images indicating a dynamic state of a subject by repeatedly irradiating the subject with radiation such as a pulsed X-ray at predetermined time intervals (pulse irradiation) or continuously irradiating the subject with the radiation at a low dose rate (continuous irradiation), and reading a radiation image at predetermined time intervals. A series of images obtained by the dynamic state imaging is called a dynamic state image. In addition, each of the plurality of images constituting the dynamic state image is referred to as a frame image. In the following embodiment, an example will be described in which dynamic state imaging of a chest is performed by pulse irradiation.
Aradiation source11 is arranged at a position facing aradiation detecting device13 across a subject M, and irradiates the subject M with radiation (X-ray) in accordance with control of a radiationirradiation control apparatus12.
The radiationirradiation control apparatus12 is connected to theimaging console2, and performs radiation imaging by controlling theradiation source11 on the basis of radiation irradiation conditions input from theimaging console2. The radiation irradiation conditions input from theimaging console2 include, for example, a pulse rate, a pulse width, a pulse interval, the number of imaging periods per dynamic state imaging, the number of imaging frames per period (the number of times of imaging), a value of an X-ray tube current, a value of an X-ray tube voltage, and an additional filter type. The pulse rate is the number of times of radiation irradiation per second, and coincides with a frame rate described later. The pulse width is a radiation irradiation time per radiation irradiation. The pulse interval is a time from a radiation irradiation start to the next radiation irradiation start, and coincides with a frame interval described later.
Theradiation detecting device13 includes a semiconductor image sensor such as a flat panel detector (FPD). The FPD includes a glass substrate, for example, and a plurality of detecting elements (pixels) is arranged in a matrix at a predetermined position on the substrate, the detecting elements each detecting radiation emitted from theradiation source11 and transmitted through at least the subject M depending on its intensity, and converting the radiation detected into an electric signal to accumulate the signal. Each pixel includes a switching device such as a thin film transistor (TFT). Examples of the FPD include an indirect conversion type that converts an X-ray into an electric signal by a photoelectric conversion element via a scintillator, and a direct conversion type that directly converts an X-ray into an electric signal, and any of them may be used.
Theradiation detecting device13 is provided to face theradiation source11 across the subject M.
Areading control apparatus14 is connected to theimaging console2. Thereading control apparatus14 controls the switching device of each pixel of theradiation detecting device13 on the basis of image reading conditions input from theimaging console2 to switch reading of the electric signal accumulated in each pixel, and reads the electric signal accumulated in theradiation detecting device13 to acquire image data. The image data is a frame image. A pixel signal value of the frame image represents a density value. Then, thereading control apparatus14 outputs the frame image acquired to theimaging console2. The image reading conditions include, for example, a frame rate, a frame interval, a pixel size, and an image size (matrix size). The frame rate is the number of frame images to be acquired per second, and coincides with the pulse rate. The frame interval is a time from a frame image acquiring operation start to the next frame image acquiring operation start, and coincides with the pulse interval.
Here, the radiationirradiation control apparatus12 and thereading control apparatus14 are connected to each other, and mutually exchange synchronization signals to synchronize radiation irradiation operation with image reading operation.
[Configuration of Imaging Console2]
Theimaging console2 outputs the radiation irradiation conditions and the image reading conditions to theimaging apparatus1 and controls the radiation imaging and reading operation of the radiation image by theimaging apparatus1, and displays the dynamic state image acquired by theimaging apparatus1 for confirmation whether or not the image is suitable for diagnosis or confirmation of positioning by an imaging technician or an imaging practitioner.
As illustrated inFIG. 1, theimaging console2 includes acontrol device21, astorage device22, anoperation device23, adisplay device24, and acommunication device25, and the devices are connected to each other by abus26.
Thecontrol device21 includes a central processing unit (CPU) and random access memory (RAM). The CPU of thecontrol device21 reads a system program and various processing programs stored in thestorage device22 depending on operation of theoperation device23 to deploy the programs in the RAM, and executes various types of processing such as imaging control processing described later in accordance with the programs deployed, to centrally control operation of each device of theimaging console2 and the radiation irradiation operation and reading operation of theimaging apparatus1. Thecontrol device21 functions as a period information acquiring device and a control device.
Thestorage device22 includes a nonvolatile semiconductor memory or a hard disk. Thestorage device22 stores data such as various programs executed by thecontrol device21, parameters necessary for execution of processing by the programs, or processing result. For example, thestorage device22 stores a program for executing the imaging control processing illustrated inFIG. 2. In addition, thestorage device22 stores predetermined conditions among the radiation irradiation conditions and the image reading conditions. The various programs are each stored in a form of a readable program code, and thecontrol device21 sequentially executes operation according to the program code.
Theoperation device23 includes a keyboard including a cursor key, a numeral input key, and various function keys, and a pointing device such as a mouse, and outputs an instruction signal input by key operation on a keyboard or mouse operation, to acontrol device21. In addition, theoperation device23 may include a touch panel on a display screen of thedisplay device24, and in this case, outputs an instruction signal input via the touch panel to thecontrol device21.
Thedisplay device24 includes a monitor such as a liquid crystal display (LCD) or a cathode ray tube (CRT), and displays an input instruction, data, and the like from theoperation device23 in accordance with an instruction of a display signal input from thecontrol device21.
Thecommunication device25 includes a LAN adapter, a modem, and a terminal adapter (TA), and controls data transmission/reception with each apparatus connected to the communication network NT.
A dynamicstate detecting device27 detects information indicating a dynamic state of the subject M and outputs a temporal change in the information (dynamic state detection information) to thecontrol device21. When the subject M is a lung field, the lung field moves with the respiratory movement, so that, for example, a respiration measuring instrument such as a respiration sensor, a spirometer, or a respiration monitor belt, and a sound collecting apparatus such as a microphone that collects a sound (for example, a breath sound) generated by an examinee, can be used as the dynamicstate detecting device27. The sound generated by the examinee may be a voice emitted by the examinee while breathing. A period in which a voice comes out can be regarded as exhalation (contraction phase of the lung field), and a period in which no voice comes out can be regarded as inhalation (expansion phase of the lung field). When the subject M is a heart, for example, a heart rate measuring instrument such as an electrocardiograph can be used as the dynamicstate detecting device27. Note that, the dynamicstate detecting device27 is preferably attached at a position that does not overlap the lung field during dynamic state imaging. For example, in the case of an electrocardiograph, the electrocardiograph is preferably attached to an arm or the like.
[Configuration of Diagnostic Console3]
Thediagnostic console3 is a dynamic state image processing apparatus that acquires a dynamic state image from theimaging console2 and applies image processing to the dynamic state image acquired and displays the image processed.
As illustrated inFIG. 1, thediagnostic console3 includes acontrol device31, astorage device32, anoperation device33, adisplay device34, and acommunication device35, and the devices are connected to each other by abus36.
Thecontrol device31 includes a CPU and RAM. The CPU of thecontrol device31 reads a system program and various processing programs stored in thestorage device32 depending on operation of theoperation device33 to deploy the programs in the RAM, and centrally controls operation of each device of thediagnostic console3 in accordance with the programs deployed. Thecontrol device31 functions as an extracting device, a dynamic state image creating device, and a comparing device.
Thestorage device32 includes a nonvolatile semiconductor memory or a hard disk. Thestorage device32 stores data such as various programs executed by thecontrol device31, parameters necessary for execution of processing by the programs, or processing results. The various programs are each stored in a form of readable program code, and thecontrol device31 sequentially executes operation according to the program code.
Thestorage device32 stores the dynamic state image imaged in the past in association with an identification ID, patient information (examinee information. For example, a patient ID, and the name, height, weight, age, and gender of the patient (examinee)), examination information (for example, an examination ID, an examination date, and a subject part (subject M; lung or heart)), and the like.
Theoperation device33 includes a keyboard including a cursor key, a numeral input key, and various function keys, and a pointing device such as a mouse, and outputs an instruction signal input by key operation on a keyboard or mouse operation by a user, to acontrol device31. In addition, theoperation device33 may include a touch panel on a display screen of thedisplay device34, and in this case, outputs an instruction signal input via the touch panel to thecontrol device31.
Thedisplay device34 includes a monitor such as an LCD or a CRT, and performs various displays in accordance with an instruction of a display signal input from thecontrol device31.
Thecommunication device35 includes a LAN adapter, a modem, and a TA, and controls data transmission/reception with each apparatus connected to the communication network NT.
[Operation of Dynamic State Imaging System100]
Next, operation will be described of the dynamicstate imaging system100 in the present embodiment.
(Operation ofImaging Apparatus1 and Imaging Console2)
First, imaging operation by theimaging apparatus1 and theimaging console2 will be described.
The imaging practitioner first operates theoperation device23 of theimaging console2 to input the patient information and the examination information of the examinee. Next, positioning is performed by arranging the subject part (subject M) between theradiation source11 and theradiation detecting device13. In addition, instruction about a respiratory state (for example, normal respiration) is given to the examinee. When imaging preparation is completed, a radiation irradiation instruction is input by operation of theoperation device23.
When the radiation irradiation instruction is input by theoperation device23, the imaging control processing is executed in theimaging console2. Hereinafter, the imaging control processing will be described.
FIG. 2 illustrates imaging control processing executed by thecontrol device21 of theimaging console2. The imaging control processing is executed by cooperation between thecontrol device21 and the programs stored in thestorage device22.
First, thecontrol device21 acquires period information on the dynamic state of the subject M on the basis of the dynamic state detection information output from the dynamic state detecting device27 (step S1).
For example, on the basis of information (waveform) indicating the temporal change in the dynamic state detection information of the subject M output from the dynamicstate detecting device27, for example, a timing of a predetermined feature point of the dynamic state (for example, a transition point from inhalation to exhalation, a midpoint between exhalation and inhalation, or a transition point from exhalation to inhalation, when the subject M is a lung field) is set as a base point of one dynamic state period, and a time from the timing to when the feature point is detected next is acquired as a time taken for one period (in other words, a period).
Next, thecontrol device21 assigns the same phase of each period to an imaging start timing, and determines an imaging time interval of the frame images (pulse interval, frame interval) by dividing the period by the number of imaging frames (number of times of imaging) N per period (step S2), and sets the radiation irradiation conditions and image reading conditions including the imaging time interval of the frame images determined, in the radiationirradiation control apparatus12 and the reading control apparatus14 (step S3). The radiation irradiation conditions and the image reading conditions include the number of imaging frames N per dynamic state period, the number of imaging periods, the imaging time interval of the frame images, and the phase of the dynamic state assigned to the imaging start timing (for example, a maximum exhalation position, and a maximum inhalation position, when the subject M is a lung field).
Next, on the basis of the dynamic state detection information and the period information output from the dynamicstate detecting device27, thecontrol device21 waits until the dynamic state of the subject M reaches the phase of the imaging start timing (step S4).
When it is determined that the dynamic state of the subject M has reached the phase of the imaging start timing (step S4; YES), thecontrol device21 controls the radiationirradiation control apparatus12 and thereading control apparatus14 of theimaging apparatus1 to image N frame images at the time interval determined in step S2 (step S5).
In theimaging apparatus1, N times of radiation irradiation is performed by theradiation source11 at the pulse interval set in the radiationirradiation control apparatus12, and image data (frame images) are acquired by theradiation detecting device13. The frame images acquired are transmitted to theimaging console2, and sequentially stored in thestorage device22.
Next, thecontrol device21 determines whether or not imaging for a predetermined period (for a plurality of periods) has ended (step S6). When it is determined that the imaging for the predetermined period has not ended (step S6; NO), thecontrol device21 returns to step S4.
When it is determined that the imaging for the predetermined period has ended (step S6; YES), thecontrol device21 outputs an instruction of ending the dynamic state imaging to the radiationirradiation control apparatus12 and thereading control apparatus14, to end the dynamic state imaging (step S7).
The frame images acquired by the imaging are sequentially input to theimaging console2, and thecontrol device21 stores the frame images input, in thestorage device22 in association with the respective numbers (frame numbers) each indicating imaging order (step S8), and causes thedisplay device24 to display the frame images (step S9). The imaging practitioner confirms positioning and the like with the dynamic state image displayed, and determines whether an image suitable for diagnosis is acquired by the imaging (imaging OK) or re-imaging is necessary (imaging NG). Then, a determination result is input by operation of theoperation device23.
When a determination result indicating the imaging OK is input by predetermined operation of the operation device23 (step S10; YES), to each of a series of frame images acquired by the dynamic state imaging, information is added (for example, written in the header region of the image data in the DICOM format) such as the identification ID for identifying the dynamic state image, the patient information, the examination information, the radiation irradiation conditions, the image reading conditions, and the number (frame number) indicating the imaging order, and transmitted to thediagnostic console3 via the communication device25 (step S11). Then, the processing ends. On the other hand, when a determination result indicating the imaging NG is input by the predetermined operation of the operation device23 (step S10; NO), the series of frame images stored in thestorage device22 are deleted (step S12), and the processing ends. In this case, re-imaging is necessary.
FIG. 3 is a graph illustrating the temporal change in the dynamic state detection information of the subject M and timings at which the frame images are imaged. As illustrated inFIG. 3, in the imaging control processing, imaging is performed at timings at the same phase for each period of the dynamic state of the subject M. For example, for each period, imaging is performed at predetermined time intervals from a predetermined phase. Therefore, the dynamic state images can be acquired having the same phases at each period of the plurality of periods.
(Operation of Diagnostic Console3)
Next, operation in thediagnostic console3 will be described.
When a series of frame images of dynamic state images obtained by imaging the dynamic state of the subject M for a plurality of periods is received from theimaging console2 via thecommunication device35, thecontrol device31 of thediagnostic console3 performs processing of at least one of the following (1) to (3) for each frame image at the same phase of the plurality of periods, as illustrated inFIG. 4.
(1) Aggregate Frame Images of the Plurality of Periods in One Period
A representative value (average value, added value, maximum value, minimum value, or median value) is calculated of pixel values (density values) of corresponding pixels (pixels at the same position) of the frame images at the same phase, and a dynamic state image for one period is created.
For the dynamic state image, a plurality of times of radiation irradiation is performed for imaging a plurality of frame images. For that reason, each frame image is imaged at a low dose to suppress exposure of the examinee, which results in an image with a lot of noise. Therefore, as described above, the density values of each pixel of frame images at the same phase of the plurality of periods are aggregated to make a dynamic state image for one period, whereby a dynamic state image can be acquired with reduced noise.
Above work is repeatedly performed and the dynamic state image for one period is created a plurality of times, whereby dynamic state images can also be acquired of the plurality of periods with reduced noise.
(2) Compare the Density Values of a Subject Region Between the Frame Images at the Same Phase
A subject region (subject M's region) is extracted from each frame image, and a representative value (average value, added value, maximum value, minimum value, or median value) of the density value is calculated for the entire subject region, or for each small region obtained by dividing the subject region into a plurality of small regions, and representative values calculated are compared between the frame images at the same phase of the plurality of periods. Alternatively, the density values of each pixel are compared between the frame images at the same phase of the plurality of periods.
Here, if there is an abnormality in the subject M, states of the subject M do not coincide with each other even at the same phase in the dynamic state period. For example, there is a case in which a difference in the density value of the subject M becomes large between the frame images at the same phase. If there is no abnormality, the states of the subject M substantially coincide with each other at the same phase, and the difference in the density value is small between the frame images at the same phase. Therefore, as a method of comparing the density values, for example, a difference value is calculated for each region (for each pixel) of the frame images to be compared, and a group of the frame images of the phase in which the difference value is greater than or equal to a predetermined threshold (the difference becomes large) is displayed side-by-side. Alternatively, an image in which each region (each pixel) of the frame image is colored depending on the difference value may be displayed on the display device34 (for example, red for greater than or equal to a predetermined threshold, and blue for less than the threshold).
As a result, it becomes possible to provide a doctor with a basis for determination to determine normality/abnormality of the subject M.
(3) Compare Shapes of the Subject Region Between the Frame Images at the Same Phase
An index value representing a shape of the subject M is calculated such as the area of the subject region of each frame image, the length (for example, the distance between the apex of a lung and the diaphragm (the width of a heart)), the curvature of the contour, or the concordance rate of the contour shape, and index values calculated are compared between the frame images at the same phase of the plurality of periods.
As described above, if there is an abnormality in the subject M, states of the subject M do not coincide with each other even at the same phase in the dynamic state period. For example, there is a case in which a difference in shape becomes large. If there is no abnormality, the states of the subject M substantially coincide with each other at the same phase, and the difference in shape is small. Therefore, as a method of comparing the shapes, for example, a difference value is calculated of the index values of the frame images to be compared, and a group of the frame images of the phase in which the difference value is greater than or equal to a predetermined threshold (the difference becomes large) is displayed side-by-side.
As a result, it becomes possible to provide a doctor with a basis for determination to determine normality/abnormality of the subject M.
As described above, according to the dynamicstate imaging system100, thecontrol device21 of theimaging console2 acquires the period information on the period of the subject M's dynamic state having the periodicity, and controls theimaging apparatus1 to acquire the dynamic state image of the plurality of periods by assigning the same phase as the imaging start timing for each period of the subject M's dynamic state on the basis of the period information acquired, and performing imaging at predetermined time intervals from the imaging start timing for each period.
Therefore, the dynamic state images can be acquired having the same phases at each period of the plurality of periods, and the dynamic state images of the plurality of periods can be effectively utilized.
For example, thecontrol device21 calculates the time interval of the dynamic state imaging by dividing the time of one period of the dynamic state of the subject M by the predetermined number of times of imaging per period, whereby imaging of each period can be performed at the same phase.
In addition, thecontrol device31 of thediagnostic console3 creates the dynamic state image in which the average values, added values, maximum values, minimum values, or median values of the density values of each pixel of the frame images at the same phase in the dynamic state images of the plurality of periods acquired by theimaging apparatus1 are calculated and aggregated in one period. Therefore, the dynamic state image can be acquired with reduced noise.
Thecontrol device31 of thediagnostic console3 extracts the subject M's region from respective frame images of the dynamic state images of the plurality of periods acquired by theimaging apparatus1, and compares the subject M's regions extracted from the frame images at the same phase of the dynamic state images of the plurality of periods. For example, thecontrol device31 compares the average values, added values, maximum values, minimum values, or median values of the density values of individual pixels in the subject M's regions in the frame images at the same phase of the plurality of periods, or of the density values in individual small regions obtained by dividing the subject M's regions. Alternatively, thecontrol device31 compares the shapes of the subject M's regions in the frame images at the same phase of the plurality of periods. Therefore, it becomes possible to provide a doctor with information to be a basis for determination to determine normality/abnormality of the dynamic state of the subject M.
Note that, the description of the embodiment is a preferable example of the present invention, and this is not a limitation.
For example, in the embodiment, the description has been made assuming that thecontrol device21 of theimaging console2 controls theimaging apparatus1 to image the frame images at the same phase in the plurality of periods; however, as illustrated inFIG. 5, theimaging apparatus1 may be controlled to image the frame images by shifting the imaging phases by a predetermined interval for each period. For example, when the subject M is a heart, the frame images are imaged at 15 fps from the end diastole in the first period, the frame images are imaged at 15 fps from 0.02 second after the end diastole in the second period, the frame images are imaged at 15 fps from 0.04 second after the end diastole in the third period, and so on. Then, the frame images of the dynamic state images imaged by shifting the imaging timing are rearranged on the basis of the phases. As a result, it is possible to virtually acquire a dynamic state image with a high frame rate. Note that, the number of imaging periods is determined so that the imaging time interval between the frame images becomes constant when the frame images are rearranged.
In the embodiment, an example has been described in which the period information acquiring device that acquires the period information on the dynamic state of the subject M and the control device that controls theimaging apparatus1 are included in theimaging console2 separate from theimaging apparatus1; however, the devices may be included in theimaging apparatus1. In addition, the description has been made assuming that the dynamic state image creating device, the extracting device, and the comparing device are included in thediagnostic console3; however, the devices may be included in theimaging apparatus1 integrated with theimaging console2, or in theimaging console2. In other words, the dynamic state imaging system of the present invention may include a single apparatus, or may include a plurality of apparatuses.
In the embodiment, an example has been described in which the subject M is a lung field or a heart; however, the subject M may be, for example, a joint of limbs, and is not limited to the embodiment.
In the embodiment, an example has been described in which thecontrol device21 acquires the period information on the basis of the information indicating the temporal change in the dynamic state of the subject M acquired from the respiration measuring instrument, the heart rate measuring instrument, or the sound collecting apparatus that collects the sound generated by the examinee; however, when imaging is performed while causing the examinee to breath on the basis of a breathing instruction such as “inhale, exhale, . . . ”, since the lung of the examinee moves in accordance with the breathing instruction, the information of the breathing instruction is acquired from a breathing instruction generating apparatus, as information indicating the temporal change in the dynamic state of the subject M, and the period information may be acquired on the basis of the information of the breathing instruction.
In the above descriptions, an example has been disclosed in which the hard disk or semiconductor nonvolatile memory is used as a computer readable medium for programs according to the present invention, but the medium is not limited to this example. As another computer readable medium, a portable recording medium such as CD-ROM can be applied. In addition, as a medium for providing data of the programs according to the present invention via a communication line, a carrier wave is also applied.
Besides, the detailed configuration of each apparatus constituting the dynamic state imaging system and detailed operation of each apparatus can be appropriately changed without departing from the spirit of the present invention.
Although embodiments of the present invention have been described and illustrated in detail, the disclosed embodiments are made for purposes of illustration and example only and not limitation. The scope of the present invention should be interpreted by terms of the appended claims.