FIELD OF THE ARTThe present invention relates to an editing system for editing source video data imaged by a video camera or the like, and more particularly, is suitably applicable to an editing system for editing material to be promptly reported like sports and news broadcasting.[0001]
BACKGROUND ARTHeretofore, as this type of editing system, a system using a video tape recorder (hereinafter, it is referred as VTR for short) as recording means for recording edit material has been provided. In this editing system, live images such as sports and news, are sequentially recorded by the VTR, and the recorded images are read out as edit material to edit programs.[0002]
In the case of handling the live images such as sports and news, an editing system is required that can rapidly edit to provide exciting images having presence to audience. In the conventional editing system, however, since the VTR is used as a recording medium, a time for head-search, fast-forwarding, or rewinding of the VTR are needed, the VTR must be controlled till immediately before on the air. It causes a problem that editing operation cannot be performed speedy. A considerable time is required only for dubbing video data in which for example, recorded video data is temporary recorded to a video tape for material as editing material, or the final video program finally completed as the result of edition is recorded to a video tape for broadcasting, moreover, each equipment such as a VTR must be operated for dubbing, thereby, it causes a problem that the operation becomes complicated.[0003]
Furthermore, in the conventional editing system, when in editing, various devices are needed in addition to VTR: a plurality of monitors to confirm record images and edit images, etc. It causes a problem that system structure becomes larger. Additionally, since various devices must be operated, there is a problem that operation becomes complicated.[0004]
As described above, the conventional editing system is not considered to efficiently perform editing work in restricted environments at the scene, and handle materials wanting a real-time characteristic, such as sports broadcasting and a news report; it has been insufficient in usability.[0005]
DISCLOSURE OF INVENTIONConsidering the above points, the present invention provides an editing system capable of easily recording video data being material and the final video program generated as the result of edition, etc., on a predetermined recording medium, and having further improved usability.[0006]
To solve the above problems, in the present invention, an editing system for editing source video data is equipped with a recording/reproducing device which has the first recording/reproducing means for recording video data on a random-access recording medium and for reproducing the video data recorded on the first recording medium at the same time, the second recording/reproducing means for recording video data on the predetermined second recording medium and for reproducing the video data recorded on the second recording medium at the same time, selecting means for supplying the video data reproduced by one of said first and second recording/reproducing means to another recording/reproducing means, and the first control means for controlling the recording and reproducing operation of said first and second recording/reproducing means and controlling the selecting operation of said selecting means, and a computer which has user interface means for entering instruction information to record said video data reproduced by one of said first and second recording/reproducing means by another recording/reproducing means, and the second control means for making said recording/reproducing means perform reproducing and recording operation corresponding to the instruction information by notifying said first control means of said recording/reproducing device, control information corresponding to said instruction information entered via said user interface means.[0007]
In this manner, video data recorded in one recording/reproducing means can be recorded to another recording/reproducing means by controlling the recording operation and the reproducing operation of the first and the second recording/reproducing means which are provided on a recording/reproducing device by control of a computer, so that only if instruction information is entered through the user interface means of the computer, for example, source video data recorded in a second recording medium brought from the outside can be easily dubbed to a first recording medium as editing material, furthermore, the video data of the final video program generated as the result of edition can be easily dubbed from a first recording medium to, for example, a second recording medium used in broadcasting. Thus, dubbing of editing material and dubbing of the final video program can be performed efficiently. Thereby, dubbing processing of video data can be performed efficiently comparing with the conventional case, so that an editing system having further improved usability can be realized.[0008]
BRIEF DESCRIPTION OF DRAWINGSFIG. 1 is a block diagram showing the general constitution of an editing system according to the embodiment of the present invention.[0009]
FIG. 2 is a block diagram showing the inner constitution of a computer forming an editing system.[0010]
FIG. 3 is a schematic diagram showing a graphical user interface (GUI) in a picture mode.[0011]
FIG. 4 is a schematic diagram showing a dubbing setting dialog displayed when a dubbing setting button is operated.[0012]
FIG. 5 is a schematic diagram showing graphic data displayed when in dubbing processing.[0013]
FIG. 6 is a schematic diagram showing a GUI in a time line mode.[0014]
FIG. 7 is a schematic diagram showing arrangement in a time-line displaying area.[0015]
FIG. 8 is a table illustrating first management record data.[0016]
FIG. 9 is a table illustrating second management record data for clip data.[0017]
FIG. 10 is a table illustrating the second management record data for event data and program data.[0018]
FIG. 11 is a table for explaining an index number, a clip number and an event number.[0019]
FIG. 12 is a schematic diagram showing an example of each displaying area.[0020]
FIGS. 13A to[0021]13C are schematic diagrams illustrating a managing method by the first and the second management record data.
FIG. 14 is a block diagram showing the configuration of a hybrid recorder.[0022]
FIG. 15 is a schematic diagram showing arrangement in a reproducing speed setting area.[0023]
FIG. 16 is an exterior view of a dedicated controller.[0024]
FIG. 17 is a table for explaining a storing format of speed data.[0025]
FIG. 18 is a schematic diagram showing a queue-up setting picture.[0026]
FIG. 19 is a schematic diagram illustrating a preroll mode.[0027]
FIG. 20 is a schematic diagram illustrating a hierarchy structure for storing work data.[0028]
FIG. 21 is a flowchart illustrating the initial operation of a computer.[0029]
FIG. 22 is a flowchart illustrating a marking operation on the recording side.[0030]
FIGS. 23 and 24 are flowcharts illustrating a marking operation on the reproducing side.[0031]
FIG. 25 is a flowchart for explaining an event trimming operation.[0032]
FIGS. 26 and 27 are flowcharts for explaining the event trimming operation with a preroll function.[0033]
FIG. 28 is a flowchart for explaining the operation when arbitrary reproducing speed is set to an event.[0034]
FIG. 29 is a flowchart for explaining the operation when a video program is produced.[0035]
FIG. 30 is a flowchart for explaining the operation when the contents of dubbing processing is set.[0036]
FIG. 31 is a flowchart for explaining the operation when the dubbing processing is executed.[0037]
FIG. 32 is a flowchart for explaining a dubbing preparation performed in executing the dubbing processing.[0038]
BEST MODE FOR CARRYING OUT THE INVENTION(1) General Constitution of Editing System[0039]
Referring to FIG. 1, the[0040]caption number1 generally shows an editing system according to the present invention. Theediting system1 includes anediting computer2 and ahybrid recorder3 for recording/reproducing source video data. Thecomputer2 is composed of the following devices: amain body2ahaving a CPU, various processing circuits, a floppy-disk drive, and a hard-disk drive; amonitor2bconnected to themain body2a; akeyboard2c; amouse2d; and adedicated controller2e.In thiscomputer2, an application program for editing video data has been previously installed in the hard-disk drive so that thecomputer2 works as an editing system by operating the application program under the operating system.
The application program includes a graphical user interface (GUI) for generating control commands used in editing work. When the application program is worked, a graphic for GUI is displayed on the[0041]monitor2b.
On the other hand, the[0042]hybrid recorder3 is composed of a hard-disk array in which plural hard disks are connected in array, and a VTR capable of reproducing or recording desired video data and audio data. In this embodiment, the VTR is provided with both VTRs having an analog mode and a digital mode, therefore, the VTR can reproduce a video tape recorded in the analog mode also can reproduce a video tape recorded in the digital mode, furthermore, the VTR can record desired video data and audio data in either methods of the analog method or the digital method.
In this connection, in the[0043]hybrid recorder3, also the video data and the audio data reproduced by the VTR can be dubbed to the hard-disk array, or the video data and the audio data reproduced by the hard-disk array can be dubbed to a video tape in the VTR. Furthermore, in thehybrid recorder3, also video signal V1 and audio signal A1 supplied from the outside can be recorded to the hard-disk array or the VTR. In this case, since the hard disk-array can perform recording and reproducing at the same time, if the hard-disk array is selected as recording/reproducing means, video data and audio data can be recorded in real time as well as reproducing thus recorded video data and audio data in real time.
Note that, among the video signal V[0044]3 and the audio signal A2 reproduced in real time from the hard-disk array of thishybrid recorder3, the video signal V3 is supplied to themain body2aof thecomputer2. Furthermore, when recording source video signal V1 inputted from the outside, thehybrid recorder3 outputs the video signal V1 almost as it is, and video signal V2 thus outputted (which is similar to the video signal V1 as a signal) is also supplied to themain body2aof thecomputer2.
In this connection, the video signal V[0045]1 inputted from the outside is a composite video signal by shooting by a video camera or the like, and used as editing material.
The video data reproduced by the VTR in the[0046]hybrid recorder3 is video data obtained by reproducing for example, the video tape recorded at the scene, by the VTR, and also used as editing material.
Note that, if this[0047]editing system1 is used in sports broadcasting or the like, the source video signal V1 is inputted to thisediting system1 mainly from the outside, and video signal V2 which is identical with the video signal V1 is displayed on themonitor2bof thecomputer2 while recording the source video signal V1 to thehybrid recorder3, and if needed, video signal V3 is reproduced from thehybrid recorder3 to perform editing work.
The[0048]computer2 and thehybrid recorder3 are connected to each other by acommunication cable4 based on an RS-422 interface communication format, and a control command and its response command can be transmitted therethrough. The RS-422 interface communication format, the control command, and its response command can be simultaneously transmitted/received.
Here, the operation of this[0049]editing system1 will be described briefly. For example, when the video signal V1 supplied from the outside is used as editing material, in using this source video signal V1 as editing material, theediting system1 records the source video signal V1 to the hard-disk array in thehybrid recorder3 in real time and outputs the source video signal V1 almost as it is as the video signal V2 to supply to thecomputer2. Thecomputer2 receiving the video signal V2 displays a reduced picture corresponding to the video signal V2 on themonitor2bto show the operator the contents of the editing material.
The operator who operates the[0050]computer2 operates a pointing device such as themouse2dconnected to thecomputer2 while monitoring the video signal V2 displayed on themonitor2bbeing display means and specifies editing points such as an in point (edition starting point) and an out point (edition stopping point) as well as producing a control command for edition using a GUI displayed on themonitor2b.This control command is transmitted to thehybrid recorder3 as an RS-422-based control command. Thereby, the reproducing operation of thehybrid recorder3 is controlled, and the reproduced video signal V3 is displayed on themonitor2bof thecomputer2. Thus, in thisediting system1, editing work can be efficiently performed by reproducing the editing material while recording the editing material.
On the other hand, when a video tape brought from the outside is used as editing material, in using source video data recorded in this video tape as editing material, first the video tape is set in the VTR in the[0051]hybrid recorder3, the video tape is reproduced, and the reproduced source video data is recorded to the hard-disk array in thehybrid recorder3.
Thereafter, the operator reproduces dubbed source video data from the hard-disk array, generates a desired editing command while monitoring the[0052]monitor2bof thecomputer2, and performs editing work in the similar manner to the above-mentioned edition in which the video signal V1 is material.
As in the above, in the[0053]editing system1, since the pointing device such as themouse2dis operated while motoring themonitor2b,editing work can be easily performed. Furthermore, since thehybrid recorder3 capable of simultaneously performing a recording operation and a reproducing operation is used, editing work can be performed in real time; thus, materials such as sports broadcasting and news report etc. can be edited without lacking a real-time characteristic.
(2) Inner Constitution of Computer[0054]
In this chapter the inner constitution of[0055]computer2 is described concretely. As shown in FIG. 2, thecomputer2 has the following blocks: asystem bus5 for transmitting command data and video data; aCPU10 for controlling whole computer; a first and asecond video processors11 and12 for conducting image processing on input video signal; adisplay controller13 for managing video data displayed on themonitor2band a GUI graphic display; anHDD interface15 for controlling a local hard-disk drive (local HDD)15a; anFDD interface16 for controlling a floppy-disk drive (FDD)16a; apointing device interface17 for generating control command based on a command supplied from the pointing device like the mouse (cursor controlling device)2d,dedicated controller2eandkeyboard2c; and anexternal interface18 including a software driver for data communication with thehybrid recorder3 according to the RS-422 communication format.
The[0056]system bus5 is a bus for the communication of video data, command data, and address data within thecomputer2, and which is composed of avideo data bus5afor transmitting video data and acommand data bus5bfor transmitting command data or the like.
The[0057]video data bus5ais connected to theCPU10, first andsecond video processors11 and12,display controller13,HDD interface15,FDD interface16 respectively. The first and thesecond video processors11 and12,display controller13,HDD interface15, andFDD interface16 transmit video data via thevideo data bus5a.
The[0058]command data bus5bis connected to theCPU10, first andsecond video processors11 and12,display controller13,HDD interface15,FDD interface16, pointing-device interface17, andexternal interface18 respectively (that is, all blocks in thecomputer2 is connected thereto). Command data and address data are transmitted via thecommand data bus5b.
The[0059]CPU10 is a block to controlwhole computer2. TheCPU10 provides aROM10ain which an operating system for thecomputer2 has been stored, and aRAM10bin which the up-loading application program or the like will be stored. When starting thecomputer2, theCPU10 executes a software program according to the operating system stored in theROM10a.
If executing an application program under the started operating system, the[0060]CPU10 first reads an application program recorded on a hard disk in the hard-disk drive15aand uploads into theRAM10bbefore executing the above application program.
The[0061]first video processor11 is a block for receiving the first video signal V2 supplied to thecomputer2, conducting data-conversion on it, and temporarily buffering the converted video data. Thefirst video processor11 is composed of the following blocks: a processor controller11afor controllingwhole video processor11; adata converting unit11bfor converting received analog composite video signal V2 into digital component video data; and aframe memory11cfor temporarily storing several frames of video data supplied from thedata converting unit11b.
The processor controller[0062]11asupplies a control signal to thedata converting unit11bto control the operation of thedata converting unit11bas well as extract time code from the composite video signal V2 to thedata converting unit11b. Moreover, the processor controller11asupplies a control signal to theframe memory11cto control read/write timing and read/write address of theframe memory11c.As to the read timing, the processor controller11aso controls read timing of theframe memory11cthat the time code sent to thedisplay controller13 corresponds to the video data (frame data).
The[0063]data converting unit11bconverts the analog composite video signal V2 to a component video signal based on the control signal from the processor controller11a,and converts the analog component video signal into digital video data. Note that, its time code is extracted when the analog component video signal is converted into digital video data. Thus digital-converted video data is supplied to theframe memory11c,and extracted time code is supplied to the processor controller11a.
The time code is coded into two lines,[0064]14H and16H, or12H and14H and inserted to the vertical blanking period of the composite video signal V2, which is called vertical interval time code (VITC). Therefore, in the case of extracting time code from the composite video signal V2, the time code can be easily extracted only by decoding the digital-converted time code in the vertical synchronizing period when an analog signal is converted into digital data. In this connection, this time code is added in thehybrid recorder3 when the video signal V2 is outputted.
The[0065]frame memory11ctemporary stores video data supplied from thedata converting unit11b.The read/write timing of theframe memory11cis controlled by the processor controller11aas described above. Thisframe memory11cis composed of two frame memories with 4-Mbyte. Video data to be stored in theframe memory11cis 1520×960-pixel and such video data can be stored for two frames.
The 1520×960-pixel video data stored in the[0066]frame memory11cis read out according to the control of the processor controller11a.This video data is not 1520×960-pixel being all of the pixels of that, but its data quantity has thinned out to 380×240-pixel. Here, “thinning out of data quantity” means simply reducing in quarter the sampling rate when video data is read out from theframe memory11cto reduce the data quantity of video data to be read out. Thus read out 380×240-pixel video data is supplied to thedisplay controller13 via thevideo data bus5a.
The[0067]second video processor12 is completely the same as the first video processor in structure. That is, thevideo processor12 provides the following units: a processor controller12afor controllingwhole video processor12; adata converting unit12bfor converting received analog composite video signal V3 into digital component video data; and aframe memory12cfor temporary storing video data for several frames supplied from thedata converting unit12b.Thefirst video processor11 and thesecond video processor12 are different in that: to thefirst video processor11 the composite video signal V2 is supplied, while to thesecond video processor12 the composite video signal V3 is supplied.
Here, since the composite video signal V[0068]2 is video signal which a time code is superimposed on the vertical synchronizing period of the source video signal recorded to the hard-disk array in thehybrid recorder3, it is the temporary same video signal as the source video signal recorded in real time. That is, the video data stored in theframe memory11cis the same video data as that the source video signal to be recorded is digitized.
On the contrary, the composite video signal V[0069]3 is the source video signal reproduced from the hard-disk array of thehybrid recorder3 by a command from thecomputer2. Therefore, this composite video signal V3 is asynchronous video signal with the source video signal to be recorded to the hard-disk array and having no temporal relation to each other.
This respect is described in detail hereinafter. If the operator specifies the reproduction of source video data, the[0070]computer2 outputs a reproducing command of the source video data to thehybrid recorder3. Thehybrid recorder3 reproduces the source video data specified by the operator responding to the reproducing command from thecomputer2. Furthermore, thehybrid recorder3 memorizes a time code corresponding to the source video data in a frame unit, and reproducing the time code of the reproduced source video data on the basis of the correspondence relation.
Then the[0071]hybrid recorder3 superimposes the reproduced time code on the vertical synchronizing period of the reproduced source video data, converts thus obtained source video data into analog composite video signal V3 so as to transmit it to thecomputer2, and transmits it to thecomputer2. As in the above, since the composite video signal V3 is the source video signal reproduced by the command by the operator, it is temporary asynchronous signal with the source video signal to be recorded in thehybrid recorder3.
The composite video signal V[0072]3 supplied to thesecond video processor12 is conducted the specified signal processing via thedata converting unit12bandframe memory12csimilarly to the composite video signal V2 supplied to thefirst video processor11, thereafter, transmitted to thedisplay controller13 as a 380×240-pixel digital video data.
The[0073]display controller13 is a control block for controlling data to be displayed on themonitor2b.Thedisplay controller13 has amemory controller13aand a video random access memory (VRAM)13b.Thememory controller13acontrols the read/write timing of theVRAM13baccording to the inner synchronization of thecomputer2. In theVRAM13b,video data from theframe memory11cof thefirst video processor11, video data from theframe memory12cof thesecond video processor12, and image data from theCPU10 will be stored based on a timing control signal from thememory controller13a,respectively. The image data stored in theVRAM13bis read out from theVRAM13bbased on the timing control signal from thememory controller13baccording to the inner synchronization of the computer, and it is used as GUI display. The image data sent from theCPU10 to theVRAM13bis image data of a window, cursor, scroll bar, etc. The graphic display for GUI can be obtained by displaying these plural kinds of image data on themonitor2b.
The hard-[0074]disk interface15 is an interface block for communicating with the local hard-disk drive (HDD)15aprovided in thecomputer2. Communication between the hard-disk interface15 and the hard-disk drive15ais performed according to the small computer system interface (SCSI) transmission format.
In the hard-[0075]disk drive15a, an application program which is started by thecomputer2 has been installed. When the application program is executed, it is read out from the hard-disk drive15aand uploaded into theRAM10bof theCPU10. When this application program is closed, a work data file formed by editing operation stored in theRAM10bis downloaded into a hard disk via the hard-disk drive15a.
The floppy-[0076]disk interface16 is an interface block for communicating with the floppy-disk drive (FDD)16aprovided in thecomputer2. The communication between the floppy-disk interface16 and the floppy-disk drive16ais performed according to the SCSI transmission format. Note that, an edit decision list (EDL) which shows the result of edition by editing operation, or the like, is stored to a floppy disk via the floppy-disk drive16a.
The pointing-[0077]device interface17 is an interface block for receiving information from themouse2d,dedicated controller2eandkeyboard2cconnected to thecomputer2. The pointing-device interface17 receives, e.g., detecting information of a two-dimensional rotary encoder provided in themouse2d, and click information of a left and a right buttons provided on themouse2d, and decodes these received information and supplies the decoded data to theCPU10. Also the pointing-device interface17 receives information from thededicated controller2eand thekeyboard2c,decodes and supplies to theCPU10.
The[0078]external interface18 is a block for communicating with thehybrid recorder3 externally connected to thecomputer2. Theexternal interface18 has an RS-422 driver for transforming command data generated in theCPU10 into the communication protocol of the RS-422, and outputs a control command such as reproducing command to thehybrid recorder3 via the RS-422 driver.
(3) Graphic Display for GUI[0079]
(3-1) Picture Mode[0080]
In the[0081]editing system1, two types of modes are prepared as graphic displays for GUI: one is a picture mode in which a program is edited by rearranging an event while monitoring the screen of the in point and out point of the registered event; the other is a time-line mode in which the length of program can be adjusted while monitoring the temporal length of the registered event. It can be easily switched between these two modes by clicking a mode button, which will be described later. Thereby, the operator can select either of GUI better for use according to the purpose of edition, and thus the usability in editing work can be improved.
In this chapter, the picture mode will be described first. In the case of the picture mode, a graphic display shown in FIG. 3 is displayed on the[0082]monitor2b.As shown in FIG. 3, the graphic display in the picture mode is divided into the following ten areas: a recordingvideo displaying area21;timing displaying area22; reproducingvideo displaying area23; recordingvideo marking area24; reproducingspeed setting area25; recycle-box area26; reproducingvideo marking area27;clip displaying area28;event displaying area29; andprogram displaying area30.
The recording[0083]video displaying area21 has arecording video screen21a,a recording-start-point displaying area21b,a residual-time-of-memory-capacity displaying area21c,and an on-recording displaying area21d.
The video signal displayed on the[0084]recording video screen21ais a signal obtained from the composite video signal V2 sent from thehybrid recorder3, and its image size has changed to 380×240 pixels by thinned out when supplied from theframe memory11cto theVRAM13b.
In the recording-start-[0085]point displaying area21b,the time code showing when the recording of the video signal displayed on therecording video screen21awas started by thehybrid recorder3, will be displayed.
In the residual-time-of-memory-[0086]capacity displaying area21c, the residual time of memory capacity of thehybrid recorder3 will be displayed. Because the total memory capacity of thehybrid recorder3 can be seen previously, the residual time displayed here can be easily obtained by subtracting, the value obtained by subtracting a time at the beginning of recording from the present time, from the recordable time of thehybrid recorder3.
In the on-[0087]recording displaying area21d,the letters “REC” will be displayed. This shows that the video signal displayed on therecording video screen21ais being recorded.
The[0088]timing displaying area22 has the following areas: a one-minute-clock displaying area22a;time displaying area22b; input-video-signal's time-code displaying area22c; reproducing video signal's time-code displaying area22d; on-air displaying area22e; mode button22f;preroll button22g; and reproducingspeed setting button22h,namely a dynamic motion controller (DMC)button22h.
The one-minute-[0089]clock displaying area22ais an area used to count one minute (or three minutes by setting the menu) in a second unit and visually displaying it. As going on the counting, the color of the displaying part provided along the circumference of this one-minute-clock displaying area22ais sequentially changed for every seconds; then the operator can easily and visually grasp passing of the time. Saying that when is counted one minute using the one-minute-clock displaying area22a, it is when one minute is counted from the specifying of in point, and then an out point is specified on the input video side or the reproducing video side, when in the case of previewing the produced program, one minute is counted from the beginning of previewing, etc.
In the[0090]time displaying area22b,the present time will be displayed. In the recording video signal's time-code displaying area22c,the time code of the video signal displayed in the recordingvideo displaying area21 will be displayed. This time code is a time code to be extracted from the vertical synchronizing period of the composite video signal V2 by the processor controller11ain thefirst video processor11.
In the reproducing video signal's time-[0091]code displaying area22d,the time code of the video signal displayed in the reproducingvideo displaying area23 will be displayed. This time code is a time code to be extracted from the vertical synchronizing period of the composite video signal V3 by the processor controller12ain thesecond video processor12.
The on-[0092]air displaying area22ewill be used to show whether to being on the air or not. If a tarry signal showing on the air is supplied from the external instrument, the color of the displaying area is changed to red. This tarry signal showing on the air is to be supplied while the composite video signal V3 supplied from thehybrid recorder3 is putting on the air. Since the display color of the on-air displaying area22ecan be changed synchronizing with the on-air state, the operator can easily and visually grasp being on the air.
The mode button[0093]22fwill be used to switch between the picture mode shown in FIG. 3 and the time-line mode which will be described later. By clicking the mode button22fby themouse2d, the switching of mode can be specified, thus the display mode can be switched between the picture mode and the time-line mode.
The[0094]preroll button22gwill be used to set a preroll mode. And the reproducing speed setting button (DMC)22hwill be used to set the reproducing speed of the selected event. The details of these two buttons will be described later.
Furthermore, a dubbing button[0095]22iis a button used when video data is dubbed from the VTR in thehybrid recorder3 to the hard-disk array or video data is dubbed from the hard-disk array in thehybrid recorder3 to the VTR. If after this dubbing button22iis clicked, arecording starting button31awhich will be described later is clicked, dubbing processing is executed.
The reproducing[0096]video displaying area23 has a reproducingvideo screen23a,shuttle button23b,jog button23cand reproducingstate displaying area23d.
The video signal to be displayed on the reproducing[0097]video screen23ais a signal obtained from the composite video signal V3 reproduced by thehybrid recorder3, and its image size has changed to 380×240 pixels by thinning-out process when the signal was supplied from theframe memory12cto theVRAM13b.
The[0098]shuttle button23bis used if the operator wants to fast-forward (i.e., shuttle-forward) the video data reproduced from thehybrid recorder3 and displayed on the reproducingvideo screen23a.If the operator specifies theshuttle button23bby operating themouse2dand drags toward the direction which he wants to fast-forward the video data, reproducing operation of thehybrid recorder3 can be controlled by the dragging operation.
The[0099]jog button23cis used if the operator wants to forward frame by frame the video data reproduced from thehybrid recorder3 and displayed on the reproducingvideo screen23a.In the case of wanting to forward frame by frame the video data displayed on the reproducingvideo screen23a, the operator clicks a button showing the direction that he wants to forward frame by frame, of thejog button23c,by themouse2d.Thereby, the reproducing video data can be forwarded frame by frame by according to click operation.
In the reproducing[0100]state displaying area23d,the letters “PLAY” or “STILL” will be displayed according to the state of the video data displayed on the reproducingvideo screen23a.More precisely, if the video data displayed on the reproducingvideo screen23ais a dynamic image reproduced from thehybrid recorder3, the letters “PLAY” are displayed, while if it is a still image, the letters “STILL” are displayed.
The recording[0101]video marking area24 will be used to mark the clipped image data of an in point or an out point from the video data displayed on therecording video screen21a.Here “marking” means specifying or setting the in point or the out point. And here “clipped image” shows “still image”. The recordingvideo marking area24 is divided into the following areas: an in-clip displaying area24a; in-point's time-code displaying area24b; mark-inbutton24c; out-clip displaying area24d; out-point's time-code displaying area24e; and mark-out button24f.
The in-[0102]clip displaying area24awill be used to display the clipped image data which has marked as the in point by the operator by clicking the mark-inbutton24c.The clipped image data to be displayed in the in-clip displaying area24ais image data obtained from the composite video signal V2 which is supplied from thehybrid recorder3, and its image size has thinned out to 95×60 pixels.
In the time-[0103]code displaying area24b,the time code of the clipped image data displayed in the in-clip displaying area24awill be displayed. The time code is that has extracted from the composite video signal V2 by the processor controller11ain thefirst video processor11 when the operator marked the in point by clicking the mark-inbutton24c.
The mark-in[0104]button24cwill be used to mark an in point. The operator clicks the mark-inbutton24cwhile monitoring the video data displayed on therecording video screen21a.If the mark-inbutton24cis clicked, (95×60-pixel) clipped image data which corresponds to the video data displayed on therecording video screen21ais generated and displayed to the in-clip displaying area24a.
The out-[0105]clip displaying area24dwill be used to display the out-point clipped image data which has marked by the operator by clicking the mark-out button24f.The clipped image data to be displayed in the out-clip displaying area24dis image data obtained from the composite video signal V2 which is supplied from thehybrid recorder3, and its image size has thinned out to 95×60 pixels.
In the time-[0106]code displaying area24e, the time code of the clipped image data displayed in the out-clip displaying area24dwill be displayed. This time code is that has extracted from the composite video signal V2 by the processor controller11ain thefirst video processor11 when the operator marked the out point by clicking the mark-out button24f.
The mark-out button[0107]24fwill be used to mark an out point. The operator clicks the mark-out button24fwhile monitoring the video data displayed on therecording video screen21a.If the mark button24fis clicked, (95×60-pixel) clipped image data which corresponds to the video data displayed on therecording video screen21ais generated and displayed to the out-clip displaying area24d.
The reproducing[0108]speed setting area25 will be used to set the reproducing speed of the selected event. The operator sets the reproducing speed while monitoring the data displayed therein. The details of the reproducingspeed setting area25 will be described later.
The[0109]recycle box26 will be used to clear the generated clipped image data. If the clipped image data is specified by themouse2dand dragged to therecycle box26, it is cleared. In the case of restoring thus cleared clipped image data, if therecycle box26 is clicked, all of the clipped image data stored in therecycle box26 is displayed. And if the clipped image data which is wanted to be restored is clicked among them, the specified clipped image data is restored.
The reproducing[0110]video marking area27 will be used to mark an in-point clipped image data or an out-point clipped image data from the video data displayed on the reproducingvideo screen23a. The reproducingvideo marking area27 is divided into the following areas: an in-clip displaying area27a; in-point's time-code displaying area27b; mark-inbutton27c; out-clip displaying area27d; out-point's time-code displaying area27e; and mark-out button27f.
The in-[0111]clip displaying area27awill be used to display the clipped image data which has marked as the in point by the operator by clicking the mark-inbutton27c.The clipped image data to be displayed in the in-clip displaying area27ais clipped image data obtained from the composite video signal V3 which is supplied from thehybrid recorder3, and its image size has thinned out to 95×60 pixels.
In the time-[0112]code displaying area27b, the time code of the clipped image data displayed in the in-clip displaying area27ais displayed. The time code is that has extracted from the composite video signal V3 by the processor controller12ain thesecond video processor12 when the operator marked the in point by clicking the mark-inbutton27c.
The mark-in[0113]button27cwill be used to mark an in point. The operator clicks the mark-inbutton27cwhile monitoring the video data displayed on the reproducingvideo screen23a.If the mark-inbutton27cis clicked, (95×60-pixel) clipped image data which corresponds to the video data displayed on the reproducingvideo screen23ais generated and displayed to the in-clip displaying area27a.
The out-[0114]clip displaying area27dwill be used to display out-point clipped image data which has marked by the operator by clicking the mark-out button27f.The clipped image data to be displayed in the out-clip displaying area27dis image data obtained from the composite video signal V3 which is supplied from thehybrid recorder3, and its image size has thinned out to 95×60 pixels.
In the time-[0115]code displaying area27e,the time code of the clipped image data displayed in the out-clip displaying area27dis displayed. This time code is that has extracted from the composite video signal V3 by the processor controller12ain thesecond video processor12 when the operator marked the out point by clicking the mark-out button27f.
The mark-[0116]out button27fwill be used to mark an out point. The operator clicks the mark-out button27fwhile monitoring the video data displayed on the reproducingvideo screen23a.If the mark-out button27fis clicked, (95×60-pixel) clipped image data which corresponds to the video data displayed on the reproducingvideo screen23ais generated and displayed to the out-clip displaying area27d.
The[0117]clip displaying area28 will be used to display the clipped image data which has marked by clicking the mark-inbutton24cor the mark-out button24fprovided in the recordingvideo marking area24, and also the clipped image data which has marked by clicking the mark-inbutton27cor the mark-out button27fprovided in the reproducingvideo marking area27. Note that, the clipped image data to be displayed in theclip displaying area28 is clipped image data which has not used as the in point or the output of an event. The clipped image data used as the in point or the output point of an event is displayed to theevent displaying area29. Theclip displaying area28 has the following areas: a clipped-image-data displaying area28a; time-code displaying area28b; clip-type displaying area28c; clip-number displaying area28d; forwardingbutton28e; and reverse button28f.
The clipped-image-[0118]data displaying area28ais the clipped image data moved from any one of the in-clip displaying area24a, the out-clip displaying area24don the recording side, and the in-clip displaying area27a, the out-clip displaying area27don the reproducing side, and having 95×60 pixels of the image size.
In the time-[0119]code displaying area28b,the time code of the clipped image data displayed in the clipped-image-data displaying area28ais displayed. If the clipped image data is moved from any one of the following areas: the in-clip displaying area24a; out-clip displaying area24d; in-clip displaying area27a; and out-clip displaying area27d, its time code is moved together with the clipped image data to the clipped-image-data displaying area28a.
In the clip-[0120]type displaying area28c, the data showing that is in-point clipped image data or out-point clipped image data being displayed in the clipped-image-data displaying area28a. For instance, if the clipped image data displayed in the clipped-image-data displaying area28ais the clipped image data obtained from the in-clip displaying area24a, the red letters “IN” are displayed. While if it is the clipped image data from the out-clip displaying area24d, the red letters “OUT” are displayed. Besides, if it is the clipped image data from the in-clip displaying area27a, the blue letters “IN” are displayed. And if it is the clipped image data from the out-clip displaying area27a, the blue letters “OUT” are displayed.
In the clip-[0121]number displaying area28d, the clip number which has added to the clipped image data displayed in the clipped-image-data displaying area28awill be displayed. This clip number will be automatically added to each clipped image data in the marking order.
The[0122]forwarding button28ewill be used to forward the display of the clipped image data in theclip displaying area28, and the reverse button28fwill be used to reverse that. However, since the size of the clip displaying area is limited, if large number of clipped image data has generated, all of the clipped image data cannot be displayed at once. In such case, by operating theforwarding button28eor the reverse button28fto forward or reverse the clipped image data, all of the clipped image data can be displayed on the monitor.
The[0123]event displaying area29 will be used to display the clipped image data of the event which has generated by sequentially clicking the mark-inbutton24cand the mark-out button24fprovided in the recordingvideo marking area24, and also the clipped image data of the event which has generated by sequentially clicking the mark-inbutton27cand the mark-out button27fprovided in the reproducingvideo marking area27. Either of the in-point clipped image data or the out-point clipped image data will be displayed per event. Theevent displaying area29 has the following units similarly to the clip displaying area28: a clipped-image-data displaying area29a; time-code displaying area29b; clip-type displaying area29c; event-number displaying area29d; forwardingbutton29e; andreverse button29f, and more has an event-title displaying area29g.
In the clip-[0124]type displaying area29c, the data showing that is the clipped image data of the event displayed in the clipped-image-data displaying area29aof an in point or an out point, will be displayed. If the clipped image data of the in point is displayed as the clipped image data of the event, the letters “IN” are displayed to the clip-type displaying area29c.Here, in the case where the out-point clipped image data is wanted to be displayed instead of the in-point clipped image data, the operator may click this clip-type displaying area29c,then the out-point clipped image data is displayed. Thereafter, the display will be switched between the in-point clipped image data and the out-point clipped image data for every click of the clip-type displaying area29c.
In the event-[0125]number displaying area29d, the event number added to each generated event will be displayed. This event number is automatically added to each event in the generation order and no relative to the clip number.
In the event-[0126]title displaying area29g, the title added to the event will be displayed in letters. Note that, this title can be registered for every events by means of a title menu.
The[0127]program displaying area30 will be used to produce a program by copying the event displayed in theevent displaying area29, and in which a copy of the clipped image data of the event displayed in theevent displaying area29 is displayed. In the case of producing the program by rearranging the event, the clipped image data of the event displayed in theevent displaying area29 is first dragged and copied to theprogram displaying area30. Thereby, the program can be produced by freely rearranging the event displayed in theevent displaying area29. At this time, even if the clipped image data of the event displayed in theprogram displaying area30 is dragged and moved again to the other position within theprogram display area30 again, the event can be freely rearranged within theprogram displaying area30. In this case, the event is not copied but moved.
This[0128]program displaying area30 has the following units similarly to the event displaying area29: a clipped-image-data displaying area30a; time-code displaying area30b; clip-type displaying area30c; event-number displaying area30d; forwardingbutton30e;reverse button30f; and event-title displaying area30g.Note that, the description of these units is omitted since these are the same as the units in theevent displaying area29.
A VTR[0129]remote button36 is a button for opening a panel for VTR control for operating the VTR provided in thehybrid recorder3. If this VTRremote button36 is clicked, theCPU10 detects that the VTRremote button36 has pushed, and displays the panel for VTR control at a predetermined position on the GUI displayed on themonitor2b.Therefore, the operator can perform a desired operation to the VTR by clicking various command buttons displayed in this panel for VTR control.
A hard-disk[0130]remote button37 is a button for opening a panel for hard-disk control for operating the hard-disk array provided in thehybrid recorder3. If this hard-diskremote button37 is clicked, theCPU10 detects that the hard-diskremote button37 has pushed, and displays the panel for hard-disk control at a predetermined position on the GUI displayed on themonitor2b. Therefore, the operator can perform a desired operation to the hard-disk array by clicking various command buttons displayed in this panel for hard-disk control.
A[0131]dubbing setting button38 is a button for opening a dialog (window) for dubbing setting for setting dubbing processing from the VTR in thehybrid recorder3 to the hard-disk array or dubbing processing from the hard-disk array to the VTR. If thisdubbing setting button38 is clicked, theCPU10 detects that thedubbing setting button38 has pushed, and displays the dialog for dubbing setting such as shown in FIG. 4, at a predetermined position on the GUI displayed on themonitor2b.
In the[0132]dubbing setting dialog38A, as shown in FIG. 4, a tape-to-disk button38B for setting an object to be dubbed from the VTR to the hard-disk array, a disk-to-tape button38C for setting an object to be dubbed from the hard-disk array to the VTR, a high-speed dubbing button38D for setting high-speed dubbing, aprogram dubbing button38E for dubbing a produced program, adetermination button38F for determining the contents of setting, and a cancelbutton38G for canceling the contents of setting, are displayed.
The tape-to-disk button[0133]38B is a button used when a dubbing mode for reading video data out from a video tape loaded in the VTR and recording this to the hard-disk array. If thecursor38H is moved on the tape-to-disk button38B and clicking operation is performed, theCPU10 detects that the above button38B has pushed, and sets the VTR as a readout device of the video data and sets the hard-disk array as a recording device of the video data.
The high-speed dubbing button[0134]38D is a button to be set when the dubbing processing from a video tape to the hard disk is performed by high-speed (e.g., 4-time speed) dubbing. In the case of setting this high-speed dubbing, if thecursor38H is moved on the high-speed dubbing button38D and clicking operation is performed, the high-speed dubbing is set. Note that, in the case of performing the high-speed dubbing, the video data recorded on the video tape is required to be compressively coded. That is, at the time of high-speed dubbing, high-speed ability is realized by recording the video data which is compressively coded at the time of high-speed dubbing to the hard-disk array as in the state of compressively coded. Therefore, if the video data recorded on the video tape is analog video data, high-speed dubbing cannot be performed, thus, only the tape-to-disk button38B is naturally pushed.
The disk-to-[0135]tape button38C is a button for setting the dubbing mode in which the video data recorded in the hard-disk array is read out and recorded to a video tape loaded in the VTR. If thecursor38H is moved on the disk-to-tape button38C and clicking operation is performed, theCPU10 detects that thebutton38C has pushed, sets the hard-disk array as the readout device of the video data, and sets the VTR as the recording device of the video data.
The[0136]program dubbing button38E is a button to be set when a program produced in theprogram displaying area30 is dubbed. If after the disk-to-tape button38C is clicked, thisprogram dubbing button38E is clicked, theCPU10 sets the video data to be dubbed as a program produced in theprogram displaying area30.
Note that, if these tape-to-disk button[0137]38B, the disk-to-tape button38C, the high-speed dubbing button38D and theprogram dubbing button38E are clicked once, these contents are set, and if clicked again, these contents are canceled.
The[0138]determination button38F is a button to be operated when the contents of dubbing condition set by the tape-to-disk button38B, the disk-to-tape button38C, the high-speed dubbing button38D and theprogram dubbing button38E are determined. If thisdetermination button38F is clicked, theCPU10 detects this, determines all of the setting contents, and closes thisdubbing setting dialog38A (that is, makes the display disappear).
The cancel[0139]button38G is a button to be operated when all of the dubbing conditions set by the tape-to-disk button38B, the disk-to-tape button38C, the high-speed dubbing button38D and theprogram dubbing button38E are canceled. If this cancelbutton38G is clicked, theCPU10 detects this, cancels the all of the setting contents, and closes thisdubbing setting dialog38A.
A recording-[0140]start button31aand a recording-stop button31bwill be used to supply the control commands of recording start and recording stop to thehybrid recorder3. If the recording-start button31ais clicked, theCPU10 detects that the recording-start button31awas pushed, and instructs to supply the recording-start command to theexternal interface18. Theexternal interface18 receives this instruction and sends a recording-start command (REC START command) which has been defined by the RS-422 to thehybrid recorder3. Thehybrid recorder3 starts recording operation responding to the recording-start command received.
Note that, if this recording-[0141]start button31ais clicked in the state where various setting contents of dubbing processing have registered and the dubbing button22ihas pushed, dubbing processing based on the dubbing setting contents is executed. In this connection, if the dubbing processing is executed, the video data to which the dubbing processing is being performed is displayed on therecording video screen21a, and a graphical display showing under the dubbing processing is displayed on the reproducingvideo screen23aas shown in FIG. 5. That is, as shown in FIG. 5, the characters, “DUBBING”, and the characters showing the contents of the dubbing, “Disk to Tape” and “Normal Dubbing”, are displayed.
On the contrary, if the recording-[0142]stop button31bis clicked, theCPU10 detects that the recording-stop button31bwas pushed, and instructs to supply the recording-stop command to theexternal interface18. Theexternal interface18 receives this instruction and sends a recording-stop command (REC STOP command) which has been defined by the RS-422 to thehybrid recorder3. Thehybrid recorder3 stops the recording operation responding to the recording-stop command received.
Note that, if this recording-[0143]stop button31bis clicked under the dubbing processing from the VTR to the hard-disk array or from the hard-disk array to the VTR, the dubbing processing is stopped.
The[0144]preview button32 will be used to preview the selected event and program, i.e., confirm the contents. If an event or a program is specified, the clipped image data of the specified event or program is displayed on the reproducingvideo screen23ain a still image. At this time, if thepreview button32 is clicked, theCPU10 detects that thepreview button32 was pushed, and instructs to supply a reproducing starting command to theexternal interface18. Theexternal interface18 receives this instruction and sends a reproducing starting command (PLAY START command) which has been defined by the RS-422 to thehybrid recorder3. Thehybrid recorder3 starts the reproduction of the specified event or the program in the composite video signal V3 from the hard-disk array (or the VTR) responding to the reproducing starting command received.
A new-[0145]event button33 will be used to newly generate an event. In the case of registering the event whose in point and out point have changed from the event specified by the operator as another new event, the new-event button33 is clicked.
A replace[0146]button34 is used if the operator wants to change the in point and the out point of the selected event. In the case of replacing the event whose in point and out point have changed from the event specified by the operator as the very specified event not as another new event, the replacebutton34 is clicked.
A[0147]delete button35 will be used to clear the selected event and program. The cleared event and program are stored to therecycle box26.
(3-2) Time-Line Mode[0148]
Secondly in this chapter, a time-line mode will be described. In case of the time-line mode, a graphic display shown in FIG. 6 will be displayed on the[0149]monitor2b.As shown in FIG. 6, the graphic display of time-line mode is divided into eleven areas: a recordingvideo displaying area21;timing displaying area22; reproducingvideo displaying area23; recordingvideo marking area24; reproducingspeed setting area25; recycle-box area26; reproducingvideo marking area27;event displaying area29; time-line displaying area40; edit-tool displaying area41; and program-view area42.
Note that, the recording[0150]video displaying area21,timing displaying area22, reproducingvideo displaying area23, recordingvideo marking area24, reproducingspeed setting area25, recycle-box area26, reproducingvideo marking area27,event displaying area29 and various control command buttons in the lower stage are the same as the areas in the picture mode shown in FIG. 3.
The time-[0151]line displaying area40 can be used to edit a program while confirming the temporal length of each event. As shown in FIG. 7, the time-line displaying area has the following areas: a time-scale displaying area40a;action displaying area40b;GPI area40c; video-edit area40d,first and second audio-edit areas40eand40f;scroll buttons40gand40h; and editbar40i.
In the time-[0152]scale displaying area40a,a time scale will be displayed. The temporal length of each event is to be shown based on the time scale. This time scale is shown in a frame unit and the minimum unit of the scale can be set to any number of frames by user's setting.
The[0153]action displaying area40bwill be used to specify the position stopping the operation when a program or event is previewed or sent out. The stop position can be set to an arbitrary position regardless of the in point and the out point. Furthermore, astop flag40bawill be displayed at the specified position as shown in FIG. 7, thus the operator can easily confirm his specified position. By specifying the stop position in theaction displaying area40bas described above, the program or event of an arbitrary period can be previewed or sent out. Note that, in the case of making the stop position specified by theaction displaying area40bbe effective, theaction button40bcshould be clicked.
A[0154]GPI area40cwill be used to specify the output point of the control commands of a general purpose interface (GPI: a general purpose interface for controlling an external instrument by sending out control commands from an editing system). The GPI output point can be set to any position regardless of the in point and the out point. Also amark40cawill be displayed at the position of the GPI output point, and thus the operator can easily confirm the his specified position. Since the GPI output point is specified in theGPI area40cas described above, the external instrument can be controlled by supplying the control command at the specified output point. Note that, in the case of making the GPI output point specified by theGPI area40cbe effective, theGPI button40cbshould be clicked.
The video-[0155]edit area40dwill be used to edit a program by rearranging the event dragged from theevent displaying area29, etc. The event to be displayed in the video-edit area40dis an event dragged from theevent displaying area29 or called by a program calling button in the program-view area42 which will be described later and displayed in theprogram displaying area30 in the picture mode. In this video-edit area40d, the clipped image data of the in point and the out point of the event will not be displayed as when in the picture mode, but the event number and the title added to the event will be displayed. However, it is so intended that the size of the display area of each event becomes different according to the length of the event, thereby the length of the event can be visually confirmed comparing with the time scale in the time-scale displaying area40a. Moreover, since the length of each event can be visually confirmed, also the length of entire edited program can be visually confirmed. Consequently, whether the edited program is within the desired length or not can be easily confirmed.
Besides, in the video-[0156]edit area40d, each event can be moved to an arbitrary position and any event can be cut into the other event, so that a program in the desired order can be produced by rearranging events to any order. In this connection, when the event is moved or cut into, the events are linked without generating a space.
The moving position or cutting-into position of an event will be specified by the[0157]edit bar40ibeing the reference position mark. Thisedit bar40iwill be fixedly displayed at the almost center of the screen. In the case of specifying the moving position or cutting-into position, the proposed moving position or cutting-into position is adjusted to theedit bar40iby scrolling the event display; and thus the position is specified as the moving position or cutting-into position.
Note that, in the case of operating the video-[0158]edit area40d, by clicking thevideo button40db, the video-edit area40dbecomes an operable state.
The first and the second audio-[0159]edit areas40eand40fwill be used to edit the audio data of each event. In the case of fetching the audio data into the first and the second audio-edit areas40eand40f,by clickingaudio buttons40eaand40faand dragging the event from theevent displaying area29, the audio data of the event can be fetched.
Note that, as to the fetched audio data, the event number and the title added to the event is displayed.[0160]
Also in the first and second[0161]audio edit areas40eand40f, the audio data of each event can be moved to any position, and the audio data of an arbitrary event can be cut into the audio data of the other event, similarly to the video-edit area40d. At this time the position can be specified by adjusting the proposed moving position or cutting-into position to theedit bar40iby scrolling the audio data in similar manner.
The first audio-[0162]edit area40eand the second audio-edit area40fare different only in its stereo output, that is, the former has on the right side and the latter has on the left side.
Scroll[0163]buttons40gand40hwill be used to operate in the case of scrolling the period fromaction displaying area40bto second audio-edit area40fas a whole rightward or leftward. If either of thescroll buttons40gand40hthat is wanted to be forwarded is clicked, the display is scrolled toward the direction.
These[0164]scroll buttons40gand40hare divided into the following buttons:40gaand40haused to execute the scroll in the scale unit of the time scale displayed in the time-scale displaying area40a;40gband40hbused to execute the scroll in a frame unit;40gcand40hcused to execute the scroll in a second unit;40gdand40hdused to execute the scroll in a time unit; and40geand40heused to execute the scroll in an in-point unit.
Hereinafter, a description of the time-line mode is performed again referring to FIG. 6. The edit-tool displaying area[0165]41 displayed at the lower of the time-line displaying area40 is a command button to instruct a command used in program edition in the time-line displaying area40. The edit-tool displaying area41 is composed of the following buttons: anaction tool button41a; sole-event-moving tool button41b;track tool button41c; ripple-edition tool button41d;over-lay tool button41e; andclear tool button41f.
The action-[0166]tool button41awill be operated in the case of setting thestop flag40bato theaction displaying area40bin the time-line displaying area40 described above. In the case of setting thestop flag40ba, by scrolling the event by operating thescroll button40gor40hand adjusting the position where thestop flag40bais wanted to be set to theedit bar40iand then clicking the action-tool button41a, thestop flag40bais set to theedit bar40iand thestop flag40bais displayed thereto.
The sole-event-moving-tool button[0167]41bwill be used to select and move one of the event in the video-edit area40dand the audio data in the audio-edit areas40eand40f.For example, in the case of moving an event in the video-edit area40d,the operator first clicks avideo button40dbin the video-edit area40dand scrolls the event, and adjusting the moving position to theedit bar40i.Then the operator clicks the sole-event-moving tool button41band then clicks the event that he wants to move. Thereby, the clicked event is moved to the position of theedit bar40i.
The track-[0168]tool button41cwill be used to select an event in the video-edit area40dor audio data in the audio-edit areas40eand40f, and move all of the data after the selected data together. Also when moving an event using the track-tool button41c,the operator clicks thevideo button40dbor theaudio buttons40eaand40faand adjusts the moving position to theedit bar40i,thereafter, sequentially clicks the track-tool button41cand the event to be moved basically.
The ripple-[0169]edit tool button41dwill be used to select one of events in the video-edit area40dand audio data in the audio-edit areas40eand40f, and moves the selected one to the desired position in the other event by cutting into.
The[0170]over-lay tool button41ewill be used to select an event in the video-edit area40dand audio data in the audio-edit areas40eand40f,and move the selected one to the other event and overwriting thereto.
Also these operational procedures are basically similar to the sole-event-moving tool button[0171]41betc.
The[0172]clear tool button41fwill be used to select an event in the video-edit area40dor audio data in the audio-edit areas40eand40fand clearing, and also to release the setting of thestop flag40baetc. In the case of clearing or releasing the setting by the clear-tool button41f,the operator clicks theclear tool button41fand then clicks the object to be cleared or the object to be released the setting.
The program-view area[0173]42 displayed at the lower of the time-line displaying area40 is described hereinafter. In the time-line displaying area40, the length of event displaying area will be basically changed according to the length of each event, so that the length of each event can be visually seen easily. However, since no clipped image data of each event have been displayed, it might become not easy to grasp what images each event has. Then, in theediting system1, the program-view area42 is provided as to be able to easily see what images each event has, even in the time-line mode.
The program-view area[0174]42 has a view area42a,program-call button42b,forwardingbutton42candreverse button42d.
The view area[0175]42awill be used to display the in-point clipped image data or the out-point clipped image data of each event. The order of the clipped image data to be displayed in the view area42aagrees with the order of events in the program produced by the time-line displaying area40. Therefore the order of events in the program produced by the time-line displaying area40 can be easily confirmed by the clipped image data, and have the images been placed in how order in the program can be confirmed easily. In this connection, each clipped image data to be displayed in the view area42ais the image data generated by thinning out the clipped image data in theevent displaying area29, and its image size is almost the half the clipped image data displayed in theevent displaying area29.
The[0176]program call button42bwill be used to enter a program calling instruction to call an event displayed in theprogram displaying area30 in the picture mode to the time-line displaying area40 and view area42a.If theprogram call button42bis clicked, calling the program is instructed, so that the events displayed in theprogram displaying area30 in the picture mode can be called to the time-line displaying area40 as it is in order. Similarly, also in the view area42a, the clipped image data having the same order as theprogram displaying area30 is called and displayed. In this manner, calling the program can be instructed by providing the program-call button42b, thereby the program produced in the other mode can be called easily to the time-line mode; and thus the time adjustment can be easily edited even as to the program produced in the other mode.
The[0177]forwarding button42cand thereverse button42dwill be used to forward or reverse the display of the clipped image data in the view area42a.In the case where the produced program is composed of a plurality of events, all of the clipped image data cannot be displayed in the view area42a. In such case, thisforwarding button42correverse button42dis operated to forward or reverse the clipped image data, and displaying thus all of the clipped image data.
(4) Clipped Image Data Managing Method[0178]
A method of storing clip data, event data and program data is described hereinafter. Here the “clip data” includes the data used to display the clipped image data to the[0179]clip displaying area28 and the data used to store the clipped image data. This respect is similar as to the event data and program data.
First, the first management record data for clip data, event data and program data will be described referring to FIG. 8.[0180]
The first management record data will be provided one by one for clip data, event data and program data. That is, the first management record data for clip data is data used to manage all of the clipped image data displayed in the[0181]clip displaying area28. The first management record data for event data is data used to manage all of the clipped image data displayed in theevent displaying area29. And the first management record data for program data is data used to manage all of the clipped image data displayed in theprogram display area30. In this embodiment, the sole first management recorded is provided for clip data, event data and program data, respectively.
The first management record data has data regarding the following things: the pointer to the data linked before; pointer to the data linked after; horizontal size of display for one page; vertical size of display for one page; display position on the screen; head position of display; and total number of links.[0182]
The “pointer to the data linked before” is data showing the pointer of the management record data which is linked before the first management record data. If no management record data is linked before, the pointer of the first management record data is stored.[0183]
The “pointer to the data linked after” is data showing the pointer of the management record data which is linked after the first management record data. If no management record data is linked after, the pointer of the first management record data is stored.[0184]
The “horizontal size of the display for one page” is data showing the maximum displayable number in horizontal of the clipped image data in each display area of the[0185]clip displaying area28,event displaying area29 andprogram displaying area30. In this embodiment, theclip displaying area28,event displaying area29 andprogram displaying area30 can display eleven clipped image data respectively, therefore the data showing “eleven” has been stored to the respective first management record data as the horizontal size of the display for one page.
The “vertical size of display for one page” is data showing the maximum displayable number in vertical of the clipped image data in each display area, the[0186]clip displaying area28,event displaying area29 andprogram displaying area30. In this embodiment, these areas all display the sole clipped image data respectively, so that the data showing “one” has been stored to the respective first management record data as the vertical size of display for one page.
The “display position on the screen” is data used to show which display area will display the clipped image data. In this embodiment, the[0187]clip displaying area28 will be provided at the lower stage on the screen; theevent displaying area29 at the middle stage; and theprogram displaying area30 at the upper stage. Then if it is the first management record data for clip data, the data showing the “lower stage” is stored as a display position on the screen; if being for event data, the “middle stage” is stored; and if being for program data, the “upper stage” is stored.
The “head position of display” is data showing will the clipped image data is to be displayed from which position in the[0188]clip displaying area28,event displaying area29 andprogram displaying area30 respectively. In this embodiment, eleven clipped image data will be displayed in theclip displaying area28; eleven clipped image data will be displayed in theevent displaying area29; and eleven clipped image data will be displayed in theprogram displaying area30, i.e., thirty-three clipped image data can be displayed. These thirty-three displayed positions will be managed by numbering them from the upper stage of the screen. For example, the following is predetermined: the display position in theprogram displaying area30 is for “1”-“11”; theevent displaying area29 is for “12”-“22”; and theclip displaying area28 is for “23”-“33”. Accordingly, if it is the first management record data for clip data, the data showing “23” is stored as the head position of the display; if being for event data, “12” is stored; and if being for program data, “1” is stored.
The “total number of links” is data showing the total number of management record data linked after the first management record data.[0189]
Secondly, the second management record data for clip data is described referring to FIG. 9. The second management record data for clip data is data used to manage the clipped image data being displayed in the[0190]clip displaying area28 for every clipped image data. Therefore, the same number of second management record data for clip data as the clipped image data being displayed to theclip displaying area28 are found.
The second management record data for clip data has the pointer to the data linked before, pointer to the data linked after, property, clipped-image-data handle, clip type, time-code data and index number of the clipped image data.[0191]
The “pointer to the data linked before” is data showing the pointer of the management record data which is linked before the second management record data. Because the second management record data invariably has the first management record data or the second management record data its before, the pointer of the data linked before is necessarily stored.[0192]
The “pointer to the data linked after” is data showing the pointer of the management record data which is linked after the second management record data. If there is no management record data linked after, the pointer of the second management record data is stored.[0193]
The “property” is data showing is the second management record data for clip data, event data or program data.[0194]
The “clipped-image-data handle” is data showing an address to which the clipped image data has been stored. Thereby, referring to the clipped-image-data handle in the second management record data which corresponds to the desired clipped image data, the address in which the clipped image data has been stored can be obtained.[0195]
The “clip type” is data showing which clipped image data of an in point or an out point managed by the second management record data is.[0196]
The “time-code data” is data showing the time code of the clipped image data managed by the second management record data.[0197]
The “index number of the clipped image data” is an index number added to the clipped image data. This index number is a number added to all of the marked clipped image data regardless of the generation of an in point, out point and event. That is, the index number is the same number as the clip number displayed to the clip-[0198]number displaying area28d.All of the clipped image data will be managed by the index number.
Thirdly, the second management record data for event data and program data will be described referring to FIG. 10. The second management record data for event data is data used to manage the clipped image data displayed in the[0199]event displaying area29 for every clipped image data. Therefore, the same number of the second management record data for event data are found as the clipped image data being displayed in theevent displaying area29. Similarly, the second management record data for program data is data used to manage the clipped image data displayed in theprogram displaying area30 for every clipped image data. Therefore, the same number of the second management record data for program data as the clipped image data being displayed in theprogram display area30 will be found.
The second management record data for event data and program data have the following items: the pointer to the data linked before; pointer to the data linked after; property; event number; title; subtitle; in-point clipped-image-data handle; clip type of the in point; time-code data of the in point; index number of the in-point clipped image data; out-point clipped-image-data handle; clip type of the out point; time-code data of the out point; index number of the out-point clipped image data; slow type; symbol type; time-code data of the symbol.[0200]
Since the “pointer to the data linked before”, “pointer to the data linked after” and “property” are similar to that of the second management record data for clip data, these descriptions are omitted here.[0201]
The “event number” is a number to be added to each event in generation order. The event number will be displayed to the event-[0202]number displaying area29d.
The “title” and the “subtitle” are a title and a subtitle which have been previously added to the registered event, these have been stored in the actual characters. The title will be displayed to the[0203]title displaying area29g.
The “in-point clipped-image-data handle” is data showing an address to which the in-point clipped image data has been stored. Thereby, referring to the in-point clipped-image-data handle in the second management record data which corresponds to the desired in-point clipped image data, the address to which the in-point clipped image data has been stored can be obtained.[0204]
The “clip type of the in point” is data showing which clipped data of an in point or an out point the in-point clipped image data managed by the second management record data is. Since all of them is the in-point clipped image data in this case, the data showing “in point” will be stored.[0205]
The “time-code data of in point” is data showing the time code of the in-point clipped image data managed by the second management record data.[0206]
The “index number of the in-point clipped image data” is an index number to be added to the in-point clipped image data. Similarly to the index number in the second management data for clip data described above, the index number of the in-point clipped image data will be added to all of the marked clipped image data regardless of the generation of an in point, out point and event.[0207]
The “out-point clipped-image-data handle” is data showing an address to which the out-point clipped image data has been stored. Therefore, referring to the out-point clipped-image-data handle in the second management record data which corresponds to the desired out-point clipped image data, the address to which the out-point clipped image data has been stored can be obtained.[0208]
The “clip type of the out point” is data showing which clipped image data the out-point clipped image data managed by the second management record data is, of an in point or an out point. Since all of them is the out-point clipped image data in this case, the data showing “out point” will be stored.[0209]
The “out-point time-code data” is the data showing the time code of the out-point clipped image data managed by the second management record data.[0210]
The “index number of the out-point clipped image data” is an index number to be added to the out-point clipped image data. Similarly to the index number in the second management record data for clip data described above, the index number of the out-point clipped image data is a number to be added to all of the marked clipped image data regardless of the generation of an in point, out point and event.[0211]
The “slow type” is data showing that optional-times reproducing speed different from normal reproducing speed is set or not to an event managed by the second management record data. If 1.0-time normal reproducing speed is set to the event, the data “00000000” is recorded, while if optional reproducing speed other than 1.0-time speed is set to the event, the data “00000001” is recorded as slow type.[0212]
The “symbol type” is data showing whether the clipped image data which has been defined as a symbol is found or not in the period between the in point and the out point of the event managed by the second management record data. Here, “symbol” means the typical clipped image data used to show the event.[0213]
The “time-code data of symbol” is the time code of the above clipped image data that has been set as a symbol.[0214]
How to manage the clipped image data using the first and the second management record data described above, is described hereinafter with a concrete example shown in FIGS.[0215]11 through FIGS. 13A to13C.
The line of “marking” shown in FIG. 11 shows has it marked for an in point or an out point. This example means that marking has performed fifteen times in the following order from the left end: IN, IN, OUT, IN, OUT, IN, IN, IN, OUT, IN, OUT, IN, IN, IN and IN. In the line of “index number”, the index number added to the in-point and the out-point clipped image data that has marked, will be shown. The index number is a number to be added to all of the marked clipped image data regardless of being of an in point or an out point. Therefore, as shown in FIG. 11, the index number “[0216]1”-“15” will be sequentially added to each clipped image data marked.
In the line of “clip number (clip NO.)”, the clip number to be displayed to the clip-[0217]number displaying area28dof theclip displaying area28 has been shown. Note that, the clip number to be displayed in the clip-number displaying area28dis the same number as the index number.
In the line of “event number (event NO.)”, an event number to be displayed to the event-[0218]number displaying area29dof theevent displaying area29 has been shown. This event number will be automatically added to each event in the generation order of the event regardless of its index number and clip number.
FIG. 12 is a diagram that shows what clipped image data displayed is, to the[0219]clip displaying area28, theevent displaying area29 and theprogram display area30 in the case of marking shown in FIG. 11.
In the[0220]clip displaying area28, the clipped image data of the index number “1”, “6”, “7”, “12”, “13”, and “14” will be sequentially displayed.
In the[0221]event displaying area29, generated four events will be displayed. That is, the clipped image data of the index number “2” is displayed as the event of the event number “1”, the clipped image data of the index number “4” is displayed as the event of the event number “2”, and the clipped image data of the index number “10” is displayed as the event of the event number “4”, respectively.
In the[0222]program displaying area30, the clipped image data is not displayed if only specified its in point and out point. In this example it is assumed that the program such as shown in FIG. 12 has produced by rearranging the four events displayed in theevent displaying area29. The program is a program in which the event of event number “2”, the event of event number “4”, and the event of event number “1”, are continuously aligned in this order. Therefore, in theprogram displaying area30, the clipped image data of the index number “4” which has been registered as the event of the event number “2”, the clipped image data of the index number “10” registered as the event number “4”, and the clipped image data of the index number “2” registered as the event number “1”, are displayed.
FIGS. 13A to[0223]13C show how to mange the clipped image data by the first and the second management record data.
FIG. 13C shows the state of managing the clipped image data to be displayed to the[0224]clip displaying area28.Management record data101 is the first management record data for clip data. As shown in FIG. 8, this firstmanagement record data101 for clip data has data used to manage wholeclip displaying area28 and the position of the clipped image data displayed to theclip displaying area28.
[0225]Management record data201 which is linked after the firstmanagement record data101 is the second management record data for clip data. This secondmanagement record data201 is data used to manage the clipped image data of the index number “1”. As shown in FIG. 9, the secondmanagement record data201 has the clipped-image-data handle showing an address to which the clipped image data of the index number “1” has been stored.
[0226]Management record data206 linked after the secondmanagement record data201 is the second management record data for clip data. This secondmanagement record data206 is data used to manage the clipped image data of the index number “6”, and has the clipped-image-data handle showing an address to which the clipped image data of the index number “6” has been stored.
Similarly, the second[0227]management record data207 used to manage the clipped image data of the index number “7” is linked after the secondmanagement record data206. The secondmanagement record data212 used to manage the clipped image data of the index number “12” is linked after the secondmanagement record data207. The secondmanagement record data213 used to manage the clipped image data of the index number “13” is linked after the secondmanagement record data212. And the secondmanagement record data214 used to manage the clipped image data of the index number “14” is linked after the secondmanagement record data213.
FIG. 13B shows the state of managing the clipped image data to be displayed to the[0228]event displaying area29.Management record data102 is the first management record data for event data. As shown in FIG. 8, this firstmanagement record data102 has data used to manage wholeevent displaying area29 and the position of the clipped image data to be displayed in theevent displaying area29.
[0229]Management record data202 linked after the firstmanagement record data102 is the second management record data for event data. As shown in FIG. 10, the secondmanagement record data202 has data used to manage the in-point clipped image data shown by the index number “2”, and the out-point clipped image data shown by the index number “3”. More precisely, the secondmanagement record data202 has the in-point clipped-image-data handle showing an address to which the in-point clipped image data shown by the index number “2” has been stored, and the out-point clipped-image-data handle showing an address to which the out-point clipped image data shown by the index number “3” has been stored.
Similarly, the second[0230]management record data204 used to manage the in-point clipped image data of the index number “4” and the out-point clipped image data of the index number “5”, is linked after the secondmanagement record data202. The secondmanagement record data208 used to manage the in-point clipped image data of the index number “8” and the out-point clipped image data of the index number “9”, is linked after the secondmanagement record data208. And the secondmanagement record data210 used to manage the in-point clipped image data of the index number “10” and the out-point clipped image data of the index number “11”, is linked after the secondmanagement record data208.
FIG. 13A shows the state of managing the clipped image data to be displayed to the[0231]program display area30.Management record data103 is the first management record data for program data. As shown in FIG. 8, the firstmanagement record data103 has data used to manage wholeprogram displaying area30 and the position of the clipped image data to be displayed to theprogram displaying area30.
The second[0232]management record data204 used to manage the in-point clipped image data of the index number “4” and the out-point clipped image data of the index number “5”, is linked after the firstmanagement record data103 for program data. The secondmanagement record data210 used to manage the in-point clipped image data of the index number “10” and the out-point clipped image data of the index number “11”, is linked after the secondmanagement record data204. And the secondmanagement record data202 used to manage the in-point clipped image data of the index number “2” and the out-point clipped image data of the index number “3”, is linked after the secondmanagement record data210.
Here, comparing FIG. 13B showing the management of event data with FIG. 13A showing the management of program data, the order of storing the clipped image data of the index number “[0233]2”, “4” and “10” is all the same between FIG. 13B and FIG. 13A. This means that the storing position of the clipped image data has not changed a bit. FIG. 13B is different from FIG. 13A in that the link order of the second management record data has changed. More specifically, in theediting system1, if the display order of the event is changed, the storing position of the clipped image data that shows an event is not changed but the link order of the second management record data that is used to directly manage the clipped image data is changed. Thereby, in theediting system1, special effects can be obtained such that the display order of the event can be changed with high speed.
Furthermore, it is not only limited to the change of the display order of the events but also the change of the display order of the clipped image data being displayed in the[0234]clip displaying area28. For instance, even if the display order of the clipped image data has changed by erasing the clipped image data or newly supplementing clipped image data, the display order can be easily changed only by correcting the linked data of the second management record data (i.e., the part of the pointer to the data linked before/after) without actually moving the storing position of the clipped image data.
The marking operation from the first marking to the fifteenth marking is concretely described hereinafter, including the operation of each circuit block.[0235]
Before starting the marking, the first[0236]management record data101 for clip data, the firstmanagement record data102 for event data, and the firstmanagement record data103 for program data have been generated at the head address in a memory area for work data that has been kept in the RAM lob. However, since any first management record data has no second management record data linked thereto, the address of itself has been stored in the “pointer to the data linked after”.
[First Marking (in point)][0237]
After the first marking, 95×60-pixel clipped image data is formed by controlling the read from the[0238]frame memory11c.The formed clipped image data is stored to a vacant area in theRAM10bas clipped image data of the index number “1”. As well as storing, the formed clipped image data is displayed to the in-clip displaying area24a.At this time, the secondmanagement record data201 managing this clipped image data has been temporarily stored in the register in theCPU10 and has not been stored in theRAM10b.The reason why the secondmanagement record data201 is unknown to link to which management record data at this time.
[Second Marking (in point)][0239]
Similarly, after the second marking, a clipped image data of the index number “[0240]2” is formed and stored to a vacant area in theRAM10b.At this time, since an in point has successively marked twice, the clipped image data of the index number “1” being displayed in the in-clip displaying area24awill have not used as an event. Then the clipped image data of the index number “1” being displayed in the in-clip displaying area24ais moved to theclip displaying area28. Also, by the second marking, the secondmanagement record data201 managing the clipped image data of the index number “1” is determined to be linked to the firstmanagement record data101 for clip data. Thus, as shown in FIG. 13C, the secondmanagement record data201 which has been temporarily stored in the register in theCPU10 is stored to theRAM10bso as to link to the firstmanagement record data101.
On the other hand, the clipped image data of the index number “[0241]2” generated by the second marking is newly displayed to the in-clip displaying area24ainstead of the clipped image data of the index number “1”. Similarly to the first marking, the secondmanagement record data202 managing the clipped image data of this index number “2” is newly and temporarily stored to the register in theCPU10.
[Third Marking (out point)][0242]
Similarly, after the third marking, clipped image data of the index number “[0243]3” is formed and stored to a vacant area in theRAM10b.Since the third marking is an out point, an event in which the clipped image data of the index number “2” is set as the in point and the clipped image data of the index number “3” is set as the out point, is formed. Then the clipped image data of the index number “2” being displayed in the in-clip displaying area24ais copied to theevent displaying area29 as being displayed in the in-clip displaying area24a.By the third marking, the secondmanagement record data202 managing the clipped image data of the index number “2” that has been stored in the register is determined to link to the firstmanagement record data102 for clip data. Thus, as shown in FIG. 13B, the secondmanagement record data202 which has been temporarily stored in the register in theCPU10 is stored to theRAM10bso as to link to the firstmanagement record data102.
On the other hand, the clipped image data of the index number “[0244]3” which has been generated by the third marking is newly displayed to the out-clip displaying area24d.Note that, since the secondmanagement record data202 managing the clipped image data of the index number “3” has determined to link to the firstmanagement record data102, it is not stored to the register in theCPU10.
[Fourth Marking (in point)][0245]
Similarly, after the fourth marking, clipped image data of the index number “[0246]4” is formed and stored to a vacant area in theRAM10b.As well as storing, the formed clipped image data is displayed to the in-clip displaying area24a. Furthermore, similarly to the first marking, the secondmanagement record data204 managing the clipped image data of this index number “4” is temporarily stored to the register in theCPU10. Note that, since the clipped image data of the index number “3” which has been displayed in the out-clip displaying area24dhas been stored already, it is cleared from the out-clip displaying area24d.
[Fifth Marking (out point)][0247]
Similarly, after the fifth marking, clipped image data of the index number “[0248]5” is formed and stored to a vacant area in theRAM10b.Since the fifth marking is of an out point similarly to the third marking, an event in which the clipped image data of the index number “4” is set as the in point and the clipped image data of the index number “5” is set as the out point is formed. Then, the clipped image data of the index number “4” being displayed in the in-clip displaying area24ais copied to theevent displaying area29 as being displayed in the in-clip displaying area24a.By the fifth marking, the secondmanagement record data204 managing the clipped image data of the index number “4” that has been stored in the register is determined to link to the secondmanagement record data202 stored before. Thus, as shown in FIG. 13B, the secondmanagement record data204 which has been temporarily stored in the register of theCPU10 is stored to theRAM10bso as to link to the secondmanagement record data202.
On the other hand, the clipped image data of the index number “[0249]5” which has been generated by the fifth marking is newly displayed to the out-clip displaying area24d.Note that, since the secondmanagement record data204 managing the clipped image data of the index number “5” has determined to link to the secondmanagement record data202, it is not stored to the register in theCPU10.
[Sixth Marking (in point)][0250]
Similarly, after the sixth marking, clipped image data of the index number “[0251]6” is formed and stored to a vacant area in theRAM10b. As well as storing, the formed clipped image data of the index number “6” is displayed to the in-clip displaying area24a. Furthermore, similarly to the fourth marking, the secondmanagement record data206 managing the clipped image data of this index number “6” is temporarily stored to the register in theCPU10. Note that, since the clipped image data of the index number “5” which has been displayed in the out-clip displaying area24dhas been stored already, it is cleared from the out-clip displaying area24d.
[Seventh Marking (in point)][0252]
Similarly, after the seventh marking, clipped image data of the index number “[0253]7” is formed and stored to a vacant area in theRAM10b.In this case, since an in point has successively marked twice, the clipped image data of the index number “6” being displayed in the in-clip displaying area24ais moved to theclip displaying area28. By the seventh marking, the secondmanagement record data206 which has been stored in the register in theCPU10 is stored to theRAM10bso as to link to the secondmanagement record data201 as shown in FIG. 13C.
On the other hand, the clipped image data of the index number “[0254]7” is displayed to the in-clip displaying area24a. Moreover, similarly to the sixth marking, the secondmanagement record data207 for managing the clipped image data of this index number “7” is temporarily stored to the register in theCPU10.
Here the description of the following ninth to the fifteenth marking is omitted because these are conducted similarly to the first to the seventh marking.[0255]
(5) Configuration of Hybrid Recorder[0256]
The[0257]hybrid recorder3 will be described hereinafter, referring to FIG. 14. As shown in FIG. 14, thehybrid recorder3 has a hard-disk drive (HDD)300 capable of recording and reproducing video signal apparently at the same time, and aVTR301 which is equipped with both a VTR of an analog mode and a VTR of a digital mode.
The structure of this[0258]hybrid recorder3 is concretely described, hereinafter.
The[0259]hybrid recorder3 provides aninterface unit302 according to the RS-422 communication protocol, and receives a control command such as recording start, recording stop, reproducing start, and reproducing stop which are sent from anexternal interface18 of thecomputer2 by theinterface unit302. Thisinterface unit302 supplies the received control command to theCPU303.
The[0260]CPU303 is a control means for controlling the entire operation of thehybrid recorder3, and controls the operation of each unit responding to the control command received from theinterface unit302. Thereby, thehybrid recorder3 performs recording operation and the reproducing operation.
First, the video signal V[0261]1 successively supplied from the external source such as a video camera is sent to the first switch304: a video signal sent from adecoder305 is also supplied to thefirst switch304 other than that. Thefirst switch304 is a switch for selecting video signal to be supplied to the hard-disk drive300 and thevideo tape recorder301 based on the control signal from theCPU303, and selects one of the video signal V1 to be inputted and the reproducing and supplies it to theencoder306.
The[0262]encoder306 converts an analog video signal supplied from thefirst switch304 to a digital video signal, and compression-codes the digitalized video signal in a frame unit according to the moving picture coding experts group (MPEG) standard and supplies thus encoded video signal to thesecond switch307 and thevideo tape recorder301.
To the[0263]second switch307, the reproducing video signal outputted from theVTR301 is supplied other than the coded video signal outputted from theencoder306. Thissecond switch306 is a switch for selecting video signal to be supplied to the hard-disk drive300 based on the control signal from theCPU303, and either of the coded video signal outputted from theencoder306 and the reproducing video signal outputted from theVTR301 is selected and outputted.
The video signal selected by the[0264]second switch307 is supplied to a serial digital date interface (SDDI)interface unit314. TheSDDI interface unit314 supplies this input video signal to aninput buffer memory308. At this time, theSDDI interface unit314 converts the video signal supplied in the serial digital interface (SDI) format into the SDDI format against the recording to the hard-disk drive300, and supplies the video signal converted into the SDDI format to theinput buffer memory308.
The[0265]input buffer memory308 has the storage capacity capable of storing for example, 15 frames of video signal, and temporary stores the input video signal.
The hard-[0266]disk drive300 is equipped with hard-disk array in which plural hard disks are connected in array, and having sufficient storage capacity for the recording of video signal. This hard-disk drive300, if recording operation is instructed by control signal from theCPU303, sequentially reads the video signal stored in theinput buffer memory308 out and stores it in the hard-disk array. Furthermore, the hard-disk drive300, if reproducing operation is instructed by control signal from theCPU303, reads the video signal at the part instructed by theCPU303 out and reproduces it. The reproduced video signal is supplied to theoutput buffer memory309 of theSDDI interface unit314 having the storage capacity of for example, 15 frames, and temporary stored therein. Thisoutput buffer memory309 sequentially reads the video signal temporary stored out and supplies it to thethird switch310. Note that, at the time of supplying the video signal stored in theoutput buffer memory309, theSDDI interface unit314 converts the video signal in the SDDI format into the SDI format, and supplies the video signal converted into the SDI format.
Here, the recording and reproducing operation of this hard-[0267]disk drive300 will be described concretely hereinafter. In thishybrid recorder3, the recording and reproducing operation of the hard-disk drive300 is completely managed by theCPU303.
The[0268]CPU303 assigns a time code to each video frame of the video signal to be recorded based on the time code supplied from the time code generating unit313, and at the same time, assigns a recording address to each video frame of the video signal. Then, theCPU303 memorizes thus assigned time code and recording address in the form of correspondence table.
At the time of recording operation, the[0269]CPU303 instructs the hard-disk drive300 of a recording address and a recording command. Thereby, the hard-disk drive300 records video signal to the instructed recording address. On the other hand, at the time of reproducing operation, if the video signal to be read out is instructed by the time code from thecomputer2, theCPU303 refers to the aforementioned correspondence table and determines that where the video frame of the instructed time code recorded is (that is, determines its recording address). Then, theCPU303 instructs the hard-disk drive300 of the determined recording address along with a reproducing command. Thus, the hard-disk drive300 reproduces the video signal from the instructed address, so that the video signal required by thecomputer2 is reproduced. As in the above, since theCPU303 memorizes the correspondence relation between time code and recording address as a correspondence table, even if a reproducing position is instructed by its time code by thecomputer2, the instructed reproducing position can be reproduced promptly.
Note that, if an external time code (EXT.TC) is supplied from the outside, the aforementioned time code generating unit[0270]313 supplies the external time code to theCPU303 as a time code, while if the external time code is not supplied, the time code generating unit313 generates a time code for itself and supplies it to theCPU303.
Next, the duties of the[0271]input buffer memory308 and theoutput buffer memory309 which are provided on the input side and the output side of the hard-disk drive300 respectively are described. These twobuffer memories308 and309 are buffers for making the recording operation and the reproducing operation of the hard-disk drive300 to perform apparently in parallel. This hard-disk drive300 can perform recording operation at a speed at least, more than twice the speed when theinput buffer memory308 fetches video signal, and can perform reproducing operation at a speed at least, more than twice the speed when theoutput buffer memory309 reads video signal out. Thus, if thesebuffer memories308 and309 are provided on the input side and the output side respectively, the hard-disk drive300 can store video signal in theoutput buffer memory309 by performing reproducing operation while theinput buffer memory308 is fetching video signal, furthermore, the hard-disk drive300 can perform recording operation by reading video signal out from theinput buffer memory308 while theoutput buffer memory309 is reading video signal out. Therefore, if thesebuffer memories308 and309 are provided on the input side and the output side of the hard-disk drive300 respectively, the recording operation and the reproducing operation of the hard-disk drive300 can be simultaneously performed apparently.
Here, the description of each part is continued again returning to FIG. 14. As described above, the coded video signal supplied from the[0272]encoder306 is also supplied to theVTR301. TheVTR301 is a device for reproducing a video tape brought from the outside, recording the video data stored in the hard-disk drive300, or recording the input video signal V1 as backup of the hard-disk drive300, and performs its recording operation or reproducing operation based on the control signal from theCPU303.
Note that, also the recording and reproducing operation of this[0273]VTR301 is managed by theCPU303 similarly to the hard-disk drive300, however, in the case of a VTR, a position cannot be specified by an address as the hard-disk drive, so that theCPU303 instructs the very time code in place of the address information. That is, at the time of recording, theVTR301 adds and records the time code given from theCPU303, but at the time of reproducing, theVTR301 performs reproducing operation by determining a readout position based on the time code instructed by theCPU303.
The video signal reproduced from the[0274]VTR301 is supplied to thethird switch310 or the aforementionedsecond switch307. Thethird switch310 has two output terminals, and selects either of two reproducing video signal to be inputted and supplies it to either of thedecoder305 and thefirst switch304.
The[0275]decoder305 is a device for decoding the video signal compressively coded in a frame unit, and decodes the video signal supplied from thethird switch310 according to the MPEG standard. Furthermore, thedecoder305 converts the decoded digital video signal into analog video signal and supplies this to the first timecode adding unit311 and thefirst switch304.
The first time[0276]code adding unit311 adds a time code to the vertical synchronizing period of the video signal supplied from thedecoder305 based on the time code supplied from theCPU303. Provided that in the case where the video signal supplied from thedecoder305 is the video signal reproduced by thevideo tape recorder301, since the time code has already added thereto, the time code is not added. The time code is added only in the case of the video signal reproduced by the hard-disk drive300. Note that, the time code to be added to the video signal is that coincides with the time code assigned in the recording.
The video signal to which the time code has added by this first time[0277]code adding unit311 is supplied to the outside as reproducing video signal V3 as well as supplied to thecomputer2.
Note that, from this[0278]hybrid recorder3, the video signal V2 which is similar to the input video signal V1 is supplied other than the reproducing video signal V3. This video signal V2 is the video signal that the time code has added to the input video signal V1 by the second timecode adding unit312.
In this case, the second time[0279]code adding unit312 adds a time code to the vertical synchronizing period of the input video signal V1 based on the time code supplied from theCPU303, and supplies this as the video signal V2. At the time, the second timecode adding unit312 adds the time code to the video signal V1 so that the correspondence relation between time code and video frame to which the time code is added becomes identical to the video signal V3. This is that for example, in the first timecode adding unit311, if the time code “00:01:23:45” is added to a video frame, the same time code “00:01:23:45” is added to the video frame coinciding with that video frame in the video signal V1.
Here, the flow of video signal in this[0280]hybrid recorder3 will be described for each operation mode. First, in the case where the video signal V1 supplied from the outside is recorded to the hard-disk drive300 as editing material, the video signal V1 is selected by thefirst switch304 and supplied to theencoder306. Then, the video signal compressively coded by theencoder306 is selected by thesecond switch307 and supplied to the hard-disk drive300 via theinput buffer memory308. According to this flow of video signal, the input video signal V1 can be supplied and recorded to the hard-disk drive300.
On the other hand, the video signal V[0281]1 supplied from the outside is recorded to a video tape by theVTR301, the video signal V1 is selected by thefirst switch304 in the similar manner, and the video signal supplied from theencoder306 is supplied to theVTR301. Thus, the video signal V1 supplied from the outside can be recorded to the video tape in digital mode.
In the case where a video tape recorded in an analog mode brought from the outside is reproduced and the video signal is recorded to the hard-[0282]disk drive300 as editing material, at first, the brought video tape is loaded in theVTR301 and reproduced. The reproducing video signal is selected by thethird switch310 and sent to thefirst switch304. Thefirst switch304 selects this reproducing video signal and sends this to theencoder306. The video signal compressively coded by theencoder306 is selected by thesecond switch307 and supplied to the hard-disk drive300 via theinput buffer memory307. According to this flow of video signal, the analog signal brought in the form of a video tape from the outside can be recorded to the hard-disk drive300 as editing material.
In the case where a video tape recorded in a digital mode brought from the outside is reproduced and the video signal is recorded to the hard-[0283]disk300 as editing material, at first, the brought video tape is loaded in theVTR301 and reproduced. Then, the reproduced video signal still compressively coded is selected by thesecond switch307 and supplied to the hard-disk drive300 via theinput buffer memory308. According to this flow of video signal, the digital video signal brought in the form of a video tape from the outside can be recorded to the hard-disk drive300 as editing material.
On the other hand, in the case where the video signal of the final video program produced as the result of edition is read out from the hard-[0284]disk drive300 and recorded to a broadcasting video tape, the video signal reproduced by the hard-disk drive300 is sent to thethird switch310 via theoutput buffer memory309. Thethird switch310 selects the reproduced video signal and supplies to thedecoder305. Thedecoder305 decodes the compressively coded reproduced video signal and sends the decoded video signal to thefirst switch304. The first304 selects this reproduced video signal and supplies to theencoder306. Then, the video signal encoded again by theencoder306 is supplied to theVTR310, so that the video signal of the final video program recorded in the hard-disk drive300 can be dubbed to the broadcasting video tape.
(6) Setting of Reproducing speed[0285]
(6-1) Reproducing speed Setting Area[0286]
It is described about the setting of reproducing speed for event hereinafter. In the[0287]editing system1, the reproducing speed of event can be set in a frame unit by using a reproducingspeed setting area25 in both modes of a picture mode and a time-line mode. Thus, for example, in a baseball broadcasting, slow-playback can be set for an event at the moment of hitting a home run. By slowly reproducing the event of the home run scene, an image in which the motion of batter and the motion of ball are represented more real can be presented to audience. Furthermore, since the reproducing speed can be set in a frame unit, for example, in the scene where a pitcher is throwing a ball, relatively fast slow-playback (e.g., 0.5-time speed) can be performed, and in the scene where a batter is hitting the ball, relatively slow slow-playback (e.g., 0.01-time speed) can be set; thereby, more powerful image can be presented to audience by setting various slow-playback in one event.
Here, this respect is concretely described with reference to FIG. 15. First, the reproducing[0288]speed setting area25 which is shown in FIG. 15 becomes an operable state by clicking a reproducingspeed setting button22hin atiming displaying area22. The reproducingspeed setting area25 has alearn button25a, aspeed fit button25b, a normal reproducingspeed setting button25c,an eventnumber display part25d, an eventduration displaying part25e,a time-line-scale displaying part25f, a time-runner displaying part25g, apoint displaying part25h, an in-point time-code displaying part25i, an out-point time-code displaying part25jand a memory-residue indicator part25k.
The[0289]learn button25ais used when a reproducing speed is set using adedicated controller2ewhich will be described later. If after thelearn button25ais clicked the reproducing speed data is entered using thededicated controller2e, the speed data is stored and the reproducing speed of the event is set.
The[0290]speed fit button25bis used when the reproducing speed is automatically set by entering the length from an in point to an out point (i.e., duration) with a numeral value from thekeyboard2c. If after thespeed fit button25bis clicked the duration value is entered from thekeyboard2c, an optimum reproducing speed is automatically set based on the duration value.
The normal reproducing[0291]speed setting button25cis used when the setting of reproducing speed is canceled. If after an event in which the reproducing speed has been set is specified the normal reproducingspeed setting button25cis clicked, the set reproducing speed is canceled and a normal reproducing speed, i.e., one-time speed is set.
The event-[0292]number displaying part25dis the area for displaying an event number of the specified event. The displayed event number coincides with the event number displayed in the event-number displaying part29dof theevent displaying area29.
The event-[0293]duration displaying part25eis the area for displaying the length from an in point to an out point of the specified event, i.e., the duration. In the event-duration displaying part25e, the duration is displayed in a frame unit.
The time-line-[0294]scale displaying part25fis the area for displaying a scale visually showing the duration of the specified event. In this time-line-scale displaying part25f, the scale is displayed in a frame unit.
The time-[0295]runner displaying part25gis the position displaying part for displaying that where position in the event is now set or reproduced when the reproducing speed is set with thededicated controller2ewhich will be described later, or when the event in which the reproducing has set is previewed. On the time-runner displaying part25g,anicon25gahaving the form that a human being is running is displayed, and the position in the event now setting or reproducing can be exactly shown by the display position of theicon25gawith reference to the scale of the time-line-scale displaying part25f.Accordingly, the operator can easily grasp visually that the setting or reproducing of where position is now performed, by the position of theicon25ga. Furthermore, in this case, theicon25gais sequentially moved from the in point toward the out point along the scale accompanying with the progress of setting or reproducing. However, at this time, the moving speed of theicon25gais changed depending on the set reproducing speed, so that the operator can easily confirm visually that the reproducing speed is fast at which part and slow at which part in the event.
The[0296]point displaying part25his the area showing whether the other in point or out point which has been set in editing operation is found or not between the in point and the out point of the specified event. In thepoint display area25h, if such other in point or out point is found, apointer25hais displayed on the position. By the presence of thepointer25ha, the operator can easily grasp the presence of the other edit point.
The in-point time-[0297]code displaying part25iand the out-point time-code displaying part25jare the areas for displaying the time code of the in point and the out point of the selected event, respectively.
The memory-[0298]residual indicator part25kis the area showing the residual of the maximum learn continue time when thelearn button25ais clicked, the reproducing speed was set using thededicated controller2eand the reproducing speed is stored to theRAM10bof theCPU10. The memory area allocated for setting of the reproducing speed of one event has been previously determined, therefore, by checking up the residual capacity of the memory area, the residual can be easily computed. By providing such memory-residual indicator25k,memory residual can be confirmed visually; thereby, the setting of reproducing speed such that the maximum learn continue time is exceeded can be prevented.
(6-2) Dedicated Controller[0299]
It is described about the[0300]dedicated controller2ewhich is used in the setting of reproducing speed hereinafter, referring to FIG. 16. As shown in FIG. 16, thededicated controller2ehas asearch dial400 being a rotary encoder and amotion control lever401 being a slide encoder other than a plurality of operation buttons. The reproducing speed can be freely entered by manual operation using these two operation means.
First, the arrangement of the operation buttons provided on the operation panel of the[0301]dedicated controller2eis described. On the upper part of the operation panel, alearn button402, astart button403, astop button404, a recording-sideselect button405, a reproducing-sideselect button406, aplay button407, a stillbutton408, a mark-inbutton409 and a mark-out button410 are provided. At the lower of these operation buttons, thesearch dial400 which has described above, ashuttle button411, ajog button412, avariable button413 and avariable indicator414 are provided.
On the right side of the operation panel, a[0302]preview button415, acursor button413 and anenter button417 are provided from the top. On the contrary, on the left side of the operation panel, themotion control lever401 described above is provided so as to slide up and down for the operation panel.
Out of these operation buttons, the[0303]learn button402 is used when the reproducing speed is set by themotion control lever401 or thesearch dial400 and stored. The storing of reproducing speed is performed while this learnbutton402 was pushed, themotion control lever401 or thesearch dial400 was operated, and the mark-out button410 is pushed. Note that, thelearn button402 has almost the same function as thelearn button25adisplayed on the reproducingspeed setting area25.
The[0304]start button403 is operated when a recording-start command is sent to thehybrid recorder3 to record the video signal displayed on the recordingvideo displaying area21. And thestop button404 is operated when a recording-stop command to thehybrid recorder3 to stop the recording operation of the video signal displayed on the recordingvideo displaying area21. Note that, thesebuttons403 and404 have the same function as the recording-start button31aand therecording stop button31bdisplayed on themonitor2b.
The recording-side[0305]select button405 and the reproducing-sideselect button406 are used when the object to be controlled by thededicated controller2eis selected. When the recording side is controlled by thededicated controller2e, the recording-sideselect button405 is pushed, when the recording side is controlled, the reproducing-sideselect button406 is pushed.
The[0306]play button407 is used when the reproducing starting command is sent to thehybrid recorder3 to display the video signal on the reproducingvideo displaying area23. And the stillbutton408 is used when the reproducing stop command is sent to thehybrid recorder3 to stop the reproducing operation of the video signal displayed on the reproducingvideo displaying area23. If the stillbutton408 is pushed, a still image is displayed on the reproducingvideo screen23a.
The mark-in[0307]button409 and the mark-out button410 are used when an in point and an out point are set respectively. Note that, if the recording-sideselect button405 was pushed, thesebuttons409 and410 operate the same manner as the mark-inbutton24cand the mark-out button24fof the recordingvideo marking area24. On the contrary, if the reproducing-sideselect button406 was pushed, thesebuttons409 and410 operate the same manner as the mark-inbutton27cand the mark-out button27fof the reproducingvideo marking area27.
The[0308]shuttle button411 is pushed when thesearch dial400 is desired to operate in the shuttle mode, and thejog button412 is pushed when thesearch dial400 is desired to operate in the jog mode. Thevariable button413 is pushed when thesearch dial400 is desired to operate in a variable mode or themotion control lever401 is desired to operate. Note that, when thevariable button413 is pushed once, thevariable indicator414 on the right side is turned on a light and thesearch dial400 is set to the variable mode. If it is pushed once again, thevariable indicator414 on the left side is turned on a light and the motion control lever becomes a usable state. If it is pushed further once, thevariable indicator414 on the both side are turned off the light and thesearch dial400 and themotion control lever401 become an unusable state.
The[0309]preview button415 is used when it is desired to preview the selected event or program. If thepreview button415 is pushed in the state where the event or program has been selected, a reproducing starting command of the event or program is supplied to thehybrid recorder3 and the video signal of the event or program is displayed on the reproducingvideo screen23a.
The[0310]cursor button416 is composed of the following four buttons: an upward button, a downward button, a leftward button and a rightward button. Thecursor button416 is the button for moving the cursor when the clipped image data is selected in theclip displaying area28, theevent displaying area29 or theprogram display area30.
The[0311]enter button417 is allocated two kinds of functions: one is the function for entering a registration instruction when from the area between the in point and the out point which has been set in the reproducingvideo marking area27 is newly registered as an event (it is the same as the new-event button33 displayed on themonitor2b); the other is the function for entering a send-out instruction when the selected event or program is sent out.
The[0312]search dial400 is a rotary encoder for entering the reproducing speed data corresponding to the turning operation by the operator. Thissearch dial400 operates in three modes of the shuttle mode, the jog mode or the variable mode by theshuttle button411, thejog button412 or thevariable button413 is pushed. First, in the shuttle mode, the reproducing speed data from −100 times-speed to +100 times-speed can be entered depending on the turned position of thesearch dial400. Note that, in this mode, thesearch dial400 becomes a click state at the position of a still image, +10 times-speed and −10 times-speed.
In the jog mode, the reproducing speed data from −1 times-speed to +1 times-speed can be entered depending on the turned position of the[0313]search dial400.
And in the variable mode, the reproducing speed data from −1 times-speed to +3 times-speed can be entered depending on the turned position of the[0314]search dial400. Note that, in this mode, thesearch dial400 becomes a click state at the position of a still image and +1 times-speed.
In this manner, it can be selected among the jog mode, in which the reproducing speed can be set precisely by reducing the control range, the shuttle mode, in which the setting in wide range can be performed by rough setting of reproducing speed, and the variable mode, in which the setting range on the plus side is widened. Thereby, the operator can set the reproducing speed freely by switching the mode depending on the desired reproducing speed.[0315]
The[0316]motion control lever401 is the slide encoder for entering the reproducing speed data responding to the slide operation by the operator. By themotion control lever401 is slid up and down, the reproducing speed data from a still image to +1 times-speed can be entered. Note that, on the both sides of themotion control lever401, arange widening button401ais provided. If therange widening button401ais pushed, the enterable reproducing speed data can be widened to the range from −1 times-speed to +3 times-speed.
In this manner, the reproducing speed data from the still image to +1 times-speed can be entered by the[0317]motion control lever401, so that the operator can set the reproducing speed within the range freely.
Moreover, as the mechanism for entering the reproducing speed data, the[0318]search dial400 having a rotating operating system and themotion control lever401 having a slide operation system are provided, so that the operator can enter the reproducing speed data by selecting one of these, which is easy to use; thus the usability can be improved.
Note that, the instruction data entered from various operation buttons of the[0319]dedicated controller2edescribed in this term, and the reproducing speed data entered from thesearch dial400 and themotion control lever401, are supplied to theCPU10 via the pointing-device interface17. Thereby, theCPU10 makes operation control corresponding to the instruction data and reproduction control corresponding to the reproducing speed data to the specified event. If the learn button was pushed, theCPU10 stores the reproducing speed data to theRAM10bas the reproducing speed of the specified event.
Reproducing speed data set by the edit operator is stored in the[0320]RAM10bin the data format shown in FIG. 17. To be concrete, reproducing speed data is stored for each video frame from the in point to the out point of the specified event. Note that, since this reproducing speed data is a control command according to the RS-422 communication protocol, theCPU10 transmits this reproducing speed data itself to thehybrid recorder3 through the RS-422 cable. Then theCPU303 of thehybrid recorder3 given the reproducing speed data performs the computation v=10(N/32-2)with assuming that the reproducing speed data is N and the reproducing speed of video data outputted from thehybrid recorder3 is v. Accordingly, if the reproducing speed data N supplied from thehybrid recorder3 is “64”, 1.0-time speed video data is reproduced from thehybrid recorder3, and if the speed data N supplied from thecomputer2 is “32”, 0.1-time speed video data is reproduced from thehybrid recorder3.
(6-3) Setting Method of Reproducing speed[0321]
The setting procedure when the reproducing speed is set using the reproducing[0322]speed setting area25 is described hereinafter.
In the most typical setting method, at first a desired event to be set is specified from the[0323]event displaying area29 by clicking. Then the reproducingspeed setting button22his clicked in thetiming displaying area22. As a result, the number of the specified event and its duration are displayed in the reproducingspeed setting area25. Then thelearn button25 of the reproducingspeed setting area25 is clicked. Thereby, it is to be settable reproducing speed, and reproducing speed data is inputted by operating amotion control lever401 or asearch dial400. Thus inputted reproducing speed is sequentially stored in theRAM10bof theCPU10. If setting of reproducing speed is to be stopped here, the mark-out button27fin the reproducingvideo marking area27 or a mark-out button410 of thededicated controller2eis pushed at the position wanted to be stopped and stopping the setting of reproducing speed. If thus set reproducing speed is to be memorized, thenew event button33 or the replacebutton34 may be clicked.
In the other setting method, a[0324]learn button402 of thededicated controller2eis pushed at a desired position while monitoring the reproducingvideo screen23ain the reproducingvideo displaying area23. As a result, an in point is set and it is to be settable reproducing speed, then reproducing speed data may be inputted by operating themotion control lever401 or thesearch dial400 of thededicated controller2ein similar manner. Thus inputted reproducing speed is sequentially stored to theRAM10bof theCPU10. If setting of reproducing speed is stopped here, the mark-out button27fin the reproducingvideo marking area27 or the mark-out button410 of thededicated controller2eis pushed at the position wanted to be stopped and stopping the setting of reproducing speed. If thus set reproducing speed is memorized, thenew event button33 or the replacebutton34 may be clicked.
In a further method, a desired event to be set a reproducing speed is specified from the[0325]event displaying area29 by clicking. The reproducingspeed set button22hin thetiming displaying area22 is clicked. Then the number of the specified event and its duration are displayed in the reproducingspeed setting area25. And thespeed fitting button25bin the reproducingspeed setting area25 is clicked. Thereby, reproducing speed data is to be able to be inputted from the keyboard. The operator inputs reproducing speed data. In this case, the operator inputs not reproducing speed data (speed data) itself but inputs a duration value. By this operation, the optimal reproducing speed corresponding to the duration value is automatically set to the event.
Note that, if the event is previewed then, the[0326]preview button32 may be clicked. Or if thus set reproducing speed is memorized, thenew event button33 or the replacebutton34 may be clicked.
(7) Preroll Mode[0327]
The preroll mode provided in the[0328]editing system1 is described hereinafter. Normally, when an event is produced, the operator clicks the mark-inbutton24cand the mark-out button24fof the recordingvideo marking area24 to specify an in point and an out point while viewing the video data displayed on therecording video screen21a. Thereby, in theediting system1, the video data from the specified in point to out point is registered as an event. When confirming the registered event, after the operator clicks the event displayed in theevent displaying area29 to specify, clicks thepreview button32. Thereby, the reproducing operation of the event is started and the video data from the in point to the out point of the event is displayed on the reproducingvideo screen23a.
By the way, when specifying the in point of the event, the operator operates the mark-in[0329]button24cto specify the in point while viewing the video data displayed on therecording video screen21a,so that sometimes specifies the in point after the scene desired to register as an event due to the delay of operation of the mark-inbutton24c.For instance, in the case where in a baseball broadcasting the scene of hitting a home run is registered as an event, registering the scene from the pitcher threw the ball to the ball hit by the batter is entered the stand is generally desired. However, whether it is the home-run scene cannot be known until the ball hit by the batter is entered the stand, so that specifying the in point is delayed inevitably. Since such event in which the in point has been delayed does not include an important scene, the event must be corrected.
Then, in the[0330]editing system1, the preroll mode in which the reproducing operation is automatically started for predetermined time before from the in-point position specified by the operator to easily correct the marking point is provided. It is concretely described about this preroll mode hereinafter.
First, the time used in the preroll mode, that is, the time for shifting the reproducing start point from the in point specified by the operator to before (hereinafter, it is referred as a queue-up time) can be set freely in the environment setting of the menu. When the queue-up time is set, the environment setting which has been prepared as a menu is called up and the item of queue-up is selected. By the selection of the queue-up item, the queue-up setting screen is displayed on the screen as shown in FIG. 18. In this queue-up setting screen, clicking a setting-[0331]time displaying part500 and entering the time to be set as a queue-up time in a second unit, the time is displayed in the set-time display area500 and temporarily set.
Note that, clicking the button of the desired direction of the[0332]jog button501 which is adjacent to the set-time display area500, the time is shifted toward the direction in a second unit. The queue-up time may be entered using thejog button501.
After inputting the queue-up time in this manner, clicking a[0333]set button502, the time displayed in the set-time display area500 is formally registered as a queue-up time. More specifically, the entered queue-up time is stored to the memory area for environment setting data of theRAM10b.Note that, clicking a cancelbutton503, the time displayed in the set-time display area500 is reset and it becomes a state where a queue-up time can be newly entered. In this connection, clicking theset button502, the queue-up setting screen is automatically cleared from the screen.
In this manner, in the state where the queue-up time has been set, clicking a[0334]preroll button22gof thetiming displaying area22, starting of the preroll mode is instructed, and then thepreroll button22gis turned on a light and starting the preroll mode. Note that, to cancel the preroll mode, clicking thepreroll button22gagain, stop of preroll mode is instructed, and then thepreroll button22gis turned off the light and canceling the preroll mode.
If the operator clicks the mark-in[0335]button24cof the recordingvideo marking area24 in the state where the preroll mode has been started, an in point is instructed and the clipped image data specified as the in point is displayed in the in-clip displaying area24a. In addition, at the same time, the set queue-up time is read and the time code at the position shifted by the queue-up time from the time code of the position specified as the in point is computed. Then a reproducing command is sent to thehybrid recorder3 so that the position of thus computed time code is set to the reproducing start point. Thereby, reproducing operation is automatically started from the reproducing start point in thisediting system1. Thus reproduced video signal V3 is displayed on the reproducingvideo screen23a, so that the operator can easily -correct the in point by clicking the mark-inbutton27cof the reproducingvideo marking area27 while viewing the reproducingvideo screen23a. Note that, specifying an out point by clicking the mark-out button27fand clicking the new-event button33 thereafter, the video data in the period from the in point to the out point is registered as an event.
By previously starting the preroll mode in this manner, even in the case where for example, in baseball broadcasting, the in point is specified by clicking the mark-in[0336]button24cat the point when the ball hit by the batter has entered the stand, the reproducing operation is automatically performed from the position for predetermined time before from the in point. So that the in point can be easily corrected only by specifying the in point by clicking the mark-inbutton27cwhile viewing the reproducing screen. For example, if the in point is corrected to the time point where the pitcher threw the ball, the event including the desired scene, such as the moment the batter was hit the ball, can be easily produced in real time.
(8) Work Data Folder[0337]
In this chapter, a work data folder will be described. In the[0338]editing system1, the work data regarding the event and program generated by edit operation will be generally stored in theRAM10b.However, in the case of stopping the edit operation by closing the application program, the work data is loaded down to the hard-disk drive15aprovided in thecomputer2, and stored to a hard disk in the hard-disk drive15a. At this time, the work data is stored by hierarchic structure called folder.
This respect is described concretely hereinafter, referring to FIG. 20. As shown in FIG. 20, the work data regarding the event and program will be stored in hierarchic structure called folder. The folder is the almost same as a directory in the MS-DOS or the like, in which a[0339]work data folder600 is set as the uppermost hierarchy and folders601-603 are formed as its lower hierarchy. Thus each data file is stored with management in hierarchic structure. Note that, thework data folder600 is formed in the hard-disk drive15aby thecomputer2 when theediting system1 is started.
First, the clipped image data to be displayed in the[0340]clip displaying area28,event displaying area29, andprogram displaying area30, will be respectively stored to the lower hierarchy of theclip folder601 formed in the lower hierarchy of thework data folder600 for each clipped image data as a clip-image file. The contents of the clipped-image file is the very clipped image data and to which video data showing the clipped image will be written. As the file name of this clip-image file, a name in which the extended character “.pic” is added to the index number that has been added to each clipped image data, will be used.
Furthermore, symbol image data registered as the typical clipped image in the event, is respectively stored for each symbol image data as a symbol-image file at the lower hierarchy of a symbol-image folder[0341]602 that is formed as the lower hierarchy of thework data folder600. In the symbol image file, video data showing a symbol image will be written. As the file name of the symbol image file, a name in which the extended character “.pic” is added to the event number of the symbol image, will be used.
The work data regarding a program will be directly stored in the lower hierarchy of the[0342]work data folder600 as a program file without forming its lower folders. In this program file, the event number of events composing the program is sequentially written, thus events forming the program can be seen referring to the program file. As the file name of the program file, a name in which the extended character “.dat” is added to “PROG” that shows being a program file, will be used.
Also the work data regarding an event is directly stored to the lower hierarchy of the[0343]work data folder600 as an event file, without forming its lower folders. In this event file, the clip number of the in point and out point are sequentially written for each event number, thus the clip number of the in point and the out point forming each event can been seen referring to the event file. As the file name of this event file, a name in which the extended character “.dat” is added to “EVNT” that shows being an event file, will be used.
Also the work data regarding the clipped image data is directly stored to the lower hierarchy of the[0344]work data folder600 as a clip file, without forming its lower folders. In this clip file, the index number and the time code of clipped image data are sequentially written for each clip number, thus the index number of the image data forming each clipped image data can be seen referring to the clip file. As the file name of this clip file, a name in which the extended character “.dat” is added to “CLIP” that shows being a clip file, will be used.
Furthermore, the speed data showing the reproducing speed of an event set using the reproducing speed setting area[0345]25 (see FIG. 17) is respectively stored to the lower hierarchy of theslow data folder603, that has formed in the lower hierarchy of thework data folder600, for each event as a slow data file. In this slow data file, the speed data shown in FIG. 17 will be written for each frame, thus the reproducing speed that has been set to the event can be seen referring to the slow data file. As the file name of this slow data file, a name in which the extended character “.dat” is added to the respective event number added to each event, will be used.
In this manner, in the[0346]editing system1, in the case of closing the application program, the work data which regards the event and program generated by edit operation is stored to the hard disk in the hard-disk drive15ain hierarchic structure. Thereby, in the case of starting the application program again, the clipped image data same as the data that has been displayed before closed can be displayed to theprogram displaying area30 andevent displaying area29 by reading out these work data stored in the hard disk, and returning to the state before the application program was closed. Moreover, by storing the work data in this manner, the work data can be read out later, and thus an edit list, such as an edit decision list (EDL), can be sent out.
(9) Description of Computer's Operation[0347]
In this chapter, the operation of the[0348]computer2 in each processing will be described referring to flowcharts. Note that, the flowcharts used in the following description generally describe the operation of theCPU10. Furthermore, in the description at the clauses (9-1) through (9-7), the recording operation or the reproducing operation of thehybrid recorder3 means the recording operation or the reproducing operation of the hard-disk drive300.
(9-1) Initial Operation[0349]
First, the initial operation of the[0350]computer2 is described referring to FIG. 21. If the execution of an application program is specified by the operator in step SP1, theCPU10 starts the operation. The application program has been stored on the hard disk in the hard-disk drive15a.In the next step SP2, since the application program has been stored in the hard disk of the hard-disk drive15a, theCPU10 uploads the application program to theoperational RAM10bthat has been provided in theCPU10.
In the next step SP[0351]3, theCPU10 executes the application program uploaded to theRAM10b.In the next step SP4, theCPU10 keeps a memory area to store a plurality of clipped image data, edit data that will be generated by edit operation performed in future, to theRAM10b.At this time, the first management recorders for clip data, event data and program data, such as shown in FIGS. 13A to13C, are generated in theRAM10b.
In the next step SP[0352]5, theCPU10 generates a work data folder to store work data of a program and event that will be generated by the edit work performed then on the hard disk in the hard-disk drive15a.
In the next step SP[0353]6, to display a graphic display for GUI on themonitor2b, theCPU10 transmits graphic data to theVRAM13bin real time synchronizing with the inner clock of thecomputer2. Thereby, in the next step SP7, a graphic display same as the graphic data stored to theVRAM13bis displayed on themonitor2b.
In the next step SP[0354]8, theCPU10 determines whether or not to display the source video signal V2 on therecording video screen21a. This determination is conducted based on the video display specified by the operator. If no video display has specified, theCPU10 determines that edit operation will not be performed, and goes to step SP16 to stop the processing. In the normal case, the video display has specified to perform edit operation, then theCPU10 goes to step SP9 and enters display processing of the source video signal V2.
In step SP[0355]9, theCPU10 supplies an RS-422 control command to thehybrid recorder3 to instruct thehybrid recorder3 send out of the source video signal V2 being editing material. Thehybrid recorder3 received this, adds a time code to the input source video signal V1 and generates the video signal V2, and supplies it to thecomputer2.
In the next step SP[0356]10, thedata converting unit11bextracts a time code from the input composite video signal V2, and converts the composite video signal V2 into digital component video data. The converted video data is supplied to theframe memory11cand temporarily stored in a frame unit. The extracted time code data is supplied to the processor controller11aand sent out to theCPU10 via the processor controller11a.
In the next step SP[0357]11, the video data stored in theframe memory11cis transmitted to theVRAM13b.In the video data to be transmitted, the number of samples read out from theframe memory11chas reduced, i.e., the video data has reduced to 380×240 pixels. Note that, at this time adjustment of theimage data bus5ais conducted, thus image data for GUI is transmitted from theCPU10 to theVRAM13bother than the video data. Furthermore, the video data stored to theVRAM13bis updated in real time, thereby the video data can be displayed in real time on themonitor2b.
In the next step SP[0358]12, the image data and video data stored to theVRAM13bare displayed on themonitor2bin real time. In the next step SP13, theCPU10 determines whether to record or not the video data displayed on therecording video screen21ato thehybrid recorder3. This determination is conducted based on click operation of the reproducingstart button31aby the operator. That is, if the reproducingstart button31ahas clicked, recording of the video data is determined and theCPU10 and goes to the next step SP14. If not, recording of video data is determined and theCPU10 goes to step SP16 to stop the processing.
In step SP[0359]14, theCPU10 sends out a recording-start command to theexternal interface18. Theexternal interface18 received this command, converts the recording start command into the RS-422 communication format and sends out this to thehybrid recorder3. Then thehybrid recorder3 starts recording operation of the input source video signal V1.
Since the recording operation has been started in the[0360]hybrid recorder3, in the next step SP15, theCPU10 judges that all of initial setting has finished, and stops the procedure of initial operation as shown in this flowchart.
(9-2) Marking on Recording Side[0361]
Marking using the recording[0362]video marking area24 is described referring to the flowchart of FIG. 22. This marking can be further easily understood by referring to the description of FIGS.11 through FIGS. 11A to11C.
After the procedure of the initial operation shown in FIG. 21 is finished, it becomes into the state where this marking operation can be begun, and thus the processing is started in step SP[0363]20.
In step SP[0364]21, theCPU10 judges whether marking has newly performed or not. The judgment of the presence of marking is performed based on whether themouse2dhas clicked or not when the cursor was positioned within the area of mark-inbutton24cor mark-out button24fin the recordingvideo marking area24. Because at this time, an interrupt command was generated by clicking themouse2d, theCPU10 conducts the judgment of marking based on generation of this interrupt command. As a result, if the mark-inbutton24chas clicked, theCPU10 judges that an in point has specified, and goes to step SP22. If the mark-out button24fhas clicked, theCPU10 judges that an out point has specified, and goes to step SP30.
In step SP[0365]22, the clipped image data of an in point is generated. The in-point clipped image data is generated by reading the video data stored in theframe memory11cto theVRAM13b.At this time, since its data quantity has thinned out to {fraction (1/16)} by reducing the number of samples to be read out, clipped image data of 95×60 pixels in image is generated.
In step SP[0366]23, the in-point clipped image data stored in the memory area for the in-clip displaying area of theVRAM13bis read out and displayed to the in-clip displaying area24a.
In step SP[0367]24, theCPU10 judges whether the marking in step SP21 is the first in-point marking or not. As a result, if it is the first marking, theCPU10 returns to step SP21, while if it is the second marking or more, theCPU10 goes to step SP25.
In step SP[0368]25, theCPU10 judges whether the clipped image data marked before is in-point clipped image data or not. As a result, if the clipped image data marked before is in-point clipped image data, theCPU10 goes to step SP26, while if it is out-point clipped image data, theCPU10 goes to step SP27.
In step SP[0369]26, the in-point clipped image data marked before is moved to theclip displaying area28. That is, since the in point was successively marked twice, the clipped image data marked before will not be used as an event and moving to theclip displaying area28. Note that, at this time the second management record data of the clipped image data moved to theclip displaying area28 is generated as shown in FIGS.11 through FIGS. 13A to13C.
On the other hand, in step SP[0370]27, theCPU10 judges whether an event has generated or not by the out-point clipped image data marked before. As a result, if an event has generated by the late marking, theCPU10 goes to step SP29, while if not generated, theCPU10 goes to step SP28.
In step SP[0371]28, the out-point clipped image data which has been displayed in the out-clip displaying area24dby the late marking is moved to theclip displaying area28. The reason why the out-point clipped image data generated by the late marking has not used as an event, however, since it might be used later, it is remained as marking history.
On the contrary, in step SP[0372]29, the out-point clipped image data being displayed in the out-clip displaying area24dis cleared. Because in this case, the clipped image data displayed in the out-clip displaying area24dhas used already as the out point of an event, there is no need to display any more.
On the other hand, if the[0373]CPU10 goes to step SP30 because the out-point marking has detected by the determination in step SP21, out-point clipped image data is generated here. Also this out-point clipped image data is generated by reading out the video data stored in theframe memory11cto theVRAM13b.Furthermore, also in this case 95×60-pixel clipped image data is generated by thinning out the data quantity to {fraction (1/16)}.
In step SP[0374]31, the out-point clipped image data stored in the memory area for the out-clip displaying area of theVRAM13bis read out and displayed to the out-clip displaying area24d.
In the next step SP[0375]32, theCPU10 judges whether the marking in step SP21 is the first marking of an out point or not. As a result, if it is the first marking, theCPU10 returns to step SP21, while if it is the second marking or more, theCPU10 goes to step SP33.
In step SP[0376]33, theCPU10 judges whether the clipped image data marked before is an in-point clipped image data or not. As a result, if the clipped image data marked before is in-point clipped image data, theCPU10 goes to step SP34, while if it is the out-point clipped image data, theCPU10 goes to step SP36.
In step SP[0377]34, theCPU10 registers the period from in point marked before to out point marked then as an event. In this manner, in theediting system1, if an out point is marked after an in point, it is automatically registered as an event. Note that, at this time the second management record data regarding the event is generated as shown in FIGS.11 through FIGS. 13A to13C.
In the next step SP[0378]35, the in-point clipped image data of the generated event is copied to theevent displaying area29 and the clipped image data is displayed to theevent displaying area29.
On the other hand, in step SP[0379]36, theCPU10 judges whether an event has generated or not by the out-point clip image data marked before. As a result, if an event has generated by the late marking, theCPU10 goes to step SP38, while if no event has generated by the late marking, theCPU10 goes to step SP37.
In step SP[0380]37, the out-point clipped image data generated by the late marking is moved to theclip displaying area28. The reason why although the out-point clipped image data generated by the late marking has not used as an event, it might be used later, thus it is remained as marking history.
On the contrary, in step SP[0381]38, the in-point clipped image data being displayed in the in-clip displaying area24ais cleared. The reason why since the event was generated by the clipped image data that is being displayed in the in-clip displaying area24aand the out-point clipped image data marked before, the clipped image data being displayed in the in-clip displaying area24awill not been used and there is no need to display any more.
After the processing of steps SP[0382]26, SP28, SP29, SP35, SP37 or SP38 is finished, theCPU10 goes to step SP39 to determine whether to stop or not the marking operation. In the case of continuing the marking operation, theCPU10 returns to step SP20 again and repeating the processing, while in the case of stopping the marking operation, theCPU10 goes to step SP40 to stop the processing.
(9-3) Marking on Reproducing Side[0383]
Hereinafter, it is described about the case of marking using the reproducing[0384]video marking area27 while monitoring the video signal V3 reproduced from thehybrid recorder3, referring to the flowcharts of FIGS. 23 and 24.
This marking is started in the state where the clipped image data has been stored already. In step SP[0385]51 following to the start step SP50, theCPU10 judges whether the clipped image data in theclip displaying area28 has specified or not. At this time, theCPU10 judges that the clipped image data has specified when themouse2dhas double-clicked (the operation successively clicking twice), when the cursor is positioned at the display position of the clipped image data (28a).
As a result, if the clipped image data has specified, in the next step SP[0386]52, the specified clipped image data is displayed to the reproducingvideo marking area27. That is, if in-point clipped image data has specified, it is displayed to the in-clip displaying area27a,while if out-point clipped image data has specified, it is displayed to the out-clip displaying area27d.
In the next step SP[0387]53, theCPU10 refers to the time code of the specified clipped image data and sends out a control command to reproduce the video data of the time code in a still image to theexternal interface18. Theexternal interface18 received this command, converts the still-reproducing command into the RS-422 communication format and supplies to thehybrid recorder3. Thehybrid recorder3 determines its recording address based on the supplied time code by referring to a correspondence table of time codes and recording addresses, and reads the video data from the position of the recording address to reproduce the specified video data. This video data is supplied to thesecond video processor12 in thecomputer2 as a video signal V3.
In the next step SP[0388]54, the time code is extracted from the video signal V3 at thesecond video processor12, and image processing of converting the video signal V3 into digital component video data is conducted. Thus converted video data is temporarily stored to theframe memory12cin thesecond video processor12.
In the next step SP[0389]55, the still-reproducing video data which has been stored in theframe memory12cis reduced to 380×240-pixel data and transmitted to theVRAM13b.
In the next step SP[0390]56, the reproducing video data stored in theVRAM13bis displayed on the reproducingvideo screen23a.In this case thehybrid recorder3 does not supply real-time video data but supply only still-video data that corresponds to the specified clipped image data, and the still image is displayed on the reproducingvideo screen23a.
In the next step SP[0391]57, theCPU10 judges whether reproducing of the still video data displayed on the reproducingvideo screen23ahas instructed or not. TheCPU10 determines that the reproducing has instructed when thepreview button32 has clicked when the still video data was displayed on the reproducingvideo screen23a.
As a result, if the reproducing has instructed, in the next step SP[0392]58 theCPU10 supplies a reproducing start command to theexternal interface18. Theexternal interface18 received this command, converts the reproducing command into the RS-422 communication format and supplies to thehybrid recorder3. Thehybrid recorder3 sequentially reads out video data from the recording address corresponding to the video data being displayed on the reproducingvideo screen23a, so that normal reproducing video data following to the video data being displayed on the reproducingvideo screen23ais generated. This reproducing video data is supplied to thesecond video processor12 in thecomputer2 as a video signal V3.
In the next step SP[0393]59, theCPU10 judges whether marking has performed or not. The presence of marking is judged based on whether themouse2dhas clicked or not when the cursor was positioned within the area of the mark-inbutton27cor mark-out button27fin the reproducingvideo marking area27. At this time since an interrupt command is generated by clicking themouse2d, theCPU10 judges the presence of marking by generation of this interrupt command. As a result of, if the mark-inbutton27chas clicked, theCPU10 judges that an in point has specified, and goes to step SP60. If the mark-out button27fhas clicked, theCPU10 judges that an out point has specified, and goes to step SP63.
In step SP[0394]60, the clipped image data of an in point is generated. The in-point clipped image data is generated by reading out the video data stored in theframe memory12cto theVRAM13b.At this time its data quantity has thinned out to {fraction (1/16)} by reducing the number of samples to be read out, then the clipped image data of 95×60 pixels in image size is generated.
In the next step SP[0395]61, the in-point clipped image data stored in the memory area for the in-clip displaying area of theVRAM13bis read out and displayed to the in-clip displaying area27a.
In the next step SP[0396]62, the in-point clipped image data which has been marked before and displayed to the in-clip displaying area27ais moved to theclip displaying area28. Note that, if there has no marked data and the clipped image data has not been displayed in the in-clip displaying area27a, this processing is not be performed. After finishing this step SP62, theCPU10 goes to step SP70.
On the other hand, if the[0397]CPU10 goes to step SP63 to mark an out point, out-point clipped image data is generated here. Also this out-point clipped image data is generated by reading out the video data stored in theframe memory12cto theVRAM13b. Furthermore, also in this case 95×60-pixel clipped image data is generated by thinning out the data quantity to {fraction (1/16)} when it was read out.
In step SP[0398]64, the out-point clipped image data stored in the memory area for the out-clip displaying area of theVRAM13bis read out and displayed to the out-clip displaying area27d.
In step SP[0399]65, theCPU10 judges whether the clipped image data marked before is in-point clipped image data or not. As a result, if the clipped image data marked before is in-point clipped image data, theCPU10 goes to step SP66, while if it is out-point clipped image data, theCPU10 goes to step SP67.
In step SP[0400]66, theCPU10 determines to newly register or not the data as an event. This determination is performed based on click operation of the new-event button33 by the operator. If the new-event button33 has clicked to instruct the event registration, theCPU10 goes to step SP68, while if the new-event button33 has not clicked and the event registration has not instructed, theCPU10 goes to step SP67.
In step SP[0401]68, theCPU10 registers the period from in point to out point as an event. In this manner, in theediting system1, if an out point is marked after an in point, the period from the in point to the out point is registered as a new event. Note that, at this time the second management record data regarding the event is generated as shown in FIGS.11 through FIGS. 13A to13C.
In the next step SP[0402]69, the in-point clipped image data of thus generated event is copied to theevent displaying area29, and the clipped image data is displayed to theevent displaying area29. After finishing this processing, theCPU10 goes to the next step SP70.
On the other hand, if the clipped image data generated by the late marking is out-point clipped image data, and the[0403]CPU10 goes to step SP67, the out-point clipped image data generated by the late marking is moved to theclip displaying area28. While, if marking has not been performed and no clipped image data is displayed in the out-clip displaying area27d, this processing is not performed. When this processing is finished, theCPU10 goes to step SP70.
In step SP[0404]70, theCPU10 judges whether reproducing stop of the video data being displayed on the reproducingvideo screen23ahas instructed or not. This judgment is performed based on whether the stillbutton408 of thededicated controller2ehas pushed or not. If reproducing stop has not instructed, theCPU10 returns to step SP59 and repeats the processing. While if instructed, theCPU10 goes to the next step SP71.
In step SP[0405]71, theCPU10 sends out a reproducing stop command to theexternal interface18. Theexternal interface18 received it, converts the reproducing stop command into the RS-422 communication format and sends out to thehybrid recorder3. Then thehybrid recorder3 stops the read-out operation of the video data. After the processing of step SP71 is finished, theCPU10 goes to step SP72 and to stop the marking processing.
(9-4) Trimming of Event[0406]
Hereinafter, it is described about the processing of specifying the generated event and changing its in point or out point, i.e., trimming, referring to the flowchart shown in FIG. 25. Note that, it is assumed that this flowchart is started from the state where events have been generated already.[0407]
First, in step SP[0408]81 following to the start step SP80, theCPU10 first judges whether clipped image data in theevent displaying area29 has specified or not. At this time, if themouse2dhas double-clicked (the operation successively clicking twice) when the cursor was positioned at the display position of the clipped image data (29a), theCPU10 judges that the clipped image data has specified.
As a result, if the clipped image data has specified, in the next step SP[0409]82, theCPU10 refers to the time code of the specified clipped image data and supplies a reproducing command to reproduce the video data of the time code in a still image, to thehybrid recorder3 via theexternal interface18. Then thehybrid recorder3 reproduces the specified video data responding to the reproducing command and generates reproducing video data; and thus the reproducing video data corresponding to the specified clipped image data is displayed on the reproducingvideo screen23a.
In the next step SP[0410]83, theCPU10 judges whether theshuttle button23bof the reproducingvideo displaying area23 has pushed or not. This judgment has performed based on whether themouse2dhas clicked or not when the cursor was displayed on theshuttle button23b.
As a result, if the[0411]shuttle button23bhas pushed, theCPU10 goes to the next step SP84 to judge whether theshuttle button23bhas dragged or not. This judgment is performed based on whether theshuttle button23bhas moved or not by moving the cursor in the state where theshuttle button23bhas clicked.
As a result, if the[0412]shuttle button23bhas dragged, theCPU10 goes to the next step SP85 to compute the moving direction and distance of the cursor. Then theCPU10 computes the time code of the specified video data based on thus obtained direction, distance, and the time code of the video data displayed on the reproducingvideo screen23a.More precisely, if it in the right direction, the time code of the specified video data is computed by adding the time code of the moving distance of the cursor to the time code of the video data being displayed, while if it is in the left direction, the time code of the specified video data is computed by subtracting the time code of the moving distance of the cursor from the time code of the video data being displayed.
In the next step SP[0413]86, theCPU10 supplies a reproducing command to reproduce the video data having the obtained time code to thehybrid recorder3 via theexternal interface18.
In the next step SP[0414]87, thehybrid recorder3 reproduces the video data having the specified time code responding to the reproducing command, thereby the reproducing video data of the specified time code is displayed on the reproducingvideo screen23a.
In the next step SP[0415]88, theCPU10 judges whether marking has performed or not. The judgment of the presence of marking is performed based on whether themouse2dhas clicked or not when the cursor was positioned in the area of the mark-inbutton27cor the mark-out button27fin the reproducingvideo marking area27. As a result, if either of the mark-inbutton27cor the mark-out button27fhas clicked, theCPU10 goes to step SP89, while if neither has clicked, theCPU10 returns to step SP83 to repeat the processing.
In step SP[0416]89, the marked clipped image data is generated. This clipped image data is generated by reading out the video data stored in theframe memory12cto theVRAM13b.At the time, its data quantity is thinned out to {fraction (1/16)} by reducing the number of samples to be read, then the clipped image data of 95×60 pixels in image size is generated.
In the next step SP[0417]90, the clipped image data stored in theVRAM13bis read and displayed to the in-clip displaying area27aor the out-clip displaying area27din the reproducingvideo marking area27. More precisely, if it has marked as an in point, the clipped image data is displayed to the in-clip displaying area27a, while if it has marked as an out point, the clipped image data is displayed to the out-clip displaying area27d.
In the next step SP[0418]91, theCPU10 judges whether the new-event button33 has pushed or not. This judgment is performed based on whether themouse2dhas clicked or not when the cursor was displayed on the new-event button33. As a result, if the new-event button33 has pushed, theCPU10 goes to step SP92, while if it has not pushed, theCPU10 goes to step SP94.
In step SP[0419]92, the in point or out point is replaced to the clipped image data marked in step SP88 and registered as a new event. For example, if the in point has marked in step SP88, the period from the new in point to the out point that has been already registered is registered as a new event, while if the out point has marked in step SP88, the period from the in point already registered to the new out point is registered as a new event. Note that, at this time the second management record data regarding the event is newly generated as shown in FIGS.11 through FIGS. 13A to13C.
In the next step SP[0420]93, the in-point clipped image data of the new event is displayed to theevent displaying area29. After finishing this processing, theCPU10 goes to step SP97 and stopping the trimming processing.
On the contrary, if the new-[0421]event button33 has not pushed and theCPU10 goes to step SP94, theCPU10 judges whether the replacebutton34 has pushed or not. This judgment is performed based on whether themouse2dhas clicked or not when the cursor was been displayed on the replacebutton34. As a result, if the replacebutton34 has pushed, theCPU10 goes to step SP95, while if the replacebutton34 has not pushed, theCPU10 returns to step SP83 and repeating the processing.
In step SP[0422]95, theCPU10 replaces the in point or out point to the clipped image data marked in step SP88. That is, in this case, the contents of the second management record data regarding event is simply replaced to the clipped image data of the marked in point or out point, i.e., the contents of the original event is simply updated without newly registering an event.
In the next step SP[0423]96, the in-point clipped image data of the updated event is displayed at the position of the original event in theevent displaying area29. After finishing this processing, theCPU10 goes to the next step SP97 and stopping the trimming processing.
(9-5) Trimming of Event with Preroll Function[0424]
Trimming of event with a preroll function in which reproducing is started from a position where is fixed time before the specified marking point will be described referring to the flowchart shown in FIGS. 26 and 27. Note that, it is assumed that this flowchart starts from a state where the[0425]hybrid recorder3 is recording the video signal V1, and the video signal V2 has been displayed on therecording video screen21a.
It is started from step SP[0426]120 and in step SP121, theCPU10 judges whether starting preroll mode has been set or not. This judgment is based on whether thepreroll button22gin thetiming displaying area22 has clicked already or not to specify starting preroll mode.
The next step SP[0427]122, theCPU10 judges whether the above-mentioned queue-up time has been set as preroll time in environmental setting. This judgment is based on whether queue-up time has been stored or not in an environmental setting data memory area in theRAM10b.As a result of the judgment, if starting preroll mode has been specified and preroll time has been set, theCPU10 goes to the next step SP123.
In step SP[0428]123, theCPU10 judges if the mark-inbutton24cin the recordingvideo marking area24 has clicked to mark an in point. As a result, if in-point marking has conducted, it goes to the next step SP124 to generate clipped image data of the in point. This clipped image data is generated by reading out video data stored in theframe memory11cto theVRAM13b. In this process, the data quantity is thinned out to {fraction (1/16)} by reducing the number of readout samples so that clipped image data in 95×60-pixel picture size is generated.
In the next step SP[0429]125, clipped image data stored in theVRAM13bis read out and displayed on the in-clip displaying area24ain the recordingvideo marking area24.
In the next step SP[0430]126, theCPU10 calculates a time code for queue up. More specifically, theCPU10 refers to the time code of the specified in-point clipped image data as well as the set queue-up time, and calculates a position shifted from the specified in point to queue-up time before (i.e., reproducing starting point).
In the next step[0431]127, theCPU10 outputs a reproducing command to reproduce video data from thus calculated time code position in real time to theexternal interface18. Receiving it, theexternal interface18 converts the reproducing command into the RS-422 standard communication format and outputs to thehybrid recorder3. By sequentially reading out video data from a recording address corresponding to the specified time code, thehybrid recorder3 generates reproducing video data which begins from the specified time code position. This video data is outputted to thesecond video processor12 in thecomputer2 as a video signal V3.
In the next step SP[0432]128, in thesecond video processor12, time code is extracted from the video signal V3 and image processing for converting the video signal V3 into digital component video data. Thus converted video data is temporary stored in theframe memory12cin thesecond video processor12.
In the next step SP[0433]129, reproducing video data stored in theframe memory12cis transmitted to theVRAM13bafter reducing to 380×240 pixels.
The next step SP[0434]130, reproducing video data stored in theVRAM13bis displayed on the reproducingvideo screen23a. In this manner, real-time video data which begins from a position queue-up time before the in point specified by the operator is displayed on the reproducingvideo screen23a.
In the next step SP[0435]131, theCPU10 judges whether marking is performed or not. Judgment of marking is based on whether themouse2dwas clicked or not when the cursor was in the area of the mark-inbutton27cor the mark-out button27fin the reproducingvideo marking area27. As a result, if the mark-inbutton27cwas clicked, theCPU10 judges that an in point was specified and goes to step SP132, while if the mark-out button27fwas clicked, judges that an out point was specified and goes to step SP135.
In step SP[0436]132, in-point clipped image data is generated by reading out video data stored in theframe memory12cto theVRAM13b. In the process, the data quantity is thinned out to {fraction (1/16)} by reducing the number of readout samples so that clipped image data in 95×60-pixel size is generated.
In the next step SP[0437]133, in-point clipped image data stored in theVRAM13bis read out and displayed on the in-clip displaying area27a.
In the next step SP[0438]134, the in-point clipped image data being displayed on the in-clip displaying area27ais moved to theclip displaying area28. If marking has not been and clipped image data is not displayed in the in-clip displaying area27a, this processing is not be performed. When the processing of this step SP134, theCPU10 goes to step SP142.
On the other hand, if the[0439]CPU10 went to SP135 to mark an out point, out-point clipped image data is generated here. Also this out-point clipped image data is generated by reading out video data stored in theframe memory12cto theVRAM13b. Also in this case, 95×60-pixel clipped image data is generated by thinning out the data quantity to {fraction (1/16)}.
In step SP[0440]136, out-point clipped image data stored in theVRAM13band displayed on the out-clip displaying area27d.
In the next step SP[0441]137, theCPU10 judges whether the marked clipped image data is an in-point clipped image data or not. As a result, if it is an in-point clipped image data, theCPU10 goes to step SP138, while if it is an out-point clipped image data, goes to step SP139.
In step SP[0442]138, theCPU10 determines whether newly register or not it as an event. This determination is based on clicking operation of the new-event button33 by the operator. If the new-event button33 was clicked and registering as an event was specified, theCPU10 goes to step SP140, but if the new-event button33 was not clicked and event registration was not specified, theCPU10 goes to step SP139.
In step SP[0443]140, theCPU10 registers the period from the in point to the out point as an event. In this case, the second management record data of the event is generated as shown in FIGS.11 through FIGS. 13A to13C.
In the next step SP[0444]141, the in-point clipped image data of thus generated event is copied to theevent displaying area29 to be displayed in theevent displaying area29. When this processing is completed, theCPU10 goes to the next step SP142.
On the other hand, if the clipped image data generated by the late marking is an out-point clipped image data and going to step SP[0445]139, the out-point clipped image data generated by the late marking is moved to theclip displaying area28. While if marking was not be performed and no clipped image data is displayed in the out-clip displaying area27d,theCPU10 goes to step SP142.
In step SP[0446]142, the CPU judges whether stopping reproducing the video data displayed on the reproducingvideo screen23awas specified or not. As a result, if reproducing stop was not specified, theCPU10 returns to step SP131 to repeat the processing, while if specified, theCPU10 goes to the next step SP143.
In step SP[0447]143, theCPU10 outputs a reproducing stop command to thehybrid recorder3 through theexternal interface18. Then thehybrid recorder3 stops readout operation of video data to stop reproducing operation. When the processing of this step SP143 is completed, theCPU10 goes to step SP144 and stops trimming processing by using a preroll function.
(9-6) Setting Operation of Reproducing Speed[0448]
The typical operation when optional reproducing speed is set to a desired event using the reproducing[0449]speed setting area25 is described with reference to the flowchart of FIG. 28.
In step SP[0450]151, theCPU10 judges whether a desired event is specified or not by the edit operator. For example, if the edit operator specified a desired clipped image data from clipped image data showing plural events displayed in theevent displaying area29 by operating a pointing device such as the mouse, theCPU10 judges that the event was specified.
In step SP[0451]152, theCPU10 judges whether the reproducingspeed setting button22hwas clicked or not by the edit operator in thetiming displaying area22. If it was clicked, theCPU10 refers to data stored in the second management record to manage specified events, and automatically displays the number of the specified event and its duration in the reproducingspeed setting area25.
In step SP[0452]154, theCPU10 judges whether thelearn button25ain the reproducingspeed setting area25 was clicked or not by the edit operator. By clicking thislearn button25a, speed setting operation is practically started to set optional reproducing speed to the specified event.
Hereafter, the speed setting operation will be described in order.[0453]
Firstly, in step SP[0454]156, the value of a time code data TC is reset so that the time code data TC is to be an in-point time code data of the specified event.
In step SP[0455]157, theCPU10 detects the state of sliding themotion control lever401 of thededicated controller2eat a timing when video data of the time code data is displayed in the reproducingvideo displaying area23.
In step SP[0456]158, theCPU10 supplies reproducing speed data corresponding to the slide state of themotion control lever401 to thehybrid recorder3 as a reproducing command, and stores it in theRAM10bin relation with the time code data TC. For example, if at the time of when speed setting operation is started by that thelearn button25ain the reproducingspeed setting area25 was clicked, themotion control lever401 has not been operated yet by the edit operator, this reproducing speed data will be the data “64” which shows normal 1.0-time speed reproduction.
In step SP[0457]159, the time code data TC is updated as time code data of the following frame.
In step SP[0458]160, theCPU10 judges whether the updated time code data TC is out-point time code data or not. If the time code TC updated in step SP159 is still not identical with the out-point time code data, theCPU10 determines that the video data of this event is reproducing and returns to step SP157.
For example, if the edit operator slid the[0459]motion control lever401 to the position of 0.1-time speed with monitoring reproducing video image displayed in the reproducingvideo displaying area23 at his desired timing, in step SP157, theCPU10 detects the motion of themotion control lever401. Then in step SP158, theCPU10 follows immediately the motion of themotion control lever401 and transmits the reproducing speed data “32” showing 0.1-time speed to thehybrid recorder3 as a reproducing control command to make the reproducing speed of thehybrid recorder3 0.1-time speed. At the same time, theCPU10 stores the time code data of the reproducing video data which was displayed in the reproducing video displaying area when themotion control lever401 was slid to the position of 0.1-time speed by the edit operator, and the reproducing speed data “32” showing 0.1-time speed connectedly as a slow data file.
As being understandable from the loop from step SP[0460]157 to step SP160, theCPU10 continues to detect the operated state of themotion control lever401 during the video data from the in point to the out point of the specified event is reproduced, and store the detected state in theRAM10bas reproducing speed data.
Accordingly, the[0461]CPU10 is always detecting the operation state of themotion control lever401 and controlling the reproducing speed of thehybrid recorder3 so as to be a reproducing speed corresponding to the slid state of themotion control lever401 during video data of the specified event is reproduced from thehybrid recorder3, furthermore, theCPU10 repeats the control such that stores speed data showing the reproducing speed corresponding to the slid state of themotion control lever401 connectedly with time code of the reproducing video data displayed in the reproducingvideo displaying area23 to theRAM10bas a slow data file, for each frame unit. Thereby, an optional reproducing speed set to the specified event is stored in a slow data file connectedly with the time code.
In the next step SP[0462]161, theCPU10 judges whether an event was updated or not. If the edit operator favors the slow data file formed in the loop from step SP156 to step SP160 and operates the replacebutton34 or the new-event button33, theCPU10 judges that the reproducing speed of this event has newly set. However, if the edit operator does not favor the formed slow data file, theCPU10 returns to step SP154 to form a slow data file again.
Since the above-mentioned speed setting operation is repeated until the edit operator is satisfied that the most optimal reproducing speed could be set, his/her desiring effective reproducing speed can be set to the specified event.[0463]
Note that, in the speed setting steps from step SP[0464]156 to step SP160, during themotion control lever401 is operated by the edit operator with his/her hand to set an optional reproducing speed to an event, the video data reproduced at a reproducing speed corresponding to the motion of themotion control lever401 is never put on the air. Because the most optimal and effective reproducing speed can not always be set by once speed setting by the edit operator. Then, theediting system1 according to the present invention is programmed so that this speed setting operation can be repeated any time until the edit operator can be satisfied that the most effective and optimal speed can be set to an event.
In step SP[0465]162, theCPU10 rewrites 2-byte slow type data of the second management record data for managing the event in which the speed information was newly set from the data “00000000” showing a normal 1.0-time speed reproducing speed to the data “00000001” showing that an optional reproducing speed is being set.
Here the operation when reproducing speed is set is completed.[0466]
Then the operation when only the event in which an optional speed has set in the above manner is outputted to the hybrid recorder to put on the air will be described hereafter.[0467]
If the edit operator specifies the reproducing of the event in which an optional reproducing speed has set by clicking the[0468]preview button32 or the like, theCPU10 refers to the slow type data of the second management record data for managing thus specified event to judge whether slow information has set or not to its event. If slow data has set, theCPU10 reads out the slow data file connected with the specified event from the RAM lob.
The[0469]CPU10 controls the reproducing speed of thehybrid recorder3 for every frames using the speed data connected with the time code data which has been recorded in this slow data file.
By automatically controlling the reproduction of the[0470]hybrid recorder3 using a slow data file formed by speed setting operation, the motion of themotion control lever401 showing the reproducing speed which was decided by the edit operator to be the best to the event in the speed setting operation can be automatically reproduced.
(9-7) Forming Processing of Video Program[0471]
The processing of producing a video program using generated event is described referring to the flowchart of FIG. 26. Note that, it is defined that this flowchart is started from the state where events have been generated already.[0472]
After the processing is started at step SP[0473]200, in the step SP201, theCPU10 judges whether an event has specified or not. At this time, determines that an event has specified when themouse2dhas double-clicked (the operation successively clicking twice) in the state where the cursor was positioned at the display position of the clipped image data in the event displaying area (29a).
As a result, if an event has specified, in the next step SP[0474]202, theCPU10 controls the specified event into an active state, i.e., movable state.
In the next step SP[0475]203, theCPU10 judges whether the cursor has moved or not as clicking themouse2d, i.e., dragged or not. As a result, if dragged, in the next step SP204, the direction and the distance that the cursor moved are computed. In the next step SP205, theCPU10 changes the display position of the clipped image data of the specified event based on thus computed direction and distance. Note that, since the processing from step SP203 to step SP205 will be executed promptly, on the screen ofmonitor2b, it is seemed that clipped image data of an event is moving together with the cursor.
In the next step SP[0476]206, theCPU10 judges whether the click button of themouse2dhas left hold of or not in theprogram displaying area30, i.e., the clicking has released or not. As a result of the judgment, if the clicking has not been released, theCPU10 returns to step SP203 to repeat the processing, while if released, theCPU10 goes to the next step SP207 to compute the cursor position when the clicking was released.
In the next step SP[0477]208, theCPU10 judges whether the other event is being displayed or not on the right side than the display position of the event specified by the cursor position in theprogram displaying area30 based on the computed cursor position. As a result, if the other event is being displayed on the right side, theCPU10 goes to step SP209, while if no event is being displayed on the right side, theCPU10 goes to step SP210.
In step SP[0478]209, theCPU10 further moves the display position of the other event displayed on the right side toward the right to insert the specified event. After that theCPU10 goes to step SP210.
In step SP[0479]210, theCPU10 displays the clipped image data of the specified event at the position specified by the cursor in theprogram displaying area30.
In the next step SP[0480]211, theCPU10 updates the data contents of the second management record data regarding program according to the insertion of event at step SP210. More precisely, theCPU10 corrects a part of the pointer to the data linked before or after in the second management record data for program data shown in FIG. 10. Note that, since the event newly inserted has no second management record data, this is newly generated.
In step SP[0481]212, an edition list showing the final video program is generated in the order of plural clipped image data shown in the program displaying area. To put it concretely, this edition list consists of in-point and out-point time code data of second management record data being connected with the clipped image data being displayed in this program displaying area, and speed data of the slow data file which has been set to the event shown by the clipped image data. For example, if the first event having no slow data file and the second event having slow data file are displayed at the first and the second in this program displaying area, in the edition list, listed data for reproducing throughout at 1-time speed the first event video data from the time code data of the in point to the time code data of the out point of the first invent is registered, and next listed data for reproducing video data based on the reproducing speed data recorded in the slow data file set to the second event is registered.
If reproducing this video program has instructed by the edit operator, the[0482]CPU10 only may control reproduction by the hybrid recorder according to thus formed edition list.
When this processing is completed, the[0483]CPU10 goes to step SP213 to determine whether to continue or not the program forming processing. If continuing it, theCPU10 returns to step SP201 to repeat the processing, while if stopping the program forming processing, theCPU10 goes to step SP214 and stopping the processing.
(9-8) Dubbing Processing in Hybrid Recorder[0484]
The[0485]hybrid recorder3 can perform mainly two types of dubbing processing. One of them is the dubbing processing for recording the video data of a video tape to the hard-disk drive300 to use the video data recorded in the video tape brought from the outside as editing material. Another of them is the dubbing processing for recording the video data of the final video program which is generated as the result of edition in thisediting system1 to a broadcasting video tape from the hard-disk drive300.
In the case of executing these dubbing processing, at first, the[0486]dubbing setting button38 is clicked among various command buttons displayed on the GUI of themonitor2bto display thedubbing setting dialog38A on the GUI. In the case where the dubbing processing from a video tape to the hard-disk drive300 is performed as the dubbing processing, the dubbing processing from video tape to hard disk is selected by clicking the tape-to-disk button38B in thedubbing setting dialog38A. Provided that since if the video data recorded in the video tape is compressively coded digital video data, high-speed dubbing can be performed, in this case, the high-speed dubbing button38D is further clicked to select a high-speed dubbing mode.
On the other hand, in the case where the dubbing processing from the hard-[0487]disk drive300 to a video tape is performed, the dubbing processing from hard disk to video tape is selected by clicking the disk-to-tape button38C in thedubbing setting dialog38A. Provided that in the case where an event or the like recorded in the hard-disk drive300 is not simply dubbed but the video data of the final video program recorded in the hard-disk drive300 is dubbed, theprogram dubbing button38E is further clicked to select a program dubbing mode.
After entering the various setting of the dubbing processing is completed, the[0488]determination button38F is clicked to determine these setting contents and thedubbing setting dialog38A is closed.
In the case where the dubbing processing is actually executed consequently, at first, the dubbing button[0489]22iis clicked to start up the dubbing processing mode, and then therecording start button31ais clicked to send a recording starting command out to thehybrid recorder3. Thus, the dubbing processing according to the set contents of dubbing processing is executed in thehybrid recorder3. Note that, if the dubbing processing is stopped in the middle of execution of the dubbing processing, therecording stop button31bis clicked to send a recording stopping command out to thehybrid recorder3, so that thehybrid recorder3 stops the dubbing processing.
Here, the procedure of the[0490]CPU10 in the dubbing processing will be described hereinafter referring to a flowchart.
(9-8-1) Setting of Dubbing Contents[0491]
As shown in FIG. 30, the procedure starts from the step SP[0492]220, and at step SP221, theCPU10 judges whether or not thedubbing setting button38 has pushed. In this case, if themouse2dis clicked in the state where the cursor is on thedubbing setting button38, theCPU10 judges that thedubbing setting button38 has pushed. Note that, also judgement whether or not to have pushed about the other buttons described below is similar to that.
If the[0493]dubbing setting button38 is pushed, theCPU10 generates graphical data of thedubbing setting dialog38A and transmits this to theVRAM13bof thedisplay controller13 at the next step SP222, so that thedubbing setting dialog38A is displayed on the GUI.
Thereafter, the[0494]CPU10 judges whether or not what button has pushed among the buttons in thedubbing setting dialog38A in the processing of the following step SP223, step SP224, step SP227 and step SP228. Specifically, at step SP223, theCPU10 judges whether or not the tape-to-disk button38B has pushed, and if the button38B has pushed, it goes to step SP224, while if not, it goes to step SP227.
At step SP[0495]227, theCPU10 judges whether or not the disk-to-tape button38C has pushed, and if thebutton38C has pushed, it goes to step SP228, while if not, it returns to step SP223 and repeating the judgement whether or not the tape-to-disk button38B has pushed.
If going to step SP[0496]224 since the tape-to-disk button38B has pushed, theCPU10 judges whether or not the high-speed dubbing button38D has pushed, and if the button38D has pushed, it goes to step SP225, while if not, it goes to step SP226. At step SP225, theCPU10 determines that the operator desires high-speed dubbing from video tape to hard disk. On the other hand, at step SP226, theCPU10 determines that the operator desires normal dubbing processing from video tape to hard disk as the dubbing processing.
If going to step SP[0497]228 since the disk-to-tape button38C has pushed, theCPU10 judges whether or not theprogram dubbing button38E, and if thebutton38E has pushed, it goes to step SP229, while if not, it goes to step SP230. At step SP229, theCPU10 determines that the operator desires program dubbing processing from hard disk to video tape (i.e., dubbing of the final video program) as the dubbing processing. On the other hand, at step SP230, theCPU10 determines that the operator desires normal dubbing processing from hard disk to video tape (i.e., dubbing of desired video data such as an event) as the dubbing processing.
At the next step SP[0498]231, theCPU10 judges whether or not thedetermination button38F has pushed, and if thedetermination button38F has pushed, it stores the set contents judged in step SP225, step SP226, step SP229 or step SP230 in the dubbing contents storing area of theRAM10band goes to step SP232, while if thedetermination button38F has not pushed, it returns to step SP223 and repeating the judgement whether or not to be pushed about each button.
At step SP[0499]232, theCPU10 closes thedubbing setting dialog38A by determining that the setting of dubbing contents has completed and proceeds to the next step SP233 to stop the processing.
(9-8-2) Execution of Dubbing Processing[0500]
In this clause, the procedure in executing the dubbing contents set by the processing shown in FIG. 30 will be described.[0501]
As shown in FIG. 31, the processing is started from step SP[0502]240, and at step SP241, at first, theCPU10 judges whether or not the dubbing button22ihas pushed, and if the dubbing button22ihas pushed, it goes to the next step SP242. At the next step SP242, theCPU10 judges whether or not the hard-disk drive300 and theVTR301 are in the middle of recording operation, and if these are in the middle of recording operation, it returns to step SP241 since the dubbing processing cannot be performed, while if not, it goes to the next step SP243.
At step SP[0503]243, theCPU10 judges whether or not thedubbing setting button38 has pushed, and if it has pushed, theCPU10 goes to step SP244 by determining that the dubbing contents is changed and executes the processing of the setting of dubbing contents shown in FIG. 30, while if not, it goes to step SP245 by determining that the contents of dubbing does not change.
At step SP[0504]245, theCPU10 obtains the dubbing contents set by the processing shown in FIG. 30 in the area following to the dubbing contents storing area of the RAM lob, and goes to the next step SP246 to perform preparation for dubbing processing.
FIG. 32 shows the preparation for dubbing processing at step SP[0505]246 concretely. The processing starts from step SP260, and at step SP261, theCPU10 supplies the control command to change the display position of the reproducing video signal V3 which is reproduced and supplied by the hard-disk drive300 or theVTR301, from the reproducingvideo displaying area23 to the recordingvideo displaying area21 on the GUI, in order to display the video signal to be recorded on therecording video screen21aof the recordingvideo displaying area21, to thedisplay controller13, so that the display position of the reproducing video signal V3 is changed.
At the next step SP[0506]262, theCPU10 generates graphic display data showing the state of dubbing as shown in FIG. 5 based on the contents of dubbing obtained in the last step SP245. At the next step SP263, theCPU10 displays the graphic display data generated in step SP262 on the reproducingvideo screen23aof the reproducingvideo displaying area23. In this case, since the graphic display data showing the state of dubbing includes information showing the contents of setting such as “Disk to Tape” and “Normal Dubbing”, the operator can easily confirm the set contents of dubbing processing without specially opening thedubbing setting dialog38A. After this processing of step SP263 is completed, theCPU10 goes to the next step SP264 to return to the last processing shown in FIG. 31.
If the processing of dubbing preparation is completed, the[0507]CPU10 goes to step SP247. At this step SP247, theCPU10 judges whether or not therecording start button31ahas pushed, and if the recording start button has pushed, it goes to the next step SP248.
At step SP[0508]248, theCPU10 notifies a control command showing recording start and a control command showing the contents of dubbing to theCPU303 of thehybrid recorder3 via theexternal interface18. Note that, in this case, the control command showing the contents of dubbing is information about the direction of dubbing (i.e., it shows from video tape to hard disk or from hard disk to video tape) and dubbing speed or the like.
In the[0509]hybrid recorder3 in which these control commands are received, theCPU303 controls the first to the third switches in accordance with the contents of dubbing as well as controlling the operation modes of the hard-disk drive300 and theVTR301, and executes the dubbing processing. For example, in the case where normal dubbing from a video tape loaded in theVTR301 to the hard-disk drive300 is executed, theVTR301 is set into a reproducing operation mode and the hard-disk drive300 is set into a recording operation mode, furthermore, the first to the third switches are controlled so that the analog video signal reproduced by theVTR301 is supplied to the hard-disk drive300 sequentially through thethird switch310, thefirst switch304, theencoder306 and thesecond switch307, and the normal dubbing processing from video tape to hard disk is executed according to that control.
On the other hand, in the case where high-speed dubbing from a video tape loaded in the[0510]VTR301 to the hard-disk drive300 is executed, theVTR301 is set into a reproducing operation mode and the hard-disk drive300 is set into a recording operation mode, furthermore, thesecond switch307 is controlled so that the compressively coded video signal reproduced by theVTR301 is supplied to the hard-disk drive300 via thesecond switch307, and the high-speed dubbing from video tape to hard disk is executed according to that control.
In the case where normal dubbing or program dubbing from the hard-[0511]disk drive300 to a video tape is executed, theVTR301 is set into the recording operation mode and the hard-disk drive300 is set into the reproducing operation mode, furthermore, the first and thethird switches304 and310 are controlled so that the video signal reproduced from the hard-disk drive300 is supplied to theVTR301 sequentially through thethird switch310, thedecoder305, thefirst switch304 and theencoder306, and the dubbing processing from hard disk to video tape is executed according to that control.
As in the above manner, the[0512]CPU10 of thecomputer2 executes the dubbing processing based on the contents of setting by controlling thehybrid recorder3 via theexternal interface18.
Note that, when this dubbing processing is being executed, the reproduced video signal V[0513]3 reproduced and supplied from theVTR301 or the hard-disk drive300 is displayed in the recordingvideo displaying area21 as described above. Thus, the operator can easily confirm the contents of the video signal being actually dubbed.
After executing the dubbing processing, the[0514]CPU10 judges whether or not therecording stop button31bor transmission of the specified video signal is completed at the next step SP249. As a result, if theCPU10 determines that the dubbing processing has completed, it goes to the next step SP250 to stop the processing of dubbing processing.
(10) Operation and Effects of the Embodiment[0515]
According to the above structure, in the case of this[0516]editing system1, the GUI for editing operation is displayed on themonitor2bso that editing work can be performed by operating the various command buttons displayed on the GUI. Moreover, the source video signal V1 supplied from the outside as live image is recorded to the hard-disk drive300 of thehybrid recorder3 as editing material, and at the same time, thus recorded video signal is displayed in the recordingvideo displaying area21, furthermore, desired video signal is read out as occasion demands from the hard-disk drive300 in the middle of recording operation and displayed in the reproducingvideo displaying area23. Thereby, in thisediting system1, editing work can be easily performed by selecting a command button on the screen while monitoring the GUI and the video data displayed on themonitor2bof thecomputer2, thus, editing work can be performed efficiently by reducing troublesome in edition comparing with the conventional editing work performed in restricted environment while operating various devices.
Furthermore, in the case of this[0517]editing system1, also in the case where a video tape brought from the outside is used as editing material, video data recorded in the video tape can be easily dubbed to the hard-disk drive300 as editing material. To put it concrete, after the video tape is loaded in theVTR301 of thehybrid recorder3, a dubbing mode is instructed by operating thedubbing setting button38 on the GUI, and the dubbing button22iand therecording start button31aare operated also on the GUI, so that the video data recorded in the video tape can be reproduced and the reproduced video data is recorded to the hard-disk drive300. Thus, dubbing from the video tape can be easily performed with simple operation only by selecting the command buttons displayed on the screen and the troublesome of the operator at time of dubbing can be reduced comparing with the case where devices on a reproducing side and a recording side are separately operated to perform dubbing as the conventional case.
Also in the case where the video data of the final video program generated as the result of editing processing is dubbed to a video tape for broadcasting, after the dubbing mode is instructed by operating the[0518]dubbing setting button38 displayed on the GUI, the dubbing button22iand therecording start button31aare operated, so that the final video program can be dubbed to the video tape for broadcasting; therefore, the dubbing processing can be performed with simple operation comparing with the case where each device is separately operated as the conventional case.
According to the above structure, since the hard-[0519]disk drive300 and theVTR301 in thehybrid recorder3 can be operated with command buttons displayed on the GUI, when video data recorded in a video tape is recorded to the hard-disk drive300 as editing material or the video data of the final video program generated as the result of edition is recorded to a video tape for broadcasting, dubbing processing can be performed with simple operation; thus, the dubbing operation can be performed efficiently by reducing the troublesome of the operator at the time of dubbing comparing with the conventional case.
Note that, the aforementioned embodiment has dealt with the case where the hard-[0520]disk drive300 in which a magnetic disk is used as random-access recording/reproducing means is applied. However, the present invention is not only limited to this but random-access recording/reproducing means in which other recording medium such as a magneto-optical disk (MO), a semiconductor memory, etc., is used is applicable.
The aforementioned embodiments have dealt with the case where the[0521]hybrid recorder3 is constituted of the hard-disk drive300 and theVTR301. However, the present invention is not only limited to this but other structure is allowed as the constitution of a recording/reproducing device. The same effects as the aforementioned case can be obtained provided that the recording/reproducing device is constituted of, at least, first random-access recording/reproducing means and second recording/reproducing means capable of performing recording and reproducing operation to a predetermined recording medium.
The aforementioned embodiments have dealt with the case where the first to the[0522]third switches304,307 and310 are provided, and video signal reproduced from theVTR301 is supplied to the hard-disk drive300, and at the same time, video signal reproduced from the hard-disk drive300 is supplied to theVTR301. However, the present invention is not only limited to this but the same effects as the aforementioned cases can be obtained by providing selecting means for supplying video data reproduced by one recording/reproducing means to another recording/reproducing means in the first and the second recording/reproducing means.
Furthermore, the embodiments described above have dealt with the case of entering various instructions and data to the[0523]editing system1 by using thekeyboard2c,mouse2dordedicated controller2e. The present invention, however, is not only limited to this but also the various instructions and data may be entered using the other input devices, provided that user interface means for entering the various instructions and data from the operator to theediting system1 is provided.
Industrial Capability[0524]
The present invention can be used when source video data being material is dubbed or the video data of the final video program generated as the result of edition is dubbed to a video tape for broadcasting at a broadcasting station or the like.[0525]