The application be same applicant the applying date be on June 23rd, 2010, application No. is 201080028028.6(PCT/JP2010/060605), the division Shen of the Chinese invention patent application of entitled " image processing apparatus and method "Please.
Specific embodiment
The embodiment of the present invention is described hereinafter with reference to attached drawing.Note that be described in the following order.
1. first embodiment (neighborhood pixels interpolation filtering switching: the example of intra prediction)
2. second embodiment (neighborhood pixels interpolation filtering switching: the example of re prediction)
3. 3rd embodiment (neighborhood pixels interpolation filtering on/off (ON/OFF) control: the example of intra prediction)
4. fourth embodiment (neighborhood pixels interpolation filtering opens/closes control: the example of re prediction)
<1. first embodiment>
[configuration example of picture coding device]
Fig. 1 shows be used as the configuration for applying the embodiment of picture coding device of image processing apparatus of the invention.
The picture coding device 51 is used and for example H.264 (is hereinafter retouched with MPEG-4Part10 (advanced video coding)State as H.264/AVC) format come to image carry out compressed encoding.
For the example in Fig. 1, picture coding device 51 by A/D converting unit 61, picture rearrange buffer 62,Computing unit 63, orthogonal transform unit 64, quantifying unit 65, lossless coding unit 66, storage buffer 67, inverse quantization listFirst 68, inverse orthogonal transformation unit 69, computing unit 70, de-blocking filter 71, frame memory 72, switch 73, intra prediction listMember 74, neighborhood pixels interpolation filtering switch unit 75, motion prediction/compensating unit 76, forecast image selecting unit 77 and rateControl unit 78 is constituted.
A/D converting unit 61 carries out Analog-digital Converter to input picture, and the input picture is output to picture weightIt is new to arrange buffer 62 to store.Picture rearranges buffer 62 and will be used to show according to depositing according to GOP (picture group)The image of the frame of the sequence of storage is rearranged for the sequence of the frame for coding.
Computing unit 63 from rearranged by picture subtracted in the image that buffer 62 is read selected by forecast image it is singleThe forecast image from intraprediction unit 74 or the forecast image from motion prediction/compensating unit 76 of 77 selection of member,And its difference information is output to orthogonal transform unit 64.Orthogonal transform unit 64 believes the difference from computing unit 63Breath carries out the orthogonal transformation of such as discrete cosine transform, Karhunen-Lo é ve transformation or the like, and exports its transformation seriesNumber.The transformation coefficient that quantifying unit 65 exports orthogonal transform unit 64 quantifies.
The quantization transform coefficient of output as quantifying unit 65 is input into lossless coding unit 66, losslessThe lossless coding and compression of such as variable length code, arithmetic coding or the like are carried out in coding unit 66 to it.
Lossless coding unit 66 obtains the information of instruction intra prediction etc. from intraprediction unit 74, and from movementPrediction/compensating unit 76 obtains the information of instruction inter-frame forecast mode etc..It note that the letter for hereinafter indicating intra predictionBreath will be referred to as intraprediction mode information.Similarly, hereinafter, it is pre- to indicate that the information of inter-prediction will be referred to as interframeSurvey pattern information.
The transformation coefficient of 66 pairs of lossless coding unit quantizations encodes, also to the information of instruction intra prediction, instructionInformation, quantization parameter of inter-frame forecast mode etc. are encoded, and take these information as the head information in compression imageA part.Coded data is supplied to storage buffer 67 to store by lossless coding unit 66.
For example, executing the lossless of such as variable length code, arithmetic coding or the like for lossless coding unit 66Consume coded treatment.The example of variable length code includes that (context-adaptive is variable by the CAVLC that is determined by H.264/AVC formatLength coding).The example of arithmetic coding includes CABAC (context adaptive binary arithmetic coding).
Storage buffer 67 is using the data supplied from lossless coding unit 66 as by H.264/AVC said shankCompression image is output to storage device or the transmitting path in downstream not shown in this Figure etc..
In addition, being also inputted to inverse quantization unit 68 from the quantization transform coefficient that quantifying unit 65 exports, inverse amount has been carried outChange, then, the further progress inverse orthogonal transformation at the inverse orthogonal transformation unit 69.It will have been carried out by computing unit 70 against justThe output that alternation changes is added with the forecast image supplied from forecast image selecting unit 77, and is changed into local decoderImage.Then de-blocking filter 71 is supplied from removal block distortion (block distortion) in decoding imageTo frame memory 72, to store.Image before de-blocking filter 71 executes deblocking filtering processing is also supplied to frame and depositsReservoir 72, to store.
The reference picture being stored in frame memory 72 is output in motion prediction/compensating unit 76 or frame by switch 73Predicting unit 74.
For example, rearranging I picture, B picture and the P figure of buffer 62 from picture for the picture coding device 51Piece is supplied to intraprediction unit 74 as the image of intra prediction to be carried out (also referred to as processing in frame).In addition, from pictureThe B picture and P picture that face rearranges the reading of buffer 62 are as inter-prediction to be carried out (also referred to as interframe processing)Image is supplied to motion prediction/compensating unit 76.
Intraprediction unit 74 based on from picture rearrange buffer 62 read the image that carry out intra prediction andThe reference picture supplied from frame memory 72 executes the intra-prediction process of all candidate intra prediction modes, pre- to generateAltimetric image.
Before intra-prediction process, intraprediction unit 74 executes filtering processing, the neighborhood pixels to neighborhood pixelsIt is the intra prediction for each current block and with scheduled positional relationship and the neighbouring pixel of current block.According to out of frameThe intra prediction mode etc. that predicting unit 74 is supplied, which is set by neighborhood pixels interpolation filtering switch unit 75The filter factor set.That is, for the intra-prediction process of all candidate intra prediction modes, intraprediction unit74 use the neighbouring picture for having carried out the filtering processing using the filter factor being arranged by neighborhood pixels interpolation filtering switch unit 75Element.
Intraprediction unit 74 calculates the cost function value (cost about the intra prediction mode for producing forecast imageFunction value), also, the intra prediction mode that wherein calculated cost function value gives minimum value is selected to makeFor best intra prediction mode.Intraprediction unit 74 is by the forecast image generated in best intra prediction mode and is directed toThe corresponding best calculated cost function value of intra prediction mode is supplied to forecast image selecting unit 77.
The case where forecast image selecting unit 77 has selected the forecast image generated in best intra prediction modeIn, the information for indicating best intra prediction mode is supplied to lossless coding unit 66 by intraprediction unit 74.Out of frameIn the case where predicting unit 74 transfers information, lossless coding unit 66 encodes the information, and the information is taken to makeFor a part of the head information in compression image.
Neighborhood pixels interpolation filtering switch unit 75 stores by using training image in the Figure 28 being described later onIt practises and executes study at device 251 and obtain, corresponding with quantization parameter and intra prediction mode filter factor.
The quantization parameter from Rate control unit 78 is supplied to neighborhood pixels interpolation filtering switch unit 75 and comes from frameThe intraprediction mode information of interior prediction unit 74.Neighborhood pixels interpolation filtering switch unit 75 is arranged and comes from rate controlThe quantization parameter of unit 78 filter factor corresponding with the intra prediction mode from intraprediction unit 74.Neighborhood pixelsThe filter factor of setting is supplied to intraprediction unit 74 by interpolation filtering switch unit 75.
Note that neighborhood pixels interpolation filtering switch unit 75 can execute in quantization parameter and intra prediction modeOnly one rather than the study and storage of two corresponding filter factors.
, can in addition, although neighborhood pixels interpolation filtering switch unit 75 stores the filter factor of preparatory off-line learningInstead in line computation filter factor.In this case, the filtering being arranged by neighborhood pixels interpolation filtering switch unit 75Coefficient is output to lossless coding unit 66 to be sent to decoding side, as shown in dotted arrows.
Motion prediction/compensating unit 76 executes motion prediction and compensation deals for all candidate inter-frame forecast modes.Specifically, it is supplied to motion prediction/compensating unit 76 from picture via switch 73 and rearranges carrying out for the reading of buffer 62The image that interframe is handled and the reference picture from frame memory 72.Motion prediction/compensating unit 76 is based on carrying out at interframeThe image and reference picture of reason detect the motion vector of all candidate inter-frame forecast modes, based on motion vector to reference to figureAs compensating processing, and generate forecast image.
In addition, motion prediction/compensating unit 76 calculates cost function value for all candidate inter-frame forecast modes.FortuneDynamic prediction/compensating unit 76 determines the prediction mode of the offer minimum value in calculated cost function value as optimum frameBetween prediction mode.
The forecast image and its cost function that motion prediction/compensating unit 76 will generate in best inter-frame forecast modeValue is supplied to forecast image selecting unit 77.It has selected to produce in best inter-frame forecast mode in forecast image selecting unit 77In the case where raw forecast image, (interframe is pre- by the information for indicating best inter-frame forecast mode for motion prediction/compensating unit 76Survey pattern information) it exports to lossless coding unit 66.
It note that motion vector information, flag information, reference frame information etc. are output to lossless coding as neededUnit 66.Lossless coding unit 66 also carries out such as variable-length to the information from motion prediction/compensating unit 76 and compilesThe lossless coded treatment of code, arithmetic coding etc., and be inserted into the head of compression image.
Forecast image selecting unit 77 based on export from intraprediction unit 74 or motion prediction/compensating unit 76 atThis functional value determines optimum prediction mode from best intra prediction mode and best inter-frame forecast mode.Forecast image selectionThen unit 77 selects the forecast image in determining optimum prediction mode, and be supplied into computing unit 63 and 70.ThisWhen, the selection information of forecast image is supplied to intraprediction unit 74 or motion prediction/benefit by forecast image selecting unit 77Repay unit 76.
Rate control unit 78 is single based on the compression image quantization state modulator quantization being stored in storage buffer 67The rate of the quantization operation of member 65, not will lead to overflow or underflow.
The quantization parameter for being used for rate control at quantifying unit 65 is supplied to lossless coding unit 66, to thisQuantization parameter carries out lossless coded treatment, and inserts it into the head of compression image.The quantization parameter is supplied to neighbourNearly pixel interpolating filters switch unit 75, and is used to set up the filtering of the filtering processing for apply to neighborhood pixelsCoefficient.
[according to the description of the intra-prediction process of H.264/AVC format]
Firstly, the intra prediction mode that description is determined by H.264/AVC format.
Firstly, the intra prediction mode that luminance signal will be described.For the intra prediction mode of luminance signal, frame is determinedIn interior 4 × 4 prediction mode, frame in 8 × 8 prediction modes and frame 16 × 16 prediction modes three systems.In the presence of for determining blockThe mode of unit, for each macro block Setting pattern.In addition, being directed to each macro block, can independently be given with luminance signalIntra prediction mode is arranged in color difference signal.
In addition, for each 4 × 4 pixel current block, can be set nine in the case where intra-frame 4 * 4 forecasting modelKind of prediction mode one of works as prediction mode.In addition, in frame in the case where 8 × 8 prediction mode, for each 8 × 8Pixel current block can be set nine kinds of prediction modes and one of work as prediction mode.In addition, 16 × 16 prediction mode in frameIn the case where, four kinds of prediction modes can be set to 16 × 16 pixel current macros and one of work as prediction mode.
It note that hereinafter, 16 × 16 prediction moulds in 8 × 8 prediction modes and frame in intra-frame 4 * 4 forecasting model, frameFormula will also be known respectively as in due course 4 × 4 pixel intra prediction modes, 8 × 8 pixel intra prediction modes and 16 ×16 pixel intra prediction modes.
For the example in Fig. 2, investing each piece of number 1 to 25 indicates that the bit stream sequence of its block (decodes the processing of sideSequence).It note that about luminance signal, macro block is divided into 4 × 4 pixels, also, executes the DCT of 4 × 4 pixels.Only existIn frame in the case where 16 × 16 prediction modes, as shown in 1 block, the DC ingredient of aggregation block generates 4 × 4 matrixes, andAnd orthogonal transformation is also carried out to the matrix.
On the other hand, about color difference signal, 4 × 4 pixels is divided into macro block and execute the DCT of 4 × 4 pixelsLater, as shown in block 16 and 17, the DC ingredient of aggregation block generates 2 × 2 matrixes, and also carries out positive alternation to the matrixIt changes.
Note that, about 8 × 8 prediction mode in frame, this can be suitable only for following situations: where with high image quality orHigher image quality (high profile or a profile beyond this) carries out 8 × 8 orthogonal transformations to current macro.
Fig. 3 and Fig. 4 is nine kind of 4 × 4 pixel intra prediction mode (Intra_4 × 4_pred_ for showing luminance signalMode diagram).Eight kinds of modes other than showing the mode 2 of average value (DC) prediction correspond respectively in Fig. 5 use numberThe direction that word 0,1,3 to 8 indicates.
Nine kinds of intra_4 × 4_pred_mode will be described referring to Fig. 6.For the example in Fig. 6, pixel a to p expression is wantedThe pixel of the current block handled in frame is carried out, pixel value A to M indicates the pixel value for belonging to the pixel of contiguous block.Specifically,Pixel a to p be rearranged from picture buffer 62 reading image to be processed, also, pixel value A to M be will be from frameThe pixel value for decoding image that memory 72 reads and is referenced.
In the case where the intra prediction mode shown in figs. 3 and 4, the pixel value A for the pixel for belonging to contiguous block is usedTo M, the following predicted pixel values for generating pixel a to p.Here, pixel value be " available " indicate the pixel value be available andNot such reason: the pixel is located in the edge of picture frame or is not yet encoded.On the other hand, pixel value be " can not" indicate cause the pixel value to be disabled for this reason: the pixel be located in the edge of picture frame orPerson is not yet encoded.
Mode 0 is vertical prediction mode, and is suitable only for the case where pixel value A to D is " available ".ThisIn the case of, the predicted pixel values of pixel a to p are generated as following expressions (1):
Predicted pixel values=A of pixel a, e, i and m
Predicted pixel values=B of pixel b, f, j and n
Predicted pixel values=C of pixel c, g, k and o
Predicted pixel values=D of pixel d, h, l and p...(1)
Mode 1 is horizontal prediction mode, and is suitable only for the case where pixel value I to L is " available ".ThisIn the case of, the predicted pixel values of pixel a to p are generated as following expressions (2):
Predicted pixel values=I of pixel a, b, c and d
Predicted pixel values=J of pixel e, f, g and h
Predicted pixel values=K of pixel i, j, k and l
Predicted pixel values=L of pixel m, n, o and p...(2)
Mode 2 is DC prediction mode, also, when pixel value A, B, C, D, I, J, K and L are entirely " available ", such as tablePredicted pixel values are equally generated up to formula (3).
(A+B+C+D+I+J+K+L+4)>>3...(3)
In addition, generating prediction pixel as expression formula (4) when pixel value A, B, C and D are entirely " not available "Value.
(I+J+K+L+2)>>2...(4)
In addition, generating prediction pixel as expression formula (5) when pixel value I, J, K and L are entirely " not available "Value.
(A+B+C+D+2)>>2...(5)
It note that, when pixel value A, B, C, D, I, J, K and L are entirely " not available ", 128 are used as predicted pixel values.
Mode 3 is Diagonal_Down_Left (diagonal down-left) prediction mode, and be suitable only for pixel value A, B,C, the case where D, I, J, K, L and M are " available ".In this case, pixel a to p is generated as following expressions (6)Predicted pixel values:
The predicted pixel values of pixel a=(A+2B+C+2) > > 2
The predicted pixel values of pixel b and e=(B+2C+D+2) > > 2
The predicted pixel values of pixel c, f and i=(C+2D+E+2) > > 2
The predicted pixel values of pixel d, g, j and m=(D+2E+F+2) > > 2
The predicted pixel values of pixel h, k and n=(E+2F+G+2) > > 2
The predicted pixel values of pixel l and o=(F+2G+H+2) > > 2
The predicted pixel values of pixel p=(G+3H+2) > > 2...(6)
Mode 4 is Diagonal_Down_Right (lower-right diagonal position) prediction mode, and be suitable only for pixel value A,B, the case where C, D, I, J, K, L and M are " available ".In this case, pixel a is generated extremely as following expressions (7)The predicted pixel values of p:
The predicted pixel values of pixel m=(J+2K+L+2) > > 2
The predicted pixel values of pixel i and n=(I+2J+K+2) > > 2
The predicted pixel values of pixel e, j and o=(M+2I+J+2) > > 2
The predicted pixel values of pixel a, f, k and p=(A+2M+I+2) > > 2
The predicted pixel values of pixel b, g and l=(M+2A+B+2) > > 2
The predicted pixel values a=(A+2B+C+2) of pixel c and h > > 2
The predicted pixel values of pixel d=(B+2C+D+2) > > 2...(7)
Mode 5 is Diagonal_Vertical_Right (right Vertical Diagonal) prediction mode, and is suitable only for pictureThe case where element value A, B, C, D, I, J, K, L and M are " available ".In this case, it is generated as following expressions (8)The predicted pixel values of pixel a to p:
The predicted pixel values of pixel a and j=(M+A+1) > > 1
The predicted pixel values of pixel b and k=(A+B+1) > > 1
The predicted pixel values of pixel c and l=(B+C+1) > > 1
The predicted pixel values of pixel d=(C+D+1) > > 1
The predicted pixel values of pixel e and n=(I+2M+A+2) > > 2
The predicted pixel values of pixel f and o=(M+2A+B+2) > > 2
The predicted pixel values of pixel g and p=(A+2B+C+2) > > 2
The predicted pixel values of pixel h=(B+2C+D+2) > > 2
The predicted pixel values of pixel i=(M+2I+J+2) > > 2
The predicted pixel values of pixel m=(I+2J+K+2) > > 2.. (8)
Mode 6 is Horizontal_Down (lower level) prediction mode, and be suitable only for pixel value A, B, C, D,I, the case where J, K, L and M are " available ".In this case, the pre- of pixel a to p is generated as following expressions (9)Survey pixel value:
The predicted pixel values of pixel a and g=(M+I+1) > > 1
The predicted pixel values of pixel b and h=(I+2M+A+2) > > 2
The predicted pixel values of pixel c=(M+2A+B+2) > > 2
The predicted pixel values of pixel d=(A+2B+C+2) > > 2
The predicted pixel values of pixel e and k=(I+J+1) > > 1
The predicted pixel values of pixel f and l=(M+2I+J+2) > > 2
The predicted pixel values of pixel i and o=(J+K+1) > > 1
The predicted pixel values of pixel j and p=(I+2J+K+2) > > 2
The predicted pixel values of pixel m=(K+L+1) > > 1
The predicted pixel values of pixel n=(J+2K+L+2) > > 2...(9)
Mode 7 is Vertical_Left (lower vertical) prediction mode, and be suitable only for pixel value A, B, C, D, I,J, the case where K, L and M are " available ".In this case, the prediction of pixel a to p is generated as following expressions (10)Pixel value:
The predicted pixel values of pixel a=(A+B+1) > > 1
The predicted pixel values of pixel b and i=(B+C+1) > > 1
The predicted pixel values of pixel c and j=(C+D+1) > > 1
The predicted pixel values of pixel d and k=(D+E+1) > > 1
The predicted pixel values of pixel l=(E+F+1) > > 1
The predicted pixel values of pixel e=(A+2B+C+2) > > 2
The predicted pixel values of pixel f and m=(B+2C+D+2) > > 2
The predicted pixel values of pixel g and n=(C+2D+E+2) > > 2
The predicted pixel values of pixel h and o=(D+2E+F+2) > > 2
The predicted pixel values of pixel p=(E+2F+G+2) > > 2...(10)
Mode 8 is Horizontal_Up (upper level) prediction mode, and be suitable only for pixel value A, B, C, D, I,J, the case where K, L and M are " available ".In this case, the prediction of pixel a to p is generated as following expressions (11)Pixel value:
The predicted pixel values of pixel a=(I+J+1) > > 1
The predicted pixel values of pixel b=(I+2J+K+2) > > 2
The predicted pixel values of pixel c and e=(J+K+1) > > 1
The predicted pixel values of pixel d and f=(J+2K+L+2) > > 2
The predicted pixel values of pixel g and i=(K+L+1) > > 1
The predicted pixel values of pixel h and j=(K+3L+2) > > 2
Predicted pixel values=L of pixel k, l, m, n, o and p...(11)
Next, 4 × 4 pixel intra prediction mode (Intra_4 × 4_pred_ that luminance signal will be described referring to Fig. 7Mode coded format).For the example of Fig. 7, the current block C as encoding target of 4 × 4 pixel of composition is shown, also,The block A and block B neighbouring with current block of 4 × 4 pixel of composition is shown.
In this case, it is conceivable that in Intra_4 × 4_pred_mode in current block C and block A and block BIntra_4 × 4_pred_mode has high correlation.Coded treatment is executed according to following using the correlation, so as to realityExisting higher code efficiency.
Specifically, Intra_4 × 4_pred_mode in block A and block B is taken as respectively for the example in Fig. 7Intra_4 × 4_pred_modeA and Intra_4 × 4_pred_modeB, also, MostProbableMode be defined as withLower expression formula (12):
MostProbableMode=Min (Intra_4 × 4_pred_modeA, Intra_4 × 4_pred_modeB)...(12)。
That is, the block for being assigned lesser mode_number in block A and block B is taken asMostProbableMode。
Referred to as prev_intra4 × 4_pred_mode_flag [luma4 × 4Blkldx] and rem_intra4x4_Two values of pred_mode [luma4x4Blkldx] are defined as the parameter about current block C in bit stream, also, logicalIt crosses based on the processing of pseudo-code shown in following expressions (13) and executes decoding process, it is hereby achieved that about block C'sThe value of Intra_4 × 4_pred_mode and Intra4 × 4PredMode [luma4 × 4Blkldx].
If(prev_intra4×4_pred_mode_flag[luma4×4Blkldx])
Intra4 × 4PredMode [luma4 × 4Blkldx]=MostProbableMode
else
if(rem_intra4×4_pred_mode[luma4×4Blkldx]<MostProbableMode)
Intra4 × 4PredMode [luma4 × 4Blkldx]=rem_intra4 × 4_pred_mode [luma4 ×4Blkldx]
else
Intra4 × 4PredMode [luma4 × 4Blkldx]=rem_intra4 × 4_pred_mode [luma4 ×4Blkldx] +1...(13)
Next, 8 × 8 pixel intra prediction modes will be described.Fig. 8 and Fig. 9 is nine kind of 8 × 8 picture for showing luminance signalThe diagram of plain intra prediction mode (intra_8 × 8_pred_mode).
Hypothesis, the pixel value in current 8 × 8 pieces are taken as (0≤x≤7 p [x, y];0≤y≤7), also, such as p [-1, -1] ..., p [- 1,15], p [- 1,0] ..., [p-1,7] equally indicate contiguous block pixel value.
About 8 × 8 pixel intra prediction modes, low-pass filtering is carried out to neighborhood pixels before generating predicted value.It is existing, it is assumed that say, the p [- 1, -1] of the pixel value before low-pass filtering treatment ..., p [- 1,15], p [- 1,0] ..., p[- 1,7] indicate, also, pixel value p'[-1 after this process, -1] ..., p'[-1,15], p'[-1,0] ..., p'[-1,7] it indicates.
Firstly, in the case where p [- 1, -1] is " available ", the calculating p'[0 as following expressions (14), -1],In the case where " not available ", the calculating p'[0 as following expressions (15), -1].
P'[0, -1]=(p [- 1, -1]+2*p [0, -1]+p [1, -1]+2) > > 2... (14)
P'[0, -1]=(3*p [0, -1]+p [1, -1]+2) > > 2... (15)
Calculate p'[x as following expressions (16), -1] (x=0 ..., 7).
P'[x, -1]=(p [x-1, -1]+2*p [x, -1]+p [x+1, -1]+2) > > 2... (16)
In the case where p [x, -1] (x=8 ..., 15) is " available ", p' is calculated as following expressions (17)[x, -1] (x=8 ..., 15).
P'[x, -1]=(p [x-1, -1]+2*p [x, -1]+p [x+1, -1]+2) > > 2
P'[15, -1]=(p [14, -1]+3*p [15, -1]+2) > > 2... (17)
In the case where p [- 1, -1] is " available ", it is calculated as follows p'[-1, -1].Specifically, p [0 ,-1] and in the case where both p [- 1,0] is " available ", calculate p'[-1 as expression formula (18), -1], also, p [-1,0] in the case where being " not available ", the calculating p'[-1 as expression formula (19), -1].In addition, be in p [0, -1] " can not" in the case where, the calculating p'[-1 as expression formula (20), -1].
P'[-1, -1]=(p [0, -1]+2*p [- 1, -1]+p [- 1,0]+2) > > 2... (18)
P'[-1, -1]=(3*p [- 1, -1]+p [0, -1]+2) > > 2... (19)
P'[-1, -1]=(3*p [- 1, -1]+p [- 1,0]+2) > > 2... (20)
When p [- 1, y] (y=0 ..., 7) is " available ", p'[-1, y is calculated as follows] (y=0 ...,7).Specifically, firstly, in the case where p [- 1, -1] is " available ", p'[-1 is calculated as expression formula (21), 0],And in the case where p [- 1, -1] is " not available ", the calculating p'[-1 as expression formula (22), 0]
P'[-1,0]=(p [- 1, -1]+2*p [- 1,0]+p [- 1,1]+2) > > 2... (21)
P'[-1,0]=(3*p [- 1,0]+p [- 1,1]+2) > > 2... (22)
In addition, calculating p'[-1, y as following expressions (23)] (y=1 ..., 6), also, such as expression formula (24)Equally calculate p'[-1,7].
P [- 1, y]=(p [- 1, y-1]+2*p [- 1, y]+p [- 1, y+1]+2) > > 2... (23)
P'[-1,7]=(p [- 1,6]+3*p [- 1,7]+2) > > 2... (24)
It is pre- in the intra prediction mode shown in figs. 8 and 9 according to generating as follows using p' calculated in this wayMeasured value.
Mode 0 is vertical prediction mode, also, is only " available " Shi Shiyong in p [x, -1] (x=0 ..., 7).Predicted value pred8 × 8 are generated as following expressions (25)L[x, y]。
pred8×8L[x, y]=p'[x, -1] x, y=0 ..., 7... (25)
Mode 1 is horizontal prediction mode, also, is only " available " Shi Shiyong in p [- 1, y] (y=0 ..., 7).Predicted value pred8 × 8 are generated as following expressions (26)L[x, y]。
pred8×8L[x, y]=p'[-1, y] x, y=0 ..., 7... (26)
Mode 2 is DC prediction mode, also, generates predicted value pred8 × 8 according to followingL[x, y].Specifically, in pIn the case where both [x, -1] (x=0 ..., 7) and p [- 1, y] (y=0 ..., 7) are " available ", such as expression formula(27) predicted value pred8 × 8 are equally generatedL[x,y]。
[mathematical expression 1]
It is " available " in p [x, -1] (x=0 ..., 7) and p [- 1, y] (y=0 ..., 7) is " not available "In situation, predicted value pred8 × 8 are generated as expression formula (28)L[x, y]。
[mathematical expression 2]
It is " not available " in p [x, -1] (x=0 ..., 7) and p [- 1, y] (y=0 ..., 7) is " available "In situation, predicted value pred8 × 8 are generated as expression formula (29)L[x, y]。
[mathematical expression 3]
The case where both p [x, -1] (x=0 ..., 7) and p [- 1, y] (y=0 ..., 7) are " not available "In, predicted value pred8 × 8 are generated as expression formula (30)L[x,y]。
pred8×8L[x, y]=128... (30)
Here, expression formula (30) indicates the case where 8 bits input.
Mode 3 is Diagonal_Down_Left_prediction mode (diagonal down-left prediction mode), also, according toIt is following to generate predicted value pred8 × 8L[x,y].Specifically, Diagonal_Down_Left_prediction mode is onlyAt p [x, -1], x=0 ..., 15 be " available " Shi Shiyong, also, generate as following expressions (31) in x=7 andPredicted pixel values in the case where y=7, also, other predicted pixel values are generated as following expressions (32).
pred8×8L[x, y]=(p'[14, -1]+3*p [15, -1]+2) > > 2... (31)
pred8×8L[x, y]=(p'[x+y, -1]+2*p'[x+y+1, -1]+p'[x+y+2, -1]+2) > > 2... (32)
Mode 4 is Diagnonal_Down_Right_prediction mode (down-right prediction mode), also, is pressedPredicted value pred8 × 8 are generated according to followingL[x,y].Specifically, Diagonal_Down_Right_prediction mode is onlyOnly at p [x, -1], x=0 ..., 7 and p [- 1, y], y=0 ..., 7 be " available " Shi Shiyong, also, such as following expressionFormula (33) equally generates predicted pixel values in the case where x>y, also, generates as following expressions (34) in x<yIn the case where predicted pixel values.In addition, generating prediction pixel in the case where x=y as following expressions (35)Value.
pred8×8L[x, y]=(p'[x-y-2, -1]+2*p'[x-y-1, -1]+p'[x-y, -1]+2) > > 2... (33)
pred8×8L[x, y]=(p'[-1, y-x-2]+2*p'[-1, y-x-1]+p'[-1, y-x]+2) > > 2... (34)
pred8×8L[x, y]=(p'[0, -1]+2*p'[-1, -1]+p'[-1,0]+2) > > 2... (35)
Mode 5 is Vertical_Right_prediction mode (right vertical prediction mode), also, is produced according to followingRaw predicted value pred8 × 8L[x,y].Specifically, Vertical_Right_prediction mode is only in p [x, -1], x=0 ..., 7 and p [- 1, y], y=-1 ..., 7 be " available " Shi Shiyong.Now, as following expressions (36)Define zVR.
ZVR=2*x-y... (36)
At this point, generating prediction as following expressions (37) in the case where zVR is 0,2,4,6,8,10,12 or 14Pixel value, also, in the case where zVR is 1,3,5,7,9,11 or 13, prediction pixel is generated as following expressions (38)Value.
pred8×8L[x, y]=(p'[x- (y > > 1) -1, -1]+p'[x- (y > > 1), -1]+1) > > 1... (37)
pred8×8L[x, y]=(p'[x- (y > > 1) -2, -1]+2*p'[x- (y > > 1) -1, -1]+p'[x- (y > > 1), -1]+2)>>2...(38)
In addition, generating predicted pixel values as following expressions (39), also, removing in the case where zVR is -1In the case where except this, specifically, in the case where zVR is -2, -3, -4, -5, -6 or -7, such as following expressions (40) oneSample generates predicted pixel values.
pred8×8L[x, y]=(p'[-1,0]+2*p'[-1, -1]+p'[0, -1]+2) > > 2... (39)
pred8×8L[x, y]=(p'[-1, y-2*x-1]+2*p'[-1, y-2*x-2]+p'[-1, y-2*x-3]+2) > >2...(40)
Mode 6 is Horizontal_Down_prediction mode (lower horizontal prediction mode), also, according to as followsGenerate predicted value pred8 × 8L[x,y].Specifically, Horizontal_Down_prediction mode only p [x ,-1], x=0 ..., 7 and p [- 1, y], y=-1 ..., 7 be " available " Shi Shiyong.Now, such as following expressions (41)Equally define zHD.
ZHD=2*y-x... (41)
At this point, generating prediction as following expressions (42) in the case where zHD is 0,2,4,6,8,10,12 or 14Pixel value, also, in the case where zHD is 1,3,5,7,9,11 or 13, prediction pixel is generated as following expressions (43)Value.
pred8×8L[x, y]=(p'[-1, y- (x > > 1) -1]+p'[-1, y- (x > > 1)+1] > > 1... (42)
pred8×8L[x, y]=(p'[-1, y- (x > > 1) -2]+2*p'[-1, y- (x > > 1) -1]+p'[-1, y- (x > >1)]+2)>>2...(43)
In addition, generating predicted pixel values as following expressions (44), also, removing in the case where zHD is -1In the case where except this, specifically, in the case where zHD is -2, -3, -4, -5, -6 or -7, such as following expressions (45) oneSample generates predicted pixel values.
pred8×8L[x, y]=(p'[-1,0]+2*p'[-1, -1]+p'[0, -1]+2) > > 2... (44)
pred8×8L[x, y]=(p'[x-2*Y-1, -1]+2*p'[x-2*y-2, -1]+p'[x-2*y-3, -1]+2) > >2...(45)
Mode 7 is Vertical_Left_prediction mode (left vertical prediction mode), also, is produced according to followingRaw predicted value pred8 × 8L[x,y].Specifically, Vertical_Left_prediction mode is only in p [x, -1], x=0 ..., 15 be " available " Shi Shiyong, in the case where y=0,2,4 or 6, is generated as following expressions (46)Predicted pixel values, also, in the case where in addition to this, that is, in the case where y=1,3,5 or 7, such as following expressions (47)Equally generate predicted pixel values.
pred8×8L[x, y]=(p'[x+ (y > > 1), -1]+p'[x+ (y > > 1)+1, -1]+1) > > 1... (46)
pred8×8L[x, y]=(p'[x+ (y > > 1), -1]+2*p'[x+ (y > > 1)+1, -1]+p'[x+ (y > > 1)+2, -1]+2)>>2...(47)
Mode 8 is Horizontal_Up_prediction mode (upper horizontal prediction mode), also, is produced according to followingRaw predicted value pred8 × 8L[x,y].Specifically, Horizontal_Up_prediction mode is only in p [- 1, y], y=0 ..., 7 be " available " Shi Shiyong.Now, zHU is defined as following expressions (48).
ZHU=x+2*y... (48)
At this point, being generated as following expressions (49) pre- in the case where the value of zHU is 0,2,4,6,8,10 or 12Pixel value is surveyed, also, in the case where the value of zHU is 1,3,5,7,9 or 11, prediction is generated as following expressions (50)Pixel value.
pred8×8L[x, y]=(p'[-1, y+ (x > > 1)]+p'[-1, y+ (x > > 1)+1]+1) > > 1... (49)
pred8×8L[x, y]=(p'[-1, y+ (x > > 1)] ... (50)
In addition, predicted pixel values are generated as following expressions (51) in the case where the value of zHU is 13, also,In the case where in addition to this, that is, in the case where the value of zHU is greater than 13, prediction picture is generated as following expressions (52)Element value.
pred8×8L[x, y]=(p'[-1,6]+3*p'[-1,7]+2) > > 2... (51)
pred8×8L[x, y]=p'[-1,7] ... (52)
Next, 16 × 16 pixel intra prediction modes will be described.Figure 10 and Figure 11 is four kind 16 for showing luminance signalThe diagram of × 16 pixel intra prediction modes (Intra_16 × 16_pred_mode).
2 four kinds of intra prediction modes will be described referring to Fig.1.For the example in Figure 12, processing in frame to be carried out is shownCurrent macro A, also, P (x, y);X, y=-1,0 ..., 15 indicate pixel value with current macro A neighbouring pixel.
Mode 0 is vertical prediction mode, and only at P (x, -1);X, y=-1,0 ..., 15 when being " available "It is applicable in.In this case, the predicted pixel values of each pixel of current macro A are generated as following expressions (53)Pred(x,y)。
Pred (x, y)=P (x, -1);X, y=0 ..., 15... (53)
Mode 1 is horizontal prediction mode, and only at P (- 1, y);X, y=-1,0 ..., 15 when being " available "It is applicable in.In this case, the predicted pixel values of each pixel of current macro A are generated as following expressions (54)Pred(x,y)。
Pred (x, y)=P (- 1, y);X, y=0 ..., 15... (54)
Mode 2 is DC prediction mode, also, in P (x, -1) and P (- 1, y);X, y=-1,0 ..., 15 be entirely " can" in the case where, the predicted pixel values Pred of each pixel of current macro A is generated as following expressions (55)(x,y)。
[mathematical expression 4]
In addition, at P (x, -1);X, y=-1,0 ..., 15 be " not available " in the case where, such as following expressions(56) the predicted pixel values Pred (x, y) of each pixel of current macro A is equally generated.
[mathematical expression 5]
In addition, at P (- 1, y);X, y=-1,0 ..., 15 be " not available " in the case where, such as following expressions(57) the predicted pixel values Pred (x, y) of each pixel of current macro A is equally generated.
[mathematical expression 6]
In P (x, -1) and P (- 1, y);X, y=-1,0 ..., 15 be entirely " not available " in the case where, 128 are used asPredicted pixel values.
Mode 3 is plane prediction mode, and only in P (x, -1) and P (- 1, y);X, y=-1,0 ..., 15 allIt is " available " Shi Shiyong.In this case, each pixel of current macro A is generated as following expressions (58)Predicted pixel values Pred (x, y).
[mathematical expression 7]
Firstly, the intra prediction mode by description about color difference signal.Figure 13 is four kinds of frames for showing color difference signalThe diagram of inner estimation mode (Intra_chroma_pred_mode).It can be with the intra prediction mode of luminance signal independentlyThe intra prediction mode of color difference signal is set.Intra prediction mode and above-mentioned luminance signal about color difference signal16 × 16 pixel intra prediction modes are consistent.
However, 16 × 16 pixel intra prediction modes of luminance signal take 16 × 16 block of pixels as target, still, separatelyOn the one hand, the intra prediction mode about color difference signal takes 8 × 8 block of pixels as target.In addition, such as above-mentioned Figure 10 andShown in Figure 13, MODE NUMBER between the two is not corresponded to.
It is now assumed that our 16 × 16 pixel intra prediction moulds in accordance with the luminance signal described above in reference to Figure 12The pixel value of current block A in formula and the definition of neighborhood pixels value.For example, it is assumed that say, it is current macro with being handled in frame to be carried outThe pixel value of block A (in the case where color difference signal, 8 × 8 pixels) neighbouring pixel is taken as P (x, y);X, y=-1,0、……、7。
Mode 0 is DC prediction mode, also, in P (x, -1) and P (- 1, y);X, y=-1,0 ..., 7 be entirely " can" in the case where, the predicted pixel values Pred of each pixel of current macro A is generated as following expressions (59)(x,y)。
[mathematical expression 8]
In addition, at P (- 1, y);X, y=-1,0 ..., 7 be " not available " in the case where, such as following expressions (60)Equally generate the predicted pixel values Pred (x, y) of each pixel of current macro A.
[mathematical expression 9]
In addition, at P (x, -1);X, y=-1,0 ..., 7 be " not available " in the case where, such as following expressions (61)Equally generate the predicted pixel values Pred (x, y) of each pixel of current macro A.
[mathematical expression 10]
Mode 1 is horizontal prediction mode, and only at P (- 1, y);X, y=-1,0 ..., 7 be " available " Shi ShiWith.In this case, the predicted pixel values of each pixel of current macro A are generated as following expressions (62)Pred(x,y)。
Pred (x, y)=P (- 1, y);X, y=0 ..., 7... (62)
Mode 2 is vertical prediction mode, and only at P (x, -1);X, y=-1,0 ..., 7 be " available " Shi ShiWith.In this case, the predicted pixel values of each pixel of current macro A are generated as following expressions (63)Pred(x,y)。
Pred (x, y)=P (x, -1);X, y=0 ..., 7... (63)
Mode 3 is plane prediction mode, and only in P (x, -1) and P (- 1, y);X, y=-1,0 ..., 7 be " can" Shi Shiyong.In this case, the prediction of each pixel of current macro A is generated as following expressions (64)Pixel value Pred (x, y).
[mathematical expression 11]
As described above, nine kinds including 4 × 4 pixels and 8 × 8 pixel block units of the intra prediction mode of luminance signal pre-Four kinds of prediction modes of survey mode and 16 × 16 pixel macroblock units.For each macro block unit, these block units are setMode.The intra prediction mode of color difference signal includes four kinds of prediction modes of 8 × 8 pixel macroblock units.It can be with brightnessThe intra prediction mode of signal is provided independently from the intra prediction mode of color difference signal.
In addition, about 4 × 4 pixel intra prediction modes (intra-frame 4 * 4 forecasting model) of luminance signal and 8 × 8 pixelsIntra prediction mode (8 × 8 prediction mode in frame) is arranged in a kind of frame for 4 × 4 pixels and 8 × 8 pixel intensity blocksPrediction mode.16 × 16 pixel intra prediction modes (16 × 16 prediction mode in frame) and colour-difference about luminance signal are believedNumber intra prediction mode, for a macro block, a kind of prediction mode is set.
It note that the prediction mode of these types corresponds to the side indicated with above-mentioned number 0,1,3 to 8 in Fig. 5To.Prediction mode 2 is average value prediction.
As described previously for the intra prediction according to H.264/AVC format, only with expression formula (14) to expression formula(24) before executing intra prediction with the block incremental of 8 × 8 above-mentioned pixels, with determining filter factor to the picture of neighborhood pixelsElement value executes filtering processing.On the contrary, using picture coding device 51, in the intra prediction for executing all intra prediction modesBefore, filtering processing is executed to the pixel value of neighborhood pixels with according to the filter factor of the block to be predicted setting.
[configuration example of intraprediction unit and neighborhood pixels interpolation filtering switch unit]
Figure 14 is the detailed of neighborhood pixels interpolation filtering switch unit 75 and intraprediction unit 74 shown in pictorial image 1The block diagram of thin configuration example.
In the case where example in Figure 14, intraprediction unit 74 is produced by adjacent image setting unit 81, forecast imageRaw unit 82 and optimum prediction mode determination unit 83 are constituted.
Neighborhood pixels interpolation filtering switch unit 75 is by prediction mode buffer 91, quantization parameter buffer 92 and low passSetting unit 93 is filtered to constitute.It note that low-pass filtering setting unit 93 has built-in filter factor memory 94.
From frame memory 72 to adjacent image setting unit 81 for the neighborhood pixels of the current block applied to intra predictionValue.Although the diagram of switch 73 is omitted in Figure 14, in fact, via switch 73 from frame memory 72 to neighborhood graphAs setting unit 81 executes supply.It note that in the case where intra prediction, without the deblocking filtering of de-blocking filter 71Pixel value be used as neighborhood pixels value.
Adjacent image setting unit 81 is deposited using the filter factor being arranged by low-pass filtering setting unit 93 to from frameThe neighborhood pixels value of the current block of reservoir 72 executes filtering processing, and will be supplied to by the neighborhood pixels value of filtering processingForecast image generates unit 82.
The information that its mode is the intra prediction mode being presently processing is supplied to pre- by forecast image generation unit 82Survey pattern buffer 91.Forecast image generates what unit 82 was filtered using passing through from adjacent image setting unit 81Neighborhood pixels value is produced with being supplied to the intra prediction mode of prediction mode buffer 91 to execute intra prediction to current blockRaw forecast image.The forecast image of generation is supplied to optimum prediction mode determination unit together with intraprediction mode information83。
Carrying out in frame in advance for the reading of buffer 62 is rearranged to the supply of optimum prediction mode determination unit 83 from pictureThe image of survey generates the forecast image and its intraprediction mode information that unit 82 generates by forecast image.
Optimum prediction mode determination unit 83 is calculated using the information of supply about in the frame for having generated forecast imageThe cost function value of prediction mode, and the intra prediction mode of the minimum value generated in calculated cost function value is determinedIt is set to best intra prediction mode.Optimum prediction mode determination unit 83 is by the forecast image and phase of best intra prediction modeThe cost function value answered is output to forecast image selecting unit 77.
In addition, the forecast image for having selected to generate in best intra prediction mode in forecast image selecting unit 77In situation, the information for indicating best intra prediction mode is supplied to lossless coding single by optimum prediction mode determination unit 83Member 66.
The storage of prediction mode buffer 91 generates the intraprediction mode information of unit 82 from forecast image.Quantization ginsengNumber buffer 92 stores the quantization parameter from Rate control unit 78.
Low-pass filtering setting unit 93 reads the intraprediction mode information of current block from prediction mode buffer 91, andAnd quantization parameter corresponding with current block is read from quantization parameter buffer 92.Low-pass filtering setting unit 93 is out of be stored inIt sets filter factor in filter factor memory 94 and filter factor corresponding with the information is set, also, by the filtering system of settingNumber is supplied to adjacent image setting unit 81.
Filter factor memory 94 stores filter factor corresponding with quantization parameter and by the Figure 28 being described later onLearning device 251 at learnt using training image and the intra prediction mode that obtains.For example, for per a piece of(slice), it calculates and stores filter factor, as described below.
[description of the calculating of optimum filtering coefficient]
Next, referring to Fig.1 5 descriptions to be used for the calculating of the optimum filtering coefficient of the filtering processing to neighborhood pixelsMethod.It note that in the example of Figure 15, the example that vertical prediction (vertical prediction) is executed to the current block of 4 × 4 pixels be shownStill the case where can be applied to any intra prediction mode is described below in son.
Intra prediction for the block incremental of 8 × 8 pixels for above expression formula (14) to expression formula (24) being used to describe,The 3- tap filtering coefficient of { 1,2,1 } is defined for the low-pass filter of neighborhood pixels, but we are by { c0,c1,c2}It is considered as the general type of 3- tap.In addition, for the present invention, also the 4th parameter c of introducing3As deviant.
Although note that the 3- tap filter is described as to be arranged for each segment in the following description,But the 3- tap filter is without being limited thereto, also, for example, can be arranged for entire sequence or for each GOP.
In example in Figure 15, akm(0≤k, m≤3) are the pixel value for the pixel for including in current block, also, bm(- 1≤m≤4) are the pixel values of the neighborhood pixels for vertical prediction.
Firstly, by neighborhood pixels value bmThe 3- tap filtering of execution is handled to generate in following expressions (65)The b'm (0≤m≤3) shown.
[mathematical expression 12]
b′m=c0*bm-1+c1*bm+c2*bm+1+c3(0≤m≤3) …(65)
That is, we will say in the case where executing filtering processing and using filter factor, also using correspondingDeviant, as shown in expression formula (65), or even following without particularly referring to.In other words, filter factor and deviant areCoefficient for filtering processing.In an identical manner, the case where decoding side is sent in a manner of encoding in filter factorIn, we will say, corresponding deviant is also sent in a manner of coding.
Now, if we say that the predicted pixel values when intra prediction mode is n are pij(b ' m, n);0≤i, j≤3,Then about intra predicting pixel values, following expressions (66) is set up, because linear by describing above in reference to Fig. 2 to 14Expression formula generates prediction pixel.
[mathematical expression 13]
pij(b′m, n)
=pij(c0*bm-1+c1*bm+c2*bm+1+c3, n)
=c0*pij(bm-1, n) and+c1*pij(bm, n) and+c2*pij(bm+1, n) and+c3 …(66)
At this point, about with aijThe prediction square error such as following expressions of current block Ω as original image pixels value(67) shown in.
[mathematical expression 14]
Now, if we indicate the set of the intra block with intra prediction mode n coding in current clip with Φ,Then about the prediction square error for the block for belonging to Φ and with following expressions (68) indicate.
[mathematical expression 15]
In above-mentioned expression formula (68), Err (Ω ∈ Φ) is considered as c by us0、c1、c2、c3Function, that is, Err (Ω ∈Φ;c0,c1,c2,c3), so making Err (Ω ∈ Φ;c0,c1,c2,c3) minimize c0、c1、c2、c3It will be in current clipOptimum filtering coefficient value.That is, being enough to obtain c in the case where following expressions (69) are set up0、c1、c2、c3。
[mathematical expression 16]
That is, obtaining simultaneous equations shown in following expressions (70) from expression formula (69).
[mathematical expression 17]
Formula deformation is carried out to the expression formula (70) using matrix, obtains expression formula (71).
[mathematical expression 18]
The expression formula (71) is solved, optimum filtering coefficient and deviant { c can be obtained for current clip0,c1,c2,c3}。
It note that by solving the simultaneous equations in expression formula (70), obtain optimum filtering coefficient and deviant { c0,c1,c2,c3It is used as floating point values, still, for example, the picture coding device 51 in Fig. 1 and the corresponding image decoding dress in Figure 22In the case where setting 151, these are rounded to 8 bit coefficients.
That is, even if filter factor is floating-point, filter factor memory 94 is also according to the register of such as processorLength keeps these as n-bit (wherein, n is integer) value.
It, can also be in other frames by the way that other intra-frame prediction methods are also applied with method same as mentioned abovePrediction technique obtains optimum filtering coefficient.In addition, by identical method, not only for intra-frame 4 * 4 forecasting model, and also it is rightIn the intra prediction mode of 16 × 16 prediction modes and color difference signal in 8 × 8 prediction modes in frame, frame, can obtain mostGood filter factor.
Although obtaining a filter factor for each intra prediction mode in the above description, the filteringCoefficient is without being limited thereto, and it is possible to carry out such arrangement: where for all intra prediction modes, only obtain oneA filter factor.Particularly, for the intra prediction mode described above in reference to Fig. 2 to Figure 14, for it is vertical (it is vertical,Vertical) and horizontal (horizontal, horizontal) mode uses predicted pixel values as former state, still, for other modesCertain average treatment or weighted average processing are executed to generate prediction pixel, so its characteristic is different.Therefore, vertical water is executedThe two categories of flat-die type powdered and other modes are classified, also, calculate the filter factor of each classification, and coding effect may be implementedRate further increases.In addition, for example, there may be a filters for intra-frame 4 * 4 forecasting model for luminance signalWave system number, a filter factor for 8 × 8 prediction modes in frame and a filter for 16 × 16 prediction modes in frameWave system number.For example, for color difference signal filter factor can be obtained respectively for Cb/Cr.
In addition, in the above description, for the filter factor for low-pass filtering treatment, having used { c0,c1,c2ThreeTap, still, this is not limited to 3 taps, and the filter of any number of tap also can be used.That is, obtaining tapFilter factor+deviant of number.However, as number of taps increases, it is desirable that the number of the simultaneous equations of solution also increases.
Furthermore, it is possible to carry out such arrangement: where prepare and applies the filter factor different according to picture frame, such asCIF (CLV Common Intermediate Format, Common Intermediate Format)/QCIF (a quarter CIF, Quarter CIF),SD (standard definition), HD (fine definition) etc..
In addition, calculating filtering system by minimizing intra prediction (prediction square error) for the above methodNumber.But filter factor calculation method is without being limited thereto, also, in the case where needing to send filter factor to decoding side,It can also execute comprising the optimization for sending the bit of filter factor.
In addition, for above-mentioned filter factor, it will be assumed that the symmetry of coefficient, as shown in following expressions (72).
C0=C2... (72)
That is, filter factor is calculated, to have symmetry, such as { c about the center coefficient for corresponding to 0 phase0,c1,c0Equally.Therefore, three simultaneous equations as shown in above-mentioned expression formula (70) can be reduced to 2.As a result, it is possible to reduceCalculation amount.
It is suitable for the filter factor of input picture and in an adaptive way to neighborhood pixels execution low pass filtered by being arrangedWave processing, using the above method, is able to use the forecast image for being suitable for the image, quantization parameter and prediction mode to executeCoding, so as to improve code efficiency.
It is calculated about above-mentioned optimum filtering coefficient, it is contemplated that two methods.A kind of method is processed offline, that is, itsIn, before executing coded treatment, filter factor is calculated using the picture signal for training in advance, to make allPicture signal optimization.Study later with reference to Figure 28 description as the processed offline is handled, also, is handled by the studyCalculated filter factor and deviant are stored in the filter factor memory 94 in Figure 14.
Second method is online processing, that is, continuously calculates optimum filtering coefficient for each segment.ThisIn the case of, decoding side is sent by calculated filter factor and deviant.It is executed later with reference to Figure 20 description and is used as secondThe example of the case where processed offline of kind method.
[description of the coded treatment of picture coding device]
Next, referring to Fig.1 6 flow chart to be described to the coded treatment of the picture coding device 51 in Fig. 1.
In step s 11, A/D converting unit 61 executes Analog-digital Converter to input picture.In step s 12, pictureRearrange buffer 62 and store image supply from A/D converting unit 61, and execution from for show the sequence of picture toSequence for coding rearranges.
In step s 13, computing unit 63 calculates between the image rearranged in step s 12 and forecast imageDifference.In the case where executing inter-prediction, via forecast image selecting unit 77 by forecast image from motion prediction/compensation listMember 76 is supplied to computing unit 63, also, in the case where executing intra prediction, will be pre- via forecast image selecting unit 77Altimetric image is supplied to computing unit 63 from intraprediction unit 74.
Compared with raw image data, the data volume of differential data is smaller.Therefore, with it is right in the case where no variationThe case where original image is encoded is compared, can be with amount of compressed data.
In step S14, orthogonal transform unit 64 carries out orthogonal transformation to the difference information supplied from computing unit 63.Specifically, executing the orthogonal transformation of discrete cosine transform, Karhunen-Lo é ve transformation etc., also, output transform systemNumber.In step S15, quantifying unit 65 quantifies transformation coefficient.In the quantization, speed control, to will describe slightlyThe processing in step S25 described afterwards.
Following local decoder is carried out to the difference information quantified in this way.Specifically, in step s 16, inverse quantization listMember 68 carries out inverse amount to the transformation coefficient quantified by quantifying unit 65 using characteristic corresponding with the characteristic of quantifying unit 65Change.In step S17, inverse orthogonal transformation unit 69 is using characteristic corresponding with the characteristic of orthogonal transform unit 64 come to processThe inverse-quantized transformation coefficient of inverse quantization unit 68 carries out inverse orthogonal transformation.
In step S18, forecast image and local solution that computing unit 70 will be inputted via forecast image selecting unit 77The difference information of code is added, and generates the image (input that the image corresponds to computing unit 63) of local decoder.In stepIn rapid S19, de-blocking filter 71 is filtered the image exported from computing unit 70.Therefore, block distortion is eliminated.In stepIn rapid S20, frame memory 72 stores filtered image.It note that the filtering processing without de-blocking filter 71Image is also supplied to frame memory 72 from computing unit 70, to store.
In the step s 21, intraprediction unit 74 and motion prediction/compensating unit 76 are performed both by image prediction processing.ToolIt says to body, in the step s 21, intraprediction unit 74 executes intra-prediction process with intra prediction mode.Motion prediction/benefitIt repays unit 76 and motion prediction and compensation deals is executed with inter-frame forecast mode.
The details of the prediction processing in step S21 is described later with reference to Figure 17, still, according to present treatment, is executed allCandidate modes in prediction processing, also, calculate the cost function value in all candidate modes.Then, baseBest intra prediction mode is selected in calculated cost function value, also, will pass through the frame in best intra prediction modeThe forecast image and its cost function value that interior prediction generates are supplied to forecast image selecting unit 77.
It note that at this point, 74 use of intraprediction unit is by neighborhood pixels interpolation filtering before intra-prediction processThe filter factor that switch unit 75 is arranged to execute filtering processing to the neighborhood pixels of the intra prediction for current block.Then,Intra prediction is executed using the neighborhood pixels by filtering processing at intraprediction unit 74, also, generates prognostic chartPicture.
In step S22, forecast image selecting unit 77 is based on from intraprediction unit 74 and motion prediction/compensation listOne of best intra prediction mode and best inter-frame forecast mode are determined as optimum prediction mould by the cost function value of 76 output of memberFormula.Then forecast image selecting unit 77 selects the forecast image in determining optimum prediction mode, and be supplied into meterCalculate unit 63 and 70.As described above, the forecast image is for the calculating in step S13 and S18.
It note that the selection information of the forecast image is supplied to intraprediction unit 74 or motion prediction/compensating unit76.In the case where having selected the forecast image in best intra prediction mode, intraprediction unit 74 will be indicated in optimum frameThe information (that is, intraprediction mode information) of prediction mode is supplied to lossless coding unit 66.
In the case where having selected the forecast image in best inter-frame forecast mode, motion prediction/compensating unit 76 will refer toShow that the information of best inter-frame forecast mode is exported to lossless coding unit 66, also, as needed, it will be pre- with best interframeThe corresponding information of survey mode is exported to lossless coding unit 66.The example packet of information corresponding with best inter-frame forecast modeInclude motion vector information, flag information and reference frame information.That is, when select with as best inter-frame forecast modeInter-frame forecast mode corresponding forecast image when, motion prediction/compensating unit 76 by inter-frame forecast mode information, move toAmount information and reference frame information are output to lossless coding unit 66.
In step S23, lossless coding unit 66 compiles the quantization transform coefficient exported from quantifying unit 65Code.Specifically, carrying out the lossless coding and compression of variable length code, arithmetic coding etc. to difference image.ThisWhen, the best intra prediction mode letter from intraprediction unit 74 of lossless coding unit 66 is input in step S22Breath or information corresponding with the best inter-frame forecast mode from motion prediction/compensating unit 76 and come from rate controlThe quantization parameter etc. of unit 78 processed is also encoded, and is added with head information.
In step s 24, storage buffer 67 stores difference image as compression image.It is stored in storage buffer 67In compression image read in due course, and via transmitting path be sent to decoding side.
In step s 25, Rate control unit 78 passes through quantization based on the compression image being stored in storage buffer 67The rate of the quantization operation of state modulator quantifying unit 65, not will lead to overflow or underflow.
The quantization parameter for being used for rate control at quantifying unit 65 is supplied to lossless coding unit 66, upperLossless coded treatment is carried out to the quantization parameter in the step S23 stated, and inserts it into the head of compression image.ThisOutside, which is supplied to neighborhood pixels interpolation filtering switch unit 75, and being used for for setting will be to neighborhood pixelsThe filter factor of the filtering processing of execution executes the filtering processing before intra prediction.
[description of prediction processing]
Next, the flow chart in referring to Fig.1 7 is described the processing of the prediction in the step S21 in Figure 16.
It is the figure in the block handled in frame to be carried out rearranging the image to be processed that buffer 62 is supplied from pictureAs in the case where, the decoding image to be referred to is read from frame memory 72, also, is supplied in frame in advance via switch 73Survey unit 74.
In step S31, intraprediction unit 74 is using the image of supply with all candidate intra prediction modes to wantingThe pixel of the block of processing executes intra prediction.It note that and be used as and to join without the pixel of the deblocking filtering of de-blocking filter 71The decoded pixel examined.
The details for referring to Fig.1 8 being described the intra-prediction process in step S31 is still managed according to this, and setting is bestFilter factor, also, filtering processing is executed to neighborhood pixels using the filter factor of setting.Then, using have been carried out filterThe neighborhood pixels of wave processing execute intra prediction, to generate forecast image.
Above-mentioned processing is executed on all candidate intra prediction modes, for all candidate intra prediction modesCost function value is calculated, also, determines best intra prediction mode based on calculated cost function value.It is resultingThe cost function value of forecast image and best intra prediction mode is supplied to forecast image selecting unit 77.
The feelings that the image to be processed that buffer 62 is supplied is the image of interframe processing to be carried out are being rearranged from pictureIn condition, the image to be referred to is read from frame memory 72, also, is supplied to motion prediction/compensating unit via switch 7376.In step s 32, these images are based on, motion prediction/compensating unit 76 executes interframe movement prediction processing.NamelyIt says, motion prediction/compensating unit 76 executes all candidate inter-prediction moulds with reference to the image supplied from frame memory 72Motion prediction process in formula.
The details of interframe movement in step S32 prediction processing is described later with reference to Figure 19, also, according to present treatment,The motion prediction process in all candidate inter-frame forecast modes is executed, still, for all candidate inter-frame forecast mode metersCalculate cost function value.
In step S33, motion prediction/compensating unit 76 compares calculated about inter-prediction mould in step s 32The cost function value of formula, and the prediction mode for providing minimum value is determined as best inter-frame forecast mode.Motion prediction/benefitIt repays unit 76 and the forecast image and its cost function value that generate in best inter-frame forecast mode is then supplied to forecast imageSelecting unit 77.
[description of intra-prediction process]
Next, the flow chart in referring to Fig.1 7 is described the intra-prediction process in the step S31 in Figure 17.It please infuseMeaning, for the example in Figure 18, the case where by luminance signal is described, as an example.
In the step S25 in above-mentioned Figure 16, Rate control unit 78 is for the quantization parameter applied to current block.?In step S41, quantization parameter buffer 92 obtains the quantization parameter for current block from Rate control unit 78 and stores the amountChange parameter.
In step S42, forecast image generates unit 82 out of, 4 × 4 pixels, 8 × 8 pixels and 16 × 16 pixels frameA kind of intra prediction mode is selected in prediction mode.The intraprediction mode information of selection is stored in prediction mode bufferingIn device 91.
Low-pass filtering setting unit 93 reads intraprediction mode information from prediction mode buffer 91, also, from quantizationParameter buffer 92 reads quantization parameter value.In step S43, low-pass filtering setting unit 93 then from be stored in filtering systemIt is arranged in the filter factor gone out for each fragment computations in number memory 94 and corresponds to the intra prediction mode and amountChange the filter factor of parameter.The filter factor of setting is supplied to adjacent image setting unit 81.
In step S44, adjacent image setting unit 81 is using the filter factor of setting come the neighborhood pixels to current blockValue executes filtering processing, also, forecast image will be supplied to generate unit 82 by the neighborhood pixels value of filtering processing.
In step S45, forecast image generates unit 82 using the neighborhood pixels value by filtering processing in stepThe intra prediction mode selected in S42 executes intra prediction to current block, and generates forecast image.
To optimum prediction mode determination unit 83 supply from picture rearrange buffer 62 read to carry out frameThe image of interior prediction generates the forecast image and its intraprediction mode information that unit 82 generates by forecast image.
In step S46, optimum prediction mode determination unit 83 is calculated using the information of supply has produced predictionThe cost function value of the intra prediction mode of image.Here, based on one of high complexity mode or the technology of low-complexity modeCarry out the calculating of executory cost functional value.The JM (conjunctive model) of the H.264/AVC reference software of format is used as to determine theseMode.
Specifically, temporarily, coded treatment is gone to all candidate modes in high complexity mode,As the processing in step S45.For the cost function value that each prediction mode calculating is indicated with following expressions (73), also,Select the prediction mode for providing its minimum value as optimum prediction mode.
Cost (Mode)=D+ λ R... (73)
D indicates original image and decodes the difference (distortion) between image, and R indicates the generation comprising orthogonal transform coefficientSize of code, also, λ indicates the Lagrange multiplier that be provided as the function of quantization parameter QP.
On the other hand, in low-complexity mode, forecast image is generated, also, extremely for all candidate modesThe head bit of more calculation of motion vectors information, prediction mode information, flag information etc., as the processing in step S45.ForEach prediction mode calculates the cost function value indicated with following expressions (74), also, selects to provide the prediction mould of its minimum valueFormula is as optimum prediction mode.
Cost (Mode)=D+QPtoQuant (QP)+Header_Bit... (74)
D indicates original image and decodes the difference (distortion) between image, and Header_Bit indicates the head about prediction modeBit, also, QptoQuant is provided as the function of the function of quantization parameter QP.
In low-complexity mode, forecast image is generated only for all prediction modes, also, without executing volumeCode processing and decoding process, so as to reduce calculation amount.
In step S47, whether optimum prediction mode determination unit 83 is determined terminates for all intra prediction modesProcessing.That is, in step S47, it is determined whether for all of 4 × 4 pixels, 8 × 8 pixels and 16 × 16 pixelsIntra prediction mode perform the processing of step S42 to S46.
In the case where determining that being not yet directed to all intra prediction modes ends processing in step S47, processing returns toTo step S42, also, repeat subsequent processing.
In the case where determining in step S47 for all intra prediction mode end for the treatment of, processing is proceeded toStep S48.In step S48, its calculated cost function value is the frame of minimum value by optimum prediction mode determination unit 83Inner estimation mode is determined as best intra prediction mode.
The forecast image and its corresponding cost functional value of best intra prediction mode are supplied to forecast image selecting unit77。
The case where forecast image selecting unit 77 has selected the forecast image generated in best intra prediction modeIn, the information for indicating best intra prediction mode is supplied to lossless coding unit 66 by optimum prediction mode determination unit 83.Then this information is encoded at lossless coding unit 66, also, the information is added with the head information of compression image(the step S23 in above-mentioned Figure 16).
Note that be stored in filter factor memory 94 to handle calculated filter factor by study also similarGround is stored in the picture decoding apparatus 151 in the Figure 22 being described later on, from without that will be arranged filter factor and compressionThe head information of image is added and sends.
Therefore, in the case where H.264/AVC, there are 51 quantization parameters, there are the nine of 4 × 4 pixels and 8 × 8 pixelsKind intra prediction mode, also, when considering these combinations, need 51 × 9=459 huge filter factors.It need not will closeIt is sent to decoding side in the information of such huge filter factor, so as in the feelings for the expense for not increasing coefficient informationRealization is handled under condition.
[description of interframe movement prediction processing]
Next, the flow chart in referring to Fig.1 9 is described the processing of the interframe movement prediction in the step S32 in Figure 17.
In step S61, motion prediction/compensating unit 76 is directed to eight kinds be made of 16 × 16 pixels to 4 × 4 pixelsEach in inter-frame forecast mode determines motion vector and reference picture.That is, for will be with each inter-predictionThe block of mode treatment determines motion vector and reference picture.
In step S62, it is based in step S61 for eight kinds of interframe being made of 16 × 16 pixels to 4 × 4 pixelsIn prediction mode each determine motion vector, motion prediction/compensating unit 76 to reference picture carry out motion prediction andCompensation deals.According to the motion prediction and compensation deals, the forecast image in each inter-frame forecast mode is generated.
In step S63, motion prediction/compensating unit 76 is generated about for by 16 × 16 pixels to 4 × 4 pixel structuresAt eight kinds of inter-frame forecast modes in each determine motion vector motion vector information, with compression image addition.
Also using the motion vector information generated when being calculated as this functional value in following step S64, also,In the case where forecast image selecting unit 77 has finally chosen corresponding forecast image, by the motion vector information of generation and in advanceIt surveys pattern information and reference frame information is exported together to lossless coding unit 66.
In step S64, motion prediction/compensating unit 76 is directed to eight kinds be made of 16 × 16 pixels to 4 × 4 pixelsEach in inter-frame forecast mode calculates cost function value shown in above-mentioned expression formula (73) or expression formula (74).WhenWhen determining best inter-frame forecast mode in the step S34 in above-mentioned Figure 17, use calculated cost function value here.
Next, as the second method for calculating optimum filtering coefficient following situations will be described referring to Figure 20Example: where execute online processing, that is, continuously calculate optimum filtering coefficient for each segment.
Now, in this case it is necessary to send the filter factor gone out in coding side for each fragment computations toSide is decoded, also, is sent and has been broken down into the filter factors of a variety of situations and leads to the deterioration of code efficiency.Therefore, for pieceSection only sends a filter factor, alternatively, each prediction mode for each block size only sends a filter factor,Alternatively, only sending a filter factor for the prediction mode type of horizontal forecast, vertical prediction etc..
In addition, in the case where above-mentioned processed offline, to use intra prediction mode and quantization parameter as based onThe example for calculating the parameter of filter factor is described.On the other hand, in the case where online processing, for calculating filter factorA large amount of parameter increase treating capacity, thus, will with Figure 20 describe about the example for only using intra prediction mode for parameterThe example of son.Although being described omitting, of course, it is possible to quantization parameter be only used, alternatively, two kinds of parameters can be used.
[the other configurations example of intraprediction unit and neighborhood pixels interpolation filtering switch unit]
Figure 20 is the intraprediction unit 74 being shown in following situations and neighborhood pixels interpolation filtering switch unit 75The block diagram of another configuration example: online processing is executed for each segment, continuously to calculate optimum filtering coefficient.
In the case where example in Figure 20, in intraprediction unit 74 and neighborhood pixels interpolation filtering switch unit 75Between insertion switch 101, also, it is different from situation shown in Figure 14, by switching on and off switch 101, intraprediction unit74 execute intra prediction twice.That is, being executed in the state that switch 101 disconnects for intraprediction unit 74H.264/AVC intra prediction defined in, also, calculate the filter factor for being suitable for intra prediction.It is connected in switch 101In state, held with the filter factor being arranged by neighborhood pixels interpolation filtering switch unit 75 in calculated filter factorRow intra prediction.
Intraprediction unit 74 in Figure 20 generates unit 112 and most by adjacent image setting unit 111, forecast imageGood prediction mode determination unit 113 is constituted.
Neighborhood pixels interpolation filtering switch unit 75 is by prediction mode buffer 121,122 and of optimum filtering computing unitLow-pass filtering setting unit 123 is constituted.
Work as from frame memory 72 to neighborhood pixels setting unit 111 for being applied to all of current clip of intra predictionPreceding piece of neighborhood pixels value.In the case where Figure 20, the diagram of switch 73 is also omited.It note that the intra prediction the case whereIn, the pixel value without the deblocking filtering of de-blocking filter 71 is used as neighborhood pixels value.
In the case where switch 101 is in an off state, 111 use of neighborhood pixels setting unit is used only forH.264/AVC the filter factor of intra prediction mode defined in carrys out the neighborhood pixels to the current block from frame memory 72Value is filtered, and is supplied into forecast image and is generated unit 112.That is, only use expression aboveIn the case where 8 × 8 prediction modes that formula (14) is described to expression formula (24), the neighborhood pixels value by filtering processing is suppliedUnit 112 is generated to forecast image.In all other situations, the neighborhood pixels value of the current block from frame memory 72It is supplied to forecast image as former state and generates unit 112.
In the case where switch 101 is in (on) state of connection, set from low-pass filtering setting unit 123 to neighborhood pixelsIt sets unit 111 and supplies filter factor.Therefore, 111 use of neighborhood pixels setting unit is arranged by low-pass filtering setting unit 123Filter factor be filtered come the neighborhood pixels value to the current block from frame memory 72, and will by filteringThe neighborhood pixels value of processing is supplied to forecast image to generate unit 112.
Forecast image generates unit 112 using the neighborhood pixels value from neighborhood pixels setting unit 111 come with allIntra prediction mode executes intra prediction to current block, and generates forecast image.By the forecast image and intra prediction of generationPattern information is supplied to optimum prediction mode determination unit 113 together.
Rearranged that buffer 62 reads from picture to the supply of optimum prediction mode determination unit 113 for frameThe image of interior prediction generates the forecast image and its intraprediction mode information that unit 112 generates by forecast image.
Optimum prediction mode determination unit 113 is calculated pre- in the frame for generated forecast image using the information of supplyThe cost function value of survey mode, and the intra prediction mode for generating the minimum value in calculated cost function value is determinedFor best intra prediction mode.
In the case where switch 101 is in (off) state of disconnection, optimum prediction mode determination unit 113 will be in optimum frameThe information of prediction mode is supplied to prediction mode buffer 121.Switch 101 in an ON state in the case where, optimum predictionThe forecast image of best intra prediction mode and corresponding cost function value are supplied to forecast image by pattern determining unit 113Selecting unit 77.
In addition, the forecast image for having selected to generate in best intra prediction mode in forecast image selecting unit 77In situation, the information for indicating best intra prediction mode is supplied to lossless coding single by optimum prediction mode determination unit 113Member 66.
Prediction mode buffer 121 stores the intraprediction mode information from optimum prediction mode determination unit 113.
The intra prediction of the reading of buffer 62 has been rearranged from picture to the supply of optimum filtering computing unit 122The neighborhood pixels value of image and the current block from frame memory 72.Optimum filtering computing unit 122 is from prediction mode bufferEach piece of intra prediction mode for including in 121 reading current clips.Then optimum filtering computing unit 122 uses the letterIt ceases to calculate the optimum filtering coefficient of the intra prediction mode of current clip, as described in above referring to Fig.1 5, also, it is bestIt filters computing unit 122 and calculated filter factor is supplied to low-pass filtering setting unit 123.
Low-pass filtering setting unit 123 is arranged for current in the filter factor of calculated current clipThe filter factor of block, connects the terminal of switch 101, and the filter factor of setting is supplied to neighborhood pixels setting unit111.In addition, the filter factor for being used for current clip is supplied to lossless coding unit 66 by low-pass filtering setting unit 123.
[other descriptions of intra-prediction process]
Next, 75 He of neighborhood pixels interpolation filtering switch unit that will be described referring to the flow chart in Figure 21 in Figure 20The intra-prediction process that intraprediction unit 74 executes.It note that the intra-prediction process is the frame of the step S31 in Figure 17Another example of interior prediction processing.
Firstly, switch 101 is in an off state.Supplying from frame memory 72 to neighborhood pixels setting unit 111 will carry outThe neighborhood pixels value of all current blocks of the current clip of intra prediction.111 use of neighborhood pixels setting unit is used only forH.264/AVC the filter factor of 8 × 8 pixel intra prediction modes defined in comes to the current block from frame memory 72Neighborhood pixels value executes filtering processing, and is supplied into forecast image and generates unit 112.That is, in othersIn the case where intra prediction mode, the neighborhood pixels value of the current block from frame memory 72 is supplied to prognostic chart as former stateAs generating unit 112.
In step s101, forecast image generates unit 112 to all pieces of execution intra predictions for including in current clipProcessing.That is, forecast image, which generates unit 112, uses the neighbouring picture of the current block from neighborhood pixels setting unit 111Plain value to execute intra prediction with each intra prediction mode, and generates forecast image.
Carrying out for the reading of buffer 62 has been rearranged from picture to the supply of optimum prediction mode determination unit 113The image of intra prediction generates the forecast image and its intraprediction mode information that unit 112 generates by forecast image.
In step s 102, optimum prediction mode determination unit 113 is calculated pre- about producing using the information of supplyCost function value in the above-mentioned expression formula (73) or expression formula (74) of all intra prediction modes of altimetric image.
In step s 103, optimum prediction mode determination unit 113 is by the cost in expression formula (73) or expression formula (74)Function is that the smallest intra prediction mode is determined as best intra prediction mode, and by the letter of determining intra prediction modeBreath is supplied to prediction mode buffer 121.
Carrying out in frame for the reading of buffer 62 has been rearranged from picture to the supply of optimum filtering computing unit 122The neighborhood pixels value of the image of prediction and the current block from frame memory 72.Optimum filtering computing unit 122 is from prediction modeBuffer 121 reads each piece of intra prediction mode for including in current clip.
In step S104, optimum filtering computing unit 122 uses the information to calculate the residual error for making entire current clipThe smallest filter factor, the optimum filtering coefficient of each intra prediction mode as current clip.It will be above in reference to figureThe filter factor of 15 descriptions is supplied to low-pass filtering setting unit 123.
Low-pass filtering setting unit 123 is arranged in the filter factor of calculated current clip corresponds to current blockFilter factor, connect the terminal of switch 101, and the filter factor of setting be supplied to neighborhood pixels setting unit 111.
In step s105, neighborhood pixels setting unit 111 uses the filtering being arranged by low-pass filtering setting unit 123Coefficient to execute filtering processing to the neighborhood pixels value of the current block from frame memory 72.
Forecast image will be supplied to generate unit 112 by the neighborhood pixels value of filtering processing.In step s 106, in advanceAltimetric image generates unit 112 and reuses the neighborhood pixels value by filtering processing come to all pieces for including in current clipIntra prediction is executed, to generate forecast image.The forecast image of generation is supplied to most together with intraprediction mode informationGood prediction mode determination unit 113.
Switch 101 in an ON state in the case where, optimum prediction mode determination unit 113 is by best intra predictionThe forecast image of mode and corresponding cost function value are supplied to forecast image selecting unit 77.
In the step S22 of above-mentioned Figure 16, forecast image selecting unit 77 is by best intra prediction mode and optimum frameBetween one of prediction mode be determined as optimum prediction mode, and supply the selection information of forecast image.
In step s 107, optimum prediction mode determination unit 113 determines whether to select according to the selection information of forecast imageThe forecast image of best intra prediction mode is selected.When determination has selected the pre- of best intra prediction mode in step S107In the case where altimetric image, processing proceeds to step S108.
In step S108, intraprediction mode information is supplied to lossless volume by optimum prediction mode determination unit 113Code unit 66.It note that in the case where not yet supplying filter factor for current clip, come from optimum filtering computing unit122 filter factor is also supplied to lossless coding unit 66.
It is pre- in frame in the case where determination not yet selects the forecast image of best intra prediction mode in step s 107Processing terminate for survey.
Note that can obtain the filtering advanced optimized by repeating the processing of above-mentioned step S104 to S106Coefficient.
The compression image of coding is transmitted via scheduled transmitting path, and is decoded by picture decoding apparatus.
[configuration example of picture decoding apparatus]
Figure 22 indicates the configuration for the embodiment for being used as the picture decoding apparatus for applying image processing apparatus of the invention.
Picture decoding apparatus 151 by storage buffer 161, lossless decoding unit 162, inverse quantization unit 163, it is inverse justConverter unit 164, computing unit 165, de-blocking filter 166, picture is handed over to rearrange buffer 167, D/A converting unit168, frame memory 169, switch 170, intraprediction unit 171, neighborhood pixels interpolation filtering switch unit 172, movement are pre-Survey/compensating unit 173 and switch 174 are constituted.
Storage buffer 161 stores the compression image of transmission.Lossless decoding unit 162 using with it is lossless in Fig. 1The corresponding format of the coded format of coding unit 66 comes to from the supply of storage buffer 161 and by the lossless coding unit 66The information of coding is decoded.Inverse quantization unit 163 uses format corresponding with the quantization format of quantifying unit 65 in Fig. 1Inverse quantization is carried out to by the decoded image of lossless decoding unit 162.Inverse orthogonal transformation unit 164 using in Fig. 1 justIt hands over the corresponding format of orthogonal transformation format of converter unit 64 to carry out the output to inverse quantization unit 163 and carries out inverse orthogonal transformation.
It will be by the forecast image phase Calais for exporting with being supplied from switch 174 of inverse orthogonal transformation by computing unit 165The output is decoded.The block distortion of the removal decoding image of de-blocking filter 166, then, is supplied into frame memory169 to store, and also outputs it to picture and rearrange buffer 167.
Picture rearranges buffer 167 and rearranges to image execution.Specifically, by by the picture in Fig. 1 againArrangement buffer 62 is rearranged for original display sequence to the sequence for the frame that coded sequence rearranges.D/A converting unit168 pairs rearrange the image that buffer 167 is supplied from picture and carry out numerical simulation conversion, and output it to and be not shownDisplay to show.
Switch 170 reads the image that carry out interframe processing and the image to be referred to from frame memory 169, outputs itTo motion prediction/compensating unit 173, the image of intra prediction will be used for and be supplied into frame by also reading from frame memory 169Interior prediction unit 171.
The information for indicating intra prediction mode obtained will be decoded from lossless decoding unit by correct information162 are supplied to intraprediction unit 171.Intraprediction unit 171 is based on the information by using by neighborhood pixels interpolation filteringThe filter factor that switch unit 172 is arranged executes filtering processing and intra prediction to neighborhood pixels value to generate forecast image, andAnd the forecast image of generation is output to switch 174.
From lossless decoding unit 162 to neighborhood pixels interpolation filtering switch unit 172, supply according in image by compilingThe correct information of coding at code device 51 is decoded and the letter of the information of instruction intra prediction mode and quantization parameter that obtainsAt least one of breath.In a manner of identical with the neighborhood pixels interpolation filtering switch unit 75 in Fig. 1, neighborhood pixels interpolationFilter switch unit 172 storage by the learning device 251 in the Figure 28 being described later on learn and obtain with quantizationThe corresponding filter factor of at least one of parameter and intra prediction mode.
The setting of neighborhood pixels interpolation filtering switch unit 172 and quantization parameter and frame from lossless decoding unit 162The corresponding filter factor of at least one of inner estimation mode.Each segment neighborhood pixels interpolation filtering switch unit 172 willThe filter factor of setting is supplied to intraprediction unit 74.
It note that for neighborhood pixels interpolation filtering switch unit 172, store the filter factor of preparatory off-line learning.ButBe, it is noted that, with the neighborhood pixels interpolation filtering switch unit 75 in Fig. 1 in the case where line computation filter factor, exampleSuch as, these filter factors are transmitted to it for each segment.In this case, neighborhood pixels interpolation filtering switch unit172 uses are by the decoded filter factor of lossless decoding unit 162.
The information obtained (prediction mode information, motion vector information and reference frame will be decoded by correct informationInformation) from lossless decoding unit 162 it is supplied to motion prediction/compensating unit 173.Supplying instruction inter-frame forecast modeInformation in the case where, motion prediction/compensating unit 173 is based on motion vector information and reference frame information and transports to imageDynamic prediction and compensation deals, to generate forecast image.Motion prediction/compensating unit 173 will generate in inter-frame forecast modeForecast image be output to switch 174.
Switch 174 selects the forecast image generated by motion prediction/compensating unit 173 or intraprediction unit 171 and willIt is supplied to computing unit 165.
It note that using the picture coding device 51 in Fig. 1, in order to be determined based on the prediction mode of cost function, forAll intra prediction modes execute intra-prediction process.On the other hand, using picture decoding apparatus 151, it is based only on codingThe intra prediction mode being sent to it information, execute intra-prediction process.
[configuration example of intraprediction unit and neighborhood pixels interpolation filtering switch unit]
Figure 23 is the frame for illustrating the detailed configuration example of neighborhood pixels interpolation filtering switch unit and intraprediction unitFigure.It note that the functional block in Figure 23 corresponds to the feelings in the processed offline using picture coding device 51 shown in Figure 14Functional block in condition.
In the case where the example of Figure 23, intraprediction unit 71 generates unit 181 by forecast image and neighborhood pixels are setSet the composition of unit 182.Neighborhood pixels interpolation filtering switch unit 172 is by prediction mode buffer 191, quantization parameter buffer192 and low-pass filtering setting unit 193 constitute.Low-pass filtering setting unit 193 has built-in filter factor memory 194.
It generates intraprediction mode information of the supply of unit 181 from lossless decoding unit 162 to forecast image and comesFrom the neighborhood pixels value by filtering processing of neighborhood pixels setting unit 182.Forecast image generate 181 use of unit to itsThe neighborhood pixels value of supply to generate prediction to execute intra prediction from the intra prediction mode of lossless decoding unit 162Image, and the forecast image of generation is supplied to switch 174.
From frame memory 169 to neighborhood pixels setting unit 182, supply will carry out the neighbouring picture of the current block of intra predictionElement value.In the case where Figure 23, the diagram of switch 170 is omitted, still, in fact, by neighborhood pixels value via switch 170Neighborhood pixels setting unit 182 is supplied to from frame memory 169.
Neighborhood pixels setting unit 182 is using the filter factor being arranged by low-pass filtering setting unit 193 come to from frameThe neighborhood pixels value of the current block of memory 169 executes filtering processing, and the neighborhood pixels value that have passed through filtering processing is suppliedUnit 181 should be generated to forecast image.
Prediction mode buffer 191 stores the intraprediction mode information from lossless decoding unit 162.Quantization ginsengNumber buffer 192 stores the quantization parameter from lossless decoding unit 162.
Low-pass filtering setting unit 193 reads the intraprediction mode information of current block from prediction mode buffer 191,Also, quantization parameter corresponding with current block is read from quantization parameter buffer 192.Low-pass filtering setting unit 193 is from storageFilter factor in built-in filter factor memory 194 is arranged filter factor corresponding with the information, also, by settingFilter factor is supplied to neighborhood pixels setting unit 182.
In a manner of identical with the filter factor memory 94 in Figure 14, filter factor memory 194 storage with byThe corresponding filtering system of intra prediction mode and quantization parameter for learning at the learning device in Figure 28 being described later on and obtainingNumber.
For example, being directed to each segment, calculating and storing filter factor, as described in above referring to Fig.1 5.It note thatFor filter factor memory 194, filter factor is also remained by n-bit value (wherein, n according to the register capacity of processorIt is integer).
[description of the decoding process of picture decoding apparatus]
Next, by the decoding process that picture decoding apparatus 151 executes is described referring to the flow chart of Figure 24.
In step S131, storage buffer 161 stores the image of transmission.In step S132, lossless decoding unit162 pairs are decoded from the compression image that storage buffer 161 is supplied.Specifically, to by the lossless coding list in Fig. 1I picture, P picture and the B picture of 66 coding of member are decoded.
At this point, also to motion vector information, reference frame information, prediction mode information (instruction intra prediction mode or interframeThe information of prediction mode), quantified parameter information, flag information etc. be decoded.
Specifically, prediction mode information is supplied in the case where prediction mode information is intraprediction mode informationTo intraprediction unit 171 and neighborhood pixels interpolation filtering switch unit 172.In addition, being carried out to quantified parameter informationIn the case where decoding, it is also supplied into neighborhood pixels interpolation filtering switch unit 172.It is that interframe is pre- in prediction mode informationIn the case where surveying pattern information, the reference frame information for corresponding to prediction mode information and motion vector information are supplied to movementPrediction/compensating unit 173.
In step S133, inverse quantization unit 163 is come using characteristic corresponding with the characteristic of quantifying unit 65 in Fig. 1Inverse quantization is carried out to by the decoded transformation coefficient of lossless decoding unit 162.In step S134, inverse orthogonal transformation unit 164Using characteristic corresponding with the characteristic of orthogonal transform unit 64 in Fig. 1 come to by the inverse-quantized transformation series of inverse quantization unit 163Number carries out inverse orthogonal transformation.It means that the input with the orthogonal transform unit 64 in Fig. 1 (computing unit 63Output) corresponding difference information is decoded.
In step S135, computing unit 165 will input via switch 174 and in the step S141 being described later onThe forecast image selected in processing is added with the difference information.Therefore, original image is decoded.In step S136, goBlocking filter 166 is filtered the image exported from computing unit 165.Therefore, block distortion is eliminated.In step S137,Frame memory 169 stores filtered image.
In step S138, intraprediction unit 171 and motion prediction/compensating unit 173 are in response to from lossless decodingThe prediction mode information that unit 162 is supplied is handled to execute corresponding image prediction.
Specifically, in the case where supplying intraprediction mode information from lossless decoding unit 162, in framePredicting unit 171 executes intra-prediction process with intra prediction mode.At this point, 171 use of intraprediction unit is by neighborhood pixelsThe filter factor that interpolation filtering switch unit 172 is arranged to execute filtering processing and intra-prediction process to neighborhood pixels.
The details that the prediction processing in step S138 is described later with reference to Figure 25 is still managed according to this, will be by frameThe forecast image or be supplied to out by the forecast image that motion prediction/compensating unit 173 generates that interior prediction unit 171 generatesClose 174.
In step S139, switch 174 selects forecast image.Specifically, supply is generated by intraprediction unit 171Forecast image or the forecast image that is generated by motion prediction/compensating unit 173.Therefore, the forecast image of supply is selectedIt selects, is supplied to computing unit 165, also, in step S134, as described above, by the forecast image and inverse orthogonal transformation listThe output of member 164 is added.
In step S140, picture rearranges the execution of buffer 167 and rearranges.Specifically, will be encoded by imageThe picture of device 51 rearranges the sequence that buffer 62 is the frame for encoding and rearranging and is rearranged for original display sequenceColumn.
In step s 141, D/A converting unit 168 carries out number to the image for rearranging buffer 167 from pictureAnalog-converted.The image is exported to unshowned display, also, shows the image.
[description of prediction processing]
Next, the prediction in the step S138 described in Figure 24 referring to the flow chart in Figure 25 is handled.
In step S171, forecast image generates unit 181 and determines whether to have carried out intraframe coding to current block.It is inciting somebody to actionWhen intraprediction mode information is supplied to forecast image to generate unit 181 from lossless decoding unit 162, in step S171,Forecast image generates the determination of unit 181 and has carried out intraframe coding to current block, also, present treatment proceeds to step S172.
In step S172, forecast image generates unit 181 and receives and obtain from lossless decoding unit 162 in frame in advanceSurvey pattern information.At this point, intraprediction mode information is also supplied to prediction mode buffer 191 and is stored.
In addition, the quantified parameter information from lossless decoding unit 162 is supplied to quantization parameter buffer 192When, in step S173, quantization parameter buffer 192 obtains and stores quantization parameter.
Low-pass filtering setting unit 193 reads the intraprediction mode information of current block from prediction mode buffer 191,Also, the quantization parameter about current block is read from quantization parameter buffer 192.In step S174, low-pass filtering setting is singleThe setting in the filter factor for being stored in each of built-in filter factor memory 194 segment of member 193 corresponds to shouldThe filter factor of the neighborhood pixels of information.The filter factor of setting is supplied to neighborhood pixels setting unit 182.
In step S175, neighborhood pixels setting unit 182 uses the filtering being arranged by low-pass filtering setting unit 193Coefficient to execute filtering processing to the neighborhood pixels value of the current block from frame memory 169, and will have passed through filtering processingNeighborhood pixels value be supplied to forecast image generate unit 181.
Forecast image generates unit 181 using the neighborhood pixels value supplied from neighborhood pixels setting unit 182 come in stepThe intra prediction mode obtained in rapid S172 executes intra prediction, and generates forecast image.The forecast image of generation is suppliedTo switch 174.
On the other hand, in the case where determination has not carried out intraframe coding in step S171, present treatment proceeds to stepS177。
In the case where image to be processed is the image handled in frame to be carried out, by inter-frame forecast mode information, referenceFrame information and motion vector information are supplied to motion prediction/compensating unit 173 from lossless decoding unit 162.In step S177In, motion prediction/compensating unit 173 obtains inter-frame forecast mode information, reference frame information, fortune from lossless decoding unit 162Moving vector information etc..
In step S178, then motion prediction/compensating unit 173 executes interframe movement prediction.Specifically, wantingIn the case where the image of processing is the image of inter-prediction processing to be carried out, necessary image is read from frame memory 169, andAnd necessary image is supplied to motion prediction/compensating unit 173 via switch 170.In step S177, motion prediction/Compensating unit 173 executes motion prediction based on the motion vector obtained in step S176 with inter-frame forecast mode, to generateForecast image.The forecast image of generation is exported to switch 174.
[the other configurations example of intraprediction unit and neighborhood pixels interpolation filtering switch unit]
Figure 26 is the frame for illustrating the detailed configuration example of neighborhood pixels interpolation filtering switch unit and intraprediction unitFigure.It note that the functional block in Figure 26 corresponds to the feelings in the online processing using picture coding device 51 shown in Figure 20Functional block in condition.
In the case where the example of Figure 26, intraprediction unit 71 is by the neighborhood pixels setting unit 182 in Figure 23 and in advanceAltimetric image generates unit 181 and constitutes.Neighborhood pixels interpolation filtering switch unit 172 is by the prediction mode buffer in Figure 23191, interpolation filtering buffer 201 and low-pass filtering setting unit 202 are constituted.It note that in the example in Figure 26, it is correspondingThe part of situation in Figure 23 is indicated with corresponding appended drawing reference, also, essentially performs identical processing, to will saveSlightly its description.
In the case where Figure 26, it is encoded for the filter factor that current clip calculates and is sent out from picture coding device 51It send.Therefore, lossless decoding unit 162 decodes it together with other information, and neighborhood pixels interpolation filtering is supplied to switchThe interpolation filtering buffer 201 of unit 172.
Interpolation filtering buffer 201 obtains filter factor and the storage for current clip from lossless decoding unit 162It.
Low-pass filtering setting unit 202 reads the intraprediction mode information of current block from prediction mode buffer 191.Low-pass filtering setting unit 202 corresponds to from the reading of the filter factor for the current clip being stored in interpolation filtering buffer 201The filter factor of the intra prediction mode read, and it is set to the filter factor for current block.By settingFilter factor is supplied to neighborhood pixels setting unit 182.
[other descriptions of prediction processing]
Next, intraprediction unit 171 and the switching of neighborhood pixels interpolation filtering will be described referring to the flow chart in Figure 27Prediction processing in the case where unit 172.It note that the intra-prediction process is at prediction in step S138 in Figure 24Another example of reason.In addition, in Figure 27 the processing of step S181, S182 and S185 to S188 essentially perform in Figure 25Step S171, the identical processing of S172 and S175 to S178, therefore omit detailed description.
In step S181, forecast image generates unit 181 and determines whether to have carried out intraframe coding to current block.It is inciting somebody to actionWhen intraprediction mode information is supplied to forecast image to generate unit 181 from lossless decoding unit 162, in step S181,Forecast image generates the determination of unit 181 and has carried out intraframe coding to current block, also, present treatment proceeds to step S182.
In step S182, forecast image generates unit 181 and receives and obtain from lossless decoding unit 162 in frame in advanceSurvey pattern information.At this point, the intraprediction mode information is also supplied to prediction mode buffer 191 and is stored.
In addition, the information in the filter factor for current clip is supplied to interpolation to filter from lossless decoding unit 162When wave buffer 201, interpolation filtering buffer 201 obtains the filter factor for being used for current clip in step S183, and stores.It note that and supply filter factor for each segment.
Low-pass filtering setting unit 202 is read from prediction mode buffer 191 to be believed for the intra prediction mode of current blockBreath.In step S184, in addition to the filter factor for the current clip being stored in interpolation filtering buffer 201, low-pass filtering is setIt sets unit 202 and also corresponds to filter factor of the intra prediction mode setting of current block for neighborhood pixels.By the filtering of settingCoefficient is supplied to neighborhood pixels setting unit 182.
In step S185, neighborhood pixels setting unit 182 uses the filtering being arranged by low-pass filtering setting unit 202Coefficient to execute filtering processing to the neighborhood pixels value of the current block from frame memory 169, and will have passed through filtering processingNeighborhood pixels value be supplied to forecast image generate unit 181.
In step S186, it is neighbouring using supplying from neighborhood pixels setting unit 182 that forecast image generates unit 181Pixel value using the intra prediction mode that obtains in step S172 executes intra prediction, and generates forecast image.It will produceRaw forecast image is supplied to switch 174.
On the other hand, in the case where determining that it is not intraframe coding in step S181, present treatment proceeds to stepS187。
In step S187, motion prediction/compensating unit 173 obtains inter-frame forecast mode from lossless decoding unit 162Information, reference frame information, motion vector information etc..
In step S188, motion prediction/compensating unit 173 executes interframe movement prediction.Due to this processing, generateForecast image be output to switch 174.
Therefore, prior to handling in frame, pass through the picture coding device 51 in Fig. 1 and the picture decoding apparatus in Figure 22151, filtering processing is executed to the neighborhood pixels for intra prediction using the filter factor for image adaptive being arranged.ExampleSuch as, filter factor is arranged according to intra prediction mode or quantization parameter.
Therefore, the noise remove corresponding to image and bit rate can be executed.As a result, it is possible to increase forecasting efficiency.
Figure 28 indicates to apply the configuration of one embodiment of learning device of the invention.In example in Figure 28, learnDevice 251 is practised to handle using the study that training image signal executes filter factor.
It note that training image signal is the test image for obtaining filter factor, and can be used, for example, canWith the standardized standard sequence for image compression encoding obtained in www.vqeg.org.It is corresponded to alternatively, can also useIn the input picture of each application.For example, can be used in the case where input is camera signal using CCD or CMOSThe baseband signal of sensor imaging executes study.
The picture coding device 51 in learning device 251 and Fig. 1 in Figure 28 is had in common that be converted with A/DUnit 61, picture rearrange buffer 62, computing unit 63, orthogonal transform unit 64, quantifying unit 65, lossless codingUnit 66, storage buffer 67, inverse quantization unit 68, inverse orthogonal transformation unit 69, computing unit 70, de-blocking filter 71,Frame memory 72, switch 73, intraprediction unit 74, motion prediction/compensating unit 76, forecast image selecting unit 77 and speedRate control unit 78.
In addition, picture coding device 51 in learning device 251 and Fig. 1 the difference is that: for the letter usedNumber, using training image signal, also, including neighborhood pixels interpolation filtering computing unit 261, rather than neighborhood pixels interpolationFilter switch unit 75.
Specifically, only executing study using only including the block in I picture for learning device 251.Alternatively, rightIn learning device 251, study only is executed using only including the block in intra-frame macro block in B picture and P picture.The former comparesThe latter needs the less calculation amount for study.In addition, in the case where the former, for including that the block in I picture obtainsCoefficient can be suitable only for the block for including in I picture, or can be adapted for include in B picture and P pictureIntra-frame macro block.
That is, executing the study only by the intra prediction using intraprediction unit 74 for learning device 251.CauseThis, we will say, motion prediction/compensating unit 76 does not work actually.
In addition, the neighborhood pixels interpolation filtering computing unit 261 in Figure 29 is cut with the neighborhood pixels interpolation filtering in Figure 20It changes unit 75 to have in common that, there is prediction mode buffer 121, optimum filtering computing unit 122 and low-pass filtering to setSet unit 123.
On the other hand, the neighborhood pixels interpolation filtering computing unit 261 in Figure 29 and the neighborhood pixels interpolation in Figure 20 are filteredWave switch unit 75 the difference is that, be added to filter factor storage unit 271, also, Rate control unit will be come from78 quantization parameter is supplied to optimum filtering computing unit 122.
Specifically, in example in Figure 29, it is identical by with the example in Figure 20 the case where in a manner of, it is pre- in frameIt surveys and switch 101 is set between unit 74 and neighborhood pixels interpolation filtering switch unit 75, also, 74 basis of intraprediction unitOpening/closing for switch 101 executes intra prediction twice.
That is, in the closed state of switch 101, being executed in H.264/AVC for intraprediction unit 74The intra prediction of definition, also, for each fragment computations filter factor optimal for intra prediction mode and quantization ginsengNumber.The filter factor gone out for each fragment computations is stored in filter factor storage unit 271.Then, it is switchingIn 101 opening state, with being cut by neighborhood pixels interpolation filtering in the filter factor of each calculated segmentThe filtering condition for changing the setting of unit 75 executes intra prediction.
Via storage medium or network etc., the filter factor stored in the filter factor storage unit 271 is stored inThe filtering of picture decoding apparatus 151 in the filter factor memory 94 (Figure 14) and Figure 22 of picture coding device 51 in Fig. 1Coefficient memory 194 (Figure 23).
[description of the intra-prediction process in study processing]
Next, by the study in the Figure 28 for a process for describing to handle as study referring to the flow chart in Figure 30The intra-prediction process that device 251 executes.It note that and study is handled, in addition to the prediction in step S21 is handled by stepOther than intra-prediction process substitution in S30, learning device 251 essentially performs place identical with the coded treatment in Figure 17Reason.
In addition, the step S201 to S203 and S206 to S209 in Figure 30 essentially perform with step S101 to S103 andThe identical processing of S105 to S108, thus, it will be omitted and repeated.That is, in step S204 in Figure 30, it is best to filterWave computing unit 122 calculates the filtering system for making each intra prediction mode of the smallest current clip of the residual error of whole fragmentSeveral and corresponding quantization parameter, as optimum filtering coefficient.Calculated filter factor is supplied to filter factor storage unit271。
In step S205, filter factor storage unit 271 stores the filtering supplied from optimum filtering computing unit 122Coefficient.
Filter factor of the low-pass filtering setting unit 123 from the current clip being stored in filter factor storage unit 271Middle setting corresponds to the filter factor of current block, connects the terminal of switch 101, and the filter factor of setting is supplied to neighbourNearly pixel setting unit 111.
Therefore, in step S206, using the filter factor of setting come to the neighbouring of the current block from frame memory 72Pixel value executes filtering processing.
Certainly, in a manner of identical with the example in Figure 21, the processing for repeating above-mentioned step S204 to S207 can be obtainedThe filter factor that must be advanced optimized.
As described previously for learning device 251, the coded treatment for being executed using training image signal and actually being usedIdentical processing, also, will thus calculated filter factor store into filter factor storage unit 271.It therefore, can be withObtain optimum filtering coefficient.
Via storage medium or network etc., the filter factor stored in the filter factor storage unit 271 is stored inThe filtering of picture decoding apparatus 151 in the filter factor memory 94 (Figure 14) and Figure 22 of picture coding device 51 in Fig. 1Coefficient memory 194 (Figure 23).
In addition, using picture coding device 251, as described previously for the block for including in I picture (or in B picture and PThe intra-frame macro block for including in picture) obtain coefficient can be suitable only for the block in I picture included.Alternatively, the coefficient is not onlyIt can be adapted for the block for including in I picture, can be applicable to the intra-frame macro block for including in B picture and P picture.
Alternatively, can realize high coding with the picture coding device 51 in Fig. 1 and the picture decoding apparatus in Figure 22 151Efficiency.
It note that the above-mentioned orthogonal transform unit 64 and inverse orthogonal transformation list using the picture coding device 51 in Fig. 1The inverse orthogonal transformation unit 164 of picture decoding apparatus 151 in 69 and Figure 22 of member executes defined in H.264/AVCOrthogonal transformation/inverse orthogonal transformation.Alternatively, such arrangement can be carried out: where picture coding device 51 in Fig. 1 it is orthogonalThe inverse orthogonal transformation unit of picture decoding apparatus 151 in converter unit 64 and inverse orthogonal transformation unit 69 and Figure 22164, execute the orthogonal transformation/inverse orthogonal transformation proposed in non-patent literature 1.
Therefore, it can be further improved the code efficiency of the format proposed in non-patent literature 1.
Although note that the example for having been described execute intra prediction in the above description, the present inventionIt can be adapted for the intra prediction in the re prediction proposed in non-patent literature 2.
<2. second embodiment>
[the other configurations example of picture coding device]
Figure 31 shows matching for another embodiment for being used as the picture coding device for applying image processing apparatus of the inventionIt sets.
Picture coding device 51 in picture coding device 351 and Fig. 1 is had in common that with A/D converting unit61, picture rearranges buffer 62, computing unit 63, orthogonal transform unit 64, quantifying unit 65, lossless coding unit66, storage buffer 67, inverse quantization unit 68, inverse orthogonal transformation unit 69, computing unit 70, de-blocking filter 71, frame are depositedReservoir 72, switch 73, intraprediction unit 74, motion prediction/compensating unit 76, forecast image selecting unit 77 and rate controlUnit 78 processed.
In addition, picture coding device 51 in picture coding device 351 and Fig. 1 the difference is that, be omitted neighbouringPixel interpolating filters switch unit 75, also, is added to re prediction unit 361 and neighborhood pixels interpolation filtering switch unit362。
That is, intraprediction unit 74 executes H.264/AVC intra prediction for the example in Figure 31.
On the other hand, motion prediction/compensating unit 76 is all based on the image handled for interframe and reference picture detectionCandidate inter-frame forecast mode motion vector, processing is compensated to reference picture based on motion vector, and generate predictionImage.
Motion prediction/compensating unit 76 supplies the motion vector information detected, for frame to re prediction unit 361Between information (address etc.) He Yici residual error of image for handling, which is the image for inter-prediction and generatesDifference between forecast image.
Motion prediction/compensating unit 76 is secondary pre- to determine by comparing the quadratic residue from re prediction unit 361Best intra prediction mode in survey.In addition, motion prediction/compensating unit 76 comes by comparing quadratic residue and a residual errorIt determines and quadratic residue encode or a residual error is encoded.It note that all candidate inter-frame forecast modesExecute the processing.
Motion prediction/compensating unit 76 calculates cost function value for all candidate inter-frame forecast modes.In addition, makingCost function value is calculated with the residual error determined for each inter-frame forecast mode in a residual sum quadratic residue.FortuneThe prediction mode of the minimum value generated in calculated cost function value is determined as best interframe by dynamic prediction/compensating unit 76Prediction mode.
Motion prediction/compensating unit 76 by the forecast image generated in best inter-frame forecast mode (or be used for interframeDifference between the image and quadratic residue of prediction) and its cost function value be supplied to forecast image selecting unit 77.In prognostic chartIn the case where having selected the forecast image generated in best inter-frame forecast mode as selecting unit 77, motion prediction/compensation listMember 76 exports the information for indicating best inter-frame forecast mode to lossless coding unit 66.At this point, motion vector information, ginsengExamine frame information, re prediction mark, the information of intra prediction mode in re prediction that instruction will execute re prediction etc.It is output to lossless coding unit 66.
Lossless coding unit 66 also carries out such as variable-length to the information from motion prediction/compensating unit 76 and compilesThe lossless coded treatment of code, arithmetic coding etc., and be inserted into the head of compression image.
Letter based on the motion vector information from motion prediction/compensating unit 76 and the image that carry out interframe processingBreath, re prediction unit 361 are read and the neighbouring current neighbouring picture of current block that carry out interframe processing from frame memory 72Element.In addition, re prediction unit 361 from frame memory 72 read with by motion vector information with current block it is associated andThe neighbouring reference neighborhood pixels of reference block.
Re prediction unit 361 executes re prediction processing.Re prediction processing is such processing: where primaryIntra prediction is executed between difference between the current neighborhood pixels of residual sum and reference neighborhood pixels, so that it is (secondary residual to generate second differenceDifference) information.
Now, re prediction processing will be described referring to Figure 32.
For the example in Figure 32, present frame and reference frame are shown, wherein show current block A in the current frame.
In the case where obtaining motion vector mv (mv_x, mv_y) in the reference frame and present frame for current block A, meterIt calculates current block A and passes through the difference information (residual error) between associated with current block A piece of motion vector mv.
For re prediction system, difference information related with current block A is not only calculated, but also is calculated neighbouring currentThe neighborhood pixels group R of block A and pass through the difference between motion vector mv neighborhood pixels group R1 associated with neighborhood pixels group RInformation.
That is, obtaining the coordinate of neighborhood pixels group R from the top-left coordinates (x, y) of current block A.In addition, from fortune is passed throughThe top-left coordinates (x+ mv_x, y+mv_y) that associated with current block A piece of moving vector mv obtain the coordinate of neighborhood pixels group R1.The difference information of neighborhood pixels group is calculated from these coordinate values.
For re prediction system, in the difference information about current block calculated in this way and about reference pixelThe intra prediction according to H.264/AVC format is executed between difference information, to generate second order difference information.To the two of generationSecondary difference information carries out orthogonal transformation and quantization, is encoded together with compression image, and is sent to decoding side.
Before the re prediction, 361 use of re prediction unit is set by neighborhood pixels interpolation filtering switch unit 362The filter factor set to execute at filtering to the current neighborhood pixels for intra prediction and with reference to the difference between neighborhood pixelsReason.Re prediction unit 361 is then using the current neighborhood pixels by filtering processing and with reference to the filtering between neighborhood pixelsDifference execute re prediction processing, and second order difference information (quadratic residue) is output to motion prediction/compensating unit76。
That is, re prediction unit 361 is configured to include intraprediction unit 74 etc. shown in Figure 14.
Neighborhood pixels interpolation filtering switch unit 362 substantially with the neighborhood pixels interpolation filtering switch unit in Fig. 175 identical modes configure, and execute identical processing.That is, 362 basis of neighborhood pixels interpolation filtering switch unitIntraprediction mode information from re prediction unit 361 and the quantization parameter from Rate control unit 78 are filtered to be arrangedWave system number, and the filter factor of setting is supplied to re prediction unit 361.
It note that the coded treatment of the picture coding device 351 in Figure 31 is held with by the picture coding device 51 in Fig. 1The difference of coded treatment in capable Figure 16 is only in that processing and motion prediction process in following frames, also, otherProcessing it is substantially the same, therefore, will the descriptions thereof are omitted.
That is, as being handled in frame, being executed according to H.264/AVC using the picture coding device 351 in Figure 31The intra prediction of format.In addition, in motion prediction process, use is by neighborhood pixels interpolation filtering as motion prediction processThe filter factor that switch unit 362 is arranged, to generate second order difference information.From first difference information and second order difference informationThe middle better second order difference information of selection, also, best intra prediction mode is determined by comparing cost function value.
Picture decoding apparatus will be described referring to Figure 33, picture decoding apparatus reception is encoded by the picture coding device 351Compression image and decode it.
[other examples of picture decoding apparatus]
Figure 33 indicates matching for another embodiment for being used as the picture decoding apparatus for applying image processing apparatus of the inventionIt sets.
Picture decoding apparatus 151 in picture decoding apparatus 401 and Figure 22 is had in common that comprising storage buffer161, lossless decoding unit 162, inverse quantization unit 163, inverse orthogonal transformation unit 164, computing unit 165, de-blocking filter166, picture rearranges buffer 167, D/A converting unit 168, frame memory 169, switch 170, intraprediction unit171, motion prediction/compensating unit 173 and switch 174.
In addition, picture decoding apparatus 151 in picture decoding apparatus 401 and Figure 22 the difference is that, neighbour is omittedNearly pixel interpolating filters switch unit 172, also, is added to re prediction unit 411 and neighborhood pixels interpolation filtering switching listMember 412.
That is, the information of the instruction intra prediction mode obtained will be decoded by correct information from losslessConsumption decoding unit 162 is supplied to intraprediction unit 171.Based on the information, intraprediction unit 171 generates forecast image simultaneouslyThe forecast image of generation is output to switch 174.
Prediction mode information, motion vector information and ginseng in the information obtained will be decoded by correct informationIt examines frame information etc. and is supplied to motion prediction/compensating unit 173 from lossless decoding unit 162.In addition, to current block applicationIn the case where re prediction processing, instruction to be executed in the re prediction mark of the re prediction and the frame of re predictionPattern information is also supplied to motion prediction/compensating unit 173 from lossless decoding unit 162.
In the case where determining using re prediction processing, motion prediction/compensating unit 173 controls re prediction unit411, so that executing re prediction with the intra prediction mode that the intraprediction mode information of re prediction indicates.
Motion prediction/compensating unit 173 is based on motion vector information and reference frame information and carries out motion prediction to the imageAnd compensation deals, and generate forecast image.That is, using reference block associated with current block in reference blockPixel value generates the forecast image of current block.Motion prediction/compensating unit 173 then will be from re prediction unit 411Prediction difference value is added with the forecast image of generation, and these are output to switch 174.
Re prediction unit 411 is used from the current neighborhood pixels that frame memory 169 is read and with reference between neighborhood pixelsDifference execute re prediction.Before the re prediction, 411 use of re prediction unit is cut by neighborhood pixels interpolation filteringThe filter factor for changing the setting of unit 412 to execute filtering processing to current neighborhood pixels and with reference to the difference between neighborhood pixels.TwoThen secondary predicting unit 411 is executed using the current neighborhood pixels by filtering processing and with reference to the difference between neighborhood pixelsRe prediction processing, and the second order difference information (quadratic residue) of acquisition is output to motion prediction/compensating unit 173.
That is, re prediction unit 411 is configured to include intraprediction unit 171 etc. shown in Figure 26.
Neighborhood pixels interpolation filtering switch unit 412 is substantially with identical as neighborhood pixels interpolation filtering switch unit 172Mode configure.That is, neighborhood pixels interpolation filtering switch unit 412 is arranged and from lossless decoding unit 162The corresponding filter factor of at least one of quantization parameter and intra prediction mode.Neighborhood pixels interpolation filtering switch unit 412The filter factor of setting is supplied to re prediction unit 411.
It note that the decoding process of the picture decoding apparatus 401 in Figure 33 and by the picture decoding apparatus 151 in Figure 22The difference of decoding process in Figure 24 of execution is only in that processing and motion prediction process in following frames, also, itsIts processing is substantially the same processing, therefore, by the descriptions thereof are omitted.
That is, as being handled in frame, being executed according to H.264/AVC using the picture decoding apparatus 401 in Figure 33The intra prediction of format.In addition, using what is be arranged by neighborhood pixels interpolation filtering switch unit 412 as motion prediction processFilter factor executes re prediction (intra prediction) in motion prediction process, and generates second order difference information.
It it may also be possible to apply the invention for the intra prediction in re prediction processing as described above.
It note that in the above description, describe such example: before intra prediction, being adapted to using to imageNeighborhood pixels for intra prediction are executed filtering processing by the filter factor of ground setting.
Now, the noise for including in the neighborhood pixels for intra prediction system is joined according to the content of such as image, quantizationThe encoding condition of number etc. is different.Accordingly, there exist mentioned by executing the filtering processing for example executed in H.264/AVC formatThe block of high coding efficiency and be not such block.
In any case, when executing processing in the frame based on 8 × 8 pieces to macro block with H.264/AVC format, to allBlock perform blank filtering processing (blanket filtering processing), thus will appear code efficiency thus reduceBlock.
Therefore, the feelings opened/closed that the neighborhood pixels for intra prediction are executed with filtering processing are described belowThe example of condition.
<3. 3rd embodiment>
[the other configurations example of picture coding device]
Figure 34 shows matching for another embodiment for being used as the picture coding device for applying image processing apparatus of the inventionIt sets.
Picture coding device 51 in picture coding device 451 and Fig. 1 is had in common that with A/D converting unit61, picture rearranges buffer 62, computing unit 63, orthogonal transform unit 64, quantifying unit 65, lossless coding unit66, storage buffer 67, inverse quantization unit 68, inverse orthogonal transformation unit 69, computing unit 70, de-blocking filter 71, frame are depositedReservoir 72, switch 73, intraprediction unit 74, motion prediction/compensating unit 76, forecast image selecting unit 77 and rate controlUnit 78 processed.
In addition, picture coding device 51 in picture coding device 451 and Fig. 1 the difference is that, use neighborhood pixelsInterpolation filtering control unit 461 is instead of neighborhood pixels interpolation filtering switch unit 75.
That is, neighborhood pixels interpolation filtering control unit 461 is executed to opening or closing of being filtered of blankControl holds all pieces of neighborhood pixels when executing processing in the frame based on 8 × 8 pieces to macro block with H.264/AVC formatGone the blank filtering processing.It is held although note that and be only used to processing in the H.264/AVC frame based on 8 × 8 pieces of formatFiltering processing of having gone still also is executed with neighborhood pixels interpolation filtering control unit 461 in intra-frame 4 * 4 and frame 16 × 16The filtering processing.
The control signal that opens/closes from neighborhood pixels interpolation filtering control unit 461 is supplied to intra prediction listMember 74.
Intraprediction unit 74 is based on the figure that carry out intra prediction for rearranging the reading of buffer 62 from picturePicture and the intra-prediction process that all candidate intra prediction modes are executed from the reference picture that frame memory 72 is supplied.At this point,At intraprediction unit 74, before intra prediction, according to the control from neighborhood pixels interpolation filtering control unit 461Signal is switched on or switched off filtering processing to execute intra prediction, also, lesser calculated using having as a resultThe intra prediction mode of cost function value.
In addition, intraprediction unit 74 generates the mark of instruction filtering processing opened or closed.It is selected in forecast imageIn the case where unit 77 selects the forecast image generated with best intra prediction mode, by the flag information and instruction optimum frameThe information of inner estimation mode is supplied to lossless coding unit 66 together.
[configuration example of intraprediction unit]
Figure 35 is the block diagram of the detailed configuration example of the intraprediction unit 74 in pictorial image 34.Example in Figure 35In the case where son, intraprediction unit 74 generates unit 471 by forecast image, cost function value generates unit 472 and mode andOpen/close the composition of flag generating unit 473.
The neighborhood pixels value that unit 471 supplies the current block of intra prediction is generated from frame memory 72 to forecast image.?In the case where in Figure 35, switch 73 is omitted in Cong Tuzhong, still, in fact, neighborhood pixels value is stored via switch 73 from frameDevice 72 is supplied to forecast image to generate unit 471.It note that in the case where intra prediction, without de-blocking filter 71The pixel value of deblocking filtering is used as neighborhood pixels value.
Based on the control signal from neighborhood pixels interpolation filtering control unit 461, forecast image generate unit 471 withAll candidate intra prediction modes execute intra prediction, execute filtering processing to neighborhood pixels value, or do not execute at filteringReason, to generate forecast image.As described in later in reference to Figure 36 to Figure 38, execute by increment of block or by increment of macro block byControl the neighborhood pixels interpolation filtering control unit 461 of signal designation opens/closes control.
Forecast image generates unit 471 and is supplied to the forecast image pixel value and its intraprediction mode information of generationCost function value generates unit 472.
Buffer 62, which is rearranged, from picture generates the supply original image pixels value of unit 472 to cost function value.ForFiltering processing is the case where opening and closing, and cost function value generates unit 472 and uses original image pixels value and forecast imagePixel value calculates the cost function value of each intra prediction mode.Cost function value generate unit 472 by it is calculated atThis functional value, forecast image pixel value and intraprediction mode information are supplied to mode and open/close flag generating unit473。
Mode uses the cost letter that unit 472 is generated from cost function value with flag generating unit 473 is opened/closedNumerical value come determine best intra prediction mode and its setting filtering processing opening and closing, and generate instruction filtering processingWhat is be on or off opens/closes flag information.
It mode and opens/closes flag generating unit 473 and supplies the forecast image pixel value of best intra prediction modeTo forecast image selecting unit 77.The feelings of the forecast image of best intra prediction mode are selected in forecast image selecting unit 77In condition, mode and open/close flag generating unit 473 by the information for indicating best intra prediction mode and it is corresponding open/It closes flag information and is supplied to lossless coding unit 66.
It note that other than the intra prediction next described, the processing executed by picture coding device 451 is basicIt is upper identical as the processing of picture coding device 51 in Fig. 1, thus, it will be omitted and repeated.
Next, by being described in the case where opening or closing control as increment execution using block referring to the flow chart of Figure 36The intra-prediction process executed by the intraprediction unit 74 in Figure 34.It note that the processing is the step S31 in Figure 17In another example of intra-prediction process the example of intra-frame 4 * 4 will be described also, in the example of Figure 36.In addition,Hereinafter, filtering processing, which opens/closes, is also briefly termed as filter and opens/closes.
In step S401, for each in nine kinds of intra prediction modes shown in Fig. 3 or Fig. 4, cost functionValue generates the cost function value that unit 472 generates current block.
That is, generating the neighbouring of the current block of the supply intra prediction of unit 471 from frame memory 72 to forecast imagePixel value.Forecast image generates unit 471 and executes frame to each in nine kinds of intra prediction modes shown in Fig. 3 or Fig. 4Interior prediction, also, generate the forecast image of current block.
At this point, from neighborhood pixels interpolation filtering control unit 461 for being applied to not execute filtering processing to neighborhood pixelsSignal is controlled, also, forecast image generates unit 471 and neighborhood pixels are executed with the intra prediction for being not carried out filtering processing.PleaseNote that here it is possible to carry out such arrangement: where supply realizes that the control that neighborhood pixels are executed with filtering processing is believedNumber.It is noted, however, that for nine kinds of modes without different control, for example, not executed to vertical execute to level;Exactly, it for nine kinds of modes, carries out about the identical control for executing or not executing.Further, it is noted that here, not to instituteSome modes, which execute filtering processing, allows less calculation amount.
Forecast image generates unit 471 and is supplied into the forecast image pixel value and intraprediction mode information of generationThis functional value generates unit 472.In the case where filtering is closing, for each intra prediction mode, cost function value is producedRaw unit 471 is calculated using the original image pixels value and forecast image pixel value that rearrange buffer 62 from pictureCost function value shown in above-mentioned expression formula (73) or expression formula (74).Cost function value generates unit 472 and will calculateCost function value, forecast image pixel value and intraprediction mode information out is supplied to mode and opens/closes mark and generateUnit 473.
In step S402, mode is used with flag generating unit 473 is opened/closed from cost function value generation listThe cost function value of member 472 selects the best intra prediction mode for current block.By the intraprediction mode information of selectionIt is supplied to neighborhood pixels interpolation filtering control unit 461.
In step S403, neighborhood pixels interpolation filtering control unit 461 makes cost function value generation unit 472 existFiltering generates the cost function value of the intra prediction mode of selection in the case where opening and closing.It note that in step S401In, generating filtering is the cost function value closed, therefore, in fact, generating the cost letter that filtering is opened in step S403Numerical value.
That is, neighborhood pixels interpolation filtering control unit 461 opens filtering in control signal and the frame of selection in advanceSurveying pattern information is supplied to forecast image to generate unit 471.Forecast image generates unit 471 to the intra prediction for selectionThe neighborhood pixels of mode execute filtering processing, execute intra prediction with the intra prediction mode of selection, and generate current blockForecast image.
Forecast image generates unit 471 and supplies the forecast image pixel value of generation and the intraprediction mode information of selectionUnit 472 should be generated to cost function value.It is open in situation in filtering, for the intra prediction mode of selection, cost letterNumerical value generates unit 471 and uses the original image pixels value and forecast image pixel value for rearranging buffer 62 from pictureTo calculate cost function value shown in above-mentioned expression formula (73) or expression formula (74).Cost function value generates unit 472Calculated cost function value, forecast image pixel value are supplied to mode and open/close flag generating unit 473.
In step s 404, mode and open/close flag generating unit 473 by comparing selection intra prediction mouldThe filtering of formula is the cost function value of opening and closing to determine that the filtering of current block opens/closes.That is, filteringBe open in the lesser situation of cost function value, for current block determine filtering open, also, filtering be close atIn the lesser situation of this functional value, determine that filtering is closed for current block.Mode and open/close flag generating unit 473Then determining prognostic chart picture value is supplied to forecast image selecting unit 77.
In step S405, mode and open/close flag generating unit 473 generate instruction current block in stepDetermined in S404 open or close open/close mark.For example, opening/closing filtering in the case where filtering openingValue is 1.In the case where filtering closing, opening/closing filter value is 0.
It, will in the case where having selected the forecast image in intra prediction mode in the step S22 in above-mentioned Figure 16The flag information that opens/closes generated is supplied to lossless coding unit together with the information for indicating best intra prediction mode66.The information of supply is encoded in the step S23 in Figure 16, is added with the head of compression image, and decoding side is sent to.
Next, by being described in the case where opening or closing control as increment execution using block referring to the flow chart of Figure 37The intra-prediction process executed by the intraprediction unit 74 in Figure 34 another example.In the case where the example of Figure 37,The example of intra-frame 4 * 4 will also be described.
In step S421, for each intra prediction mode, cost function value generates unit 472 is according to filteringIt opens and closes to generate the cost function value of current block.
That is, generating the neighbouring of the current block of the supply intra prediction of unit 471 from frame memory 72 to forecast imagePixel value.Forecast image generates unit 471 and executes frame to each in nine kinds of intra prediction modes shown in Fig. 3 or Fig. 4Interior prediction, also, generate the forecast image of current block.
At this point, firstly, from neighborhood pixels interpolation filtering control unit 461 for being applied to not execute filtering to neighborhood pixelsThe control signal of processing, also, forecast image generation unit 471 is not carried out each of filtering processing to neighborhood pixelsIntra prediction mode executes intra prediction, and generates forecast image.In addition, being supplied from neighborhood pixels interpolation filtering control unit 461Should realize to neighborhood pixels execute filtering processing control signal, also, forecast image generate unit 471 to neighborhood pixels withEach intra prediction mode for performing filtering processing executes intra prediction, and generates forecast image.
Forecast image generates unit 471 for each intra prediction mode in the case where filtering is to open and closeInformation and corresponding forecast image pixel value be supplied to cost function value generate unit 472.It is to close and open in filteringIn every case, for each intra prediction mode, cost function value generates unit 471 and arranges again using from pictureThe original image pixels value of column buffer 62 and forecast image pixel value calculate above-mentioned expression formula (73) or expression formula(74) cost function value shown in.It is in the every case closed and opened in filtering, cost function value generates unitCalculated cost function value, forecast image pixel value and intraprediction mode information are supplied to mode and beat ON/OFF by 472Close flag generating unit 473.
In step S422, mode is used with flag generating unit 473 is opened/closed from cost function value generation listTo determine, filtering should be the cost function value of member 472 for current block in the case where each intra prediction modeIt opens or closes.
In step S423, mode and open/close flag generating unit 473 from about determine filtering be beat on or offThe best intra prediction mode of current block is selected in the intra prediction mode closed.
In step S424, mode generates instruction for pre- in the frame of selection with flag generating unit 473 is opened/closedThe state (opening or closing) of the filter of survey mode opens/closes flag information.When the step in above-mentioned Figure 16In the case where having selected the forecast image in intra prediction mode in S22, generation is opened/closed into flag information and instructionThe information of best intra prediction mode is supplied to lossless coding unit 66 together.Step of the information of supply in Figure 16It is encoded in S23, is added with the head of compression image, be sent to decoding side.
Next, by the case where opening or closing control is being executed by increment of macro block referring to the description of the flow chart of Figure 38In the intra-prediction process executed by the intraprediction unit 74 in Figure 34.
It note that the processing is another example of the intra-prediction process in step S31 in Figure 17, also, in Figure 38Example in, the example of intra-frame 4 * 4 will also be described.
In step S451, the filtering about entire macro block is fixed as closing by neighborhood pixels interpolation filtering control unit 461It closes or opens.In this case, filtering is fixed as closing by neighborhood pixels interpolation filtering control unit 461, also, will filterThe control signal that wave is closed is supplied to forecast image to generate unit 471.The fixation of filtering can be on or off, still, canTo realize the fixation of closing with less calculation amount.
In step S452, intraprediction unit 74 determines each piece of intra prediction mode.That is, being deposited from frameReservoir 72 generates the neighborhood pixels value that unit 471 supplies the current block of intra prediction to forecast image.Forecast image generates unitEach in nine kinds of intra prediction modes shown in 471 couples of Fig. 3 or Fig. 4 executes intra prediction, also, generates current blockForecast image.
At this point, not executing filtering to neighborhood pixels firstly, realizing from the supply of neighborhood pixels interpolation filtering control unit 461The control signal of processing, also, forecast image generates unit 471 and executes on each intra prediction mode to neighborhood pixelsIntra prediction generates forecast image without executing filtering processing.Forecast image generates unit 471 for the prognostic chart of generationAs pixel value and its intraprediction mode information are supplied to cost function value to generate unit 472.
It is in the every case closed in filtering, for each intra prediction mode, cost function value generates unit471 is above-mentioned to calculate using the original image pixels value and forecast image pixel value for rearranging buffer 62 from pictureCost function value shown in expression formula (73) or expression formula (74).It is cost function in the every case closed in filteringValue generates unit 472 and calculated cost function value, forecast image pixel value and intraprediction mode information is supplied to mouldFormula and open/close flag generating unit 473.
Mode uses the cost letter that unit 472 is generated from cost function value with flag generating unit 473 is opened/closedNumerical value determines each piece of best intra prediction mode.Decoded intraprediction mode information is supplied in neighborhood pixelsInsert filtering control unit 461.
In step S453, neighborhood pixels interpolation filtering control unit 461 makes cost function value generate 472 needle of unitThe cost function value that filtering opens and closes is generated to entire macro block.It note that being produced in step S452 in filtering is to closeThe cost function value of the best intra prediction mode of each piece (that is, entire macro block) in macro block is directed in the case where closing.CauseThis, in fact, generating the cost function value for the entire macro block being open in situation in filtering in step S453.
That is, filtering is opened control signal and is directed to each piece really by neighborhood pixels interpolation filtering control unit 461The information of fixed intra prediction mode is supplied to forecast image to generate unit 471.Forecast image generates unit 471 to for trueThe neighborhood pixels value of fixed intra prediction mode executes filtering processing, executes intra prediction with determining intra prediction mode, andAnd generate the forecast image of current block.
Forecast image generates unit 471 and supplies the forecast image pixel value of generation and the intraprediction mode information of determinationUnit 472 should be generated to cost function value.It is open in every case in filtering, for determining intra prediction mode,Cost function value generates unit 471 and uses the original image pixels value and forecast image for rearranging buffer 62 from picturePixel value calculates cost function value shown in above-mentioned expression formula (73) or expression formula (74).It is to close and beat in filteringIn the every case opened, cost function value generate unit 472 by calculated cost function value, forecast image pixel value andIntraprediction mode information is supplied to mode and opens/closes flag generating unit 473.
In step S454, mode compares with flag generating unit 473 is opened/closed from cost function value generation listAll pieces of cost function value in macro block in the case where filtering is to open and close of member 472, and determine to entireMacro block which of is opened/closed using filtering.
In step S455, for entire macro block, mode generates instruction in step with flag generating unit 473 is opened/closedDetermined in rapid S454 open or close open/close mark.For each macro block, generation is opened/closed into markInformation is supplied to lossless coding unit 66.The information of supply is encoded in the step S23 in Figure 16, with compression imageHead is added, and is sent to decoding side.
As described above, the control that filtering is opened/closed and (turned on or off) can be executed by increment of block, alternatively, canTo be executed by increment of macro block.Although note that can improve intra prediction by opening/closing using block as increment control algorithmThe precision of prediction of processing, but information content needed for the flag information for transmitting each piece increases.On the contrary, being with macro blockIn the case where increment is controlled, the raising of precision of prediction is lower than the raising of the precision of prediction executed using block as increment, still,A flag information for each macro block is enough, it is thus possible to reduce the increase of flag information amount.
Although describing the example of luminance signal in the above description, this be can be used for about color difference signalIntra prediction.In addition, the filter factor for the filtering processing to be controlled be not limited in H.264/AVC format three taps 1,2,// 4, also, this any coefficient for can be adapted for any tap length being arranged with the picture coding device 51 in Fig. 11 }.
That is, can also be performed and be arranged with by the picture coding device 51 in Fig. 1 in the case where filtering openingFilter factor filtering processing.
Picture decoding apparatus will be described referring to Figure 39, picture decoding apparatus reception is encoded by the picture coding device 451Compression image and decode it.
[other examples of picture decoding apparatus]
Figure 39 diagram is used as matching for another embodiment for applying the picture decoding apparatus of image processing apparatus of the inventionIt sets.
Picture decoding apparatus 151 in picture decoding apparatus 501 and Figure 22 is had in common that comprising storage buffer161, lossless decoding unit 162, inverse quantization unit 163, inverse orthogonal transformation unit 164, computing unit 165, de-blocking filter166, picture rearranges buffer 167, D/A converting unit 168, frame memory 169, switch 170, intraprediction unit171, motion prediction/compensating unit 173 and switch 174.
In addition, picture decoding apparatus 151 in picture decoding apparatus 501 and Figure 22 the difference is that, with neighbouring picturePlain interpolation filtering control unit 511 is instead of neighborhood pixels interpolation filtering switch unit 172.
That is, the information of the instruction intra prediction mode obtained will be decoded by correct information from losslessConsumption decoding unit 162 is supplied to intraprediction unit 171.Based on the information, intraprediction unit 171 generates forecast image simultaneouslyThe forecast image of generation is output to switch 174.At this time.Before intra prediction, intraprediction unit 171 is according to from neighbourThe control signal of nearly pixel interpolating filtering control unit 511 executes (or not executing) filtering processing to neighborhood pixels value.
According to the coding of picture coding device 451, controlled from lossless decoding unit 162 to neighborhood pixels interpolation filteringWhat unit 511 supplied each macro block or each piece opens/closes flag information.
Neighborhood pixels interpolation filtering control unit 511 opens/closes flag information to intraprediction unit according to supplyThe control signal for executing or not executing filtering processing is realized in 171 supplies.
It note that using the picture coding device 451 in Figure 34, be that the two kinds of situations opened and closed carry out to filteringTest, also, after a kind of situation for having selected the higher code efficiency of generation by cost function value, it executes pre- in frameSurvey processing.On the other hand, using picture decoding apparatus 501, the transmission based on coding opens/closes flag information control filterWave opens or closes, also, executes intra-prediction process.
[configuration example of intraprediction unit and neighborhood pixels interpolation filtering control unit]
Figure 40 is the frame for illustrating the detailed configuration example of neighborhood pixels interpolation filtering control unit and intraprediction unitFigure.
In the case where the example of Figure 40, intraprediction unit 171 is produced by prediction mode buffer 521 and forecast imageRaw unit 522 is constituted.Neighborhood pixels interpolation filtering control unit 511 is by mark buffer 531 and control signal generation unit532 are constituted.
The intraprediction mode information from lossless decoding unit 162 is supplied to prediction mode buffer 521.From frameMemory 169 generates the neighborhood pixels value that unit 522 supplies the current block of intra prediction to forecast image.Feelings in Figure 40In condition, switch 170 is also omitted from figure, still, in fact, by neighborhood pixels value via switch 170 from frame memory 169It is supplied to forecast image to generate unit 522.
Forecast image generates the intraprediction mode information that unit 522 reads current block from prediction mode buffer 521,Intra prediction is executed to current block with the intra prediction mode of reading, and generates forecast image.Before the intra prediction,Forecast image generates unit 522 according to the control signal from control signal generation unit 532 to from frame memory 169Forecast image pixel value executes filtering processing.
For each macro block or each piece, ON/OFF is beaten from lossless decoding unit 162 to the mark supply of buffer 531Close flag information.Control signal generation unit 532 reads from mark buffer 531 and opens/closes mark accordingly, and generation refers toShow and execute the control signal that filtering processing does not still execute filtering processing for each piece, and the control signal of generation is suppliedUnit 522 should be generated to forecast image.
It note that the processing executed by picture decoding apparatus 501 is basic other than the prediction processing next describedIt is upper identical as the processing of picture decoding apparatus 151 in Figure 22, thus, it will be omitted and repeated.
[description of prediction processing]
Next, the prediction of the picture decoding apparatus 501 described in Figure 39 referring to the flow chart of Figure 41 is handled.It please infuseMeaning, the intra-prediction process are another examples of the prediction processing in the step S138 in Figure 24.
In step S501, forecast image generates unit 522 and determines whether to have carried out intraframe coding to current block.By frameInner estimation mode information is supplied to prediction mode buffer 521 from lossless decoding unit 162, to be generated by forecast imageUnit 522 is read.Therefore, in step S501, forecast image generates unit 522 and determines to current block progress intraframe coding, andAnd present treatment proceeds to step S502.
In step S502, forecast image generates the intra prediction mode that unit 522 obtains prediction mode buffer 521Information.
In addition, mark buffer 531 will be supplied to from the flag information that opens/closes of lossless decoding unit 162When, mark buffer 531 obtains forecast image pixel value mark in step S503 and stores it.
Control signal generation unit 532 reads from mark buffer 531 and opens/closes mark corresponding to current flag,Also, determination opens/closes whether mark is 1 in step S504.It opens/closes mark when being determined in step S504 and is1, that is, filtering processing is open in situation, and control signal generation unit 532 is supplied to forecast image to generate for signal is controlledUnit 522, so that executing filtering processing.
According to control signal, in step S505, forecast image generates unit 522 using filter factor to neighborhood pixelsIt is filtered.In step S506, forecast image generates unit 522 and uses the neighborhood pixels value that have passed through filtering processingIt executes intra prediction, and generates forecast image.
It on the other hand, is not 1 when determination opens/closes mark in step S504, that is, filtering processing is the feelings closedIn condition, the filtering processing of step S505 is skipped, also, present treatment proceeds to step S506.
In step S506, forecast image generates unit 522 and uses the forecast image pixel value from frame memory 169It executes intra prediction, and generates forecast image.
The forecast image generated in step S506 is supplied to switch 174.
On the other hand, in the case where determination does not execute intraframe coding in step S501, present treatment proceeds to stepS507。
In step s 507, motion prediction/compensating unit 173 executes interframe movement prediction.That is, to handleImage be for inter-prediction processing image in the case where, read necessary image from frame memory 169, also, mustThe image wanted is supplied to motion prediction/compensating unit 173 via switch 170.In step S508, motion prediction/compensating unit173 execute motion prediction based on the motion vector obtained in step s 507 with inter-frame forecast mode, and generate prognostic chartPicture.The forecast image of generation is exported to switch 174.
As described above, being controlled using picture coding device 451 and picture decoding apparatus 501 for for intra predictionThe opening and closing of the filtering processing of neighborhood pixels, also, filtering processing is not executed for the block of code efficiency deterioration.Therefore,Code efficiency can be improved.
Although note that the example for describing execute intra prediction in the above description, filtering processing is openedControl with closing can be adapted for the intra prediction in the re prediction described above in reference to Figure 32.
<4. fourth embodiment>
[the other configurations example of picture coding device]
Figure 42 shows matching for another embodiment for being used as the picture coding device for applying image processing apparatus of the inventionIt sets.
Picture coding device 451 in picture coding device 551 and Figure 34, which is had in common that, converts list with A/DIt is single that member 61, picture rearrange buffer 62, computing unit 63, orthogonal transform unit 64, quantifying unit 65, lossless codingFirst 66, storage buffer 67, inverse quantization unit 68, inverse orthogonal transformation unit 69, computing unit 70, de-blocking filter 71, frameMemory 72, switch 73, intraprediction unit 74, motion prediction/compensating unit 76, forecast image selecting unit 77 and rateControl unit 78.
In addition, picture coding device 451 in picture coding device 551 and Figure 34 the difference is that, neighbour is omittedNearly pixel interpolating filtering control unit 461, also, re prediction unit 361 and neighborhood pixels the interpolation filter being added in Figure 31Wave control unit 561.
That is, for the example in Figure 42, H.264/AVC intraprediction unit 74 according to executing intra prediction.
On the other hand, motion prediction/compensating unit 76 is all based on the image handled for interframe and reference picture detectionCandidate inter-frame forecast mode motion vector, processing is compensated to reference picture based on motion vector, and generate predictionImage.
Motion prediction/compensating unit 76 supplies the motion vector information detected, for frame to re prediction unit 361Between information (address etc.) He Yici residual error of image for handling, which is the image for inter-prediction and generatesDifference between forecast image.
Motion prediction/compensating unit 76 is secondary pre- to determine by comparing the quadratic residue from re prediction unit 361Best intra prediction mode in survey.In addition, motion prediction/compensating unit 76 comes by comparing quadratic residue and a residual errorIt determines and quadratic residue encode or a residual error is encoded.It note that all candidate inter-frame forecast modesExecute the processing.
Motion prediction/compensating unit 76 calculates cost function value for all candidate inter-frame forecast modes.At this point, makingCost function value is calculated with the residual error determined for each inter-frame forecast mode in a residual sum quadratic residue.FortuneThe prediction mode of the minimum value generated in calculated cost function value is determined as best interframe by dynamic prediction/compensating unit 76Prediction mode.
Motion prediction/compensating unit 76 by the forecast image generated in best inter-frame forecast mode (or be used for interframeDifference between the image and quadratic residue of prediction) and its cost function value be supplied to forecast image selecting unit 77.In prognostic chartIn the case where having selected the forecast image generated in best inter-frame forecast mode as selecting unit 77, motion prediction/compensation listMember 76 exports the information for indicating best inter-frame forecast mode to lossless coding unit 66.
At this point, motion vector information, reference frame information, instruction will execute the re prediction mark, secondary pre- of re predictionThe information etc. of intra prediction mode in survey is also output to lossless coding unit 66.Lossless coding unit 66 is also to nextAutokinesis prediction/compensating unit 76 information carries out the lossless coded treatment of variable length code, arithmetic coding etc.,And it is inserted into the head of compression image.
Letter based on the motion vector information from motion prediction/compensating unit 76 and the image that carry out interframe processingBreath, re prediction unit 361 are read and the neighbouring current neighbouring picture of current block that carry out interframe processing from frame memory 72Element.In addition, re prediction unit 361 reads from frame memory 72 and passes through motion vector information with the associated ginseng of current blockExamine the neighbouring reference neighborhood pixels of block.
Re prediction unit 361 is executed to be handled above with reference to Figure 32 re prediction described.Re prediction processing is in this wayProcessing: where execute intra prediction in the current neighborhood pixels of a residual sum and with reference between the difference between neighborhood pixels, fromAnd generate the information of second difference (quadratic residue).
It is noted, however, that the re prediction unit 361 in Figure 42 is according to from neighborhood pixels before the re predictionThe control signal of interpolation filtering control unit 561 is between the reference neighborhood pixels and current neighborhood pixels for inter-predictionDifference execute (or not executing) filtering processing.Re prediction unit 361 is then using current neighborhood pixels and with reference to neighbouring picture(or unfiltered) difference of filtering between element executes re prediction processing, and by the second order difference information (two of acquisitionSecondary residual error) it is output to motion prediction/compensating unit 76.At this point, re prediction unit 361 will also indicate whether to execute at filteringThe flag information that opens/closes of reason is output to motion prediction/compensating unit 76.
That is, re prediction unit 361 includes intraprediction unit 74 shown in Figure 35.
Neighborhood pixels interpolation filtering control unit 561 is substantially with identical as neighborhood pixels interpolation filtering control unit 461Mode configure, and execute identical processing.That is, neighborhood pixels interpolation filtering control unit 561 will realize whetherThe control signal that the control of filtering processing is executed by increment of block or by increment of macro block is supplied to re prediction unit 361.
It note that other than processing in following frames and motion prediction process, by the picture coding device in Figure 42The processing of 551 execution is substantially identical as the processing of the picture coding device 451 in Figure 34 (that is, coded treatment in Figure 16),To by the descriptions thereof are omitted.
That is, as being handled in frame, being executed according to H.264/AVC using the picture coding device 551 in Figure 42The intra prediction of format.In addition, in motion prediction process, being filtered according to from neighborhood pixels interpolation as motion prediction processThe control signal of wave control unit 561, control filtering processing, to generate (or unfiltered) second order difference information of filtering.Select the difference information with preferable code efficiency in first difference information and second order difference information, and comparative costsFunctional value, so that it is determined that best inter-frame forecast mode.
Picture decoding apparatus will be described referring to Figure 43, picture decoding apparatus reception is encoded by the picture coding device 551Compression image and decode it.
[the other configurations example of picture decoding apparatus]
Figure 43 diagram is used as matching for another embodiment for applying the picture decoding apparatus of image processing apparatus of the inventionIt sets.
Picture decoding apparatus 501 in picture decoding apparatus 601 and Figure 39 is had in common that comprising storage buffer161, lossless decoding unit 162, inverse quantization unit 163, inverse orthogonal transformation unit 164, computing unit 165, de-blocking filter166, picture rearranges buffer 167, D/A converting unit 168, frame memory 169, switch 170, intraprediction unit171, motion prediction/compensating unit 173 and switch 174
In addition, picture decoding apparatus 501 in picture decoding apparatus 601 and Figure 39 the difference is that, neighbour is omittedNearly pixel interpolating filtering control unit 511, also, re prediction unit 411 and neighborhood pixels the interpolation filter being added in Figure 33Wave control unit 611.
That is, the information of the instruction intra prediction mode obtained will be decoded by correct information from losslessConsumption decoding unit 162 is supplied to intraprediction unit 171.Based on the information, intraprediction unit 171 generates forecast image simultaneouslyThe forecast image of generation is output to switch 174.
Prediction mode information, motion vector information and ginseng in the information obtained will be decoded by correct informationIt examines frame information etc. and is supplied to motion prediction/compensating unit 173 from lossless decoding unit 162.In addition, to current block applicationIn the case where re prediction processing, singly from lossless decoding by the frame mode information of re prediction mark and re predictionMember 162 is supplied to motion prediction/compensating unit 173.
In the case where determining using re prediction processing, motion prediction/compensating unit 173 controls re prediction unit411, to execute re prediction with the intra prediction mode that the intraprediction mode information of re prediction indicates.
Motion prediction/compensating unit 173 is based on motion vector information and reference frame information and executes motion prediction to the imageAnd compensation deals, and generate forecast image.That is, using reference block associated with current block in reference blockPixel value generates the forecast image of current block.Motion prediction/compensating unit 173 then will be from re prediction unit 411Prediction difference value is added with the forecast image of generation, and these are output to switch 174.
Re prediction unit 411 is used from the current neighborhood pixels that frame memory 169 is read and with reference between neighborhood pixelsDifference execute re prediction.It is noted, however, that being executed having received realization from neighborhood pixels interpolation filtering control unit 611In the case where the control signal of the control of filtering processing, re prediction unit 411 is before the re prediction to current neighbouring pictureThe difference of element and reference neighborhood pixels executes filtering processing.Re prediction unit 411 is then using by the current of filtering processingNeighborhood pixels and re prediction processing is executed with reference to the difference between neighborhood pixels, and by the second order difference information (two of acquisitionSecondary residual error) it is output to motion prediction/compensating unit 173.
It note that and having received the control that realization does not execute filtering processing from neighborhood pixels interpolation filtering control unit 611Control signal in the case where, re prediction unit 411 does not execute filtering processing, and uses current neighborhood pixels and referenceThe difference of neighborhood pixels executes secondary treatment.
That is, re prediction unit 411 is configured to include intraprediction unit 171 shown in Figure 40.
Neighborhood pixels interpolation filtering control unit 611 is substantially single to control with the neighborhood pixels interpolation filtering in Figure 39First 511 identical modes configure, and essentially perform identical processing.That is, from lossless decoding unit 162 toNeighborhood pixels interpolation filtering control unit 611 supplies in the information obtained and to header information decoder and opens/closes markInformation.Neighborhood pixels interpolation filtering control unit 611 is according to flag information supply control signal is opened/closed, so that twoSecondary predicting unit 411 executes to neighborhood pixels or does not execute filtering processing.
It note that other than processing in following frames and motion prediction process, by the picture decoding apparatus in Figure 43The processing of 601 execution is substantially identical as the processing of the picture decoding apparatus 501 in Figure 39 (that is, decoding process in Figure 24),To by the descriptions thereof are omitted.
That is, as being handled in frame, being executed according to H.264/AVC using the picture decoding apparatus 601 in Figure 43The intra prediction of format.In addition, in motion prediction process, being filtered according to from neighborhood pixels interpolation as motion prediction processThe control signal of wave control unit 611 controls filtering processing, executes re prediction (intra prediction), and generate second order differenceInformation.
The opening and closing control of filtering processing is readily applicable in the frame as described above with re prediction processingPrediction.
The case where size of macro block is 16 × 16 pixel is described in the above description although note that, this hairThe bright macroblock size for being readily applicable to extension described in non-patent literature 3.
Figure 44 is the diagram for illustrating the example of extended macroblock size.For non-patent literature 3, macroblock size is straight by extensionTo 32 × 32 pixels.
Upper layer in Figure 44 in turn shown from left side by be divided into 32 × 32 pixels, 32 × 16 pixels, 16 ×The macro block that 32 × 32 pixels of the block (subregion) of 32 pixels and 16 × 16 pixels are constituted.Middle layer in Figure 44 from left side according to16 × 16 of block (subregion) by being divided into 16 × 16 pixels, 16 × 8 pixels, 8 × 16 pixels and 8 × 8 pixels are shown secondarylyThe block that pixel is constituted.Lower layer in Figure 44 in turn shown from left side by be divided into 8 × 8 pixels, 8 × 4 pixels, 4 ×The block that 8 × 8 pixels of the block (subregion) of 8 pixels and 4 × 4 pixels are constituted.
In other words, 32 × 32 pixels, 32 × 16 pixels, 16 × 32 pixels shown in the upper layer in Figure 44 can be used inThe macro block of 32 × 32 pixels is handled with the blocks of 16 × 16 pixels.
In addition, 16 × 16 pixels shown in middle level, 16 × 8 can be used in a manner of identical with H.264/AVC formatThe block of pixel, 8 × 16 pixels and 8 × 8 pixels handles the blocks of 16 × 16 pixels shown on the right of upper layer.
In addition, in a manner of identical with H.264/AVC format, it can be used in 8 × 8 pixels shown in lower layer, 8 × 4 picturesElement, 4 × 8 pixels and 4 × 4 pixels block handle the blocks of 8 × 8 pixels shown in the right in middle level.
Using the macroblock size of extension, by utilizing such layer structure, about 16 × 16 block of pixels or smaller,While holding with H.264/AVC format compatible, its superset is defined as by biggish piece.
Filter factor setting, calculating and filtering processing according to the present invention open/close control and are readily applicable to as aboveThe macroblock size of the proposition extended describedly.
It is described so far in the case where H.264/AVC format is used as coded format, but the present invention is notIt is limited to this, and it is possible to utilize for executing the another of prediction (for example, intra prediction or re prediction) using neighborhood pixelsKind coded format/codec format.
Note that for example, with MPEG, H.26X etc., present invention may apply to via such as satellite broadcasting, haveThe network media of line TV, internet, cellular phone etc. receives the orthogonal transformation and movement by discrete cosine transform etc.The picture coding device and picture decoding apparatus used when compensating image information (bit stream) of compression.In addition, the present invention can be withSuitable for the image coding used when handling the image information on such as storage medium of CD, disk and flash memoryDevice and picture decoding apparatus.In addition, present invention may apply in such picture coding device and picture decoding apparatusThe motion predicted compensation device for including.
A series of above-mentioned processing can be executed by hardware, or can be executed by software.Passing through softwareIn the case where executing this series of processing, the program for constituting the software is mounted in a computer.Here, the example of computerIncluding the computer and general purpose personal computer being placed in specialized hardware, thus, it is possible to by install wherein various programs comeIt performs various functions.
Figure 45 is the frame of the configuration example of the hardware for the computer that diagram executes a series of above-mentioned processing using programFigure.
For computer, CPU (central processing unit) 801, ROM (read-only memory) 802 and RAM (random access memoryDevice) it 803 is connected with each other by bus 804.
In addition, input/output interface 805 is connected to bus 804.Input unit 86, output unit 807, storage unit808, communication unit 809 and driver 810 are connect with input/output interface 805.
Input unit 806 is made of keyboard, mouse, microphone etc..Output unit 807 is by structures such as display, loudspeakersAt.Storage unit 807 is made of hard disk, nonvolatile memory etc..Communication unit 809 is made of network interface etc..DriverThe removable medium 811 of 810 driving disks, CD, magneto-optic disk, semiconductor memory etc..
For the computer being configured so that, for example, CPU 801 will be stored in the program in storage unit 808 via defeatedEnter/output interface 805 and bus 804 be loaded into RAM 803, the program is executed, thereby executing a series of above-mentioned processing.
The program that computer (CPU 801) executes can be used as the removable medium of encapsulation medium etc. by being recorded inIt is provided in 811.In addition, program can be via the wired or wireless biography of such as local area network, internet or digital satellite broadcastingDefeated media are provided.
For computer, by the way that removable medium 808 is installed on driver 810, can by program via input/Output interface 805 is installed in storage unit 808.In addition, the program can be via wired or wireless transmission media in communication unitIt is received, and is installed in storage unit 808 at member 809.
Furthermore it is possible to which program is pre-installed in ROM 802 or storage unit 808.
The program that note that computer executes can be wherein along sequence described in this specification according to time sequenceColumn execute the program of processing, or can be wherein concurrently or at the necessary timing for such as executing calling at executionThe program of reason.
The embodiment of the present invention is not limited to the above embodiments, also, in the case where not departing from essence of the inventionVarious modifications can be carried out.
For example, above-mentioned 151,401,501 and of picture coding device 51,351,451 and 551 and picture decoding apparatus601 can be adapted for optional electronic device.Hereinafter, its example will be described.
Figure 46 is main configuration example of the diagram using the television receiver for applying picture decoding apparatus of the inventionBlock diagram.
Television receiver 1300 shown in Figure 46 includes terrestrial tuner 1313, Video Decoder 1315, vision signalProcessing circuit 1318, graphics generation circuit 1319, panel drive circuit 1320 and display panel 1321.
Terrestrial tuner 1313 via antenna receive terrestrial analog broadcast broadcast wave signal, to the broadcast wave signal intoRow demodulation, obtains vision signal, and these vision signals are supplied to Video Decoder 1315.
Video Decoder 1315 is decoded processing to the vision signal supplied from terrestrial tuner 1313, and will obtainThe digital component signal obtained is supplied to video processing circuit 1318.
Video processing circuit 1318 carries out such as noise remove to the video data supplied from Video Decoder 1315Deng predetermined process, also, the video data of acquisition is supplied to graphics generation circuit 1319.
The video data for the program that the generation of graphics generation circuit 1319 will be shown on display panel 1321, or due toBased on will be via image data caused by the processing of the application of network provisioning, also, by the video data or picture number of generationAccording to being supplied to panel drive circuit 1320.In addition, graphics generation circuit 1319 also execute such as will by for user generate videoData (figure) and the video data obtained is supplied to the processing of panel drive circuit 1320, the video in due courseData show the picture of the selection for project etc., and are overlapped on the video data of program.
Panel drive circuit 1320 drives display panel 1321 based on the data supplied from graphics generation circuit 1319,With the video of display program on display panel 1321 or above-mentioned various pictures.
Display panel 1321 is made of LCD (liquid crystal display) etc., and aobvious according to the control of panel drive circuit 1320Show the video etc. of program.
In addition, television receiver 1300 further includes audio A/D (analog/digital) converting unit 1314, Audio Signal ProcessingCircuit 1322, echo cancellor/audio synthesizing circuitry 1323 and audio-frequency amplifier circuit 1324 and loudspeaker 1325.
Terrestrial tuner 1313 demodulates the broadcast wave signal received, also obtains to not only obtain vision signalObtain audio signal.The audio signal of acquisition is supplied to audio A/D conversion circuit 1314 by terrestrial tuner 1313.
The audio A/D conversion circuit 1314 carries out A/D conversion place to the audio signal supplied from terrestrial tuner 1313Reason, also, the digital audio and video signals of acquisition are supplied to audio signal processing circuit 1322.
Audio signal processing circuit 1322 such as makes an uproar to the audio data supplied from audio A/D conversion circuit 1314The predetermined process of sound removal etc., also, the audio data of acquisition is supplied to echo cancellor/audio synthesizing circuitry 1323.
Echo cancellor/audio synthesizing circuitry 1323 supplies the audio data supplied from audio signal processing circuit 1322To audio-frequency amplifier circuit 1324.
Audio-frequency amplifier circuit 1324 carries out D/A to the audio data supplied from echo cancellor/audio synthesizing circuitry 1323Conversion process amplifies device processing to adjust scheduled volume, then, will export audio from loudspeaker 1325.
In addition, television receiver 1300 further includes digital tuner 1316 and mpeg decoder 1317.
Digital tuner 1316 receives digital broadcasting (received terrestrial digital broadcasting, BS (broadcasting satellite)/CS (communication via antennaSatellite) digital broadcasting) broadcast wave signal, it is demodulated to obtain MPEG-TS (motion characteristics planning-transport stream),And it is supplied into MPEG decoder 1317.
Mpeg decoder 1317 descrambles the scrambling for giving the MPEG-TS supplied from digital tuner 1316, andExtract the stream of the data comprising being used as the program for resetting target (viewing target).1317 pairs of mpeg decoder constitute the stream extractedAudio packet be decoded, the audio data of acquisition is supplied to video processing circuit 1322, also, also to compositionThe video packets of stream are decoded, and the video data of acquisition is supplied to video processing circuit 1318.In addition, MPEGEPG (electronic program guides) data extracted from MPEG-TS are supplied to CPU via unshowned path by decoder 13171332。
Television receiver 1300 uses above-mentioned picture decoding apparatus 151,401,501 or 601, as in this wayMode mpeg decoder 1317 that video packets are decoded.Therefore, with identical with picture coding device 151 and 401Mode, mpeg decoder 1317 switch filter factor according to quantization parameter and prediction mode, also, before intra prediction, rightNeighborhood pixels execute filtering processing.Alternatively, in a manner of identical with picture coding device 501 and 601, mpeg decoder 1317Control whether filtering processing is executed to neighborhood pixels before intra prediction based on mark is opened/closed.It is thus possible to improveCode efficiency.
By with the video data supplied from Video Decoder 1315 the case where it is identical in a manner of, in video frequency signal processing electricityScheduled processing is carried out to the video data supplied from mpeg decoder 1317 at road 1318.Then, in due course, existAt graphics generation circuit 1319, the video data Jing Guo predetermined process is overlapped on video data of generation etc., via panelThe video data is supplied to display panel 1321 by driving circuit 1320, also, its image is shown on display panel 1321.
By with the audio data supplied from audio A/D conversion circuit 1314 the case where it is identical in a manner of, at audio signalScheduled processing is carried out to the audio data supplied from mpeg decoder 1317 at reason circuit 1322.By the sound of predetermined processFrequency is supplied to audio-frequency amplifier circuit 1324 according to and then via echo cancellor/audio synthesizing circuitry 1323, and passes through D/A conversion process and amplifier processing.As a result, by being exported with the audio of predetermined volume adjustment from loudspeaker 1325.
In addition, television receiver 1300 also includes microphone 1326 and A/D conversion circuit 1327.
A/D conversion circuit 1327 receives the sound of the user collected by the microphone 1326 being arranged to television receiver 1300Frequency signal is converted for audio.A/D conversion circuit 1327 carries out A/D conversion process to the audio signal received, also, willThe digital audio-frequency data of acquisition is supplied to echo cancellor/audio synthesizing circuitry 1323.
The audio data of user (user A) for having supplied television receiver 1300 from A/D conversion circuit 1327 the case whereIn, echo cancellor/audio synthesizing circuitry 1323 executes echo cancellor using the audio data of user A as target.After echo cancellor, echo cancellor/audio synthesizing circuitry 1323 will pass through the audio data of synthesis user A and other audiosData etc. and obtain audio data via audio-frequency amplifier circuit 1324 from loudspeaker 1325 export.
In addition, television receiver 1300 further includes that audio codec 1328, internal bus 1329, SDRAM (are synchronized dynamicState random access memory) 1330, flash memory 1331, CPU 1332, USB (universal serial bus) I/F 1333 and netNetwork I/F 1334.
A/D conversion circuit 1327 receives the sound of the user collected by the microphone 1326 being arranged to television receiver 1300Frequency signal is converted for audio.A/D conversion circuit 1327 carries out A/D conversion process to the audio signal received, also, willThe digital audio-frequency data of acquisition is supplied to audio codec 1328.
The audio data supplied from A/D conversion circuit 1327 is converted to the number of predetermined format by audio codec 1328Accordingly just via transmission of network, also, it is supplied to network I/F 1334 via internal bus 1329.
Network I/F 1334 is via the cable and network connection being mounted in network terminal 1335.For example, network I/FThe audio data supplied from audio codec 1328 is transmitted to another device being connected to the network with it by 1334.In addition, exampleSuch as, network I/F 1334 is received via network terminal 1335 from the audio number transmitted via network another device connected to itAccording to, and it is supplied to audio codec 1328 via internal bus 1329.
The audio data supplied from network I/F 1334 is converted to the data of predetermined format by audio codec 1328,And it is supplied into echo cancellor/audio synthesizing circuitry 1323.
Echo cancellor/audio synthesizing circuitry 1323 is being taken as mesh with the audio data supplied from audio codec 1328Echo cancellor is executed in the case where mark, also, should by synthesis from the output of loudspeaker 1325 via audio-frequency amplifier circuit 1324The data of the audio of the acquisitions such as audio data and other audio datas.
SDRAM 1330 stores various data needed for CPU 1332 executes processing.
Flash memory 1331 stores the program to be executed by CPU 1332.By such as activating television receiverThe program stored in flash memory 1331 is read by CPU 1332 at 1300 equal predetermined timings.It is obtained via digital broadcastingEPG data, be also stored in flash memory 1331 from data that book server obtains etc. via network.
For example, the MPEG- of the content-data obtained via network from book server comprising the control by CPU 1331TS is stored in flash memory 1331.For example, passing through the control of CPU 1332, via internal bus 1329, flash storageIts MPEG-TS is supplied to mpeg decoder 1317 by device 1331.
From coming in a manner of mpeg decoder 1317 is identical by with the MPEG-TS supplied from digital tuner 1316 the case whereManage its MPEG-TS.In this way, television receiver 1300 receives the content number being made of video, audio etc. via networkAccording to being decoded using mpeg decoder 1317, so as to show its video, and it is possible to export its audio.
In addition, television receiver 1300 is also comprising the light-receiving for receiving the infrared signal emitted from remote controler 1351Unit 1337.
Light receiving unit 1337 indicates user's from 1351 receiving infrared-ray of remote controler, also, by what is obtained by demodulationThe control routine of the content of operation is exported to CPU 1332.
CPU 1332 is executed according to control routine supplied from light receiving unit 1337 etc. and is deposited in flash memory 1331The program of storage, to control the whole operation of television receiver 1300.Each unit of CPU 1332 and television receiver 1300 warpIt is connected by unshowned path.
USB I/F 1333 is to the television receiver 1300 via the USB cable connection being mounted on USB terminal 1336Transmission/reception of external device (ED) execution data.Network I/F 1334 is via the cable and network being mounted in network terminal 1335Connection also executes transmission/reception of the data in addition to audio data to various devices connected to the network.
Television receiver 1300 uses picture decoding apparatus 151,401,501 or 601 as mpeg decoder 1317, fromAnd code efficiency can be improved.As a result, television receiver 1300 can be received with higher speed from via antennaBroadcast wave signal or via network obtain content-data in obtain have higher precision decoding image and show it.
Figure 47 is master of the diagram using the cellular phone for applying picture coding device and picture decoding apparatus of the inventionWant the block diagram of configuration example.
Cellular phone 1400 shown in Figure 47 includes the main control unit for being configured as integrally controlling each unit1450, power circuit unit 1451, operation input control unit 1452, image encoder 1453, Camera IO/F cell 1454,LCD control unit 1455, image decoder 1456, multiplexing/separative unit 1457, record/playback unit 1462, modulation/Demodulator circuit unit 1458 and audio codec 1459.These units are connected with each other via bus 1460.
In addition, cellular phone 1400 includes operation key 1419, CCD (charge-coupled device) camera 1416, liquid crystal displayDevice 1418, storage unit 1423, transmission/reception circuit unit 1463, antenna 1414, microphone (MIC) 1421 and loudspeaker1417。
In end of calling and when powering on key by the operation of user, power circuit unit 1451 is by from battery packCellular phone 1400 is activated in operational state to each unit power supply.
Based on the control for the main control unit 1450 being made of CPU, ROM, RAM etc., in such as voice call mode, numberAccording in the various modes of communication pattern etc., cellular phone 1400 executes various operations, for example, transmission/reception of audio signal,Transmission/reception of Email and image data, image taking, data record etc..
For example, cellular phone 1400 will be by microphone (words by audio codec 1459 in voice call modeCylinder) 1421 collect audio signals be converted to digital audio-frequency data, light is carried out to it at modulation/demodulation circuit unit 1458Extension process is composed, also, digital-to-analog conversion process and frequency conversion are carried out to it at transmission/reception circuit unit 1463Processing.Cellular phone 1400, which sends the signal for being used to send obtained by its conversion process to via antenna 1414, not to be shownBase station out.The signal (audio signal) for being used to send for being sent to base station is supplied to communication partner via public phone networkThe cellular phone of companion.
In addition, for example, in voice call mode, cellular phone 1400 at transmission/reception circuit unit 1463 toReceived reception signal amplifies at antenna 1414, further carries out at frequency conversion process and analog/digital conversion to itReason, spectrum is carried out against extension process to it at modulation/demodulation circuit unit 1458, and by audio codec by itsBe converted to analog audio signal.Cellular phone 1400 exports the analogue audio frequency letter that the sum that it is converted obtains from loudspeaker 1417Number.
In addition, for example, cellular phone 1400 is operating in the case where sending Email in data communication modeReceive the text data of the Email inputted by the operation of operation key 1419 at input control unit 1452.Honeycomb electricityWords 1400 handle its text data at main control unit 1450, and aobvious as image via LCD control unit 1455Show on liquid crystal display 1418.
In addition, cellular phone 1400 is based on the text data received by operation input control unit 1452, the instruction of userDeng the generation e-mail data at main control unit 1450.Cellular phone 1400 is at modulation/demodulation circuit unit 1458 pairIts e-mail data carries out spectrum extension process, and carries out number/mould to it at transmission/reception circuit unit 1463Quasi- conversion process and frequency conversion process.Cellular phone 1400 passes through the signal for being used to send obtained by its conversion processUnshowned base station is sent to by antenna 1414.By be sent to base station be used for send signal (Email) via network,Mail server etc. is supplied to scheduled destination.
In addition, for example, when in data communication mode receive Email in the case where, cellular phone 1400 with send/Receiving circuit unit 1463 receives the signal sent via antenna 1414 from base station, amplifies to the signal, also, to itFurther progress frequency conversion process and analog/digital conversion processing.Cellular phone 1400 is in modulation/demodulation circuit unitSignal is received at 1458 to it and carries out spectrum against extension process, to restore original electronic mail data.Cellular phone 1400 passes throughThe e-mail data of recovery is shown on liquid crystal display 1418 by LCD control unit 1455.
It note that cellular phone 1400 can record in storage unit 1423 via record/playback unit 1462 (to depositStorage) e-mail data that receives.
The storage unit 1423 is optional rewritable recording medium.Storage unit 1423 can be such as RAM,The semiconductor memory of built-in flash memory etc. can be hard disk, or can be such as disk, magneto-optic disk, CD, USBThe removable medium of memory, memory card etc..Much less, storage unit 1423 can be storage list in addition to theseMember.
In addition, for example, when in data communication mode send image data in the case where, cellular phone 1400 byImaging is at CCD camera 1416 to generate image data.CCD camera 1416 includes the optics as lens, aperture etc.The intensity-conversion of the light received is telecommunications by device and the CCD that subject is imaged for being used as photoelectric conversion deviceNumber, and generate the image data of the image of subject.Via Camera IO/F cell 1451, such as MPEG2, MPEG4 are usedDeng scheduled coded format, at image encoder 1453 to its image data carry out compressed encoding, therefore, by its imageData are converted to the image data of coding.
Cellular phone 1400 is using above-mentioned picture coding device 61,351,451 and 551 as executing this placeThe image encoder 1453 of reason.Therefore, in a manner of identical with picture coding device 51 and 351,1453 basis of image encoderFilter factor is arranged in quantization parameter and prediction mode, also, before intra prediction, executes filtering processing to neighborhood pixels.OrPerson, in a manner of identical with picture coding device 451 and 551, image encoder 1453 control before intra prediction whetherFiltering processing is executed to neighborhood pixels.It is thus possible to improve code efficiency.
It note that at this point, simultaneously, while being shot with CCD camera 1416, cellular phone 1400 is compiled in audio and solvedCode device 1459 at microphone (microphone) 1421 collect audio carry out Analog-digital Converter, and further to its intoRow coding.
Cellular phone 1400 is using scheduled method to from image encoder at multiplexing/separative unit 14571453 supply coded image datas and from audio codec 1459 supply digital audio-frequency data multiplexed.BeeCellular telephone 1400 carries out spectrum expansion to the multiplexed data obtained as its result at modulation/demodulation circuit unit 1458Exhibition processing, and it is carried out at digital-to-analog conversion process and frequency conversion at transmission/reception circuit unit 1463Reason.Cellular phone 1400 sends the signal for being used to send obtained by its conversion process to via antenna 1414 and is not shownBase station.The signal (picture signal) for being used to send for being sent to base station is supplied to communication parter via network etc..
It note that in the case where not sending image data, cellular phone 1400 can also be via LCD control unit1455 rather than image encoder 1453 be shown on liquid crystal display 1418 at CCD camera 1416 generate picture numberAccording to.
In addition, for example, working as the data for the motion pictures files that reception and simple website etc. link in data communication modeIn the case where, cellular phone 1400 receives the letter sent from base station via antenna 1414 at transmission/reception circuit unit 1463Number, which is amplified, also, its further progress frequency conversion process and analog/digital conversion are handled.
Cellular phone 1400 carries out spectrum against at extension to the signal received at the modulation/demodulation circuit unit 1458Reason, to restore original multiplexed data.Cellular phone 1400 answers its multichannel at multiplexing/separative unit 1457It is the image data and audio data of coding with data separating.
Cellular phone 1400 is existed using codec format corresponding with the scheduled coded format of MPEG2, MPEG4 etc.The image data of coding is decoded at image decoder 1456, to generate the motion image data of playback, also, beeCellular telephone 1400 shows the motion image data via LCD control unit 1455 on liquid crystal display 1418.Thus, for example,The motion image data for including in the motion pictures files with simple web site url is shown on liquid crystal display 1418.
Cellular phone 1400 is using above-mentioned picture decoding apparatus 151,401,501 or 601 as executing this placeThe image decoder 1456 of reason.Therefore, in a manner of identical with picture decoding apparatus 151 and 401, image decoder 1456Switch filter factor according to quantization parameter and prediction mode, also, before intra prediction, filtering processing is executed to neighborhood pixels.Alternatively, image decoder 1456 is based on opening/closing mark control in a manner of identical with picture decoding apparatus 501 and 601Whether filtering processing is executed to neighborhood pixels before intra prediction.It is thus possible to improve code efficiency.
At this point, simultaneously, digital audio-frequency data is converted to analog audio at audio codec 1459 by cellular phone 1400Frequency signal, and it is exported from loudspeaker 1417.Thus, for example, playing in the motion pictures files with simple web site urlThe audio data for including.
Note that it is identical by with Email the case where in a manner of, cellular phone 1400 can be via recording/reproducing listMember 1462 is by the data record (storage) received linked with simple website etc. in storage unit 1423.
In addition, two dimension of the cellular phone 1400 at main control unit 1450 to the imaging obtained by CCD camera 1416Code is analyzed, it is hereby achieved that the information recorded in 2 d code.
In addition, infrared ray can be used at infrared communication unit 1481 and communication with external apparatus in cellular phone 1400.
For example, cellular phone 1400 is using picture coding device 51,351,451 or 551 as image encoder 1453,It is thus possible to improve the coded data generated and encoding to the image data generated at CCD camera 1416Code efficiency.As a result, cellular phone 1400 can provide the coded data with excellent code efficiency to another device(image data).
In addition, cellular phone 1400 is using picture decoding apparatus 151,401,501 or 601 as image decoder 1456,So as to generate with high-precision forecast image.For example, as a result, cellular phone 1400 can from simple netThe motion pictures files linked of standing obtain the decoding image with higher precision, and show it.
It note that and such description has been carried out so far: where cellular phone 1400 utilizes CCD camera1416, still, cellular phone 1400 can use the imaging sensor using CMOS (complementary metal oxide semiconductor)(cmos image sensor) replaces the CCD camera 1416.In this case, with the feelings with utilization CCD camera 1416The identical mode of condition, cellular phone 1400 can also be imaged to subject and be generated the image data of the image of subject.
Be described in addition, being directed to cellular phone 1400 so far, still, with cellular phone 1400The identical mode of situation, picture coding device 51,351,451 and 55 and picture decoding apparatus 151,401,501 and 601 canTo be suitable for any kind of device, as long as it is with identical with the imaging function of cellular phone 1400 and image functionThe device of imaging function and image function, for example, PDA (personal digital assistant), smart phone, UMPC (super mobile personal meterCalculation machine), network book (net book), notebook personal computer etc..
Figure 48 is that diagram uses the hdd recorder for applying picture coding device and picture decoding apparatus of the inventionThe block diagram of main configuration example.
Hdd recorder shown in Figure 48 (HDD logger) 1500 is devices which that the device is in built-in hard diskThe broadcast that middle storage is received by tuner and includes from the broadcast wave signal (TV signal) of the transmissions such as satellite or ground-plane antennaThe audio data and video data of program, also, the number of storage is provided a user according to the instruction of user at a certain timingAccording to.
For example, hdd recorder 1500 can be from broadcast wave signal extraction audio data and video data, when appropriateTime is decoded these audio datas and video data, and stores it in built-in hard disk.In addition, for example, hard diskLogger 1500 can also obtain audio data and video data from another device via network, in due course to theseAudio data and video data are decoded, and are stored it in built-in hard disk.
In addition, for example, hdd recorder 1500 carries out the audio data and video data recorded in built-in hard diskDecoding, is supplied into monitor 1460, also, its image is shown on the screen of monitor 1560.In addition, hard disk recordingDevice 1500 can export its audio from the loudspeaker of monitor 1560.
For example, hdd recorder 1500 is to the audio data and view from the broadcast wave signal extraction obtained via tunerFrequency evidence or the audio data and video data obtained via network from another device are decoded, and are supplied into monitoringDevice 1560, also, its image is shown on the screen of monitor 1560.In addition, hdd recorder 1500 can be from monitor1560 loudspeaker exports its audio.
Much less, the operation in addition to these operations can be executed.
As shown in figure 48, hdd recorder 1500 includes receiving unit 1521, demodulating unit 1522, demultiplexer 1523, soundFrequency decoder 1524, Video Decoder 1525 and logger control unit 1526.Hdd recorder 1500 further includes EPG dataMemory 1527, program storage 1528, working storage 1529, display converter 1530, OSD (being shown on screen) controlUnit 1531, display control unit 1523, record/playback unit 1533, D/A converter 1534 and communication unit 1535.
In addition, display converter 1530 includes video encoder 1541.Record/playback unit 1533 includes encoder1551 and decoder 1552.
Receiving unit 1521 receives infrared signal from remote controler (not shown), is converted into electric signal, and export and giveLogger control unit 1526.Logger control unit 1526 is made of such as microprocessor etc., also, according to being stored in journeyProgram in sequence memory 1528 executes various processing.At this point, logger control unit 1526 uses various storages as neededDevice 1529.
Communication unit 1535 connected to the network executes image procossing via network and another device.For example, communication unit1535 are controlled by logger control unit 1526, to communicate with tuner (not shown) also, mainly export channel to tunerSelection control signal.
Demodulating unit 1522 demodulates the signal supplied from tuner, and exports to demultiplexer 1523.DemultiplexerThe data separating supplied from demodulating unit 1522 is audio data, video data and EPG data by 1523, and is exported respectivelyTo audio decoder 1524, Video Decoder 1525 and logger control unit 1526.
Audio decoder 1524 is for example decoded using audio data of the mpeg format to input, and is exported to noteRecord/playback unit 1533.Video Decoder 1525 is for example decoded using video data of the MPEG format to input, andIt exports to display converter 1530.The EPG data of input is supplied to EPG data storage by logger control unit 15261527 to store.
Display converter 1530 will be from Video Decoder 1525 or logger control unit using video encoder 1541The video data encoding of 1526 supplies is the video data for for example meeting NTSC (National Television Standards Committee) format, andIt exports to record/playback unit 1533.In addition, display converter 1530 will be single from Video Decoder 1525 or logger controlThe size conversion of the picture of the video data of 1526 supply of member is the size of the size corresponding to monitor 1560.Display conversionThe video data that converted screen size is further converted by device 1530 using video encoder 1541 meets NTSC formatVideo data, and analog signal is converted to, and export to display control unit 1532.
Under the control of logger control unit 1526, display control unit 1523 will be controlled from OSD (showing on screen)The osd signal that unit 1531 exports is overlapped from the vision signal that display converter 1530 inputs, and is exported to monitor1560 display is to show.
In addition, the audio data exported from audio decoder 1524 is converted to simulation letter using D/A converter 1534Number, and it is supplied into monitor 1560.Monitor 1560 exports the audio signal from built-in loudspeaker.
Record/playback unit 1533 includes hard disk, there is the recording medium of video data, video data etc. as record.
For example, record/playback unit 1533 is supplied by encoder 1551 from audio decoder 1524 using mpeg formatThe audio data answered is encoded.In addition, record/playback unit 1533 using mpeg format by encoder 1551 to from aobviousThe video data for showing that the video encoder 1541 of converter 1530 is supplied is encoded.Record/playback unit 1533 uses multichannelMultiplexer synthesizes the coded data of its audio data and the coded data of its video data.Record/playback unit 1533 passes through letterThe data of road coding amplification synthesis, and its data is written in a hard disk via record head.
Record/playback unit 1533 plays the data recorded in a hard disk via reproducing head, amplifies the data, also, makeAudio data and video data are separated into demultiplexer.Record/playback unit 1533 passes through decoder using mpeg format1552 pairs of audio datas and video data are decoded.Record/playback unit 1533 carries out digital mould to decoded audio dataQuasi- conversion, and export to the loudspeaker of monitor 1560.
In addition, record/playback unit 1533 carries out digital-to-analogue conversion to decoded video data, and export to prisonThe display of visual organ 1560.
Logger control unit 1526 via the received infrared signal from remote controler of receiving unit 1521 based on by referring toThe instruction of the user shown reads newest EPG data from EPG data storage 1527, and is supplied to OSD control unit1531.OSD control unit 1531 generates the image data for corresponding to the EPG data of input, and exports to display control unit1532.Display control unit 1532 exports the video data inputted from OSD control unit 1531 to the display of monitor 1560Device is to show.Therefore, EPG (electronic program guides) is shown on the display of monitor 1560.
In addition, hdd recorder 1500 can be obtained via the network of internet etc. from each of another device provisioningKind data, for example, video data, audio data, EPG data etc..
Communication unit 1535 is controlled by logger control unit 1526, is sent via network from another device with acquisitionThe coded data of video data, audio data, EPG data etc., and it is supplied into logger control unit 1526.For example, the coded data of the video data of acquisition and audio data is supplied to recording/reproducing list by logger control unit 1526Member 1533, and store it in hard disk.At this point, logger control unit 1526 and record/playback unit 153 can rootsAccording to needing to be implemented the processing recompiled etc..
In addition, the coded data of 1526 pairs of the logger control unit video datas obtained and audio data is decoded,And the video data of acquisition is supplied to display converter 1530.With with the video data supplied from Video Decoder 1525Identical mode, display converter 1530 handle the video data supplied from logger control unit 1526, via aobviousShow that control unit 1532 is supplied to monitor 1560, to show its image.
Alternatively, such arrangement can be carried out: where shown according to the image, logger control unit 1526 will decodeAudio data be supplied to monitor 1560 via D/A converter 1534, and export its audio from loudspeaker.
In addition, the coded data of the EPG data of 1526 pairs of logger control unit acquisitions is decoded, also, will decodingEPG data be supplied to EPG data storage 1527.
The hdd recorder 1500 being configured so that is decoded using picture decoding apparatus 151,401,501 or 601 as videoDevice 1525, decoder 1552 and the decoder being contained in logger control unit 1526.Therefore, with picture decoding apparatus151 and 401 identical modes, Video Decoder 1525, decoder 1552 and the solution being contained in logger control unit 1526Code device switches filter factor according to quantization parameter and prediction mode, also, before intra prediction, executes filter to neighborhood pixelsWave processing.Alternatively, in a manner of identical with picture decoding apparatus 501 and 601, Video Decoder 1525, decoder 1552 and appearanceWhether the decoder being contained in logger control unit 1526 is based on opening/closing mark control before intra prediction to neighbourNearly pixel executes filtering processing.It is thus possible to improve code efficiency.
Therefore, hdd recorder 1500 can produce with high-precision forecast image.As a result, hard disk recordingDevice 1500 can for example from via the received video data of tuner coded data, from the hard disk of record/playback unit 1533The coded data of the video data of reading or the coded data of the video data obtained via network, which obtain, has higher precisionDecoding image, and shown on monitor 1560.
In addition, hdd recorder 1500 is using picture coding device 51,351,451 or 551 as encoder 1551.CauseThis, in a manner of identical with picture coding device 51 and 351, encoder 1551 is according to quantization parameter and prediction mode setting filterWave system number, also, before intra prediction, filtering processing is executed to neighborhood pixels.Alternatively, with picture coding device 451 andWhether 551 identical modes, the control of encoder 1551 execute filtering processing to neighborhood pixels before intra prediction.Therefore, may be usedTo improve code efficiency.
Thus, for example, the code efficiency that record coded data in a hard disk can be improved in hdd recorder 1500.As a result, hdd recorder 1500 can use the storage region of hard disk in a more effective manner.
It note that so far to the hdd recorder for recording video data and audio data in a hard disk1500 are described, and still, much less, can use any kind of recording medium.For example, even all using applyingSuch as flash memory, CD, video tape the recording medium in addition to hard disk logger, with above-mentioned hard disk recordingThe case where device 1500 identical mode, picture coding device 51,351,451 and 551 and picture decoding apparatus 151,401,501 and 601 are readily applicable to this.
Figure 49 is that diagram uses the main of the camera for applying picture coding device and picture decoding apparatus of the inventionThe block diagram of configuration example.
Subject is imaged in camera 1600 shown in Figure 49, and the image of subject is shown on LCD 1616,And as Imagery Data Recording in recording medium 1633.
Light (that is, video of subject) is input to CCD/CMOS 1612 by block of lense 1611.CCD/CMOS 1612 is benefitIt is electric signal by the intensity-conversion of the light received with the imaging sensor of CCD or CMOS, and is supplied to camera signalProcessing unit 1613.
The electric signal supplied from CCD/CMOS 1612 is converted to Y, Cr and Cb by camera signal processing unit 1613Color difference signal, and it is supplied to image signal processing unit 1614.Under the control of controller 1621, image signal processUnit 1614 carries out scheduled image procossing to the picture signal supplied from camera signal processing unit 1613, alternatively, usingSuch as mpeg format encodes its picture signal by encoder 1641.Image signal processing unit 1614 will be by rightThe coded data that picture signal is encoded and generated is supplied to decoder 1615.In addition, image signal processing unit 1614 obtainsThe data for display generated at display (OSD) 1620 on the screen are obtained, and are supplied into decoder 1615.
For above-mentioned processing, camera signal processing unit 1613 is utilized as suitably desired via bus 1617The DRAM (dynamic random access memory) 1617 of connection by image data, from coded data of its coded image data etc.It is maintained in its DRAM 1618.
Decoder 1615 is decoded the coded data supplied from image signal processing unit 1614, also, will obtainImage data (decoding image data) be supplied to LCD 1616.In addition, decoder 1615 will be from image signal processing unitThe data for display of 1614 supplies are supplied to LCD 1616.LCD 1616 is synthesized in due course from decoder 1615The image of the data for display of supply and the image of decoding image data, and show its composograph.
Under the control of controller 1621, display 1620 will be by such as dish of symbol, character or figure constitution on screenThe data for display of single-image or icon etc. are output to image signal processing unit 1614 via bus 1617.
Signal based on the content that instruction is ordered by user using operating unit 1622, control unit 1621 executes variousProcessing, also, image signal processing unit 1614, DRAM 1618, external interface 1619, screen are also controlled via bus 1617Upper display 1620, media drive 1623 etc..Procedure, data needed for executing various processing for controller 1621 etc. is depositedStorage is in flash ROM 1624.
For example, controller 1621 can replace image signal processing unit 1614 and decoder 1615 to being stored in DRAMImage data in 1618 is encoded, or is decoded to the coded data being stored in DRAM 1618.At this point, controlFormat identical with the coding and decoding format of image signal processing unit 1614 and decoder 1615 can be used in device 1621Coding and decoding processing is executed, alternatively, can be used image signal processing unit 1614 and decoder 1615 cannot all be handledFormat is handled to execute coding and decoding.
In addition, for example, since indicating operating unit 1622 image print in the case where, controller 1621 fromDRAM 1618 reads image data, and the printer connecting with external interface 1619 is supplied into via bus 16171634 to print.
In addition, for example, in the case where indicating image recording from operating unit 1622, controller 1621 is from DRAM1618 read coded data, and the recording medium being mounted on media drive 1623 is supplied into via bus 16171633 to store.
Recording medium 1633 is optional read-write removable medium, for example, disk, magneto-optic disk, CD, semiconductorMemory etc..Much less, about the type of removable medium, recording medium 1633 be also it is optional, as such, it can be that bandDevice perhaps can be disk or can be memory card.Much less, recording medium 1633 can be non-contact IC cardDeng.
Alternatively, media drive 1623 and recording medium 1633, which can be configured as, is integrated in non-portabillity recording mediumIn, the non-portabillity recording medium is, for example, built-in hard disk drive, SSD (solid state drive) etc..
External interface 1619 is made of such as USB input/output terminal etc., and in the case where executing image printingIt is connect with printer 1634.In addition, driver 1631 is connect with external interface 1619 as needed, such as disk, CD or magneticThe removable medium 1632 of CD is mounted in due course, also, the computer program being read out from is according to needIt is installed in flash ROM 1624.
In addition, external interface 1619 includes the network interface to connect with the predetermined network of LAN, internet etc..ExampleSuch as, according to the instruction from operating unit 122, controller 1621 can read coded data from DRAM 1618, and by itsAnother device connected via a network is supplied to from external interface 1619.In addition, controller 1621 can be via external interface1619 obtain via network from the coded data or image data of another device provisioning, and hold it in DRAM 1618In, alternatively, being supplied into image signal processing unit 1614.
The camera 1600 being configured so that is using picture decoding apparatus 151,401,501 or 601 as decoder 1615.
Therefore, in a manner of identical with picture decoding apparatus 151 and 401, decoder 1615 is according to quantization parameter and predictionPattern switching filter factor, also, before intra prediction, filtering processing is executed to neighborhood pixels.Alternatively, with image solutionThe identical mode of code device 501 and 601, decoder 1615 are based on opening/closing indicate whether control is right before intra predictionNeighborhood pixels execute filtering processing.It is thus possible to improve code efficiency.
Therefore, camera 1600 can produce with high-precision forecast image.As a result, for example, camera1600 can be from the image data generated at CCD/CMOS 1612, the view read from DRAM 1618 or recording medium 1633The coded data of frequency evidence or the coded data of the video data obtained via network obtain the decoding with higher precisionImage, and shown on LCD 1616.
In addition, camera 1600 is using picture coding device 51,351,451 or 551 as encoder 1641.
Therefore, in a manner of identical with picture coding device 51 and 351, encoder 1641 is according to quantization parameter and predictionMode setting filter factor, also, before intra prediction, filtering processing is executed to neighborhood pixels.Alternatively, to be compiled with imageWhether the identical mode of code device 451 and 551, the control of encoder 1641 execute filtering to neighborhood pixels before intra predictionProcessing.It is thus possible to improve code efficiency.
Thus, for example, the code efficiency that record coded data in a hard disk can be improved in camera 1600.AsAs a result, camera 1600 can be in a more effective manner using DRAM 1618 or the storage region of recording medium 1633.
It note that the coding/decoding method of picture decoding apparatus 151,401,501 and 601 can be adapted for controller 1621 and holdCapable decoding process.Similarly, the coding method of picture coding device 51,351,451 and 551 can be adapted for controller1621 coded treatments executed.
In addition, the image data that camera 1600 is imaged can be moving image, or it can be static image.
Much less, picture coding device 51,351,451 and 551 and picture decoding apparatus 151,401,501 and 601 canWith the device or system being suitable in addition to above-mentioned apparatus.
Reference signs list
51 picture coding devices
66 lossless coding units
74 intraprediction units
75 neighborhood pixels interpolation filtering switch units
81 neighborhood pixels setting units
82 forecast images generate unit
83 optimum prediction mode determination units
91 prediction mode buffers
92 quantization parameter buffers
93 low-pass filtering setting units
94 frame coefficient memories
111 neighborhood pixels setting units
112 forecast images generate unit
113 optimum prediction mode determination units
121 prediction mode buffers
122 optimum filtering computing units
151 picture decoding apparatus
162 lossless decoding units
171 intraprediction units
172 neighborhood pixels interpolation filtering switch units
181 forecast images generate unit
182 neighborhood pixels setting units
191 prediction mode buffers
192 quantization parameter buffers
193 low-pass filtering setting units
194 filter factor memories
202 low-pass filtering setting units
251 learning devices
261 adjacent to interpolation filtering computing unit
271 filter factor storage units
351 picture coding devices
361 re prediction units
362 neighborhood pixels interpolation filtering switch units
401 picture decoding apparatus
411 re prediction units
412 neighborhood pixels interpolation filtering switch units
451 picture coding devices
461 neighborhood pixels interpolation filtering control units
501 picture decoding apparatus
511 neighborhood pixels interpolation filtering control units
551 picture coding devices
561 neighborhood pixels interpolation filtering control units
601 picture decoding apparatus
611 neighborhood pixels interpolation filtering control units