torch.Tensor.scatter_#
- Tensor.scatter_(dim,index,src,*,reduce=None)→Tensor#
Writes all values from the tensor
srcintoselfat the indicesspecified in theindextensor. For each value insrc, its outputindex is specified by its index insrcfordimension!=dimand bythe corresponding value inindexfordimension=dim.For a 3-D tensor,
selfis updated as:self[index[i][j][k]][j][k]=src[i][j][k]# if dim == 0self[i][index[i][j][k]][k]=src[i][j][k]# if dim == 1self[i][j][index[i][j][k]]=src[i][j][k]# if dim == 2
This is the reverse operation of the manner described in
gather().It is also required that
index.size(d)<=src.size(d)for all dimensionsd, and thatindex.size(d)<=self.size(d)for all dimensionsd!=dim.Note thatinputandindexdo not broadcast against each other for NPUs,so when running on NPUs,inputandindexmust have the same number of dimensions.Standard broadcasting occurs in all other cases.Moreover, as for
gather(), the values ofindexmust bebetween0andself.size(dim)-1inclusive.Warning
When indices are not unique, the behavior is non-deterministic (one of thevalues from
srcwill be picked arbitrarily) and the gradient will beincorrect (it will be propagated to all locations in the source thatcorrespond to the same index)!Note
The backward pass is implemented only for
src.shape==index.shape.Additionally accepts an optional
reduceargument that allowsspecification of an optional reduction operation, which is applied to allvalues in the tensorsrcintoselfat the indicesspecified in theindex. For each value insrc, the reductionoperation is applied to an index inselfwhich is specified byits index insrcfordimension!=dimand by the correspondingvalue inindexfordimension=dim.Given a 3-D tensor and reduction using the multiplication operation,
selfis updated as:self[index[i][j][k]][j][k]*=src[i][j][k]# if dim == 0self[i][index[i][j][k]][k]*=src[i][j][k]# if dim == 1self[i][j][index[i][j][k]]*=src[i][j][k]# if dim == 2
Reducing with the addition operation is the same as using
scatter_add_().Warning
The reduce argument with Tensor
srcis deprecated and will be removed ina future PyTorch release. Please usescatter_reduce_()instead for more reduction options.- Parameters
- Keyword Arguments
reduce (str,optional) – reduction operation to apply, can be either
'add'or'multiply'.
Example:
>>>src=torch.arange(1,11).reshape((2,5))>>>srctensor([[ 1, 2, 3, 4, 5], [ 6, 7, 8, 9, 10]])>>>index=torch.tensor([[0,1,2,0]])>>>torch.zeros(3,5,dtype=src.dtype).scatter_(0,index,src)tensor([[1, 0, 0, 4, 0], [0, 2, 0, 0, 0], [0, 0, 3, 0, 0]])>>>index=torch.tensor([[0,1,2],[0,1,4]])>>>torch.zeros(3,5,dtype=src.dtype).scatter_(1,index,src)tensor([[1, 2, 3, 0, 0], [6, 7, 0, 0, 8], [0, 0, 0, 0, 0]])>>>torch.full((2,4),2.).scatter_(1,torch.tensor([[2],[3]]),...1.23,reduce='multiply')tensor([[2.0000, 2.0000, 2.4600, 2.0000], [2.0000, 2.0000, 2.0000, 2.4600]])>>>torch.full((2,4),2.).scatter_(1,torch.tensor([[2],[3]]),...1.23,reduce='add')tensor([[2.0000, 2.0000, 3.2300, 2.0000], [2.0000, 2.0000, 2.0000, 3.2300]])
- scatter_(dim,index,value,*,reduce=None)→Tensor:
Writes the value from
valueintoselfat the indicesspecified in theindextensor. This operation is equivalent to the previous version,with thesrctensor filled entirely withvalue.- Parameters
dim (int) – the axis along which to index
index (LongTensor) – the indices of elements to scatter, can be either emptyor of the same dimensionality as
src. When empty, the operationreturnsselfunchanged.value (Scalar) – the value to scatter.
- Keyword Arguments
reduce (str,optional) – reduction operation to apply, can be either
'add'or'multiply'.
Example:
>>>index=torch.tensor([[0,1]])>>>value=2>>>torch.zeros(3,5).scatter_(0,index,value)tensor([[2., 0., 0., 0., 0.], [0., 2., 0., 0., 0.], [0., 0., 0., 0., 0.]])