Rate this Page

torch.Tensor.scatter_#

Tensor.scatter_(dim,index,src,*,reduce=None)Tensor#

Writes all values from the tensorsrc intoself at the indicesspecified in theindex tensor. For each value insrc, its outputindex is specified by its index insrc fordimension!=dim and bythe corresponding value inindex fordimension=dim.

For a 3-D tensor,self is updated as:

self[index[i][j][k]][j][k]=src[i][j][k]# if dim == 0self[i][index[i][j][k]][k]=src[i][j][k]# if dim == 1self[i][j][index[i][j][k]]=src[i][j][k]# if dim == 2

This is the reverse operation of the manner described ingather().

It is also required thatindex.size(d)<=src.size(d) for all dimensionsd, and thatindex.size(d)<=self.size(d) for all dimensionsd!=dim.Note thatinput andindex do not broadcast against each other for NPUs,so when running on NPUs,input andindex must have the same number of dimensions.Standard broadcasting occurs in all other cases.

Moreover, as forgather(), the values ofindex must bebetween0 andself.size(dim)-1 inclusive.

Warning

When indices are not unique, the behavior is non-deterministic (one of thevalues fromsrc will be picked arbitrarily) and the gradient will beincorrect (it will be propagated to all locations in the source thatcorrespond to the same index)!

Note

The backward pass is implemented only forsrc.shape==index.shape.

Additionally accepts an optionalreduce argument that allowsspecification of an optional reduction operation, which is applied to allvalues in the tensorsrc intoself at the indicesspecified in theindex. For each value insrc, the reductionoperation is applied to an index inself which is specified byits index insrc fordimension!=dim and by the correspondingvalue inindex fordimension=dim.

Given a 3-D tensor and reduction using the multiplication operation,selfis updated as:

self[index[i][j][k]][j][k]*=src[i][j][k]# if dim == 0self[i][index[i][j][k]][k]*=src[i][j][k]# if dim == 1self[i][j][index[i][j][k]]*=src[i][j][k]# if dim == 2

Reducing with the addition operation is the same as usingscatter_add_().

Warning

The reduce argument with Tensorsrc is deprecated and will be removed ina future PyTorch release. Please usescatter_reduce_()instead for more reduction options.

Parameters
  • dim (int) – the axis along which to index

  • index (LongTensor) – the indices of elements to scatter, can be either emptyor of the same dimensionality assrc. When empty, the operationreturnsself unchanged.

  • src (Tensor) – the source element(s) to scatter.

Keyword Arguments

reduce (str,optional) – reduction operation to apply, can be either'add' or'multiply'.

Example:

>>>src=torch.arange(1,11).reshape((2,5))>>>srctensor([[ 1,  2,  3,  4,  5],        [ 6,  7,  8,  9, 10]])>>>index=torch.tensor([[0,1,2,0]])>>>torch.zeros(3,5,dtype=src.dtype).scatter_(0,index,src)tensor([[1, 0, 0, 4, 0],        [0, 2, 0, 0, 0],        [0, 0, 3, 0, 0]])>>>index=torch.tensor([[0,1,2],[0,1,4]])>>>torch.zeros(3,5,dtype=src.dtype).scatter_(1,index,src)tensor([[1, 2, 3, 0, 0],        [6, 7, 0, 0, 8],        [0, 0, 0, 0, 0]])>>>torch.full((2,4),2.).scatter_(1,torch.tensor([[2],[3]]),...1.23,reduce='multiply')tensor([[2.0000, 2.0000, 2.4600, 2.0000],        [2.0000, 2.0000, 2.0000, 2.4600]])>>>torch.full((2,4),2.).scatter_(1,torch.tensor([[2],[3]]),...1.23,reduce='add')tensor([[2.0000, 2.0000, 3.2300, 2.0000],        [2.0000, 2.0000, 2.0000, 3.2300]])
scatter_(dim,index,value,*,reduce=None)Tensor:

Writes the value fromvalue intoself at the indicesspecified in theindex tensor. This operation is equivalent to the previous version,with thesrc tensor filled entirely withvalue.

Parameters
  • dim (int) – the axis along which to index

  • index (LongTensor) – the indices of elements to scatter, can be either emptyor of the same dimensionality assrc. When empty, the operationreturnsself unchanged.

  • value (Scalar) – the value to scatter.

Keyword Arguments

reduce (str,optional) – reduction operation to apply, can be either'add' or'multiply'.

Example:

>>>index=torch.tensor([[0,1]])>>>value=2>>>torch.zeros(3,5).scatter_(0,index,value)tensor([[2., 0., 0., 0., 0.],        [0., 2., 0., 0., 0.],        [0., 0., 0., 0., 0.]])