Movatterモバイル変換


[0]ホーム

URL:


None
Keras 3 API documentation
Models APILayers APIThe base Layer classLayer activationsLayer weight initializersLayer weight regularizersLayer weight constraintsCore layersConvolution layersPooling layersRecurrent layersPreprocessing layersNormalization layersRegularization layersAttention layersReshaping layersMerging layersActivation layersBackend-specific layersCallbacks APIOps APIOptimizersMetricsLossesData loadingBuilt-in small datasetsKeras ApplicationsMixed precisionMulti-device distributionRNG APIQuantizersScopeRematerializationUtilities
Keras 2 API documentation

GroupNormalization layer

[source]

GroupNormalization class

keras.layers.GroupNormalization(groups=32,axis=-1,epsilon=0.001,center=True,scale=True,beta_initializer="zeros",gamma_initializer="ones",beta_regularizer=None,gamma_regularizer=None,beta_constraint=None,gamma_constraint=None,**kwargs)

Group normalization layer.

Group Normalization divides the channels into groups and computeswithin each group the mean and variance for normalization.Empirically, its accuracy is more stable than batch norm in a widerange of small batch sizes, if learning rate is adjusted linearlywith batch sizes.

Relation to Layer Normalization:If the number of groups is set to 1, then this operation becomes nearlyidentical to Layer Normalization (see Layer Normalization docs for details).

Relation to Instance Normalization:If the number of groups is set to the input dimension (number of groups isequal to number of channels), then this operation becomes identical toInstance Normalization. You can achieve this viagroups=-1.

Arguments

  • groups: Integer, the number of groups for Group Normalization. Can be in the range[1, N] where N is the input dimension. The input dimension must be divisible by the number of groups. Defaults to 32.
  • axis: Integer or List/Tuple. The axis or axes to normalize across. Typically, this is the features axis/axes. The left-out axes are typically the batch axis/axes. -1 is the last dimension in the input. Defaults to-1.
  • epsilon: Small float added to variance to avoid dividing by zero. Defaults to 1e-3.
  • center: IfTrue, add offset ofbeta to normalized tensor. IfFalse,beta is ignored. Defaults toTrue.
  • scale: IfTrue, multiply bygamma. IfFalse,gamma is not used. When the next layer is linear (also e.g.relu), this can be disabled since the scaling will be done by the next layer. Defaults toTrue.
  • beta_initializer: Initializer for the beta weight. Defaults to zeros.
  • gamma_initializer: Initializer for the gamma weight. Defaults to ones.
  • beta_regularizer: Optional regularizer for the beta weight. None by default.
  • gamma_regularizer: Optional regularizer for the gamma weight. None by default.
  • beta_constraint: Optional constraint for the beta weight. None by default.
  • gamma_constraint: Optional constraint for the gamma weight. None by default. # Input shape Arbitrary. Use the keyword argumentinput_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model. # Output shape Same shape as input.
  • **kwargs: Base layer keyword arguments (e.g.name anddtype).

Reference



[8]ページ先頭

©2009-2025 Movatter.jp