Instance Normalization

Resources

Instance Normalization is a normalization technique commonly used in tasks like style transfer and image generation. It normalizes the features independently for each sample and each channel.

Instance normalization is applied per-instance per-channel(copy-pasted from chatgpt):

Then optionally apply learned affine parameters:

Where:

  • is a small number to avoid division by zero.
  • , are learnable scale and shift parameters (per channel).

Key differences from other normalizations:

  • Compared to BatchNorm, instance norm doesn’t compute statistics across the batch.
  • It improves style invariance and is often used in generative networks.
# PyTorch example
import torch
import torch.nn as nn
 
norm = nn.InstanceNorm2d(num_features=3)  # for 3-channel images
out = norm(torch.randn(8, 3, 32, 32))     # input: (batch, channel, height, width)