pyiqa.losses

Submodules

Package Contents

Classes

CharbonnierLoss

Charbonnier loss (one variant of Robust L1Loss, a differentiable

L1Loss

L1 (mean absolute error, MAE) loss.

MSELoss

MSE (L2) loss.

WeightedTVLoss

Weighted TV loss.

EMDLoss

EMD (earth mover distance) loss.

PLCCLoss

PLCC loss, induced from Pearson’s Linear Correlation Coefficient.

NiNLoss

NiN (Norm in Norm) loss

class pyiqa.losses.CharbonnierLoss(loss_weight=1.0, reduction='mean', eps=1e-12)[source]

Bases: torch.nn.Module

Charbonnier loss (one variant of Robust L1Loss, a differentiable variant of L1Loss).

Described in “Deep Laplacian Pyramid Networks for Fast and Accurate

Super-Resolution”.

Args:

loss_weight (float): Loss weight for L1 loss. Default: 1.0. reduction (str): Specifies the reduction to apply to the output.

Supported choices are ‘none’ | ‘mean’ | ‘sum’. Default: ‘mean’.

eps (float): A value used to control the curvature near zero. Default: 1e-12.

forward(pred, target, weight=None, **kwargs)[source]
Args:

pred (Tensor): of shape (N, C, H, W). Predicted tensor. target (Tensor): of shape (N, C, H, W). Ground truth tensor. weight (Tensor, optional): of shape (N, C, H, W). Element-wise weights. Default: None.

class pyiqa.losses.L1Loss(loss_weight=1.0, reduction='mean')[source]

Bases: torch.nn.Module

L1 (mean absolute error, MAE) loss.

Args:

loss_weight (float): Loss weight for L1 loss. Default: 1.0. reduction (str): Specifies the reduction to apply to the output.

Supported choices are ‘none’ | ‘mean’ | ‘sum’. Default: ‘mean’.

forward(pred, target, weight=None, **kwargs)[source]
Args:

pred (Tensor): of shape (N, C, H, W). Predicted tensor. target (Tensor): of shape (N, C, H, W). Ground truth tensor. weight (Tensor, optional): of shape (N, C, H, W). Element-wise weights. Default: None.

class pyiqa.losses.MSELoss(loss_weight=1.0, reduction='mean')[source]

Bases: torch.nn.Module

MSE (L2) loss.

Args:

loss_weight (float): Loss weight for MSE loss. Default: 1.0. reduction (str): Specifies the reduction to apply to the output.

Supported choices are ‘none’ | ‘mean’ | ‘sum’. Default: ‘mean’.

forward(pred, target, weight=None, **kwargs)[source]
Args:

pred (Tensor): of shape (N, C, H, W). Predicted tensor. target (Tensor): of shape (N, C, H, W). Ground truth tensor. weight (Tensor, optional): of shape (N, C, H, W). Element-wise weights. Default: None.

class pyiqa.losses.WeightedTVLoss(loss_weight=1.0, reduction='mean')[source]

Bases: L1Loss

Weighted TV loss.

Args:

loss_weight (float): Loss weight. Default: 1.0.

forward(pred, weight=None)[source]
Args:

pred (Tensor): of shape (N, C, H, W). Predicted tensor. target (Tensor): of shape (N, C, H, W). Ground truth tensor. weight (Tensor, optional): of shape (N, C, H, W). Element-wise weights. Default: None.

class pyiqa.losses.EMDLoss(loss_weight=1.0, r=2, reduction='mean')[source]

Bases: torch.nn.Module

EMD (earth mover distance) loss.

forward(pred, target, weight=None, **kwargs)[source]
class pyiqa.losses.PLCCLoss(loss_weight=1.0)[source]

Bases: torch.nn.Module

PLCC loss, induced from Pearson’s Linear Correlation Coefficient.

forward(pred, target)[source]
class pyiqa.losses.NiNLoss(loss_weight=1.0, p=1, q=2)[source]

Bases: torch.nn.Module

NiN (Norm in Norm) loss

Reference:

This loss can be simply described as: l1_norm(normalize(pred - pred_mean), normalize(target - target_mean))

forward(pred, target)[source]