Memorization-Informed Frechet Inception Distance (MiFID)¶
Module Interface¶
- class torchmetrics.image.mifid.MemorizationInformedFrechetInceptionDistance(feature=2048, reset_real_features=True, normalize=False, cosine_distance_eps=0.1, **kwargs)[source]¶
Calculate Memorization-Informed Frechet Inception Distance (MIFID).
MIFID is a improved variation of the Frechet Inception Distance (FID) that penalizes memorization of the training set by the generator. It is calculated as
\[MIFID = \frac{FID(F_{real}, F_{fake})}{M(F_{real}, F_{fake})}\]where \(FID\) is the normal FID score and \(M\) is the memorization penalty. The memorization penalty essentially corresponds to the average minimum cosine distance between the features of the real and fake distribution.
Using the default feature extraction (Inception v3 using the original weights from fid ref2), the input is expected to be mini-batches of 3-channel RGB images of shape
(3 x H x W). If argumentnormalizeisTrueimages are expected to be dtypefloatand have values in the[0, 1]range, else ifnormalizeis set toFalseimages are expected to have dtypeuint8and take values in the[0, 255]range. All images will be resized to 299 x 299 which is the size of the original training data. The boolian flagrealdetermines if the images should update the statistics of the real distribution or the fake distribution.Hint
Using this metrics requires you to have
scipyinstall. Either install aspip install torchmetrics[image]orpip install scipyHint
Using this metric with the default feature extractor requires that
torch-fidelityis installed. Either install aspip install torchmetrics[image]orpip install torch-fidelityAs input to
forwardandupdatethe metric accepts the following inputimgs(Tensor): tensor with images feed to the feature extractor withreal(bool): bool indicating ifimgsbelong to the real or the fake distribution
As output of forward and compute the metric returns the following output
mifid(Tensor): float scalar tensor with mean MIFID value over samples
- Parameters:
feature¶ (
Union[int,Module]) –Either an integer or
nn.Module:an integer will indicate the inceptionv3 feature layer to choose. Can be one of the following: 64, 192, 768, 2048
an
nn.Modulefor using a custom feature extractor. Expects that its forward method returns an(N,d)matrix whereNis the batch size anddis the feature size.
reset_real_features¶ (
bool) – Whether to also reset the real features. Since in many cases the real dataset does not change, the features can be cached them to avoid recomputing them which is costly. Set this toFalseif your dataset does not change.normalize¶ (
bool) – Whether to normalize the input images. IfTruethe input is expected to be in the range [0, 1] and converted touint8. IfFalsethe input is expected to already be in the range [0, 255] and of typeuint8. If a custom feature extractor is used, this argument is ignored.cosine_distance_eps¶ (
float) – Epsilon value for the cosine distance. If the cosine distance is larger than this value it is set to 1 and thus ignored in the MIFID calculation.kwargs¶ (
Any) – Additional keyword arguments, see Advanced metric settings for more info.
- Raises:
RuntimeError – If
torchis version less than 1.10ValueError – If
featureis set to anintandtorch-fidelityis not installedValueError – If
featureis set to anintnot in [64, 192, 768, 2048]TypeError – If
featureis not anstr,intortorch.nn.ModuleValueError – If
reset_real_featuresis not anbool
- Example::
>>> from torch import randint >>> from torchmetrics.image.mifid import MemorizationInformedFrechetInceptionDistance >>> mifid = MemorizationInformedFrechetInceptionDistance(feature=64) >>> # generate two slightly overlapping image intensity distributions >>> imgs_dist1 = randint(0, 200, (100, 3, 299, 299), dtype=torch.uint8) >>> imgs_dist2 = randint(100, 255, (100, 3, 299, 299), dtype=torch.uint8) >>> mifid.update(imgs_dist1, real=True) >>> mifid.update(imgs_dist2, real=False) >>> mifid.compute() tensor(3003.3691)
- plot(val=None, ax=None)[source]¶
Plot a single or multiple values from the metric.
- Parameters:
val¶ (
Union[Tensor,Sequence[Tensor],None]) – Either a single result from calling metric.forward or metric.compute or a list of these results. If no value is provided, will automatically call metric.compute and plot that result.ax¶ (
Optional[Axes]) – An matplotlib axis object. If provided will add plot to that axis
- Return type:
- Returns:
Figure and Axes object
- Raises:
ModuleNotFoundError – If matplotlib is not installed
>>> # Example plotting a single value >>> import torch >>> from torchmetrics.image.mifid import MemorizationInformedFrechetInceptionDistance >>> imgs_dist1 = torch.randint(0, 200, (100, 3, 299, 299), dtype=torch.uint8) >>> imgs_dist2 = torch.randint(100, 255, (100, 3, 299, 299), dtype=torch.uint8) >>> metric = MemorizationInformedFrechetInceptionDistance(feature=64) >>> metric.update(imgs_dist1, real=True) >>> metric.update(imgs_dist2, real=False) >>> fig_, ax_ = metric.plot()
>>> # Example plotting multiple values >>> import torch >>> from torchmetrics.image.mifid import MemorizationInformedFrechetInceptionDistance >>> imgs_dist1 = lambda: torch.randint(0, 200, (100, 3, 299, 299), dtype=torch.uint8) >>> imgs_dist2 = lambda: torch.randint(100, 255, (100, 3, 299, 299), dtype=torch.uint8) >>> metric = MemorizationInformedFrechetInceptionDistance(feature=64) >>> values = [ ] >>> for _ in range(3): ... metric.update(imgs_dist1(), real=True) ... metric.update(imgs_dist2(), real=False) ... values.append(metric.compute()) ... metric.reset() >>> fig_, ax_ = metric.plot(values)