To enforce the disentanglement of codes, InfoGAN proposed a regularizer to the original loss function that maximizes the mutual information between the latent codes c and
:
(Equation 6.1.3)
The regularizer forces the generator to consider the latent codes when it formulates a function that synthesizes the fake images. In the field of information theory, the mutual information between latent codes c and
is defined as:
(Equation 6.1.4)
Where H(c) is the entropy of the latent code c and
is the conditional entropy of c, after observing the output of the generator,
. Entropy is a measure of uncertainty of a random variable or an event. For example, information like, the sun rises in the east, has low entropy. Whereas, winning the jackpot in the lottery has high entropy.
In Equation 6.1.4, maximizing the mutual information means minimizing
or decreasing the uncertainty in the latent code upon observing the generated output. This makes sense since, for example, in the...