How do we apply the Law Of The Unconcious Statistician to Ian Goodfellow’s original GANs value function formulation?
In this brief we want to prove a passage of the original GANs Paper by Ian Goodfellow. Specifically, we want to prove that the following equation is satisfied:
∫zpz(z)log(1−D(G(z)))dz=∫xpg(x)log(1−D(x))dx For a continuous random variable z, let x=G(z) and z=G−1(x) suppose that G is differentiable and that its inverse G−1 is monotonic. By the formula for inverse functions and differentiation we have that dzdx⋅dxdz=1 dzdx=1dxdz dzdx=1dG(G−1(x))d(G−1(x)) dzdx=1G′(G−1(x)) dz=1G′(G−1(x))dx so that by a change of variables we can rewrite everything in function of x, ∫∞−∞pz(z)log(1−D(G(z)))dz= ∫∞−∞pz(z)log(1−D(x))1G′(G−1(x))dx= ∫∞−∞pz(z)log(1−D(x))1G′(G−1(x))dx= ∫∞−∞pz(z)1G′(G−1(x))log(1−D(x))dx= Now, notice that the cumulative distribution function PX(x):Rn→[0,1] is PX(x)=Pr(X<x)=Pr(X1≤x1,X2≤x2,…,Xn≤xn), we observe that:
PX(x)= Pr(X<x)= Pr(G(Z)≤x)= Pr(Z≤G−1(x))= PZ(G−1(x)) From here PX(x)=PZ(G−1(x))PX(x)=PZ(z) We take the derivative w.r.t x: dPX(x)dx=PZ(z)dx dPX(x)dx=PZ(z)dzdzdx pX(x)=pz(z)1G′(G−1(x)) For clarity we rename pX(x) as pg(x) since it represents the distribution learned from our generator G, we have pg(x)=pz(z)1G′(G−1(x))
For attribution, please cite this work as
Bonvini (2020, Dec. 11). Last Week's Potatoes: LOTUS Theorem in original GANs formulation.. Retrieved from https://lastweekspotatoes.com/posts/2021-07-22-lotus-theorem-in-original-gans-formulation/
BibTeX citation
@misc{bonvini2020lotus,
author = {Bonvini, Andrea},
title = {Last Week's Potatoes: LOTUS Theorem in original GANs formulation.},
url = {https://lastweekspotatoes.com/posts/2021-07-22-lotus-theorem-in-original-gans-formulation/},
year = {2020}
}