In this work we describe the mathematical background of Generative Adversarial Networks (GANs). We discuss optimization criteria of the Jensen-Shannon GAN, the f-GAN, the Wasserstein GAN and the...Show moreIn this work we describe the mathematical background of Generative Adversarial Networks (GANs). We discuss optimization criteria of the Jensen-Shannon GAN, the f-GAN, the Wasserstein GAN and the Sobolev GAN. For the f-GAN we derive a variational formulation, which is closely related to the Wasserstein and Sobolev GAN criterion. The focus of the thesis will then shift towards Sobolev GANs. We show that optimizing the criterion belonging to these GANs can be reduced to a density estimation problem. The main result of this thesis is the convergence rate of Sobolev GANs when data lies on a simple manifold. The manifold used is a toy model and not realistic, but gives insight in the robustness of Sobolev GANs when data lies on a more general manifold. We use kernel density estimations to establish this convergence rate. When the function space over which the GAN is optimized is the L 2 -Sobolev space we found the following rates: for β ∈ R+ with 0 < β < d 2 we found the rate n −β d+3η , where η ∈ R+ is an arbitrary small number and d is the complete dimension of the observations and for β > d 2 we found the parametric rate n − 1 2 . The rates do not depend on the dimension d 0 of the manifold on which the data lies.Show less