Generative Adversarial Networks (GANs), introduced by Ian Goodfellow in 2014, are a compelling neural network architecture where two networks—the Generator and the Discriminator—compete to improve data generation. Despite their innovative design, GANs face practical challenges, such as mode collapse, vanishing gradients, and convergence difficulties, often due to simplistic loss functions like the standard min-max loss. Variations such as the Non-Saturating GAN Loss and alternative approaches like Wasserstein GAN (WGAN) and Conditional GAN (CGAN) have been developed to address these issues, offering more stable training and diverse outputs by modifying the loss function framework. These alternatives aim to mitigate performance inconsistencies and improve the GAN's ability to differentiate between real and synthetic data, providing insights into the evolution of GANs and their practical applications.