PromoterGPT is a project that uses a decoder-only transformer model to generate new DNA promoter sequences, which are crucial regions for controlling gene expression. The initiative aims to teach the model to write DNA instructions by training it on 200-base-pair promoter sequences, using a tokenization process that breaks DNA into k-mers (overlapping 3-base segments) and builds a custom vocabulary. The project employs a modified GPT-2 architecture with a small-scale setup, featuring two layers and eight attention heads, to predict biologically plausible promoter sequences. After training, the model generates novel DNA sequences, which are evaluated for biological plausibility by analyzing GC content and sequence motifs. The generated sequences exhibit realistic GC content and contain common motifs found in natural promoters, suggesting that the model has learned the grammatical rules of DNA. The research opens avenues for further exploration, including testing the synthetic sequences' functionality in biological systems and experimenting with different model architectures and genomic regions.