Building low-code applications using large language models (LLMs) on AWS EKS has become more accessible with tools like Flowise, LocalAI, and Pulumi, which simplify the creation, deployment, and management of AI-powered workflows without requiring extensive coding knowledge. Flowise offers a drag-and-drop interface for orchestrating LLM workflows while LocalAI provides a GPU-free, open-source alternative to OpenAI's API for running LLMs locally. Pulumi facilitates cloud infrastructure deployment using minimal code. The process involves setting up a low-code LLM application with Flowise and LocalAI, deploying it on AWS EKS with Pulumi and TypeScript, and creating a chatbot workflow that can be exposed via API. This approach allows developers and AI enthusiasts to experiment with LLMs efficiently and securely, particularly in environments where data privacy is paramount. Additionally, the tools support easy prototyping, testing, and iteration of LLM models and workflows, and they provide the flexibility to destroy the infrastructure when it is no longer needed.