Human language, with its inherent ambiguity and variability, poses challenges for large language models (LLMs) in generating structured data, such as JSON from unstructured text like receipts. Constrained generation offers a solution by ensuring that LLMs produce only valid outputs through techniques like setting invalid token probabilities to zero. Guardrails, a tool leveraging constrained generation, facilitates the conversion of unstructured text into structured formats by allowing users to define schemas and output formats, simplifying the process and maintaining data validity. Despite its effectiveness, the technique faces challenges, particularly concerning tokenizer variations and latency in remote model inference. Nonetheless, Guardrails provides a streamlined interface for both local and remote models, offering alternatives like function calling and prompt engineering to address these limitations, and highlighting the importance of tools like JSONFormer in enhancing JSON generation capabilities.