A new LangChain template, developed in collaboration with the GPT Researcher team, offers a research assistant tool that diverges from the typical chat-based user experience (UX) of many language model applications. This template focuses on creating long-form research reports by generating sub-questions, retrieving and summarizing relevant documents, and combining these into a final report, all of which is facilitated within the LangChain ecosystem. Unlike chat applications, which are bound by latency expectations and often require a human-in-the-loop for accuracy, this template allows for more complex and autonomous operations, providing users with high-quality outputs that can be inspected and modified. The integration with tools like LangSmith enhances observability by allowing users to track the process comprehensively. The template uses OpenAI and Tavily, a search engine optimized for AI workloads, but is customizable to work with various data sources. This shift towards non-chat, longer-running applications aims to meet growing demands for sophisticated AI solutions that prioritize quality over speed, embodying a trend towards more autonomous agent frameworks in AI development.