Agentic AI at Scale: Marsh McLennan Saves 1M+ Hours
Blog post from Predibase
Large Language Models (LLMs) offer significant potential for enterprise efficiency, productivity, and savings, but fine-tuning is essential for specific use cases, as demonstrated by Marsh McLennan's implementation of LenAI. By working with Predibase to fine-tune their models, Marsh McLennan improved the accuracy and responsiveness of LenAI, an AI assistant designed to leverage institutional knowledge and provide industry expertise. This customization addressed challenges like intent recognition, which was problematic with off-the-shelf models such as GPT-3.5, and resulted in a 7-12% increase in accuracy and reduced latency. Under the leadership of Chief Information and Operations Officer Paul Beswick, the company adopted a proactive approach to generative AI, deploying APIs and launching LenAI to its global workforce, which now handles around 20 million requests annually. The success of LenAI has translated into a productivity boost, saving over 1 million hours in its first year, and has empowered employees to experiment with the technology to innovate further. By democratizing access to AI tools and fostering a culture of innovation, Marsh McLennan continues to expand LenAI’s capabilities and transform enterprise knowledge management.