Company
Date Published
Author
-
Word count
918
Language
English
Hacker News points
None

Summary

Infor has integrated generative AI capabilities into its cloud-based multi-tenant solutions, leveraging the Infor OS platform to provide a unified cloud experience that enhances functionality, security, and system interoperability for users, developers, and businesses. The company transitioned its chat assistant Coleman DA from AWS Lex to a more flexible, LLM-powered platform, enabling the platform to handle complex queries, generate dynamic content, provide intelligent automation, and seamlessly integrate with ML models, APIs, and cloud suite applications across the ecosystem. Infor's Generative AI team built a platform on AWS Bedrock with three key components: GenAI embedded experiences, GenAI Knowledge Hub, and GenAI Assistant, which embed generative AI features, enhance document retrieval, and provide more intelligent, context-aware interactions, respectively. LangGraph has been instrumental to Infor’s multi-agent workflows, providing a flexible and structured approach to managing complex interactions. The company strengthened LLM observability and compliance with LangSmith's tracing, enabling monitoring of inference performance, model behavior and quality, data and model integrity, compliance and security, and transparency and accountability. Infor's generative AI initiative is driving the company to maintain its innovative edge, enhance customer confidence for its enterprise solutions, streamline report generation, automate content creation, and improve knowledge retrieval, while empowering customers to leverage AI to enhance their businesses and customize AI agents to their use cases.