New Relic is extending its observability experience to provide a new offering for artificial intelligence (AI) and machine learning (ML) teams to break down visibility silos. This innovation provides AI/ML and DevOps teams one place to monitor and visualize critical signals like recall, precision, and model accuracy alongside their apps and infrastructure. Monitoring machine learning models is crucial for several reasons, including performance tracking, data quality assurance, compliance and fairness, user experience and business impact. Functional monitoring focuses on assessing the performance and accuracy of the machine learning model in terms of its predictive capabilities, while operational monitoring is concerned with the deployment and runtime aspects of ML models, focusing on their interactions with the production environment. By using New Relic for ML model performance monitoring, development and data science teams can bring their own ML data into New Relic, create custom dashboards, apply predictive alerts, review ML model telemetry data, collaborate in a production environment, access data to make data-driven decisions, and implement best practices such as choosing appropriate metrics, real-time monitoring, monitoring data quality, alerting and notification systems, resource monitoring, automated retraining, fairness monitoring, and getting instant value from machine learning model telemetry. New Relic's open-source ecosystem offers flexible quickstarts to start getting value from ML model data faster, with integrations with leading data science platforms like AWS SageMaker, DataRobot (Algorithmia), Aporia, Superwise, Comet, DAGsHub, Mona, and TruEra.