Ably has released its Moderation for Ably Chat feature, which provides a flexible and powerful way to filter inappropriate content and protect users in real-time. The moderation engine is available in two modes: before-publish and after-publish, allowing developers to choose the approach that best suits their needs. The feature integrates with Hive, a leading moderation provider specializing in AI-based content review, offering two options: Hive model-only (before publish) and Hive Dashboard (after publish). For teams that want full control or need to integrate with a moderation provider not natively supported, Ably also supports custom moderation using AWS Lambda. Moderation is fully integrated into the Ably dashboard and pipeline, allowing developers to apply rules, control retry policies, track errors, and combine automation with human review for nuanced control. The feature is now available in Ably Chat, enabling developers to keep their rooms safer without slowing down their app.